Skip to content

Pyomo.DoE: Add symbolic differentiation gradient calculation option as an alternative to finite different #7

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 33 commits into
base: master
Choose a base branch
from

Conversation

adowling2
Copy link
Owner

@adowling2 adowling2 commented Jun 4, 2025

Fixes # .

Summary/Motivation:

Changes proposed in this PR:

  • Replaced FiniteDifferenceMethod with GradientMethod
  • Plan to deprecate fd_method in favor of gradient_method as an input for Pyomo.DoE
  • Added symbolic as an option in GradientMethod
  • Updated tests

Legal Acknowledgement

By contributing to this software project, I have read the contribution guide and agree to the following terms and conditions for my contribution:

  1. I agree my contributions are submitted under the BSD license.
  2. I represent I am authorized to make the contributions and grant the license. If my employer has rights to intellectual property that includes these contributions, I represent that I have received permission to make contributions and grant the required license on behalf of that employer.

@adowling2
Copy link
Owner Author

Here is the current error message:

  File "/Users/adowling/GitHub/pyomo/pyomo/contrib/doe/doe.py", line 777, in _kaug_FIM
    self.kaug_FIM = self.kaug_jac.T @ cov_y @ self.kaug_jac + self.prior_FIM
ValueError: matmul: Input operand 1 has a mismatch in its core dimension 0, with gufunc signature (n?,k),(k,m?)->(n?,m?) (size 27 is different from 4)

Here is output from some extra print statements:

Dimensions of kaug_jac =  (4, 25)
Dimensions of cov_y (27, 27)
Dimensions of prior_FIM (4, 4)

Why does my new PyNumero code use 25 measurements but the old code considers 27? I think the old code is counting the ODE initial conditions as both a measurement (output) and experiment input. But this needs to be investigated.

@adowling2
Copy link
Owner Author

From my "shunk-works" branch dowlinglab/pyomo-doe#8 ...

Experiment model size:
  179 total variables
  27 outputs (measurements)
  10 inputs
  4 unknown parameters
  175 constraints

New thought: something is not getting fixed properly before calling my AD/SD code.

@adowling2
Copy link
Owner Author

Good news: most recent change (fixing input parameters and variables)

Bad news: solver failure on a machine without good Ipopt:

WARNING: Loading a SolverResults object with a warning status into
model.name="unknown";
    - termination condition: other
    - message from solver: <undefined>
WARNING: :::::::::::Warning: Cannot converge this run.::::::::::::
--- Logging error ---
Traceback (most recent call last):
  File "/Users/adowling/DowlingLab/pyomo/pyomo/contrib/doe/doe.py", line 605, in _sequential_FIM
    pyo.assert_optimal_termination(res)
  File "/Users/adowling/DowlingLab/pyomo/pyomo/opt/results/solver.py", line 196, in assert_optimal_termination
    raise RuntimeError(msg)
RuntimeError: Solver failed to return an optimal solution. Solution status: warning, Termination condition: other

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/adowling/DowlingLab/pyomo/pyomo/contrib/doe/doe.py", line 1636, in compute_FIM_full_factorial
    self.compute_FIM(model=model, method=method)
  File "/Users/adowling/DowlingLab/pyomo/pyomo/contrib/doe/doe.py", line 515, in compute_FIM
    self._sequential_FIM(model=model)
  File "/Users/adowling/DowlingLab/pyomo/pyomo/contrib/doe/doe.py", line 609, in _sequential_FIM
    raise RuntimeError(
RuntimeError: Model from experiment did not solve appropriately. Make sure the model is well-posed.

@adowling2
Copy link
Owner Author

Actually, there is still a dimension mismatch error:

Dimensions of kaug_jac =  (4, 25)
Dimensions of cov_y (27, 27)
Dimensions of prior_FIM (4, 4)

@adowling2 adowling2 marked this pull request as draft June 6, 2025 22:51
@adowling2
Copy link
Owner Author

Confirmed the issue: my AD code is not considering CA and CB at time zero are measurements. I suspect that is because these are experiment inputs and thus fixed.

@adowling2
Copy link
Owner Author

Interesting... CA[0] is an experiment input, but CB[0] is not. By construction, for this problem, we fixed CB[0] = 0.

Should we allow the user to specify quantities that are fixed as measurements? If they do, the gradient (sensitivity) should be zero. I should be able to handle this zero gradient.

@adowling2
Copy link
Owner Author

Results from symbolic sensitivity analysis (reactor example):

example_reactor_compute_FIM_A_opt
example_reactor_compute_FIM_D_opt
example_reactor_compute_FIM_E_opt
example_reactor_compute_FIM_ME_opt

@adowling2
Copy link
Owner Author

adowling2 commented Jun 11, 2025

Results with central finite difference and sequential:

example_reactor_compute_FIM_A_opt
example_reactor_compute_FIM_D_opt
example_reactor_compute_FIM_E_opt
example_reactor_compute_FIM_ME_opt

@adowling2
Copy link
Owner Author

Results generated with kaug:

example_reactor_compute_FIM_A_opt
example_reactor_compute_FIM_D_opt
example_reactor_compute_FIM_E_opt
example_reactor_compute_FIM_ME_opt

@adowling2
Copy link
Owner Author

Comparison of trace:

  • Same shape, but symbolic values are lower

Comparison of determinate:

  • Same shape, smallest value is lowest with symbolic

Ideas:

  • Check that symbolic is assembling the measurement errors correctly
  • Compare Jacobians at a single point
  • Symbolic explicitly adds a zero row for fixed measurements... there might be a small gradient with the others due to solver convergence tolerances

@adowling2
Copy link
Owner Author

adowling2 commented Jun 11, 2025

Results for a single point using kaug:

kaug_jac
: [[ 0.00000000e+00  0.00000000e+00  0.00000000e+00  0.00000000e+00]
 [-1.63468975e+00 -7.20926079e-15  3.09547195e+00 -3.79355202e-14]
 [-1.33002472e+00 -1.79960585e-14  2.89441710e+00  1.24206956e-14]
 [-1.02460753e+00 -1.21392639e-14  2.41080622e+00  3.24985417e-15]
 [-7.61613008e-01 -8.14617661e-15  1.88410535e+00 -7.98770880e-16]
 [-5.52026118e-01 -5.44195302e-15  1.41418067e+00 -2.27373609e-15]
 [-3.92682429e-01 -3.62105040e-15  1.03222972e+00 -2.53378302e-15]
 [-2.75314860e-01 -2.40099454e-15  7.38175156e-01 -2.28132668e-15]
 [-1.90812791e-01 -1.58704881e-15  5.19693626e-01 -1.86332539e-15]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  0.00000000e+00]
 [ 5.33726838e-01 -1.24246100e+00 -1.03703316e+00  4.53010295e+00]
 [ 1.88837814e-01 -1.33332768e+00 -7.33381777e-01  5.39096253e+00]
 [-1.19831767e-01 -1.42212773e+00 -1.97331587e-01  6.20642210e+00]
 [-3.55637395e-01 -1.49674329e+00  3.21685806e-01  6.90683175e+00]
 [-5.16043971e-01 -1.55195047e+00  7.31799560e-01  7.46377434e+00]
 [-6.12063742e-01 -1.58648627e+00  1.01535861e+00  7.87240556e+00]
 [-6.58324208e-01 -1.60126116e+00  1.18593874e+00  8.14070357e+00]
 [-6.68686268e-01 -1.59828246e+00  1.26716070e+00  8.28300293e+00]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  0.00000000e+00]
 [ 1.10096291e+00  1.24246100e+00 -2.05843879e+00 -4.53010295e+00]
 [ 1.14118691e+00  1.33332768e+00 -2.16103533e+00 -5.39096253e+00]
 [ 1.14443929e+00  1.42212773e+00 -2.21347463e+00 -6.20642210e+00]
 [ 1.11725040e+00  1.49674329e+00 -2.20579115e+00 -6.90683175e+00]
 [ 1.06807009e+00  1.55195047e+00 -2.14598023e+00 -7.46377434e+00]
 [ 1.00474617e+00  1.58648627e+00 -2.04758834e+00 -7.87240556e+00]
 [ 9.33639068e-01  1.60126116e+00 -1.92411390e+00 -8.14070357e+00]
 [ 8.59499060e-01  1.59828246e+00 -1.78685433e+00 -8.28300293e+00]]


cov_y
: [[100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.]]


kaug_FIM
: [[  1745.81343391   1599.21859987  -3512.47892155  -7589.26220445]
 [  1599.21859987   3525.63856364  -2900.94638673 -16465.46338508]
 [ -3512.47892155  -2900.94638673   7190.13048958  13849.96839993]
 [ -7589.26220445 -16465.46338508  13849.96839993  77674.03976715]]


prior_FIM
: [[0. 0. 0. 0.]
 [0. 0. 0. 0.]
 [0. 0. 0. 0.]
 [0. 0. 0. 0.]]

@adowling2
Copy link
Owner Author

adowling2 commented Jun 11, 2025

Results using symbolic:

kaug_jac
: [[ 0.          0.          0.          0.        ]
 [-0.01927928  0.          0.39787557  0.        ]
 [-0.0156861   0.          0.37203305  0.        ]
 [-0.01208406  0.          0.30987226  0.        ]
 [-0.00898234  0.          0.24217292  0.        ]
 [-0.00651051  0.          0.18177129  0.        ]
 [-0.00463124  0.          0.13267734  0.        ]
 [-0.00324702  0.          0.09488113  0.        ]
 [-0.00225042 -0.          0.06679867 -0.        ]
 [ 0.          0.          0.          0.        ]
 [ 0.00629469 -0.00334246 -0.13329475  0.30100352]
 [ 0.00222712 -0.00358691 -0.09426501  0.35820349]
 [-0.00141328 -0.0038258  -0.02536396  0.41238685]
 [-0.00419433 -0.00402653  0.04134779  0.4589257 ]
 [-0.00608614 -0.00417505  0.09406164  0.49593185]
 [-0.00721858 -0.00426796  0.13050882  0.52308343]
 [-0.00776417 -0.00430771  0.15243429  0.54091054]
 [-0.00788638 -0.00429969  0.16287413  0.55036564]
 [ 0.          0.         -0.         -0.        ]
 [ 0.01298458  0.00334246 -0.26458082 -0.30100352]
 [ 0.01345898  0.00358691 -0.27776804 -0.35820349]
 [ 0.01349734  0.0038258  -0.28450831 -0.41238685]
 [ 0.01317668  0.00402653 -0.28352071 -0.4589257 ]
 [ 0.01259665  0.00417505 -0.27583293 -0.49593185]
 [ 0.01184982  0.00426796 -0.26318616 -0.52308343]
 [ 0.01101119  0.00430771 -0.24731541 -0.54091054]
 [ 0.0101368   0.00429969 -0.22967279 -0.55036564]]


cov_y
: [[100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.]]


kaug_FIM
: [[ 2.42833478e-01  5.07396268e-02 -5.32463073e+00 -5.94728050e+00]
 [ 5.07396268e-02  2.55156086e-02 -1.00309988e+00 -2.94321189e+00]
 [-5.32463073e+00 -1.00309988e+00  1.18789370e+02  1.18285820e+02]
 [-5.94728050e+00 -2.94321189e+00  1.18285820e+02  3.42927958e+02]]


prior_FIM
: [[0. 0. 0. 0.]
 [0. 0. 0. 0.]
 [0. 0. 0. 0.]
 [0. 0. 0. 0.]]

@adowling2
Copy link
Owner Author

I should make a much easier test example without ODEs/DAEs.

@adowling2
Copy link
Owner Author

Set scaling to False; symbolic results did not change.

Results using kaug without scaling:

Dimensions of kaug_jac =  (27, 4)
Dimensions of cov_y (27, 27)
Dimensions of prior_FIM (4, 4)
kaug_jac
: [[ 0.00000000e+00  0.00000000e+00  0.00000000e+00  0.00000000e+00]
 [-1.92792752e-02 -1.93943312e-17  3.97875573e-01 -2.52063257e-15]
 [-1.56861036e-02 -4.84129411e-17  3.72033047e-01  8.25295390e-16]
 [-1.20840609e-02 -3.26570103e-17  3.09872265e-01  2.15937154e-16]
 [-8.98234471e-03 -2.19148192e-17  2.42172924e-01 -5.30744771e-17]
 [-6.51050971e-03 -1.46399253e-17  1.81771294e-01 -1.51078810e-16]
 [-4.63123516e-03 -9.74133864e-18  1.32677342e-01 -1.68357675e-16]
 [-3.24702040e-03 -6.45914811e-18  9.48811255e-02 -1.51583168e-16]
 [-2.25041622e-03 -4.26947383e-18  6.67986666e-02 -1.23808996e-16]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  0.00000000e+00]
 [ 6.29469086e-03 -3.34246476e-03 -1.33294751e-01  3.01003518e-01]
 [ 2.22712365e-03 -3.58691402e-03 -9.42650099e-02  3.58203491e-01]
 [-1.41327712e-03 -3.82580363e-03 -2.53639571e-02  4.12386850e-01]
 [-4.19433182e-03 -4.02653418e-03  4.13477900e-02  4.58925698e-01]
 [-6.08614189e-03 -4.17505238e-03  9.40616402e-02  4.95931850e-01]
 [-7.21858405e-03 -4.26796047e-03  1.30508819e-01  5.23083426e-01]
 [-7.76417276e-03 -4.30770785e-03  1.52434286e-01  5.40910537e-01]
 [-7.88638128e-03 -4.29969455e-03  1.62874127e-01  5.50365643e-01]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  0.00000000e+00]
 [ 1.29845844e-02  3.34246476e-03 -2.64580821e-01 -3.01003518e-01]
 [ 1.34589799e-02  3.58691402e-03 -2.77768037e-01 -3.58203491e-01]
 [ 1.34973380e-02  3.82580363e-03 -2.84508308e-01 -4.12386850e-01]
 [ 1.31766765e-02  4.02653418e-03 -2.83520714e-01 -4.58925698e-01]
 [ 1.25966516e-02  4.17505238e-03 -2.75832935e-01 -4.95931850e-01]
 [ 1.18498192e-02  4.26796047e-03 -2.63186162e-01 -5.23083426e-01]
 [ 1.10111932e-02  4.30770785e-03 -2.47315411e-01 -5.40910537e-01]
 [ 1.01367975e-02  4.29969455e-03 -2.29672793e-01 -5.50365643e-01]]


cov_y
: [[100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.]]


kaug_FIM
: [[ 2.42833478e-01  5.07396268e-02 -5.32463073e+00 -5.94728050e+00]
 [ 5.07396268e-02  2.55156086e-02 -1.00309988e+00 -2.94321189e+00]
 [-5.32463073e+00 -1.00309988e+00  1.18789370e+02  1.18285820e+02]
 [-5.94728050e+00 -2.94321189e+00  1.18285820e+02  3.42927958e+02]]


prior_FIM
: [[0. 0. 0. 0.]
 [0. 0. 0. 0.]
 [0. 0. 0. 0.]
 [0. 0. 0. 0.]]

Conclusion: I need to implement scaling!

@adowling2
Copy link
Owner Author

Results with scaling for symbolic:

kaug_jac
: [[ 0.          0.          0.          0.        ]
 [-1.63468975  0.          3.09547195  0.        ]
 [-1.33002472  0.          2.8944171   0.        ]
 [-1.02460753  0.          2.41080622  0.        ]
 [-0.76161301  0.          1.88410535  0.        ]
 [-0.55202612  0.          1.41418067  0.        ]
 [-0.39268243  0.          1.03222972  0.        ]
 [-0.27531486  0.          0.73817516  0.        ]
 [-0.19081279 -0.          0.51969363 -0.        ]
 [ 0.          0.          0.          0.        ]
 [ 0.53372684 -1.242461   -1.03703316  4.53010295]
 [ 0.18883781 -1.33332768 -0.73338178  5.39096253]
 [-0.11983177 -1.42212773 -0.19733159  6.2064221 ]
 [-0.35563739 -1.49674329  0.32168581  6.90683175]
 [-0.51604397 -1.55195047  0.73179956  7.46377434]
 [-0.61206374 -1.58648627  1.01535861  7.87240556]
 [-0.65832421 -1.60126116  1.18593874  8.14070357]
 [-0.66868627 -1.59828246  1.2671607   8.28300293]
 [ 0.          0.         -0.         -0.        ]
 [ 1.10096291  1.242461   -2.05843879 -4.53010295]
 [ 1.14118691  1.33332768 -2.16103533 -5.39096253]
 [ 1.14443929  1.42212773 -2.21347463 -6.2064221 ]
 [ 1.1172504   1.49674329 -2.20579115 -6.90683175]
 [ 1.06807009  1.55195047 -2.14598023 -7.46377434]
 [ 1.00474617  1.58648627 -2.04758834 -7.87240556]
 [ 0.93363907  1.60126116 -1.9241139  -8.14070357]
 [ 0.85949906  1.59828246 -1.78685433 -8.28300293]]


cov_y
: [[100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
    0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0. 100.]]


kaug_FIM
: [[  1745.81343391   1599.21859987  -3512.47892155  -7589.26220445]
 [  1599.21859987   3525.63856364  -2900.94638673 -16465.46338508]
 [ -3512.47892155  -2900.94638673   7190.13048958  13849.96839993]
 [ -7589.26220445 -16465.46338508  13849.96839993  77674.03976715]]


prior_FIM
: [[0. 0. 0. 0.]
 [0. 0. 0. 0.]
 [0. 0. 0. 0.]
 [0. 0. 0. 0.]]

@adowling2
Copy link
Owner Author

More results using symbolic with scaling. The results are now consistent!
example_reactor_compute_FIM_A_opt
example_reactor_compute_FIM_D_opt
example_reactor_compute_FIM_E_opt
example_reactor_compute_FIM_ME_opt

@adowling2
Copy link
Owner Author

Original Pyomo.DoE for the reaction kinetics example:

This is Ipopt version 3.13.2, running with linear solver ma27.

Number of nonzeros in equality constraint Jacobian...:     6370
Number of nonzeros in inequality constraint Jacobian.:        0
Number of nonzeros in Lagrangian Hessian.............:     1026

Total number of variables............................:     1608
                     variables with only lower bounds:      732
                variables with lower and upper bounds:      256
                     variables with only upper bounds:        0
Total number of equality constraints.................:     1598
Total number of inequality constraints...............:        0
        inequality constraints with only lower bounds:        0
   inequality constraints with lower and upper bounds:        0
        inequality constraints with only upper bounds:        0

iter    objective    inf_pr   inf_du lg(mu)  ||d||  lg(rg) alpha_du alpha_pr  ls
   0 -9.9386618e+00 1.54e-01 1.18e+00  -1.0 0.00e+00    -  0.00e+00 0.00e+00   0
   1 -9.9420153e+00 4.81e-02 1.31e-01  -1.0 1.26e+02    -  9.90e-01 9.90e-01h  1
   2 -9.9949580e+00 7.14e-02 9.90e+00  -1.0 8.07e+01    -  1.00e+00 9.90e-01H  1
   3 -1.0165394e+01 4.24e-03 3.03e+00  -1.0 2.57e+02    -  1.00e+00 1.00e+00H  1
   4 -1.0287458e+01 1.97e+01 3.62e+00  -1.7 1.83e+02    -  1.00e+00 1.00e+00f  1
   5 -1.0543793e+01 1.83e+02 1.09e+01  -1.7 5.22e+02    -  1.00e+00 1.00e+00h  1
   6 -1.0533052e+01 4.34e+00 1.74e+00  -1.7 5.95e+02    -  1.00e+00 1.00e+00h  1
   7 -1.0532473e+01 4.53e-02 1.64e-02  -1.7 2.58e+01    -  1.00e+00 1.00e+00h  1
   8 -1.0782349e+01 2.24e+02 3.76e+05  -3.8 5.33e+02    -  8.11e-01 1.00e+00f  1
   9 -1.1083263e+01 5.07e+02 1.21e+05  -3.8 1.89e+03    -  6.78e-01 1.00e+00h  1
iter    objective    inf_pr   inf_du lg(mu)  ||d||  lg(rg) alpha_du alpha_pr  ls
  10 -1.1226667e+01 2.45e+01 1.24e+04  -3.8 4.39e+02    -  8.97e-01 1.00e+00h  1
  11 -1.1288911e+01 2.11e+01 8.72e-01  -3.8 4.28e+02    -  1.00e+00 1.00e+00h  1
  12 -1.1290305e+01 1.11e+00 4.54e-02  -3.8 3.14e+02    -  1.00e+00 1.00e+00h  1
  13 -1.1290370e+01 3.44e-04 2.32e-04  -3.8 3.58e+00    -  1.00e+00 1.00e+00h  1
  14 -1.1315467e+01 1.66e+00 1.26e+03  -5.7 1.03e+02    -  9.15e-01 1.00e+00h  1
  15 -1.1317504e+01 9.09e-02 3.82e-02  -5.7 2.33e+01    -  1.00e+00 1.00e+00h  1
  16 -1.1317664e+01 1.84e-03 2.72e-03  -5.7 5.37e+00    -  1.00e+00 1.00e+00h  1
  17 -1.1317669e+01 2.04e-05 6.62e-05  -5.7 3.82e-01    -  1.00e+00 1.00e+00h  1
  18 -1.1317669e+01 1.10e-08 3.54e-08  -5.7 7.69e-03    -  1.00e+00 1.00e+00h  1
  19 -1.1318050e+01 8.16e-04 2.57e-01  -8.6 2.28e+00    -  9.99e-01 1.00e+00h  1
iter    objective    inf_pr   inf_du lg(mu)  ||d||  lg(rg) alpha_du alpha_pr  ls
  20 -1.1318051e+01 2.84e-07 9.25e-07  -8.6 4.52e-02    -  1.00e+00 1.00e+00h  1
  21 -1.1318051e+01 2.91e-11 9.57e-12  -8.6 1.27e-04    -  1.00e+00 1.00e+00h  1

Number of Iterations....: 21

                                   (scaled)                 (unscaled)
Objective...............:  -1.1318050921093612e+01   -1.1318050921093612e+01
Dual infeasibility......:   9.5672042699536117e-12    9.5672042699536117e-12
Constraint violation....:   3.0004887463519481e-12    2.9103830456733704e-11
Complementarity.........:   2.5059065524033082e-09    2.5059065524033082e-09
Overall NLP error.......:   2.5059065524033082e-09    2.5059065524033082e-09


Number of objective function evaluations             = 26
Number of objective gradient evaluations             = 22
Number of equality constraint evaluations            = 26
Number of inequality constraint evaluations          = 0
Number of equality constraint Jacobian evaluations   = 22
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations             = 21
Total CPU secs in IPOPT (w/o function evaluations)   =      0.086
Total CPU secs in NLP function evaluations           =      0.003

EXIT: Optimal Solution Found.
Optimal experiment values: 
	Initial concentration: 5.00
	Temperature values: [481.88, 300.00, 300.00, 300.00, 300.00, 300.00, 300.00, 300.00, 300.00]
FIM at optimal design:
 [[  1731.79089055   1442.18099318  -3647.74691348  -7115.51970487]
 [  1442.18099318   3479.17598013  -2710.24342004 -16836.77882308]
 [ -3647.74691348  -2710.24342004   7809.41772632  13457.72531516]
 [ -7115.51970487 -16836.77882308  13457.72531516  82181.28341606]]
Objective value at optimal design: 11.32

***Optimal Experiment Design***
  Decision Var   Value
0        CA[0]    5.00
1         T[0]  481.88
2     T[0.125]  300.00
3      T[0.25]  300.00
4     T[0.375]  300.00
5       T[0.5]  300.00
6     T[0.625]  300.00
7      T[0.75]  300.00
8     T[0.875]  300.00
9         T[1]  300.00

@adowling2
Copy link
Owner Author

With symbolic derivatives:

This is Ipopt version 3.13.2, running with linear solver ma27.

Number of nonzeros in equality constraint Jacobian...:     4341
Number of nonzeros in inequality constraint Jacobian.:        0
Number of nonzeros in Lagrangian Hessian.............:      886

Total number of variables............................:     1013
                     variables with only lower bounds:       95
                variables with lower and upper bounds:       32
                     variables with only upper bounds:        0
Total number of equality constraints.................:     1003
Total number of inequality constraints...............:        0
        inequality constraints with only lower bounds:        0
   inequality constraints with lower and upper bounds:        0
        inequality constraints with only upper bounds:        0

iter    objective    inf_pr   inf_du lg(mu)  ||d||  lg(rg) alpha_du alpha_pr  ls
   0 -1.1297038e+01 2.65e+00 1.10e+00  -1.0 0.00e+00    -  0.00e+00 0.00e+00   0
   1 -1.1242568e+01 3.72e+00 6.29e-01  -1.0 1.83e+02    -  9.89e-01 9.90e-01h  1
   2 -1.1179453e+01 1.80e+02 1.59e+01  -1.0 3.96e+03    -  9.74e-01 9.90e-01f  1
   3 -1.0834854e+01 9.82e+02 1.40e+01  -1.0 6.32e+03    -  1.00e+00 1.00e+00h  1
   4 -1.0698726e+01 3.29e+02 2.57e+00  -1.0 6.92e+03    -  1.00e+00 1.00e+00h  1
   5 -1.0688962e+01 4.53e+00 8.60e-01  -1.0 1.77e+02    -  1.00e+00 1.00e+00h  1
   6 -1.0874882e+01 1.41e+02 6.54e+00  -1.7 4.91e+02    -  1.00e+00 1.00e+00h  1
   7 -1.1055990e+01 1.88e+02 2.75e+00  -1.7 2.25e+03    -  1.00e+00 1.00e+00h  1
   8 -1.1047059e+01 1.87e+01 2.17e-01  -1.7 1.43e+03    -  1.00e+00 1.00e+00h  1
   9 -1.1046237e+01 4.14e-02 3.55e-03  -1.7 2.89e+01    -  1.00e+00 1.00e+00h  1
iter    objective    inf_pr   inf_du lg(mu)  ||d||  lg(rg) alpha_du alpha_pr  ls
  10 -1.1213820e+01 8.69e+01 5.06e+05  -3.8 6.72e+02    -  7.45e-01 1.00e+00f  1
  11 -1.1290645e+01 1.29e+01 6.48e+04  -3.8 2.89e+02    -  8.72e-01 1.00e+00h  1
  12 -1.1312328e+01 4.70e+00 1.84e-01  -3.8 1.55e+02    -  1.00e+00 1.00e+00h  1
  13 -1.1314140e+01 1.40e-01 2.21e-02  -3.8 2.84e+01    -  1.00e+00 1.00e+00h  1
  14 -1.1314255e+01 1.58e-03 1.10e-03  -3.8 2.71e+00    -  1.00e+00 1.00e+00h  1
  15 -1.1317925e+01 6.21e-02 1.54e+02  -5.7 2.00e+01    -  9.90e-01 1.00e+00h  1
  16 -1.1318010e+01 4.24e-04 6.08e-04  -5.7 1.54e+00    -  1.00e+00 1.00e+00h  1
  17 -1.1318012e+01 2.45e-06 5.87e-06  -5.7 1.17e-01    -  1.00e+00 1.00e+00h  1
  18 -1.1318060e+01 1.33e-05 3.48e-06  -8.6 2.91e-01    -  1.00e+00 1.00e+00h  1
  19 -1.1318060e+01 1.08e-10 2.61e-10  -8.6 8.44e-04    -  1.00e+00 1.00e+00h  1

Number of Iterations....: 19

                                   (scaled)                 (unscaled)
Objective...............:  -1.1318059589328048e+01   -1.1318059589328048e+01
Dual infeasibility......:   2.6113792358236406e-10    2.6113792358236406e-10
Constraint violation....:   1.0797918115201810e-10    1.0797918115201810e-10
Complementarity.........:   2.5068103979073895e-09    2.5068103979073895e-09
Overall NLP error.......:   2.5068103979073895e-09    2.5068103979073895e-09


Number of objective function evaluations             = 20
Number of objective gradient evaluations             = 20
Number of equality constraint evaluations            = 20
Number of inequality constraint evaluations          = 0
Number of equality constraint Jacobian evaluations   = 20
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations             = 19
Total CPU secs in IPOPT (w/o function evaluations)   =      0.035
Total CPU secs in NLP function evaluations           =      0.003

EXIT: Optimal Solution Found.
INFO: Successfully optimized experiment. Solve time: 0.1 seconds
INFO: Total time for build, initialization, and solve: 0.4 seconds
Optimal experiment values: 
	Initial concentration: 5.00
	Temperature values: [481.88, 300.00, 300.00, 300.00, 300.00, 300.00, 300.00, 300.00, 300.00]
FIM at optimal design:
 [[  1731.78924161   1442.17360943  -3647.75456986  -7115.52005827]
 [  1442.17360943   3479.17107758  -2710.23709531 -16836.83659169]
 [ -3647.75456986  -2710.23709531   7809.45694289  13457.76589358]
 [ -7115.52005827 -16836.83659169  13457.76589358  82181.9737957 ]]
Objective value at optimal design: 11.32

***Optimal Experiment Design***
  Decision Var   Value
0        CA[0]    5.00
1         T[0]  481.88
2     T[0.125]  300.00
3      T[0.25]  300.00
4     T[0.375]  300.00
5       T[0.5]  300.00
6     T[0.625]  300.00
7      T[0.75]  300.00
8     T[0.875]  300.00
9         T[1]  300.00

@adowling2
Copy link
Owner Author

@djlaky @blnicho @michaelbynum @jsiirola I am excited to share we have a working draft implementation of Pyomo.DoE using symbolic differentiation to build the sensitivity matrix instead of finite difference. I tested this on the reactor example. The model is smaller and solves in Ipopt ~2x faster.

Now that there is a draft implementation, this is ready for a software design review. I appreciate your comments in advance. Also, is there a good way to move this PR from my fork to the main Pyomo repo?

@blnicho
Copy link

blnicho commented Jun 13, 2025

@adowling2 thanks for the update, this is really exciting! You may be able to edit the target branch for this PR to point to the main Pyomo repo (I can't remember if GitHub allows you to edit the target to a different fork). If that doesn't work then you'll have to open a new PR into Pyomo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants