Skip to content

Commit 57aa7af

Browse files
authored
Merge branch 'Pyomo:main' into add-greybox
2 parents b5a6945 + a9ce9b4 commit 57aa7af

File tree

32 files changed

+1641
-286
lines changed

32 files changed

+1641
-286
lines changed

.coin-or/projDesc.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -226,8 +226,8 @@ Carl D. Laird, Chair, Pyomo Management Committee, claird at andrew dot cmu dot e
226226
Use explicit overrides to disable use of automated
227227
version reporting.
228228
-->
229-
<stableVersionNumber>6.9.3</stableVersionNumber>
230-
<releaseNumber>6.9.3</releaseNumber>
229+
<stableVersionNumber>6.9.4</stableVersionNumber>
230+
<releaseNumber>6.9.4</releaseNumber>
231231

232232
</developmentStatus>
233233

.github/workflows/test_branches.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ env:
2323
PYTHON_CORE_PKGS: wheel
2424
PYPI_ONLY: z3-solver linear-tree
2525
PYPY_EXCLUDE: scipy numdifftools seaborn statsmodels linear-tree
26-
CACHE_VER: v250820.0
26+
CACHE_VER: v250827.0
2727
NEOS_EMAIL: [email protected]
2828
SRC_REF: ${{ github.head_ref || github.ref }}
2929
PYOMO_WORKFLOW: branch

.github/workflows/test_pr_and_main.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ env:
3131
PYTHON_CORE_PKGS: wheel
3232
PYPI_ONLY: z3-solver linear-tree
3333
PYPY_EXCLUDE: scipy numdifftools seaborn statsmodels linear-tree
34-
CACHE_VER: v250820.0
34+
CACHE_VER: v250827.0
3535
NEOS_EMAIL: [email protected]
3636
SRC_REF: ${{ github.head_ref || github.ref }}
3737
PYOMO_WORKFLOW: |

CHANGELOG.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,35 @@
11
Pyomo CHANGELOG
22
===============
33

4+
-------------------------------------------------------------------------------
5+
Pyomo 6.9.4 (27 Aug 2025)
6+
-------------------------------------------------------------------------------
7+
8+
- General
9+
- Cleanup `collections` module (#3708)
10+
- Work around environments where `IntEnum.__doc__` is None (#3711)
11+
- Standardize 'multiprocessing' deferred import (#3702)
12+
- Resolve `capture_output(capture_fd=True)` deadlock on Windows (#3679)
13+
- Core
14+
- Bugfix: standard form with fixed variables (#3704)
15+
- DAE
16+
- Fix simulator bug (#3692)
17+
- Solver Interfaces
18+
- Add reporting of SCIP node count to solver results (#3691)
19+
- Create API version attribute for different solver generations (#3699)
20+
- Ipopt_v2: update options processing (#3693)
21+
- Expanded LegacySolverWrapper fixes (#3700)
22+
- Testing
23+
- Update GAMS download to new "latest" link (#3706)
24+
- New TPL cache version on GHA workflows (#3705)
25+
- Contributed Packages
26+
- benders: Added support for HiGHS solver (#3686)
27+
- DoE: Add grey box objectives (#3606)
28+
- Parmest: Correct two very old test skips (#3697)
29+
- Parmest: Extend capability for weighted SSE objective (#3535)
30+
- PyROS: Modify two solver tests (#3694)
31+
- sensitivity_toolbox: Sensitivity calculation for named expressions (#3685)
32+
433
-------------------------------------------------------------------------------
534
Pyomo 6.9.3 (6 Aug 2025)
635
-------------------------------------------------------------------------------

RELEASE.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
We are pleased to announce the release of Pyomo 6.9.3.
1+
We are pleased to announce the release of Pyomo 6.9.4.
22

33
Pyomo is a collection of Python software packages that supports a
44
diverse set of optimization capabilities for formulating and analyzing

doc/OnlineDocs/_templates/recursive-module.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@ Library Reference
9494
:toctree:
9595
:template: recursive-module.rst
9696
:recursive:
97-
{% for item in modules %}
97+
{% for item in all_modules %}
9898
{# Need item != tests for Sphinx >= 8.0; !endswith(.tests) for < 8.0 #}
9999
{% if item != 'tests' and not item.endswith('.tests')
100100
and item != 'examples' and not item.endswith('.examples') %}
Lines changed: 117 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,121 @@
11
Covariance Matrix Estimation
2-
=================================
2+
============================
33

4-
If the optional argument ``calc_cov=True`` is specified for :class:`~pyomo.contrib.parmest.parmest.Estimator.theta_est`,
5-
parmest will calculate the covariance matrix :math:`V_{\theta}` as follows:
4+
The uncertainty in the estimated parameters is quantified using the covariance matrix.
5+
The diagonal of the covariance matrix contains the variance of the estimated parameters.
6+
Assuming Gaussian independent and identically distributed measurement errors, the
7+
covariance matrix of the estimated parameters can be computed using the following
8+
methods which have been implemented in parmest.
69

7-
.. math::
8-
V_{\theta} = 2 \sigma^2 H^{-1}
10+
1. Reduced Hessian Method
911

10-
This formula assumes all measurement errors are independent and identically distributed with
11-
variance :math:`\sigma^2`. :math:`H^{-1}` is the inverse of the Hessian matrix for an unweighted
12-
sum of least squares problem. Currently, the covariance approximation is only valid if the
13-
objective given to parmest is the sum of squared error. Moreover, parmest approximates the
14-
variance of the measurement errors as :math:`\sigma^2 = \frac{1}{n-l} \sum e_i^2` where :math:`n` is
15-
the number of data points, :math:`l` is the number of fitted parameters, and :math:`e_i` is the
16-
residual for experiment :math:`i`.
12+
When the objective function is the sum of squared errors (SSE):
13+
:math:`\text{SSE} = \sum_{i = 1}^n \left(y_{i} - \hat{y}_{i}\right)^2`,
14+
the covariance matrix is:
15+
16+
.. math::
17+
V_{\boldsymbol{\theta}} = 2 \sigma^2 \left(\frac{\partial^2 \text{SSE}}
18+
{\partial \boldsymbol{\theta} \partial \boldsymbol{\theta}}\right)^{-1}_{\boldsymbol{\theta}
19+
= \boldsymbol{\theta}^*}
20+
21+
When the objective function is the weighted SSE (WSSE):
22+
:math:`\text{WSSE} = \frac{1}{2} \left(\mathbf{y} - f(\mathbf{x};\boldsymbol{\theta})\right)^\text{T}
23+
\mathbf{W} \left(\mathbf{y} - f(\mathbf{x};\boldsymbol{\theta})\right)`,
24+
the covariance matrix is:
25+
26+
.. math::
27+
V_{\boldsymbol{\theta}} = \left(\frac{\partial^2 \text{WSSE}}
28+
{\partial \boldsymbol{\theta} \partial \boldsymbol{\theta}}\right)^{-1}_{\boldsymbol{\theta}
29+
= \boldsymbol{\theta}^*}
30+
31+
Where :math:`V_{\boldsymbol{\theta}}` is the covariance matrix of the estimated
32+
parameters, :math:`y` are the observed measured variables, :math:`\hat{y}` are the
33+
predicted measured variables, :math:`n` is the number of data points,
34+
:math:`\boldsymbol{\theta}` are the unknown parameters, :math:`\boldsymbol{\theta^*}`
35+
are the estimates of the unknown parameters, :math:`\mathbf{x}` are the decision
36+
variables, and :math:`\mathbf{W}` is a diagonal matrix containing the inverse of the
37+
variance of the measurement error, :math:`\sigma^2`. When the standard
38+
deviation of the measurement error is not supplied by the user, parmest
39+
approximates the variance of the measurement error as
40+
:math:`\sigma^2 = \frac{1}{n-l} \sum e_i^2` where :math:`l` is the number of
41+
fitted parameters, and :math:`e_i` is the residual for experiment :math:`i`.
42+
43+
In parmest, this method computes the inverse of the Hessian by scaling the
44+
objective function (SSE or WSSE) with a constant probability factor.
45+
46+
2. Finite Difference Method
47+
48+
In this method, the covariance matrix, :math:`V_{\boldsymbol{\theta}}`, is
49+
calculated by applying the Gauss-Newton approximation to the Hessian,
50+
:math:`\frac{\partial^2 \text{SSE}}{\partial \boldsymbol{\theta} \partial \boldsymbol{\theta}}`
51+
or
52+
:math:`\frac{\partial^2 \text{WSSE}}{\partial \boldsymbol{\theta} \partial \boldsymbol{\theta}}`,
53+
leading to:
54+
55+
.. math::
56+
V_{\boldsymbol{\theta}} = \left(\sum_{i = 1}^n \mathbf{G}_{i}^{\mathrm{T}} \mathbf{W}
57+
\mathbf{G}_{i} \right)^{-1}
58+
59+
This method uses central finite difference to compute the Jacobian matrix,
60+
:math:`\mathbf{G}_{i}`, for experiment :math:`i`, which is the sensitivity of
61+
the measured variables with respect to the parameters, :math:`\boldsymbol{\theta}`.
62+
:math:`\mathbf{W}` is a diagonal matrix containing the inverse of the variance
63+
of the measurement errors, :math:`\sigma^2`.
64+
65+
3. Automatic Differentiation Method
66+
67+
Similar to the finite difference method, the covariance matrix is calculated as:
68+
69+
.. math::
70+
V_{\boldsymbol{\theta}} = \left( \sum_{i = 1}^n \mathbf{G}_{\text{kaug},\, i}^{\mathrm{T}}
71+
\mathbf{W} \mathbf{G}_{\text{kaug},\, i} \right)^{-1}
72+
73+
However, this method uses the model optimality (KKT) condition to compute the
74+
Jacobian matrix, :math:`\mathbf{G}_{\text{kaug},\, i}`, for experiment :math:`i`.
75+
76+
The covariance matrix calculation is only supported with the built-in objective
77+
functions "SSE" or "SSE_weighted".
78+
79+
In parmest, the covariance matrix can be calculated after defining the
80+
:class:`~pyomo.contrib.parmest.parmest.Estimator` object and estimating the unknown
81+
parameters using :class:`~pyomo.contrib.parmest.parmest.Estimator.theta_est`. To
82+
estimate the covariance matrix, with the default method being "finite_difference", call
83+
the :class:`~pyomo.contrib.parmest.parmest.Estimator.cov_est` function, e.g.,
84+
85+
.. testsetup:: *
86+
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
87+
88+
# Data
89+
import pandas as pd
90+
data = pd.DataFrame(
91+
data=[[1, 8.3], [2, 10.3], [3, 19.0],
92+
[4, 16.0], [5, 15.6], [7, 19.8]],
93+
columns=['hour', 'y'],
94+
)
95+
96+
# Create the Experiment class
97+
from pyomo.contrib.parmest.examples.rooney_biegler.rooney_biegler import RooneyBieglerExperiment
98+
99+
exp_list = []
100+
for i in range(data.shape[0]):
101+
exp_list.append(RooneyBieglerExperiment(data.loc[i, :]))
102+
103+
.. doctest::
104+
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
105+
106+
>>> import pyomo.contrib.parmest.parmest as parmest
107+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE")
108+
>>> obj_val, theta_val = pest.theta_est()
109+
>>> cov = pest.cov_est()
110+
111+
Optionally, one of the three methods; "reduced_hessian", "finite_difference",
112+
and "automatic_differentiation_kaug" can be supplied for the covariance calculation,
113+
e.g.,
114+
115+
.. doctest::
116+
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
117+
118+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE")
119+
>>> obj_val, theta_val = pest.theta_est()
120+
>>> cov_method = "reduced_hessian"
121+
>>> cov = pest.cov_est(method=cov_method)

doc/OnlineDocs/explanation/analysis/parmest/datarec.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,8 +44,8 @@ The following example returns model values from a Pyomo Expression.
4444

4545
>>> # Define objective
4646
>>> def SSE(model):
47-
... expr = (model.experiment_outputs[model.y]
48-
... - model.response_function[model.experiment_outputs[model.hour]]
47+
... expr = (model.experiment_outputs[model.y[model.hour]]
48+
... - model.y[model.hour]
4949
... ) ** 2
5050
... return expr
5151

doc/OnlineDocs/explanation/analysis/parmest/driver.rst

Lines changed: 28 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -16,19 +16,21 @@ the model and the observations (typically defined as the sum of squared
1616
deviation between model values and observed values).
1717

1818
If the Pyomo model is not formatted as a two-stage stochastic
19-
programming problem in this format, the user can supply a custom
20-
function to use as the second stage cost and the Pyomo model will be
19+
programming problem in this format, the user can choose either the
20+
built-in "SSE" or "SSE_weighted" objective functions, or supply a custom
21+
objective function to use as the second stage cost. The Pyomo model will then be
2122
modified within parmest to match the required specifications.
22-
The stochastic programming callback function is also defined within parmest. The callback
23-
function returns a populated and initialized model for each scenario.
23+
The stochastic programming callback function is also defined within parmest.
24+
The callback function returns a populated and initialized model for each scenario.
2425

25-
To use parmest, the user creates a :class:`~pyomo.contrib.parmest.parmest.Estimator` object
26-
which includes the following methods:
26+
To use parmest, the user creates a :class:`~pyomo.contrib.parmest.parmest.Estimator`
27+
object which includes the following methods:
2728

2829
.. autosummary::
2930
:nosignatures:
3031

3132
~pyomo.contrib.parmest.parmest.Estimator.theta_est
33+
~pyomo.contrib.parmest.parmest.Estimator.cov_est
3234
~pyomo.contrib.parmest.parmest.Estimator.theta_est_bootstrap
3335
~pyomo.contrib.parmest.parmest.Estimator.theta_est_leaveNout
3436
~pyomo.contrib.parmest.parmest.Estimator.objective_at_theta
@@ -65,16 +67,9 @@ Section.
6567
columns=['hour', 'y'],
6668
)
6769

68-
# Sum of squared error function
69-
def SSE(model):
70-
expr = (
71-
model.experiment_outputs[model.y]
72-
- model.response_function[model.experiment_outputs[model.hour]]
73-
) ** 2
74-
return expr
75-
7670
# Create an experiment list
7771
from pyomo.contrib.parmest.examples.rooney_biegler.rooney_biegler import RooneyBieglerExperiment
72+
7873
exp_list = []
7974
for i in range(data.shape[0]):
8075
exp_list.append(RooneyBieglerExperiment(data.loc[i, :]))
@@ -83,15 +78,15 @@ Section.
8378
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
8479

8580
>>> import pyomo.contrib.parmest.parmest as parmest
86-
>>> pest = parmest.Estimator(exp_list, obj_function=SSE)
81+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE")
8782

8883
Optionally, solver options can be supplied, e.g.,
8984

9085
.. doctest::
9186
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
9287

9388
>>> solver_options = {"max_iter": 6000}
94-
>>> pest = parmest.Estimator(exp_list, obj_function=SSE, solver_options=solver_options)
89+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE", solver_options=solver_options)
9590

9691

9792
List of experiment objects
@@ -137,17 +132,20 @@ expressions that are used to build an objective for the two-stage
137132
stochastic programming problem.
138133

139134
If the Pyomo model is not written as a two-stage stochastic programming problem in
140-
this format, and/or if the user wants to use an objective that is
141-
different than the original model, a custom objective function can be
142-
defined for parameter estimation. The objective function has a single argument,
143-
which is the model from a single experiment.
135+
this format, the user can select the "SSE" or "SSE_weighted" built-in objective
136+
functions. If the user wants to use an objective that is different from the built-in
137+
options, a custom objective function can be defined for parameter estimation. However,
138+
covariance matrix estimation will not support this custom objective function. The objective
139+
function (built-in or custom) has a single argument, which is the model from a single
140+
experiment.
144141
The objective function returns a Pyomo
145142
expression which is used to define "SecondStageCost". The objective
146143
function can be used to customize data points and weights that are used
147144
in parameter estimation.
148145

149-
Parmest includes one built in objective function to compute the sum of squared errors ("SSE") between the
150-
``m.experiment_outputs`` model values and data values.
146+
Parmest includes two built-in objective functions ("SSE" and "SSE_weighted") to compute
147+
the sum of squared errors between the ``m.experiment_outputs`` model values and
148+
data values.
151149

152150
Suggested initialization procedure for parameter estimation problems
153151
--------------------------------------------------------------------
@@ -162,4 +160,11 @@ estimation solve from the square problem solution, set optional argument ``solve
162160
argument ``(initialize_parmest_model=True)``. Different initial guess values for the fitted
163161
parameters can be provided using optional argument `theta_values` (**Pandas Dataframe**)
164162

165-
3. Solve parameter estimation problem by calling :class:`~pyomo.contrib.parmest.parmest.Estimator.theta_est`
163+
3. Solve parameter estimation problem by calling
164+
:class:`~pyomo.contrib.parmest.parmest.Estimator.theta_est`, e.g.,
165+
166+
.. doctest::
167+
:skipif: not parmest_available
168+
169+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE")
170+
>>> obj_val, theta_val = pest.theta_est()

doc/OnlineDocs/explanation/analysis/parmest/overview.rst

Lines changed: 17 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -41,23 +41,32 @@ that for most experiments, only small parts of :math:`x` will change
4141
from one experiment to the next.
4242

4343
The following least squares objective can be used to estimate parameter
44-
values, where data points are indexed by :math:`s=1,\ldots,S`
44+
values assuming Gaussian independent and identically distributed measurement
45+
errors, where data points are indexed by :math:`s=1,\ldots,S`
4546

4647
.. math::
4748
4849
\min_{{\theta}} Q({\theta};{\tilde{x}}, {\tilde{y}}) \equiv \sum_{s=1}^{S}q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s}) \;\;
4950
50-
where
51+
where :math:`q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s})` can be:
5152

52-
.. math::
53+
1. Sum of squared errors
54+
55+
.. math::
56+
57+
q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s}) =
58+
\sum_{i=1}^{m}\left({\tilde{y}}_{s,i} - g_{i}({\tilde{x}}_{s};{\theta})\right)^{2}
59+
60+
2. Weighted sum of squared errors
61+
62+
.. math::
5363
54-
q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s}) = \sum_{i=1}^{m}w_{i}\left[{\tilde{y}}_{si} - g_{i}({\tilde{x}}_{s};{\theta})\right]^{2},
64+
q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s}) =
65+
\sum_{i=1}^{m}\left(\frac{{\tilde{y}}_{s,i} - g_{i}({\tilde{x}}_{s};{\theta})}{w_i}\right)^{2}
5566
5667
i.e., the contribution of sample :math:`s` to :math:`Q`, where :math:`w
57-
\in \Re^{m}` is a vector of weights for the responses. For
58-
multi-dimensional :math:`y`, this is the squared weighted :math:`L_{2}`
59-
norm and for univariate :math:`y` the weighted squared deviation.
60-
Custom objectives can also be defined for parameter estimation.
68+
\in \Re^{m}` is a vector containing the standard deviation of the measurement
69+
errors of :math:`y`. Custom objectives can also be defined for parameter estimation.
6170

6271
In the applications of interest to us, the function :math:`g(\cdot)` is
6372
usually defined as an optimization problem with a large number of

0 commit comments

Comments
 (0)