-
Notifications
You must be signed in to change notification settings - Fork 15
Open
Description
#30 implements an adapter to use Ax for Bayesian Optimization in CADET-Process. The aim of that PR is a MVP, allowing us to explore further options.
This issue here collects some further ideas to improve BO in CADET-Process.
- Enable specification of Hyperparameters of the optimizer (moved to Enable specification of Hyperparameters for Ax #96)
- Implement linear equality constraints (see also Add method to transform linear equality constraints to variable dependencies #97)
- Implement early stopping and batch trials for Ax (moved to Implement early stopping and batch trials for Ax #98)
- Thompson sampling for trivial parallelization. Maybe parallelize the minimization of the acquisition fct if this is possible in ax; required (Implement early stopping and batch trials for Ax #98)
- Add Noise / Errors:
- Standard Error of Measurement (SEM): ideally, SEM can be specified also in the Objectives and Nonlinear Constraints.
- Explore if adding artificial noise to the result in
CADETProcessRunner.runhelps the fitting. I think it is common in BO
Other ideas (low priority)
- Explore "fidelity" / "multi-fidelity" concepts in Ax
- Staging can formalize constraints checking, etc. before trials are actually run (similar to
RepairIndividualin pymoo).Staging: For new trials, there are different staging steps (e.g. to check rejection, feasibility, ...). How can we leverage this?class CADETProcessRunner(Runner): ... def staging_required(self) -> bool: # return True # if staging should be a required step return False ...
Metadata
Metadata
Assignees
Labels
No labels