-
Hi all, I saw by chance that when you use constraints on kernel parameters in botorch, you always set Here is an example: Edit 1: It seems that it is not everywhere like this, in the mixed GP in the categorical kernel it is different, there the default transform is used: Is this on purpose? Best, Johannes |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 8 replies
-
Hi @jduerholt, the main reason for setting One downside is that if you simply used an unconstrained optimizer (such as Adam) to optimize BoTorch model's hyperparameters you'd have to make sure to handle the constraint manually. There is some related discussion here: #2542
Hmm good catch - I am not sure, but I don't think this is on purpose. @saitcakmak, @dme65 do you recall if there was a reason to use the transform or was this an oversight? |
Beta Was this translation helpful? Give feedback.
Hi @jduerholt, the main reason for setting
transform=None
is that we typically use L-BFGS-B to optimize the MLL of the model, and we can explicitly include those constraints in the optimization rather than relying on the default transformations to the real line used in gpytorch. We have found that to work better in most cases. The parsing logic for this lives here: https://github.com/pytorch/botorch/blob/main/botorch/optim/utils/model_utils.py#L99-L101One downside is that if you simply used an unconstrained optimizer (such as Adam) to optimize BoTorch model's hyperparameters you'd have to make sure to handle the constraint manually. There is some related discussion here: #2542