Documentation

Welcome to SigOpt’s developer documentation. If you have a question you can’t answer, feel free to contact us!

Parameter Transformation

Picking the appropriate scale when tuning hyperparameters can have an impact on the efficiency of the optimization process. SigOpt now allows users to specify the Parameter transformation to indicate the change in scale.

Logarithmic Transformation

In most machine learning applications, the learning rate hyperparameter is often searched in the log space. In other words, it is more efficient to sample values from different orders of magnitude in the given bounds. SigOpt assumes base 10 when applying the logarithmic transformation. Below is a code snippet on how to indicate log transformation for the learning rate parameter inside a SigOpt experiment create call.

dict(
  name="learning_rate",
  bounds=dict(min=1e-4, max=1),
  type="double",
  transformation="log",
)

This is functionally equivalent of setting the parameter bounds in the log space, i.e.,

dict(
  name="log10_learning_rate",
  bounds=dict(min=-4, max=0),
  type="double",
)

And then manually exponentiating the assignments, e.g., learning_rate = 10 ** suggestion.assignments['log10_learning_rate'].

Changes in Web Visualization

There are changes in the Analysis page to indicate when a parameter is set to log transformation. You can view parameters with log transformation in log scale in the Experiment History plots.

Limitations

  • Parameter transformation can only be defined for double parameters.
  • Parameter transformation does not support prior beliefs.
  • Parameter transformation cannot be updated.
  • Parameter transformation cannot be defined for parameters that are part of a Constraint Term.