Constraint Metric (Alpha)
Constraint Metric combines the ideas of Metric Strategy and Metric Thresholds to give you more control over where SigOpt searches for the desirable outcomes. For example, some stored metrics might be guardrail metrics, where you want the values for these metrics to satisfy some baseline values. In some other cases, some Multimetric experiments can be rephrased as a constrained single metric experiment, where you want to optimize a primary metric, subject to the secondary metric exceeding a certain threshold.
Example: Optimizing Accuracy with Constraint on Inference Time
We modify the neural network example from the Metric Thresholds documentation. Instead of being interested in the tradeoff between accuracy and inference time, we only want to optimize for the accuracy of the network. We now set the inference time as a Constraint Metric, with a threshold of 100ms. The Constraint Metric feature allows you to state limitations at experiment creation to help inform SigOpt of the practical restrictions in your problem.
Defining the Constraint Metric
To assign one of your metrics as a Constraint Metric, specify the
strategy field as
constraint in the desired Metric object when creating your experiment. You must specify a
threshold for the Constraint Metric. When the objective is set to
minimize, observations with constraint metric values lower than or equal to the threshold are considered as feasible. When the objective is set to
maximize, observations with constraint metric values greater than or equal to the threshold are considered as feasible.
Creating the Experiment
Below, we create a new experiment, using the above example of optimizing the accuracy of the network subject to inference time constraint.
from sigopt import Connection conn = Connection(client_token="SIGOPT_API_TOKEN") experiment = conn.experiments().create( name="Neural network with inference time constraint", parameters=[ dict( name="log10_learning_rate", bounds=dict( min=-4, max=0 ), type="double" ), dict( name="nodes_per_layer", bounds=dict( min=5, max=20 ), type="int" ) ], metrics=[ dict( name="inference_time_milliseconds", objective="minimize", strategy="constraint", threshold=100 ), dict( name="validation_accuracy", objective="maximize", strategy="optimize" ) ], observation_budget=65, parallel_bandwidth=2, project="sigopt-examples", type="offline" ) print("Created experiment: https://app.sigopt.com/experiment/" + experiment.id)
Like Meric Threshold, Constraint Metrics allow you to update the
threshold on the properties page of an experiment at any time while the experiment is in progress. The thresholds can also be updated directly through our API. An example of this is given below.
### Run the experiment above for some number of observations and lower the threshold on inference time. experiment = conn.experiments().update( metrics=[ dict( name="inference_time_milliseconds", objective="minimize", strategy="constraint", threshold=50 ), dict( name="validation_accuracy", objective="maximize", strategy="optimize", ) ], )
threshold defined for a Constraint Metric cannot be satisfied anywhere in the domain, this feature will perform unpredictably. For example, if the inference time is set to
0, this feature will assume 0 ms is actually feasible as inference time and explore the domain erratically trying to find an observation that satisfies this threshold. When defining thresholds, it is best to state thresholds at the start of an experiment which are well-understood based on previous experiments or prior knowledge.
Best Assignments List will only consider observations that satisfy all constraint metric thresholds. Additionally, it is possible that there will be no observation that satisfies all thresholds in the early stages of experiments, and thus no best assignments. In this situation, Best Assignments List will return no best assignments (an empty Pagination), and we recommend waiting until the experiment is further along to retrieve best assignments.
During development of this feature, we have produced some guidance regarding the successful use of Constraint Metrics to control the optimization of the desired metrics.
- There can be up to 4 Constraint Metrics for an experiment.
- Constraint Metrics must define
- All constraint metrics must be reported for successful Observations.
- No constraint metrics can be reported for Failed Observations.
observation_budgetmust be set for an experiment with Constraint Metrics.
- Conditional parameters are not permitted.
- Multisolution experiments are not permitted.
- Multitask experiments are not permitted.
- Setting the
observation_budgethigher. Constraint Metrics require additional Observations for SigOpt to understand the associated feasible parameter space. As a result, we recommend adding 25% to your original
observation_budgetfor each additional Constraint Metric.
- Parallelism is another powerful tool for accelerating tuning by testing multiple Suggestions simultaneously. It can be used in conjunction with Constraint Metrics, but each Constraint Metric requires extra time for SigOpt to compute the associated feasible parameter space. We recommend limiting parallelism to no more than 5 simultaneously open suggestions.