Multimetric Experiments
In the standard optimization problem, a set of input parameters yields a metric of interest, such as trading portfolio profit or machine learning accuracy. SigOpt's goal is to determine the parameters which maximize that metric.
How Multimetric Experiments are Different
In many applications, however, it may be necessary to maximize two competing metrics which have maximal values for different parameters. Such a situation may be referred to as a multimetric (or multicriteria/multiobjective) experiment. Further discussion on this topic appears in our blog here and here.
Multimetric Example
The contour plots below depict two competing metrics, where parameter values of x1
and x2
cannot simultaneously maximize both f1
and f2
.
Defining a Multimetric Function in SigOpt
A SigOpt multimetric Experiment can be conducted to explore the maximum values achievable for both metrics. We define these metrics in python in the code block below, along with the associated experiment meta which will be used to define the experiment to SigOpt. Notice that, unlike for single metric functions, this multimetric function returns a list of dictionaries, each of which contains the name of the metric and the associated value for that metric.
def evaluate_function(assignments):
x1 = assignments['x1']
x2 = assignments['x2']
f1_val = -((x1 - 7) ** 2 + (x2 - 8) ** 2)
f2_val = -((x1 - 1) ** 2 + (x2 - 4) ** 2)
return [{'name': 'f1', 'value': f1_val}, {'name': 'f2', 'value': f2_val}]
experiment_meta = {
'name': '2D Quadratic Polynomials',
'project': 'sigopt-examples',
'parameters': [
{'bounds': {'max': 10, 'min': 0}, 'name': 'x1', 'type': 'double'},
{'bounds': {'max': 10, 'min': 0}, 'name': 'x2', 'type': 'double'},
],
'observation_budget': 100,
'metrics': [{'name': 'f1'}, {'name': 'f2'}],
'parallel_bandwidth': 1,
}
Multimetric Suggestions and Observations
Using the standard SigOpt optimization loop, demonstrated below, a Suggestion should be retrieved and evaluated and an Observation should be reported. The values
keyword is used to pass the result of the function evaluation.
experiment = conn.experiments().create(**experiment_meta)
while experiment.progress.observation_count < experiment.observation_budget:
suggestion = conn.experiments(experiment.id).suggestions().create()
values = evaluate_function(suggestion.assignments)
conn.experiments(experiment.id).observations().create(
suggestion=suggestion.id,
values=values,
)
experiment = conn.experiments(experiment.id).fetch()
Because these metrics compete (an improvement in one may require a penalty to the other) there is unlikely to be a single solution to this, or any, multimetric optimization problem. See our discussion on Pareto Efficiency for help interpreting the solution to this and other multimetric optimization problems.
Thresholds for Mulimetric Experiments
SigOpt will find trade-offs in each of your metrics in order to find the efficient frontier. In order to improve the quality of your experiment’s results, you may want to quantify how much of a trade-off you’re willing to make. This can be done with the Metric Thresholds feature.
Limitations
observation_budget
must be set when a multimetric experiment is created- The maximum number of optimized metrics is 2
- Multisolution experiments are not permitted
While this document discusses metric maximization, you can also minimize your metric.