Documentation

Welcome to the developer documentation for SigOpt. If you have a question you can’t answer, feel free to contact us!
Welcome to the new SigOpt docs! If you're looking for the classic SigOpt documentation then you can find that here. Otherwise, happy optimizing!

Optimize Your Model

A key component of the SigOpt Platform is the ability to go from tracking your model with SigOpt Runs to optimizing that very same model with minimal changes to your code.

At a high level, a SigOpt Experiment is a grouping of SigOpt Runs and is defined by user-defined parameter and metric spaces. A SigOpt Experiment has a budget that is used to determine the number of hyperparameter tuning loops to conduct. Each hyperparameter loop produces a SigOpt Run with suggested assignments for each parameter. Different sets of hyperparameter values are suggested by either SigOpt algorithms and/or the user with the goal of finding the optimal set(s) of hyperparameters. Overtime, when using the SigOpt Optimizer, you can expect your model to perform better on your metrics.

Create a SigOpt Experiment

A SigOpt Experiment is created as an object containing all information needed to execute the optimization process. To create a SigOpt Experiment, you will define:

NameTypeKeyDescription
Experiment NameStringnameName of the experiment.
Experiment TypeStringtypeIdentifier for the type of experiment.
Parameter SpaceArrayparametersList of objects where each object is defining a single parameter.
Metric SpaceArraymetricsList of objects where each object is defining a single parameter.
Budget
IntegerbudgetInteger defining the minimum number of SigOpt Runs in a given SigOpt Experiment.
Parallel BandwidthIntegerparallel_bandwidthInteger defining the number of machines/nodes there will be running different asynchronous SigOpt Runs at any given time.

All the above information will be available through the Experiment object created during your experiment creation process as well as being available through the unique experiment page on the web dashboard.

sigopt.create_experiment(
  name="Keras Model Optimization (Python)",
  type="offline",
  parameters=[
    dict(
      name="hidden_layer_size",
      type="int",
      bounds=dict(
        min=32,
        max=128
      )
    ),
    dict(
      name="activation_fn",
      type="categorical",
      categorical_values=[
        "relu",
        "tanh"
      ]
    )
  ],
  metrics=[
    dict(
      name="holdout_accuracy",
      objective="maximize"
    )
  ],
  parallel_bandwidth=1,
  budget=30
)