Hyperparameter tuning with external model files

Use APIs to run hyperparameter tuning on external models.

To prepare you external model files, do the following:
  1. Obtain proposed hyperparameters:
    The Hyperparameters will be supplied in a file called config.json as a JSON formatted dictionary, located in the current folder and can be read using the following example snippet (which expects Hyperparameters to be defined for initial_learning_rate):
    hyper_params = json.loads(open("config.json").read())
    learning_rate = float(hyper_params.get("initial_learning_rate", "0.01"))
    
  2. Output trainings results:

    At the end of your training run your code will need to create a file called $RESULT_DIR/val_dict_list.json with the series of test metrics generated during training. This file is analyzed by the HPO algorithm and the statistics contained within it are used to guide choice of hyper-parameters in subsequent runs.

    The content of val_dict_list.json will be some thing as below, ‘step’ is some thing optional, one of ‘loss’ and ‘accuracy’ can be the name of target metric to optimize, at least one metric need to be included here.
    [
    {‘step’: 1, ‘loss’:0.2487, ‘accuracy’: 0.4523},
    {‘step’: 2, ‘loss’:0.1487, ‘accuracy’: 0.5523},
    {‘step’: 3, ‘loss’:0.1087, ‘accuracy’: 0.6523},
    …
    ]
  3. For an elastic distributed training model, consider leveraging the call back functions to log the metrics. See Elastic distributed training usage and examples.
  4. Start using the HyperSearch API, see Deep learning API