Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Latest commit

 

History

History

Hyperparameters

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

Hyperparameter Tuning

The goal of hyperparameter tuning is to find the optimized parameters that are used as input constants for each training job, such as learning rate and network model definitions. Hyperparameter Tuning can be treated as an upper layer of the actual learning problems which may require much more computational resources.

We provide a collection of helper classes/functions in job_factory.py and experiment.py to submit/monitor hyperparameter jobs. Please refer to the README.md for detail.

This Random-Search recipe contains information on how to implement the basic random search based hyperparameter tuning using BatchAI.

This recipe contains an example on how to implement the basic hyperband hyperparameter tuning algorithm using BatchAI.

Help or Feedback


If you have any problems or questions, you can reach the Batch AI team at AzureBatchAITrainingPreview@service.microsoft.com or you can create an issue on GitHub.

We also welcome your contributions of additional sample notebooks, scripts, or other examples of working with Batch AI.