Webinar

Population Based Training with Ray Tune

Wednesday, November 18, 6:00PM UTC

Population Based Training (PBT) is a method for hyperparameter optimization algorithm that trains many models in parallel and uses information from the entire population to refine hyperparameters and allocate more resources to promising configurations. Ray Tune is one of the only libraries to offer a distributed implementation of PBT. In this talk, we will compare PBT against other hyperparameter optimization algorithms to show why PBT is more cost efficient, show how to use PBT with Tune, and give an update on some recent PBT improvements in Ray. Finally, we will discuss a new algorithm, Population Based Bandits, that combines PBT with Bayesian Optimization, and was added to Tune in v1.0.1.

Schedule

10:00am - 10:05am: Introductions and welcome

10:05am - 10:30am: Population Based Training with Ray Tune

10:30am - 11:00am: Audience Q&A

Speakers

Amog Kamsetty

Amog Kamsetty

Amog is a software engineer at Anyscale. He helps maintain and develop two machine learning libraries: Ray Tune for distributed hyperparameter tuning and RaySGD for distributed deep learning training.