Webinar

Simplify and scale your XGBoost model using Ray on Anyscale

Wednesday, March 9, 5:00PM UTC

In this webinar, we will cover how Ray, a universal distributed computing framework running on Anyscale, simplifies the end-to-end machine learning lifecycle and provides serverless compute without limits. We will go through an example from beginning to end using XGBoost.

See first hand how to:

  • Load data with Ray Datasets

  • Train an XGBoost model on Ray

  • Perform hyperparameter tuning with Ray Tune

  • Scale from your laptop to Anyscale with zero code changes

  • Experiment tracking with Weight and Biases

Resources

Speakers

Phi Nguyen

Phi Nguyen

GTM Tech Lead, Anyscale

Phi has been working with Fortune 500 customers in Retail, CPG, HCLS, Financial services and startups to accelerate their machine learning practices. This includes a wide range of engagements such as helping teams organize and build a center of excellence for ML, MLOps processes and automation, ML use cases development and feasibility to providing cloud best practices combining Ray and public cloud such as AWS and GCP or open source projects running on Kubernetes.

Antoni Baum

Antoni Baum

Software engineer, Anyscale

Antoni is a Software Engineer at Anyscale, working on Ray Tune and other ML libraries, and a Computer Science & Econometrics MSc student. In his spare time, he contributes to various open source projects, trying to make machine learning more accessible and approachable.