Training

TRAINING: Introduction to Ray for distributed applications

An introduction to Ray, the system for scaling your Python and machine learning applications from a laptop to a cluster. We'll start with a hands-on exploration of the core Ray API for distributed workloads, covering basic distributed Ray Core API patterns, and then move on to a quick introduction to Ray's native libraries:
- Remote functions as tasks
- Remote objects as futures
- Remote classes as stateful actors
- Quick introduction to Ray's native libraries

Key takeaways:
- Understand what the Ray ecosystem is and why to use it
- Learn about Ray Core basic APIs and Python APIs
- Use Ray APIs to convert Python functions and classes into distributed stateless and stateful tasks
- Use Dashboard for inspection
- Learn about the purpose of Ray native libraries and how to use them

About Jules

Jules S. Damji is a lead developer advocate at Anyscale and an MLflow contributor. He is a hands-on developer with over 20 years of experience and has worked at leading companies such as Sun Microsystems, Netscape, @Home, Opsware/Loudcloud, VeriSign, ProQuest, Hortonworks, and Databricks, building large-scale distributed systems. He holds a BSc and MSc in computer science (from Oregon State University and Cal State, Chico, respectively), and an MA in political advocacy and communication (from Johns Hopkins University).

Jules Damji

Lead Developer Advocate, Anyscale
Ray Summit 2022 logo blue

Ready to Register?

Come connect with the global community of thinkers and disruptors who are building and deploying the next generation of AI and ML applications.

Save your spot

Join the Conversation

Ready to get involved in the Ray community before the conference? Ask a question in the forums. Open a pull request. Or share why you’re excited with the hashtag #RaySummit on Twitter.