All Posts

Ray Dashboard 8 Core
08 . 12 . 2021

Writing your First Distributed Python Application with Ray

Ray is a fast, simple distributed execution framework that makes it easy to scale your applications and to leverage state of the art machine learning libraries. Using Ray, you can take Python code that runs sequentially and transform it into a distri...

Ray + LightGBM
08 . 10 . 2021

Introducing Distributed LightGBM Training with Ray

LightGBM is a gradient boosting framework based on tree-based learning algorithms. Compared to XGBoost, it is a relatively new framework, but one that is quickly becoming popular in both academic and production use cases.  We’re excited to announce a...

Neural MMO
07 . 22 . 2021

Best Reinforcement Learning Talks from Ray Summit 2021

An overview of some of the best reinforcement learning talks presented at the second Ray Summit

ML Platform Panel
07 . 20 . 2021

Best Machine Learning Talks from Ray Summit 2021

An overview of some of the best machine learning talks presented at Ray Summit 2021.

Kafka + Ray
07 . 13 . 2021

Serverless Kafka Stream Processing with Ray

Learn how Ray can be paired with Apache Kafka to power streaming applications.

06 . 29 . 2021

Analyzing memory management and performance in Dask-on-Ray

Ray is a general-purpose distributed system. One of Ray's goals is to seamlessly integrate data processing libraries (e.g., Dask, Spark) into distributed applications. As part of this goal, Ray provides a robust distributed memory manager. The goal...

06 . 16 . 2021

Introducing Distributed XGBoost Training with Ray

XGBoost-Ray is a novel backend for distributed XGBoost training. It features multi node and multi GPU training, distributed data loading, advanced fault tolerance such as elastic training, and a seamless integration with hyperparameter optimization f...

Ray Distributed Library Patterns (Figure 2)
06 . 14 . 2021

Ray Distributed Library Patterns

Ray has many library integrations, from machine learning libraries such as Horovod and Hugging Face to data processing frameworks such as Spark, Modin, and Dask. But what does it mean to be "integrated with Ray"? And what benefits does it provide to...

Sender Receiver (Introducing Collective Communication Primitive APIs in Ray)
05 . 28 . 2021

Introducing Collective Communication Primitive APIs in Ray

In Ray 1.2.0, we’ve added a library for “collective communication primitives” to Ray. These primitives can be used in your Ray program to simplify the exchange of information across many distributed processes at the same time, speeding up certain dis...

05 . 17 . 2021

Autoscaling clusters with Ray

This post explores how The Ray Cluster works. To demonstrate how to speed up compute by using Ray Clusters, this post will perform a compute-intensive image processing task (palette extraction) on a large dataset (thousands of movie posters).