All Posts

Neural MMO
07 . 22 . 2021

Best Reinforcement Learning Talks from Ray Summit 2021

An overview of some of the best reinforcement learning talks presented at the second Ray Summit

ML Platform Panel
07 . 20 . 2021

Best Machine Learning Talks from Ray Summit 2021

An overview of some of the best machine learning talks presented at Ray Summit 2021.

Kafka + Ray
07 . 13 . 2021

Serverless Kafka Stream Processing with Ray

Learn how Ray can be paired with Apache Kafka to power streaming applications.

Dask+Ray
06 . 29 . 2021

Analyzing memory management and performance in Dask-on-Ray

Ray is a general-purpose distributed system. One of Ray's goals is to seamlessly integrate data processing libraries (e.g., Dask, Spark) into distributed applications. As part of this goal, Ray provides a robust distributed memory manager. The goal...

XGBoost-Ray
06 . 16 . 2021

Introducing Distributed XGBoost Training with Ray

XGBoost-Ray is a novel backend for distributed XGBoost training. It features multi node and multi GPU training, distributed data loading, advanced fault tolerance such as elastic training, and a seamless integration with hyperparameter optimization f...

Ray Distributed Library Patterns (Figure 2)
06 . 14 . 2021

Ray Distributed Library Patterns

Ray has many library integrations, from machine learning libraries such as Horovod and Hugging Face to data processing frameworks such as Spark, Modin, and Dask. But what does it mean to be "integrated with Ray"? And what benefits does it provide to...

Sender Receiver (Introducing Collective Communication Primitive APIs in Ray)
05 . 28 . 2021

Introducing Collective Communication Primitive APIs in Ray

In Ray 1.2.0, we’ve added a library for “collective communication primitives” to Ray. These primitives can be used in your Ray program to simplify the exchange of information across many distributed processes at the same time, speeding up certain dis...

AutoscalingThing
05 . 17 . 2021

Autoscaling clusters with Ray

This post explores how The Ray Cluster works. To demonstrate how to speed up compute by using Ray Clusters, this post will perform a compute-intensive image processing task (palette extraction) on a large dataset (thousands of movie posters).

Ray Ecosystem
05 . 12 . 2021

The 2021 Ray Community Pulse Survey is Now Open

Calling all Ray users! Take a few minutes to complete the Ray Community Pulse survey to let us know how your use Ray and help guide our roadmap.

Asawari Samant 2
05 . 11 . 2021

Why I joined Anyscale

A bold vision — to make distributed computing simple and accessible from anywhere. Open source model, with a developer-first mindset. A vibrant and rapidly growing community. And a stellar team, with a track record of building great technology to sol...