All Posts

The 2021 Ray Community Pulse Survey is Now Open

Why I joined Anyscale

A bold vision — to make distributed computing simple and accessible from anywhere. Open source model, with a developer-first mindset. A vibrant and rapidly growing community. And a stellar team, with a track record of building great technology to solve hard problems.

Why you should build your AI Applications with Ray

Attention Nets and More with RLlib's Trajectory View API

Online Resource Allocation with Ray at Ant Group

Executing a distributed shuffle without a MapReduce system

How to Speed Up Pandas with Modin

Getting Started with Distributed Machine Learning with PyTorch and Ray

Ray is a popular framework for distributed Python that can be paired with PyTorch to rapidly scale machine learning applications.

Data Processing Support in Ray

Retrieval Augmented Generation with Huggingface Transformers and Ray

Huggingface Transformers recently added the Retrieval Augmented Generation (RAG) model, a new NLP architecture that leverages external documents (like Wikipedia) to augment its knowledge and achieve state of the art results on knowledge-intensive tasks. In this blog post, we introduce the integration of Ray, a library for building scalable applications, into the RAG contextual document retrieval mechanism. This speeds up retrieval calls by 2x and improves the scalability of RAG distributed fine-tuning.