HomeBlogBlog Detail

Ray is Joining The PyTorch Foundation

By Robert Nishihara, Philipp Moritz and Ion Stoica   |   October 22, 2025

Today, Ray is joining the PyTorch Foundation, which is part of the Linux Foundation. The Linux Foundation is home to foundational infrastructure projects including Kubernetes, PyTorch, and vLLM.

Daily downloads of Ray have grown nearly 10x over the past year.

Today, Ray drives billions of dollars of annual compute consumption and powers AI workloads across the globe, with Ray Summit 2025 featuring use cases from companies across the board including

  • AI startups: xAI, Thinking Machines, Perplexity

  • Established tech companies: Netflix, ByteDance, Apple, Tencent

  • Major enterprises: JPMorgan, BMW, Bridgewater, Workday

Ray downloads per dayRay downloads per day
Ray downloads per day

LinkEarly Days

We started Ray at UC Berkeley because, as graduate students working on AI, we needed to scale our algorithms across CPU and GPU clusters, and we found ourselves spending most of our time building infrastructure to solve the distributed systems challenges of scaling (reliability, cost, performance).

From the start, we were interested in producing useful open-source software. And we were inspired by the previous generation of students in our lab who had created Apache Spark.

We started Anyscale in 2019 to develop and commercialize Ray. However, at the time, the need for scale existed only among the earliest of adopters. Ant Group was the first major production user of Ray. Uber and Pinterest were among the next. But the reality was that, at the time, the vast majority of companies didn’t need to scale compute for AI.

All of that changed with the advent of generative AI, LLMs, and today’s reasoning models. Now in 2025, the need for scale is clear. And the software engineering challenges around scaling AI workloads are far greater today than they were when we started Ray.

LinkFull Circle

In some ways, we have come full circle. From the start, we envisioned Ray as a general purpose framework (at its core, Ray is an actor framework), but the very first use case we pursued was reinforcement learning. We used RL to showcase Ray’s advantages in the original Ray paper. We focused on reinforcement learning because we ourselves were working on RL (TRPO, RLlib). A huge fraction of the AI community was working on playing Atari games and learning motor control in physics simulators like Mujoco. However, that wave of excitement around reinforcement learning was relatively siloed and didn’t translate into widespread practical application.

Today, reinforcement learning has made a major comeback through its success in building reasoning models and shaping agentic behavior. And today, nearly every open-source RL framework for post-training LLMs is built on top of Ray.

LinkPyTorch Foundation

When it comes to infrastructure software, open source wins. As the software compute stack for AI emerges, every layer of the stack will have an open-source standard. PyTorch is the dominant deep learning framework. Kubernetes is the dominant container orchestrator. We are building Ray to serve as the open-source distributed compute engine.

Ray adoption has exploded over the last year. Ray is joining the PyTorch Foundation now because we are growing a global community of contributors. The need for scale is only growing, and our intention is to invest heavily and to build the open-source community around Ray to meet the rapidly evolving needs of AI.

LinkJoin the Ray Community

This is just the beginning.

Together, we’ll continue shaping the future of AI infrastructure — open and community-driven.

Ready to try Anyscale?

Access Anyscale today to see how companies using Anyscale and Ray benefit from rapid time-to-market and faster iterations across the entire AI lifecycle.