Posts by Mark Saroufim

03 . 24 . 2022

Large-scale distributed training with TorchX and Ray

We're excited to introduce a new Ray Scheduler for TorchX — a joint effort from the PyTorch and Anyscale teams that allows developers to run scalable and distributed PyTorch workloads without setting up infrastructure or changing training scripts.