.

Technology

Ray Tune

Ray Tune is the scalable, distributed hyperparameter optimization library built on Ray, enabling seamless transition from a single machine to a large cluster.

Ray Tune efficiently manages hyperparameter search across any scale: from your laptop to a multi-node cluster. It integrates directly with major ML frameworks (PyTorch, TensorFlow, XGBoost) and offers advanced search algorithms like Population-Based Training (PBT) and ASHA (Asynchronous Successive Halving). You define a search space and a target metric (e.g., 'mean_accuracy'), and Ray's distributed actors manage the parallel execution of hundreds of trials. This process minimizes tuning time and maximizes model performance, all without requiring code changes for scaling.

https://docs.ray.io/en/latest/tune/index.html
2 projects · 2 cities

Related technologies

Recent Talks & Demos

Showing 1-2 of 2

Members-Only

Sign in to see who built these projects