Container orchestration
for AI teams
dstack is an open-source alternative to Kubernetes and Slurm, designed to simplify container orchestration for AI workloads both in the cloud and on-prem. It supports NVIDIA, AMD, TPU, and Intel accelerators.
Dev environments
Dev environments allow you to provision a remote machine, set up with your code and favorite IDE, with just one command.
Dev environments are perfect for interactively running code using your favorite IDE or notebook before scheduling a task or deploying a service.


Tasks
A task allows you to schedule a job or run a web app. It lets you configure dependencies, resources, ports, and more. Tasks can be distributed and run on clusters.
Tasks are ideal for training and fine-tuning jobs or running apps for development purposes.
Services
Services allow you to deploy web apps or models as private or public auto-scalable endpoints. You can configure dependencies, resources, authorizarion, auto-scaling rules, etc.
Once deployed, the web app or a model can be used by anyone on the team.


Fleets
Fleets enable efficient provisioning and management of clusters and instances, both in the cloud and on-prem.
Once a fleet is created, it can be reused by dev environments, tasks, and services.
Why ML engineers dstack

Andrew Spott
ML Engineer at Stealth Startup
Thanks to @dstack, I get the convenience of having a personal Slurm cluster and using budget-friendly cloud GPUs, without paying the super-high premiums charged by the big three.

Alvaro Bartolome
ML Engineer at Argilla
With @dstack it's incredibly easy to define a configuration within a repository and run it without worrying about GPU availability. It lets you focus on data and your research.

Park Chansung
ML Researcher at ETRI
Thanks to @dstack, I can effortlessly access the top GPU options across different clouds, saving me time and money while pushing my AI work forward.

Eckart Burgwedel
CEO at Uberchord
With @dstack, running LLMs on a cloud GPU is as easy as running a local Docker container. It combines the ease of Docker with the auto-scaling capabilities of K8S.

Peter Hill
Co-Founder at CUDO Compute
@dstack simplifies infrastructure provisioning and AI development. If your team is on the lookout for an AI platform, I wholeheartedly recommend @dstack.
Get started in under a minute
Have questions, or need help?
Talk to us
Discord