AI container orchestration
for everyone

dstack is an open-source alternative to Kubernetes, designed to simplify development and deployment of AI. It works with top cloud providers and on-prem servers, and supports NVIDIA, AMD, and TPU.

Dev environments

Dev environments allow you to provision a remote machine, set up with your code and favorite IDE, with just one command.

Dev environments are perfect for interactively running code using your favorite IDE or notebook before scheduling a task or deploying a service.

Learn more

Tasks

A task allows you to schedule a job or run a web app. It lets you configure dependencies, resources, ports, and more. Tasks can be distributed and run on clusters.

Tasks are ideal for training and fine-tuning jobs or running apps for development purposes.

Learn more

Services

Services allow you to deploy web apps or models as private or public auto-scalable endpoints. You can configure dependencies, resources, authorizarion, auto-scaling rules, etc.

Once deployed, the web app or a model can be used by anyone on the team.

Learn more

Fleets

Fleets enable efficient provisioning and management of clusters and instances, both in the cloud and on-prem.

Once a fleet is created, it can be reused by dev environments, tasks, and services.

Learn more

Why ML engineers dstack

Andrew Spott

ML Engineer at Stealth Startup

Thanks to @dstack, I get the convenience of having a personal Slurm cluster and using budget-friendly cloud GPUs, without paying the super-high premiums charged by the big three.

Alvaro Bartolome

ML Engineer at Argilla

With @dstack it's incredibly easy to define a configuration within a repository and run it without worrying about GPU availability. It lets you focus on data and your research.

Park Chansung

ML Researcher at ETRI

Thanks to @dstack, I can effortlessly access the top GPU options across different clouds, saving me time and money while pushing my AI work forward.

Eckart Burgwedel

CEO at Uberchord

With @dstack, running LLMs on a cloud GPU is as easy as running a local Docker container. It combines the ease of Docker with the auto-scaling capabilities of K8S.

Peter Hill

Co-Founder at CUDO Compute

@dstack simplifies infrastructure provisioning and AI development. If your team is on the lookout for an AI platform, I wholeheartedly recommend @dstack.

Get started in under a minute

Open-source
Self-hosted
Use with your own cloud accounts:
Use with your on-prem servers:
Private subnets
Multiple tenancies
Control plane
CLI & API
Install open-source
Always free.
dstack Sky
Hosted by dstack
Get the cheapest GPUs from the marketplace.
Multiple tenancies
Control plane
CLI & API
Sign up now
Pay per compute.

Have questions, or need help?
Contact us Community