Skip to content

swiss-ai/model-launch

Repository files navigation

sml: Swiss AI Model Launch

SML Logo

CI Docs Quality Gate Coverage Python License

Easy to launch LLM models 🚀

A CLI for launching LLMs on HPC clusters via SLURM directly or through FirecREST. Public serving endpoint: https://serving.swissai.svc.cscs.ch/.

Quickstart

pip install git+https://github.com/swiss-ai/model-launch.git
sml init
sml

That's it — the second command sml init sets up credentials, the third launches a model interactively.

Prefer a script you can copy? Browse examples/ and run any of them after pip install.

Documentation

Topic When to read
Getting Started First time here
Initialization Setting up credentials, FirecREST vs SLURM
Using SML Day-to-day launches via the interactive CLI
Advanced Usage Full SLURM/framework control
How to Size a Model Picking replica/node layout for a given model
Benchmarking Measuring throughput and latency
MCP Server Driving SML from Claude Desktop / Cursor
Architecture How SML fits with serving-api and opentela
Development Contributing to SML itself
CI/CD Pipeline structure
FAQ Always-on hosting, common gotchas

A rendered docs site is built from the same files via MkDocs — run make docs for a local preview, or browse the published site at https://swiss-ai.github.io/model-launch/.

Launching Apertus-8B with sml

License

Apache 2.0.

About

A CLI app for launching LLM/VLM models on clusters.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors