Skip to content

Commit dbb00a4

Browse files
authored
Merge pull request #1328 from hughiwnl/litellmdocs
litellm docs
2 parents 97a3dd6 + d909534 commit dbb00a4

File tree

2 files changed

+268
-0
lines changed

2 files changed

+268
-0
lines changed

docs/mkdocs.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -394,6 +394,7 @@ nav:
394394
- LLM Providers:
395395
- Language Models:
396396
- Overview: "swarms/examples/model_providers.md"
397+
- LiteLLM: "swarms/examples/litellm.md"
397398
- OpenAI: "swarms/examples/openai_example.md"
398399
- Anthropic: "swarms/examples/claude.md"
399400
- Groq: "swarms/examples/groq.md"

docs/swarms/examples/litellm.md

Lines changed: 267 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,267 @@
1+
# LiteLLM with Swarms
2+
3+
LiteLLM provides a unified interface for 100+ LLM providers. Swarms uses LiteLLM to support multiple providers through a single API.
4+
5+
## Quick Start
6+
7+
```python
8+
from swarms import Agent
9+
10+
# Use any LiteLLM-supported model
11+
agent = Agent(
12+
model_name="gpt-4o-mini", # Change this to any provider
13+
max_loops=1,
14+
)
15+
16+
response = agent.run("Hello, world!")
17+
```
18+
19+
## Supported Providers
20+
21+
Switch providers by changing `model_name`:
22+
23+
```python
24+
# OpenAI
25+
Agent(model_name="gpt-4o")
26+
Agent(model_name="gpt-4o-mini")
27+
Agent(model_name="gpt-3.5-turbo")
28+
29+
# Anthropic Claude
30+
Agent(model_name="claude-3-5-sonnet-20241022")
31+
Agent(model_name="claude-3-opus")
32+
33+
# Google Gemini
34+
Agent(model_name="gemini/gemini-pro")
35+
Agent(model_name="gemini/gemini-1.5-pro")
36+
37+
# Azure OpenAI
38+
Agent(model_name="azure/gpt-4")
39+
40+
# Ollama (local)
41+
Agent(model_name="ollama/llama2")
42+
Agent(model_name="ollama/mistral")
43+
44+
# Cohere
45+
Agent(model_name="command-r")
46+
Agent(model_name="command-r-plus")
47+
48+
# DeepSeek
49+
Agent(model_name="deepseek/deepseek-chat")
50+
Agent(model_name="deepseek/deepseek-r1")
51+
52+
# Groq
53+
Agent(model_name="groq/llama-3.1-70b-versatile")
54+
55+
# OpenRouter
56+
Agent(model_name="openrouter/google/palm-2-chat-bison")
57+
58+
# X.AI
59+
Agent(model_name="xai/grok-beta")
60+
```
61+
62+
## Using LiteLLM Wrapper Directly
63+
64+
```python
65+
from swarms.utils.litellm_wrapper import LiteLLM
66+
67+
llm = LiteLLM(
68+
model_name="gpt-4o",
69+
temperature=0.7,
70+
max_tokens=2000,
71+
verbose=True,
72+
)
73+
74+
response = llm.run("What is machine learning?")
75+
```
76+
77+
## Features
78+
79+
### 1. Vision (Image Input)
80+
81+
```python
82+
from swarms import Agent
83+
84+
agent = Agent(model_name="gpt-4o", max_loops=1)
85+
86+
# Supports: file path, URL, or base64
87+
response = agent.run(
88+
"Describe this image",
89+
img="path/to/image.jpg" # or URL or base64
90+
)
91+
```
92+
93+
### 2. Tool/Function Calling
94+
95+
```python
96+
from swarms.utils.litellm_wrapper import LiteLLM
97+
98+
tools = [
99+
{
100+
"type": "function",
101+
"function": {
102+
"name": "get_weather",
103+
"description": "Get weather for a location",
104+
"parameters": {
105+
"type": "object",
106+
"properties": {
107+
"location": {"type": "string"}
108+
},
109+
"required": ["location"]
110+
}
111+
}
112+
}
113+
]
114+
115+
llm = LiteLLM(
116+
model_name="gpt-4o",
117+
tools_list_dictionary=tools,
118+
tool_choice="auto",
119+
)
120+
121+
response = llm.run("What's the weather in San Francisco?")
122+
```
123+
124+
### 3. Reasoning Models
125+
126+
```python
127+
from swarms.utils.litellm_wrapper import LiteLLM
128+
129+
llm = LiteLLM(
130+
model_name="openai/o1-preview",
131+
reasoning_enabled=True,
132+
max_tokens=4000,
133+
)
134+
135+
response = llm.run("Solve this complex math problem...")
136+
```
137+
138+
### 4. Streaming
139+
140+
```python
141+
from swarms.utils.litellm_wrapper import LiteLLM
142+
143+
llm = LiteLLM(model_name="gpt-4o", stream=True)
144+
145+
for chunk in llm.run("Tell me a story"):
146+
print(chunk, end="", flush=True)
147+
```
148+
149+
### 5. Audio Input
150+
151+
```python
152+
from swarms.utils.litellm_wrapper import LiteLLM
153+
154+
llm = LiteLLM(
155+
model_name="gpt-4o",
156+
audio="path/to/audio.wav",
157+
)
158+
159+
response = llm.run("Transcribe this audio")
160+
```
161+
162+
### 6. Advanced Configuration
163+
164+
```python
165+
from swarms.utils.litellm_wrapper import LiteLLM
166+
167+
llm = LiteLLM(
168+
model_name="gpt-4o",
169+
system_prompt="You are a helpful assistant.",
170+
temperature=0.7,
171+
max_tokens=4000,
172+
stream=False,
173+
verbose=True,
174+
retries=3,
175+
caching=False,
176+
top_p=1.0,
177+
)
178+
179+
response = llm.run("Explain neural networks")
180+
```
181+
182+
## Provider Setup
183+
184+
### Azure OpenAI
185+
186+
```python
187+
import os
188+
os.environ["AZURE_API_KEY"] = "your-key"
189+
os.environ["AZURE_API_BASE"] = "https://your-resource.openai.azure.com/"
190+
os.environ["AZURE_API_VERSION"] = "2024-02-15-preview"
191+
192+
agent = Agent(model_name="azure/gpt-4", max_loops=1)
193+
```
194+
195+
### Anthropic Claude
196+
197+
```python
198+
import os
199+
os.environ["ANTHROPIC_API_KEY"] = "your-key"
200+
201+
agent = Agent(model_name="claude-3-5-sonnet-20241022", max_loops=1)
202+
```
203+
204+
### Google Gemini
205+
206+
```python
207+
import os
208+
os.environ["GEMINI_API_KEY"] = "your-key"
209+
210+
agent = Agent(model_name="gemini/gemini-pro", max_loops=1)
211+
```
212+
213+
### Ollama (Local)
214+
215+
```python
216+
# No API key needed - ensure Ollama is running
217+
agent = Agent(model_name="ollama/llama2", max_loops=1)
218+
```
219+
220+
## Complete Examples
221+
222+
### Multi-Provider Comparison
223+
224+
```python
225+
from swarms import Agent
226+
227+
models = ["gpt-4o-mini", "claude-3-5-sonnet-20241022", "gemini/gemini-pro"]
228+
task = "Explain quantum computing in one paragraph."
229+
230+
for model_name in models:
231+
print(f"\n=== {model_name} ===")
232+
agent = Agent(model_name=model_name, max_loops=1)
233+
response = agent.run(task)
234+
print(response[:200])
235+
```
236+
237+
### Vision Analysis
238+
239+
```python
240+
from swarms import Agent
241+
242+
agent = Agent(model_name="gpt-4o", max_loops=1)
243+
244+
response = agent.run(
245+
"Analyze this image and describe what you see.",
246+
img="https://example.com/image.jpg"
247+
)
248+
print(response)
249+
```
250+
251+
### Streaming Response
252+
253+
```python
254+
from swarms.utils.litellm_wrapper import LiteLLM
255+
256+
llm = LiteLLM(model_name="gpt-4o", stream=True)
257+
258+
print("Response: ", end="")
259+
for chunk in llm.run("Write a short poem about AI"):
260+
print(chunk, end="", flush=True)
261+
```
262+
263+
## Resources
264+
265+
- **LiteLLM Docs**: https://docs.litellm.ai/
266+
- **Providers**: https://docs.litellm.ai/docs/providers
267+
- **Swarms Wrapper**: `swarms/utils/litellm_wrapper.py`

0 commit comments

Comments
 (0)