Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@
MODEL_BASE_URL=
MODEL_KEY=

#MINIMAX (https://platform.minimax.io)
#Models: MiniMax-M2.7, MiniMax-M2.7-highspeed, MiniMax-M2.5, MiniMax-M2.5-highspeed (204K context)
MINIMAX_API_KEY=

#CUSTOM MODEL
QWEN_BASE_URL=
QWEN_KEY=
Expand Down
32 changes: 32 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,37 @@ MODEL_BASE_URL=<your model base url>
MODEL_KEY=<your model key>
```

## Supported LLM Providers

Aser works with any OpenAI-compatible API. Use the built-in `providers` module for quick setup:

| Provider | Models | Context |
|----------|--------|---------|
| OpenAI | gpt-4o, gpt-4.1-mini, … | up to 128K |
| [MiniMax](https://platform.minimax.io) | MiniMax-M2.7, MiniMax-M2.7-highspeed, MiniMax-M2.5, MiniMax-M2.5-highspeed | **204K** |
| Any OpenAI-compat API | set `MODEL_BASE_URL` + `MODEL_KEY` | depends on model |

### Using MiniMax

```bash
export MINIMAX_API_KEY=your_api_key_here
```

```python
from aser.agent import Agent
from aser.providers import MINIMAX, MINIMAX_MODELS

agent = Agent(
name="minimax_agent",
model=MINIMAX_MODELS["standard"], # "MiniMax-M2.7"
model_config=MINIMAX,
)
response = agent.chat("What is MiniMax AI?")
print(response)
```

See the full example at [examples/agent_minimax.py](./examples/agent_minimax.py).

## Usage

```python
Expand Down Expand Up @@ -66,6 +97,7 @@ If you clone the project source code, before running the examples, please run `p

- [Aser Agent](./examples/agent.py): Your First AI Agent
- [Model Config](./examples/agent_model.py): Customize the LLM configuration
- [MiniMax](./examples/agent_minimax.py): Use MiniMax M2.7 / M2.5 models (204K context)
- [Character](./examples/agent_character.py): Build an agent with character
- [Memory](./examples/agent_memory.py): Build an agent with memory storage
- [RAG](./examples/agent_knowledge.py): Build an agent with knowledge retrieval
Expand Down
32 changes: 32 additions & 0 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,37 @@ MODEL_BASE_URL=<您的模型基础URL>
MODEL_KEY=<您的模型密钥>
```

## 支持的 LLM 提供商

Aser 兼容任何 OpenAI 兼容 API。通过内置 `providers` 模块可快速配置:

| 提供商 | 模型 | 上下文长度 |
|--------|------|-----------|
| OpenAI | gpt-4o, gpt-4.1-mini, … | 最高 128K |
| [MiniMax](https://platform.minimax.io) | MiniMax-M2.7, MiniMax-M2.7-highspeed, MiniMax-M2.5, MiniMax-M2.5-highspeed | **204K** |
| 任意 OpenAI 兼容 API | 设置 `MODEL_BASE_URL` + `MODEL_KEY` | 取决于模型 |

### 使用 MiniMax

```bash
export MINIMAX_API_KEY=your_api_key_here
```

```python
from aser.agent import Agent
from aser.providers import MINIMAX, MINIMAX_MODELS

agent = Agent(
name="minimax_agent",
model=MINIMAX_MODELS["standard"], # "MiniMax-M2.7"
model_config=MINIMAX,
)
response = agent.chat("你好,请介绍一下 MiniMax。")
print(response)
```

完整示例参见 [examples/agent_minimax.py](./examples/agent_minimax.py)。

## 使用

```python
Expand Down Expand Up @@ -66,6 +97,7 @@ aser = Agent(

- [Aser代理](./examples/agent.py): 您的第一个AI代理
- [模型配置](./examples/agent_model.py): 自定义LLM配置
- [MiniMax](./examples/agent_minimax.py): 使用 MiniMax M2.7/M2.5 模型(204K 上下文)
- [角色](./examples/agent_character.py): 构建具有角色的代理
- [记忆](./examples/agent_memory.py): 构建具有记忆存储的代理
- [RAG](./examples/agent_knowledge.py): 构建具有知识检索的代理
Expand Down
1 change: 1 addition & 0 deletions aser/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
from .text2sql import Text2SQL
from .evolution import SelfCodingTool
from . import social,storage,utils
from . import providers



Expand Down
63 changes: 63 additions & 0 deletions aser/providers.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
"""
Provider presets for popular LLM APIs.

Usage:
from aser.providers import MINIMAX, OPENAI
from aser.agent import Agent

agent = Agent(name="my_agent", model="MiniMax-M2.7", model_config=MINIMAX)
"""

import os


def _make_config(base_url: str, api_key_env: str, api_key: str | None = None) -> dict:
return {
"base_url": base_url,
"api_key": api_key or os.getenv(api_key_env) or "MISSING_API_KEY",
}


# ---------------------------------------------------------------------------
# MiniMax — OpenAI-compatible endpoint
# Models: MiniMax-M2.7, MiniMax-M2.7-highspeed, MiniMax-M2.5, MiniMax-M2.5-highspeed
# Context: 204K tokens for all models
# Docs: https://docs.minimax.io
# Note: temperature must be in (0.0, 1.0]; do NOT use temperature=0
# ---------------------------------------------------------------------------
MINIMAX = _make_config(
base_url="https://api.minimax.io/v1",
api_key_env="MINIMAX_API_KEY",
)

# Recommended models
MINIMAX_MODELS = {
"standard": "MiniMax-M2.7",
"fast": "MiniMax-M2.7-highspeed",
"standard_v25": "MiniMax-M2.5",
"fast_v25": "MiniMax-M2.5-highspeed",
}

# ---------------------------------------------------------------------------
# OpenAI
# ---------------------------------------------------------------------------
OPENAI = _make_config(
base_url="https://api.openai.com/v1",
api_key_env="OPENAI_API_KEY",
)

# ---------------------------------------------------------------------------
# Anthropic (OpenAI-compatible proxy via third-party gateways)
# Use your gateway's base_url; Anthropic's native API is not OpenAI-compatible.
# ---------------------------------------------------------------------------
ANTHROPIC_COMPAT = _make_config(
base_url=os.getenv("ANTHROPIC_COMPAT_BASE_URL", "https://api.anthropic.com/v1"),
api_key_env="ANTHROPIC_API_KEY",
)

# ---------------------------------------------------------------------------
# Generic helper: build a custom provider config on the fly
# ---------------------------------------------------------------------------
def custom_provider(base_url: str, api_key: str) -> dict:
"""Return a model_config dict for any OpenAI-compatible provider."""
return {"base_url": base_url, "api_key": api_key}
72 changes: 72 additions & 0 deletions examples/agent_minimax.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
"""
Example: Using Aser with MiniMax models.

MiniMax offers OpenAI-compatible API access to its M2.7 and M2.5 model families,
all featuring 204K-token context windows.

Setup:
1. Get your API key from https://platform.minimax.io
2. Set the environment variable:
export MINIMAX_API_KEY=your_api_key_here
3. Run this example:
python examples/agent_minimax.py
"""

import os
from dotenv import load_dotenv

load_dotenv()

from aser.agent import Agent
from aser.providers import MINIMAX, MINIMAX_MODELS

# ---------------------------------------------------------------------------
# Basic usage: use the default MODEL_BASE_URL / MODEL_KEY env vars
# ---------------------------------------------------------------------------
# You can also point MODEL_BASE_URL=https://api.minimax.io/v1 and
# MODEL_KEY=<your MINIMAX_API_KEY> in your .env file and use Agent without
# explicitly passing model_config:
#
# agent = Agent(name="aser agent", model="MiniMax-M2.7")

# ---------------------------------------------------------------------------
# Explicit usage: pass model_config=MINIMAX
# ---------------------------------------------------------------------------
print("=== MiniMax M2.7 (standard) ===")
agent = Agent(
name="minimax_agent",
description="You are a helpful AI assistant powered by MiniMax.",
model=MINIMAX_MODELS["standard"], # "MiniMax-M2.7"
model_config=MINIMAX, # base_url + MINIMAX_API_KEY
)
response = agent.chat("What is MiniMax AI? Introduce yourself briefly.")
print(response)

# ---------------------------------------------------------------------------
# High-speed variant — same capability, optimised for latency
# ---------------------------------------------------------------------------
print("\n=== MiniMax M2.7-highspeed (fast) ===")
fast_agent = Agent(
name="minimax_fast_agent",
description="You are a concise assistant. Always answer in one sentence.",
model=MINIMAX_MODELS["fast"], # "MiniMax-M2.7-highspeed"
model_config=MINIMAX,
)
response = fast_agent.chat("What is the capital of France?")
print(response)

# ---------------------------------------------------------------------------
# Tip: you can also build the config inline without importing MINIMAX
# ---------------------------------------------------------------------------
print("\n=== Inline model_config ===")
inline_agent = Agent(
name="inline_minimax_agent",
description="Helpful assistant",
model="MiniMax-M2.5",
model_config={
"base_url": "https://api.minimax.io/v1",
"api_key": os.getenv("MINIMAX_API_KEY"),
},
)
response = inline_agent.chat("Say hello in three different languages.")
print(response)
Loading