v0.4.3 - Universal System Instruction Abstraction
✨ Enhancement Release
This release provides universal system_instruction abstraction across all AI providers, building on the fix from v0.4.2.
What's New
- Universal system_instruction: Same parameter now works seamlessly across all providers
- OpenAI/Ollama: Automatically converts to system message in messages array
- Gemini: Passes as native system_instruction parameter
- Anthropic: Converts to system parameter
- Zero code changes needed: Use the same
system_instructionparameter everywhere - Backward compatible: All existing code continues to work
Example Usage
from ai_proxy_core import CompletionClient
client = CompletionClient()
# Same system_instruction works for all providers\!
response = await client.create_completion(
messages=[{"role": "user", "content": "Hello"}],
model="gpt-4", # or "gemini-1.5-flash", "claude-3", "llama2"
system_instruction="You are a helpful pirate. Speak like a pirate."
)Installation
pip install --upgrade ai-proxy-core==0.4.3Links
- 📦 PyPI Package
- 🐛 Related Issue #32
- 💻 Commit: 2ef184e
Full Changelog: v0.4.2...v0.4.3