Skip to content

Add Tensorix as a new LLM provider#1022

Open
shanemort1982 wants to merge 1 commit intoenricoros:mainfrom
shanemort1982:add-tensorix-provider
Open

Add Tensorix as a new LLM provider#1022
shanemort1982 wants to merge 1 commit intoenricoros:mainfrom
shanemort1982:add-tensorix-provider

Conversation

@shanemort1982
Copy link
Copy Markdown

Hey there 👋

This adds Tensorix as a new OpenAI-compatible vendor in big-AGI, letting users access models from DeepSeek, Meta Llama, Qwen, GLM, MiniMax and more through a single API key and endpoint.

Tensorix follows the same OpenAI-compatible pattern as Deepseek/Together/etc, so the integration is straightforward — new vendor folder + wiring into the existing dialect system.

What's included

  • New vendor: src/modules/llms/vendors/tensorix/ — vendor definition + setup UI component
  • Dialect: tensorix added to openai.access.ts with default host https://api.tensorix.ai
  • Server env: TENSORIX_API_KEY for server-side deployment config
  • Backend capability: hasLlmTensorix for the green checkmark when pre-configured
  • Registry + UI wiring: vendor registered in vendors.registry.ts, LLMVendorSetup.tsx, and backend.router.ts

How it works

Users add their Tensorix API key in Models → Add → Tensorix and models are fetched from the /v1/models endpoint. The setup UI includes an optional custom host field and client-side fetch toggle, matching the pattern of other cloud vendors.

This also opens up cross-promotion opportunities — we'd be happy to feature big-AGI prominently in Tensorix's integration docs at docs.tensorix.ai.

Happy to adjust anything if needed!

Add Tensorix (https://tensorix.ai) as a new OpenAI-compatible vendor,
giving users access to models from DeepSeek, Meta Llama, Qwen, GLM,
MiniMax and others through a single API key.

Changes:
- New vendor: src/modules/llms/vendors/tensorix/ (vendor + setup UI)
- Register 'tensorix' dialect in openai.access.ts with default host
- Add TENSORIX_API_KEY server env variable
- Add hasLlmTensorix backend capability
- Wire up vendor in registry, LLMVendorSetup, and backend router

Co-authored-by: openhands <openhands@all-hands.dev>
@vercel
Copy link
Copy Markdown

vercel Bot commented Mar 8, 2026

@shanemort1982 is attempting to deploy a commit to the Token Fabrics Pro Team on Vercel.

A member of the Team first needs to authorize it.

@DoS007
Copy link
Copy Markdown

DoS007 commented Mar 8, 2026

The service seems to be similiar to avian (but maybe better)? #996

Can it be, that people try to get money from that openrouter use case pie?

@shanemort1982
Copy link
Copy Markdown
Author

shanemort1982 commented Mar 8, 2026

Hey — actually we're quite different from Avian/OpenRouter. Tensorix isn't an aggregator or proxy — we host all our models on our own hardware. Zero data retention, no third-party routing, your requests never leave our infrastructure.

So it's more like running your own inference stack but without the ops headache. We offer popular open-source models (DeepSeek, Llama, GLM, MiniMax, etc.) through a standard OpenAI-compatible API, pay-as-you-go. But the key difference is everything runs on infra we own and operate — nothing gets forwarded to another provider.

Happy to answer any other questions!

@shanemort1982 shanemort1982 marked this pull request as ready for review March 8, 2026 11:53
@DoS007
Copy link
Copy Markdown

DoS007 commented Mar 9, 2026

@shanemort1982 Ah, ok. So more like deepinfra and others.

@enricoros
Copy link
Copy Markdown
Owner

Considering merge of this one if we see interest or requests by our users.

Copy link
Copy Markdown

@JiwaniZakir JiwaniZakir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In openai.access.ts, the guard condition if (!tensorixKey || !tensorixHost) will never trigger for the host portion, since tensorixHost is derived from llmsFixupHost(access.oaiHost || DEFAULT_TENSORIX_HOST, apiPath) — the DEFAULT_TENSORIX_HOST fallback ensures it's always non-empty. The error message "Missing Tensorix API Key or Host" is therefore misleading; the !tensorixHost branch is dead code and should be removed to keep the guard honest (just if (!tensorixKey)), consistent with how similar dialects like xai handle it.

Additionally, looking at other dialect cases in the same switch block, the tensorix case introduces a let declaration (let tensorixKey) followed by a const on the next line, then reassigns tensorixKey. This is consistent with other cases but placing the llmsRandomKeyFromMultiKey call directly in the initializer (let tensorixKey = llmsRandomKeyFromMultiKey(access.oaiKey || env.TENSORIX_API_KEY || '')) would make the mutation unnecessary and cleaner.

It's also worth confirming whether https://api.tensorix.ai is publicly documented as stable — the comment references https://docs.tensorix.ai but the linked dashboard URL in TensorixServiceSetup.tsx (https://app.tensorix.ai/dashboard) suggests this is a relatively new provider; if the base API URL is subject to change during early access, a fallback note or env-var override documentation would be worth adding to .env.example.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants