Replies: 2 comments 2 replies
-
|
Hi! Thanks for your question! We're planning to add Groq support this year. In the meantime, as far as I can see, Groq is OpenAI compatible (https://console.groq.com/docs/openai) so please feel free to try implementing the Then, you'll only need to create your own instance of |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
@snande Here is my code. val LLMProvider.Companion.Groq by lazy {
object : LLMProvider("groq", "groq") {}
}object GroqModels {
val Kimi_K2_Instruct_0905 = LLModel(
provider = LLMProvider.Groq,
id = "moonshotai/kimi-k2-instruct-0905",
contextLength = 262_144,
maxOutputTokens = 16_384,
capabilities = fullCapability
)
}update val groqClient = OpenAILLMClient(
apiKey = System.getenv("GROQ_API_KEY"),
settings = OpenAIClientSettings(
baseUrl = System.getenv("GROQ_OPENAI_URL"),
)
) |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Is there any ETA on when Groq as a provider would be supported by Koog.
Is there any way to use it right now?
I am really interested in using the openai/gpt-oss-120b models being provided by Groq in my Kotlin projects.
https://console.groq.com/docs/model/openai/gpt-oss-120b
Beta Was this translation helpful? Give feedback.
All reactions