Skip to content

GonkaGate/n8n-nodes-gonkagate

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GonkaGate for n8n

Install the GonkaGate community nodes for self-hosted n8n.

Use this package name in n8n Community Nodes:

@gonkagate/n8n-nodes-gonkagate

Package n8n License

Website Docs API%20Key Telegram X LinkedIn

Install

This package is for self-hosted n8n. If you already run self-hosted n8n, the preferred and fastest install path is the Community Nodes UI inside n8n.

Fastest Install

  1. Open your self-hosted n8n instance.
  2. Open Settings.
  3. Open Community Nodes.
  4. Click Install.
  5. Enter:
@gonkagate/n8n-nodes-gonkagate
  1. Confirm the community-node prompt if n8n shows it.
  2. Wait for installation to finish.
  3. Restart n8n if your deployment model requires it.

If you already know n8n, that is usually all you need. If you use queue mode, want Docker or Docker Compose, need a tarball, or want an unpublished build, use the Installation Guide.

See It In Action

From Community Nodes install to a working GonkaGate node in one short walkthrough:

Install GonkaGate in n8n

Overview

@gonkagate/n8n-nodes-gonkagate is the GonkaGate community node package for self-hosted n8n.

Use it if you want a GonkaGate-first integration instead of wiring stock OpenAI-compatible nodes by hand. The package gives you:

  • the root node GonkaGate
  • the additive AI model node GonkaGate Chat Model
  • the shared credential GonkaGate API

Today the package targets GonkaGate's OpenAI-compatible GET /v1/models and POST /v1/chat/completions surface, with the canonical base URL fixed to https://api.gonkagate.com/v1.

First Check

After installation:

  1. Open your n8n UI.
  2. Click Start from scratch.
  3. Add Manual Trigger.
  4. Click + to add the next node.
  5. Search for gonka or GonkaGate.
  6. If n8n opens the AI Nodes picker first, check Results in other categories.
  7. Choose the plain GonkaGate node.
  8. Ignore GonkaGate Tool and GonkaGate Chat Model for the first check.
  9. Set Operation to List Models.
  10. Create GonkaGate API, paste your API key, and save.
  11. Click Execute step or Execute workflow.

If that works, change the same node to Chat Completion, choose a model, and run one short test message.

For the full click-by-click flow, continue with Quickstart.

Other Install Paths

Use the Installation Guide if you need one of these:

  • local macOS smoke test with npm install -g n8n
  • manual npm install on a server
  • Docker or Docker Compose
  • advanced Docker install with your own custom image
  • queue mode or worker-based deployments
  • tarball install for staging or unpublished builds

Which Node Should You Use?

Start with... Use it when...
GonkaGate You want the fastest first request, List Models, or easier setup
GonkaGate Chat Model You are building AI Agent or other AiLanguageModel workflows

Credentials

GonkaGate API is the shared credential for both node surfaces. Users normally only enter an API key. The canonical GonkaGate base URL stays hidden and defaults to https://api.gonkagate.com/v1.

If you still have credentials created before the canonical hidden default was added, recreate them before chasing other auth issues.

GonkaGate Chat Model

GonkaGate Chat Model is the additive AI-model surface inside the same package. It reuses GonkaGate API, the same live model discovery path, and the same manual Model ID fallback as the root node.

Use it when you are building AI Agent or other AiLanguageModel workflows. For first-time validation, start with the plain root node GonkaGate and move to the chat-model surface after auth and model selection are already proven.

What Works Today

  • GonkaGate with List Models
  • GonkaGate with non-streaming Chat Completion
  • GonkaGate Chat Model for n8n AI workflows
  • shared GonkaGate API credential across both node surfaces
  • live model discovery from GET /v1/models
  • manual Model ID fallback when the live list is empty or unavailable

Current Limits

  • no /v1/responses support
  • no blanket n8n version support claim
  • self-hosted first only, with no n8n Cloud promise
  • no verified-node approval yet; submission readiness does not equal Creator Portal approval
  • root-node Chat Completion returns one final JSON response instead of visible live streaming
  • GonkaGate Chat Model is the better fit for streaming-capable AI workflows, but not every workflow shape has been live-validated yet

For the exact support posture, see the Compatibility Matrix and Known Limitations.

Docs

About

GonkaGate community node package for self-hosted n8n with a root node, shared credentials, and AI chat model workflows

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors