Skip to content

ShubhranshiPatel/QueryGiene

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Natural Language SQL Generator

AI-powered business analytics app that converts plain English questions into safe SQL, executes them against connected data sources, and returns tables, charts, and insights.

Overview

This project is a full-stack analytics workspace built with:

  • FastAPI backend
  • Next.js + Tailwind CSS frontend
  • SQLite local analytics warehouse by default
  • SQLAlchemy, Pandas, and OpenPyXL for ingestion and querying
  • Optional LLM support with OpenAI-compatible APIs or Ollama

Users can:

  • Upload CSV and Excel files
  • Connect API payloads
  • Import tables from external SQL databases
  • Ask business questions in natural language
  • Review generated SQL with read-only safety guardrails
  • View results as tables and charts
  • Save query history and dashboards
  • Define reusable business rules like revenue and profit

Features

  • Multi-source ingestion
  • Schema profiling with column aliases and sample rows
  • Natural-language-to-SQL generation
  • SQL validation and read-only safety checks
  • Query history and saved dashboards
  • Role-aware business rules
  • Optional local LLM mode with Ollama
  • Automated backend test coverage

Project Structure

.
|-- backend/
|   |-- core/
|   |-- db/
|   |-- routes/
|   |-- schemas/
|   `-- services/
|-- database/
|-- frontend/
|   `-- src/
|-- main.py
|-- req.txt
`-- test_cases.py

Backend Architecture

The backend is organized into modular layers:

  • backend/main.py: FastAPI app creation and router registration
  • backend/routes/: API endpoints
  • backend/services/: business logic for ingestion, schema retrieval, SQL generation, safety, and execution
  • backend/db/: SQLAlchemy models, session setup, and bootstrap logic
  • backend/schemas/: request and response models

Core pipeline:

  1. User uploads or connects a data source
  2. Data is normalized into local SQL tables
  3. Schema metadata is profiled and stored
  4. Relevant tables are retrieved for a question
  5. SQL is generated with heuristic logic or configured LLM provider
  6. SQL is validated as read-only
  7. Query executes and returns rows, chart suggestions, and insights

Frontend Pages

  • /login
  • /dashboard
  • /upload
  • /workspace
  • /history
  • /admin-rules

Requirements

Backend

  • Python 3.11+ recommended
  • Virtual environment

Frontend

  • Node.js 18+
  • npm

Backend Setup

Create and activate a virtual environment, then install dependencies:

python -m venv venv
.\venv\Scripts\activate
pip install -r req.txt

Start the FastAPI backend:

.\venv\Scripts\python.exe main.py

The backend runs on:

  • http://localhost:8000
  • health check: http://localhost:8000/health
  • API prefix: http://localhost:8000/api

Frontend Setup

Install frontend dependencies:

cd frontend
npm install

Run the Next.js app:

npm run dev

Frontend default URL:

  • http://localhost:3000

The frontend expects the backend at:

  • NEXT_PUBLIC_API_BASE=http://localhost:8000/api

Environment Variables

The backend supports these environment variables:

Variable Default Description
APP_NAME Natural Language SQL Generator FastAPI app name
API_PREFIX /api API route prefix
DATABASE_URL local SQLite DB SQLAlchemy connection string
JWT_SECRET change-me-in-production Token signing secret
ACCESS_TOKEN_EXPIRY_MINUTES 720 Auth token lifetime
DEFAULT_ROW_LIMIT 100 Default row limit
MAX_ROW_LIMIT 500 Maximum row limit
QUERY_TIMEOUT_SECONDS 10 Query timeout
LLM_PROVIDER heuristic heuristic, openai, or ollama
OPENAI_BASE_URL OpenAI API URL Base URL for OpenAI-compatible API
OPENAI_API_KEY empty API key for remote LLM
OPENAI_MODEL gpt-4o-mini Model name
OLLAMA_BASE_URL http://localhost:11434 Ollama server URL
OLLAMA_MODEL llama3.1 Ollama model name
CORS_ORIGINS localhost URLs Allowed frontend origins

Heuristic Mode

Default mode. No external model required.

$env:LLM_PROVIDER="heuristic"

OpenAI-Compatible Mode

$env:LLM_PROVIDER="openai"
$env:OPENAI_API_KEY="your_api_key"
$env:OPENAI_MODEL="gpt-4o-mini"

Ollama Mode

$env:LLM_PROVIDER="ollama"
$env:OLLAMA_MODEL="llama3.1"

Default Authentication

The first registered user becomes the admin user automatically.

Admin users can:

  • create business rules
  • manage admin-only features

Main API Endpoints

Auth

  • POST /api/auth/register
  • POST /api/auth/login
  • GET /api/auth/me

Data Sources

  • POST /api/upload
  • POST /api/connect-api
  • POST /api/connect-database
  • GET /api/datasources

Schema and Querying

  • GET /api/schema
  • GET /api/rules
  • POST /api/rules
  • POST /api/query

History and Dashboards

  • GET /api/history
  • GET /api/dashboards
  • POST /api/dashboards

Testing

Run the backend test suite:

.\venv\Scripts\python.exe -m pytest test_cases.py

Current automated tests cover:

  • registration and login
  • file upload and schema discovery
  • natural language query generation
  • repeat customer analysis
  • declining revenue analysis
  • business rule creation
  • API ingestion
  • external database ingestion
  • dashboard creation
  • you can also test yourself using the test cases files in the root folder

GitHub Upload Notes

Before pushing:

  1. Make sure secrets are stored in environment variables, not code.
  2. Do not commit local SQLite databases, caches, or virtual environments.
  3. Install frontend dependencies locally so package-lock.json can be committed if you want reproducible installs.

Production Notes

  • Replace the default JWT_SECRET
  • Use PostgreSQL or another managed database instead of local SQLite
  • Put FastAPI behind a production ASGI server and reverse proxy
  • Configure real authentication, HTTPS, and rate limiting
  • Add background jobs for scheduled reports and refresh workflows

About

Open-source AI SQL Generator that converts natural language into SQL using Excel, APIs, and databases. Ask business questions, get instant insights, charts, and query results.

Topics

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors