Real-time EEG streaming platform for PiEEG (8/16 ch) and IronBCI (8 ch BLE)
Reads at 250 Hz · streams over WebSocket · live dashboard with spectral analysis, topographic maps, experiences gallery, VRChat OSC, Lab Streaming Layer, webhook automation — Raspberry Pi (SPI) or IronBCI (Bluetooth LE).
Try it now — no hardware needed, click ▶ Use Demo Server then Connect. Documentation — full guides, API reference, and integration docs. Chat with us — Join PiEEG Discord community
curl -sSL https://raw.githubusercontent.com/pieeg-club/PiEEG-server/main/install.sh | bash
|
Getting Started |
Features |
Integrations |
Reference |
Pick one method — they all get you
pieeg-serverready to run.
| Method | Command | Notes |
|---|---|---|
| A) One-line (recommended) | curl -sSL https://raw.githubusercontent.com/pieeg-club/PiEEG-server/main/install.sh | bash |
Run sudo reboot first time to enable SPI |
| B) Clone & setup | git clone https://github.com/pieeg-club/PiEEG-server.git && cd PiEEG-server && ./setup.sh |
Same reboot note |
| C) pip | pip install pieeg-server |
Requires Python 3.10+ |
| D) pip + IronBCI | pip install pieeg-server[ironbci] |
Adds bleak for Bluetooth LE |
IronBCI / EAREEG users: install the BLE extra for Bluetooth Low Energy support:
pip install pieeg-server[ironbci]Then run with
pieeg-server --device ironbci8. See CLI Reference for BLE options.
pieeg-server ships with a pure-Python reference implementation for every
DSP hot path (24-bit ADC decode, Butterworth bandpass, Hampel spike
rejection). Installing the optional pieeg-core
Rust accelerator transparently swaps those hot paths for compiled
equivalents (~15–30× faster) — no config, no code changes:
pip install 'pieeg-server[fast]'The active engine is announced on startup and reported in the WebSocket
connected message as "engine": {"native": true, "engine": "pieeg-core", "version": "..."},
and pieeg-server doctor prints a DSP Engine section. If the wheel
is missing or fails to import, pieeg-server falls back to the Python
implementation automatically — nothing breaks.
pieeg-core is licensed AGPL-3.0-or-later; installing it is strictly
opt-in so the default MIT-licensed pieeg-server install stays
license-clean.
Benchmark the two engines side-by-side with the bundled script:
python -m scripts.bench_native --seconds 5Sample results (Windows, 5 s @ 250 Hz × 16 ch):
| Hot path | Python | pieeg-core | Speedup |
|---|---|---|---|
MultichannelFilter (bandpass 1–40 Hz) |
992 samples/s | 1,048,658 samples/s | ~1057× |
HampelFilter (spike removal) |
24,619 samples/s | 358,361 samples/s | ~15× |
decode_channels (24-bit SPI → µV) |
121,352 samples/s | 1,134,301 samples/s | ~9× |
Real-time streaming at 250 Hz × 16 ch requires 4,000 samples/s — the
pure-Python MultichannelFilter is below that threshold, so installing
pieeg-core is effectively required when server-side filtering is
enabled on 16-channel rigs.
pieeg-server # stream 16 ch (PiEEG)
pieeg-server --device pieeg8 # 8-channel PiEEG shield
pieeg-server --device ironbci8 # IronBCI via Bluetooth LE
pieeg-server --device ironbci8 --ble-name MyBoard # custom BLE name
pieeg-server --filter # 1–40 Hz bandpass
pieeg-server --monitor # terminal sparklines
pieeg-server --mock # synthetic data, no hardware
pieeg-server --auth # require 6-digit access code
pieeg-server --osc # VRChat OSC bridge
pieeg-server --lsl # Lab Streaming Layer outlet
pieeg-server doctor # diagnose everything| Port | Purpose |
|---|---|
:1616 |
WebSocket data stream |
:1617 |
Web dashboard |
Open http://raspberrypi.local:1617 in any browser on your network.
curl -sSL https://raw.githubusercontent.com/pieeg-club/PiEEG-server/main/install.sh | bash
sudo reboot # first time only, enables SPI
pieeg-server # or: pieeg-server --mock (no hardware)Browse to http://raspberrypi.local:1617 — you'll land on the Session Lobby.
The server URL is pre-filled with ws://raspberrypi.local:1616 — click Connect to start streaming.
No hardware? Click ▶ Use Demo Server to connect to a public mock instance with synthetic EEG data.
import asyncio, json, websockets
async def main():
async with websockets.connect("ws://raspberrypi.local:1616") as ws:
async for msg in ws:
frame = json.loads(msg)
print(f"#{frame['n']}: {frame['channels']}")
asyncio.run(main())const ws = new WebSocket("ws://raspberrypi.local:1616");
ws.onmessage = (e) => {
const frame = JSON.parse(e.data);
console.log(`#${frame.n}:`, frame.channels);
};websocat ws://raspberrypi.local:1616 # CLI one-linerThat's it. Every frame is plain JSON — no SDK, no binary protocol, works in any language that has WebSocket support.
| Feature | Description |
|---|---|
| 250 Hz streaming | WebSocket (ws://<host>:1616), plain JSON, language-agnostic |
| 8 & 16 channel | PiEEG-8, PiEEG-16 (SPI) and IronBCI-8 (BLE) |
| IronBCI BLE | Bluetooth Low Energy support for IronBCI / EAREEG boards; auto-scan or direct MAC address |
| Bandpass filter | Butterworth IIR (SOS), per-channel state, adjustable live via WebSocket |
| CSV recording | Start/stop from dashboard or CLI; auto-timestamped; optional duration limit |
| Session annotations | Text notes on any frame; sidecar .annotations.json |
| Terminal monitor | Rich TUI with per-channel sparklines and µV readout; works over SSH |
| Mock mode | Realistic synthetic EEG (alpha rhythm, drift, noise, blink artifacts) |
| Authentication | Optional 6-digit code, rate limiting, HMAC timing-safe verify, HttpOnly cookies |
| VRChat OSC | Band powers via UDP OSC; chatbox + avatar parameters; rolling normalization |
| LSL | Push raw samples to LSL network; discoverable by OpenViBE, MNE, LabRecorder |
| Webhooks | HTTP callbacks on EEG events; IFTTT & Zapier presets; per-rule cooldown |
| ADS1299 registers | Live ADS1299 register config via WebSocket & dashboard; presets, noise test |
| Self-diagnostics | pieeg-server doctor checks Pi model, SPI/GPIO, ports, deps, systemd |
| Self-update | Detects pip/git install; checks PyPI or remote; one-click upgrade from dashboard |
| Systemd service | Auto-starts on boot; standard systemctl management |
| Zero-dep GPIO | Direct Linux chardev v1 ioctl; stable ABI since Linux 4.8 |
| Spike rejection | Two-layer: hardware delta-threshold + device-agnostic Hampel filter (per-channel median-based replacement) |
| Cloud-ready | Dockerfile + Fly.io config for mock-mode demo hosting |
| Feature | Description |
|---|---|
| Real-time waveforms | Canvas 2D, adaptive quality; time window 2–16 s, Y-scale ±50–500 µV |
| Session lobby | Enter server URL and click Connect; ▶ Use Demo Server prefills the public mock endpoint; Disconnect button returns to lobby |
| Signal quality badges | Live per-channel RMS with color feedback (green / yellow / red / gray) |
| Channel detail panel | Click to expand: zoomed trace, FFT, band power bars, histogram, statistics |
| Spectral analysis | 256-point FFT in Web Worker; PSD (log dB / linear); band power bars δ θ α β γ |
| Spectrogram | Scrolling time-frequency heatmap (Turbo colormap, –60 → 0 dB) |
| Topographic map | IDW-interpolated scalp heatmap over 10-20 montage; selectable band metric |
| Statistics panel | 10 metrics per channel; sortable columns; CSV export |
| Filter preview | Live Butterworth magnitude response with –3 dB reference |
| Session library | Browse, replay recordings; play/pause, seek, speed control (0.5×–2×); annotations |
| AI chat assistant | BYO provider (OpenAI, Anthropic, Ollama, Groq, LM Studio); SSE streaming |
| Webhook panel | Visual rule builder; POST/PUT/PATCH/GET; Authorization headers; IFTTT & Zapier |
| Webcam feed | Optional video overlay |
| Performance monitor | FPS, frame time, JS heap overlay |
| Update banner | Version upgrade notification |
| Responsive | Desktop: all 16 channels. Mobile: 4-channel subset |
| Key | Action | Key | Action |
|---|---|---|---|
Space |
Toggle pause | V |
Experiences gallery |
R |
Toggle record | C |
Toggle chat |
F |
Toggle FFT | W |
Toggle webhooks |
G |
Toggle spectrogram | P |
Toggle perf monitor |
S |
Toggle stats | Esc |
Close overlays |
? |
Show shortcut help |
Community-driven immersive EEG visualizations — lazy-loaded, card-based launcher, simple registry for contributors.
| Experience | Description |
|---|---|
| Neural Wave Space | Three.js 3D arc of 16 wave strips with amplitude-responsive color, starfield, WebXR + hand tracking |
| Blink Browser | Scroll articles via eye blinks; per-user calibration; frontal electrode monitoring |
| Neural Sonification | Brainwaves → live music; bands mapped to drone, FM pad, lead, harmonics, shimmer; DJ controls |
| VRChat OSC | Stream band powers into VRChat; chatbox + avatar parameter output; live config UI |
| Spoon Bend | Matrix-style telekinesis controlled by focus/beta/gamma; 3D spoon + digital rain |
| Webhook Wizard | Guided 60-second first-webhook setup; live EEG feedback; IFTTT/Zapier templates |
Time to first playable: ~15 minutes. One
.tsxfile + one line in the registry.
1. Create your component in dashboard/src/experiences/my-game/MyGame.tsx:
import { useRef, useEffect } from "react";
import type { ExperienceProps } from "../registry";
import { useFocus, useRelax, useBlink } from "../../hooks/detectors";
export default function MyGame({ eegData, onExit }: ExperienceProps) {
const { state: focus } = useFocus(eegData);
const { state: relax } = useRelax(eegData);
const { state: blink } = useBlink(eegData);
const canvasRef = useRef<HTMLCanvasElement>(null);
useEffect(() => {
let raf: number;
function loop() {
const f = focus.current.focus; // 0–1
const r = relax.current.relaxation; // 0–1
const b = blink.current.blinked; // true for one cycle per blink
// --- your game logic here ---
raf = requestAnimationFrame(loop);
}
raf = requestAnimationFrame(loop);
return () => cancelAnimationFrame(raf);
}, []);
return (
<div style={{ position: "fixed", inset: 0, background: "#000" }}>
<canvas ref={canvasRef} />
<button onClick={onExit} style={{ position: "absolute", top: 12, left: 12 }}>
← Exit
</button>
</div>
);
}2. Register in dashboard/src/experiences/registry.ts:
const MyGameExperience = lazy(() => import("./my-game/MyGame"));
// Add to EXPERIENCES array:
{
id: "my-game",
name: "My Game",
description: "One-sentence summary.",
tag: "Focus",
gradient: ["#ec4899", "#8b5cf6"],
component: MyGameExperience,
author: "Your Name",
}The gallery picks it up automatically. Each experience is code-split — no impact on initial load.
Browser-side EEG processing hooks — React refs, zero re-renders. Read .current in your requestAnimationFrame loop.
import { useBandPowers, useBlink, useFocus, useRelax } from "../../hooks/detectors";Foundation layer. Single FFT instance, averaged spectral band powers.
| Field | Type | Description |
|---|---|---|
absolute |
BandPowers |
Absolute power per band (µV²/Hz) — Delta, Theta, Alpha, Beta, Gamma |
relative |
BandPowers |
Normalized to sum = 1 |
totalPower |
number |
Sum across all bands |
dominantFrequency |
number |
Peak PSD bin (Hz) |
Config: { updateHz?, channels?, smoothing? }
Ocular artifact detector. Amplitude-threshold state machine on frontal channels (Fp1/Fp2).
| Field | Type | Description |
|---|---|---|
blinked |
boolean |
true for exactly one poll cycle per blink |
count |
number |
Cumulative blink count |
amplitude |
number |
Current peak-to-peak µV |
lastBlinkTime |
number |
Epoch ms of last confirmed blink |
Config: { channels?, threshold?, windowMs?, minDurationMs?, maxDurationMs?, refractoryMs?, pollHz? }
Cortical engagement index — (Beta + Gamma) / (Alpha + Theta + Delta).
| Field | Type | Description |
|---|---|---|
focus |
number |
0 (relaxed) – 1 (highly focused), smoothed |
raw |
number |
Unsmoothed, uncalibrated ratio |
calibrated |
boolean |
Whether baseline has been captured |
Config: { channels?, updateHz?, smoothing?, scaleDivisor? } · Returns: { state, calibrate(), resetCalibration(), calibrating }
Alpha-dominance + theta-beta ratio composite relaxation index.
| Field | Type | Description |
|---|---|---|
relaxation |
number |
0 (alert) – 1 (deeply relaxed), smoothed |
alphaRelative |
number |
Alpha / total power (0–1) |
thetaBetaRatio |
number |
θ / β raw ratio |
calibrated |
boolean |
Whether baseline has been captured |
Config: { channels?, updateHz?, smoothing?, alphaWeight?, tbrCeiling? } · Returns: { state, calibrate(), resetCalibration(), calibrating }
PiEEG-server exposes real-time ADS1299 register configuration over WebSocket. Change channel input modes, run noise diagnostics, and apply presets — all live, without restarting the stream.
Channels CH1SET–CH8SET (addresses 0x05–0x0C) can be written at runtime. Configuration registers (CONFIG1–CONFIG3) are protected.
| Value | Input mode | Use case |
|---|---|---|
0x00 |
Normal electrode | Standard EEG measurement |
0x01 |
Input shorted | Baseline noise testing |
0x02 |
Bias measurement | Bias current check |
0x04 |
Temperature sensor | On-chip temp readout |
0x05 |
Test signal | Internal 1× square wave |
{"cmd": "reg_read"}
{"cmd": "reg_write", "regs": {"0x05": "0x00", "0x06": "0x01"}}
{"cmd": "reg_preset", "preset": "internal_short"}
{"cmd": "noise_test", "duration": 3}Presets: normal, internal_short, test_signal, temp_sensor.
The dashboard Register panel provides a visual interface — click to switch modes per channel, run noise tests with automated pass/fail verdicts, no code required.
Plain WebSocket, UTF-8 JSON frames on ws://<host>:1616. No SDK needed.
| Scenario | URL |
|---|---|
| No auth | ws://raspberrypi.local:1616 |
| With auth | ws://raspberrypi.local:1616?token=<ws-token> |
Authenticated connection flow (TypeScript)
With --auth enabled: POST /auth with the 6-digit code → GET /auth/ws-token → connect with ?token=....
const dashboardBase = "http://raspberrypi.local:1617";
const wsBase = "ws://raspberrypi.local:1616";
async function connectAuthenticated(code: string): Promise<WebSocket> {
await fetch(`${dashboardBase}/auth`, {
method: "POST",
headers: { "Content-Type": "application/json" },
credentials: "include",
body: JSON.stringify({ code }),
});
const { token } = await fetch(`${dashboardBase}/auth/ws-token`, {
credentials: "include",
}).then((r) => r.json());
const ws = new WebSocket(`${wsBase}?token=${encodeURIComponent(token)}`);
ws.addEventListener("message", (e) => {
const frame = JSON.parse(e.data as string);
console.log(`#${frame.n}:`, frame.channels);
});
return ws;
}Filter
{"cmd": "set_filter", "enabled": true, "lowcut": 1.0, "highcut": 40.0}
{"cmd": "set_filter", "enabled": false}Recording
{"cmd": "start_record"}
{"cmd": "stop_record"}Webhooks
{"cmd": "webhook_list"}
{"cmd": "webhook_create", "rule": {"name": "Alpha alert", "trigger_type": "band_power_above", "params": {"band": "alpha", "threshold": 20}, "url": "https://example.com/hook", "method": "POST", "headers": {"Authorization": "Bearer ..."}, "cooldown": 30}}
{"cmd": "webhook_update", "rule": {"id": "abc123", "enabled": false}}
{"cmd": "webhook_delete", "id": "abc123"}
{"cmd": "webhook_test", "id": "abc123"}VRChat OSC
{"cmd": "osc_start"}
{"cmd": "osc_stop"}
{"cmd": "osc_configure", "host": "127.0.0.1", "port": 9000, "mode": "both", "interval": 0.25}
{"cmd": "osc_status"}Lab Streaming Layer
{"cmd": "lsl_start"}
{"cmd": "lsl_stop"}
{"cmd": "lsl_status"}ADS1299 Registers (see full details)
{"cmd": "reg_read"}
{"cmd": "reg_write", "regs": {"0x05": "0x00", "0x06": "0x01"}}
{"cmd": "reg_preset", "preset": "internal_short"}
{"cmd": "noise_test", "duration": 3}Every frame from the WebSocket is a JSON object:
{ "t": 1711234567.123, "n": 42, "channels": [12.34, -5.67, 8.90, ...] }| Field | Type | Description |
|---|---|---|
t |
float | Unix timestamp (seconds, 6-decimal precision) |
n |
int | Sample number (monotonic, incremental) |
channels |
float[] | µV per channel (8 or 16 elements) |
{
"status": "connected",
"sample_rate": 250,
"channels": 16,
"filter": false,
"recording": false
}| Band | Range | Color |
|---|---|---|
| Delta (δ) | 0.5–4 Hz | 🟣 #8b5cf6 |
| Theta (θ) | 4–8 Hz | 🔵 #06b6d4 |
| Alpha (α) | 8–13 Hz | 🟢 #22c55e |
| Beta (β) | 13–30 Hz | 🟡 #f59e0b |
| Gamma (γ) | 30–100 Hz | 🔴 #ef4444 |
HTTP callbacks fired when EEG conditions are met. Build rules in the dashboard or via WebSocket commands.
| Type | Fires when… |
|---|---|
band_power_above |
Band power exceeds threshold |
band_power_below |
Band power drops below threshold |
amplitude_above |
Peak amplitude exceeds threshold |
amplitude_below |
Peak amplitude drops below threshold |
band_ratio_above |
Band ratio (e.g. β/θ) exceeds threshold |
band_ratio_below |
Band ratio drops below threshold |
{
"event": "band_power_above",
"rule": "Alpha alert",
"value": 23.45,
"threshold": 20,
"channel": 0,
"timestamp": 1711234567.12
}Each rule has an independent cooldown (seconds) to prevent spam. Enforced server-side.
Webhook rules support three service modes — Generic, IFTTT, Zapier — selectable per rule.
| Service | Setup | Payload shape |
|---|---|---|
| Get key at ifttt.com/maker_webhooks → Documentation | value1 (value), value2 (trigger type), value3 (full JSON) |
|
| Create Zap → Webhooks trigger (Catch Hook) → paste URL | Flat JSON: event, rule, value, threshold, channel, band, timestamp, source |
|
| Generic | Any POST-accepting endpoint (n8n, Home Assistant, Node-RED…) | event, rule, value, threshold, channel, timestamp |
Example payloads
// IFTTT
{"value1": "23.45", "value2": "band_power_above", "value3": "{\"event\":\"band_power_above\",\"rule\":\"Alpha alert\",\"value\":23.45,\"threshold\":20}"}
// Zapier
{"event": "band_power_above", "rule": "Alpha alert", "value": 23.45, "threshold": 20, "channel": 0, "band": "alpha", "timestamp": 1711234567.12, "source": "pieeg"}
// Generic
{"event": "band_power_above", "rule": "Alpha alert", "value": 23.45, "threshold": 20, "channel": 0, "timestamp": 1711234567.12}Stream EEG band powers to VRChat via UDP OSC. Enable with --osc or from the dashboard.
pieeg-server --osc # defaults
pieeg-server --osc --osc-mode chatbox --osc-interval 0.5 # chatbox only, 2 Hz| Mode | Output |
|---|---|
chatbox |
Band powers as text in VRChat chatbox |
parameters |
Avatar parameter floats (normalized 0–1) |
both |
Chatbox + avatar parameters (default) |
WebSocket commands: osc_start, osc_stop, osc_configure, osc_status — see WebSocket API.
Push raw EEG samples to the LSL network. Discoverable by OpenViBE, MNE-LSL, BCI2000, NeuroPype, LabRecorder.
pieeg-server --lsl # enable on startup
pieeg-server --lsl --lsl-name MyEEG # custom stream nameToggle at runtime via dashboard or WebSocket (lsl_start / lsl_stop).
Receive in Python:
from pylsl import StreamInlet, resolve_stream
streams = resolve_stream('name', 'PiEEG')
inlet = StreamInlet(streams[0])
sample, timestamp = inlet.pull_sample()pieeg-server record session.csv # standalone, Ctrl-C to stop
pieeg-server record session.csv --duration 300 # 5 minutes
pieeg-server --record session.csv # record while streaming| Feature | Details |
|---|---|
| Format | CSV: timestamp, ch1, ch2, ..., chN |
| Location | recordings/pieeg_YYYYMMDD_HHMMSS.csv |
| Annotations | Frame-based text markers; sidecar .annotations.json |
| Playback | Dashboard session library: play/pause, seek, speed (0.5×–2×) |
| Export | CSV with annotation column, or JSON with metadata |
| # | Notebook | Description |
|---|---|---|
| 1 | 01_load_and_plot_session |
Load CSV or stream live, plot all channels |
| 2 | 02_detect_blinks |
Eye-blink artifact detection (threshold-based) |
| 3 | 03_bandpower_and_alpha |
Band powers via Welch's method, alpha tracking |
| 4 | 04_export_features_for_ml |
Windowed features → ML-ready CSV (scikit-learn) |
cd notebooks && pip install -r requirements.txt
pieeg-server --mock &
jupyter labpieeg-server [OPTIONS] [COMMAND]
| Command | Description |
|---|---|
doctor |
Diagnose hardware, software, and configuration |
record FILE |
Record EEG to CSV (standalone, no server) |
monitor |
Live terminal display (standalone, no server) |
| Flag | Default | Description |
|---|---|---|
--device DEVICE |
pieeg16 |
pieeg8, pieeg16, or ironbci8 |
--ble-name NAME |
EAREEG |
BLE advertised device name (IronBCI only) |
--ble-address ADDR |
— | BLE MAC address — skip scan, connect directly |
--host HOST |
0.0.0.0 |
Bind address |
--port PORT |
1616 |
WebSocket port |
--dashboard-port PORT |
1617 |
Dashboard HTTP port |
--no-dashboard |
— | Disable web dashboard |
--auth |
— | Enable 6-digit access code |
--gpio-chip PATH |
/dev/gpiochip4 |
GPIO chip device |
--filter |
— | Enable 1–40 Hz bandpass filter |
--lowcut HZ |
1.0 |
Filter low cutoff |
--highcut HZ |
40.0 |
Filter high cutoff |
--record FILE |
— | Record to CSV while streaming |
--record-duration SEC |
— | Stop recording after N seconds |
--monitor |
— | Show terminal monitor alongside server |
--mock |
— | Synthetic EEG (no hardware needed) |
--no-webhooks |
— | Disable webhooks |
--osc |
— | Enable VRChat OSC bridge |
--osc-host HOST |
127.0.0.1 |
VRChat receiver |
--osc-port PORT |
9000 |
VRChat OSC port |
--osc-mode MODE |
both |
chatbox, parameters, or both |
--osc-interval SEC |
0.25 |
OSC update rate |
--lsl |
— | Enable Lab Streaming Layer outlet |
--lsl-name NAME |
PiEEG |
LSL stream name |
-v, --verbose |
— | Debug logging |
┌──────────────────────────────────────────────────────────┐
│ Raspberry Pi + PiEEG Shield (8 or 16 ch) │
│ — or — IronBCI / EAREEG board (8 ch, Bluetooth LE) │
│ │
│ hardware.py → SPI/GPIO init, ADS1299 config (PiEEG) │
│ ironbci.py → BLE scan + GATT notifications (IronBCI)│
│ ↓ │
│ acquisition.py → 250 Hz read loop (SPI or BLE) │
│ ↓ pub/sub │
│ ├── server.py → WebSocket broadcast │
│ ├── recorder.py → CSV writer │
│ ├── monitor.py → Terminal sparklines │
│ ├── osc_vrchat.py → VRChat OSC bridge │
│ └── lsl.py → Lab Streaming Layer outlet │
│ │
│ webhooks.py → event rules + HTTP relay │
│ dashboard.py → HTTP server (React UI) │
│ auth.py → session + WS token management │
│ filters.py → Butterworth bandpass (SOS) │
│ updater.py → version check + upgrade │
│ doctor.py → system diagnostics │
│ ↓ │
│ ws://0.0.0.0:1616 │ http://0.0.0.0:1617 │
└──────────┬───────────────────────────────────────────────┘
│ Local network / internet
├── Browser dashboard (React + Vite)
├── Python / Jupyter notebook
├── VRChat (OSC over UDP)
├── LSL consumers (OpenViBE, MNE, LabRecorder…)
├── IFTTT / Zapier (webhooks)
└── Any WebSocket client
| Module | Responsibility |
|---|---|
hardware.py |
SPI/GPIO initialization, ADS1299 register config (PiEEG) |
ironbci.py |
BLE scan, GATT notifications, ADS1299 packet parsing (IronBCI) |
acquisition.py |
250 Hz read loop (SPI, BLE, or mock), pub/sub queues |
| Layer | Tech |
|---|---|
| Framework | React 19 |
| Bundler | Vite 6 |
| Rendering | Canvas 2D (waveforms, spectra), Three.js (WebXR) |
| FFT | Pure-JS Cooley-Tukey radix-2 (256-point, Hanning window) |
| State | React hooks + refs (no external state library) |
| Styling | Plain CSS |
Setup creates a service that auto-starts on boot:
sudo systemctl status pieeg-server
sudo systemctl restart pieeg-server
journalctl -u pieeg-server -fStart with --auth to require a 6-digit access code (printed at startup, changes every restart). Sessions persist 24 hours via HttpOnly cookies.
╔══════════════════════════════════════╗
║ DASHBOARD ACCESS CODE ║
║ Code: 847291 ║
╚══════════════════════════════════════╝
| Layer | Protection |
|---|---|
| Access code | HMAC timing-safe comparison |
| Sessions | secrets.token_urlsafe (256-bit), 24h TTL |
| Cookies | HttpOnly, SameSite=Lax |
| WebSocket | Short-lived single-use token |
| Rate limit | 5 attempts / 60s per IP |
| Recordings | Path traversal protection |
| CORS | Origin-reflecting (not wildcard) |
Without
--auth, dashboard and WebSocket are open to your local network.
git clone https://github.com/pieeg-club/PiEEG-server.git
cd PiEEG-server
python -m venv .venv && source .venv/bin/activate # Windows: .venv\Scripts\Activate.ps1
pip install -e ".[dev,ironbci]"# Terminal 1
pieeg-server --mock
# Terminal 2
cd dashboard && npm install && npm run dev # → http://localhost:3000Vite proxies /auth and /ws to the Python server.
Build: cd dashboard && npm run build — output committed to pieeg_server/static/dashboard/ so pip users don't need Node.
Auto-build hook:
cp scripts/pre-commit .git/hooks/pre-commit && chmod +x .git/hooks/pre-commitpytest # all tests
pytest -v # verbose./scripts/build_release.sh # React build → wheel → dist/
./scripts/build_release.sh --upload # + upload to PyPI- Bump version in
pyproject.toml+pieeg_server/__init__.py git commit -am "release v0.X.0" && git tag v0.X.0./scripts/build_release.sh --upload && git push origin main --tags
Designed for trusted local networks (home lab, research bench). Not hardened for the public internet.
Assumes: private network, physical access to Pi, trusted client devices.
Does NOT defend against: MITM (no TLS), public internet exposure, untrusted users on the same network.
| Not secured | Mitigation |
|---|---|
| No TLS | Dedicated network; or nginx/caddy reverse proxy with self-signed cert |
| No user accounts | Single-user device on home network |
| No CSP headers | Static React build, no user-generated content |
| In-memory sessions | By design — code changes on restart |
| 6-digit code (900k possibilities) | Rate limiting + trusted network; use firewall/VLAN if concerned |
For sensitive deployments: private network only → TLS termination (nginx/caddy) → firewall rules → VPN (WireGuard/Tailscale) for remote access.
pieeg-server doctor # checks Pi model, SPI, GPIO, ports, deps, systemd
# exit 0 = ok, 1 = warnings, 2 = errorsWindows
Windows only supports mock mode (--mock). If pip install -e . fails with WinError 2:
pip uninstall pieeg-server -y
pip install -e .Or use the bundled wrapper: .\pieeg-server.cmd --mock
GPIO: no gpiod dependency
The server talks to GPIO directly via Linux chardev v1 ioctl — no gpiod pip package. Two GPIO pins (chip-select + data-ready) use ~20 lines of struct packing instead of an external dep that has breaking v1/v2 API changes, requires libgpiod-dev C headers, and only works on Linux. The chardev ABI has been stable since Linux 4.8 (2016).
PiEEG must operate from battery power (5V) only. Do NOT connect to mains-powered equipment via USB. PiEEG is NOT a medical device.
Built with guidance from Ildar Rakhmatulin, PhD, creator of the PiEEG platform.
MIT





