Monorepo for real-time digital avatar R&D across:
- MetaHuman + Unreal Engine control/runtime
- Video-generation avatar workflows
- Gaussian avatar runtimes
- WebRTC/LiveKit streaming infrastructure
- Evidence-backed slide/article evolution pipelines
- Primary:
https://www.realtime-avatars.com - Vercel fallback:
https://realtime-avatars.vercel.app - Slides deck entry:
https://www.realtime-avatars.com/slides/1
- Repository: public at
https://github.com/pajamadot/realtime-avatars - Web stack: Next.js
16.1.6, React19, TypeScript, Tailwind v4 - Slides:
35slides inweb/app/slides/SlidesDeck.tsx - API routes live under
web/app/api/* - Latest evolution snapshot:
full-modality-social-evolvercycle31multimodal-io-research-evolvercycle102slide-research-proofreader-evolvercycle103(0warned slides)
- GitHub Actions automation:
.github/workflows/update-research-feed.yml(daily feed refresh + build safety check)
web/: Next.js app, slides, demos, learning tracks, API routes, public docs snapshotsmetahuman/: UE 5.7 project (avatar01), plugin, bridge service, architecture/E2E docsagents/: LiveKit Hedra avatar Python agent runtimeworkers/: Cloudflare Worker for LiveKit token issuancegaussian-avatar/: Dockerized OpenAvatarChat + LAM runtime.claude/skills/: self-evolving research/proofread/architecture automationresearch/: lightweight research notessecrets/: local secret-loading support (not for committed secrets)
/: main survey page/slides,/slides/[id],/slides/demos/[slug]/learn/*track pages and concept drilldowns/rapport: MetaHuman realtime control/talk panel/livekit: LiveKit streaming avatar demo/gaussian-video-wall: curated Gaussian-avatar video wall/presenter-scriptand/presenter-script/s/[token]
web/app/api/metahuman/control/route.tsweb/app/api/livekit/token/route.tsweb/app/api/openai/realtime/client-secret/route.tsweb/app/api/fal/proxy/route.ts
cd web
npm install
npm run devcd web
npm run buildcd web
npm run feeds:updatecd web
npm run feeds:refreshRun from repository root:
python .claude/skills/metahuman-evolver/scripts/evolve_metahuman.py --max-api-samples 30
python .claude/skills/full-modality-social-evolver/scripts/evolve_social_modality_research.py
python .claude/skills/multimodal-io-research-evolver/scripts/evolve_multimodal_io_research.py
python .claude/skills/realtime-gaussian-research-evolver/scripts/evolve_realtime_gaussian_research.py
python .claude/skills/gaussian-youtube-video-wall-evolver/scripts/evolve_gaussian_video_wall.py
python .claude/skills/slide-research-proofreader-evolver/scripts/evolve_slide_research_proofread.pyPublished snapshots are mirrored to web/public/docs/ (research, claim checks, dependency graphs, proofread reports).
- Unreal plugin source:
metahuman/avatar01/Plugins/PajamaRealtimeControl/Source/PajamaRealtimeControl/*
- Local bridge:
metahuman/editor-bridge/server.mjs
- Web UI:
web/app/components/MetaHumanEditorControlPanel.tsxweb/app/components/MetaHumanRealtimeTalkPanel.tsx
- Runbook:
metahuman/UE57_REALTIME_E2E.md
- This is an actively iterated engineering/research repo, not a versioned product release.
- Some local-runtime-heavy folders can exist (for example
agents/livekit-hedra-avatar/.venvand UE artifacts undermetahuman/avatar01/). - Source-of-truth operational guidance is in
AGENTS.mdandITERATION-MEGA-PROMPT.md.