Skip to content

tdbe/openxr-vulkan-gamedev-framework

 
 

Repository files navigation

TL;DR:

Performant open-source OpenXR, Vulkan, ECS, C++. A boilerplate -> framework -> mini "game engine" to quickly make an actual playable modern royalty-free XR game.

Demystifies ECS / Memory Management, Single Pass Rendering, XR Input, and XR gamedev fundamentals, on top of @janhsimon's excellent timesaving khronos setup openxr-vulkan-example.

[Note: you can leave github comments over on my blog post version 😉]

snowglobe_openxr_vulkan_framework_30crf_800x800_github.webm

Demo video, summer 2025. (There's also a youtube hq 1600x1600/1440p 60fps version.)

Abstract:

⠀⠀⠀⠀⠀⠀⠀⠀ ⠀10 <-0-> 10⠀ ⠀⠀⠀⠀⠀⠀⠀⠀
"frictionlessly understandable by all humans" |•║••••••••|••••••••••| "hoo-man? what is, this, hoo-man?"
"human craft, real (abductive) attention" |••║•••••••|••••••••••| "ai vibecoding slop & copy pasta"
"beat all benchmarks, even the useless ones" |•••║••••••|••••••••••| "I like C# style garbage collection, and hate ECS"
"fix engine tropes, frictions, dev UX" |••••║•••••|••••••••••| "I don't make games I just implement standards and compile engines"
 (but we got 0 budget)  ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀  

*Trey Parker voice* Vulkan has a rich body of work, and many strengths as a people; but know not what is of hoo-man. I've managed to translate their work, for hoo-mans, whose lifetimes are too short to decipher all the khronos lunar manuals & messaging in hopes of achieving even the most basic contact.

It didn't help that they didn't want to touch Single-Pass rendering (the performant & industry-standard linchpin of (XR) rendering).

In any case, thanks to open-source (as opposed to commercial / AAA) you can now build something pretty good the right way, without worrying about mighty morphing license agreements or wetting the beaks of people with golden parachutes. PS: you'll find no insensitivity, abuse, or horrors beyond all comprehension, attached to this project.

Builds:

"Recommended Specs" and "Minimum Requirements":

  • The test scene runs on a RTX 4070 mobile at 90FPS (the refresh rate of the headset) with 8 (animated) tube lights and 2 directional lights, and a lot of transparent objects overdraw.

  • It also runs on a RTX 2060 mobile (with a bit of lag spikes if you have too much overdraw right on your face).

Controls:

My feature stack so far:

(sections ordered from high-level to low-level)

XR Locomotion

  • Rotating and (accelerated) Panning of the scene by grabbing with both hands, retreating into a non-euclideanly warped pocket dimension (pushing the world away from you non-linearly) and seeing a "tunnelvision" portal-style chaperone. Highest effectiveness and lowest sickness (carefully tweaked and tested across dozens of different people).

  • Uses state machines for movement and for visuals. Supports animated teleportation with targets.

    chaperone_demo_gif

Base XR gameplay mechanics

  • Mechanics system based on a list of GameBehaviours set up as FSMs.

  • Each behaviour is Created (with its own required references), Updated (with frame & input data etc), and Destroyed.

  • Mechanics for locomotion, hands, XR Input testing, world objects.

  • Any GameComponent has one or more GameEntity parents and manual or automatic cleanup (and preventing dangling components when all owners are freed).

Physics

  • There's support for running jobs on Bounds components (generated at model load time), with proper functions for AABB intersection or enclosure tests, plane tests, rectangular selection (even at non-Axis-Aligned angles) / frustum casting, raycasting.

  • There's a concept of ground in the locomotion system.

  • But TODO: no actual Physics library added.

Animation

  • *crickets* TODO: add it via/with full/extended gltf support.

  • TODO: find something open-source for: IK, gpu-skinning, and LoDs, or implement some of it.

Audio

  • *crickets* TODO: add Audio component, and threaded spatial sound library with audio queueing.

GUI

  • *crickets* TODO: vector or sdf text, and just textures. Then use previously made ui code.

  • TODO: main menu, in-game hands inventory

  • Use Blender instead of a scene-graph management system (game editor)?

    • TODO: Maybe use Blender scenes and exports instead of in-engine scene serialization and even prefabs.

Jobs / Threading / Queueing

  • Using ./external/taskflow, and here you can run threads per tile (chunk), the right way - all components of the entity are in the same chunk, comfortably fit in CPU cache.

  • The objects and memory are set up in tiled (chunked) ECS pools, and Archetyped pools.

  • There's a e.g. a SystemTransformPropagation you can use as an example on how to run jobs over entities and components.

  • There is tf::TaskFlow::for_each, tasks and subtasks, task queuing, waiting, chaining, semaphoring, parenting & graph, work-stealing, pooling, thread load balancing, graph dump debugging.

Systems

Transform Propagation System

  • SystemTransformPropagation - Parenting system that runs after the "simulation stage" and before "rendering stage" and processes the chain of local (relative) Transform data (position, orientation, scale) using recursive jobs querying entity chunks with Parent Components and ignoring disabled entity trees. (This system what allows you to have children that move with their parent(s).)

Queries

  • If you're familiar with e.g. "Entity Queries" from Unity ECS, a "query" here would be a job we run on each chunk of a GameWorld's GameDataPool or ArchetypeGameDataPool, and check yourself whatever you want: the entity's ID, archetype mask, specific component value etc.

ECB (Entity Command Buffer)

  • TODO:

System Stages

  • For now we have our Main game loop with:

    • GameBehaviours loop

    • InputSystem update & poses & haptics

    • Your custom Systems

    • Engine systems e.g. SystemTransformPropagation

    • Renderer

GameData

  • GameWorld { ArchetypedGameDataPool { a variadic type Structure of tiled (chunked) Arrays (subpools) of GameEntity / GameEntityObject, and unique Components }, and GameDataPool (also cache friendly tiled (chunked)) { shared Components }, GameDataPool { sparse Components }, and GameDataPool { Buffered Components } }.

  • Everything is set in generic memory-span pools, by type, or variadic types ("archetype"). You set up a game world with maximum allocated memory for each pool, then during setup or gameplay you can request to use a free object, or mark a used one as clear (free and reusable). There's no need for defragmenting, or swap-and-pop (would be slower in average-case) ("ted talk" in GameDataPool.h). There's archetype masks on entities, and per archetype pool, but not per tile (chunk).

  • (non-shared) Pools are also split into Tiles (simple chunking): a component will always be added to the tile where the owner entity is. (order in tile doesn't matter) (purpose is cache coherency) (no automatic archetype management; but customizable e.g. build order)

  • TODO: Nice to have but too much accounting work: the tile (chunk) enforce and expose an archetype for the purpose of knowing ahead of time what is in a chunk while querying all chunks.

  • The chunky Tiles are kept in GameWorlds (e.g. main world, enemies world, bullets, vfx etc.). Each world has many Tiles each with e.g. 128 elements.

  • Enities and components are based on GameDataId (serving as a weak reference): [worldIndex][typeUID][tileIndex][index][version] and a cached [typeIndex] of the top level T for convenience.

  • Entity Parenting support with automatic (or manual) management, and a builtin Transform Propagation System.

  • Everything is easy to request and keep track of through various means, even by name in hash maps for the elements you set up with manual light scripting in mind.

  • Cleanup is either manual (jobs/systems) (and cache coherent) or automated via (cache-missing) awareness of component dependencies and function/event propagation.

  • GameEntity and GameEntityObject:GameEntity

  • Properties: isVisible, isEnabled, "object" entity versions also have some events etc. but no real extra memory usage.

  • GameComponent: Material (shared), Model (shared), Transform, Bounds, Parent, Light (sparse component).

  • PlayerObjects {GameEntityObjects, PlayerActiveStates}.

  • Materials {Shader, Descriptor-set UniformData, instancing, optional/shared Pipeline (for e.g blend ops)}

  • AABB Bounds from the Mesh.

  • Light holding multiple kinds of lights based on properties and/or volume/sizes.

  • Transform, Parent

  • Children (just for local, cache-missing traversal, mainly for debugging or crawling e.g. a character rig)

Rendering

  • Implemented the most high quality e.g. Disney BRDF lighting equations for diffuse, specular and MRP based (Most Representative Point) shape lights. There's a blog post on correct tube lights: mrp_volumetric_lights_gif
  • TODO: does not include clearcoat.
  • TODO: does not include subsurface scattering,
  • TODO: does not include shadows,
  • TODO: no per-pixel transparent object sorting,
  • ^, ^^, ^^^: but, I'll someday add in raytracing into some form of nanite clumps or other semi-volumetric discrete mesh data, instead of going through the legacy shading/sorting timesinks again.
  • Per-material, per-model, per-pipeline properties. Easily create a material e.g. transparent, doublesided; add new shaders with all the static and dynamic uniform data you need, instance geometry etc.
  • Render pipeline knows if you modified any default properties and creates pipelines from unique materials. Tries its best to batch per unique material and per model to minimise GPU-CPU communication, and has instancing, but it's not Indirect Rendering.
  • Expanded, added to, and explained Khronos' & JanhSimon's Headset, Context, Renderer/Pipeline etc, and the easily misleading & hard to customize khronos vulkan <-> openxr implementation. Especially regarding multipass vs singlepass & multiview, and what it takes if you want to use your own renderer or a diffrent API like webgpu. (look for "// [tdbe]" )

Input class and InputData.

  • A 'proper' universal xr input class, supporting (probably) all controllers/headsets, with customizable binding paths and action sets.
  • Nicely accessible data through InputData and InputHaptics, including matrixes and other tracked XR positional data.
  • Poses for controllers and for head.
  • Actions (buttons, sticks, triggers, pressure, proximity etc).
  • User presence / headset activity state.
  • Haptic feedback output.
  • Exposes action state data (e.g. lastChangeTime, isActive, changedSinceLastSync)

Utils

  • Utils and math for XR, input, general gamedev.

  • Debugging, logging to file.

  • TODO: Bring in gizmos, e.g. debug lines, wireframes for lights etc.

A typical run log:

Attributions #2

(attributions #1 here)

Asset Title Author License
models/SuzanneHighQuality20k.obj Suzanne Blender (but mesh smoothed and subdivided by tdbe) GNU GPL 2+
models/SudaBeam.obj SudaBeam tdbe, a simplified version of Suda 51's parody of a light saber GNU GPL 3+
models/Squid_Happy_Grumpy.obj Happy Grumpy Squid tdbe CC BY 4.0
models/ground_displaced_*.obj demo ground tdbe CC BY 4.0
models/icosphere_*.obj utility (ico)spheres tdbe CC BY 4.0
models/quad_*.obj utility quad tdbe CC BY 4.0
models/capsule_*.obj utility capsule tdbe CC BY 4.0
models/text_*.obj various texts tdbe CC BY 4.0



Below is Janhsimon's original readme:


Teaser

Overview

This project demonstrates how you can write your own VR application using OpenXR 1.1 and Vulkan 1.3. These are its main features:

  • Basic rendering of example scene to the headset and into a resizeable mirror view on your desktop monitor.

  • Focus on easy to read and understand C++ without smart pointers, inheritance, templates, etc.

  • Usage of the Vulkan multiview extension for extra performance.

  • Warning-free code base spread over a small handful of classes.

  • No OpenXR or Vulkan validation errors or warnings.

  • CMake project setup for easy building.

Integrating both OpenXR and Vulkan yourself can be a daunting and painfully time-consuming task. Both APIs are very verbose and require the correct handling of countless minute details. This is why there are two main use cases where this project comes in handy:

  1. Fork the repository and use it as a starting point to save yourself weeks of tedious integration work before you get to the juicy bits of VR development.
  2. Reference the code while writing your own implementation from scratch, to help you out if you are stuck with a problem, or simply for inspiration.

Running the OpenXR Vulkan Example

  1. Download the latest release or build the project yourself with the steps below.
  2. Make sure your headset is connected to your computer.
  3. Run the program!

Building the OpenXR Vulkan Example

  1. Install the Vulkan SDK version 1.3 or newer.
  2. Install CMake version 3.1 or newer.
  3. Clone the repository and generate build files.
  4. Build!

The repository includes binaries for all dependencies except the Vulkan SDK on Windows. These can be found in the external folder. You will have to build these dependencies yourself on other platforms. Use the address and version tag or commit hash in version.txt to ensure compatibility. Please don't hesitate to open a pull request if you have built dependencies for previously unsupported platforms.

Attributions

Asset Title Author License
models/Beetle.obj Car Scene toivo CC BY 4.0
models/Bike.obj Sci-fi Hover Bike 04 taktelon CC BY 4.0
models/Car.obj GAZ-21 Ashkelon CC BY 4.0
models/Hand.obj Hand Anatomy Reference Ant B-D CC BY 4.0
models/Ruins.obj Ancient Ruins Pack Toni García Vilche CC BY 4.0

About

Open Source cutting edge poor man's game framework. OpenXR, Vulkan single-pass renderer, ECS, artifact-free volumetric lights, advanced chaperone & locomotion etc. Gamedev Fundamentals in C++ & Vulkan.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • C++ 72.5%
  • GLSL 15.9%
  • Batchfile 6.5%
  • C 3.0%
  • CMake 2.1%