A search application for discovering and exploring Maven workshops/talks with instructors. Built with modern web technologies and deployed on Cloudflare.
- Full-text search - Search workshops by title and description
- Advanced filtering - Filter by tags, instructors, and status (Scheduled/Live/Recorded)
- Sortable results - Sort by start time or duration (ascending/descending)
- Pagination - Configurable rows per page (10, 20, 50, 100)
- CSV export - Download filtered results as CSV
- RSS feed - Subscribe to filtered results via RSS
- Featured talks - Highlighted workshops marked as featured
| Category | Technology |
|---|---|
| Runtime | Bun |
| Monorepo | Bun workspaces |
| Database | Cloudflare D1 (SQLite) |
| ORM | Drizzle ORM |
| Frontend | React 19 |
| Routing | TanStack Router |
| Data Fetching | TanStack Query |
| SSR | TanStack Start |
| Styling | Tailwind CSS v4 |
| Components | shadcn/ui (Base UI) |
| Scraper | Cloudflare Workers |
| Hosting | Cloudflare Pages |
| Linting | Biome |
| Logging | tslog |
- Bun v1.0+
- Cloudflare account (for D1 database and deployment)
bun installCreate a Cloudflare API token with the following permissions:
- D1: Edit
- Account Settings: Read
Add the token to your .env (create the file if needed):
# .env
CLOUDFLARE_API_TOKEN=...Then run the setup script:
bun run setup_cf.tsThis will automatically:
- Create a D1 database (
maven_search_prod) - Overwrite
.envwith your credentials - Create
wrangler.jsoncconfiguration files
You’ll also need to authenticate Wrangler for migrations and deployment:
bunx wrangler loginGenerate and run migrations:
# Generate migrations from schema
bun drizzle-kit generate
# Run migrations on remote D1
bun drizzle-kit migrateFor local development, prefix commands with LOCAL=1:
LOCAL=1 bun drizzle-kit migrate # Run migrations locally
LOCAL=1 bun drizzle-kit studio # Open Drizzle StudioStart the scraper locally and trigger a scrape:
bun run scrape:dev
curl http://localhost:8080/triggerbun run web:devOpens at http://localhost:3000
bun run scrape:devThe scraper runs as a Cloudflare Worker with:
- Cron trigger: Automated scheduled scraping
- HTTP trigger: Manual scrape via
GET /trigger
bun run check
bun run type-checkTo auto-fix lint/format issues, run bun run check:fix.
bun run scrape:deployDeploys the scraper as a Cloudflare Worker with scheduled cron triggers.
bun run web:deployBuilds and deploys to Cloudflare Pages.
| Script | Description |
|---|---|
bun install |
Install dependencies |
bun run web:dev |
Run the web app locally |
bun run scrape:dev |
Run the scraper locally |
bun run check |
Lint + format checks (Biome) |
bun run check:fix |
Auto-fix lint/format issues |
bun run type-check |
TypeScript type checking |
For the full list of scripts, see package.json.
MIT License. See LICENSE for details.