Skip to content

Kaemon/rwkv-public-transport-demand-prediction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

🚌 Public Transport Demand Prediction using RWKV-Based Architecture

Author: Kaemon Ng Bao Leng | TP082265 Course: Research Methods for Computing and Technology (CT098-3-2) Institution: Asia Pacific University of Technology & Innovation (APU)


📄 Documents

Document Description
KAEMON NG BAO LENG PART1 RESUBMISSION.pdf Part 1 – Project Proposal
KAEMON NG BAO LENG PART2.pdf Part 2 – Literature Review & Methodology

📌 Research Overview

This research investigates the effectiveness of Receptance Weighted Key Value (RWKV)-based architecture in enhancing operational efficiency for public transport systems.

Problem

Public transport operators struggle with supply-demand imbalances caused by dynamic urban data. Existing deep learning solutions like BERT Transformer Encoder achieve high prediction accuracy but suffer from quadratic computational complexity O(N²), making real-time fleet management computationally expensive.

Proposed Solution

This research proposes a RWKV-based architecture that introduces a Linear Attention mechanism O(N), combining:

  • Transformer's parallel training capability
  • RNN's constant-memory inference capability

Research Objectives

  1. Examine how public transport operations are affected by low demand prediction
  2. Investigate the shortcomings of BERT Transformer Encoder in public transport
  3. Propose RWKV to improve prediction accuracy and lower resource consumption

🔬 Methodology

  • Direct Users: Public transport operators, fleet managers, dispatchers
  • Indirect Users: Daily commuters, urban policymakers
  • Sampling: Purposive sampling (operators) + Cluster sampling (commuters)
  • Data Collection: Survey forms (quantitative) + Interviews (qualitative)

⚖️ RWKV vs BERT Transformer Encoder

Aspect BERT Transformer Encoder RWKV
Complexity O(N²) O(N)
Memory Grows exponentially Constant
Training Parallel Parallel
Inference Heavy Lightweight
Real-time Use Difficult Feasible

🔑 Key Terms

RWKV · Deep Learning · Public Transport · Demand Prediction · BERT · Transformer · Linear Attention · Sustainable Development Goals (SDG11)

About

A research proposal on enhancing public transport operational efficiency using RWKV-based architecture with linear attention mechanism O(N), comparing against BERT Transformer Encoder.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors