I Drive is a cloud storage system & online file browser that stores files on Discord.
It's basically like Google Drive, but instead it stores all files in Discord.
It's vailable at https://idrive.pamparampam.dev
Credentials: demo/demo
Sorry, do demo currently.
| Feature | Support |
|---|---|
| Login & Permission system | ✅ |
| Full File encryption | ✅ |
| Online streaming and viewing of files without downloading | ✅ |
| Locked folders | ✅ |
| Bulk zip download | ✅ |
| Mobile support | ✅ |
| Share files & folders | ✅ |
| Delete/move/rename files & folders | ✅ |
| Search | ✅ |
| Supports Polish & English languages | ✅ |
| Code editor with highlighting | ✅ |
| Docker support | ✅ |
| Dark theme | ✅ |
| Virtual lists to render tens of thousand of files in a single folder | ✅ |
| And a LOT more features! | ✅ |
For a list of bugs, and planned features Click here
In essence, I Drive simply takes your upload files, and splits them into chunks to fit in Discord's (10Mb) file size limit. They are then encrypted and uploaded to Discord. After the upload is done, file's metadata is sent to backend and stored into a central database. This allows for a simple way of viewing, managing, and downloading of your files.
In reality the Frontend does a LOT more than just splitting the file into chunks. It has to:
- Calculate crc checksum
- Generate metadata
- Generate thumbnails
- Extract subtitles
- Encrypt the file
- And more!
The same thing applies to pretty much every part of this app. Even if something looks simple at first glance. It's most likely pretty complicated under the hood. After all, this entire project has more than 36k lines of code. And another 2k of configuration and translation lines
I Drive is made up of 5 main components.
Frontend is made with vue3 + vite. Vue Router is used for routing and Pinia as global state management. It's then built and served statically by NGINX
Why vue? Its data-driven approach makes it ideal for application which DOM is based on the underlying data.
Main backend is made with 🐍 Python, Django, Daphne, Channels, Rest Framework 🐍 It's responsible for authenticating users and communicating with a database. It uses REST API to both serve & modify data. The main backend has more than 80 different endpoints.
Backend uses websockets to communicate data changes to the clients. A list of all websocket events can be found in here
It's also responsible for streaming files from Discord. It supports partial requests, streaming, in browser video/audio seeking, decryption.
Thanks to a custom zipFly library it also supports streaming zip files "on the fly"
Postgres is currently used as a database
Redis is used as a fast in memory database for caching and message broker for celery. It also serves as a channel layer for django websockets
Asynchronous task queue for delegating long tasks like file deletion outside of HTTP call lifecycle.
On average Discord allows a single bot to make 1 request a second, that's way to little! That's why, for iDrive to work, a single user needs at least few bots, this way backend can switch between tokens and bypass Discord's ratelimits. The same thing applies to Discord channels and webhooks. Sadly discord still groups all requests per ip as well, so the ratelimits are still sometimes hit.
Discord issues cloudflare bans if you make more than 10k 4xx requests in 10 minutes. iDrive tries to avoid this as much as possible, including throwing 502 errors when it can't handle more requests
Why are webhooks needed? Why not use Discord bots to upload files?
Discord bots are in my opinion too powerful to send tokens back and forth in the browser. In an unlikely situation, a third party could steal bot's token and access all files stored(encrypted or not) on a Discord server. Discord bots if given too many permissions would also allow for easy raiding and greefing.
Webhooks on the other hand can only send messages, and delete/modify their own.
It's written is python, what do you expect. Rewrite it in rust!
I drive is fully dockerized! Yay. There are 4 containers managed by docker compose:
- Backend, containing a backend server and celery
- Nginx, it's responsible for reverse proxy, cache, and serving the static frontend files.
- Redis
- Postgres
-
Create a fresh directory and in it
-
create
docker-compose.ymlfile. Copy content from here to it. -
create
nginx.conffile. Copy content from here to it. -
create
.envfile and copy these values:
IS_DEV_ENV=True
PROTOCOL=http
DEPLOYMENT_HOST=localhost
NGINX_PORT=80
BACKEND_SECRET_KEY=very_secret_key
BACKEND_BASE_URL=http://localhost
REDIS_PASSWORD=1234
POSTGRES_USER=admin
POSTGRES_PASSWORD=1234
BACKEND_PORT=8001
- Run
docker-compose up - Run
docker exec -it idrive-backend bash - Run
python manage.py migrate websiteto setup a database - Run
python manage.py createsuperuserto create admin user - Go to browser and type
localhost
You need python version 3.11 installed. Tested on Node v20.10.0
- Clone this repository
git clone https://github.com/pam-param-pam/I-Drive
- Start redis.
docker run -d --name dev_idrive_redis -p 6379:6379 redis:latest redis-server --requirepass 1234
- Start postgres
docker run -d --name dev_idrive_postgres -e POSTGRES_DB=dev_idrive_postgres -e POSTGRES_USER=admin -e POSTGRES_PASSWORD=1234 -p 5432:5432 -v dev_idrive_postgres_data:/var/lib/postgresql/data postgres:16
- Navigate to the cloned repo. Find
frontenddir. In it create.envfile and put these variables:
VITE_BACKEND_BASE_URL=http://localhost:8000
VITE_BACKEND_BASE_WS=ws://localhost:8000
- Inside the
frontenddir run these commands:
npm installto install all requirementsnpm run dev -- --host 0.0.0.0 --port 5173to start the frontend dev server
- Navigate back to the cloned repo root. Find
backenddir. In it create.envfile and put these variables:
IS_DEV_ENV=True
PROTOCOL=http
DEPLOYMENT_HOST=localhost
NGINX_PORT=80
BACKEND_SECRET_KEY=very_secret_key
BACKEND_BASE_URL=http://localhost:8000
REDIS_PASSWORD=1234
REDIS_ADDRESS=localhost
REDIS_PORT=6379
POSTGRES_ADDRESS=localhost
POSTGRES_PORT=5432
POSTGRES_NAME=dev_idrive_postgres
POSTGRES_USER=admin
POSTGRES_PASSWORD=1234
- Inside
backenddir run these commands.
If on windows:
# 1. Create virtual environment
py -3.11 -m venv .venv
# 2. Activate the virtual environment
.venv\Scripts\activate
# 3. Install dependencies
pip install -r requirements.txt
# 4. Run migrations
python manage.py migrate
# 5. Create admin user
python manage.py createsuperuser
# 6. Start backend dev server
python manage.py runserver 0.0.0.0:8000
# 7. start both celeries #todo
If on MacOc/Linux
- Everything should work now, head over to
localhost:5173to see the website
Dear Discord, please don't sue me 👉👈