AnoUpload is a compact, privacy-minded file uploader built with Node.js, Express and Multer. Version 2.0 focuses on reliability, a modern UI, and large-file resilience (client chunking + server reassembly).
- Modernized single-page UI with dark purple theme, responsive layout and mobile support
- Optional Discord notifications: server no longer requires
DISCORD_WEBHOOK_URL(notifications skipped when unset) - Robust
MAX_CONTENT_LENGTHparsing (supports1gb,1024MB, etc.) and human-readable/configendpoint - Sequential uploads and client-side chunking (default 8 MB chunks) with server-side
/upload-chunksupport - Improved error handling and debug logging (413 JSON responses include configured limits and bytes received)
- Local upload history (browser localStorage), drag-and-drop, previews, and copy/open shortcuts
- Runtime: Node.js + Express
- Upload handling: Multer (disk storage) + chunk reassembly endpoint
- Storage: Local filesystem by default, optional GitHub-backed storage
Key endpoints
GET /→ web UI (public/index.html)GET /config→ server limits and storage modePOST /→ single-file upload (small files)POST /upload-chunk→ appendable chunk upload for large filesGET /uploads/:filename→ serve uploaded files
- Clone and install:
git clone https://github.com/FreeCode911/AnoUpload.git
cd AnoUpload
npm install- Create a
.envfile in the project root and set the minimum variables you need:
UPLOAD_FOLDER=uploads
MAX_CONTENT_LENGTH=1gb # accepts human formats like 1gb, 1024MB, 1000000000
USE_GITHUB=false # true to push to GitHub instead of local disk
GITHUB_TOKEN= # optional, required if USE_GITHUB=true
GITHUB_REPO= # optional
DISCORD_WEBHOOK_URL=none # optional; leave unset or "none" to disable notifications
- Start the server:
npm start- Open the web UI in your browser at the address printed by the server (or visit
http://localhost:8080).
- Client uploads files one-by-one; files larger than the chunk threshold are split into chunks (default 8 MB).
- Each chunk is posted to
POST /upload-chunkwith metadata (uploadId, filename, index, total, isLast). - Server appends chunks to a temporary file and finalizes the upload when the last chunk arrives.
This avoids many upstream single-request size limits. If your host still rejects chunked POSTs, try reducing chunk size or increase proxy limits (e.g., NGINX client_max_body_size).
UPLOAD_FOLDER— Directory to store uploads (default:uploads)MAX_CONTENT_LENGTH— Maximum file size accepted by server (bytes or human like1gb)USE_GITHUB—trueto push files to GitHub instead of diskGITHUB_TOKEN,GITHUB_REPO— required whenUSE_GITHUB=trueDISCORD_WEBHOOK_URL— Optional; leave unset ornoneto disable notifications
The server exposes GET /config which returns maxContentLength and maxContentHuman for client enforcement.
- 413 Request Entity Too Large (HTML nginx 413 returned immediately): upstream proxy likely rejected the request before Node received the body. Use client chunking or increase proxy limits on the host.
- If server logs show
bytesReceived≈ 0 for a failed upload, that confirms upstream rejection.
- Main files:
index.js— server entrypoint, parsing, chunk reassembly and debug loggingpublic/index.html— UI, drag/drop, chunked upload client and historydiscordWebhook.js— optional webhook notifier
- 2.0.0 — 2025-08-24
- UI refresh, optional webhook, MAX_CONTENT_LENGTH improvements, client chunking +
/upload-chunk, enhanced logging
- UI refresh, optional webhook, MAX_CONTENT_LENGTH improvements, client chunking +
GPL-3.0 — see LICENSE
If you'd like, I can add an UPGRADE.md with migration notes or embed a small screenshot/GIF of the new UI.