Skip to content

Commit

Permalink
Switch to a relational database (#637)
Browse files Browse the repository at this point in the history
* It boots from SQL

* progress on loading playlists

* Use uppercase ID

* Search based on SQL media

* add userID to history entry schema

* stash

* Migrate history, read history

* Typed IDs; move mostly to new schema types

* Migrate authentication model to SQL

* Update unique constraints

* Fix lodash import

* Select the right stuff from users

* Use Object.groupBy

* Use order column for playlist sorting?

The other option is to have a JSON column with IDs on the playlists
table.

* Add linting for JSDoc

* SQL config store

* stash

* Bump kysely

* Different way to store playlist item order

* Opaque -> Tagged

* Port bans

* deps: update better-sqlite3

* Remove mongodb connection code

* Adding playlist items with sql?

* Revert "Remove mongodb connection code"

This reverts commit 8b2ae37.

* Make migrations work in sql

* Try with SQLite

* Migrate auth passwords

* Better Date support for SQLite

* Use json_each

* use json_array_length

* SQLite utility functions

* Fix property name in test

* playlist shuffle and cycle with sqlite

* Use a flat list of permissions

* Various test fixes

* Ban test sorta working

* small test fixes

* acl fixes

* some more json sqlite fixes

* serialize active playlist id

* Implement playlist updates with sql

* More JSON fun

* users test fixes

* test fixes for bans and /now

* finish redis connection before changing configs

* User avatar / roles return values

* test ID fix

* Fix playlist item serialization

* implement removing playlist items

* put comment

* Fix issues due to playlist position options

* disable sql query logging

* various sql booth fixes

* Test fixes by moving to new data structure

* Inline the email function

* Fix email test

* This map is a multi map

* fix playlist item filter

* fix running into apparent bound param limit

* Fix serializing media items

* check passwords

* various type fixes

* Fix populating moderator in getBans

* Produce JSON-compatible type in serializers

* Miscellaneous type fixes

* Port favouriting

* Types in playlist advance

* Update lint settings

* Lint autofix

* slight jank but its ok

* Type fixes post merge

* Only connect to mongo in the migration

* Remove mongo from tests

* Backwards compatibility for /api/now.roles

* Implement votes

* Move sqlite plugins etc into utils/sqlite

* Lint migration

* Fix vote queries in booth plugin

* Mutes from redis to sqlite

* Optionally run most functions in a transaction

* deps: update node types

* Record favorites in feedback table

* remove jsdoc linting

it gets in the way quite a bit. maybe later?

* Use config store for MOTD

* silence logs in tests

* use pino-pretty

* Fix vote test, do not re-emit unchanged vote submissions

* Fix timezone confusion in bans

* explicitly store UTC in sqlite

* Run eslint --fix

* Address lints

* Disable lints that i cant get to work

* ci: add node 22
  • Loading branch information
goto-bus-stop authored Nov 22, 2024
1 parent 2a315bb commit 8765523
Show file tree
Hide file tree
Showing 87 changed files with 3,261 additions and 2,012 deletions.
11 changes: 2 additions & 9 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,19 +35,12 @@ jobs:
name: Tests
strategy:
matrix:
node-version: ['18.x', '20.x']
mongo-version: ['6.0', '7.0']
include:
- node-version: 'lts/*'
mongo-version: '5.0'
node-version: ['18.x', '20.x', '22.x']
runs-on: ubuntu-latest
services:
redis:
image: redis:6
ports: ['6379:6379']
mongodb:
image: mongo:${{matrix.mongo-version}}
ports: ['27017:27017']
steps:
- name: Checkout sources
uses: actions/checkout@v4
Expand All @@ -63,7 +56,7 @@ jobs:
REDIS_URL: '127.0.0.1:6379'
MONGODB_HOST: '127.0.0.1:27017'
- name: Submit coverage
if: matrix.node-version == '20.x' && matrix.mongo-version == '7.0'
if: matrix.node-version == '20.x'
uses: coverallsapp/github-action@v2.3.1
with:
github-token: ${{secrets.GITHUB_TOKEN}}
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -35,3 +35,7 @@ package-lock.json
.nyc_output

.env

*.sqlite
*.sqlite-shm
*.sqlite-wal
13 changes: 10 additions & 3 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@
"ajv-formats": "^3.0.1",
"avvio": "^9.0.0",
"bcryptjs": "^2.4.3",
"better-sqlite3": "^11.2.1",
"body-parser": "^1.19.0",
"cookie": "^1.0.1",
"cookie-parser": "^1.4.4",
Expand All @@ -47,15 +48,18 @@
"ioredis": "^5.0.1",
"json-merge-patch": "^1.0.2",
"jsonwebtoken": "^9.0.0",
"kysely": "^0.27.3",
"lodash": "^4.17.15",
"minimist": "^1.2.5",
"mongoose": "^8.6.2",
"ms": "^2.1.2",
"node-fetch": "^3.3.1",
"nodemailer": "^6.4.2",
"object.groupby": "^1.0.1",
"passport": "^0.5.0",
"passport-google-oauth20": "^2.0.0",
"passport-local": "^1.0.0",
"pg": "^8.10.0",
"pino": "^9.0.0",
"pino-http": "^10.1.0",
"qs": "^6.9.1",
Expand All @@ -76,6 +80,7 @@
"@eslint/js": "^9.12.0",
"@tsconfig/node18": "^18.2.2",
"@types/bcryptjs": "^2.4.2",
"@types/better-sqlite3": "^7.6.4",
"@types/cookie": "^1.0.0",
"@types/cookie-parser": "^1.4.2",
"@types/cors": "^2.8.10",
Expand All @@ -90,9 +95,11 @@
"@types/node": "~18.18.0",
"@types/node-fetch": "^2.5.8",
"@types/nodemailer": "^6.4.1",
"@types/object.groupby": "^1.0.3",
"@types/passport": "^1.0.6",
"@types/passport-google-oauth20": "^2.0.7",
"@types/passport-local": "^1.0.33",
"@types/pg": "^8.6.6",
"@types/qs": "^6.9.6",
"@types/random-string": "^0.2.0",
"@types/ratelimiter": "^3.4.1",
Expand All @@ -112,7 +119,7 @@
"mocha": "^10.0.0",
"nock": "^13.2.0",
"nodemon": "^3.0.1",
"pino-colada": "^2.2.2",
"pino-pretty": "^11.2.2",
"recaptcha-test-keys": "^1.0.0",
"sinon": "^19.0.2",
"supertest": "^7.0.0",
Expand All @@ -123,8 +130,8 @@
"scripts": {
"lint": "eslint --cache .",
"test": "npm run tests-only && npm run lint",
"tests-only": "c8 --reporter lcov --src src mocha --exit",
"tests-only": "c8 --reporter lcov --src src mocha",
"types": "tsc -p tsconfig.json",
"start": "nodemon dev/u-wave-dev-server.js | pino-colada"
"start": "nodemon dev/u-wave-dev-server.js | pino-pretty"
}
}
4 changes: 2 additions & 2 deletions src/AuthRegistry.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ class AuthRegistry {
}

/**
* @param {import('./models/index.js').User} user
* @param {import('./schema.js').User} user
*/
async createAuthToken(user) {
const token = (await randomBytes(64)).toString('hex');
Expand All @@ -42,7 +42,7 @@ class AuthRegistry {
throw err;
}

return userID;
return /** @type {import('./schema.js').UserID} */ (userID);
}
}

Expand Down
1 change: 0 additions & 1 deletion src/HttpApi.js
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,6 @@ function defaultCreatePasswordResetEmail({ token, requestUrl }) {
* @prop {import('nodemailer').Transport} [mailTransport]
* @prop {(options: { token: string, requestUrl: string }) =>
* import('nodemailer').SendMailOptions} [createPasswordResetEmail]
*
* @typedef {object} HttpApiSettings - Runtime options for the HTTP API.
* @prop {string[]} allowedOrigins
*/
Expand Down
41 changes: 21 additions & 20 deletions src/SocketServer.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
import { promisify } from 'node:util';
import mongoose from 'mongoose';
import lodash from 'lodash';
import sjson from 'secure-json-parse';
import { WebSocketServer } from 'ws';
Expand All @@ -15,10 +14,9 @@ import LostConnection from './sockets/LostConnection.js';
import { serializeUser } from './utils/serialize.js';

const { debounce, isEmpty } = lodash;
const { ObjectId } = mongoose.mongo;

/**
* @typedef {import('./models/index.js').User} User
* @typedef {import('./schema.js').User} User
*/

/**
Expand Down Expand Up @@ -109,6 +107,7 @@ class SocketServer {

/**
* Handlers for commands that come in from clients.
*
* @type {ClientActions}
*/
#clientActions;
Expand Down Expand Up @@ -206,7 +205,7 @@ class SocketServer {
logout: (user, _, connection) => {
this.replace(connection, this.createGuestConnection(connection.socket));
if (!this.connection(user)) {
disconnectUser(this.#uw, user._id);
disconnectUser(this.#uw, user.id);
}
},
};
Expand All @@ -231,7 +230,6 @@ class SocketServer {
this.broadcast('advance', {
historyID: next.historyID,
userID: next.userID,
itemID: next.itemID,
media: next.media,
playedAt: new Date(next.playedAt).getTime(),
});
Expand Down Expand Up @@ -448,19 +446,22 @@ class SocketServer {
/**
* Create `LostConnection`s for every user that's known to be online, but that
* is not currently connected to the socket server.
*
* @private
*/
async initLostConnections() {
const { User } = this.#uw.models;
const userIDs = await this.#uw.redis.lrange('users', 0, -1);
const disconnectedIDs = userIDs
.filter((userID) => !this.connection(userID))
.map((userID) => new ObjectId(userID));

/** @type {User[]} */
const disconnectedUsers = await User.find({
_id: { $in: disconnectedIDs },
}).exec();
const { db, redis } = this.#uw;
const userIDs = /** @type {import('./schema').UserID[]} */ (await redis.lrange('users', 0, -1));
const disconnectedIDs = userIDs.filter((userID) => !this.connection(userID));

if (disconnectedIDs.length === 0) {
return;
}

const disconnectedUsers = await db.selectFrom('users')
.where('id', 'in', disconnectedIDs)
.selectAll()
.execute();
disconnectedUsers.forEach((user) => {
this.add(this.createLostConnection(user));
});
Expand Down Expand Up @@ -556,7 +557,7 @@ class SocketServer {
connection.on('close', ({ banned }) => {
if (banned) {
this.#logger.info({ userId: user.id }, 'removing connection after ban');
disconnectUser(this.#uw, user._id);
disconnectUser(this.#uw, user.id);
} else if (!this.#closing) {
this.#logger.info({ userId: user.id }, 'lost connection');
this.add(this.createLostConnection(user));
Expand Down Expand Up @@ -602,7 +603,7 @@ class SocketServer {
// Only register that the user left if they didn't have another connection
// still open.
if (!this.connection(user)) {
disconnectUser(this.#uw, user._id);
disconnectUser(this.#uw, user.id);
}
});
return connection;
Expand Down Expand Up @@ -659,7 +660,7 @@ class SocketServer {
*
* @param {string} channel
* @param {string} rawCommand
* @return {Promise<void>}
* @returns {Promise<void>}
* @private
*/
async onServerMessage(channel, rawCommand) {
Expand All @@ -686,7 +687,7 @@ class SocketServer {
/**
* Stop the socket server.
*
* @return {Promise<void>}
* @returns {Promise<void>}
*/
async destroy() {
clearInterval(this.#pinger);
Expand All @@ -707,7 +708,7 @@ class SocketServer {
* Get the connection instance for a specific user.
*
* @param {User|string} user The user.
* @return {Connection|undefined}
* @returns {Connection|undefined}
*/
connection(user) {
const userID = typeof user === 'object' ? user.id : user;
Expand Down
39 changes: 25 additions & 14 deletions src/Source.js
Original file line number Diff line number Diff line change
@@ -1,31 +1,41 @@
import { SourceNoImportError } from './errors/index.js';

/**
* @typedef {import('./models/index.js').User} User
* @typedef {import('./models/index.js').Playlist} Playlist
* @typedef {import('./schema.js').User} User
* @typedef {import('./schema.js').Playlist} Playlist
* @typedef {import('./plugins/playlists.js').PlaylistItemDesc} PlaylistItemDesc
*/

/**
* @typedef {{
* sourceType: string,
* sourceID: string,
* sourceData: import('type-fest').JsonObject | null,
* artist: string,
* title: string,
* duration: number,
* thumbnail: string,
* }} SourceMedia
*/

/**
* @typedef {object} SourcePluginV1
* @prop {undefined|1} api
* @prop {(ids: string[]) => Promise<PlaylistItemDesc[]>} get
* @prop {(query: string, page: unknown, ...args: unknown[]) => Promise<PlaylistItemDesc[]>} search
* @prop {(ids: string[]) => Promise<SourceMedia[]>} get
* @prop {(query: string, page: unknown, ...args: unknown[]) => Promise<SourceMedia[]>} search
* @prop {(context: ImportContext, ...args: unknown[]) => Promise<unknown>} [import]
*
* @typedef {object} SourcePluginV2
* @prop {2} api
* @prop {(context: SourceContext, ids: string[]) => Promise<PlaylistItemDesc[]>} get
* @prop {(context: SourceContext, ids: string[]) => Promise<SourceMedia[]>} get
* @prop {(
* context: SourceContext,
* query: string,
* page: unknown,
* ...args: unknown[]
* ) => Promise<PlaylistItemDesc[]>} search
* ) => Promise<SourceMedia[]>} search
* @prop {(context: ImportContext, ...args: unknown[]) => Promise<unknown>} [import]
* @prop {(context: SourceContext, entry: PlaylistItemDesc) =>
* Promise<import('type-fest').JsonObject>} [play]
*
* @typedef {SourcePluginV1 | SourcePluginV2} SourcePlugin
*/

Expand Down Expand Up @@ -61,7 +71,7 @@ class ImportContext extends SourceContext {
* @returns {Promise<Playlist>} Playlist model.
*/
async createPlaylist(name, itemOrItems) {
const playlist = await this.uw.playlists.createPlaylist(this.user, { name });
const { playlist } = await this.uw.playlists.createPlaylist(this.user, { name });

const rawItems = Array.isArray(itemOrItems) ? itemOrItems : [itemOrItems];
const items = this.source.addSourceType(rawItems);
Expand Down Expand Up @@ -101,8 +111,9 @@ class Source {
* Media items can provide their own sourceType, too, so media sources can
* aggregate items from different source types.
*
* @param {Omit<PlaylistItemDesc, 'sourceType'>[]} items
* @returns {PlaylistItemDesc[]}
* @template T
* @param {T[]} items
* @returns {(T & { sourceType: string })[]}
*/
addSourceType(items) {
return items.map((item) => ({
Expand All @@ -116,7 +127,7 @@ class Source {
*
* @param {User} user
* @param {string} id
* @returns {Promise<PlaylistItemDesc?>}
* @returns {Promise<SourceMedia?>}
*/
getOne(user, id) {
return this.get(user, [id])
Expand All @@ -128,7 +139,7 @@ class Source {
*
* @param {User} user
* @param {string[]} ids
* @returns {Promise<PlaylistItemDesc[]>}
* @returns {Promise<SourceMedia[]>}
*/
async get(user, ids) {
let items;
Expand All @@ -150,7 +161,7 @@ class Source {
* @param {string} query
* @param {TPagination} [page]
* @param {unknown[]} args
* @returns {Promise<PlaylistItemDesc[]>}
* @returns {Promise<SourceMedia[]>}
*/
async search(user, query, page, ...args) {
let results;
Expand Down
Loading

0 comments on commit 8765523

Please sign in to comment.