-
-
Notifications
You must be signed in to change notification settings - Fork 101
FAQ
-
Requires minimal installation. I don't need to download and maintain Docker or tamper with Dockerfile/docker-compose.yml files or avoid name or port conflicts or monitor logging output or start/restart/stop containers etc. Even if that can all be done relatively easy, there are still pain points.
-
It's easy to keep everything clean. There are no environment variables, ports, massive memory allocations, file writes/reads, mile-long Docker logs. There's zero risk of ever overwriting any data by accident. You don't need to incessantly run migration scripts. You don't need as much work for handling queries for repopulating mock databases. You don't have the process taking up disk space or memory in the background when you close down the project.
-
It is fast. While in-memory queries should be considerably faster, I don't actually think this is the main reason. Rather, you could get your mock database up-and-running from an uninstalled folder in literal seconds, and close it down in a millisecond. That seems very nice to me.
The obvious possible drawback:
The library will, of course, not be a perfect representation of a real Postgres database. But I could absolutely see it being good enough for all testing purposes.
pg-mem does not implement any existing postgres extension (at time of writing) - it could be a bit too heavy to include everything, given that few people use them.
That said, it's pretty easy to implement them yourself. For instance, if you use the uuid_generate_v4()
function of uuid-ossp
, you could register an implementation like this:
import { v4 } from 'uuid';
db.registerExtension('uuid-ossp', (schema) => {
schema.registerFunction({
name: 'uuid_generate_v4',
returns: DataType.uuid,
implementation: v4,
impure: true,
});
});
In which case, the following SQL statement will work as expected:
create extension "uuid-ossp";
select uuid_generate_v4();
You could just dump the schema in like this:
pg_dump --schema-only --no-owner --no-acl --disable-triggers --no-comments --no-publications --no-security-labels --no-subscriptions --no-tablespaces --host HOST --user USER --password DBNAME > dump.sql
Replacing HOST
, USER
and DBNAME
by your config.
This will generate a dump.sql
file that should be executable in pg-mem.
⚠ If the dumped SQL is not supported by pg-mem, file an issue !
Then, pretty straightforward:
import fs from 'fs';
import {newDb} from 'pg-mem';
cons db = newDb();
// create schema
db.public.none(fs.readFileSync('dump.sql', 'utf8'));
// then, create a backup (insert data that will be common to all tests before that if required)
const backup = db.backup();
Then use it like that:
// restore 'db' as original
backup.restore();
// => use 'db' !
⚠ If your DB uses extensions, you'll have to mock them (see ad-hoc question in this FAQ)
Yes, it does. I adapted the node-sqlite algorithm that does just that. There is their documentation about it at time of writing:
This module comes with a lightweight migrations API that works with SQL-based migration files
With default configuration, you can create a migrations/
directory in your project with SQL files,
and call the migrate()
method to run the SQL in the directory against the database.
See this project's migrations/
folder for examples.
await db.public.migrate({
/**
* If true, will force the migration API to rollback and re-apply the latest migration over
* again each time when Node.js app launches.
*/
force?: boolean
/**
* Migrations table name. Default is 'migrations'
*/
table?: string
/**
* Path to the migrations folder. Default is `path.join(process.cwd(), 'migrations')`
*/
migrationsPath?: string
})
You since v2.3.0, you can use CREATE FUNCTION
, but only SQL
language is supported. Meaning that you can only have functions that are a single query, wrapped in a function.
pg-mem does not implement PLPGSQL, nor other scripts such as PLV8 natively. You can, however, fake those by declaring a custom script compiler of your own.