Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

backup of Postgres fails if pg_hba.conf forces encryption #369

Open
jkoopmann opened this issue Sep 19, 2024 · 8 comments
Open

backup of Postgres fails if pg_hba.conf forces encryption #369

jkoopmann opened this issue Sep 19, 2024 · 8 comments

Comments

@jkoopmann
Copy link

Summary

I am unable to backup a postgresql probably due to encryption problems.

Steps to reproduce

Setup a postgresql server that requires SSL encrypted user sessions.
Try to backup with the typical DB01_NAME _USER _HOST _PASSWORD settings.

2024-09-19.10:26:14 [INFO] ** [db-backup04] Starting Manual Backup for db-backup04 2024-09-19.10:26:17 [NOTICE] ** [04-x.x.x.x__postgres] Dumping PostgresSQL database: 'postgres' and compressing with 'zstd' pg_dump: error: connection to server at "x.x.x.x", port 5432 failed: could not open certificate file "/root/.postgresql/postgresql.crt": Permission denied connection to server at "x.x.x.x", port 5432 failed: FATAL: pg_hba.conf rejects connection for host "x.x.x.x", user "maitadmin", database "postgres", no encryption 2024-09-19.10:26:17 [INFO] ** [04-x.x.x.x__postgres] DB Backup of 'pgsql_postgres_x.x.x.x_20240919-102617.sql.zst' completed successfully

If I exec into the docker having the problem and use psql to login to the server with the credentials provided in the compose.yml, the connection works like a charm. So I can only assume that network and pg_hba.conf are correct. db_backup however fails with the above error.

What is the expected correct behavior?

Login and backup working from db_backup

Relevant logs and/or screenshots

See above.

@jkoopmann jkoopmann added the bug label Sep 19, 2024
@jkoopmann
Copy link
Author

Funny thing:

running pg_dump from the normal prompt (root I assume) works:

[tiredofit/db-backup:4.1.1 10:42:10 /] $ pg_dump -h x.x.x.x -p 5432 -U maitadmin postgres Password: pg_dump: error: query failed: ERROR: permission denied for table postgres_log pg_dump: detail: Query was: LOCK TABLE public.postgres_log IN ACCESS SHARE MODE [tiredofit/db-backup:4.1.1 10:42:26 /] $

Locking error but let's ignore that. If I run it as the script does with the dbbackup user however:

[tiredofit/db-backup:4.1.1 10:41:41 /] $ sudo -Eu dbbackup pg_dump -h x.x.x.x -p 5432 -U maitadmin postgres pg_dump: error: connection to server at "x.x.x.x", port 5432 failed: could not open certificate file "/root/.postgresql/postgresql.crt": Permission denied connection to server at "x.x.x.x", port 5432 failed: FATAL: pg_hba.conf rejects connection for host "x.x.x.x", user "maitadmin", database "postgres", no encryption

This looks like a permission problem in the db_backup part of this.

@tiredofit
Copy link
Owner

tiredofit commented Sep 19, 2024

I'm not sure I would qualify this as a bug but more so an inability to backup with an encrypted connection.

You could try fiddling with the container options specifically the USER_DBBACKUP options, alternatively to try to mount your certificate to /home/dbbackup/.... There doesn't look like to be any command line arguments based on a cursory look to set a custom location for the certs which we could work into the pg_dump process. You could also get creative with the image pre / post hooks and do some scripting to get them into the right places, alternatively some of the base image "hooks" (custom scripts/custom assets) functionality could also help with scaffolding to support it.

After that, it would require to be a feature to add into the image to support this sort of thing. I see it as beneficial.

EDIT: It does seem there is an ability to pass some sort of arguments regarding custom paths for certificates based on a random search here.. There are some environment variables that could append some of those strings onto your psql command line which may work in the short term?

@jkoopmann
Copy link
Author

If it was not for my second mail I would tend to agree. However: The pgdump installation in the docker obviously is able to communicate to the database in question (tls encrypted) without any fiddling around. I never installed any type of certificate in that docker. So the question is: Why is this working as the root user in the docker but not as the dbbackup user? Why would I need to do something with options/hooks or import a certificate in /home/dbbackup when the same docker, network, database etc. is working like a charm, just with the root user.

sudo -Eu dbbackup export PGSSLCRL=/tmp; psql -h x.x.x.x -U maitadmin -d postgres

actually does work. So the remaining question is: How can I set the environment variable PGSSLCRL (and to make it complete presumably the other PGSSL* variables as well) from within docker-compose.yml. That would indeed solve the issue and would be an improvement.

Thanks very much for your support!

@tiredofit
Copy link
Owner

Interesting that you don't even have a certificate to begin with ! My assumption is that you did due to the initial error message.

So it seems the execution of pg_dump is looking to get inside the /root/ folder and it can't so its throwing the errors, even though there is no certificate within /root to begin with.

I have a function that looks like this

run_as_user() {
    sudo -Eu "${DBBACKUP_USER}" "$@"
}

that is executing the pg_dump command which looks very similar to yours. Above. While not optimal, but certainly solves the problem you should be able to set DBBACKUP_USER as root and this should allow for the backup. Looking at the README this is actually undocumented, but was the way things were done in the 3.xx series where everything was run as the root user.

WRT to a prefix to be able to insert those environment variables it's not possible but would be fairly easy to insert into the code I believe, but lets see how this user context change works first and then see if there is an easy way.

@jkoopmann
Copy link
Author

While not optimal, but certainly solves the problem you should be able to set DBBACKUP_USER as root and this should allow for the backup

Agreed but how? I am unable to find a suitable environment variable setting this globally or for that specific backup-job. Agreed that this should solve this individual challenge.

@tiredofit
Copy link
Owner

Set DBBACKUP_USER=root in environment variables. As I wrote above, it appears to be undocumented but exists in my function as possible..

@jkoopmann
Copy link
Author

Got it. Thought it would be on a "per backup-job" basis and would not love to change the user for all jobs. But I checked it and it works.

@tiredofit
Copy link
Owner

Good, glad it works, with the obvious drawback. I don't think it would be too difficult to switch it over to per "job" basis as the majority of the scaffolding is in place. There were a couple enhancements that came out of this that would be beneficial. A post hook or command would probably work to allow you to chown the output files to something readable outside the container.. I'm going to mark this as a feature request for future development if I have the time.

@tiredofit tiredofit added enhancement and removed bug labels Sep 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants