-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reorganize setup files & update Docker configurations #620
Conversation
On the docker database setup process - how are you managing the postgres user accounts & passwords? From the postgres docker image documentation you can set an environment variable (as follows in the docker-compose.yml file:) |
Hmm, I see you have a create_database.sh script - I guess you are doing some things directly on the postgres instance (with psql) rather than installing psql on the redash client? I guess that should work. But you still need to connect from the redash client via the python tools - I am getting this to work by setting the DATABSE_URL to REDASH_DATABASE_URL=postgresql://postgres:xxxxxx@postgres/redash (where the xxxx is the POSTGRES_PASSWORD used in setting up the postgres container). However then you need to set up the 'redash' database in the postgres instance also before anything else. |
OK, I've been trying to get this to work but it doesn't seem possible. If I try to run with a fresh postgres container (nothing in the database): docker-compose run postgres su postgres -c 'createdb redash --owner=postgres' then it can't connect to the database because the container it's bringing up isn't running the database, it's running the createdb command. I think all the database and user initialization needs to be done from the redash container, it can't be done (easily) on the postgres container. If we don't want to install the psql client then it needs to happen from python - I would guess via 'manage.py'. For example the following seems to work: docker-compose run redash ./manage.py database create_tables so I would propose adding commands for ./manage.py database create (to be run before create_tables) and ./manage.py users create --reader (to create the redash_reader account) - I can take a look at this if you like. |
|
Oh - you're using the default 'postgres' database instead of 'redash'?! That does work then, ok! For reference here is what works for me now with the current redash docker image:
|
I still think the docker-compose file should not assume anything about the host environment/filesystem, particularly for the volumes under the postgres section. What is /opt/postgres-data on your host system? Is it empty or populated, and what permissions does the path have? After attempting to create that path on my system (Mac, btw), I get the following:
|
I start with no /opt/postgres-data directory on the docker host. Docker
On 10/21/15 4:31 PM, astewart-twist wrote:
|
Oh, no, I'm sure that's a fine location. I just assumed it had to exist before it could be mounted by docker. I bet that's why I got that error. I re-ran after removing that directory and it deploys up fine now. Very cool! I wonder if the whole thing could be better integrated into the 'docker-compose up' workflow. The order of operations sorta requires the sequential flow which docker-compose doesn't seem to really handle in a straight-forward manner. Probably the easiest thing to do would be to prepend the redash entrypoint to check for existence of the redash db in postgress and run create_database.sh if not. |
@astewart-twist all the docker-compose files we will provide in the repo will be for reference only, so it seems ok to have some assumptions there, and each one can adapt them to his own environment. As for flow -- I will check again when writing the documentation, but iirc, you can do docker-compose up and then run the create database script. re:dash can start without a database. I wouldn't want to add the create database script inside the container, mainly because it might be used in an environment where it's not relevant. |
And I really appreciate the discussion here, as it helps me understand the gaps I need to fill when preparing this and the documentation. |
FYI I thought we could get away without the nginx front-end, but it turned out the python flask server didn't behave well with Amazon elastic load balancers. I was able to use the standard nginx:1 container image with only a slight modification of your nginx.conf file (change the access_log and error_log entries so they don't require subdirectories of /var/log) and the following setting in docker-compose.yml:
note that I also had to copy the nginx.conf file to /opt/redash-nginx.conf on the docker host to make this work. Depending on what your filesystem is on your docker host you might want to do something different (as with the persistent postgres db files). Or maybe there should be a custom redash-nginx docker image that includes the conf file so this isn't necessary. |
Allow sending rd_ui/dist, remove rd_ui/nodemodules.
- No need to pg client anymore. - Fix path to supervisord.conf.
…rent code base as context
Reorganize setup files & update Docker configurations
This is a WIP, but due to many people starting to look into the Docker support, I thought I should share it already.
Some changes here from master:
setup/
folder with files for each kind of deployment (Amazon Linux, Ubuntu, Docker) in their own directory.stdout
, so we can easily access/collect them in a Docker environment.bin/run
(bin/run
is used to source the .env file, which isn't necessary in Docker environment).postgres
database andpostgres
user instead of creating special user for re:dash.docker-compose
and the different containers.TODO:
bootstrap.sh
script for the ubuntu-docker image.Comments are welcome.
Closes #450.