Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docker support #450

Closed
dmitry-saritasa opened this issue Jun 10, 2015 · 30 comments
Closed

docker support #450

dmitry-saritasa opened this issue Jun 10, 2015 · 30 comments

Comments

@dmitry-saritasa
Copy link

It would be nice to have Dockerfile in repo, so we can build a container automatically

@arikfr
Copy link
Member

arikfr commented Jun 11, 2015

It is in the backlog. Can't commit to ETA, but I will comment here when it's available.

@KensoDev
Copy link

@arikfr @Saritasa I plan on working on it next week.

@meetwudi
Copy link
Contributor

Is anybody still working on this? I am currently investigating to add a Dockerfile to redash too. I would like to follow up if anybody have made a working version ;)

@KensoDev
Copy link

@tjwudi I was planning on this but got drawn to other project internally.

I don't have a working version yet, but it's definitely something I am planning on doing since we are dockerizing a lot of the infrastructure right now and redash is becoming a major part of our admin interfaces.

@meetwudi
Copy link
Contributor

@KensoDev thanks for your info. Then I will start to work on it. Once I got it working I will send in a PR.

@arikfr
Copy link
Member

arikfr commented Sep 29, 2015

I'm adding some thoughts on how to add Docker support for re:dash, in case someone would like to pick up the glove :)

Basically the whole setup process is "documented" in the bootstrap script, but the Docker setup should only take inspiration from it rather reuse it entirely.

Basically what I would do is:

  1. Use PostgreSQL official container.
  2. Use Redis official container.
  3. Use Nginx official container.
  4. Create "base" re:dash container, which will be used for the workers and web server.
  5. Move the database bootstrap part into a script in the bin folder.

Then tie everything together with Docker compose and some setup script to create the database once everything is up. Basic re:dash setup should have the following containers: PostgreSQL, Redis, Nginx, re:dash web server, re:dash "master" celery worker (with --beat option) & one additional re:dash celery worker.

I'm quite new to Docker, so if someone has other suggestions, I'll be happy to hear.

@meetwudi
Copy link
Contributor

@arikfr Oh you use nginx in a separate container? Is there any reason not use nginx in the same container with redash?

@KensoDev
Copy link

@tjwudi That's a Docker best practice, but you need to understand what is your purpose of doing this.
If the purpose it to make redash easily distributed across machines, uses @arikfr suggestion.

This way, you can compose the workload easier and not worry about processes running inside Docker, each Docker will be a process that you can monitor using upstart or anything like that.

@arikfr
Copy link
Member

arikfr commented Sep 29, 2015

I'm not using re:dash with Docker yet, the above outline was just a suggestion. As for reasoning, @KensoDev summed it up nicely. But I will add that it's not only about distributing across machines (which not really relevant for re:dash) but more about reusing existing work (in this case - official nginx container) and following best practices.

@KensoDev
Copy link

@arikfr yes.

Also, when you have a single point of entry to a docker machine there's less complexity when you run it and monitor it.

This way, you upstart a docker and the entry point is nginx, another is celery etc...

@arthurpsmith
Copy link

Proper docker support would be nice. I did get re:dash running in docker via the naive method of running the provision script on top of a basic ubuntu instance (actually wget libffi-dev need to be installed before the provision script will run as it is - also I had to add a 'service postgresql start' in the middle of the script, otherwise it didn't see the database at all). So... it is possible to do anyway. But following docker best practices would be nice.

@meetwudi
Copy link
Contributor

Docker support now in master from #588

@arikfr
Copy link
Member

arikfr commented Oct 12, 2015

I will build "official" images later this week. But whoever wants, can use the Dockerfile directly.

@astewart-twist
Copy link

I left this the same comment at https://hub.docker.com/r/everythingme/redash/ so pardon the redundancy, but what is the current entry point for using a dockerized deployment of redash? Is the Dockerfile in this github repo self-sufficient, or does it need to be run in the context of docker-compose?

@arthurpsmith
Copy link

It is not standalone - it requires connections to redis and postgres
servers at least. docker-compose should work; I'm just starting to test...

On 10/20/15 2:05 PM, astewart-twist wrote:

I left this the same comment at
https://hub.docker.com/r/everythingme/redash/ so pardon the
redundancy, but what is the current entry point for using a dockerized
deployment of redash? Is the Dockerfile in this github repo
self-sufficient, or does it need to be run in the context of
docker-compose?


Reply to this email directly or view it on GitHub
#450 (comment).

@meetwudi
Copy link
Contributor

Though you don't need to use docker compose, you should provide postgres
and redis connection info through environment variables.

Arthur Smith notifications@github.com于2015年10月20日 周二上午11:10写道:

It is not standalone - it requires connections to redis and postgres
servers at least. docker-compose should work; I'm just starting to test...

On 10/20/15 2:05 PM, astewart-twist wrote:

I left this the same comment at
https://hub.docker.com/r/everythingme/redash/ so pardon the
redundancy, but what is the current entry point for using a dockerized
deployment of redash? Is the Dockerfile in this github repo
self-sufficient, or does it need to be run in the context of
docker-compose?


Reply to this email directly or view it on GitHub
<
https://github.com/EverythingMe/redash/issues/450#issuecomment-149649897>.


Reply to this email directly or view it on GitHub
#450 (comment)
.

@astewart-twist
Copy link

@arthurpsmith Thanks. I'm fairly knowledgable in Docker so perhaps I can help too.

@tjwudi Ah, so that .env file is the key to freeing the redash Dockerfile from the rest of the docker-compose environment. I'll give that a shot.

Is the postgres container referenced in the docker-compose file intended to be used for internal redash purposes or is it a 'batteries included' db instance intended for user content to be consumed in redash's interface?

@meetwudi
Copy link
Contributor

@astewart-twist So re:dash's postgres is for its own usage, such as result caching, user management etc. You should not pipe your own data into re:dash's postgres manually.

@astewart-twist
Copy link

@tjwudi Ok, good to know. I figured that's probably the case, but wasn't sure what the role of redis was then. That certainly simplifies the docker-compose deployment then.

@arikfr
Copy link
Member

arikfr commented Oct 20, 2015

@astewart-twist note that you don't necessarily need to use the .env file, but rather pass environment variables to the re:dash processes. The .env file was used by docker-compose to pass env variables into the container, but the way to do this depends on how you run your docker containers.

@meetwudi
Copy link
Contributor

@astewart-twist redis acts as message broker for celery.

@astewart-twist
Copy link

@arikfr Right, I understand. I'm suspecting docker-compose is going to offer the easiest out-of-the-box deployment. I'm running into a few issues though.

redash is complaining that it can't connect to redis. The default REDASH_REDIS_URL in the .env is set to redis://localhost:6379/1, though I suspect that should actually be redis://redis:6379/1 in order to reflect the link set in the docker-compose file:

redash_1       | [2015-10-20 21:37:42,941: ERROR/Beat] beat: Connection error: Error 111 connecting localhost:6379. Connection refused.. Trying again in 26.0 seconds...
redash_1       | [2015-10-20 21:38:08,610: ERROR/MainProcess] consumer: Cannot connect to redis://localhost:6379/0: Error 111 connecting localhost:6379. Connection refused..
redash_1       | Trying again in 28.00 seconds...

I went ahead and tried that, but actually received the same error. It's also interesting that the error mentions /0 while the .env specifies /1 in the redis url. (I'm not familiar enough with redis to know if that means anything significant). Is it possible that redash isn't picking up the REDASH_REDIS_URL as intended? I can see the variable is set on the container, but maybe REDASH isn't using it?

@astewart-twist
Copy link

Aha! It looks like all the environment variables on the redash container are set literally as, for example, "export REDASH_REDIS_URL=...". The bash style 'export' isn't needed, and stripping them out loads the env vars properly. Docker also doesn't seem to like the quotes around the values for some reason.

The redash-nginx section in the docker-compose file also appears to be mounting a file as a volume. Not sure there should be any need for copying nginx.conf into redash-nginx at docker-compose time anyway since the file is copied into the image at docker build time.

I think REDASH_DATABASE_URL should be pointing to postgresql://postgres, as per docker-compose.yaml (since postgresql://redash would be pointing to localhost, where pg isn't running). Even with that modification, however, redash is still trying to connect to a local postgres socket.

redash_1       | [2015-10-20 22:47:39,869][PID:26][ERROR][redash.wsgi] Exception on / [GET]
redash_1       | Traceback (most recent call last):
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/flask/app.py", line 1817, in wsgi_app
redash_1       |     response = self.full_dispatch_request()
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/flask/app.py", line 1477, in full_dispatch_request
redash_1       |     rv = self.handle_user_exception(e)
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/flask_restful/__init__.py", line 251, in error_router
redash_1       |     return original_handler(e)
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/flask/app.py", line 1381, in handle_user_exception
redash_1       |     reraise(exc_type, exc_value, tb)
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/flask/app.py", line 1473, in full_dispatch_request
redash_1       |     rv = self.preprocess_request()
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/flask/app.py", line 1666, in preprocess_request
redash_1       |     rv = func()
redash_1       |   File "/opt/redash/current/redash/models.py", line 35, in connect_db
redash_1       |     self.database.connect()
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/peewee.py", line 2811, in connect
redash_1       |     self.__local.closed = False
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/peewee.py", line 2732, in __exit__
redash_1       |     reraise(new_type, new_type(*exc_value.args), traceback)
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/peewee.py", line 2810, in connect
redash_1       |     **self.connect_kwargs)
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/playhouse/postgres_ext.py", line 362, in _connect
redash_1       |     conn = super(PostgresqlExtDatabase, self)._connect(database, **kwargs)
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/peewee.py", line 3115, in _connect
redash_1       |     conn = psycopg2.connect(database=database, **kwargs)
redash_1       |   File "/usr/local/lib/python2.7/dist-packages/psycopg2/__init__.py", line 164, in connect
redash_1       |     conn = _connect(dsn, connection_factory=connection_factory, async=async)
redash_1       | OperationalError: could not connect to server: No such file or directory
redash_1       |    Is the server running locally and accepting
redash_1       |    connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?

Lastly, redash-nginx is returning a 500.

image

@meetwudi
Copy link
Contributor

@astewart-twist I've already remove those exports actually. You can checkout the example .env file on master. Probably you are using the old version of .env.

@meetwudi
Copy link
Contributor

The example .env file does not assume (also should not assume) the adoption of docker-compose. So I haven't change them to point to postgres and redis instances brings up by docker-compose. You can always customize the configs for your own needs. :)

@astewart-twist
Copy link

@tjwudi I'm understanding how the components come together as I work through this, and consequently how I might customize my config. If I may make a suggestion, though, I think it's important to provide first-time users a fairly simple entry-point without requiring that configuration knowledge, and docker-compose certainly provides that. You may want to include a ".env.docker" that is used by docker-compose.yaml that is more or less guaranteed to work right out of the box. For example, the docker-compose.yaml file refers to host paths that might not exist or be relevant for a user who just did a clone of the repo.

If I can figure out the other issues I mentioned above, I'll send a PR.

@arikfr
Copy link
Member

arikfr commented Oct 21, 2015

@tjwudi @astewart-twist see the updates I'm doing in #620. Also there is an example env file for docker-compose usage: https://github.com/EverythingMe/redash/pull/620/files#diff-74828010e796608ffd025ff947a84d10

I've also started working on a new image for AWS and GCE (but haven't pushed yet), that will install all needed components to run docker-compose in "production" environment for those who don't have already some Docker cluster in place. In time, it will replace the current images that install everything from packages.

@arthurpsmith
Copy link

@astewart-twist I've created a running re:dash from scratch using the docker-compose file I posted just now in #620 (which uses the latest everythingme/redash image from docker-hub). It's not quite as simple as docker-compose up, but just a couple of extra steps - see my comment in #620 In particular, you don't need to mess around with the 'env' file, just put the environment variables in the docker-compose.yml file.

@astewart-twist
Copy link

@arthurpsmith Awesome! I'll check it out ASAP. One dumb question: which fork or branch do I need to clone to try it?

I also worked through most of the issues I was having, which are probably the same fixes you made. I agree that everything can be handled by the docker-compose file (no need to use an .env at all).

@arthurpsmith
Copy link

arikfr's changes are in the 'docker' branch. However the only thing you need from that if you use the docker-compose file I posted is "create_database.sh" file (setup/docker/create_database.sh).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants