-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
files on the docker image #55
Comments
Hi Ben, No magic here since everything has to run on the Docker Hub. Recall that you can only refer to things in the directory containing the Dockerfile or subdirectories thereof, you cannot go back up (no Note that Okay, questions for you: I keep going back and forth on whether it is actually a good idea or not to copy the data/manuscript files etc onto the Docker container like this, or just omit that part. It sounds great to have everything together, but it creates a potential for confusion when linking volumes. In practice, I tend to instead to clone the repo from Github and then run the analysis in Docker; e.g.:
this ensures that the pdf is on the host machine and not stuck inside the container. However, if you run the above with an interactive RStudio instance, you see the two different copies of the |
Thanks very much for your quick and detailed reply. If I understand correctly, you might have done Building from the local folder is a little awkward since I run boot2docker and it's not trivial (for me, at least) to have Looking into it a bit, I think I now see a flaw in using the vignettes directory: I cannot store the HTML output from the rendered Rmd in there... is that why you use a non-standard package directory like My first thought on your specific question is to make isolation the priority, since that seems to be what Docker does so well. So my preference for the final compendium is to include the data, manuscript source and built package all on the Docker image, and avoid linking or any other kind of connection with the host. If the data got so big, perhaps then a data-only image linked to a manuscript-source-and-package image. But when developing the analysis I find it much more convenient to have access to my host file system (relevant files are usually scattered widely until the last moment when I organise the compendium), so I do similar to you with linking the docker container to a host directory. By the way, running I've had a bit of a look into how to do travis-style continuous integration for the docker image, circleci.com seems closest. After a bit of poking around I've got a circle.yml file, a shield on my readme, and my image passes their test, so that seems like a reasonable option. I can't see how to securely send my docker hub credentials to circle to push the image after a successful build though. I'm looking forward to talking more about this with you at the rOpenSci event, perhaps we should propose a project during that event to document more of this process of using R packages and Docker containers to make research compendia. |
Inline replies below. good fodder for a wee blog post here. On Wed Jan 28 2015 at 11:19:19 AM Ben Marwick notifications@github.com
Bingo, I go back and forth on this one too and would really like your
WOW! Looks awesome, I've been wishing for something like that. Looks like At the moment, I'm using Drone CI on a private digital-ocean server. I'm looking forward to talking more about this with you at the rOpenSci
|
Thanks again, your comments are very instructive, as usual! I'm not doing any linking (that I know of) for the compendium image, only the development image (which isn't part of the repository because of the relative paths and general disorder). Sorry to be unclear, I meant that I tried I've posted a comment over on the That's a neat private digital-ocean server you've set up, I can see how that would save time for testing. Anyway, thanks again for your help with making sense of all this, really looking forward to talking more at the rOpenSci event. |
Ah, good question about the github-only dependencies thing. The regular Maybe that's a good way to go, but I got really annoyed at packrat always On Wed Jan 28 2015 at 2:07:36 PM Ben Marwick notifications@github.com
|
@benmarwick Thanks again for the pointer to Circle-CI. Looks like they have a max build time of 1.5 hrs, so I'm not sure if nonparametric-bayes or some of my other research papers could build in that time (mostly because I can't be bothered to make the code run faster than anything more intrinsic), but I did set up a quick example for the RNeXML paper: https://circleci.com/gh/ropensci/RNeXML This is really nice -- building all the dependencies on rocker/ropensci from scratch (compiling R packages from source, installing the LaTeX) on travis is prohibitively long; travis has only ever tested the R package but not the compile. Super simple on circle ci. |
Yes, the circleci service looks pretty handy. Your description of the generic docker image and a package that contains its own dependencies does sound like a good option too. I guess an easy method of having a package contain its own dependencies doesn't seem to be available yet. I really like the idea of packrat, but it's never worked properly for me ( |
Hey all, The installation mechanism in switchr (http://github.com/gmbecker/switchr) See the vignette for a baby example, and please feel free to ping me with ~G On Wed, Jan 28, 2015 at 11:40 PM, Ben Marwick notifications@github.com
Gabriel Becker, PhD |
Hi Carl,
I'm curious about how you got just the contents of
nonparametric-bayes/manuscripts
to show in the docker image at/home/rstudio/pdg-control
, can you help me make sense of that?I gather that these lines in your dockerfile install the packages needed for the project, including the
nonparametric-bayes/manuscripts
directory:But I'm not quite sure how the
COPY
function is working:In this context what is
.
referring to? How is it specifically getting thenonparametric-bayes/manuscripts
directory?Or are there some other lines you run (that I can't see here) when configuring the image initially to get that directory on the image?
thanks!
The text was updated successfully, but these errors were encountered: