-
Notifications
You must be signed in to change notification settings - Fork 458
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docker images #1851
Comments
I'm very much interested in this. The use case is to allow creating scalable openfast instances in the cloud so we can run any number of openfast analyses in parallel and serverlessly (docker is essential for this). So far, I created a working Dockerfile here whose image is uploaded to Docker Hub. It was quite hard to get working as I had difficulties getting conda to install a specific version of openfast (getting the latest version works fine) which is important for allowing other engineers to create tools that use openfast without breaking them each time a new version of openfast is uploaded to the conda repo. I think working together on this would be beneficial! |
Thanks for your interest in this idea! I definitely agree that we should collaborate on this as well as the openfast_toolbox. One thing I haven't thought through is where to host containerization recipes. I had been previously thinking they would just be added into the OpenFAST repository, but perhaps a separate repository as you have done is a more logical method. @deslaughter, @mayankchetan, @rafmudaf -- do any of you have thoughts on this, or the best way to work with @cortadocodes and @octue on this? |
If there's any need to have the Dockerfiles match a particular OpenFAST version, I recommend to keep them in the OpenFAST repo. There are already some here. Being in the same repository as the code, you could automatically update them on a release or at least have a release checklist to remind you to update them. |
@cortadocodes, thank you for your interest in Dockerizing OpenFAST! I have been working on a few docker images (heavily inspired by @deslaughter's) that include the source code so that we can develop within the instance. But your approach to use the OpenFAST Conda package defiantly makes it attractive to be used in the cloud. As you'll see my docker images (https://hub.docker.com/r/mayankchetan/openfast/tags) are very bulky; have few ideas from a different project to reduce the size that I need to implement. For version control, I pass external environmental variables to the docker file, and have GitHub actions handling the image creation: https://github.com/mayankchetan/wind-energy-images . This allows me to make a pull request with new version number in the I do agree with @rafmudaf that having the dockerfile within the OpenFAST repo would be better for automation. What we might want to also discuss is the content of the docker image; containing the source code, just the conda package, conda package + r-tests as examples, etc. and their versioning. @andrew-platt, for hosting the images, we can start using the Github container registry and then work towards adding it to the NREL dockerhub. Here's an example for the above image: https://github.com/mayankchetan/wind-energy-images/pkgs/container/openfast . |
Alternatively we can set up a meeting using calendly: https://www.octue.com/getinvolved |
Is your feature request related to a problem? Please describe.
It would be nice to add docker images of our releases and host them on the NREL dockerhub (https://hub.docker.com/u/nrel).
Describe the solution you'd like
Ideally these would be built using a really stripped down linux version that is as small as possible (a stripped down Alpine linux for example). Also ideally build with the OneAPI compiler set. This should of course be automated (@mayankchetan has done some automation of this already).
Describe alternatives you've considered
None.
Additional context
I'm not entirely certain how many users would be interested in this, so perhaps this isn't a really high priority.
The text was updated successfully, but these errors were encountered: