Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

new LTS release and recent weekly #544

Closed
wants to merge 3 commits into from
Closed

new LTS release and recent weekly #544

wants to merge 3 commits into from

Conversation

ndeloof
Copy link
Contributor

@ndeloof ndeloof commented Mar 10, 2015

No description provided.

@yosifkit
Copy link
Member

Wow, 55 separate versions. Is there a point at which there will be less versions? Since these are weekly versions would it make sense to stick to just 6 months of weekly and 2-3 LTS versions?

@tianon
Copy link
Member

tianon commented Mar 10, 2015

I'd really love to get some Jenkins user feedback on this. I know I haven't run into any serious plugin compatibility issues myself, and definitely not to the level of needing six full months worth of weekly releases to track down something that works.

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 10, 2015

I have customer to run various jenkins version. Most of them do use LTS but others may run a weekly ... 8 months old. As the Dockerfile are generated via a stackbrew script I don't get the issue with this approach, as this ensure all tags get a common usage.

@yosifkit
Copy link
Member

Build test of #544; 26cec21 (jenkins):

$ url="https://raw.githubusercontent.com/docker-library/official-images/26cec21018af5533bb31b8e602b0019c90539d23/library/jenkins"
$ bashbrew build "$url"
Fetching jenkins (git://github.com/cloudbees/jenkins-ci.org-docker) ...
Processing jenkins:1.554.1 ...
Processing jenkins:1.554.2 ...
Processing jenkins:1.554.3 ...
Processing jenkins:1.555 ...
Processing jenkins:1.556 ...
Processing jenkins:1.557 ...
Processing jenkins:1.558 ...
Processing jenkins:1.559 ...
Processing jenkins:1.560 ...
Processing jenkins:1.561 ...
Processing jenkins:1.562 ...
Processing jenkins:1.563 ...
Processing jenkins:1.564 ...
Processing jenkins:1.565.1 ...
Processing jenkins:1.565.2 ...
Processing jenkins:1.565.3 ...
Processing jenkins:1.565 ...
Processing jenkins:1.566 ...
Processing jenkins:1.567 ...
Processing jenkins:1.568 ...
Processing jenkins:1.569 ...
Processing jenkins:1.570 ...
Processing jenkins:1.571 ...
Processing jenkins:1.572 ...
Processing jenkins:1.573 ...
Processing jenkins:1.574 ...
Processing jenkins:1.575 ...
Processing jenkins:1.576 ...
Processing jenkins:1.577 ...
Processing jenkins:1.578 ...
Processing jenkins:1.579 ...
Processing jenkins:1.580.1 ...
Processing jenkins:1.580.2 ...
Processing jenkins:1.580.3 ...
Processing jenkins:latest ...
Processing jenkins:1.580 ...
Processing jenkins:1.581 ...
Processing jenkins:1.582 ...
Processing jenkins:1.583 ...
Processing jenkins:1.584 ...
Processing jenkins:1.585 ...
Processing jenkins:1.586 ...
Processing jenkins:1.587 ...
Processing jenkins:1.588 ...
Processing jenkins:1.589 ...
Processing jenkins:1.590 ...
Processing jenkins:1.591 ...
Processing jenkins:1.592 ...
Processing jenkins:1.593 ...
Processing jenkins:1.594 ...
Processing jenkins:1.595 ...
Processing jenkins:1.596 ...
Processing jenkins:1.597 ...
Processing jenkins:1.598 ...
Processing jenkins:1.599 ...
Processing jenkins:1.600 ...
Processing jenkins:weekly ...
$ bashbrew list "$url" | xargs test/run.sh
testing jenkins:1.554.1
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.554.2
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.554.3
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.555
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.556
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.557
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.558
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.559
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.560
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.561
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.562
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.563
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.564
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.565.1
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.565.2
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.565.3
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.565
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.566
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.567
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.568
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.569
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.570
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.571
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.572
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.573
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.574
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.575
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.576
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.577
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.578
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.579
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.580.1
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.580.2
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.580.3
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:latest
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.580
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.581
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.582
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.583
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.584
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.585
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.586
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.587
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.588
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.589
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.590
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.591
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.592
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.593
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.594
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.595
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.596
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.597
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.598
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.599
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:1.600
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed
testing jenkins:weekly
    'utc' [1/2]...passed
    'cve-2014--shellshock' [2/2]...passed

@yosifkit
Copy link
Member

LGTM

@md5
Copy link
Contributor

md5 commented Mar 10, 2015

As the Dockerfile are generated via a stackbrew script I don't get the issue with this approach, as this ensure all tags get a common usage.

I'd point out these issues that I see with your approach:

  1. Build time and other resources
  2. Storage space
  3. Confusing UX when visiting https://registry.hub.docker.com/_/jenkins/

The first two items may be partially mitigated by layer caching on the build servers, but the bulk of each of these images beyond what the base java and debian images provide is going to be the layer that downloads Jenkins itself, which will differ for each tag.

The third item is something that could be improved, but I honestly don't know what a user who has to scroll through two+ screens of Jenkins tags on that page is supposed to think.

I wish that Docker Hub published pull statistics on a per-tag basis, but I strongly suspect most people are downloading latest or an LTS version (if they have enough Jenkins knowledge to distinguish them).

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 11, 2015

@md5 stop me if I'm wrong, but

  1. stackbrew script only do register new commits for images I've added, until I make change to the underlying Dockerfile which will indeed require a looooong build to rebuild them all. Is this really an issue ? I mean, don't you use elastic cloud nodes on build infra ?
  2. as those images have been published to dockerhub, even I remove them from my stackbrew script they will remain, so consume the same disk space, won't they ?
  3. same issue as http://mirrors.jenkins-ci.org/war/ - http://jenkins-ci.org/ only exposes LTS and latest weekly. Maybe I should just add a note on README to promote those ones.

@Vlatombe
Copy link

I'm jumping in here, but it could be interesting to have two repositories : jenkins-weekly and jenkins-lts, just to ease people's life, to answer @md5 's concern about scrolling lots of pages ;)

My 2 cents,

@md5
Copy link
Contributor

md5 commented Mar 11, 2015

@ndeloof so looking at http://jenkins-ci.org/, it's only showing 1.602 and 1.596.1. It sounds like you'd be fine with the docs on Docker Hub showing only those two versions, as long as you can still rebuild new images of the old versions as needed (back to some unspecified oldest version). Is that accurate?

As for the setup of the official build server, I only know what I've gleaned from Github since I don't have any official involvement with the team running the official images program. Perhaps I should not have speculated about that part of it since I may have been speaking out of turn.

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 11, 2015

I'd be fine ton only list the most recent version on dockerhub page, but don't have control over this.

@Vlatombe tags are designed for this usage. We already have "weekly" and "latest" (i.e. LTS)

@Vlatombe
Copy link

@ndeloof Yes I agree, I guess in the end it is down to a usability issue of the https://registry.hub.docker.com/_/jenkins/tags/manage/ page when the repositories contains loads of tags ;)

@yosifkit
Copy link
Member

A few points:

  • images in the library file determine what is listed on the Docker hub page
  • images in the file are rebuilt anytime cache breaks (so if debian:jessie or java:openjdk-7u65-jdk updates, or they themselves change)
  • images not in the file remain on the Docker hub indefinitely (but obviously are not rebuilt if the base image changes)
  • all items in the list are built locally (without cache) for testing PRs since they are all supported (see build test comment above).

@ricoli
Copy link

ricoli commented Mar 17, 2015

by the way, Jenkins has a new LTS (1.596.1) so this PR needs another small update :)

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 17, 2015

... as well as many new weekly release.
This manual review process is embarrassing for an active project as jenkins with weekly releases. I'd like we can push a new Dockerfile just as the release is done (or even better : as part of the release process)

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

Looks like @yosifkit already gave this a LGTM, so it's just waiting on another from @tianon or @Moghedrin.

I still don't think that having all these old releases available in this way is a good idea, but that situation was already accepted when the jenkins images first taken over by Cloudbees as I understand it, so it shouldn't block this merge.

Apologies for bike-shedding.

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

P.S. I really think the UX will be better if you merge jenkinsci/docker#71 and regenerate the library file from that. At least then the releases you actually want people to download will be listed first.

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

Nevermind about @yosifkit's LGTM, I see this in the Travis output after your latest push:

Processing jenkins:1.601 ...
- failed; invalid ref: 
Processing jenkins:1.602 ...
- failed; invalid ref: 
Processing jenkins:1.603 ...
- failed; invalid ref: 
Processing jenkins:1.604 ...
- failed; invalid ref: 
Processing jenkins:1.605 ...
- failed; invalid ref: 
Processing jenkins:weekly ...
- failed; invalid ref: 

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 17, 2015

applied your proposed UI PR
jenkinsci/docker@b75dc1a do exist in repo

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

I just realized that the output is incorrect for the current weekly and LTS releases. It's missing the directory name. I'll fix that and send you a new PR.

Sorry!

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 17, 2015

oh, indeed. np, I should have reviewed this more carefully :)

@tianon
Copy link
Member

tianon commented Mar 17, 2015 via email

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 17, 2015

all those tags are created from a template as you suggested earlier, based on postgres sample IIRC
So I don't get why a longer review is needed when all versions are actually based on the same template

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

@ndeloof can you re-run the script so we can get a clean Travis build?

@tianon I hear what you're saying, but in this case the actual git tag has only changed for the newly added releases. If those releases had each been submitted as their own PR, the review of them could have been quick.

That being said, in the case that the template actually does change in a way that the git tag updates for every Docker tag, then I agree that the amount of work involved increases.

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 17, 2015

Then it would make more sense running the generation script is made on your side and the pull request is just some minor metadata update : either changes to the template (simple revue, all images generation) or just new version added to metadata with no template changes (trivial revue)

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

@ndeloof Not sure how much of your comment was directed at me, if any. I'll just re-emphasize as I mentioned before that I'm not a maintainer of the official-images repo, just an interested third party.

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 17, 2015

@md5 indeed not at your intention but @tianon

@yosifkit
Copy link
Member

@ndeloof, could you go review and merge jenkinsci/docker#74 so that we can then get a working set here 🙏?

@ndeloof
Copy link
Contributor Author

ndeloof commented Mar 17, 2015

I'm having discussion with my colleague Michael Neale on way to avoid continuous updates for weekly release. Will create a fresh new PR as we went into a reasonable conclusion

@ndeloof ndeloof closed this Mar 17, 2015
@michaelneale
Copy link
Contributor

I think this is a signal that the weekly release each having its own tag isn't going to work with this process. I recommend just having LTS here and a big sign (somehow) saying to use jenkinsci/jenkins: for a specific weekly version if someone really needs it.

Otherwise have a "weekly" tag which changes week to week (but I think that is out of favour).

@yosifkit
Copy link
Member

I think a latest and a weekly tag (with their applicable version number), would be superb and simple for reviewing. Could also drop in the set of LTS versions that are currently supported as well. There can also be a simple link put on the docs on the Docker hub to link to "maintained" older versions.

@michaelneale
Copy link
Contributor

tags for every LTS is entirely reasonable - as people do care about those
specific version numbers - more so than the weekly ones.

On Wed, Mar 18, 2015 at 9:27 AM yosifkit notifications@github.com wrote:

I think a latest and a weekly tag (with their applicable version number),
would be superb and simple for reviewing. Could also drop in the set of LTS
versions that are currently supported as well. There can also be a simple
link put on the docs on the Docker hub to link to "maintained" older
versions.


Reply to this email directly or view it on GitHub
#544 (comment)
.

@tianon
Copy link
Member

tianon commented Mar 17, 2015

@yosifkit 👍 I think that sounds really reasonable

@michaelneale
Copy link
Contributor

@yosifkit @tianon what would they refer too? latest being latest LTS? weekly being latest weekly?

@tianon
Copy link
Member

tianon commented Mar 17, 2015 via email

@tianon
Copy link
Member

tianon commented Mar 17, 2015

@michaelneale yeah, I think having jenkins:latest be the "latest LTS" makes sense, similar to how ubuntu:latest works

@michaelneale
Copy link
Contributor

should there be discrete tagged versions for past LTS - like ubuntu also has (that analogy holds - the only difference is jenkins tempo for LTS is faster than ubuntu at the moment - btu the same other wise - people care about those versions).

I think weeklies are possibly best somewhere else - or at most - a weekly tag (which is hopefully the latest one - but then, as this is a manual process, like getting an app through the app store, weekly is probably too frequent for this repo).

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

but then, as this is a manual process, like getting an app through the app store, weekly is probably too frequent for this repo).

I disagree with this part. iojs and crate are making nearly weekly releases that only involve simple version updates.

I think having a single weekly tag here for the latest weekly, latest for the latest LTS, and numbered tags for anything that is pushed sounds like the right way to go.

@yosifkit
Copy link
Member

I think something like the below would work perfect and would only be a version bump for the weekly (which should be like a 5 minute review). Then you can keep the LTS versions here for as long as they are supported. As I noted in a previous comment, the versions that drop out of this list will still be available on the Docker hub, just not displayed as "supported" and not rebuilt with any parent image updates.

# group: Current Releases
1.596.1: git://github.com/cloudbees/jenkins-ci.org-docker@b75dc1ae86f9aac71fef358e737ae71dab8cc55b 1.596.1
latest: git://github.com/cloudbees/jenkins-ci.org-docker@b75dc1ae86f9aac71fef358e737ae71dab8cc55b 1.596.1

1.605: git://github.com/cloudbees/jenkins-ci.org-docker@b75dc1ae86f9aac71fef358e737ae71dab8cc55b 1.605
weekly: git://github.com/cloudbees/jenkins-ci.org-docker@b75dc1ae86f9aac71fef358e737ae71dab8cc55b 1.605

# group: Previous LTS Releases
1.580.1: git://github.com/cloudbees/jenkins-ci.org-docker@40c3e3f46939b9f9dcf8d46e62fa7daa80485588 1.580.1
1.565.3: git://github.com/cloudbees/jenkins-ci.org-docker@40c3e3f46939b9f9dcf8d46e62fa7daa80485588 1.565.3
1.565.2: git://github.com/cloudbees/jenkins-ci.org-docker@f313389f8ab728d7b4207da36804ea54415c830b 1.565.2
1.565.1: git://github.com/cloudbees/jenkins-ci.org-docker@f313389f8ab728d7b4207da36804ea54415c830b 1.565.1
1.554.3: git://github.com/cloudbees/jenkins-ci.org-docker@40c3e3f46939b9f9dcf8d46e62fa7daa80485588 1.554.3
1.554.2: git://github.com/cloudbees/jenkins-ci.org-docker@f313389f8ab728d7b4207da36804ea54415c830b 1.554.2
1.554.1: git://github.com/cloudbees/jenkins-ci.org-docker@f313389f8ab728d7b4207da36804ea54415c830b 1.554.1

@tianon
Copy link
Member

tianon commented Mar 17, 2015 via email

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

@yosifkit I'm not sure exactly how Jenkins LTS releases work, but it seems like having only the latest 1.M.N tag per "1.M" would make more sense.

# group: Current Releases
1.596.1: git://github.com/cloudbees/jenkins-ci.org-docker@b75dc1ae86f9aac71fef358e737ae71dab8cc55b 1.596.1
latest: git://github.com/cloudbees/jenkins-ci.org-docker@b75dc1ae86f9aac71fef358e737ae71dab8cc55b 1.596.1

1.605: git://github.com/cloudbees/jenkins-ci.org-docker@b75dc1ae86f9aac71fef358e737ae71dab8cc55b 1.605
weekly: git://github.com/cloudbees/jenkins-ci.org-docker@b75dc1ae86f9aac71fef358e737ae71dab8cc55b 1.605

# group: Previous LTS Releases
1.580.1: git://github.com/cloudbees/jenkins-ci.org-docker@40c3e3f46939b9f9dcf8d46e62fa7daa80485588 1.580.1
1.565.3: git://github.com/cloudbees/jenkins-ci.org-docker@40c3e3f46939b9f9dcf8d46e62fa7daa80485588 1.565.3
1.554.3: git://github.com/cloudbees/jenkins-ci.org-docker@40c3e3f46939b9f9dcf8d46e62fa7daa80485588 1.554.3

@yosifkit
Copy link
Member

I was just copying what was currently there minus all the old weekly versions, so I am unsure on which are still supported upstream.

@michaelneale
Copy link
Contributor

Generally updates are for the last few LTSes. Not so much for weeklies (I
think some security patches do get applied more widely from time to time).

The lts versions are chosen from weeklies - one weekly gets picked to be
the lts. The numbering scheme doesn't necessarily reflect this. Lts
releases get updates for some time, even long after a new lts is available.

Hope that clears things up a bit.

So it sounds like LTS, with latest tag, and past LTS tags is easily doable.

For weeklies - it is mentioned other projects do this (eg io.js) So that is
possible. Do they not tag the weekly releases but only keep one latest
weekly?
On Wed, 18 Mar 2015 at 10:10 am yosifkit notifications@github.com wrote:

I was just copying what was currently there minus all the old weekly
versions, so I am unsure on which are still supported upstream.


Reply to this email directly or view it on GitHub
#544 (comment)
.

@md5
Copy link
Contributor

md5 commented Mar 17, 2015

They don't have a concept of "LTS" or "weekly" for iojs as far as I know, but they do push out point releases regularly and it has been my observation that @tianon and @yosifkit turn around the requests within a day at most.

See a list of the iojs PRs here for reference: https://github.com/docker-library/official-images/issues?q=+label%3Alibrary%2Fiojs+

@tianon
Copy link
Member

tianon commented Mar 17, 2015 via email

@michaelneale
Copy link
Contributor

@tianon is there a tag for each weekly release?

@tianon
Copy link
Member

tianon commented Mar 18, 2015

https://github.com/docker-library/official-images/blob/74cda4688d1818920c4d96013245791b91b3c2be/library/iojs

Not sure exactly what you mean, but as you can see there, each version is tagged multiple times.

#543 is a good example PR for a representative update to that image -- the 1.5.0 tag is replaced by 1.5.1, but the 1.5.0 image remains on the Hub, it's just no longer listed as "supported".

@michaelneale
Copy link
Contributor

So what is the trouble that started all this? too many tags in a PR?

ndeloof added a commit to jenkinsci/docker that referenced this pull request Mar 18, 2015
@ndeloof ndeloof mentioned this pull request Mar 18, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants