Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Go version in Admission Controller 0.9.2 #5455

Closed
coltonfreeman26 opened this issue Jan 31, 2023 · 19 comments
Closed

Go version in Admission Controller 0.9.2 #5455

coltonfreeman26 opened this issue Jan 31, 2023 · 19 comments
Assignees
Labels
area/vertical-pod-autoscaler kind/feature Categorizes issue or PR as related to a new feature. lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale.

Comments

@coltonfreeman26
Copy link

coltonfreeman26 commented Jan 31, 2023

Good day all,
I am a member of the Iron Bank team on Platform One and we are consuming the vpa-admission-controller:0.9.2 image from k8s.gcr.io
Our scan tools have picked up quite a few findings in regards to the version of Go currently being used (1.14.3)
We are using the VPA sweet and have these findings for all of them. Recommender, Updated and Admission Controller.
Do you know if there is any plan to upgrade the Go version for these products?
I did reach out to the security team via email and they pointed me back to the public forum.
Please let me know if you have any questions.

@coltonfreeman26 coltonfreeman26 added the kind/feature Categorizes issue or PR as related to a new feature. label Jan 31, 2023
@coltonfreeman26
Copy link
Author

coltonfreeman26 commented Jan 31, 2023

So my apologies,
Our Renovate tool picked up a newer version 0.12.0 which resolved quite a few of the findings by updating to go 1.19
Do you know if there is any plan to update to the latest 1.19?
Also github.com/emicklei/go-restful/v3-v3.8.0 has a finding as well. This has been fixed in 3.10.1
getporter/operator#133

@jbartosik
Copy link
Collaborator

Currently I'm in process of preparing VPA 0.13.0 (#5355). We build release using this image which is based on golang:1.19 image so I'm not sure which patch version it used to build images.

Can you check 0.13.0 images (kubernetes/k8s.io#4703 or you can wait until I finish the release? I'd like to understand what dependencies we have in the latest image with known security vulnerabilities (sounds like github.com/emicklei/go-restful/v3-v3.8.0 and maybe old version of go).

And then to decide if we can do a patch release I want to understand:

  • what are those vulnerabilities,
  • how much work it would be to remove them,
  • if I can get some help with fixing them and doing a patch release.

@coltonfreeman26
Copy link
Author

Good morning @jbartosik
I can pull the latest 0.13.0 and look at it. Also once our automation tool picks it up and it runs through our scanners I can comment here again with what all they are still catching. Thank you for this.

@coltonfreeman26
Copy link
Author

Hello @jbartosik
I just pushed it through our pipelines and the findings are still present. There are a total of 8 (7 of which are related to go 1.19.3 and prior).
Do you work on the autoscaler team as well or should I open a new issue to see about that image getting bumped?

@jbartosik jbartosik self-assigned this Feb 3, 2023
@jbartosik
Copy link
Collaborator

I can update go and go-restful

Is the list of vulnerabilities available in some public place?

@coltonfreeman26
Copy link
Author

If this would also update autoscaler that would be awesome. I can post the CVE' here if you would like.

@jbartosik
Copy link
Collaborator

If this would also update autoscaler

Do you mean Cluster Autoscaler? I don't maintain CA, only VPA (Vertical Pod Autoscaler).

I can post the CVE' here if you would like.

Please do.

@voelzmo
Copy link
Contributor

voelzmo commented Feb 6, 2023

This indeed doesn't look like the vpa components were built with the most recent go version:

$ docker create registry.k8s.io/autoscaling/vpa-updater:0.13.0
fd2cbc57b979c5fa498c4bc0e75894f58f80eca37e90cd9338502151f3c22ebb

$ docker cp fd2cbc57b979c5fa498c4bc0e75894f58f80eca37e90cd9338502151f3c22ebb:updater - > ~/scratch/updater.tgz

$ tar -xvf updater.tgz
updater

$ go version -v updater
updater: go1.19

Whereas, when I'm building this locally in docker, it builds correctly with go1.19.5

$ make build-in-docker-amd64

$ go version updater-amd64
updater-amd64: go1.19.5

@jbartosik Did the image get cached somewhere so it used the very outdated go1.19? Tags for the docker images are updates as soon as a new minor version is released. For the vpa 0.13.0 release I would have expected the release job to have picked up go1.19.5, which was released on 2023-01-10.

@jbartosik
Copy link
Collaborator

It's possible an old image got cached at my workstation, I can check tomorrow.

@jbartosik
Copy link
Collaborator

Looks like my workstation had an old image, the image I had was 5 months old, when I did docker pull golang:1.19 it got updated to an image that was created 2 days ago.

@jbartosik
Copy link
Collaborator

I think it would be better to specify which golang version exactly we're using

@coltonfreeman26
Copy link
Author

Hello all. Thank you for the work on this thus far. I am looking forward to the changes you have made.
As for any other findings. Would you like me to use this issue or open a new one?

@jbartosik
Copy link
Collaborator

I want to merge : kubernetes/k8s.io#4973 before doing a release

@k8s-triage-robot
Copy link

The Kubernetes project currently lacks enough contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Mark this issue as fresh with /remove-lifecycle stale
  • Close this issue with /close
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle stale

@k8s-ci-robot k8s-ci-robot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Jun 15, 2023
@jbartosik
Copy link
Collaborator

/remove-lifecycle stale

@k8s-ci-robot k8s-ci-robot removed the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Jun 16, 2023
@kgolab
Copy link
Collaborator

kgolab commented Jun 22, 2023

Release 0.14.0 (https://github.com/kubernetes/autoscaler/releases/tag/vertical-pod-autoscaler-0.14.0) bumped golang to 1.20.5.

@coltonfreeman26 , could you please confirm this fixes the problem for you?

@k8s-triage-robot
Copy link

The Kubernetes project currently lacks enough contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Mark this issue as fresh with /remove-lifecycle stale
  • Close this issue with /close
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle stale

@k8s-ci-robot k8s-ci-robot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Jan 23, 2024
@coltonfreeman26
Copy link
Author

sorry for the late response on this. 0.14.0 did resolve all of the original CVE's i posted. now there are some new ones in regards to 1.20.5 but thats just the way the game goes. thank you for the help on this. ill go ahead and close this out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/vertical-pod-autoscaler kind/feature Categorizes issue or PR as related to a new feature. lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale.
Projects
None yet
Development

No branches or pull requests

6 participants