-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SNOW-843760] Update pom.xml corresponding to Wiz vulnerability scan #546
Conversation
Codecov Report
@@ Coverage Diff @@
## master #546 +/- ##
=======================================
Coverage 78.30% 78.30%
=======================================
Files 76 76
Lines 4734 4734
Branches 424 424
=======================================
Hits 3707 3707
Misses 846 846
Partials 181 181 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Revi! Do you think we should provide a private jar to customers and ask them to run it with Blackduck in case there is still some differences? My concern is that we may need another round of SDK/KC release which is time consuming.
Agreed - discussed offline, lets ask them to run a blackduck scan for us. Created jira https://snowflakecomputing.atlassian.net/browse/SNOW-860497 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, thanks!
Following error, so i think we need to retain the shaded hadoop jar. I'll add it back in, it was originally removed because I thought it caused the protobuf vulnerability. @sfc-gh-lsembera any thoughts?
|
@sfc-gh-rcheng Putting versions into The Hadoop errors seem to be caused by the linkage checker, you can fix it by adding a rule to |
Thanks Lukas, youre right adding the version makes no change on the dependency list. Closing this PR as the versions are high enough to resolve the vulnerabilities |
Discussing on slack - how to find shaded dependencies |
…546) * update pom for vulns * update pom * use fasterxml version * remove hadoop exclusion * mvn install passes * passes * autoformatting * remove dependency pom * relocate parquet-hadoop * dont relocate * relocate just airlift not parquet-hadoop
Changes, full scan in slack (ping revi for link)
net.minidev:json-smart: 2.4.7 -> 2.4.9
com.nimbusds.nimbus-jose-jwt references json-smart
note: can't find reference to this anymore, even using lukas's process
com.google.protobuf:protobuf-java: 3.7.1 -> 3.16.3 and com.google.guava:guava: 30.1.1-jre -> 32.0.0
org.apache.hadoop/hadoop-common (exclude io.dropwizard.metrics.metrics-core to converge dependencies)
org.apache.hadoop.thirdparty/hadoop-shaded-guava-1.1.1: 3.71
org.apache.hadoop.thirdparty/hadoop-shaded-protobuf_3_7-1.1.1.jar: 3.7.1
com.fasterxml.jackson.core:jackson-databind
2.13.2.2 -> 2.13.4.2
org.apache.parquet.parquet-hadoop references jackson-databind
org.apache.commons/commons-configuration2: 2.13.3
org.apache.hadoop/hadoop-common
org.apache.parquet/parquet-hadoop
parquet-jackson-1.13.1: 2.13.4.2
org.apache.parquet/parquet-jackson