Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement] 编译包排除spark 中已经有的jar #241

Open
2 of 3 tasks
melin opened this issue Dec 22, 2024 · 1 comment
Open
2 of 3 tasks

[Enhancement] 编译包排除spark 中已经有的jar #241

melin opened this issue Dec 22, 2024 · 1 comment

Comments

@melin
Copy link

melin commented Dec 22, 2024

Search before asking

  • I had searched in the issues and found no similar issues.

Description

建议编译包排除spark 中已经有的jar,例如jackson,org.apache.commons等

Solution

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@gnehil
Copy link
Contributor

gnehil commented Jan 9, 2025

These dependencies have been shaded to avoid execution exceptions caused by differences in the Spark runtime environment version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants