-
Notifications
You must be signed in to change notification settings - Fork 3.7k
[CodeRefactor] Modify FE modules #4146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
build.sh
Outdated
| cp -r -p ${DORIS_HOME}/webroot/* ${DORIS_OUTPUT}/fe/webroot/ | ||
| # Copy Frontend and Backend | ||
| if [ ${BUILD_FE} -eq 1 -o ${BUILD_SPARK_DPP} -eq 1 ]; then | ||
| if [ ${BUILD_FE} -eq 1]; then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| if [ ${BUILD_FE} -eq 1]; then | |
| if [ ${BUILD_FE} -eq 1 ]; then |
| </dependencies> | ||
|
|
||
| <build> | ||
| <finalName>spark-dpp-${version}</finalName> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think better named it without version, just like "palo-fe.jar", may be "spark-dpp.jar". Because we will have a DppVersion in FeConstant.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If without version, it is difficult to found out which version it is when we saw this file.
|
|
||
| This module is used to store some common classes of other modules. | ||
|
|
||
| # spark-dpp |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would better add a explain for dpp.
…4163) ### Resume When users use spark load, they have to upload the dependent jars to hdfs every time. This cl will add a self-generated repository under working_dir folder in hdfs for saving dependecies of spark dpp programe and spark platform. Note that, the dependcies we upload to repository include: 1、`spark-dpp.jar` 2、`spark2x.zip` 1 is the dpp library which built with spark-dpp submodule. See details about spark-dpp submodule in pr #4146 . 2 is the spark2.x.x platform library which contains all jars in $SPARK_HOME/jars **The repository structure** will be like this: ``` __spark_repository__/ |-__archive_1_0_0/ | |-__lib_990325d2c0d1d5e45bf675e54e44fb16_spark-dpp.jar | |-__lib_7670c29daf535efe3c9b923f778f61fc_spark-2x.zip |-__archive_2_2_0/ | |-__lib_64d5696f99c379af2bee28c1c84271d5_spark-dpp.jar | |-__lib_1bbb74bb6b264a270bc7fca3e964160f_spark-2x.zip |-__archive_3_2_0/ | |-... ``` The followinng conditions will force fe to upload dependencies: 1、When fe find its dppVersion is absent in repository. 2、The MD5 value of remote file does not match the local file. Before Fe uploads the dependencies, it will create an archive directory with name `__archive_{dppVersion}` under the repository.
0d53821 to
883b261
Compare
883b261 to
0e78695
Compare
0e78695 to
f289137
Compare
|
LGTM |
yangzhg
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Proposed changes
This CL mainly changes:
Add 2 new FE modules
fe-common
save all common classes for other modules, currently only
jmockitspark-dpp
The Spark DPP application for Spark Load. And I removed all dpp related classes to this module, including unit tests.
Change the
build.shAdd a new param
--spark-dppto compile thespark-dppalone. And--fewill compile all FE modules.the output of
spark-dppmodule isspark-dpp-1.0.0-jar-with-dependencies.jar, and it will be installed tooutput/fe/spark-dpp/.Types of changes
Checklist
Further comments
After this PR merged, the Spark Load feature will NOT working, it need to wait another PR to modify the way to deploy the new
spark-dpp.jar.