Skip to content
This repository was archived by the owner on May 9, 2024. It is now read-only.

Conversation

@skonto
Copy link

@skonto skonto commented Jul 1, 2018

What changes were proposed in this pull request?

  • Adds custom tests. Modifies a limited number of files and will try to upstream what is possible to minimize merge conflicts in the future.
  • Adds some functionality back from the repo here (https://github.com/databricks/spark-integration-tests).
    Cloud support was removed from the upstream spark project.
  • This work has revealed the following bugs:
    https://issues.apache.org/jira/browse/SPARK-24711
    https://issues.apache.org/jira/browse/SPARK-24694
    Fixes are upstreamed.
  • There is a comprehensive read me file for how to run the tests. Next step is to add them to Jenkins
  • The PR is against the k8s-develop branch which is updated according to master. In the future in its jenkins build we will try to merge with the latest changes from upstream. In case of conflicts we will solve them manually. The benefit is that we have tests in one place.
  • Tests cover HDFS and Kafka. Tests are tagged so we can pick the ones we want for example based on security (coming soon, PR is already out).

How was this patch tested?

Manually, check image bellow:

image

@skonto skonto requested review from deanwampler, maasg, seglo and yuchaoran2011 and removed request for seglo July 1, 2018 20:35
Copy link
Author

@skonto skonto Jul 1, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Temporarily adds support for external hdfs clusters. This will be removed when community will add support for HDFS.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

;) almost there but +1 on this

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:) thanx @ifilonenko

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mark this test as not appropriate for DC/OS as we run against insecure api server and when no scheme is defined current implementation will default to https. In the future I will remove this with targeting the secure api server on dc/os.

@skonto
Copy link
Author

skonto commented Jul 2, 2018

Checks failed are from an old job for PRs, safe to ignore it.

Copy link

@deanwampler deanwampler left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. A few minor suggestions in the comments.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it work to have a conditional wrapping this one to see if the $HADOOP_HOME/conf exists and contains the config files, then if not, check the URL? Or, are you forcing an update to the correct versions of the files whether they exist or not?

Copy link
Author

@skonto skonto Jul 3, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I just update them if the url exists for now. I am also waiting for work that @ifilonenko is doing. This is temporary anyway. Btw its the same thing DC/OS Spark does
https://github.com/mesosphere/spark-build/blob/dd14958297da8080615fc431d6ef0ab6f791220c/conf/spark-env.sh#L8-L13

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we use the same version string we use in releases or perhaps "latest" between releases? This is something we should discuss in the standup.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes whatever makes sense. Just random version there.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will use latest.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Extra space after case ;)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will fix.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's bump this to 1.1.X. We need to convert all projects to use SBT version > 1.0, because of changes in the way you query settings, which I've needed to invoke in the release build.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will fix.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest removing the Eclipse plugin. Users of Eclipse can add this to their local ~/.sbt/1.0/... environments.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure ;)

@deanwampler
Copy link

One other thing; now that 1.2.0 is done, we can merge work like this to the develop branch when it's ready.

@skonto
Copy link
Author

skonto commented Jul 3, 2018

@deanwampler the branch I am merging to is k8s-develop, I would like to distinguish it from mesos stuff if it is not big deal.

@skonto
Copy link
Author

skonto commented Jul 4, 2018

Updated the PR will merge it.

@skonto skonto merged commit 218b8c1 into lightbend:k8s-develop Jul 4, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants