Skip to content

Conversation

@budde
Copy link

@budde budde commented Mar 10, 2017

What changes were proposed in this pull request?

  • Add new KinesisDStream.scala containing KinesisDStream.Builder class
  • Add KinesisDStreamBuilderSuite test suite
  • Make KinesisInputDStream ctor args package private for testing
  • Add JavaKinesisDStreamBuilderSuite test suite
  • Add args to KinesisInputDStream and KinesisReceiver for optional
    service-specific auth (Kinesis, DynamoDB and CloudWatch)

How was this patch tested?

Added KinesisDStreamBuilderSuite to verify builder class works as expected

@budde
Copy link
Author

budde commented Mar 10, 2017

Open questions I'd like feedback on:

  • Should the KinesisUtils.createStream() methods be marked as deprecated?
  • Should the KinesisUtils.createStream() methods be refactored to use the builder?
  • Should I add full docs for each method (e.g. including @param lists)? EDIT: has been added
  • Does the file name and class name I've added seem reasonable?
  • Is making the ctor args to KinesisInputDStream package private for testing reasonable?

I'd like to also extend this to allow configuring CloudWatch and DynamoDB-specific authorization which I imagine will be quite helpful to users. I'm trying to decide if I should do this as a separate PR or just roll it in here. EDIT: has been added to this PR

Pinging @brkyvz and @srowen from #16744

@SparkQA
Copy link

SparkQA commented Mar 10, 2017

Test build #74342 has finished for PR 17250 at commit 8552caf.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class Builder[T: ClassTag](

@budde budde force-pushed the KinesisStreamBuilder branch from 8552caf to bcb7667 Compare March 10, 2017 22:03
@budde
Copy link
Author

budde commented Mar 10, 2017

Forgot to stop the StreamingContext added in KinesisDStreamBuilderSuite. Updated the code to stop the context after all tests have run.

@SparkQA
Copy link

SparkQA commented Mar 10, 2017

Test build #74351 has finished for PR 17250 at commit bcb7667.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class Builder[T: ClassTag](

@budde budde force-pushed the KinesisStreamBuilder branch from bcb7667 to a604dc5 Compare March 11, 2017 06:35
@SparkQA
Copy link

SparkQA commented Mar 11, 2017

Test build #74379 has finished for PR 17250 at commit a604dc5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class Builder[T: ClassTag](

@budde budde force-pushed the KinesisStreamBuilder branch from a604dc5 to 8aeef08 Compare March 13, 2017 02:18
@budde
Copy link
Author

budde commented Mar 13, 2017

Found a bit more time to work on this. Changes made:

  • Added implementations of KinesisDStream.builder() that take JavaStreamingContext instead of StreamingContext
  • Added JavaKinesisDStreamBuilderSuite to test that the builder interface is accessible in Java
  • Added support for DynamoDB and CloudWatch-specific authentication parameters
  • Made the documentation for KinesisDStream.Builder more thorough

@SparkQA
Copy link

SparkQA commented Mar 13, 2017

Test build #74418 has finished for PR 17250 at commit 8aeef08.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class Builder[T: ClassTag](

Copy link
Contributor

@brkyvz brkyvz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The direction is great. Major feedback is:

  1. Builder shouldn't have constructor params. It should check required values at the end.
  2. Please at @deprecated to KinesisUtils.createStream methods.
  3. We can have the createStream methods re-use the builder pattern in a separate PR.
  4. We should think more about how to provide credentials. Ideally we only need 3 methods, not 3 x 6.
    Maybe we should also have a AWSCredentialsBuilder. What do you think?

Ideally, the Builder class will be in KinesisInputDStream. KinesisInputDStream will have a private constructor, i.e.

private[kinesis] KinesisInputDStream private (
  ...)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel that builders rarely have constructor params. I understand you wanted to make these the required options, but I would just rather have the builder have a zero-param constructor, and it checks for the required fields upon build(). What do you think?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is probably the first builder class I've implemented so I'll defer to your judgment here :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would also make this a required field, otherwise people will face confusing issues when they start 2 streams from the same Spark application.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds reasonable. Feel free to push back on any other defaults as well-- figured these would just be a starting point.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be great to document what happens when both region and endpoint is set, but are referring to different regions

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll take a look and see. To be honest supplying both of these has always felt pretty redundant to me. The AWS SDK has changed a bit in how it handles endpoints and regions as well, so it may also be worth revisiting how KinesisReceiver uses these params.

Long term, it may also be nice to allow for different endpoints to be specified for Kinesis, DynamoDB and CloudWatch (I think the KCL should support this...)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you link InitialPositionInStream for simplicity

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't default to the Spark app name

Copy link
Author

@budde budde Mar 14, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll make this required

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have the builder take setStreamingContext, one which takes StreamingContext and the other JavaStreamingContext

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will be made required builder arguments

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's also have a setMessageHandler function as well

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: move = to line above

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why the underscore?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Declaring val storageLevel collides with DStream.storageLevel

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With this many methods for the credential provider, I feel we need a credential provider builder. I wouldn't want to re-enter everything again between dynamoDb and cloudWatch. If I want to keep them separate from the kinesis credentials

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll look at introducing a credential builder

@budde
Copy link
Author

budde commented Mar 14, 2017

@brkyvz Thanks for taking a look!

Re: major feedback:

  1. Sounds reasonable to me. I don't have strong feeling here.
  2. Will do.
  3. Sounds good.
  4. I agree that method overloading isn't a great and could lead to a similar scenario as KinesisUtils.createStream() in the long run. I kept it here since I figured that the scope was rather limited and I wanted to avoid having to deal with too many builders but I think you raise a good point. I'll look into adding a credentials builder. Ideally we wouldn't have to worry about Java/Python interoperability and could just use a Scala method with optionals or default args to solve this but I guess we have to play the hands we're dealt :)

I can move the builder to the companion object of KinesisInputDStream (this is actually where I had this placed originally). I like the idea of the private constructor as well although I don't think this will be possible until KinesisUtils.createStream() is refactored to use the builder pattern.

@budde
Copy link
Author

budde commented Mar 14, 2017

@brkyvz Actually, now that I think about it, do we need to make messageHandler a constructor arg since Builder is a generic class? There's probably a way we could get around this but I'd imagine it would be pretty complex...

@brkyvz
Copy link
Contributor

brkyvz commented Mar 14, 2017

Good point @budde. I can think of two options:

  1. Leave it as a constructor param
  2. Make the Builder class non-generic and have the build function take the message handler:
class Builder {
  
  def build(): KinesisInputDStream[Array[Byte]]

  def buildWithMessageHandler[T](f: Record => T): KinesisInputDStream[T]
}

It's a matter of taking it as the first parameter or the final parameter. There are other ways to do it as well, but will throw runtime exceptions instead of at compile time.

cc @rxin for input on APIs

@budde
Copy link
Author

budde commented Mar 14, 2017

@brkyvz I think if we're eliminating the constructor arguments then the second approach you've proposed might make more sense. I can't think of anything cleaner.

@brkyvz
Copy link
Contributor

brkyvz commented Mar 18, 2017

@budde Do you think you can update this PR? The 2.2 branch will be cut on Monday (2017-03-18).

@budde
Copy link
Author

budde commented Mar 19, 2017

@brkyvz A conference took up a lot of my time last week but I should have it updated later today

@budde budde force-pushed the KinesisStreamBuilder branch from 8aeef08 to d6afaef Compare March 20, 2017 19:28
@budde
Copy link
Author

budde commented Mar 20, 2017

@brkyvz PR has been updated, apologies for the delay. I've added SerializableCredentialsProvider.Builder, which I'm willing to hear suggestions for a better name on. I wanted to stay away from something like AWSCredentials.Builder so as to avoid confusion with similarly-named classes in the AWS Java SDK.

@SparkQA
Copy link

SparkQA commented Mar 20, 2017

Test build #74901 has finished for PR 17250 at commit d6afaef.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class Builder
  • class Builder

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would you mind just re-using the code in KinesisUtils instead of copying the code?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about keeping it here and refactoring KinesisUtils to use it? I think this is what I was intending to do originally, just forgot to update the code.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sounds good

@budde budde force-pushed the KinesisStreamBuilder branch from d6afaef to 3cc2df8 Compare March 20, 2017 21:55
@budde
Copy link
Author

budde commented Mar 20, 2017

@brkyvz Updated the PR to remove defaultMessageHandler() from KinesisUtils in favor of keeping this method in KinesisInputDStream. My thought here was that this would be a better place for it since we've put KinesisUtils on the deprecation path.

@SparkQA
Copy link

SparkQA commented Mar 20, 2017

Test build #74909 has finished for PR 17250 at commit 3cc2df8.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class Builder
  • class Builder

Copy link
Contributor

@brkyvz brkyvz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a bit more to go.

I think we can rename SerializableCredentialProvider to CredentialProvider. Then it should have methods for keys and sts, not independent variables.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: move = to the line above

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will fix. Sorry I keep doing this :-/

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm. This isn't a great name for a user facing API, the user shouldn't have to care about if the provider is serializable or not, that's an implementation detail.

I understand your concerns with the AWSCredentials name collisions. However, I think it's the best name there is.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about simply CredentialProvider

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree we should definitely come up with a better name here. What about SparkAWSCredentials? Obviously it's not as succinct as AWSCredentials but I think it's a clear name that avoids collisions.

I'm okay with CredentialsProvider otherwise.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess SparkAWSCredentials also work

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

anyone who provides an accessKeyId should also provide a secretKey therefore I would take both together.

.withKeys(awsAccessKey: String, awsSecretKey: String)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll rework this builder to take multiple arguments for the long-lived keypair and STS

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here:

def withSts(roleArn: String, sessionName: String)
def withSts(roleArn: String, sessionName: String, externalId: String)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess you no longer need this

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right, thanks for catching it

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wouldn't call these Stable just yet :) Let's call it evolving for one release cycle

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For sure. Honestly I just cribbed these annotations from SparkSession.Builder so I appreciate you letting me know what the proper convention is.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto, let's call it evolving for now

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sounds good

@budde budde force-pushed the KinesisStreamBuilder branch from 3cc2df8 to 337b6ba Compare March 22, 2017 22:12
@budde
Copy link
Author

budde commented Mar 22, 2017

@brkyvz Updated per your feedback. Most significant change is renaming SerializableCredentialsProvider to SparkAWSCredentials (as well as renaming its subclasses) and refactoring its builder as you've suggested.

@SparkQA
Copy link

SparkQA commented Mar 22, 2017

Test build #75065 has finished for PR 17250 at commit 337b6ba.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Contributor

@brkyvz brkyvz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking really good. I think we can merge it after this pass

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The privates are unnecessary since the class itself is private. You can either:

  1. remove private[kinesis]
  2. make it private val and then use PrivateMethodTester in the tests.

I'm fine either way.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, it's really just there so that I could access the values directly from the test. I'll look into using PrivateMethodTester, thanks for the suggestion

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you keep it val it should be fine.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, was just thinking that'd be a lot simpler. I'll go that route. Thanks!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SparkAWSCredentials

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My bad, thanks

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Figured out why I didn't catch this before-- apparently I can't spell "serializable". Ugh!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will fix and do a "grep -r 'SerialziableCredentialsProvider' *" to make sure this isn't appearing anywhere else

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where does this come from? LocalJavaStreamingContext? If so, I wouldn't stop it. In fact, you can use Mockito to create a mock JavaStreamingContext if you like

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll probably just go the mock route then and ignore the context all together. I was seeing a bunch of "Spark context is already running" error messages when I tried to run all of the streaming tests before I added this

Copy link
Author

@budde budde Mar 23, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using a mock here and in the other test might not be very practical after all-- looks like the DStream constructor hooks into StreamingContext. We would at least need to mock it's getState() method as well as mocking a SparkContext along with its local properties.

Edit: this might not be as bad as I thought-- I'll keep trying the mock approach

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's fine then. As long as we don't break the environment for other tests, do proper clean up, it should be fine

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like this will need to be left as-is. In the current test implementation we check that checkpointInterval isn't a required option and its default value is obtained via ssc.graph.batchDuration, which we won't be able to mock because DStreamGraph is final.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since we're not really starting a stream, and testing API's, we should just mock it.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would split this into 3 very small tests:
should raise an exception if StreamingContext is missing
should raise an exception if stream name is missing
should raise an exception if checkpoint app name is missing

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: could you make this a single line?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: single line please.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll fix it. This happened since DefaultCredentialsProvider was shortened to DefaultCredentials so I'll try to check for other places where a multiline statement can be rolled up into a single line

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know we didn't have it before, but could you also check that after deserialization, they're equivalent?

val creds = BasicCredentials("x", "y")
assert(Utils.deserialize[BasicCredentials](Utils.serialize(creds)) === creds)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do

@budde budde force-pushed the KinesisStreamBuilder branch from 337b6ba to 6f11978 Compare March 23, 2017 20:09
@budde
Copy link
Author

budde commented Mar 23, 2017

@brkyvz Updated per your feedback, thanks for taking a thorough look. I also renamed the longLivedCredsProvider of STSCredentials to just longLivedCreds to match the updated naming conventions.

@brkyvz
Copy link
Contributor

brkyvz commented Mar 23, 2017

Thanks a lot for the quick turnaround @budde ! Could you also contribute to the docs as well with the new builder API?
https://spark.apache.org/docs/latest/streaming-kinesis-integration.html

@SparkQA
Copy link

SparkQA commented Mar 23, 2017

Test build #75111 has finished for PR 17250 at commit 6f11978.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@budde
Copy link
Author

budde commented Mar 23, 2017

@brkyvz Sure, want me to add it to this PR or open a new one?

@brkyvz
Copy link
Contributor

brkyvz commented Mar 23, 2017

Thanks! new PR would be easier!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you want to add the note here as well?

* @note The given AWS credentials will get saved in DStream checkpoints if checkpointing
* is enabled. Make sure that your checkpoint directory is secure.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@budde budde force-pushed the KinesisStreamBuilder branch from 6f11978 to 5315f1e Compare March 24, 2017 18:29
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Make sure that your checkpoint directory is secure. Prefer using the [https://link.to.amazon.docs default credential provider chain]] if possible

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The link in this case will be quite long-- URL just by itself pushes it over the 100 char limit:

[[http://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/credentials.html#credentials-default default credential provider chain]]

Do you know if there's a way to safely split this into multiple lines? Should I just turn style checks off for this comment?

Copy link
Contributor

@brkyvz brkyvz Mar 24, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feel free to add

// scalastyle:off

// scalastyle:on

around the doc

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

- Add KinesisInputDStream.Builder class
- Add KinesisInputDStreamBuilderSuite test suite
- Add JavaKinesisInputDStreamBuilderSuite test suite
- Rename SerializableCredentialsProvider -> SparkAWSCredentials
- Add SparkAWSCredentials.Builder
- Add SparkAWSCredentialsBuilderSuite test suite
- Make KinesisInputDStream ctor args package private for testing
- Add args to KinesisInputDStream and KinesisReceiver for optional
  service-specific auth (Kinesis, DynamoDB and CloudWatch)
@budde budde force-pushed the KinesisStreamBuilder branch from 5315f1e to 03f91da Compare March 24, 2017 19:01
@brkyvz
Copy link
Contributor

brkyvz commented Mar 24, 2017

LGTM pending tests. Thanks a lot for this PR @budde

PS It's okay to make new commits, you don't have to squash commit every time :)

@SparkQA
Copy link

SparkQA commented Mar 24, 2017

Test build #75170 has finished for PR 17250 at commit 5315f1e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Mar 24, 2017

Test build #75172 has finished for PR 17250 at commit 03f91da.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@brkyvz
Copy link
Contributor

brkyvz commented Mar 24, 2017

Merging to master

@asfgit asfgit closed this in 707e501 Mar 24, 2017
@budde
Copy link
Author

budde commented Mar 24, 2017

@brkyvz Awesome, thanks for reviewing this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants