-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-4688] Have a single shared network timeout in Spark #3562
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can one of the admins verify this patch? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about
int t = conf.getInt("spark.shuffle.io.connectionTimeout", conf.getInt("spark.network.timeout", 100);
return t * 1000;|
Wanted to know what should be the default spark.network.timeout value ? I have kept it at 100 sec. Should it be different ? |
|
Made the changes as per review. Also updated configuration.md |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why was default value changed from 60 to 100?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Lewuathe , that is what I wanted to know. That what should I keep as the default value. Keep a single fixed timeout value for the config spark.network.timeout or change the default based on defaults for earlier configurations which this config intends to replace. As I am pretty new to Spark, I am not aware of what default value will be suitable. Maybe @rxin can confirm.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think 100 is ok - given the akka timeout was 100.
|
@rxin , any conclusion on what should be the default timeout values used in these different cases ? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you need to add a space before 100 here
|
@rxin , I will just summarize what are the configuration defaults I have used. I put a value of 100 in initial pull request with the intention of having a futher discussion on appropriate defaults. There are 2 approaches possible. We can continue using the same defaults as earlier i.e. spark.network.timeout will have different default values at different places. Or decide a fixed default value. I think latter should be done but an appropriate value has to be decided.
I think based on these cases we can fix a default timeout value of 120 sec. for spark.network.timeout |
[SPARK-4688] Have a single shared network timeout in Spark
|
@rxin , any conclusion on this ? |
|
ping @rxin |
|
Jenkins, retest this please. |
|
Test build #24963 has started for PR 3562 at commit
|
|
Test build #24963 has finished for PR 3562 at commit
|
|
Test PASSed. |
|
Thanks. I'm merging this in master. |
|
Thanks for the review and commit. |
[SPARK-4688] Have a single shared network timeout in Spark