-
Notifications
You must be signed in to change notification settings - Fork 29k
Disabling Utils.chmod700 for Windows #4299
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can one of the admins verify this patch? |
|
(This needs a JIRA) Did this work in 1.2.0? the question will be whether it's a regression or not. |
|
Yes, it's a regression. It worked with 1.2.0. |
|
I didn't try 1.2.1, but in 1.2.0, I didn't met such problems in windows. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't this comment belong after if (!isWindows), and not in the else clause that presumably does work for Windows?
|
Moved comment after |
|
ok to test |
|
Test build #26483 has started for PR 4299 at commit
|
|
Test build #26483 has finished for PR 4299 at commit
|
|
Test PASSed. |
|
Hey @MartinWeindel thanks for submitting the patch. Apparently this is a known issue: http://stackoverflow.com/questions/5302269/java-file-setwritable-and-stopped-working-correctly-after-jdk-6u18. For Spark 1.2.1 I think it makes sense to just commit the changes here after updating the javadoc comments. Also, any ideas why the diff for this PR is almost 2k lines? Is your IDE changing the line end characters somehow? |
|
Test build #26527 has started for PR 4299 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@MartinWeindel, mind adding a comment to this Javadoc to explain that this method is a no-op under Windows?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if we swapped the order of these conditionals so that we first attempt to change the permissions and then ignore failures if we're running on Windows? Would that work, or would it fail with an exception?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, the current ordering seems more intuitive, so I'm going to keep the patch in its current form.
|
Test build #26536 has started for PR 4299 at commit
|
|
Test FAILed. |
|
Ah, the Git timeout problem resurfaces once again :(. Jenkins, retest this please. |
|
Am 02.02.2015 um 22:07 schrieb UCB AMPLab:
|
|
Jenkins, retest this please. |
|
Test build #26540 has started for PR 4299 at commit
|
|
I built this branch under Windows and tested it out using |
This patch makes Spark 1.2.1rc2 work again on Windows. Without it you get following log output on creating a Spark context: INFO org.apache.spark.SparkEnv:59 - Registering BlockManagerMaster ERROR org.apache.spark.util.Utils:75 - Failed to create local root dir in .... Ignoring this directory. ERROR org.apache.spark.storage.DiskBlockManager:75 - Failed to create any local dir. Author: Martin Weindel <martin.weindel@gmail.com> Author: mweindel <m.weindel@usu-software.de> Closes #4299 from MartinWeindel/branch-1.2 and squashes the following commits: 535cb7f [Martin Weindel] fixed last commit f17072e [Martin Weindel] moved condition to caller to avoid confusion on chmod700() return value 4de5e91 [Martin Weindel] reverted to unix line ends fe2740b [mweindel] moved comment ac4749c [mweindel] fixed chmod700 for Windows
|
Test build #26527 has finished for PR 4299 at commit
|
|
Test FAILed. |
|
Test build #26536 has finished for PR 4299 at commit
|
|
Test PASSed. |
|
Test build #26540 has finished for PR 4299 at commit
|
|
Test PASSed. |
This patch makes Spark 1.2.1rc2 work again on Windows.
Without it you get following log output on creating a Spark context:
INFO org.apache.spark.SparkEnv:59 - Registering BlockManagerMaster
ERROR org.apache.spark.util.Utils:75 - Failed to create local root dir in .... Ignoring this directory.
ERROR org.apache.spark.storage.DiskBlockManager:75 - Failed to create any local dir.