Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Auto convert to contiguous format for all_gather #4907

Merged
merged 9 commits into from
Dec 5, 2020

Conversation

SkafteNicki
Copy link
Member

What does this PR do?

Fixes #4781
When syncing using gather_all, tensors are required to be in contiguous memory format. This PR converts tensors that are not and warns the user.

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified; Bugfixes should be including in bug-fix release milestones (m.f.X) and features should be included in (m.X.b) releases.

Did you have fun?

Make sure you had fun coding 🙃

@pep8speaks
Copy link

pep8speaks commented Nov 30, 2020

Hello @SkafteNicki! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-12-05 13:58:36 UTC

@SkafteNicki SkafteNicki added bug Something isn't working Metrics labels Nov 30, 2020
@codecov
Copy link

codecov bot commented Nov 30, 2020

Codecov Report

Merging #4907 (30a883a) into master (7234970) will not change coverage.
The diff coverage is n/a.

@@          Coverage Diff           @@
##           master   #4907   +/-   ##
======================================
  Coverage      93%     93%           
======================================
  Files         129     129           
  Lines        9359    9359           
======================================
  Hits         8677    8677           
  Misses        682     682           

tests/metrics/test_ddp.py Outdated Show resolved Hide resolved
@Borda Borda added this to the 1.1 milestone Nov 30, 2020
tests/metrics/test_ddp.py Outdated Show resolved Hide resolved
tests/metrics/test_ddp.py Outdated Show resolved Hide resolved
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

pytorch_lightning/utilities/distributed.py Outdated Show resolved Hide resolved
tests/metrics/test_ddp.py Show resolved Hide resolved
@tchaton tchaton added the ready PRs ready to be merged label Dec 3, 2020
@Borda
Copy link
Member

Borda commented Dec 4, 2020

@SkafteNicki mind check the latest comments? 🐰

@Borda Borda added priority: 1 Medium priority task and removed ready PRs ready to be merged labels Dec 4, 2020
@SkafteNicki SkafteNicki merged commit 1b40a40 into Lightning-AI:master Dec 5, 2020
@SkafteNicki SkafteNicki deleted the memory_format branch December 5, 2020 14:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working priority: 1 Medium priority task
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Potential bug in metric when updated with a slice of tensor in DDP
6 participants