-
Notifications
You must be signed in to change notification settings - Fork 149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Change iterator over multiple Queue wrappers to request all protocols simulteniously #769
Change iterator over multiple Queue wrappers to request all protocols simulteniously #769
Conversation
… simulteniously [ghstack-poisoned]
… simulteniously ghstack-source-id: 09787247b78b4054d16f606070fc00880a0763c9 Pull Request resolved: #769
…l protocols simulteniously" This is part of MPRS optimizations, changes are covered by the existing test_dataloader2.py test. [ghstack-poisoned]
… simulteniously ghstack-source-id: ed8aae4f86aaaa4157d8803e127ee2151c658b30 Pull Request resolved: #769
|
||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, It might be better to add an Error message in __init__
.
super().__init__(msg)
.
for idx in range(total_pipes): | ||
self.datapipes[idx].protocol.request_next() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the main review comment, the rests are some nits.
Might be a noob question. We now request and receive data from protocol
object. Then, do we still need QueueWrapper
? We can directly let _IterateQueueDataPipes
store a list of protocol clients.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
QueueWrapper
handles terminations (and snapshotting in the future). Direct access to protocol here is only required to reorder traversals.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
However, I'm still considering the possibility of merging _IterateQueueDataPipes and QueueWrapper to make it one class that supports 1:M queues.
…l protocols simulteniously" This is part of MPRS optimizations, changes are covered by the existing test_dataloader2.py test. [ghstack-poisoned]
…l protocols simulteniously" This is part of MPRS optimizations, changes are covered by the existing test_dataloader2.py test. [ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
if isinstance(response, communication.messages.InvalidStateResponse): | ||
raise communication.iter.InvalidStateResetRequired | ||
if isinstance(response, communication.messages.TerminateResponse): | ||
raise communication.iter.TerminateRequired |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Question: shouldn't these be caught by QueueWrapper
's method nonblocking_next
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, because I'm not using next of QueueWrapper, but instead accessing protocols next directly.
@VitalyFedyunin has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
…l protocols simulteniously" This is part of MPRS optimizations, changes are covered by the existing test_dataloader2.py test. Differential Revision: [D39816752](https://our.internmc.facebook.com/intern/diff/D39816752) [ghstack-poisoned]
@VitalyFedyunin has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
…l protocols simulteniously" This is part of MPRS optimizations, changes are covered by the existing test_dataloader2.py test. Differential Revision: [D39816752](https://our.internmc.facebook.com/intern/diff/D39816752) [ghstack-poisoned]
@VitalyFedyunin has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
This is part of MPRS optimizations, changes are covered by the existing test_dataloader2.py test.
Stack from ghstack (oldest at bottom):
Differential Revision: D39816752