Skip to content

Batching for transforms #337

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 16 commits into from
Nov 18, 2019
Merged

Batching for transforms #337

merged 16 commits into from
Nov 18, 2019

Conversation

vincentqb
Copy link
Contributor

@vincentqb vincentqb commented Nov 11, 2019

Extends #327 to other transforms.

Fixes #336 also.

@vincentqb vincentqb changed the title batching for transforms Batching for transforms Nov 11, 2019
@vincentqb vincentqb marked this pull request as ready for review November 18, 2019 16:25
@cpuhrsch
Copy link
Contributor

I'd also update the README alongside this to explain the ellipsis convention.

@vincentqb
Copy link
Contributor Author

I'd also update the README alongside this to explain the ellipsis convention.

Updated. We were already using ellipses in the README. :)

README.md Outdated
@@ -129,7 +129,7 @@ Transforms expect and return the following dimensions.
* `MuLawDecode`: (channel, time) -> (channel, time)
* `Resample`: (channel, time) -> (channel, time)

Complex numbers are supported via tensors of dimension (..., 2), and torchaudio provides `complex_norm` and `angle` to convert such a tensor into its magnitude and phase.
Complex numbers are supported via tensors of dimension (..., 2), and torchaudio provides `complex_norm` and `angle` to convert such a tensor into its magnitude and phase. Here, and in the documentation, we use ellipsis "..." as a placeholder for the rest of the dimensions of a tensor.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd also briefly mention the purpose of that (i.e. batching, etc.). "rest of the dimensions" to me at least doesn't imply a functionality.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree: "we use an ellipsis "..." as a placeholder for the rest of the dimensions of a tensor, e.g. an optional batching dimension."

Copy link
Contributor

@cpuhrsch cpuhrsch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. See small comment.

@vincentqb vincentqb merged commit f5f79d1 into pytorch:master Nov 18, 2019
@vincentqb vincentqb deleted the batching branch November 18, 2019 23:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Problems with compute_deltas
2 participants