Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slower performance for Torchmetrics PESQ #1223

Closed
ashinkajay opened this issue Sep 14, 2022 · 4 comments · Fixed by #1227
Closed

Slower performance for Torchmetrics PESQ #1223

ashinkajay opened this issue Sep 14, 2022 · 4 comments · Fixed by #1227
Labels
bug / fix Something isn't working help wanted Extra attention is needed topic: Audio
Milestone

Comments

@ashinkajay
Copy link

🐛 Bug

Multiprocessing capability is missing for Torchmetrics PESQ while the underlying library ludlows/python-pesq have that option.
As a result, the torchmetrics version of PESQ performs slower than ludlows/python-pesq.

To Reproduce

Run PESQ for a batch of 100 audios with a minimum duration of 10 seconds for each audio.

Code sample

Python notebook code attachment here:
analysis.zip

Expected behavior

Both the Torchmetrics PESQ and ludlows/python-pesq should take almost the same time to compute the PESQ score.

Environment

  • Python - 3.10.4 (installed using conda)
  • OS: Ubuntu 20.04.4 LTS

Below libraries installed using pip

  • TorchMetrics - 0.9.3
  • PyTorch - 1.12.1
  • Torchaudio - 0.12.1
  • PESQ - 0.0.4

Additional context

References
ludlows/python-pesq: https://github.com/ludlows/PESQ

@ashinkajay ashinkajay added bug / fix Something isn't working help wanted Extra attention is needed labels Sep 14, 2022
@stancld
Copy link
Contributor

stancld commented Sep 14, 2022

Hi @ashinkajay, thank you for your post. As far as I know, the original PESQ code has a pretty restricting license regarding the way how PESQ can be implemented in torchmetrics (see discussion in #726)

cc: @Borda @SkafteNicki Have you had any discussion about this besides the issue attached? :]

@SkafteNicki
Copy link
Member

Yes their license is very strict so we cannot re-implement. However, it seems like we could probably call pesq_batch from https://github.com/ludlows/PESQ instead of pesq to run the calculation in parallel. We would just need to add an n_processor argument to the modular and functional implementations.

@SkafteNicki
Copy link
Member

Yes their license is very strict so we cannot re-implement. However, it seems like we could probably call pesq_batch from https://github.com/ludlows/PESQ instead of pesq to run the calculation in parallel. We would just need to add an n_processor argument to the modular and functional implementations.

@ashinkajay is this what you are thinking about when raising this issue?

@ashinkajay
Copy link
Author

Thank you @SkafteNicki and @stancld

@SkafteNicki Yes I found pesq_batch and couldn't see that functionality in torchmetrics. That is why I raised the issue.
Thanks for the clarification about the license.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed topic: Audio
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants