Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pearson Correlation Coefficient raises error when 2D tensor but single task #1647

Closed
shenoynikhil opened this issue Mar 23, 2023 · 1 comment · Fixed by #1649
Closed

Pearson Correlation Coefficient raises error when 2D tensor but single task #1647

shenoynikhil opened this issue Mar 23, 2023 · 1 comment · Fixed by #1649
Labels
bug / fix Something isn't working help wanted Extra attention is needed

Comments

@shenoynikhil
Copy link
Contributor

shenoynikhil commented Mar 23, 2023

🐛 Bug

I have a regression based modelling repository where the predictions can be multi-output or single-output based on configuration. My network outputs [n_samples, n_tasks] where n_task varies according to the task. If n_task is 1 then trying, torchmetrics.functional.pearson_corrcoef(predictions, targets) gives the error,

ValueError: Expected argument `num_outputs` to match the second dimension of input, but got 1 and 1

Changing the output shape for a single task specifically just so as to fit the metric function does not seem like a good solution. I think a simple change should be able to fix it.
My current workout around,

import torchmetrics.functional as Fm

# predictions are [n, 1] for single task/output
Fm.pearson_corrcoef(predictions, targets) if predictions.shape[1] > 1 else Fm.pearson_corrcoef(predictions[:, 0], targets[:, 0])

There are other metrics that handle this,

        metrics = {
            "mse": Fm.mean_squared_error(predictions, targets, squared=True),
            "rmse": Fm.mean_squared_error(predictions, targets, squared=False),
            "mae": Fm.mean_absolute_error(predictions, targets),
            "r2": Fm.r2_score(predictions, targets, multioutput="raw_values"),
            "mape": Fm.mean_absolute_percentage_error(predictions, targets),
            # TODO: Raise issue on torchmetrics
            "pcc": (
                Fm.pearson_corrcoef(predictions, targets) if predictions.shape[1] > 1 else
                Fm.pearson_corrcoef(predictions[:, 0], targets[:, 0])
            ),
        }

To Reproduce

Steps to reproduce the behavior...

Code sample

Expected behavior

Environment

  • TorchMetrics version (and how you installed TM, e.g. conda, pip, build from source):
  • Python & PyTorch Version (e.g., 1.0):
  • Any other relevant information such as OS (e.g., Linux):

Additional context

@shenoynikhil shenoynikhil added bug / fix Something isn't working help wanted Extra attention is needed labels Mar 23, 2023
@github-actions
Copy link

Hi! thanks for your contribution!, great first issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant