Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accessing True/Pred values for plotting Confusion Matrices #662

Open
Rinkachirikiari opened this issue Oct 10, 2024 · 1 comment
Open

Accessing True/Pred values for plotting Confusion Matrices #662

Rinkachirikiari opened this issue Oct 10, 2024 · 1 comment

Comments

@Rinkachirikiari
Copy link

Hey there,

Is there a way to easily access the true labels (y_true) and predicted labels (y_pred) from the model evaluation pipeline?
This would be helpful for plotting confusion matrices after training or cross-validation.

Best,

@bruAristimunha
Copy link
Collaborator

Hey @Rinkachirikiari,

Apologies for the late reply. In moabb currently the philosophy of only relying on a single metric, to avoid some types of cherry picking of the best metric. It's a philosophy I don't 100% agree with, and with the refactoring of the evaluation function, I hope we can move on it.

Since moabb is a volunteer project, where people work when they have free time, it may take a while for this to happen.

My suggestion for you now is that you directly modify the evaluation code to get what you want.

Cc. @PierreGtch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants