Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing bug for metacat #474

Merged
merged 1 commit into from
Aug 9, 2024
Merged

Fixing bug for metacat #474

merged 1 commit into from
Aug 9, 2024

Conversation

shubham-s-agarwal
Copy link
Collaborator

Fix issues with compute_class_weights JSON serialization and enforce fc2 usage when fc3 is enabled

  • Resolved an issue where compute_class_weights returns a NumPy array, causing an error when saving the configuration as JSON (since JSON does not support NumPy arrays). The fix ensures compatibility by converting the NumPy array to a JSON-serializable format.

  • Added a safeguard in the model_architecture_config for meta_cat_config. The current architecture assumes fc3 is only used when fc2 is enabled. If fc2 is set to False and fc3 is True, the model would fail due to a mismatch in hidden layer sizes. The fix automatically enables fc2 if fc3 is set to True, preventing potential errors.

Fix issues with compute_class_weights JSON serialization and enforce fc2 usage when fc3 is enabled

* Resolved an issue where compute_class_weights returns a NumPy array, causing an error when saving the configuration as JSON (since JSON does not support NumPy arrays). The fix ensures compatibility by converting the NumPy array to a JSON-serializable format.

* Added a safeguard in the model_architecture_config for meta_cat_config. The current architecture assumes fc3 is only used when fc2 is enabled. If fc2 is set to False and fc3 is True, the model would fail due to a mismatch in hidden layer sizes. The fix automatically enables fc2 if fc3 is set to True, preventing potential errors.
Copy link
Collaborator

@mart-r mart-r left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Look good to me.

If you think it's worth it, you could also write a test for (some of) this. Something that would break with the previous implementation but would pass with the current one.
I.e something that automatically fills the class_weights part of the config through calling train_model and then attempts to save the MetaCAT.
But use your best judgement. If you don't think it's needed, don't bother.

@shubham-s-agarwal
Copy link
Collaborator Author

We could write a test, but then it would make sense to perhaps test every combination (of config variations).
Do we want to do that now?

@mart-r
Copy link
Collaborator

mart-r commented Aug 9, 2024

We could write a test, but then it would make sense to perhaps test every combination (of config variations). Do we want to do that now?

You can do that if you like. But if you do, then I wouldn't recommend creating a separate test method for each configuration, but rather generate the various combinations of config values automatically and go through them one by one, probably with a subTest.

But then again, if you don't believe it's something we need to test, then feel free to omit.

@shubham-s-agarwal
Copy link
Collaborator Author

I went through the combinations we would test, and I feel we can omit the testing as previously we tested the main module (two phase learning)

@shubham-s-agarwal shubham-s-agarwal merged commit b7658ee into master Aug 9, 2024
8 checks passed
@shubham-s-agarwal shubham-s-agarwal deleted the metacat_bug_resolve branch August 9, 2024 11:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants