Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A Curious Question about Prediction Quality #214

Closed
reshalfahsi opened this issue Sep 19, 2021 · 7 comments
Closed

A Curious Question about Prediction Quality #214

reshalfahsi opened this issue Sep 19, 2021 · 7 comments
Labels
enhancement New feature or request

Comments

@reshalfahsi
Copy link

reshalfahsi commented Sep 19, 2021

Hi there,

Is it possible to obtain prediction quality? Is there any computation under the hood within this package to find the conformal prediction? If there weren't any could you please add it as the feature request?

Thank you.

@sonichi sonichi added the enhancement New feature or request label Sep 20, 2021
@sonichi
Copy link
Contributor

sonichi commented Sep 20, 2021

I'm making it a feature request. What package do you use today to get prediction quality for models like lightgbm or random forest?

@reshalfahsi
Copy link
Author

Have you heard about nonconformist package? I currently use that one.

@sonichi
Copy link
Contributor

sonichi commented Sep 21, 2021

It seems that you can first run flaml to do hyperparameter tuning and model selection, and then pass flaml.model to the package. Give that a try?

@reshalfahsi
Copy link
Author

I haven't but maybe I will give it a try.

@reshalfahsi
Copy link
Author

reshalfahsi commented Sep 21, 2021

Hi @sonichi,

It seems the nonconformist package requires the user to train alongside the wrapper from the package. You have to wrap the model and do the magic. I recommend that you should implement the algorithm by yourself because when I apply XGBoost model I have to tweak the package a little bit.

@sonichi
Copy link
Contributor

sonichi commented Sep 21, 2021

Yes it uses a wrapper. I was thinking that you could take the tuned model from flaml, wrap it with the package and retrain. Is there any issue with this approach? What tweak is required for XGBoost?

@sonichi
Copy link
Contributor

sonichi commented Oct 19, 2021

@reshalfahsi I'm closing this issue as it is inactive for a month. If you still request this feature, feel free to reopen and follow up.

@sonichi sonichi closed this as completed Oct 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants