Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Improve inference performance with loaded TransformerChain ML.NET model #370

Open
najeeb-kazmi opened this issue Nov 20, 2019 · 0 comments · May be fixed by #371
Open

Improve inference performance with loaded TransformerChain ML.NET model #370

najeeb-kazmi opened this issue Nov 20, 2019 · 0 comments · May be fixed by #371
Assignees

Comments

@najeeb-kazmi
Copy link
Member

PR #230 introduced ability to load and score ML.NET models trained in the new ML.NET TransformerChain serialization format. This was done by checking whether "TransformerChain" exists in the archive members. Currently, this is done every time test, predict, predict_proba, and decision_function methods call _predict. This can be improved by checking for "TransformerChain" only once when the model is loaded, improving inference performance.

@najeeb-kazmi najeeb-kazmi self-assigned this Nov 20, 2019
@najeeb-kazmi najeeb-kazmi changed the title Improve inference with ML.NET new TransformerChain format models Improve inference performance with loaded TransformerChain ML.NET model Nov 20, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant