You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.
PR #230 introduced ability to load and score ML.NET models trained in the new ML.NET TransformerChain serialization format. This was done by checking whether "TransformerChain" exists in the archive members. Currently, this is done every time test, predict, predict_proba, and decision_function methods call _predict. This can be improved by checking for "TransformerChain" only once when the model is loaded, improving inference performance.
The text was updated successfully, but these errors were encountered:
najeeb-kazmi
changed the title
Improve inference with ML.NET new TransformerChain format models
Improve inference performance with loaded TransformerChain ML.NET model
Nov 20, 2019
PR #230 introduced ability to load and score ML.NET models trained in the new ML.NET
TransformerChain
serialization format. This was done by checking whether "TransformerChain" exists in the archive members. Currently, this is done every timetest
,predict
,predict_proba
, anddecision_function
methods call_predict
. This can be improved by checking for "TransformerChain" only once when the model is loaded, improving inference performance.The text was updated successfully, but these errors were encountered: