-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use schema from trained model instead of inferring it again from type when creating prediction engine. #347
Comments
@TomFinley, not directly related to #216 in a sense that any changes made to #216 will not directly contribute to solve it. The problem here is that during training in this particular example, the schema is explicitly provided in TextLoader. However, the BatchPredictionEngine uses input type (IrisData) to infer schema and it fails there. |
I have the same issue, I have written a custom loader with a custom schema and it fails in the |
@Zruty0 is this still valid? |
It is still valid. |
Not valid anymore, the new API and the prediction allows you pass in schema that you can get when you do loadModel. |
System information
Issue
CreateBatchPredictionEngine
should use schema from trained model instead of inferring schema again from input type?Source code / logs
The text was updated successfully, but these errors were encountered: