Not able to serve bentoml API on pre-trained .h5 file #1169
Unanswered
arpitkjain10
asked this question in
General
Replies: 1 comment 1 reply
-
For BentoML, it packages the model and all of the dependencies. It doesn't add additional methods on to model itself. You will need to check your model framework, what method is available for the model to call. For tensorflow, I don't think @bentoml.api(input=TfTensorInput(), batch=True)
def predict(self, tensor):
outputs = self.artifacts.model(tensor)
return outputs In notebook: https://github.com/bentoml/gallery/blob/master/tensorflow/echo/tensorflow-echo.ipynb Hope this helps, and let me know how it goes, I am happy to help |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello!
Using bentoml to serve my tensorflow model. added my pre-trained sequential model to artifacts using beentoml decorator.
When triggering the API created from swagger, getting invalid model error.
Error: "Exception happened in API function: '_UserObject' object has no attribute 'predict'"
Code: prediction = self.artifacts.model.predict(testing_padded_seq)
I checked and got to know that the above line is laoding a UserObject.
<tensorflow.python.saved_model.load.Loader._recreate_base_user_object.._UserObject object at 0x7fb82c65af90>
Completely stuck. Please let me know if anyone know the solution here.
Beta Was this translation helpful? Give feedback.
All reactions