You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I recently opened a PR that supports compilation to CoreML (Apple devices): #1007
It is a simple as drag-and-drop to the XCode worksapce. XCode generates inference classes and you can try your model right away.
As of TensorFlow runtimes, they have great documentation about how to convert to TFLite: https://www.tensorflow.org/lite/models/convert/
We recommend converting to TensorFlow (preferbly Keras is possible, converts better) and then to TFLite.
You can convert Tensorflow/Keras to TFLite, or ONNX to TFLite, following these docs.
I would recommend:
Convert pytroch to ONNX using torch.onnx.export
Convert ONNX to TensorFlow/Keras
Convert Keras (h5) / TensorFlow (pb/saved model) to TFLite.
Use the checkpoint from step 3 with a TFLite interpreter (runtime) on your mobile device.
If you seek performance with TFlite, I recommend checking the GPUDelegates together with MediaPipe by Google.
Worth saying TFlite runs also on browsers and is universal, unlike CoreML.
Please share the documentation to test the model on mobile application
The text was updated successfully, but these errors were encountered: