Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Linking Flex Delegates with Edge TPU Compiler #479

Closed
JoniSuominen opened this issue Sep 29, 2021 · 8 comments
Closed

Linking Flex Delegates with Edge TPU Compiler #479

JoniSuominen opened this issue Sep 29, 2021 · 8 comments
Assignees
Labels
comp:compiler Compiler related issues comp:model Model related isssues type:support Support question or issue

Comments

@JoniSuominen
Copy link

Hi!

I'm currently trying to compile efficientdet-d0 .tflite model for Edge TPU, but it unfortunately contains some flex ops as can be seen below

Edge TPU Compiler version 16.0.384591198
Started a compilation timeout timer of 180 seconds.
ERROR: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
ERROR: Node number 0 (FlexTensorListReserve) failed to prepare.

Compilation failed: Model failed in Tflite interpreter. Please ensure model can be loaded/run in Tflite interpreter.
Compilation child process completed within timeout period.
Compilation failed!

I'm able to run it with tflite-interpreter outside edgetpu_compiler, since I have built tflite-runtime from source (https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/tools/pip_package) and added flex delegates to the build there. Is this somehow supported by edgetpu compiler?

@google-coral-bot google-coral-bot bot added comp:compiler Compiler related issues comp:model Model related isssues labels Sep 29, 2021
@hjonnala hjonnala added the type:support Support question or issue label Sep 29, 2021
@hjonnala
Copy link
Contributor

Hi, can you please share the tflite model.

@JoniSuominen
Copy link
Author

Not able to share that specific model as its private, but I'll see if I can reproduce it with a publicly available model.

@hjonnala
Copy link
Contributor

okay, can you try to install tflite runtime and test the inference on colab notebook.

! python3 -m pip install "https://github.com/google-coral/pycoral/releases/download/v2.0.0/tflite_runtime-2.5.0.post1-cp37-cp37m-linux_x86_64.whl"

from tflite_runtime import interpreter as tflite
import numpy as np
interpreter = tflite.Interpreter("model_path.tflite")
interpreter.allocate_tensors()

input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()

input_shape = input_details[0]['shape']
input_data = np.array(np.random.random_sample(input_shape), dtype=input_details[0]['dtype'])
interpreter.set_tensor(input_details[0]['index'], input_data)

interpreter.invoke()

output_data = interpreter.get_tensor(output_details[0]['index'])
print(output_data)

@mbrooksx
Copy link
Member

EfficientDet-D0 is not compatible with the EdgeTPU. You'll need to use an EfficientDet-Lite variant (i.e. EfficientDet-Lite0). You can either use the ones on TFHub, the Coral test_data repo, or use TFLite model maker to retrain.

@JoniSuominen
Copy link
Author

EfficientDet-D0 is not compatible with the EdgeTPU. You'll need to use an EfficientDet-Lite variant (i.e. EfficientDet-Lite0). You can either use the ones on TFHub, the Coral test_data repo, or use TFLite model maker to retrain.

Sorry for late reply! That solves the issue, as the lite models seemed to work ok 👍

@google-coral-bot
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@HeywardLiu
Copy link

HeywardLiu commented Jul 15, 2022

okay, can you try to install tflite runtime and test the inference on colab notebook.

@hjonnala
Hi, I'm facing with same issue while compiling to an edgetpu model.
The following error are similar to poster.

Edge TPU Compiler version 16.0.384591198
Started a compilation timeout timer of 180 seconds.
ERROR: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
ERROR: Node number 68 (FlexErf) failed to prepare.

Compilation failed: Model failed in Tflite interpreter. Please ensure model can be loaded/run in Tflite interpreter.
Compilation child process completed within timeout period.
Compilation failed! 

The model to compile is converted from pytorch -> onnx -> tensorflow -> fully quantized tflite
Here's the error messages from code you post to run tflite model inference.

ValueError                                Traceback (most recent call last)
[<ipython-input-1-97f0669c5021>](https://localhost:8080/#) in <module>()
      1 from tflite_runtime import interpreter as tflite
      2 import numpy as np
----> 3 interpreter = tflite.Interpreter("fully_quant_from_onnx_deit_tiny_distilled_patch16_224.tflite")
      4 interpreter.allocate_tensors()
      5 

[/usr/local/lib/python3.7/dist-packages/tflite_runtime/interpreter.py](https://localhost:8080/#) in __init__(self, model_path, model_content, experimental_delegates, num_threads, experimental_op_resolver_type, experimental_preserve_all_tensors)
    349               model_path, op_resolver_id, custom_op_registerers_by_name,
    350               custom_op_registerers_by_func,
--> 351               experimental_preserve_all_tensors))
    352       if not self._interpreter:
    353         raise ValueError('Failed to open {}'.format(model_path))

ValueError: No subgraph in the model.

#274
I also found a post said edgetpu compiler doesn't support 5D transpose operation.
So I decided to take a look of my model on netron, there's addtional 5D transpose operaions actually.
Is 5D-transpose related to this compiling error?
Base on the post, is it impossible to run my model in edgetpu, untill edgetpu compiler support this 5D transpose operation?

Thank you for your attention!

@ankitmaurya001
Copy link

HI @JoniSuominen can you please share the steps to build tflite_runtime whl with TF ops support for edge TPU?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:compiler Compiler related issues comp:model Model related isssues type:support Support question or issue
Projects
None yet
Development

No branches or pull requests

5 participants