Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for openVINO 2020 is missing #1179

Closed
MichelEhmen opened this issue Feb 21, 2020 · 9 comments · Fixed by #1767
Closed

Support for openVINO 2020 is missing #1179

MichelEhmen opened this issue Feb 21, 2020 · 9 comments · Fixed by #1767
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@MichelEhmen
Copy link

I converted my yolov3 pb model with openVINO (2020). After a successfull conversion I tried to upload the model and received the following error:
Checking request has returned the "failed" status. Message: Exception: Model was not properly created/updated. Test failed: Cannot load library '/opt/intel/openvino_2020.1.023/deployment_tools/inference_engine/lib/intel64/libcpu_extension_avx2.so': /opt/intel/openvino_2020.1.023/deployment_tools/inference_engine/lib/intel64/libcpu_extension_avx2.so: cannot open shared object file: No such file or directory

I also tried a conversion with openVINO (2019 R4) which already failed in the conversion process with the following error:
Model Optimizer version: 2019.3.0-375-g332562022 [ ERROR ] List of operations that cannot be converted to Inference Engine IR: [ ERROR ] FusedBatchNormV3 (72) [ ERROR ] detector/darknet-53/Conv/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_1/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_2/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_3/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_4/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_5/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_6/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_7/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_8/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_9/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_10/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_11/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_12/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_13/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_14/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_15/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_16/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_17/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_18/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_19/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_20/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_21/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_22/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_23/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_24/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_25/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_26/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_27/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_28/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_29/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_30/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_31/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_32/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_33/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_34/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_35/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_36/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_37/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_38/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_39/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_40/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_41/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_42/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_43/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_44/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_45/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_46/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_47/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_48/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_49/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_50/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/darknet-53/Conv_51/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_1/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_2/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_3/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_4/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_7/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_8/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_9/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_10/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_11/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_12/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_13/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_15/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_16/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_17/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_18/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_19/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_20/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_21/BatchNorm/FusedBatchNormV3 [ ERROR ] detector/yolo-v3/Conv_5/BatchNorm/FusedBatchNormV3 [ ERROR ] Part of the nodes was not converted to IR. Stopped.

So I can't see any possibility to import yolov3 model at the moment.

@benhoff
Copy link
Contributor

benhoff commented Feb 23, 2020

I'm guessing the filename just changed. Cvat hardcodes the extension name. See here: https://github.com/opencv/cvat/blob/8caa1695c52aab06574fc8651840ff6e80baa072/cvat/apps/auto_annotation/inference_engine.py#L28

You could likely see what files are in the /opt/intel/openvino_2020.1.023/deployment_tools/inference_engine/lib/intel64/ and change the above method call to reflect what's in the directory. Cvat devs are pretty chill about accepting patches.

@MichelEhmen
Copy link
Author

Unfortunately there is no similar file in the mentioned directory. I also found this answer https://software.intel.com/en-us/forums/intel-distribution-of-openvino-toolkit/topic/848825

Is there any other possibility to fix this issue?

@benhoff
Copy link
Contributor

benhoff commented Feb 24, 2020

Can you list the files in that directory? What operating system are you using? Docker? Mac OSX? Ubuntu?

Yes, it can be patched.

@MichelEhmen
Copy link
Author

screenshot

I attached a screenshot with the files in that directory. It's currently runnning in a Docker container with Ubuntu

@benhoff
Copy link
Contributor

benhoff commented Feb 26, 2020

Ah. Thanks for the screenshot, it's not as easy as I thought it was.

In general, the OpenVINO code needs to be migrated to the new OpenVINO core API anyways. See here for reference.

If I'm reading the documents correctly, line 28 in this code file needs to be changed to libinference_engine.so instead of libcpu_extension_avx2.

https://github.com/opencv/cvat/blob/8caa1695c52aab06574fc8651840ff6e80baa072/cvat/apps/auto_annotation/inference_engine.py#L28

However, if we put this change into the main code base it would fail for past versions of OpenVINO.
The best thing to do would be to check for a version. It doesn't look like the python package has a version attached to it (from a quick glance, maybe it's somewhere in the tools module?). However the inference engine package in OpenVINO 2019 release 3 has a get_version call.

Putting an if statement that looked something like,

if openvino.inference_engine.get_version() > XXXXXXX:
    # use libinference_engine.so instead of libcpu_extension_avx2
else:
    # use libcpu_extension_avx2

would suffice. Note that it looks like the get_version command returns a string which would require some parsing.

@rvorias
Copy link

rvorias commented Feb 27, 2020

@benhoff I tried to implement your suggestion, it's not working.
Checking request has returned the "failed" status. Message: Exception: Model was not properly created/updated. Test failed: dlSym cannot locate method 'CreateExtension': /opt/intel/openvino_2020.1.023/deployment_tools/inference_engine/lib/intel64/libinference_engine.so: undefined symbol: CreateExtension
Currently trying to roll back to 2019R3

@benhoff

This comment has been minimized.

@alalek
Copy link

alalek commented Feb 27, 2020

use libinference_engine.so instead of libcpu_extension_avx2

This is wrong.
You should do nothing in this case instead (OpenVINO 2020.1).
cpu_extension content has been merged into MKLDNN plugin and it is used automatically.

@nmanovic nmanovic added the enhancement New feature or request label Feb 29, 2020
@nmanovic nmanovic added this to the 1.0.0 - Release milestone Feb 29, 2020
@benhoff benhoff mentioned this issue Mar 15, 2020
@nmanovic nmanovic modified the milestones: 1.0.0-release, 1.1.0-beta May 23, 2020
@nmanovic nmanovic self-assigned this Jul 27, 2020
@nmanovic nmanovic linked a pull request Jul 29, 2020 that will close this issue
18 tasks
@nmanovic
Copy link
Contributor

Should be fixed by #1767

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants