Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpNotImplemented: The following operators are not supported for frontend ONNX: Softplus #7176

Closed
FelixFu520 opened this issue Dec 29, 2020 · 5 comments

Comments

@FelixFu520
Copy link

when I transfer ONNX to TVM lib, an error

---------------------------------------------------------------------------
OpNotImplemented                          Traceback (most recent call last)
<ipython-input-2-4fbc9b21ba7b> in <module>
     19 shape_dict = {input_name: img.shape}
     20 # 利用Relay中的onnx前端读取我们导出的onnx模型
---> 21 sym, params = relay.frontend.from_onnx(onnx_model, shape_dict)
     22 
     23 # 这里利用TVM构建出优化后模型的信息

~/tvm/python/tvm/relay/frontend/onnx.py in from_onnx(model, shape, dtype, opset, freeze_params)
   2746     # Use the graph proto as a scope so that ops can access other nodes if needed.
   2747     with g:
-> 2748         mod, params = g.from_onnx(graph, opset, freeze_params)
   2749     return mod, params

~/tvm/python/tvm/relay/frontend/onnx.py in from_onnx(self, graph, opset, freeze_params, get_output_expr)
   2527             msg = "The following operators are not supported for frontend ONNX: "
   2528             msg += ", ".join(unsupported_ops)
-> 2529             raise tvm.error.OpNotImplemented(msg)
   2530         # construct nodes, nodes are stored as directed acyclic graph
   2531         for node in graph.node:

OpNotImplemented: The following operators are not supported for frontend ONNX: Softplus

the code below

# 导入onnx,转换成*.so动态库
import onnx
import time
import tvm
import numpy as np
import tvm.relay as relay
from PIL import Image

#开始同样是读取.onnx模型
onnx_model = onnx.load('../models/yolov4.onnx')  # 导入模型

img = Image.open("street.jpg")
img = img.resize((416, 416))
img = np.array(img, dtype=np.float32)
img /= 255.0
img = img.transpose((2, 0, 1))
img = np.expand_dims(img, axis=0)

# 这里首先在PC的CPU上进行测试 所以使用LLVM进行导出
# target = tvm.target.create('llvm') # x86
target = tvm.target.Target('llvm') # x86
# target = tvm.target.arm_cpu("rasp3b") # raspi
# target = 'llvm'


input_name = "input_0"  # 注意这里为之前导出onnx模型中的模型的输入id,这里为0
shape_dict = {input_name: img.shape}
# 利用Relay中的onnx前端读取我们导出的onnx模型
sym, params = relay.frontend.from_onnx(onnx_model, shape_dict)

# 这里利用TVM构建出优化后模型的信息
with relay.build_config(opt_level=2):
    graph, lib, params = relay.build_module.build(sym, target, params=params)
    

    
dtype = 'float32'
from tvm.contrib import graph_runtime

# 下面的函数导出我们需要的动态链接库 地址可以自己定义
print("Output model files")
libpath = "../models/yolov4_pc.so"
lib.export_library(libpath)

# 下面的函数导出我们神经网络的结构,使用json文件保存
graph_json_path = "../models/yolov4_pc.json"
with open(graph_json_path, 'w') as fo:
    fo.write(graph)

# 下面的函数中我们导出神经网络模型的权重参数
param_path = "../models/yolov4_pc.params"
with open(param_path, 'wb') as fo:
    fo.write(relay.save_param_dict(params))
# -------------至此导出模型阶段已经结束--------

what should I do?

@junrushao
Copy link
Member

junrushao commented Dec 29, 2020

Looks like softplus is not implemented yet, so a PR is more than welcomed :-)

@junrushao
Copy link
Member

Thanks for your interests in TVM. Please create a thread at the discuss forum (https://discuss.tvm.apache.org/) since it is not a bug. Thanks!

@insop
Copy link
Contributor

insop commented Jan 1, 2021

Softplus is added in 12/10/2020 from this #7089

@FelixFu520 , you might want to pull the latest main and re-try

@junrushao1994 @jwfromm

However, I see that there were SoftPlus (not ethe P is in upper case) was already in.
According to Onnx spec, it is Softplus not SoftPlus.
I am not sure we need to keep them both (Softplus and SoftPlus).

I have a branch that removed SoftPlus, let me know I can create a PR.
https://github.com/insop/incubator-tvm/commit/1e944644680188f31ada93a7c4ec7de797a1a0e1.patch

From 1e944644680188f31ada93a7c4ec7de797a1a0e1 Mon Sep 17 00:00:00 2001
From: Insop Song <x@y.z>
Date: Thu, 31 Dec 2020 18:53:33 -0800
Subject: [PATCH] Remove seemingly invalid SoftPlus

- `Softplus` is added in 12/10/2020 from this https://github.com/apache/tvm/pull/7089
- However, I see that there were `SoftPlus` (not the P is in capital) was already in.
According to [Onnx spec](https://github.com/onnx/onnx/blob/master/docs/Operators.md), it is `Softplus` not `SoftPlus`.
---
 python/tvm/relay/frontend/onnx.py          | 9 ---------
 tests/python/frontend/onnx/test_forward.py | 1 -
 2 files changed, 10 deletions(-)

diff --git a/python/tvm/relay/frontend/onnx.py b/python/tvm/relay/frontend/onnx.py
index 6122c81d321..1c544d30971 100644
--- a/python/tvm/relay/frontend/onnx.py
+++ b/python/tvm/relay/frontend/onnx.py
@@ -932,14 +932,6 @@ def _impl_v1(cls, inputs, attr, params):
         return _op.tanh(_expr.const(beta) * inputs[0]) * _expr.const(alpha)
 
 
-class SoftPlus(OnnxOpConverter):
-    """Operator converter for SoftPlus."""
-
-    @classmethod
-    def _impl_v1(cls, inputs, attr, params):
-        return _op.log(_op.exp(inputs[0]) + _expr.const(1.0))
-
-
 class Softsign(OnnxOpConverter):
     """Operator converter for Softsign."""
 
@@ -2661,7 +2653,6 @@ def _get_convert_map(opset):
         "OneHot": OneHot.get_converter(opset),
         # 'Hardmax'
         "Softsign": Softsign.get_converter(opset),
-        "SoftPlus": SoftPlus.get_converter(opset),
         "Gemm": Gemm.get_converter(opset),
         "MatMul": MatMul.get_converter(opset),
         "Mod": Mod.get_converter(opset),
diff --git a/tests/python/frontend/onnx/test_forward.py b/tests/python/frontend/onnx/test_forward.py
index 33dd048896b..3d95a9a83ee 100644
--- a/tests/python/frontend/onnx/test_forward.py
+++ b/tests/python/frontend/onnx/test_forward.py
@@ -1983,7 +1983,6 @@ def verify_single_ops(op, x, out_np, rtol=1e-5, atol=1e-5):
     verify_single_ops("Tanh", x, np.tanh(x))
     verify_single_ops("Sigmoid", x, 1 / (1 + np.exp(-x)))
     verify_single_ops("Softsign", x, x / (1 + np.abs(x)))
-    verify_single_ops("SoftPlus", x, np.log(1 + np.exp(x)))
 
 
 @tvm.testing.uses_gpu

@jwfromm
Copy link
Contributor

jwfromm commented Jan 2, 2021

Thanks for catching that, I actually totally missed that we had a SoftPlus operator. I agree its silly to have both. Thanks for removing the redundant one!

@insop
Copy link
Contributor

insop commented Jan 2, 2021

Thanks for catching that, I actually totally missed that we had a SoftPlus operator. I agree its silly to have both. Thanks for removing the redundant one!

@jwfromm
PR created: #7189

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants