Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] [Relay] [Keras]relay.concatenate requires all tensors have the same shape on non-concatenating axes #15174

Closed
jikechao opened this issue Jun 28, 2023 · 0 comments
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug

Comments

@jikechao
Copy link
Contributor

jikechao commented Jun 28, 2023

For the input:

keras.layers.Concatenate(axis=2) 
input_shape1 =[1, 1, 4, 5]
input_shape2 = [1, 1, 8, 5]

The two input tensors have the same shape in non-concatenating axes(i.e., axis=2).

However, TVM threw an unexpected exception:

Traceback (most recent call last):
  8: TVMFuncCall
  7: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::transform::Pass, tvm::IRModule)>::AssignTypedLambda<tvm::transform::$_6>(tvm::transform::$_6, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  6: tvm::transform::Pass::operator()(tvm::IRModule) const
  5: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  4: tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  3: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::IRModule, tvm::transform::PassContext)>::AssignTypedLambda<tvm::relay::transform::InferType()::$_2>(tvm::relay::transform::InferType()::$_2)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  2: tvm::relay::TypeInferencer::Infer(tvm::GlobalVar, tvm::relay::Function)
  1: tvm::relay::TypeSolver::Solve()
  0: _ZN3tvm7runtime6detail
  File "/workplace/software/tvm/tvm/src/relay/analysis/type_solver.cc", line 643
TVMError: 
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (false) is false: relay.concatenate requires all tensors have the same shape on non-concatenating axes
``


### Triage

* relay:analysis
* needs-triage


cc @shingjan
@jikechao jikechao added needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug labels Jun 28, 2023
@jikechao jikechao changed the title [Bug] [Relay] relay.concatenate requires all tensors have the same shape on non-concatenating axes [Bug] [Relay] [Keras]relay.concatenate requires all tensors have the same shape on non-concatenating axes Jun 28, 2023
@echuraev echuraev closed this as completed Jul 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug
Projects
None yet
Development

No branches or pull requests

2 participants