Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FRONTEND][TF][Bug] output type assignment not work for tf.range() #4265

Closed
FinnWeng opened this issue Nov 6, 2019 · 1 comment
Closed

Comments

@FinnWeng
Copy link

FinnWeng commented Nov 6, 2019

This issue happens when converting code with tf.range().

My environment is:
develop: python3.6, tensorflow1.14
convert to TVM: container of tvmai/demo-gpu

The code of issue is

my_tensor = tf.reshape(tf.range(1,256+1,1,dtype=tf.float32),[1,256])

the error log is:

tvm._ffi.base.TVMError: Traceback (most recent call last):
  [bt] (8) /usr/tvm/build/libtvm.so(TVMFuncCall+0x61) [0x7f8c084d70f1]
  [bt] (7) /usr/tvm/build/libtvm.so(+0xb1d64b) [0x7f8c083e264b]
  [bt] (6) /usr/tvm/build/libtvm.so(tvm::relay::ModuleNode::FromExpr(tvm::relay::Expr const&, tvm::Map<tvm::relay::GlobalVar, tvm::relay::Function, void, void> const&, tvm::Map<tvm::relay::GlobalTypeVar, tvm::relay::TypeData, void, void> const&)+0x17b) [0x7f8c083e236b]
  [bt] (5) /usr/tvm/build/libtvm.so(tvm::relay::ModuleNode::Add(tvm::relay::GlobalVar const&, tvm::relay::Function const&, bool)+0x344) [0x7f8c083df2d4]
  [bt] (4) /usr/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Function const&, tvm::relay::Module const&, tvm::relay::GlobalVar const&)+0x1fd) [0x7f8c082c7ced]
  [bt] (3) /usr/tvm/build/libtvm.so(tvm::relay::TypeInferencer::Infer(tvm::relay::Expr)+0x55) [0x7f8c082c6e65]
  [bt] (2) /usr/tvm/build/libtvm.so(tvm::relay::TypeSolver::Solve()+0x4e1) [0x7f8c08306781]
  [bt] (1) /usr/tvm/build/libtvm.so(std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), void tvm::runtime::TypedPackedFunc<bool (tvm::Array<tvm::relay::Type, void> const&, int, tvm::Attrs const&, tvm::relay::TypeReporter const&)>::AssignTypedLambda<bool (*)(tvm::Array<tvm::relay::Type, void> const&, int, tvm::Attrs const&, tvm::relay::TypeReporter const&)>(bool (*)(tvm::Array<tvm::relay::Type, void> const&, int, tvm::Attrs const&, tvm::relay::TypeReporter const&))::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)+0xd4) [0x7f8c0809dc74]
  [bt] (0) /usr/tvm/build/libtvm.so(tvm::relay::BroadcastRel(tvm::Array<tvm::relay::Type, void> const&, int, tvm::Attrs const&, tvm::relay::TypeReporter const&)+0xb7c) [0x7f8c080d0efc]
  File "/usr/tvm/src/relay/ir/error.cc", line 133
TVMError:
Error(s) have occurred. The program has been annotated with them:

In `main`:
v0.0.4
fn () {
  %0 = arange(1f, 257f, 1f, start=meta[relay.Constant][0], stop=meta[relay.Constant][1], step=meta[relay.Constant][2], dtype="int32") unable to unify: `int32` and `float32`; ;
  %1 = reshape(%0, newshape=[1, 256]);
  multiply(%1, meta[relay.Constant][3]) an internal invariant was violated while typechecking your program [08:25:28] /usr/tvm/src/relay/op/type_relations.cc:121: Check failed: t0->dtype == t1->dtype (int32 vs. float32) :
;
}
// meta data omitted. you can use show_meta_data=True to include meta data

And I take detour to avoid this issue by:

my_tensor = tf.cast(tf.reshape(tf.range(1,256+1,1),[1,256]),tf.float32)

It seems fine with code above. So I guess the issue is about type assignment in tf.range.

Thanks!

@tqchen tqchen changed the title [RELAY][Bug] output type assignment not work for tf.range() in TVM [RELAY][FRONTEND][TF][Bug] output type assignment not work for tf.range() Nov 6, 2019
@tqchen tqchen changed the title [RELAY][FRONTEND][TF][Bug] output type assignment not work for tf.range() [FRONTEND][TF][Bug] output type assignment not work for tf.range() Nov 6, 2019
@srkreddy1238
Copy link
Contributor

@FinnWeng ref. #4294 please confirm if this is solved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants