You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I cannot run the codes successfully. When using the default configuration "ConcatOp : Dimensions of inputs should match" appears. While when I changed model_type to others all failed. Even if model_type = mlp, it failed as follow:
Traceback (most recent call last):
File "./run_dnn.py", line 912, in
train(wnd_conf, args['model_ckpt'])
File "./run_dnn.py", line 154, in train
tower_train_logits = inf.inference(tower_batch_features, is_train=True)
File "/notebook/dmtfq/CIKM2020_DMT/DMT_code/model/inference_mlp.py", line 118, in inference
return self.model.inference(inputs,is_train,is_predict)
TypeError: inference() takes from 2 to 3 positional arguments but 4 were given
when I modify mlp inference() function parameters, it run next few steps but failed at last.
The text was updated successfully, but these errors were encountered:
I solve this bug by deleting some data about emb and attention_embed in conf/settings/conf, because there are some misssing data provided by the author.By the way, there are some bugs in ohter models like mmoe and mlp.
I cannot run the codes successfully. When using the default configuration "ConcatOp : Dimensions of inputs should match" appears. While when I changed model_type to others all failed. Even if model_type = mlp, it failed as follow:
Traceback (most recent call last):
File "./run_dnn.py", line 912, in
train(wnd_conf, args['model_ckpt'])
File "./run_dnn.py", line 154, in train
tower_train_logits = inf.inference(tower_batch_features, is_train=True)
File "/notebook/dmtfq/CIKM2020_DMT/DMT_code/model/inference_mlp.py", line 118, in inference
return self.model.inference(inputs,is_train,is_predict)
TypeError: inference() takes from 2 to 3 positional arguments but 4 were given
when I modify mlp inference() function parameters, it run next few steps but failed at last.
The text was updated successfully, but these errors were encountered: