Skip to content

Commit

Permalink
Bypassing Export model step, if training on TPU's. As this need infer…
Browse files Browse the repository at this point in the history
…ence to be supported on TPU's. Remove this check once inference is supported. (tensorflow#5209)
  • Loading branch information
aman2930 authored and Taylor Robie committed Aug 30, 2018
1 parent 2aec950 commit 23b5b42
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion official/transformer/transformer_main.py
Original file line number Diff line number Diff line change
Expand Up @@ -610,7 +610,7 @@ def run_transformer(flags_obj):
bleu_threshold=flags_obj.stop_threshold,
vocab_file=flags_obj.vocab_file)

if flags_obj.export_dir:
if flags_obj.export_dir and not params["use_tpu"]:
serving_input_fn = export.build_tensor_serving_input_receiver_fn(
shape=[None], dtype=tf.int64, batch_size=None)
# Export saved model, and save the vocab file as an extra asset. The vocab
Expand Down

0 comments on commit 23b5b42

Please sign in to comment.