Skip to content

Commit

Permalink
Fix doc examples: KeyError (#14699)
Browse files Browse the repository at this point in the history
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
  • Loading branch information
ydshieh and ydshieh authored Dec 10, 2021
1 parent bab1556 commit 8395f14
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -512,7 +512,6 @@ def dummy_inputs(self):
>>> UTTERANCE = "My friends are cool but they eat too many carbs."
>>> print("Human: ", UTTERANCE)
>>> inputs = tokenizer([UTTERANCE], return_tensors='pt')
>>> inputs.pop("token_type_ids")
>>> reply_ids = model.generate(**inputs)
>>> print("Bot: ", tokenizer.batch_decode(reply_ids, skip_special_tokens=True)[0])
what kind of carbs do they eat? i don't know much about carbs.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -517,12 +517,11 @@ def serving(self, inputs):
>>> from transformers import BlenderbotSmallTokenizer, TFBlenderbotSmallForConditionalGeneration
>>> mname = 'facebook/blenderbot_small-90M'
>>> model = BlenderbotSmallForConditionalGeneration.from_pretrained(mname)
>>> tokenizer = TFBlenderbotSmallTokenizer.from_pretrained(mname)
>>> tokenizer = BlenderbotSmallTokenizer.from_pretrained(mname)
>>> UTTERANCE = "My friends are cool but they eat too many carbs."
>>> print("Human: ", UTTERANCE)
>>> inputs = tokenizer([UTTERANCE], return_tensors='tf')
>>> inputs.pop("token_type_ids")
>>> reply_ids = model.generate(**inputs)
>>> print("Bot: ", tokenizer.batch_decode(reply_ids, skip_special_tokens=True)[0])
Expand Down

0 comments on commit 8395f14

Please sign in to comment.