Skip to content

Commit

Permalink
Merge pull request #10 from ammarasmro/patch-1
Browse files Browse the repository at this point in the history
Fix (are -> care) typo in README.md
  • Loading branch information
jacobdevlin-google authored Oct 31, 2018
2 parents aff0e6a + 57f9528 commit 6d6e691
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ minutes.

BERT is method of pre-training language representations, meaning that we train a
general-purpose "language understanding" model on a large text corpus (like
Wikipedia), and then use that model for downstream NLP tasks that we are about
Wikipedia), and then use that model for downstream NLP tasks that we care about
(like question answering). BERT outperforms previous methods because it is the
first *unsupervised*, *deeply bidirectional* system for pre-training NLP.

Expand Down

0 comments on commit 6d6e691

Please sign in to comment.