From fa2ff5fffe77f8b47def19385f294d8441b21e06 Mon Sep 17 00:00:00 2001 From: Steven Date: Thu, 11 Aug 2022 18:03:50 -0700 Subject: [PATCH 1/4] =?UTF-8?q?=20=F0=9F=93=9D=20update=20docs=20landing?= =?UTF-8?q?=20page?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/source/en/index.mdx | 23 ++++++++--------------- 1 file changed, 8 insertions(+), 15 deletions(-) diff --git a/docs/source/en/index.mdx b/docs/source/en/index.mdx index 82053b11effdda..3d5bf8d563c368 100644 --- a/docs/source/en/index.mdx +++ b/docs/source/en/index.mdx @@ -12,18 +12,13 @@ specific language governing permissions and limitations under the License. # 🤗 Transformers -State-of-the-art Machine Learning for PyTorch, TensorFlow and JAX. +State-of-the-art Transformers models in PyTorch, TensorFlow, and JAX for research and production. -🤗 Transformers provides APIs to easily download and train state-of-the-art pretrained models. Using pretrained models can reduce your compute costs, carbon footprint, and save you time from training a model from scratch. The models can be used across different modalities such as: +🤗 Transformers is dedicated to providing inference, finetuning, and pretraining for Transformer-based models on text, computer vision, speech, and multimodal tasks. Each model is defined by three classes - a configuration, a model, and a preprocessor - which can be initialized from pretrained instances stored on the [Hugging Face Hub](https://huggingface.co/models). The library also offers convenient helper tools like a simple pipeline to use a model for inference on a given task and a feature-complete training loop optimized for 🤗 Transformers models. -* 📝 Text: text classification, information extraction, question answering, summarization, translation, and text generation in over 100 languages. -* 🖼️ Images: image classification, object detection, and segmentation. -* 🗣️ Audio: speech recognition and audio classification. -* 🐙 Multimodal: table question answering, optical character recognition, information extraction from scanned documents, video classification, and visual question answering. +Models are available in PyTorch, TensorFlow, and JAX. This framework interoperability grants the flexibility to use a different framework at each stage of a model's lifetime. 🤗 Transformers also supports exporting a model to a format like ONNX and TorchScript for deployment in production environments. -Our library supports seamless integration between three of the most popular deep learning libraries: [PyTorch](https://pytorch.org/), [TensorFlow](https://www.tensorflow.org/) and [JAX](https://jax.readthedocs.io/en/latest/). Train your model in three lines of code in one framework, and load it for inference with another. - -Each 🤗 Transformers architecture is defined in a standalone Python module so they can be easily customized for research and experiments. +Join the growing community on the [Hub](https://huggingface.co/models), [forum](https://discuss.huggingface.co/), or [Discord](https://discord.com/invite/JfAtkvEtRb) today. ## If you are looking for custom support from the Hugging Face team @@ -33,7 +28,7 @@ Each 🤗 Transformers architecture is defined in a standalone Python module so ## Contents -The documentation is organized in five parts: +The documentation is organized into five sections: - **GET STARTED** contains a quick tour and installation instructions to get up and running with 🤗 Transformers. - **TUTORIALS** are a great place to begin if you are new to our library. This section will help you gain the basic skills you need to start using 🤗 Transformers. @@ -41,11 +36,9 @@ The documentation is organized in five parts: - **CONCEPTUAL GUIDES** provides more discussion and explanation of the underlying concepts and ideas behind models, tasks, and the design philosophy of 🤗 Transformers. - **API** describes each class and function, grouped in: - - **MAIN CLASSES** for the main classes exposing the important APIs of the library. - - **MODELS** for the classes and functions related to each model implemented in the library. - - **INTERNAL HELPERS** for the classes and functions we use internally. - -The library currently contains JAX, PyTorch and TensorFlow implementations, pretrained model weights, usage scripts and conversion utilities for the following models. + - **MAIN CLASSES** details the most important classes like configuration, model, tokenizer, and pipeline. + - **MODELS** details the classes and functions related to each model implemented in the library. + - **INTERNAL HELPERS** details utility classes and functions used internally. ### Supported models From f1866a04fc07540e1651c6128af28163bc9b8c9c Mon Sep 17 00:00:00 2001 From: Steven Date: Wed, 17 Aug 2022 15:45:14 -0700 Subject: [PATCH 2/4] =?UTF-8?q?=20=F0=9F=96=8D=20apply=20feedbacks?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/source/en/index.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/en/index.mdx b/docs/source/en/index.mdx index 3d5bf8d563c368..70af62a4ade719 100644 --- a/docs/source/en/index.mdx +++ b/docs/source/en/index.mdx @@ -16,7 +16,7 @@ State-of-the-art Transformers models in PyTorch, TensorFlow, and JAX for researc 🤗 Transformers is dedicated to providing inference, finetuning, and pretraining for Transformer-based models on text, computer vision, speech, and multimodal tasks. Each model is defined by three classes - a configuration, a model, and a preprocessor - which can be initialized from pretrained instances stored on the [Hugging Face Hub](https://huggingface.co/models). The library also offers convenient helper tools like a simple pipeline to use a model for inference on a given task and a feature-complete training loop optimized for 🤗 Transformers models. -Models are available in PyTorch, TensorFlow, and JAX. This framework interoperability grants the flexibility to use a different framework at each stage of a model's lifetime. 🤗 Transformers also supports exporting a model to a format like ONNX and TorchScript for deployment in production environments. +🤗 Transformers supports framework interoperability between PyTorch, TensorFlow, and JAX. This provides the flexibility to use a different framework at each stage of a model's life; train a model in three lines of code in one framework, and load it for inference in another framework. Models can also be exported to a format like ONNX and TorchScript for deployment in production environments. Join the growing community on the [Hub](https://huggingface.co/models), [forum](https://discuss.huggingface.co/), or [Discord](https://discord.com/invite/JfAtkvEtRb) today. From ff20425ba8f3f8a6fc0e20d51d514aa675a9dc57 Mon Sep 17 00:00:00 2001 From: Steven Date: Thu, 1 Sep 2022 15:22:44 -0700 Subject: [PATCH 3/4] apply feedbacks --- docs/source/en/index.mdx | 23 ++++++++++++++--------- 1 file changed, 14 insertions(+), 9 deletions(-) diff --git a/docs/source/en/index.mdx b/docs/source/en/index.mdx index 70af62a4ade719..c5742f537c23d8 100644 --- a/docs/source/en/index.mdx +++ b/docs/source/en/index.mdx @@ -12,13 +12,18 @@ specific language governing permissions and limitations under the License. # 🤗 Transformers -State-of-the-art Transformers models in PyTorch, TensorFlow, and JAX for research and production. +State-of-the-art Machine Learning for PyTorch, TensorFlow, and JAX. -🤗 Transformers is dedicated to providing inference, finetuning, and pretraining for Transformer-based models on text, computer vision, speech, and multimodal tasks. Each model is defined by three classes - a configuration, a model, and a preprocessor - which can be initialized from pretrained instances stored on the [Hugging Face Hub](https://huggingface.co/models). The library also offers convenient helper tools like a simple pipeline to use a model for inference on a given task and a feature-complete training loop optimized for 🤗 Transformers models. +🤗 Transformers provides APIs and tools to easily download and train state-of-the-art pretrained models. Using pretrained models can reduce your compute costs, carbon footprint, and save you the time and resources required to train a model from scratch. These models support common tasks in different modalities, such as: -🤗 Transformers supports framework interoperability between PyTorch, TensorFlow, and JAX. This provides the flexibility to use a different framework at each stage of a model's life; train a model in three lines of code in one framework, and load it for inference in another framework. Models can also be exported to a format like ONNX and TorchScript for deployment in production environments. +📝 **Natural Language Processing**: text classification, named entity recognition, question answering, language modeling, summarization, translation, and multiple choice. +🖼️ **Computer Vision**: image classification, object detection, and segmentation. +🗣️ **Audio**: automatic speech recognition and audio classification. +🐙 **Multimodal**: table question answering, optical character recognition, information extraction from scanned documents, video classification, and visual question answering. -Join the growing community on the [Hub](https://huggingface.co/models), [forum](https://discuss.huggingface.co/), or [Discord](https://discord.com/invite/JfAtkvEtRb) today. +🤗 Transformers support framework interoperability between PyTorch, TensorFlow, and JAX. This provides the flexibility to use a different framework at each stage of a model's life; train a model in three lines of code in one framework, and load it for inference in another. Models can also be exported to a format like ONNX and TorchScript for deployment in production environments. + +Join the growing community on the [Hub](https://huggingface.co/models), [forum](https://discuss.huggingface.co/), or [Discord](https://discord.com/invite/JfAtkvEtRb) today! ## If you are looking for custom support from the Hugging Face team @@ -30,11 +35,11 @@ Join the growing community on the [Hub](https://huggingface.co/models), [forum]( The documentation is organized into five sections: -- **GET STARTED** contains a quick tour and installation instructions to get up and running with 🤗 Transformers. -- **TUTORIALS** are a great place to begin if you are new to our library. This section will help you gain the basic skills you need to start using 🤗 Transformers. -- **HOW-TO GUIDES** will show you how to achieve a specific goal like fine-tuning a pretrained model for language modeling or how to create a custom model head. -- **CONCEPTUAL GUIDES** provides more discussion and explanation of the underlying concepts and ideas behind models, tasks, and the design philosophy of 🤗 Transformers. -- **API** describes each class and function, grouped in: +- **GET STARTED** provides a quick tour of the library and installation instructions to get up and running. +- **TUTORIALS** are a great place to start if you're a beginner. This section will help you gain the basic skills you need to start using the library. +- **HOW-TO GUIDES** show you how to achieve a specific goal, like finetuning a pretrained model for language modeling or how to write and share a custom model. +- **CONCEPTUAL GUIDES** offers more discussion and explanation of the underlying concepts and ideas behind models, tasks, and the design philosophy of 🤗 Transformers. +- **API** describes all classes and functions: - **MAIN CLASSES** details the most important classes like configuration, model, tokenizer, and pipeline. - **MODELS** details the classes and functions related to each model implemented in the library. From 4499bce8af60f69655f71d6319ebc8084f7ba933 Mon Sep 17 00:00:00 2001 From: Steven Date: Fri, 2 Sep 2022 09:29:22 -0700 Subject: [PATCH 4/4] apply feedbacks, use
for list --- docs/source/en/index.mdx | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/source/en/index.mdx b/docs/source/en/index.mdx index c5742f537c23d8..2103f6af59ef74 100644 --- a/docs/source/en/index.mdx +++ b/docs/source/en/index.mdx @@ -12,13 +12,13 @@ specific language governing permissions and limitations under the License. # 🤗 Transformers -State-of-the-art Machine Learning for PyTorch, TensorFlow, and JAX. +State-of-the-art Machine Learning for [PyTorch](https://pytorch.org/), [TensorFlow](https://www.tensorflow.org/), and [JAX](https://jax.readthedocs.io/en/latest/). 🤗 Transformers provides APIs and tools to easily download and train state-of-the-art pretrained models. Using pretrained models can reduce your compute costs, carbon footprint, and save you the time and resources required to train a model from scratch. These models support common tasks in different modalities, such as: -📝 **Natural Language Processing**: text classification, named entity recognition, question answering, language modeling, summarization, translation, and multiple choice. -🖼️ **Computer Vision**: image classification, object detection, and segmentation. -🗣️ **Audio**: automatic speech recognition and audio classification. +📝 **Natural Language Processing**: text classification, named entity recognition, question answering, language modeling, summarization, translation, multiple choice, and text generation.
+🖼️ **Computer Vision**: image classification, object detection, and segmentation.
+🗣️ **Audio**: automatic speech recognition and audio classification.
🐙 **Multimodal**: table question answering, optical character recognition, information extraction from scanned documents, video classification, and visual question answering. 🤗 Transformers support framework interoperability between PyTorch, TensorFlow, and JAX. This provides the flexibility to use a different framework at each stage of a model's life; train a model in three lines of code in one framework, and load it for inference in another. Models can also be exported to a format like ONNX and TorchScript for deployment in production environments.