diff --git a/assets/86_bloom/bloom-examples.jpg b/assets/86_bloom/bloom-examples.jpg new file mode 100644 index 0000000000..5884a6aeb6 Binary files /dev/null and b/assets/86_bloom/bloom-examples.jpg differ diff --git a/bloom.md b/bloom.md index c86e16710f..8fc8009730 100644 --- a/bloom.md +++ b/bloom.md @@ -11,7 +11,7 @@ thumbnail: /blog/assets/86_bloom/thumbnail.png display: block; margin-left: auto; margin-right: auto; - width: 50%; + width: 100%; }

🌸 Introducing The World's Largest Open Multilingual Language Model: BLOOM 🌸

@@ -39,7 +39,7 @@ With its 176 billion parameters, BLOOM is able to generate text in 46 natural la Researchers can [now download, run and study BLOOM](https://huggingface.co/bigscience/bloom) to investigate the performance and behavior of recently developed large language models down to their deepest internal operations. More generally, any individual or institution who agrees to the terms of the model’s [Responsible AI License](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) (developed during the BigScience project itself) can use and build upon the model on a local machine or on a cloud provider. In this spirit of collaboration and continuous improvement, we’re also releasing, for the first time, the intermediary checkpoints and optimizer states of the training. Don’t have 8 A100s to play with? An inference API, currently backed by Google’s TPU cloud and a FLAX version of the model, also allows quick tests, prototyping, and lower-scale use. You can already play with it on the Hugging Face Hub. - + This is only the beginning. BLOOM’s capabilities will continue to improve as the workshop continues to experiment and tinker with the model. We’ve started work to make it instructable as our earlier effort T0++ was and are slated to add more languages, compress the model into a more usable version with the same level of performance, and use it as a starting point for more complex architectures… All of the experiments researchers and practitioners have always wanted to run, starting with the power of a 100+ billion parameter model, are now possible. BLOOM is the seed of a living family of models that we intend to grow, not just a one-and-done model, and we’re ready to support community efforts to expand it.