English | 简体中文
[InternLM] (https://github.com/InternLM/InternLM) is the artificial intelligence Laboratory (Shanghai AI Laboratory) development of high-performance multimodal language model series. This product is based on the Huawei Cloud EulerOS 2.0 64-bit system of the arm architecture and provides the out-of-the-box InternLM2.5-1.8B-Chat model.
- Utilize trillions of high - quality corpora to build a super - strong knowledge system for the model.
- Support a context window length of 8k, enabling longer inputs and a stronger inference experience.
- Have general tool invocation capabilities, allowing users to flexibly build their own processes.
- Provide a lightweight training framework for model pre - training. There's no need to install a large number of dependency packages. A single set of code supports pre - training on thousands of cards and single - card human preference alignment training, while achieving extreme performance optimization with an acceleration efficiency of nearly 90% under thousands of cards training.
The open - source image product InternLM Large Language Model provided by this project has InternLM2.5 - 1.8B - Chat and its related operating environment pre - installed, and also provides deployment templates. Come and refer to the usage guide to easily start your "out - of - the - box" efficient experience!
System requirements are as follows:
- CPU: 4vCPUs or higher
- RAM: 16GB or larger
- Disk: At least 40GB
Register a Huawei account and activate Huawei Cloud
Image Specification | Feature Description | Remarks |
---|---|---|
internlm2_5-1_8b-chat-kunpeng | Installed and deployed based on Kunpeng servers + Huawei Cloud EulerOS 2.0 64 - bit |
- For more questions, you can contact us via issue or the service support of the specified product in the Huawei Cloud Marketplace.
- For other open - source images, refer to open-source-image-repos
- Fork this repository and submit a merge request.
- Synchronously update README.md based on your open - source image information.