Baichuan-7B is an open-source large-scale pre-trained model developed by Baichuan Intelligence. Based on the Transformer structure, the 7 billion parameter model is trained on approximately 1.2 trillion tokens, supports Chinese and English, and has a context window length of 4096. It achieves the best results of the same size on both the standard Chinese and English authoritative benchmarks (C-EVAL/MMLU).
- Wallet Address: 0x835E98f15640348040C5B9a24E7fd47e872d60D5
- Space Link: Lagrange Baichuan2-LLM-13B-Chat Space
- Space Page: -