Skip to content

Latest commit

 

History

History
 
 

Baichuan2-13b-LLM-Chat

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

Baichuan2-LLM-13B-Chat LDL

Intro about the AI Model

Baichuan-7B is an open-source large-scale pre-trained model developed by Baichuan Intelligence. Based on the Transformer structure, the 7 billion parameter model is trained on approximately 1.2 trillion tokens, supports Chinese and English, and has a context window length of 4096. It achieves the best results of the same size on both the standard Chinese and English authoritative benchmarks (C-EVAL/MMLU).

Deployment Confirmation