Baichuan Intelligence Launches 'Pengcheng-Baichuan·Mindsea 33B' Large Model with 128K Long Context Window
-
Baichuan Intelligence and Pengcheng Laboratory announced a collaboration to develop the longest-context large model based on domestic computing power. This partnership breaks through technical limitations of domestic computing power models and sets an example for China's large model industry development.
The collaboration will leverage both parties' strengths to support China's large model innovation, promote open-source development of local large models, and facilitate intelligent transformation.
The jointly developed 'Pengcheng-Baichuan·Mindsea 33B' model features a 128K context window - the longest available - which provides richer semantic information and improves content generation accuracy and fluency. Future upgrades will extend this to 192K, making it the longest context window model trained on domestic computing power.
Both parties will continue strengthening cooperation to advance technological innovation and practical implementation of domestic computing power large models.