Cerebras releases 7 new GPT models trained on CS-2 wafer-sca

Cerebras releases 7 new GPT models trained on CS-2 wafer-scale systems

Cerebras Systems has trained and is releasing a series of seven GPT-based large language models (LLM) for open use by the research community, according to the company. This is the first time a company has used non-GPU based AI systems to train LLMs up to 13 billion parameters and is sharing the models, weights, and training recipe via the industry standard Apache 2.0 license. All seven models were trained on the 16 CS-2 systems in the Cerebras Andromeda AI supercomputer.

Related Keywords

, Cerebras Systems , Ai , Cerebras , Chatgpt , Hips Components , C Design , Distribution , C Manufacturing , T Ce , Nvidia , Server , Pc , Cloud Computing , Lot , Software , Big Data ,

© 2025 Vimarsana