vimarsana.com
Home
Live Updates
Introducing gigaGPT: GPT-3 sized models in 565 lines of code
Introducing gigaGPT: GPT-3 sized models in 565 lines of code
Introducing gigaGPT: GPT-3 sized models in 565 lines of code
GigaGPT is Cerebras’ implementation of Andrei Karpathy’s nanoGPT – the simplest and most compact code base to train and fine-tune GPT models.
Related Keywords
,
Pytorch Apis ,
Condor Galaxy ,
Cerebras Wafer Scale Clusters ,
Cerebras Wafer Scale Engine ,