The model claims to use only 75 percent of GPT-3’s training compute time, and 40 and 80 percent of DeepMind’s Chinchilla NLP and Google’s PaLM-62B compute times, respectively.
The Technology Innovation Institute (TII) has upped its generative AI credentials with the launch of “Falcon LLM,” a foundational large language model (LLM) with 40 billion parameters.