Google's Gemini - a family of large, natively multimodal generative AI models - is capable of taking in about 700,000 words or 30,000 lines of code. Gemini 1.5 is more efficient to train and serve, having a new mixture of experts (MoE) neural network architecture, rather than one large neural network like traditional models.
Moving beyond shallow chit-chat, Gemini 1.5 can navigate complex domains like geopolitics, unlike its competitors such as Anthropic's Claude and OpenAI's ChatGPT.