H2O.ai Builds Smaller AI Models | Sramana Mitra : vimarsana.

H2O.ai Builds Smaller AI Models | Sramana Mitra

The growth in the AI industry has led to the increasing size of the AI models. OpenAI's ChatGPT and Google's Bard, for instance, are composed of more than 100 billion parameters. GPT-4 is estimated to be built out of over 1 trillion parameters. However, these large parameters require substantial computing power, have high operating costs, and can perpetuate harmful biases if not carefully monitored. The high resource requirement makes these large AI models inaccessible to smaller players. Mountain View-based H2O.ai is helping democratize AI adoption by working on smaller AI models, some that weigh as little as 1.8 billion parameters. H2O.ai's Offerings Founded in 2012 by open-source focused Cliff Click and Satish Ambati, H2O.ai began with the idea that there should be freedom around the creation and use of AI. H2O.ai's open source framework known as H2O allows data scientists and developers access to a fast machine learning engine for their applications. It works both on top of

Related Keywords

New York , United States , Australia , Google Bard , Satish Ambati , Google , Nvidia , Goldman Sachs , Wells Fargo , Commonwealth Bank , Cliff Click , Azure Data Lake , Veligera Capital , Crane Venture Partners , Pivot Investment Partners , New York Life Insurance , Celesta Capital , H2o Ai , Technology Stocks ,

© 2025 Vimarsana