Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More When legal research company LexisNexis created its AI assistant Protégé, ...
Bigger AI isn’t always better. Here's why smaller, task-specific models deliver faster performance, lower costs and better ...
There’s a paradox at the heart of modern AI: The kinds of sophisticated models that companies are using to get real work done and reduce head count aren’t the ones getting all the attention. Ever-more ...
Google's DeepMind AI research team has unveiled a new open source AI model today, Gemma 3 270M. As its name would suggest, this is a 270-million-parameter model — far smaller than the 70 billion or ...
The original version of this story appeared in Quanta Magazine. Large language models work well because they’re so large. The latest models from OpenAI, Meta, and DeepSeek use hundreds of billions of ...
‘Tis the week for small AI models, it seems. Nonprofit AI research institute Ai2 on Thursday released Olmo 2 1B, a 1-billion-parameter model that Ai2 claims beats similarly-sized models from Google, ...
Blazor creator Steve Sanderson presented a keynote at the recent NDC London 2025 conference where he previewed the future of .NET application development with smaller AI models and autonomous agents, ...
The AI startup focuses on BFSI use cases, where low latency, native-language voice, and security-first deployments are critical Cofounder Akshat Mandloi believes distribution and customer access now ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results