Patronus AI unveiled “Generative Simulators,” adaptive “practice worlds” that replace static benchmarks with dynamic ...
On the Humanity’s Last Exam benchmark, Deep Research Agent scored 46.4%, outperforming OpenAI’s GPT-5 Pro (38.9%).
The Nemotron 3 lineup includes Nano, Super and Ultra models built on a hybrid latent mixture-of-experts (MoE) architecture.
Google rolls out Gemini Deep Research via the Interactions API, along with DeepSearchQA, enabling developers to build ...
Google today announced a “significantly more powerful Gemini Deep Research agent” that will soon be available in consumer ...
Google has released a "reimagined" version of its Gemini Deep Research agent, offering developers access to the tech giant's ...
Google is now amping up the AI competition against its rivals – and its response to OpenAI’s release of GPT 5.2 is the ...
Built on a hybrid mixture-of-experts architecture, these models aim to help enterprises implement multi-agent systems.
The Nemotron 3 family of open models — in Nano, Super and Ultra sizes — introduces the most efficient family of open models ...
Nvidia is leaning on the hybrid Mamba-Transformer mixture-of-experts architecture its been tapping for models for its new ...
Nvidia Corp. today announced the launch of Nemotron 3, a family of open models and data libraries aimed at powering the next ...
Microsoft Foundry brings a model catalog, visual workflows, and 1,000 connectors so teams ship safer AI agents faster with ...