ChatGPT: Everything you need to know about the AI-powered chatbot
A timeline of ChatGPT product updates and releases, starting with the latest, which we’ve been updating throughout the year.
A timeline of ChatGPT product updates and releases, starting with the latest, which we’ve been updating throughout the year.
Learning science consistently shows us that true learning requires active engagement. This is fundamental to how Gemini helps you learn. Going beyond simple text and sta…
As AI models grow in complexity and hardware evolves to meet the demand, the software layer connecting the two must also adapt. We recently sat down with Stephen Jones, a Distinguished Engineer at NVIDIA and one of the original architects of CUDA. Jones, whose background spans from fluid mechanics to aerospace engineering, offered deep insights into NVIDIA’s latest software innovations, including the shift toward tile-based programming, the introduction of “Green Contexts,” and how AI is rewriting the rules […]
Schools across Northern Europe are safely and responsibly integrating Google and Gemini for Education tools in the classroom, saving teachers and administrations signifi…
Ayn Rand described Thanksgiving as “a typically American holiday . . . its essential, secular meaning is a celebration of successful production. It is a producers’ holiday. The lavish meal is a symbol of the fact that abundant consumption is the result and reward of production.”
In part 2 of our two-part series on generative artificial intelligence’s environmental impacts, MIT News explores some of the ways experts are working to reduce the technology’s carbon footprint. The energy demands of generative AI are expected to continue increasing dramatically over the next decade. For instance, an April 2025 report from the International Energy Agency predicts that the global electricity demand from data centers, which house the computing infrastructure to train and deploy AI models, will more than double by […]
Can a 3B model deliver 30B class reasoning by fixing the training recipe instead of scaling parameters? Nanbeige LLM Lab at Boss Zhipin has released Nanbeige4-3B, a 3B parameter small language model family trained with an unusually heavy emphasis on data quality, curriculum scheduling, distillation, and reinforcement learning. The research team ships 2 primary checkpoints, Nanbeige4-3B-Base and Nanbeige4-3B-Thinking, and evaluates the reasoning tuned model against Qwen3 checkpoints from 4B up to 32B parameters. https://arxiv.org/pdf/2512.06266 Benchmark results On AIME […]
Scout24 has created a GPT-5 powered conversational assistant that reimagines real-estate search, guiding users with clarifying questions, summaries, and tailored listing recommendations.
Gradient Canvas is a new art exhibition celebrating a decade of creative collaborations between artists and artificial intelligence.