Dark Mode Light Mode

Microsoft received’t cease shopping for AI chips from Nvidia, AMD, even after launching its personal, Nadella says 

Microsoft won’t stop buying AI chips from Nvidia, AMD, even after launching its own, Nadella says  Microsoft won’t stop buying AI chips from Nvidia, AMD, even after launching its own, Nadella says 

Microsoft this week deployed its first crop of its homegrown AI chips in one of its data centers, with plans to roll out more in the coming months, it says.  

The chip, named the Maia 200, is designed to be what Microsoft calls an “AI inference powerhouse” as Microsoft describes, meaning it’s optimized for the compute-intensive work of running AI models in production. The company released some impressive processing-speed specs for Maia, saying it outperforms Amazon’s latest Trainium chips and Google’s latest Tensor Processing Units (TPU).  

All of the cloud giants are turning to their own AI chip designs in part because of the difficulty, and expense, of obtaining the latest and greatest from Nvidia — a supply crunch that shows no signs of abating.  

But even with its own state-of-the-art, high-performance chip in hand, Microsoft CEO Satya Nadella said the company will still be buying chips made by others. 

“We have a great partnership with Nvidia, with AMD. They are innovating. We are innovating,” he explained. “I think a lot of folks just talk about who’s ahead. Just remember, you have to be ahead for all time to come.” 

He added: “Because we can vertically integrate doesn’t mean we just only vertically integrate,” meaning building its own systems from top to bottom, without using wares from other vendors. 

That said, Maia 200 will be used by Microsoft’s own so-called Superintelligence team, the AI specialists building the software giant’s own frontier models. That’s according to Mustafa Suleyman, the former Google DeepMind co-founder who now leads the team. Microsoft is working on its own models to perhaps one day lessen its reliance on OpenAI, Anthropic, and other model makers.

Techcrunch event

Boston, MA
|
June 23, 2026

The Maia 200 chip will also support OpenAI’s models running on Microsoft’s Azure cloud platform, the company says. But, by all accounts, securing access to the most advanced AI hardware is still a challenge for everyone, paying customers and internal teams alike.

So in a post on X, Suleyman clearly relished sharing the news that his team gets first dibs. “It’s a big day,” he wrote when the chip launched. “Our Superintelligence team will be the first to use Maia 200 as we develop our frontier AI models.”

Source link

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

The weirdest headphones at CES will launch on Kickstarter subsequent month

Next Post

Tyler Ballgame: For the First Time, Once more evaluation – cosplaying singer-songwriter courts comparisons to Seventies greats | Music