Microsoft’s next-generation Maia AI chip is facing a delay of at least six months, pushing its mass production to 2026 from 2025, The Information reported on Friday, citing three people involved in the effort.
When the chip, code-named Braga, goes into production, it is expected to fall well short of the performance of Nvidia’s Blackwell chip that was released late last year, the report said.
Microsoft had hoped to use the Braga chip in its data centers this year, the report said, adding that unanticipated changes to its design, staffing constraints and high turnover were contributing to the delay.
Microsoft did not immediately respond to a Reuters request for comment.
Like its Big Tech peers, Microsoft has focused heavily on developing custom processors for artificial intelligence operations and general purpose applications, a move that would help reduce the tech giant’s reliance on pricey Nvidia chips.
Cloud rivals Amazon and Alphabet’s Google have both raced to develop chips in-house, customized for their specific needs with the goal of improving performance and reducing costs.
Microsoft had introduced the Maia chip in November 2023, but has lagged its peers in ramping it up to scale.
Google, meanwhile, has seen success with its custom AI chips – called Tensor Processing Units – and in April unveiled its seventh-generation AI chip designed to speed the performance of AI applications.
Amazon in December also unveiled its next-generation AI chip Trainium3 that is set to be released late this year.
© Thomson Reuters 2025
Microsoft’s next-generation Maia AI chip is facing a delay of at least six months, pushing its mass production to 2026 from 2025, The Information reported on Friday, citing three people involved in the effort.
When the chip, code-named Braga, goes into production, it is expected to fall well short of the performance of Nvidia’s Blackwell chip that was released late last year, the report said.
Microsoft had hoped to use the Braga chip in its data centers this year, the report said, adding that unanticipated changes to its design, staffing constraints and high turnover were contributing to the delay.
Microsoft did not immediately respond to a Reuters request for comment.
Like its Big Tech peers, Microsoft has focused heavily on developing custom processors for artificial intelligence operations and general purpose applications, a move that would help reduce the tech giant’s reliance on pricey Nvidia chips.
Cloud rivals Amazon and Alphabet’s Google have both raced to develop chips in-house, customized for their specific needs with the goal of improving performance and reducing costs.
Microsoft had introduced the Maia chip in November 2023, but has lagged its peers in ramping it up to scale.
Google, meanwhile, has seen success with its custom AI chips – called Tensor Processing Units – and in April unveiled its seventh-generation AI chip designed to speed the performance of AI applications.
Amazon in December also unveiled its next-generation AI chip Trainium3 that is set to be released late this year.
© Thomson Reuters 2025
It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using ‘Content here, content here’, making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for ‘lorem ipsum’ will uncover many web sites still in their infancy.
It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using ‘Content here, content here’, making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for ‘lorem ipsum’ will uncover many web sites still in their infancy.
The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using ‘Content here, content here’, making
The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using ‘Content here, content here’, making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for ‘lorem ipsum’ will uncover many web sites still in their infancy.
It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution
Copyright BlazeThemes. 2023