Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    EZVIZ joins the United Nations Global Compact, starting a new chapter of its unwavering journey to long-term sustainability and further expanding its contribution to key environmental issues

    April 22, 2026

    Infortrend Recognized in CRN’s Storage 100 as One of The 50 Coolest Software-Defined Storage Vendors for 2026

    April 22, 2026

    WATERTECH CHINA 2026 to Spotlight Two High-Impact Forums, Driving Early Pre-Registration from Global Water Professionals

    April 22, 2026
    Facebook X (Twitter) Instagram
    UAE ReporterUAE Reporter
    • Automotive
    • Business
    • Entertainment
    • Health
    • Lifestyle
    • Luxury
    • News
    • Sports
    • Technology
    • Travel
    UAE ReporterUAE Reporter
    Home » Microsoft launches Maia 200 chip for Azure AI inference
    Technology

    Microsoft launches Maia 200 chip for Azure AI inference

    January 28, 2026
    Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email

    MENA Newswire, SAN FRANCISCO: Microsoft on Jan. 26 introduced Maia 200, the second generation of its in-house artificial intelligence accelerator, built to run AI models in production across Azure data centres. The company said Maia 200 is designed for inference, the stage where trained models generate responses to live requests, and will be used to support a range of Microsoft AI services.

    Microsoft launches Maia 200 chip for Azure AI inference
    Microsoft Maia 200 targets faster AI inference in Azure data centers using custom silicon. (AI-generated image)

    Maia 200 is manufactured on TSMC’s 3-nanometer process and includes more than 140 billion transistors, Microsoft said. The chip pairs compute with a new memory system that includes 216 gigabytes of HBM3e high-bandwidth memory and about 272 megabytes of on-chip SRAM, aimed at sustaining large-scale token generation and other inference-heavy workloads.

    Microsoft said Maia 200 delivers more than 10 petaflops of performance at 4-bit precision and about 5 petaflops at 8-bit precision, formats commonly used to run modern generative AI efficiently. The company also said the system is designed around a 750-watt power envelope and is built with scalable networking so chips can be linked for larger deployments.

    The company said the new hardware has begun coming online in an Azure U.S. Central data centre in Iowa, with an additional location planned in Arizona. Microsoft described Maia 200 as its most efficient inference system deployed to date, reporting a 30% improvement in performance per dollar compared with its existing inference systems.

    AI inference focus and Azure deployment

    Microsoft said Maia 200 is intended to support AI products and services that rely on high-volume, low-latency model execution, including workloads running in Azure and Microsoft’s own applications. The company said it has designed the chip and the surrounding system as part of an end-to-end infrastructure approach that includes silicon, servers, networking and software for deploying AI models at scale.

    Alongside the chip, Microsoft announced early access to a Maia software development kit for developers and researchers working on model optimization. The company said the tooling is aimed at helping teams compile and tune models for Maia-based systems, and is structured to fit into common AI development workflows used for deploying inference in the cloud.

    Performance claims and model support

    Microsoft said Maia 200 is built to run large language models and advanced reasoning systems, and that it will be used for internal and hosted model deployments in Azure. The company has positioned the chip as a production inference accelerator, distinguishing it from training-focused systems that are typically used to build models before deployment.

    Microsoft has accelerated custom silicon work as demand has grown for compute to serve generative AI applications, where costs and availability of accelerators can affect how quickly services scale. Maia 200 follows Maia 100, which Microsoft introduced in 2023, and represents the company’s latest iteration of its dedicated AI accelerator line for datacenter inference.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email

    Related Posts

    UAE and Albania leaders deepen bilateral ties

    April 21, 2026

    UAE and UK foreign ministers review regional tensions

    April 20, 2026

    Sabah fire destroys 1,000 homes and displaces thousands

    April 20, 2026

    Etihad expands Africa network with six new routes

    April 18, 2026
    Latest News
    Business

    Apple names John Ternus CEO as Tim Cook shifts roles

    Business

    Apple says Tim Cook will become executive chairman and John Ternus CEO on Sept. 1, formalizing the iPhone maker leadership transition.

    UAE and Albania leaders deepen bilateral ties

    April 21, 2026

    UAE and UK foreign ministers review regional tensions

    April 20, 2026

    Sabah fire destroys 1,000 homes and displaces thousands

    April 20, 2026

    Etihad expands Africa network with six new routes

    April 18, 2026

    Japan defense budget nears 2% of GDP in fiscal 2026

    April 18, 2026

    UAE economy extends global rise on strong 2026 data

    April 18, 2026

    Malaysia halal exports rise 10.9% to RM68.52 billion

    April 17, 2026
    © 2026 UAE Reporter | All Rights Reserved
    • Home
    • Contact Us

    Type above and press Enter to search. Press Esc to cancel.