The new "Maia 200" chip comes online this week in a data center in Iowa, with plans for a second location in Arizona, ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the ...
Like Nvidia’s newly announced flagship “Vera Rubin” chips, the Maia 200 is manufactured by Taiwan Semiconductor Manufacturing ...
In a blog post, Microsoft said it has added capabilities to its Quantum Development Kit (QDK), an open source developer toolkit for building quantum applications, including domain-specific toolkits ...
Microsoft is testing AI-assisted coding for non-developers, signalling a shift in how ideas move to prototypes.
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
Microsoft and Linux are adding AI and Rust to their pipelines. Microsoft is leaning much harder into AI development than Linux. Both are expanding Rust, but neither OS will be fully Rust soon.
The two tech giants remain the most balanced plays in the booming AI market.
Microsoft has introduced the second generation of its in‑house artificial intelligence processor, the Maia AI chip.
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
Microsoft unveils Maia 200, a custom AI chip designed to power Copilot and Azure, challenging Amazon and Google in the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results