• Home
  • Hardware
  • Microsoft announced Azure Maia 100 and Cobalt 100 processors: Here are the details

Microsoft announced Azure Maia 100 and Cobalt 100 processors: Here are the details

Microsoft has created its own custom AI chip that can be used to train large language models and potentially avoid costly dependence on Nvidia. Microsoft is also building its own Arm-based CPU for cloud workloads.
 Microsoft announced Azure Maia 100 and Cobalt 100 processors: Here are the details
READING NOW Microsoft announced Azure Maia 100 and Cobalt 100 processors: Here are the details
Microsoft has created its own custom AI chip that can be used to train large language models and potentially avoid costly dependence on Nvidia. Microsoft has also built its own Arm-based CPU for cloud workloads. Both custom silicon chips are designed to power Azure data centers.

Microsoft’s Azure Maia AI chip and Arm-powered Azure Cobalt CPU arrive in 2024, following a surge in demand this year for Nvidia’s H100 GPUs, which are widely used to train and run generative vision tools and large language models. Both chips are manufactured on the “cutting edge process node” at TSMC, Microsoft said.

Microsoft Azure Maia 100 and Cobalt 100 introduced

These chips will appear “early next year” in Microsoft’s data centers, which will begin running Microsoft’s Copilot, Azure OpenAI, and other services. The question of how many chips will be used remains unanswered for now, but Microsoft said it is in the “early stages of deployment” with “first servers” online in data centers. Therefore, Nvidia and AMD solutions will continue to be used for a while.

On the other hand, Microsoft does not provide much information about technical specifications, making it difficult to compare new solutions with Nvidia and AMD options.

Microsoft Azure Maia 100

Azure Maia 100 is manufactured on a 5-nanometer process and has 105 billion transistors – 30 percent fewer than AMD’s MI300X, according to Microsoft. According to the company, these details make Maia 100 one of the biggest chips. Microsoft highlights that Maia’s servers are designed with a fully proprietary Ethernet-based networking protocol with 4.8 terabits of total bandwidth per accelerator to provide better scalability and end-to-end workload performance. Microsoft says its Maia 100 AI accelerator is designed to run cloud AI workloads such as large language model training and inference, and works internally with OpenAI.

Microsoft Azure Cobalt 100

Another special solution, Cobalt 100, is Arm-based hardware and, according to the company, was designed with power efficiency in mind. Azure Cobalt 100 uses a licensed design from Arm based on ARM Neoverse CSS. The 64-bit 128-core chip provides performance improvements of up to 40% over existing Azure Arm server processors. Microsoft is currently testing the Cobalt CPU on workloads such as Microsoft Teams and SQL server, and plans to make virtual machines available to customers for a variety of workloads next year.
Microsoft has also developed special rack and cooling designs specifically for Azure Maia 100, and these are also shared with industry partners. The company is currently working on second-generation versions of both the Maia AI Accelerator and the Azure Cobalt CPU. It is clear from the naming that these are the first generation.

 

Microsoft also announced new partnerships with AMD and Nvidia. On the AMD side, Microsoft will add AMD MI300X virtual machines to the Azure platform. On the Nvidia side, there is the new NC H100 v5 Virtual Machine Series designed for Nvidia’s H00 Tensor Core GPUs. There are no detailed specifications and benchmarks for Microsoft Azure Cobalt 100 and Azure Maia 100 yet, but we think they will arrive soon.

Comments
Leave a Comment

Details
230 read
okunma15821
0 comments