BSV
$68.68
Vol 212.48m
-4.58%
BTC
$98999
Vol 105025.21m
2.4%
BCH
$493.99
Vol 2265.79m
5.38%
LTC
$90.39
Vol 1467.59m
6.62%
DOGE
$0.39
Vol 10383.94m
4.5%
Getting your Trinity Audio player ready...

Microsoft (NASDAQ: MSFT) has announced plans to introduce an artificial intelligence (AI) chip to support the development of large language models (LLMs) and another for mundane computing processes.

The big tech company confirmed plans to enter the chipmaking business at its Ignite conference after months of planning. Called the Maia 100, Microsoft’s latest AI chip is expected to be a close competitor to Nvidia’s chips (NASDAQ: NVDA) while being a cheaper alternative.

Priced at around $40,000, Nvidia’s H100 chips have seized a large chunk of the market, with the company scrambling to meet the market demand. Microsoft is also keen to meet the demand of AI developers with its offering for running cloud AI processes.

Microsoft’s testing of Maia with OpenAI’s systems has since yielded impressive results, given its range of functionalities for faster model training and inference.

“We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models,” said OpenAI CEO Sam Altman. “Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers.”

Experts opine that Microsoft can catch up with industry leaders like Nvidia and Advanced Micro Devices (AMD) (NASDAQ: AMD) with its debut Maia 100. Microsoft’s new chips are laced with 105 billion transistors, offering support for 8-bit data types and liquid-cooled server processors.

The company says it will share the designs for its unique rack house with its partners but will keep the chip designs private.

“The goal here was to enable higher density of servers at higher efficiencies,” said Microsoft. “Because we’re reimagining the entire stack we purposely think through every layer, so these systems are actually going to fit in our current data center footprint.”

Alongside Maia 100, Microsoft announced a second chip, Cobalt 100, designed for general computing tasks and expected to be a direct competitor with Intel processors.

Microsoft keeping pace with rivals

Given the pace of industry innovation, Microsoft faces a rough patch ahead in its attempt to keep pace with Nvidia and AMD. Barely months after rolling out H100 chips, Nvidia has hinted toward developing a successor—the H200—expected to be launched in 2024.

With sweltering pockets from its AI chips business, Nvidia has splurged a fortune in the research and development for advanced chips, confirming an annual innovation timeline rather than every two years. Big Tech firms are no longer throwing their weight behind the development of chips for block reward mining but are scrambling to satisfy the demand stemming from generative AI.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Artificial intelligence needs blockchain

Recommended for you

Blockchain enables autonomous AI agents to learn
Utilizing blockchain tech, a group of Belgian scientists enabled autonomous AI agents to learn and communicate securely, contributing to the...
September 17, 2024
WhatsOnChain gets own UTXO endpoints for BSV blockchain services
With ElectrumX set to retire in October, WhatsOnChain is gearing up to implement a new UTXO set of API endpoints,...
September 16, 2024
Advertisement