Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
While much attention has been on AMD's traditional server chips lately, investors will still be focused on what AMD says ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
OpenAI is racing to bring its own artificial intelligence chips into production, a move that could loosen Nvidia’s grip on ...
These chip designers have been great investment options.
Company claims its first-generation Atlas chips can achieve three times the compute per watt of Nvidia’s H100 GPUs ...
Here are our picks for the top 10 edge AI chips with a bright future across applications from vision processing to handling multimodal LLMs.
The Maia 200 AI chip is described as an inference powerhouse — meaning it could lead AI models to apply their knowledge to ...
Microsoft is taking on Amazon and Google with the launch of its Maia 200 AI accelerator chip, calling the processor “the most performant, first-party silicon from any hyperscaler.” ...
Last year, Nvidia agreed to install Navitas' GaN and SiC chips into its own AI data centers to support its next-gen 800 V ...
Intel's latest results make it clear that it still has work to do to become a notable player in AI chips.
Data centers are eating up computing resources and pushing chipmakers toward AI-grade memory, tightening supply for Nintendo ...