Economy

The furious battle to topple the world’s most valuable company

The increased competition was evident Tuesday, when Amazon announced the availability of computing services based on its new Trainium 2 AI chips and testimonials from potential users including Apple. The company also unveiled computers containing either 16 or 64 of the chips, with ultrafast networking connections that particularly accelerate inferencing performance.

Amazon is even building a kind of giant AI factory for the startup Anthropic, which it has invested in, said Matt Garman, chief executive of Amazon Web Services. That computing “cluster” will have hundreds of thousands of the new Trainium chips and will be five times as powerful as any that Anthropic has ever used, said Tom Brown, a founder and the chief compute officer of the startup, which operates the Claude chatbot and is based in San Francisco.

AMD’s new MI300 artificial intelligence chip is expected to generate more than $US5 billion in sales in its first year of release.Credit: NYT

“This means customers will get more intelligence at a lower price and at faster speeds,” Brown said.

In total, spending on computers without Nvidia chips by data centre operators, which provide the computing power needed for AI tasks, is expected to grow 49 per cent this year to $US126 billion, according to Omdia, a market research firm.

Even so, the increased competition does not mean Nvidia is in danger of losing its lead. A spokesperson for the company pointed to comments made by Jensen Huang, Nvidia’s chief executive, who has said his company has major advantages in AI software and inferencing capability. Huang has added that demand is torrid for the company’s new Blackwell AI chips, which he says perform many more calculations per watt of energy used, despite an increase in the power they need to operate.

Loading

“Our total cost of ownership is so good that even when the competitor’s chips are free, it’s not cheap enough,” Huang said in a speech at Stanford University this year.

The changing AI chip market has partly been propelled by well-funded startups such as SambaNova Systems, Groq and Cerebras Systems, which have lately claimed big speed advantages in inferencing, with lower prices and power consumption. Nvidia’s current chips can cost as much as $US15,000 each, and its Blackwell chips are expected to cost tens of thousands of dollars each.

That has pushed some customers toward alternatives. Dan Stanzione, executive director of the Texas Advanced Computing Centre, a research centre, said the organisation planned to buy a Blackwell-based supercomputer next year but would most likely also use chips from SambaNova for inferencing tasks because of their lower power consumption and pricing.

“That stuff is just too expensive,” he said of Nvidia’s chips.

AMD said it expected to target Nvidia’s Blackwell chips with its own new AI chips arriving next year. In the company’s Austin labs, where it exhaustively tests AI chips, executives said inferencing performance was a major selling point. One customer is Meta, the owner of Facebook and Instagram, which says that it has trained a new AI model, called Llama 3.1 405B, using Nvidia chips but that it uses AMD MI300s chips for providing answers to users.

Jensen Huang has built Nvidia into the world’s most valuable company.

Jensen Huang has built Nvidia into the world’s most valuable company.Credit: Bloomberg

Amazon, Google, Microsoft and Meta are also designing their own AI chips to speed up specific computing chores and achieve lower costs, while still building big clusters of machines powered by Nvidia’s chips. This month, Google plans to begin selling services based on a sixth generation of internally developed chips, called Trillium, which is nearly five times as fast as its predecessor.

Amazon, sometimes seen as a laggard in AI, seems particularly determined to catch up. The company allocated $US75 billion this year for AI chips and other computing hardware, among other capital spending.

At the company’s Austin offices — run by Annapurna Labs, a startup that it bought in 2015 — engineers previously developed networking chips and general-purpose microprocessors for Amazon Web Services. Its early AI chips, including the first version of Trainium, did not gain much market traction.

Loading

Amazon is far more optimistic about the new Trainium 2 chips, which are four times as fast as previous chips. On Tuesday, the company also announced plans for another chip, Trainium 3, which was set to be even more powerful.

Eiso Kant, chief technology officer of Poolside, an AI startup in Paris, estimated that Trainium 2 would provide a 40 per cent improvement in computing performance per dollar compared with Nvidia-based hardware.

Amazon also plans to offer Trainium-based services in data centres across the world, Kant added, which helps with inferencing tasks.

“The reality is, in my business, I don’t care what silicon is underneath,” he said. “What I care about is that I get the best price performance and that I can get it to the end user.”

This article originally appeared in The New York Times.

The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “brisbanetimes”

Related Articles

Leave a Reply

Back to top button

Discover more from Elrisala

Subscribe now to keep reading and get access to the full archive.

Continue reading