Advertisement
- Amazon announced another $4 billion investment in AI startup Anthropic on Friday.
- The deal includes an agreement for Anthropic to use Amazon’s AI chips more.
- The cloud giant is trying to challenge Nvidia and get developers to switch away from those GPUs.
Amazon‘s Trainium chips are about to get a lot busier. At least that’s what Amazon hopes will happen after it pumps another $4 billion into AI startup Anthropic.
The two companies announced a huge new deal on Friday that brings Amazon’s total investment in Anthropic to $8 billion. The goal of all this money is mainly to get Amazon’s AI chips to be used more often to train and run large language models.
Advertisement
In return for this latest cash injection, Anthropic said it will use AWS as its “primary cloud and training partner.” It’s also going to help Amazon design future Trainium chips and contribute to building out an Amazon AI model development platform called AWS Neuron.
This is a full frontal assault on Nvidia, which currently dominates the AI chip market with its GPUs, servers, and CUDA platform. Nvidia stock dropped more than 3% on Friday after the Amazon-Anthropic news broke.
The challenge will be getting Anthropic to actually use Trainium chips in big ways. Switching away from Nvidia GPUs is complicated, time-consuming, and risky for AI model developers, and Amazon has struggled with this, as Business Insider has reported.
Advertisement
Earlier this week, Anthropic CEO Dario Amodei didn’t sound like he was all-in on Amazon’s Trainium chips, despite another $4 billion coming his way.
“We use Nvidia, but we also use custom chips from both Google and Amazon,” he said at the Cerebral Valley tech conference in San Francisco. “Different chips have different trade-offs. I think we’re getting value from all them.”
Back in 2023, Amazon made its first investment in Anthropic, agreeing to put in $4 billion. That deal came with similar strings attached. At the time, Anthropic said it would use Amazon’s Trainium and Inferentia chips to build, train, and deploy future AI models, and the two companies would collaborate on the development of future chip technology.
Advertisement
It’s unclear if Anthropic followed through on these intentions. The Information reported recently that Anthropic prefers to use Nvidia GPUs, rather than Amazon AI chips. And the talks on this latest investment focused on getting Anthropic more committed to using Amazon’s offerings, the publication reported.
There are signs that Anthropic could be more committed now, after getting another $4 billion from Amazon.
In Friday’s announcement, Anthropic said it is working with Amazon on its Neuron software, which offers the crucial connective tissue between the chip and the AI models. This competes with Nvidia’s CUDA software stack, which is the real enabler of Nvidia’s GPUs and makes these components very hard to swap out for other chips. Nvidia has decade-long headstart on CUDA, and so far competitors have found that difficult to overcome.
Advertisement
Anthropic’s “deep technical collaboration” may be significant because it suggests a new level of commitment to using and improving Amazon’s Trainium chips.
Though several companies make chips that compete or even beat Nvidia’s in certain elements of computing performance, no other chip has so far touched the company in terms of market or mindshare.
Amazon’s AI chip journey
Amazon is on a short list of cloud providers attempting to stock its data centers with its own AI chips and avoid spending heavily on Nvidia GPUs, which have profit margins that often exceed 70%.
Advertisement
Amazon debuted it’s Trainium and Inferentia chips — named after the training and inference tasks they are built for — in 2020.
The aim was to become less dependent on Nvidia and find a way to make cloud computing in the AI age cheaper.
“As customers approach higher scale in their implementations, they realize quickly that AI can get costly. It’s why we’ve invested in our own custom silicon in Trainium for training and Inferentia for inference,” Amazon CEO Andy Jassy said on the company’s October earnings call.
Advertisement
But like its many competitors, Amazon has found that breaking the industry’s preference for Nvidia is difficult. Some say that’s due to CUDA, which offers an abundant software stack with libraries, tools, and troubleshooting help galore. Other say it’s simple habit or convention.
In May, Bernstein analyst Stacy Rasgon told Business Insider he wasn’t aware of any companies using Amazon chips at scale.
With Friday’s announcement, that might change.
Advertisement
Jassy said in October that the next generation Trainium 2 chip, is ramping up.
“We’re seeing significant interest in these chips, and we’ve gone back to our manufacturing partners multiple times to produce much more than we’d originally planned,” Jassy said.
Still, earlier this week Anthropic’s Amodei sounded like he was still hedging his bets.
Advertisement
“We believe that our mission is best served by being an independent company,” he said. “If you look at our position in the market and what we’ve been able to do, the independent partnerships we have Google, with Amazon, with others, I think this is very viable.”