Amazon is making its own chips to offer generative AI on AWS

The company aims for low-cost, high-throughput chips that allow users to work with its web services on the cloud.
Ameya Paleja
Stock image of an Amazon office in Houston, Texas
Stock image of an Amazon office in Houston, Texas


Even as the world looks to Microsoft and Google to reveal the next big thing in the generative artificial intelligence (AI) field, Jeff Bezos-founded Amazon has been silently working to let its customers work directly with the technology. In an unmarked building in Austin, Texas, Amazon engineers are busy developing two types of microchips that will be used to train and run AI models, CNBC reported.

The world took notice of generative AI when OpenAI launched ChatGPT last year. Microsoft, which has partnered with OpenAI previously, was quick to use its association with the company and incorporate the features of the AI model into its existing products.

Yet, months later, we are to see a significant development in the use cases of the technology. Amazon, which has 40 percent of the market share in cloud computing, attributes this to the lack of tools for businesses to leverage their existing data and train models on them.

Executives at the company told CNBC that businesses were not keen on migrating their cloud data to Microsoft just because it was the market leader in generative AI. Therefore, Amazon is using its time to build tools that businesses can use on their data already in the cloud.

Tools built by Amazon

Instead of looking to let users run language models like GPT on its cloud servers, Amazon has built its own family of large language models called Titan. Developers can use another service called Bedrock for generative AI applications, which also gives users access to other models developed by companies like Anthropic, Stability AI, and AI21 Labs.

Even though Amazon has built its own LLM, it does not assume that it is the best option to serve all user cases and wants to offer users the option to pick the model they deem best for their application.

So far, the company has unveiled tools like Healthscribe that let doctors make patient summaries using AI, while its CodeWhisperer helps reduce the workload for developers. As the company looks to build more language models and even unveil a ChatGPT rival, its plans also revolve around data processed by in-house designed microchips.

Powered by self-designed chips

Amazon may well be following the trend in Silicon Valley, where companies have ditched legacy chipmakers and designed their chips. However, this isn't a recent shift in its strategy.

Amazon is making its own chips to offer generative AI on AWS
Stock image of a microchip during its design and fabrication process

Nearly a decade ago, Amazon's custom-designed silicon made its way into its cloud infrastructure. Dubbed Nitro, Amazon has made more than 20 million of them, practically putting at least one in each AWS server.

In 2018, the company built an x86 chip Graviton that competes with offerings from AMD and Intel for Arm-based servers. Around the same time, it began working on its AI-focused chips, now looking to compete with Nvidia's dominance in the area.

Amazon's offerings have been dubbed Trainium and Inferentia to help distinguish their capabilities for training and running AI models, respectively. The latter is currently in its second generation and is designed to offer a low-cost, high-throughput option for running models.

Trainium, too, offers 50 percent improvements in price performance compared to other training model methods while using AWS, company executives told CNBC.Amazon is confident that companies will pick its offerings to train their models rather than share data with OpenAI.

With the infrastructure and tools in place, Amazon might not just catch up with the likes of Google and Microsoft, they might even supersede them quickly.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board