Amazon jumps into the generative A.I. race with new cloud service and its own large language models
The announcement indicates that the largest provider of cloud infrastructure won’t be leaving a trendy growth area to challengers such as Google and Microsoft, both of which have started offering developers large language models they can tap into. Generally speaking, large language models are AI programs trained with extensive amounts of data that can compose human-like text in response to prompts that people type in.
Through its Bedrock generative AI service, Amazon Web Services will offer access to its own first-party language models called Titan, as well as language models from startups AI21 and Google-backed Anthropic, and a model for turning text into images from startup Stability AI. One Titan model can generate text for blog posts, emails or other documents. The other can help with search and personalization.
The Bedrock initiative comes one month after OpenAI announced GPT-4, a large language model that powers ChatGPT, a chatbot that went viral after its launch in November. The most formidable competition for Amazon’s AWS business comes from Microsoft, which has invested billions in OpenAI and supplies the startup with computing power through its Azure cloud.
People using ChatGPT and Microsoft’s Bing chatbot based on OpenAI language models have at times encountered inaccurate information, owing to a behavior called hallucination, where the output can appear convincing but actually has nothing to do with the training data. Amazon is “really concerned about” accuracy and ensuring its Titan models produce high-quality responses, Bratin Saha, an AWS vice president, told CNBC in an interview.
Clients will be able to customize Titan models with their own data. But that data will never be used to train the Titan models, ensuring that other customers, including competitors, don’t end up benefiting from that data, said another vice president.
Sivasubramanian and Saha declined to talk about the size of the Titan models or identify the data Amazon used to train them, and Saha would not describe the process Amazon followed to remove problematic parts of the model training data.
Amazon isn’t disclosing the cost of the Bedrock service, because for now it’s starting a limited preview. Customers can add themselves to a waiting list, a spokesperson said. Microsoft and OpenAI have announced prices for using GPT-4, which start at a few cents per 1,000 “tokens,” with one token being equal to about four characters of English text. Google has not released pricing for its PaLM language model.
Sivasubramanian, who has been at Amazon since the mid-2000s, said that Amazon has worked on AI for more than two decades and that AWS has racked up over 100,000 AI customers. Amazon has been using a fine-tuned version of Titan to deliver search results through its homepage, he added.
But Amazon is just one of the big companies that have rushed to bring out generative AI capabilities after ChatGPT appeared and became a hit. Expedia, HubSpot, Paylocity and Spotify are among the companies that have committed to integrating OpenAI technology.
Morgan Stanley analysts said in a Wednesday note that, based on a February survey of chief information officers, they expect AI to become a larger part of cloud spending, with Google and Microsoft being the largest beneficiaries, not Amazon.
“We always actually launch when things are ready, and all these technologies are super early,” Sivasubramanian said. He said Amazon wants to ensure Bedrock will be easy to use and cost-effective, thanks to the use of custom AI processors.
C3.ai, Pegasystems, Accenture and Deloitte are among the companies looking forward to using Bedrock, he wrote in the blog post.