Amazon has announced a $4 billion investment in the firm behind the generative AI model Claude. Amazon's partnership with Anthropic will allow the retailer and cloud computing firm to integrate a foundational AI platform in its Amazon Web Services (AWS) cloud computing division. The partnership will allow Amazon and Anthropic to work together as part of an arrangement through which Amazon's chips will also be used by Anthropic to develop future models.
Through the deal, Antrhopic will have a steady flow of orders for its software, as Claude will be available to Amazon's AWS customers and its developers to use Antrhopic's AI as a foundational layer to developing customized variants. Amazon claims that the deal allows it to expand its AI offerings across all three layers of generative AI.
Amazon Beefs Up Middle Layer Of Generative AI Stack By Taking Minority Position in AI Firm Anthropic
AI is one of the major trends of 2023 that has managed to mitigate some of the effects of the downturn in the personal computing market. Within a year, NVIDIA Corporation's revenue has grown after tanking as GPU sales continued to dry up. Some of the effects of a PC downturn have also been felt by other chip firms, but what is clear is that the impact of inventory adjustment would have been greater on nearly every major chip firm if businesses had not ramped up their spending on chips for AI.
This latest wave is different from previous ones since it involves models capable of generating their content, whether it's text or images. Amazon's Bedrock service, which incorporates the generative AI model, Claude, is one such platform that enables businesses to shop for an off-the-shelf foundation that they can use to build models. Amazon will also invest $4 billion in Anthropic by becoming a minority shareholder in the firm.
The equity should allow Anthropic to expand its future models. Anthropic has already announced Claude 2 and Amazon began offering Claude to customers in April during the peak of the hype surrounding ChatGPT.
With the latest announcement, it seems that not only will Amazon offer foundational AI models to its customers, but it will also ramp up its own transformation. Anthropic and Amazon's partnership already allows customers such as the asset management firm Bridgewater Associates to shift some analyst business functions to AI.
Another key part of the deal is Antropic agreeing to use Amazon's Titanium and Inferentia chips for its future model training. This will provide Amazon with a steady stream of orders for its custom chip lineup. Silicon investments are quite common among big technology firms, and Amazon announced the Titanium processors in 2020.
At the time of the Titanium announcement, Amazon claimed that the chips would feature higher throughput and lower costs than traditional GPU based systems. The Titanium chips are for training machine learning models, while the inference portion is left to custom chips called Inferentia accelerators.
Amazon's press release also included comments from small and big companies that are already using the Amazon Bedrock. Amazon and Anthropic will provide the latter's Claude and Claude 2 to customers via the AWS Generative AI Innovation Center. The list of customers includes Bridgewater, which is using AWS to create an Investment Analyst Assistant.
This assistant will "be able to generate elaborate charts, compute financial indicators, and create summaries of the results, based on both minimal and complex instructions," according to Bridgewater's co chief investment officer Greg Jensen. Other Amazon customers that are using foundational AI models to create their own variants are travel companies and law firms, which use Claude to provide customized plans, summarization and legal drafting.
WccftechContinue reading/original-link]