When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

AWS is shelling out free credits for startups to use AI models on its platform, again

The AWS logo on a dark background

Amazon Web Services (AWS) has expanded its initiative to support startups by offering free credits to allow them to utilize prominent AI models, including those that are developed by Anthropic, Meta, Mistral AI, and Cohere. The move is part of Amazon's strategy to increase the adoption of its own AI platform - Bedrock.

The program, called AWS Generative AI Accelerator, was initially launched in 2023 with the aim to assist promising startups globally by providing access to networking opportunities, potential investors, and up to $300,000 in AWS credits. This time, Amazon has partnered up with Y Combinator to offer $500,000 in credits which can be used on AI models and Amazon's chips.

Howard Wright, vice president and global head of startups at AWS, said in an interview with Reuters:

"This is another gift that we're making back to the startup ecosystem, in exchange for what we hope is startups continue to choose AWS as their first stop."

The program is open to startups across various industries applying generative AI, such as legal, marketing, software engineering, green energy, and life sciences. Selected startups should have a minimal viable product (MVP), customer traction, and a focus on enhancing their product value proposition to scale. The initiative also offers a dedicated support team for startups already building on AWS, including a team of AWS Solutions Architects to guide product development.

Amazon is one of the major investors in Anthropic, the company behind the popular AI chatbot, Claude. The cloud and e-commerce company has invested more than $4 billion investment in Anthropic in exchange for AWS being Anthropic's primary cloud provider, similar to how Microsoft Azure is the exclusive cloud provider for OpenAI. So far, Amazon says that it has invested more than $6 billion in credits to startups in the past decade.

Amazon is pushing hard not only on training its own LLMs but also on making efficient and powerful chips to train these AI models. The company's Trainium and Inferentia chips offer superior price performance compared to Nvidia GPUs, with Trainium providing about a 50% improvement in price performance for training machine learning models. Inferentia, on the other hand, delivers low-cost, high-throughput, low-latency machine learning inference, making it ideal for processing predictions in generative AI models.

Via Reuters

Report a problem with article
Xbox logo light green on dark grey background
Next Article

It's not a huge surprise but a Microsoft Xbox animated AI chatbot is in the works

Microsoft Teams logo with shapes around it in the various colors of the Teams logo
Previous Article

Microsoft Teams (free) adds automatic welcome messages for Windows 11 and more

Join the conversation!

Login or Sign Up to read and post a comment.

0 Comments - Add comment