Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Craig Hale

Amazon is the latest player in the generative AI game, but with a twist

AWS re:Invent 2022 logo sign

As OpenAI has thrown down the gauntlet to big tech with ChatGPT, Amazon has announced its own generative AI project to meet the challenge, highlighting a two-decade history of machine learning and artificial intelligence developments.

The company has reminded us several times during the new AI era that its own e-commerce model, warehouse robots, and logistical operations are based on ML, but it seems its cloud division is finally ready to commit to generative AI.

With the launch of Bedrock, Amazon will support third-party companies and startups to develop their own generative AI apps using pre-trained models.

Amazon Bedrock generative AI

Democratizing is a word that gets thrown around a lot in 2023, and that’s what Amazon plans to do for machine learning. But what exactly is the tech giant planning to do with Bedrock?

According to the firm, the announcement appeals to three customer wishes: access to high-performing Foundation Models (FM), seamless and affordable integration into applications, and customizability.

Bedrock makes FMs available via an API from companies like I21 Labs, Anthropic, Stability AI, and itself, including models with support for multiple languages, text-to-image, and conversation.

Currently in limited preview with hand-picked companies, AWS plans to make Bedrock available to more customers this year.

It has also shared a use case for FMs in its CodeWhisperer tool, which is now generally available and free for individual developers following preview last year. It’s designed to save time by generating code suggestions much like GitHub’s GPT-powered Copilot, aimed at preventing the vulnerabilities and inefficiencies typically related to copying code from the web.

Slightly ahead of general availability, Amazon has also announced the general availability of EC2 Trn1n and Amazon EC2 Inf2 instances to give potential customers some research to do before committing to AWS’s cloud infrastructure. They use the AWS Trainium and AWS Inferentia chips with headline figures promising huge savings on training costs whilst delivering high performance.

While significant time likely stands between Amazon’s announcements and the effective democratization of AI and ML, support for startups is clearly there as we enter the era of AI.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.