Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tribune News Service
Tribune News Service
Business
Davey Alba

Google is weaving generative AI into online shopping features

Google is bringing generative AI technology to shopping, aiming to get a jump on e-commerce sites like Amazon.com Inc.

The Alphabet Inc.-owned company announced features Wednesday aimed at helping people understand how apparel will fit on them, no matter their body size, and added capabilities for finding products using its search and image-recognition technology. Additionally, Google introduced new ways to research travel destinations and map routes using generative AI — technology that can craft text, images or even video from simple prompts.

“We want to make Google the place for consumers to come shop, as well as the place for merchants to connect with consumers,” Maria Renz, Google’s vice president of commerce, said in an interview ahead of the announcement. “We’ve always been committed to an open ecosystem and a healthy web, and this is one way where we’re bringing this technology to bear across merchants.”

Google is the world’s dominant search engine, but 46% of respondents in a survey of US shoppers conducted last year said they still started their product searches and research on Amazon, according to the research firm CivicScience. TikTok, too, is making inroads, CivicScience’s research found — 18% of Gen Z online shoppers turn to the platform first. Google is taking note, with some of its new, AI-powered shopping exploration features aimed at capturing younger audiences.

A new virtual “try-on” feature, launching on Wednesday, will let people see how clothes fit across a range of body types, from XXS to 4XL sizes. Apparel will be overlaid on top of images of diverse models that the company photographed while developing the capability.

Google said it was able to launch such a service because of a new image-based AI model that it developed internally, and the company is releasing a new research paper detailing its work alongside the announcement. The depictions of the clothing take into consideration the way fabric stretches and winkles as it’s worn to produce lifelike images. The try-on feature will start with women’s tops, in partnership with retailers like Anthropologie and Everlane, followed by men’s clothing later.

The company also said it will begin pulling in more sources of information as people test out its new “search generative experience” — a service that Google first announced at its I/O developers conference last month. For now, that offering is only available through the company’s experimental Search Labs product.

Google earlier announced that it was using a variety of web-based sources to display AI-generated information on, say, the best hotel for families at a particular vacation destination or the best waterproof Bluetooth speaker. Now it’s also adding in user reviews for its AI model to draw on as well.

The company is rolling out new additions to existing features on Google Maps, too. Immersive view, which uses AI to show people 3D tours of landmarks, is rolling out to four new cities: Amsterdam and Dublin, as well as Florence and Venice in Italy. Google is expanding its collection of landmarks available on immersive view to more than 500 — on both the iOS and Android apps — adding in such destinations as the Sydney Harbour Bridge and the Prague Castle.

Glanceable directions, meanwhile, will let people see turn-by-turn directions for walking, cycling and driving modes from their phone lock screens. Google said people will also be able to see updated ETAs as they follow their routes in real time. That feature is rolling out globally in June.

Some AI features on Google Lens — the image-recognition app that uses a phone camera to identify objects and text — have been around for a while, like discovering the name of a local dish by snapping a photo of it while traveling. But on Wednesday, Google said it is launching the ability for users to search for skin conditions using the app.

After a user takes a photo of a rash or a skin bump, Lens will find visual matches for the image that could help inform people’s searches, the company said. The feature is meant to be a starting point for research, not certified medical advice, Google said.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.