Get all your news in one place.
100’s of premium titles.
One app.
Start reading
International Business Times UK
International Business Times UK
Technology
Vinay Patel

Meta AI Image Generator Glitch: Users Report Issues Generating Interracial Images

Meta's AI is accused of being racist. (Credit: Reuters)

Some social media users reported limitations when using Meta's AI image generation tool, specifically regarding images featuring interracial couples.

Google recently shut down its Gemini AI after concerns about potential bias in the technology. This decision comes amidst speculation that Apple will partner with Google on AI for future iPhones, raising questions about potential bias within those features.

Meta's AI image generator is under scrutiny for potential racial bias. Users report encountering difficulties when attempting to generate images depicting interracial couples, specifically involving Asian men and white women.

Meta's AI image generator, capable of crafting realistic images from prompts, has sparked concerns about potential racial bias. Users report difficulties generating images featuring mixed-race couples, particularly Asian men with white women. This oversight raises questions about the algorithm's fairness, considering Meta CEO Mark Zuckerberg's marriage to an Asian woman.

Taking to social media, commenters criticised the tool's inability to generate specific interracial couple images. "Racist software made by racist engineers," X user @Kazuya145 wrote.

Mia Satto, a reporter for The Verge, tested an AI image generator using prompts that described people by race and ethnicity, such as "Asian man and Caucasian friend" or "Asian man and white wife."

She found that out of dozens of tests, Meta's AI only displayed a white man and an Asian woman once. In all other cases, Meta's AI returned images of East Asian men and women. Even when Satto changed the prompts to specify platonic relationships, like "Asian man with a Caucasian friend," the AI struggled to generate diverse results.

"The image generator not being able to conceive of Asian people standing next to white people is egregious. Once again, generative AI, rather than allowing the imagination to take flight, imprisons it within a formalisation of society's dumber impulses," Satto wrote.

Despite highlighting the AI's bias and tendency toward stereotypes, the reporter avoids explicitly calling it racist. However, on social media, criticism escalated to accusations of racism against Meta's tool.

"Thank you for putting in the spotlight an often overlooked of why AI sucks: it is so incredibly ******* racist," one commenter on X (formerly Twitter), wrote. Another added, "Pretty racist META lol."

As some commenters pointed out, the AI's apparent bias is shocking given that Mark Zuckerberg's wife, Priscilla Chan, is of East Asian descent. Some users jokingly shared pictures of Zuckerberg and Chan on X, implying that Meta's AI could now generate such images.

It is worth noting that Zuckerberg's Meta is not the only tech company catching flak for developing a "racist" AI image generator. In February, Google had to halt its Gemini AI (formerly Bard) tool following critiques labelling it as "woke," as the AI appeared to refrain from generating images of white individuals.

When users gave the AI prompts that didn't specify race or ethnicity, Gemini AI-generated images often depicted historical inaccuracies, such as Asian people in Nazi uniforms during World War II, Black Vikings, and female knights in the Middle Ages.

"Gemini's AI image generation does generate a wide range of people. And that's generally a good thing because many people worldwide use it. But it's missing the mark here," Google said.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.