In recent days, everyone from government agencies to celebrities to pro sports teams have hopped on the latest artificial intelligence-generated trend, using programs like ChatGPT to create an image of themselves in Barbie-like plastic packaging.
However, tech experts warn that the light-hearted trend carries some risks, ranging from potentially inviting cyber scams to raising ethics and sustainability concerns.
Participants in the trend often generate images featuring items referencing various aspects of their life, whether it’s where they live, what they do for a living, or a favorite pastime. Those type of disclosures could help scammers trick people down the line.
Oh, you know we had to hop on the trend.
— Cleveland Guardians (@CleGuardians) April 14, 2025
The fellas, reimagined as action figures.#GuardsBall pic.twitter.com/0u60uUfF2j
Clark County employees, assemble! 💪
— Clark County Nevada (@ClarkCountyNV) April 16, 2025
We joined the action figure trend to spotlight some real-life heroes who make our community stronger every day.
Because when we work #togetherforbetter, anything is possible. 💙
Want to join our team? Apply now: https://t.co/w8zNIqUen9 pic.twitter.com/ADTSamKFL6
“The fact that you are showing people, ‘Here are the three or four things I’m most interested in at this point’ and sharing it to the world, that becomes a very big risk, because now people can target you,” Dave Chronister, the CEO of cybersecurity company Parameter Security, told HuffPost. “Social engineering attacks today are still the easiest, most popular way for attackers to target you as an employee and you as an individual.“
Jennifer King, a privacy and data policy fellow at the Stanford University Institute for Human-Centered Artificial Intelligence, added in an interview with the outlet that users ought to consider that their images will likely go towards training future AI models, the same tools increasingly being integrated into corporate and military applications.
Others have urged users to be careful incorporating trademarked material into their action figures.

"Mattel has been known to pretty actively enforce protections against their marks," attorney Charles Gallagher told Fox13. "Having a Barbie logo on your action figure would probably be something you don’t want to have."
In the face of the action figure meme, some have sought to remind the public of the enormous energy and water needed to feed the advanced computers that power AI models.
"ChatGPT Barbie represents a triple threat to our privacy, our culture and our planet," Professor Gina Neff of Queen Mary University London said in an interview with BBC.