Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Pedestrian.tv
Pedestrian.tv
National
Lachlan Hodson

Both Parties Are Under Fire For Using Weird AI Videos To Attack Their Opposition

A series of TikToks have caused a storm in the world of Australian politics, after various parties used an AI deepfakes of politicians to make videos of their political opponents dancing. The videos have been described by one politician as a “turning point in democracy”.

You might have been surprised online recently to see either the Queensland Premier Steven Miles or Opposition Leader Peter Dutton grooving it out on your TikTok For You Page.

Unfortunately, neither of these politicians have secretly been taking dance lessons (that we know of) in order to appeal to the Gen Z voter base. Instead, these are AI-generated deepfakes.

No doubt you’ve probably seen similar strange dance videos on your TikTok FYP lately. Thanks to the constant progress in AI video generation, using programs to create content is becoming more and more accessible to everyone.

@jessika_r9 #CapCut #viggle #viral #fypp #tiktok #glorilla #ai #aidance #tutorial #dance @Christian Torrez🎸 ♬ original sound – Jessica Ramos

And while it’s excellent for cooked memes that entertain my broken Zoomer humour, political parties using AI for attack ads has caused issues for a few politicians.

Steven Miles has AI dance video generated of him

This week Queensland’s Labor Premier Steven Miles slammed the state’s Labor National Party for sharing a video to its TikTok that used AI to depict him dancing.

The video (which has since been removed from the LNP’s TikTok) featured a model of Miles ripped straight from the uncanny valley, busting a move to the song “Closer” by Ne-Yo.

“POV: my rent is up $60 a week, my power bill is up 20%, but the premier made a sandwich on TikTok,” the controversial video was captioned.

Miles slammed the Liberal National Party for using AI to depict his likeness, calling the video a “turning point for our democracy”, especially in the lead up to the state’s October election.

“Until now we’ve known that photos could be doctored or Photoshopped. But we’ve been trained to believe what we see in videos and for a political party, now, to be willing to use AI to make a deepfake attack videos. It’s a very dangerous turning point,” Miles said.

“And that means that Queenslanders between now and October will have to question everything that they see from the LNP and ask themselves, is this real? Or is this a deepfake?”

A spokesperson for the LNP defended the video, saying it was clearly marked as AI on social media. They also highlighted that this format of entertaining video is being used as a trend online, so they followed the trend.

Steven Miles doubled down and said his party has “no intention of using artificial intelligence to create deep fake videos”.

Except as the LNP pointed out… someone from Labor already had! Oops, caught red handed.

Peter Dutton got one too

On June 6, the federal Labor Party’s TikTok page posted a similar video of Opposition Leader Peter Dutton, long before the Queensland Liberal Nationals posted their video.

Labor’s attack ad on Dutton came out just after the Liberal Leader announced his controversial nuclear energy policy, and featured an AI-deepfake model of the politician doing the “Gimme Head Top” TikTok dance trend.

“Dance if you want to build nuclear power plants in everyone’s backyard,” read the video’s caption.

The video remains online, and has been labelled as created by AI.

Neither Peter Dutton, nor a spokesperson from Labor have commented on the video yet.

How are these AI dance videos made?

Seeing AI versions of politicians do or say uncharacteristic things on TikTok is hardly new territory. I think we’ve all seen enough videos of Barrack Obama, Joe Biden, and Donald Trump playing Minecraft together now to know we can’t believe everything we see online.

However, this trend of creating passably lifelike models and making them dance is new.

These videos are created using an artificial intelligence program called Viggle.ai, which can be accessed for free online after a signing up to the program’s beta.

All the user needs to do is choose a photo of a person, which the AI then creates a 3D model of. Then it makes that model copy the body movements of another chosen video subject.

After only minutes on the site, I was able to render AI versions of whoever I wanted doing a variety of dances.

Naturally, I created a video of former prime minister Tony Abbott doing a Fortnite dance in his iconic budgie smugglers.

Can’t wait for the Boomers to get tricked by that one.

However the program doesn’t just make silly dance videos. It has the ability to overlay an AI-generated model of anyone over any other clear footage of someone’s movement.

For example, here is a deepfake of billionaire and wannabe politician Clive Palmer playing soccer.

I was able to create dozens of these videos in minutes.

So what does this accessibility mean for the future of political ads? Is everyone going to start using deepfakes of their opposition, or is it already banned?

Why are politicians concerned about AI deepfakes?

The creation of deepfake technology has been the cause for countless concerns, especially due to threats such as deepfake pornography.

In the political sphere, AI deepfakes can cause issues for politicians if they have a deepfake of their likeness used to share a message that they didn’t endorse or believe in, as highlighted by Greens Senator David Shoebridge on ABC Radio.

“Imagine if it was a more serious video, imagine if it was, you know, the premier allegedly saying things he didn’t believe in and that may have been contrary to his party platform,” he said.

Shoebridge highlighted that laws in Australia had been proposed that would make it illegal for deepfakes to be made of politicians before and during an election campaign — with similar laws already existing in South Korea.

However, we’re talking about silly dancing videos here.

The content, regardless of it it is good or not, was labelled by those uploading it as AI. Both examples were used for the purpose of parody, not to mislead voters.

In these cases, Shoebridge said that any laws created would have to allow for “things like parody” to be made.

At the end of the day, the internet is still the cooking pot for all the misleading and confusing content it’s always been. And until legislation comes in to stop misinformation and AI deepfakes, it is up to you to make sure you don’t fall for fake news.

Good luck out there, folks.

The post Both Parties Are Under Fire For Using Weird AI Videos To Attack Their Opposition appeared first on PEDESTRIAN.TV .

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.