The “first deepfake” of Russia’s invasion of Ukraine is… bad?

But that doesn’t mean it couldn’t be effective.

Grant Currin
The “first deepfake” of Russia’s invasion of Ukraine is… bad?
The deepfake itself (left); Zelenskyy responding to the deepfake (right).1, 2

For years, researchers and activists have warned that AI is getting so good at manipulating video and audio that bad actors can make it look like people said things they never did. 

These counterfeit videos — experts call them “deep fakes” — could have absolutely devastating consequences.

Bad actors could stoke civil unrest by sowing fear and hate. They could start wars. They could spread almost any kind of misinformation. 

Get more updates on this story and more with The Blueprint, our daily newsletter: Sign up here for free.

That’s exactly what just happened in Ukraine.

Earlier today, Ukrainian new station Channel 24 announced that hackers who’d broken into its website and TV channel had shared a deepfake that depicted Ukrainian president Volodymyr Zelenskyy telling Ukrainians to lay down their arms.

The station explained in a Facebook post that the announcement was fake and the result of a hack.

“The running line of the ‘Ukraine 24’ TV channel and the ‘Today’ website were hacked by enemy hackers and broadcast Zelenskyy’s message about alleged ‘capitulation’❗️❗️❗️ THIS IS FAKE! FAKE !” the post reads in English.

A transcript of the fabricated audio was posted to Channel 24’s hacked website. An automatically generated translation of the message reads, in part, “I advise you to lay down your arms and return to your families. You should not die in this war. I advise you to live, and I’m going to do the same.”

A screenshot of the announcement posted to Channel 24’s hacked website. Source: Web Archive

This incident did not come as a big surprise to anyone who’s watching the situation in Ukraine closely. Earlier this month, the Ukrainian Army issued a warning about fabricated videos.

“Imagine seeing Vladimir Zelensky on TV making a surrender statement. You see it, you hear it – so it’s true. But this is not true! This is deepfake technology,” the post reads in Ukrainian.

“This will not be a real video, but created through machine learning algorithms. Videos made through such technologies are almost indistinguishable from real ones,” the post continues.

This incident is surprising because the deepfake was, well, not very good. The technology for creating truly indistinguishable deepfakes doesn’t just exist — it’s easily accessible to almost anyone. A study published last month showed that ordinary people couldn’t didn’t have a chance of telling the difference between a still image of a real face and of a deepfake. 

But the Zelenskyy deepfake wasn’t quite so convincing. It looks more like a copy-paste job than AI-enabled super-villainy. 

Of course, it doesn’t really matter how good the deepfake is. What’s important is whether people are convinced. 

It’s impossible to say what — if any — effect this propaganda move has had, though Euronews reports that one version on Twitter was viewed more than 120,000 times.

Sam Gregory, who is a program director at a human rights and technology group called Witness, told the outlet “[t]his is the first deepfake that we’ve seen used in an intentional and broadly deceptive way.”  

No one has claimed responsibility for the video as of writing.

Zelenskyy responded earlier today with a  short video on Instagram assuring viewers that he was not backing down. 

The only ones who should give up arms are Russian soldiers,” he says in the video.

message circleSHOW COMMENT ()chevron