Zelensky and Putin ‘deepfake’ video emerges on brink of Ukraine conflict

Ukrainian media was hacked on March 16 with a video of Volodymyr Zelensky announcing the surrender of Ukrainian troops. This low-quality video is a “deepfake” that tells them things they didn’t say by animating their faces and modifying their voices. In response, other hackers retweeted a similar video in which Vladimir Putin announced he had reconciled.

A video of Volodymyr Zelensky, broadcast through multiple channels including Facebook, Twitter and Telegram, shows him explaining to the camera in Ukrainian:

dear defender [de l’Ukraine]. As it turns out, being president is not that easy. I have to make difficult decisions.First I decided to return to the Donbass [à la Russie]. Time to face the facts: it didn’t work. […] Now I have made another difficult decision: say goodbye to you. I suggest you lay down your arms and go back to your family. You should not die in this war. I advise you to live. I will do the same.

As shown in the screenshot, the video was posted on the social networks of the Ukrainian Russian-language tabloid Segodnya, as well as on the Ukraine 24 website. Both media outlets responded quickly, claiming they had been hacked and denying the veracity of the claim.

The Ukrainian president himself also released a video on the afternoon of March 16 to mock a video he deemed “naive”. “The only thing I can advise is that the Russian soldiers put away their weapons and go home!” he explained to the camera.

The virality of the video is difficult to analyze because it is quickly removed from major platforms like Twitter or Facebook.

Still, by the end of the day on March 16, the main Twitter post had reached 80,000 views and just over 400 retweets. The pro-Ukrainian account mocked the poor quality of deepfakes, saying “Who are we against? These disabled people don’t even know how to do a proper deepfake.”

The tweet, from a pro-Ukrainian account, mocked Zelensky’s low-quality deepfakes. © twitter

According to the website Atlantic Council, The video was first shared in the pro-Russian Telegram group “Operational”, and Ukrainian media are welcome to repost the video and share the video’s files.

The video, which was created on March 16 and shared on the channel two hours later, went viral, boosted mainly by the main Russian social network VK, the metadata said.

Vladimir Putin’s deepfake in response

Hours after the deepfake of the Ukrainian president, another video surfaced, this time of Vladimir Putin delivering a pacifist speech that was a far cry from his last. We hear it say in Russian as follows:

“Just negotiated with Ukraine and Russia has had some success. I will tell you soon: we have made peace. With Ukraine. Ukraine within its internationally recognized borders, with Donetsk and Luhansk Regions border. We have agreed to establish a common fund with the EU and the United States to rebuild infrastructure in these Ukrainian regions. We have also signed a five-year roadmap to restore Crimea’s independence as a Ukrainian republic.[…] there will be no repression [en Ukraine]. like there will be no repression against the Russian people. All of this is enshrined in the peace agreement. Life and peace go on. “

According to reporters, the video was circulated a few days ago. From the American media DailyDot. Again, the source of this video is hard to find and not widely shared: it only reached 70,000 views and is still visible on twitter Forwarded by pro-Ukrainian accounts.

Alert less than two weeks ago about the spread of deepfakes

On March 2, the Ukrainian Center for Strategic Communications and Information Security reminded its subscribers Via a post on Facebook Explains: “Imagine you see Volodymyr Zelensky on TV making a statement about the country’s surrender. You see it, you hear it. But it’s not. It’s a deepfake created by a machine learning algorithm.”

Can such content really deceive people? According to Gérald Holubowicz, a product manager for various media journalists and a deepfake expert who is joined by the editorial staff of France 24 Observer, we have to be cautious.

If we look at the credibility of the content, first there is a quality issue: we see the size of the head [de Volodymir Zelinski, NDLR] disproportionate to the neck, and the diffusion channel remains relatively ineffective. But above all a question of context: President Zelensky has a strong reputation among Ukrainians and has direct lines of communication, so the fakes were immediately questioned.

However, allegations of political deepfakes often involve highly controversial figures, especially in Gabon. [des opposants avaient accusé le pouvoir d’avoir diffusé un deepfake d’un discours du président Ali Bongo après son AVC, le pensant incapable de gouverner, NDLR]. The idea is to create confusion between the real and the fake, and it will be difficult for politicians to prove whether the video is real or fake.

“This type of content can be very dangerous in the face of an unprepared crowd”

However, according to Gérald Holubowicz, the conflict in Ukraine is where we first saw deepfakes multiply, including Vladimir Putin The purpose is to mislead public opinion. He pursues:

We are seeing synthetic media starting to play a major role in this misinformation battle. And there are knowledge issues on the subject, and this type of content can be very dangerous in the face of an unprepared crowd. The operation is obvious to the used eye. But the situation Ukrainians face leaves them in limbo, and a video like this, broadcast on Telegram or WhatsApp on a mass scale, would clearly sow panic among the unaccustomed.

We might also worry that these are a testing or learning phase for hackers: they post videos to see how misleading it is, and study what mistakes they might have made before making another fake that’s more convincing. At the same time, as Zelensky’s videos piled up, it also created videos that they could process and improve.

If the conflict continues, I fear we’ll expect more and more convincing deepfakes, which is why it’s important to raise public awareness of these issues, especially the provenance of these videos by the open platform sources that create such content.

Detect deepfakes

There are relatively few deepfake videos aimed at large-scale deception, although there have been some recent cases, such as in Cameroon or more recently. in Mali.

Often of poor quality, because they require a lot of investment to execute correctly, they are usually easy to spot. Here are some tips listed in Info or Intox episodes.

Source link