Credits: Unsplash/CC0 Public Domain
imagine a october wonder Like No Other: Only a week before November 5, 2024, a video recording reveals a secret meeting between Joe Biden and Volodymyr Zelensky. The US and Ukrainian presidents agreed to immediately admit Ukraine into NATO under a “Special Emergency Membership Protocol” and prepare for a nuclear weapons strike against Russia. Suddenly, the world is on the brink of Armageddon.
while journalists could point to no such protocol exists and social media users may see the strange video-game-like qualities of videos others may feel like their worst fear Has been confirmed. When election day arrives, these concerned citizens may let the video sway their votes, unaware that they have just been manipulated by a status deepfake – an event that never actually happened.
The situation represents the next step in deepfake technologies that have already shaken viewers’ perception of reality. in our research defec projectmy coworkers Rochester Institute of TechnologyThe University of Mississippi, Michigan State University And me Study how deepfakes are made and what voters can do to protect themselves.
imagining events that never happened
Deepfakes are created when someone uses artificial intelligence tools, especially deep learning, to manipulate or generate faceWith the rise of larger language models such as a voice or— chatgpt,Koine, These can be combined to make a “situation deepfake”.
The basic idea and technique of a deepfake situation is the same as any other deepfake, but with a bolder ambition: to manipulate a real event or invent an event out of thin air. Examples include illustrations of Donald Trump’s Purple Walk And Trump hugging Anthony FauciNone of these happened. hug shot was promoted by a Twitter account associated with the presidential campaign Trump opponent Ron DeSantis. One attack ads Joe Biden’s 2024 campaign was targeted, published by the Republican National Committee Made entirely from AI,
But defec projectOur Research has found including situations, deepfakes are usually created by mixing some joining one piece of media with another, using a video To animate an image or convert another video, dubbed Kathputli; bring a piece of media into existence, typically using generative ai, or some combination of these techniques.
To put it plainly, many situational deepfakes are created for innocent purposes. For example, Infinite Odyssey Magazine Creates fake images from movies never produced Or could never have existed, But even innocent deepfakes give reason to pause, as is the case with almost-believable fake photos. Apollo Moon landing as a film production,
deepfaking the election
Now put yourself in the position of someone trying to influence an upcoming election. What are the possible situations that you would like to create?
For starters, it will matter whether you want to tilt the vote toward or away from a specific outcome. Maybe you’ll picture a candidate acting heroic by pulling a pedestrian out of the way of a speeding car, or vice versa, doing some offensive or criminal act. The format of the situation deepfake will also matter. Instead of a video, it could be a picture, perhaps blurred and with angles that simulate a smartphone camera or a forged logo of a news agency.
Your target audience will be important. Instead of targeting ordinary voters or a party’s base, you can target conspiracy theorists in key voting districts. You can portray the candidate or their family members as attending satanic ritualsattending a festival distinctive and controversial Bohemian Groveor be one secret rendezvous with a supernatural being,
If you have the ambition and abilities for it, you can even try to deepfake the polls. In June 2023, Russia’s television and radio stations were hacked broadcast full mobilization orders by a Deepfake of Russian President Vladimir Putin, While this would be more difficult to do in a US election, theoretically any news outlet could be hacked to force their anchors to announce false results or a candidate’s defeat.
defending reality
There are a variety of technical and psychological methods for detecting and preventing deepfakes.
On the technical front, all deepfakes contain some evidence of their true nature. Some of these things can be seen by the human eye – such as overly smooth skin or contrasting lighting or architecture – while others may be Can only be detected by deepfake-hunting AI,
we are building DefecA detector using AI to catch the clearest signs of deepfakes, and we’re working to try to have it ready in time for the 2024 election. But even if a sufficiently powerful deepfake detector like ours can’t be deployed until Election Day, there are psychological tools you, the voter, can use to identify deepfakes: background knowledge, curiosity, and healthy skepticism.
If you come across media content about a person, place or event that seems unusual, rely on your background knowledge. For example, in a Recent rumors of a fire at the PentagonThe building shown looks more square than pentagonal, which may have been a cheaper gift.
However, try not to rely solely on your background knowledge, which may be incorrect or incomplete. Never be afraid to learn more from reliable sources, such as fact-checked news reports, peer-reviewed academic articles or interviews with certified experts.
Additionally, be aware that deepfakes can be used to take advantage of what you believe about a person, place or event. that’s the best way to deal with it be aware of your biases And be a little wary about any media content that appears to confirm them.
Even if it becomes possible to create perfect conditions for deepfakes, no matter how believable their subject is, they are likely to be in trouble. So, with or without a technical solution, you still have the power to protect the election from the effects of fraudulent events.
This article has been republished from Conversation Under Creative Commons Licence. read the original article,
Citation: Fake videos could affect the 2024 presidential election – a cyber security researcher explains the state of deepfakes (2023, July 17) Retrieved on July 17, 2023
This document is subject to copyright. No part may be reproduced without written permission, except in any fair dealing for the purpose of personal study or research. The content is provided for information purposes only.











