The authors suggest that we do not need technological advances to distort memory, we can do it very easily and effectively using non-technological means. Credits: Anemone123, Pixabay, CC0 (creativecommons.org/publicdomain/zero/1.0/)
In a new study, deepfake video clips of movie remakes that don’t actually exist lead participants to falsely remember the movies – but simple text descriptions of the fake movies induced similar false memory rates. Gillian Murphy of University College Cork, Ireland, and Leroy and colleagues from the Science Foundation Ireland Research Center for Software, presented these findings in the open-access journal. one more,
Deepfake videos are clips that have been altered using artificial intelligence (AI) technology to replace one person’s voice or face with another person’s. The tools to create deepfakes have recently become much cheaper and more accessible, raising conversations about potential creative applications as well as potential risks – such as spreading misinformation and manipulating audience memories.
To explore the potential risks and benefits, Murphy and colleagues invited 436 people to complete an online survey that involved watching deepfake videos of fake movie remakes starring various actors, such as the character Neo in The Matrix. Will Smith in – The character originally played by Keanu Reeves and Brad Pitt and Angelina Jolie in The Shining. Other mock remakes in the study were Indiana Jones and Captain Marvel.
For comparison, participants also viewed clips from actual remakes, including Charlie and the Chocolate Factory, Total Recall, Carrie, and Tomb Raider. Also, in some cases, participants read text descriptions of remakes instead of viewing deepfakes. Participants were not told until later in the survey that the deepfakes were false.
In line with prior studies, deepfake videos and descriptions produced false memories of fake remakes, with an average of 49 percent of participants believing each fake remake was real. Several participants reported that they remembered that the fake remakes were better than the originals. However, false memory rates from text descriptions were similarly high, which suggests that deepfake techniques may not be more powerful than other tools in distorting memory.
Most participants reported being uncomfortable with deepfake technology being used to recreate movies, citing concerns including disrespecting artistic integrity and disrupting shared social experiences of movies.
These findings may help inform future design and regulation of deepfake technology in movies.
The authors state, “While deepfakes are of great concern for a number of reasons, such as non-consensual pornography and bullying, the current study suggests that they are not uniquely powerful in distorting our memories of the past. However Deepfakes cause people to form false memories. Despite the significantly higher rate in this study, we achieved the same effect using simple text. In short, this study shows that we don’t need technological advances to distort memory , we can do this very easily and effectively using non-technical means.”
more information:
Gillian Murphy et al, Face/Off: Changing the Face of Movies with Deepfakes, one more (2023). DOI: 10.1371/journal.pon.0287503
Citation: Deepfake videos give half of participants false memories of movies (2023, July 13) Retrieved on July 13, 2023
This document is subject to copyright. No part may be reproduced without written permission, except in any fair dealing for the purpose of personal study or research. The content is provided for information purposes only.










