A deepfake video of actor Rashmika Mandanna entering an elevator has gone viral on social media, sparking calls for legal action. The video is so well-made that it is difficult to tell that it is not real.
Rashmika Mandanna DeepFake Video
Rashmika Mandanna DeepFake Real Video
Fellow actors and other users on social media platforms like X (formerly Twitter) quickly pointed out that the video is fake, and it has since been taken down from many platforms. However, the video has already been seen by millions of people.
The video has shocked actor Amitabh Bachchan, who has called for legal action against the person or people who created it. Union minister Rajeev Chandrashekhar has also spoken out about the video, saying that it is a “serious issue” and that the government is “working on a comprehensive legislative framework to deal with the issue of deepfakes”.
What are Deepfake videos?
Deepfakes are videos or audio recordings that have been manipulated to make it look or sound like someone is saying or doing something they never actually said or did. They are created using artificial intelligence (AI) and can be very difficult to detect.
The spread of deepfake videos is a growing concern, as they can be used to spread misinformation, damage someone’s reputation, or even commit fraud.
What are deepfakes used for?
Most of the deepfake videos have been pornographic in nature. But during elections time, digitally altered clips of politicians are also circulated to falsely attribute a statement or promise to them.
A few years ago, former US President Barack Obama was seen calling Donald Trump a complete dipshit in a widely circulated deepfake video. Similarly, Meta chief Mark Zuckerberg was seen bragging about having “total control of billions of people’s stolen data” in another such video.
AI firm Deeptrace found 15,000 deepfake videos online in September 2019, a near doubling over nine months.