Prashant Kumar

Can Congress Save MrBeast and Tom Hanks From AI Deepfakes?

Deepfake technology is getting easier to use, and it’s becoming a problem for both famous people and regular folks. Experts say it’s risky because scammers are using this tech to create fake ads that look real.

There was an ad on TikTok claiming to sell iPhones for $2 each, with a famous YouTuber named MrBeast promoting it. But it was all a scam. MrBeast and other celebrities like Tom Hanks and Gayle King are facing issues because scammers are using their faces in fake ads for things like phones and weight-loss products, made using AI.

Some politicians are trying to solve this. They want to make laws to stop these fake ads. If these laws pass, it could help people take legal action against scammers using their faces.

But it’s not clear if these laws will stop all the harmful fake videos. Social media companies have rules against fake videos, but these ads still manage to get through.

The people behind these scams are hard to catch. They appear and disappear quickly, moving on to new targets. This means anyone who has posted their face online could become a target for these scammers, not just famous people.

Some think the best solution is to use existing laws that protect a person’s image and likeness. These laws exist in some states and are used by celebrities to fight against unauthorized ads. However, not all states have these laws, so some politicians are pushing for a new law called the No Fakes Act. This law would create a rule across the country to let people sue those who use their fake images without permission, with fines and penalties for the offenders.

Recently, a tweet by a social and digital strategist and fan raised a pivotal question: “Can Congress Save MrBeast and Tom Hanks From AI Deepfakes?” This query has since triggered significant speculation and varied reactions among fans and the public alike.

Face Swap Lawsuit

Kyland Young, who was a contestant on the TV show “Big Brother,” took legal action in April against Reface in a court in Los Angeles. He claimed that Reface made money from using his face and likeness without asking him first. He said the app promoted paid versions of its software.

A judge named Wesley Hsu said no to Reface’s request to stop the case in September. Reface argued that because people using the app make new funny videos and memes, it should be protected by the First Amendment (freedom of speech). But the judge disagreed.

Eleanor Lackman, lawyer for intellectual property, wasn’t surprised by the judge’s decision. She explained that even though California’s law allows free speech exceptions, Reface using Young’s image in an ad was seen as a way to sell something, which meant the case could continue rather than being thrown out early.

When a product is being sold using someone’s face, it’s more obvious that there might be a problem,” she said.

What is a “deepfake” and why does it matter?

The term “deepfake” denotes an AI-driven method that creates synthetic media by incorporating human features onto another person’s body or manipulating audio to produce a lifelike human encounter. Actor Val Kilmer, who lost his distinct voice due to throat cancer in 2015, recently had his voice “restored” using Sonantic’s deepfake technology. This emotional moment moved Kilmer’s son to tears upon hearing his father’s voice again.

Furthermore, deepfakes have been employed to overcome language barriers, as exemplified by English soccer icon David Beckham in his Malaria No More campaign. Deepfake technology allowed Beckham to convey his message in nine different languages. Additionally, deepfakes are sometimes utilized for entertainment purposes, as seen in an art installation that permits users to take a “surreal” selfie with Salvador Dalí.

Final Verdict

The rise of deepfake technology poses challenges as fake ads misuse famous faces. Politicians propose laws like the No Fakes Act, but effectiveness is uncertain due to differing state rules. Catching culprits is tough, affecting not only celebrities but anyone online. Legal cases, like Kyland Young’s against Reface, reveal the complexity of balancing free speech and commercial use of identities.

Proposed laws and legal actions aim to address the issue, but the ever-changing tech landscape and legal interpretations pose challenges. Collaboration among lawmakers, tech companies, and laws is crucial to protect individuals from deceptive deepfake practices.

Leave a Comment