Prashant Kumar

MrBeast and BBC stars used in deepfake scam videos

In the digital realm, deepfake videos have targeted YouTube’s MrBeast and BBC presenters Matthew Amroliwala and Sally Bundock, using AI to create deceptive content. A recent TikTok incident featured a fake MrBeast offering $2 iPhones, and a Facebook scam had BBC presenters endorsing an investment opportunity with Elon Musk. Meta and TikTok swiftly removed the content, raising concerns about their ability to combat deepfake threats. This article explores the intricacies of deepfake videos, revealing signs that distinguish them from reality, as high-profile individuals grapple with the aftermath. How can the public discern fact from fiction in a tech era blurring reality? 

MrBeast and BBC stars used in deepfake scam

The world’s largest YouTuber, MrBeast, along with two BBC presenters, has become the target of deepfake videos aimed at deceiving unsuspecting individuals online.

Deepfakes utilize artificial intelligence (AI) to fabricate videos by manipulating facial or bodily features of a person. A recent instance on TikTok purported to feature MrBeast, falsely offering new iPhones for a mere $2 (£1.65).

Similarly, the likenesses of BBC personalities Matthew Amroliwala and Sally Bundock were exploited in a known scam. A Facebook video presented the journalists supposedly introducing Elon Musk, the billionaire owner of X (formerly Twitter), endorsing an investment opportunity. Previous deepfake videos of Musk falsely portrayed him giving away money and cryptocurrency.

Upon reaching out to Meta, the owner of Facebook, the BBC had the content removed. Previously, such videos were flagged with a warning image, highlighting false information, as reported by independent fact-checkers FullFact.

A spokesperson from Meta stated, “We don’t allow this kind of content on our platforms and have removed it,” emphasizing ongoing efforts to enhance systems and urging users to report rule-breaking content.

TikTok, where the MrBeast deepfake appeared, swiftly removed the ad within hours of uploading. The associated account was also taken down for violating TikTok’s policies against “synthetic media” featuring the likeness of real individuals.

MrBeast, in a post on X viewed by over 28 million people, shared the fake video, raising concerns about social media platforms’ readiness to tackle the growing prevalence of deepfakes.

BBC News presenter Sally Bundock shared her experience of being deepfaked, expressing shock at the convincing video circulating on social media. The deepfake falsely presented her delivering a “breaking business news” story and directed viewers to a financial scam, claiming that a new investment project by Elon Musk could lead to significant returns, allowing individuals to quit their jobs. Bundock emphasized the scam’s unrealistic nature and the potential harm it could cause.

How To Spot a Deepfake Video

There has been a recent surge in notable instances of deceptive deepfake content, prompting concerns about the escalating capabilities of AI systems in generating highly convincing virtual replicas of real individuals. Tom Hanks, for instance, issued a cautionary message on Monday regarding a misleading advertisement featuring him endorsing a dental plan.

Typically, a key indicator that a video may be deceptive is its promise of something for nothing. However, the situation is more intricate in the case of the MrBeast video. The YouTuber, renowned for his generous giveaways, has distributed cars, houses, and cash, even gifting iPhones to trick-or-treaters last Halloween. This generosity could lead people to believe he is giving away devices online.

Nevertheless, astute viewers and listeners may discern subtle signs of deception. The scammers attempted to appear authentic by including MrBeast’s name and a blue verification mark in the video, mimicking those used on various social media platforms. However, TikTok videos automatically display the uploader’s name beneath the TikTok logo.

The unverified account that posted the video no longer exists. In videos featuring BBC presenters, errors are more noticeable. For instance, in the Sally Bundock video, the imposter mispronounces “15” as “fife-teen” and says “pro-ject” instead of “project.” Verbal errors, like spelling mistakes in scam emails, can be valuable in identifying fakes as technology advances and visual cues become less evident.

Deepfake creators used a genuine video of Sally Bundock discussing Mr. Musk’s takeover of X but altered it to imply she discussed an investment opportunity. In the Matthew Amroliwana video, similar audio cues with garbled sounds at sentence beginnings are present. The video also shows signs like distinct text, spelling errors, and unusual phrasing not in line with BBC News standards.

While a visual glitch where “Elon Musk” appears to have an extra eye on top of his left eye is a common issue with deepfakes, it remains a notable visual error. When uncertain about a video’s authenticity, a golden rule is that unless MrBeast or Elon Musk is physically present, there is no such thing as a free iPhone.

What does the law say about deepfakes?

The issue becomes intricate due to variations in laws across different countries, as highlighted by law professor Lilian Edwards.

In addressing copyright infringement cases involving scenarios such as an American company deceiving a British consumer through a Chinese platform, Edwards poses the question of determining the applicable law.

Potential legal avenues include defamation laws or data protection laws when one’s voice or image is utilized without consent and without justifiable cause.

However, Edwards issues a cautionary note against a blanket criminalization of all deepfakes. She emphasizes that criminalizing the creation of any deepfake could inadvertently impact the special effects industry in films and criminalize various forms of artistic expression.

Leave a Comment