Skip to main content

Scammers have turned to AI-generated images as they create fake fundraising pages for the Turkish and Syrian earthquakes. So far more than 37,000 people have died, a further 80,000 have been injured, and an estimated 1,192 buildings have been destroyed leaving people without shelter – and unscrupulous people are capitalizing on the tragedy using images generated by artificial intelligence. 

There has been a huge amount of controversy surrounding AI recently, from an AI image appearing on the cover (opens in new tab) of a photography magazine to an AI image fooling judges (opens in new tab) in a photo competition, but this latest scandal tips the ethical scale. In an attempt to pull on people’s heartstrings, scammers have been creating fake appeal pages that include AI-generated imagery of firemen “rescuing” toddlers from buildings or show children running away from explosions.

• These are the best AI image generators (opens in new tab) – please use them responsibly!

With fundraising efforts taking place all over the world to get lifesaving aid to the areas that need it most, scammers are using these fake appeal pages for their own benefit. Although it’s not uncommon to use Paypal as a means of donations, many of these pages have very few followers and a bulk of the donations have been self-made to make the scam seem more authentic. 

Despite how good AI is getting, however, it still struggles with one very specific thing:  hands. And people have noticed that some of the photos on these pages depict rescuers with six fingers. Rather than using images of previously occurring natural disasters, scammers are using AI text-to-image generation to create photos that look like scenes from the aftermath of an earthquake. 

At first glance, and from a distance, the photos look convincing, but as you get up closer they have a plastic, cinematic look to them that you certainly wouldn’t see on documentary style photos. 

The BBC spotted one such scam (opens in new tab) , which was advertising an appeal on Twitter, but when asked about the fake fundraiser, denied it was a scam. “My aim is to be able to help people affected by the earthquake if I manage to raise funds. Now people are cold in the disaster area and especially babies do not have food. I can prove this process with receipts.” As of yet, the BBC has not received any proof of receipts. 

TikTok has also been targeted by scammers, who use the Live video feature to stream looped footage of the devastation caused in the hope people will send digital gifts that can be redeemed from TikTok for money. 

A spokesperson for TikTok told the BBC, “We are deeply saddened by the devastating earthquakes in Turkey and Syria and are contributing to aid earthquake relief efforts. We’re also actively working to prevent people from scamming and misleading community members who want to help.”

Not only does this show a darker side to the uses of AI, but a darker side to humanity too. People will try to make easy money at any given opportunity, with no regard for the lives and people who really need it. For now, AI images are still not perfect – but it’s worrying to think that, as AI machines get more intelligent, it could become impossible to tell if an image is authentic or fake, even under the most dire circumstances. 


About SDX

Leave a Reply

%d bloggers like this: