0

With reports suggesting that adults across the US lost a record $10 billion to fraud in 2023, the Federal Trade Commission has warned of an increase of scams using Artificial Intelligence.

RELATED: AI scams using voice cloning are the new frontier for fraudsters targeting consumers

It’s extremely important to know how to safeguard against AI-driven scams which will become more prominent over the next few years.

Eager to help adults around the world, Christoph C. Cemper, an AI expert on behalf of AIPRM, identifies the most common forms of AI scams and how to avoid them.

The most common forms of AI scams 

  1. Deepfakes 

Deepfakes are used by scammers to create manipulated images, audio and video content. It involves cyber criminals hosting a large database of images and videos in order to replicate the voice and appearance of an individual, usually of somebody that is in the public eye.

ADVERTISEMENT

Celebrities such as Martin Lewis have been involved in viral deepfake videos over the past year, which showed him endorsing a fake investment project from Elon Musk. More recently however, amid conspiracy theories circulating, eagle-eyed users believe recent images of Kate Middleton may perhaps be a deepfake.

To limit your chances of being stung by a deepfake, it’s important to be very cautious about the personal information you share online, as well as watermarking photos and enabling strong privacy settings.

Christoph C. Cemper, on behalf of AIPRM, shares how to spot a deepfake:

ADVERTISEMENT

“AI allows scam artists to produce very convincing materials, whether it be through text, images, videos or audio clips.”

“If the deepfake is in the form of a video clip, look for unnatural expressions such as limited blinking and lack of expression, which AI can find hard to mimic. A lot of deepfake videos commonly use lip syncing, so carefully monitor this to ensure speech looks natural.”

  1. Voice Cloning 

A form of deepfake AI, Voice Cloning is capable of replicating the voice of an individual in order to convince someone that they are having an actual conversation with that person.

ADVERTISEMENT

According to security firm McAfee, it reportedly only takes three seconds of audio for artificial imposters to create a convincing AI voice clone. This is increasingly concerning in an age where 53% of adults share their voice data online at least once per week via social media and voice notes, making it increasingly easy for cybercriminals to tamper with personal data.

To reduce your chances of being involved in AI scams such as Voice Cloning, it’s important to limit the information you share about yourself personally, especially over voice recordings. It’s also advised to verify caller identity and report any suspicious activity to relevant authorities.

Christoph C. Cemper shares how to spot Voice Cloning scams:

“If you think you are being conned by a Voice Cloning scam, be sure to ask the caller for as much detail as possible, as only the individual they are pretending to be will know the correct answers.

“Many Voice Cloning scams pretend to be family or friends in distress, so it’s wise to agree on a verbal safety question or phrase with loved ones, that only they will know the answer to.”

“Be sure to listen for unexpected noises or unusual changes in the scammer’s tone of voice too, such as unusual pauses which suggest you aren’t having a real time conversation with the individual.”

  1. Verification Fraud

Something we’ve all become accustomed to is using passwords and biometrics to access apps on our mobile devices. By creating images and videos of non-existent people, this AI scam can deceive security protocols, granting access to financial and sensitive information.

Christoph C. Cemper shares how to spot Verification Fraud:

“It’s important to spend time educating yourself on recognising and avoiding AI scams such as Verification Fraud. Requests for personal information, unrealistic pricing or a pressure to act quickly, all highlight great red flags.”

Credit:  https://www.aiprm.com/ 

More in Features

You may also like