0

Recent research found that one in 12 Britons had been a victim to rising AI scams, and with the technology only continuing to develop and advance, this is likely to rise.

RELATED: AI scams using voice cloning are the new frontier for fraudsters targeting consumers

Due to AI, previous scam awareness tips such as proofreading for poor spelling and grammar, or poor use of English, aren’t relevant. Chat GPT allows fraudsters to create convincing emails and messages with very little effort.

AI allows fraudsters to mimic voice, and even bypass certain forms of verification. As such, the anti-fraud experts at Scams.info wanted to raise awareness around the most common AI scams, and offer their expertise so that consumers can avoid becoming a victim of fraud.

Top AI scams to watch out for, and how to avoid becoming victim

ADVERTISEMENT

  1. Investment Fraud

Scammers play on the hype around AI and use this to their advantage with fledgling investors. Social media platforms like TikTok have seen sharp increases in financial disinformation that often serve to benefit faux-financial gurus that benefit from driving referrals to investment opportunities without giving clear information. There have been instances of platforms that claim to use AI to generate returns for investors, however these are often scams that trick investors into handing over money.

Nicholas Crouch at Scams.info says “Watch out for investments that promise high returns with little risk and be sure to do comprehensive research before handing over money. Budding investors should also be aware of opportunities that ask you to recruit new investors; these often operate as Ponzi or pyramid schemes that while benefiting those at the top of the pyramid, very rarely benefit others involved. And finally, be conscious of the financial information you learn online, particularly on social media, there is an increasing amount of financial disinformation around investing that lures investors into sophisticated scams.”

ADVERTISEMENT
  1. Impersonating Loved Ones to Extort

Perhaps one of the cruelest forms of AI scamming is where criminals replicate the voice of a loved one to pretend they’re in danger to extort money. This can be anything from a WhatApp message to an actual phone call. Scammers take audio samples from social media and use these to create messages that can be used to manipulate family members into sending over money.

Nicholas Crouch says “It’s vital that people protect their social media accounts to prevent scammers having access to recordings of your voice and details of your wider family. For people with public social media accounts for content creation purposes, try creating a family ‘password’ or ‘codeword’ that can be used to verify identity verbally. Even those with private accounts may choose to do so as a precautionary measure. This password should be kept top secret and limited to family and close members of your support network and shouldn’t be anything that is predictable based on your social media accounts, or relating to family names or pets. If you struggle remembering the passwords make sure to keep a physical note of it rather than a digital one.”

  1. Bypassing Security with Voice

Many banks throughout Europe and the US rely on voice identification when banking over the phone, you might be familiar with ‘using your voice as a password’ and how this has been likened to the equivalent of a fingerprint in terms of uniqueness. And while this seems secure, with the rising ability to clone voices with AI, this form of verification has been found to be penetrable.

ADVERTISEMENT

“Many banks are using multi-phase verification so that in addition to your voice, you’ll likely be asked to confirm the sum of a recent transaction, or where an amount was spent, so it’s unlikely that fraudsters would be able to bypass this step. However, if you’re concerned, you can reach out to your bank for advice, as well as following the precautions mentioned for protecting your voice” according to Nicholas Crouch.

Courtesy: https://www.scams.info

More in Report

You may also like