The emergence of AI systems such as ChatGPT marks an important development because, for the first time, the public has the ability to access and exploit what it understands to be AI technology.
The wizardry of AI and how it might be useful to the everyday person remains a complete mystery to most. The vast majority will no doubt tinker with it and continue to sit on the side lines and watch as this new world develops. They will be, or at least hope to be, mere observers as the undoubted peaks, troughs, scandals and triumphs around AI emerge.
Fraudsters tend to be amongst the more innovative and bold amongst us and there are already clear indications that they are not sitting on the side lines when it comes to AI. And whilst the presumption might be that AI will lead to a wealth of super futuristic hi-tech fraud, the public is more likely to fall victim to well established scams that have been enhanced using AI technology.
Standard phishing emails, for example from a bank, can be made more realistic and less prone to some of the ‘easy to spot’ red flags such as poor spelling or grammar. This means more people can create convincing phishing emails, regardless of their grasp of English. This results in phishing emails of both higher quality and quantity – and of course this is important because the scammers only expect a tiny success rate, so volume is everything.
Spear phishing is highly targeted and relies on being able to tailor communication to an individual, so convincing them it must be genuine. AI can be used to upscale spear phishing as it can be used to automate the information-gathering process by scanning for publicly available information about a person, typically through their social media offerings.
Perhaps a more disturbing development is the emergence of voice cloning scams. This is where AI is used to find samples of an individual’s voice online and create fake messages of distress that are then sent to loved ones. Previously such scams were carried out by email or text/instant message. Examples of the voice messages received are that the person has been in a car accident, been robbed, lost their wallet or need help abroad.
So what can people do to protect themselves?
Just as these are enhanced versions of standard scams, so enhanced diligence is required by the public. People should be sceptical of any electronic communication that comes unexpectedly, even more so if it places time pressure on the individual to complete some act, and more so again if it contains a link or requires any divulgence of personal data. If any person suspects an electronic communication is not genuine, they should independently verify it by looking up the company or person in question [do not use any contact details or websites provided in the message itself] and contacting that company directly to seek confirmation on whether the communication is genuine. Most companies will never contact customers by email or text to solicit personal data.
For the voice scam it is much harder to have a measured response if you believe a loved one is in difficulty. Awareness of such scams is the best way to stop people falling for it. Sadly this means adopting a sceptical mindset in circumstances where a voice message has been received seemingly from a distressed love one. But then the rules of independent verification still apply, and the victim needs to use tried and trusted ways of communicating with the person for whom the distress claim has been made. If not with the person themselves then with others close to them who might be able to dispel the authenticity of the message very quickly. Never use any contact details that might have been provided through the voice message or any connected communication.
So the point is that AI is already impacting upon the types of everyday scams we are used to, making them more convincing and common. It remains in its infancy as a tool for the masses to play with and exploit and only a select few can predict on any informed basis where it will go next. For the rest of us we have to be more alert and sceptical than ever before.
Mike Jackson