October 13, 2019
Share this post:
Author: John Martin, Senior Security Architect, IBM New Zealand
Earlier this year, the head of a UK-based energy firm received a call from the CEO of his German parent company. The CEO gave him an urgent directive, instructing the UK executive to send funds to one of the company’s suppliers in Hungary within an hour. He actioned the order immediately and transferred €220,000 as requested.
It seems like an everyday business scenario, but in reality, was far from it. That’s because the caller wasn’t the CEO at all, but rather a fraudster who used Augmented Intelligence to mimic his voice. The long-predicted nightmare scenario of law enforcement agencies – that criminals would use Augmented Intelligence to automate cyberattacks – had come true. It’s known as ‘whaling’, and it’s uncharted territory for security professionals.
An evolutionary leap in social engineering
As improved security has made it harder to gain physical access to buildings or to hack into computer networks, criminals have increasingly targeted the people in an organisation with social engineering attacks.
But the incident described above represents a major escalation, using not just the persona but the actual voice of a major figure in an enterprise to convince others to bypass basic security protocols.
How would the people in your organisation cope if your CEO’s voice was artificially reproduced to make a request, with sufficient believability that they immediately associate the voice on the telephone with the real person? Have you implemented and practised an incident response plan for such a situation?
Simple steps for countering Whaling
Should you find yourself in a similar high-pressure position, there are steps you can take to verify that the person you are talking to actually is who they claim to be, including:
- Ask questions that only you and the caller would know the answers to, like the name of their pet or when you last saw them in person.
- Do they immediately attempt to put you under more pressure by stating they don’t have time for questions? Listen to your instincts – does the conversation feel ‘wrong’?
- Do their responses differ from the person’s normal attitude? Has their tone changed or become suspicious?
- Follow the Cold War maxim of “Trust but verify” – confirm that the caller is genuine by having them confirm as many facts about themselves as possible.
- If you can, record the telephone conversation on your mobile or another device.
- Ensure your incident response plan includes these scenarios and make sure they’re regularly practised, so it becomes automatic to verify that a caller is genuine.
- Avoid saying the word “yes” in a suspect conversation, as this can be played back later to confirm an agreement.
Technological countermeasures to ‘whaling’
Soon we may see automatic speaker recognition embedded into mobile phones and landlines, which could recognise a speaker based on analysis of their voice patterns. This would allow a computer to recognise a person who was talking to it earlier and continue the conversation where it left off, just as humans do.
In the meantime, protecting your organisation against ‘whaling’ comes down to following verification procedures, using common sense, and listening to your instincts.