banner-image

Not again! Another Phishing Simulation Goes Awry – Key lessons from the West Midlands Railway incident

By Omer Taran
image Mai 12, 2021 image 5 MIN READ

Key lessons from the West Midlands Railway incident and other common mistakes in employee training

How resilient are your employees in the face of a phishing scam? That’s the question that many security teams ask themselves on a regular basis. One transport organization, the West Midlands Railway in the UK, was looking to answer the same question when they created a phishing simulation to test their employees this month.

So far, so good. After all, testing your employees and establishing your high-risk group is an important part of validating your security posture as an organization. However, the simulation, which promised financial reward for the hard work invested by employees during the COVID pandemic, soon backlashed as employees found this to be a not so funny joke on their behalf, and naturally, the train company soon hit the headlines.

This is not the first time a phishing simulation email has yielded employee aggravation. It happened to ABN Amro back in 2017, and just recently to GoDaddy and the Tribune. These are the few cases that hit the news, but this is much more common than you may think and it seems that security teams are unaware of the psychological aspects of training in general and phishing simulations in particular.

We have very basic facts on the newest incident – The West Midland case – but there is enough for us to learn from. As portrayed in the press, the company sent an email to employees announcing that its Managing Director Julian Edwards, is offering a one-off payment as a reward for their hard work during the COVID-19 pandemic. Employees that clicked the link leading to the Thank You note were soon sent a second email letting them know this was only an exercise.

In response, the TSSA employee union has called the security training “crass and reprehensible behavior.”

So what made this training email so hurtful? Let’s take a look at the errors of this specific simulation, and peel back what went so badly wrong.

Thinking like a hacker may have its downside

When you create the content of a phishing simulation test, you may ask yourself “What would encourage my employees to click on this link?” “Whose voice should I imitate, and what incentive should I offer them to sound the most believable?” You might recite a common security mantra that ‘to beat hackers you need to think like a hacker’.

Of course, you want your phishing simulations to be believable, and to catch employees off-guard. However, unlike the hacker, you also go to lunch with these employees or rely on their goodwill in reporting security flaws they’ve uncovered. Unlike hackers, who have no relationship with the organization, the security team is deeply involved with employees and usually, such backlashes are a result of failing to understand this complex dynamic.

In this case, the train company also failed to consider the context of what they were sending to their workers. The TSSA General Secretary Manual Cortes spelled out just how difficult a year the West Midlands Railway has had. “Our members have made real sacrifices these past 12 months and more. Some WMT staff have caught the disease at work, one has tragically died, and others have placed family members at great risk.”

We all know how hard the pandemic hit and that Public Service employees were on the front line – getting economies back in order. While a hacker would naturally exploit this in his favor, employees within an organization cannot “play a hacker” as they’ll find themselves “winning the battle but losing the war”. It’s important to realize that there is no such thing as neutral content: It will always be interpreted by your employees based on factors such as internal culture, recent events, hierarchies, and more. It’s the job of the security team to find and consider all of these factors when they are creating security awareness training.

Phishing is not a technical test, it’s a cognitive one

Phishing doesn’t succeed because DKIM is not aligned or because links are not hovered to detect the URL. Phishing succeeds in a split-second decision that goes wrong. That split-second decision is when we look at our inbox, with the many emails cluttering it, and need to make a decision – should I open a specific email, or not? It’s not a well-thought-out decision, but rather an instinctive one, based on System 1 thinking.

Transforming instincts is achievable, but it requires security teams to step away from their technical comfort zone and the thinking like a hacker’ approach and into the world of psychological cues and how they affect employees. A recent example would be one of our customers, who had a debate with managers in India whether a specific phishing simulation pertaining to the availability of COVID vaccines is an appropriate one for that part of the world. Such a discussion factors in employees’ reactions prior to sending a simulation. Sure, one can argue that hackers might send that kind of a phishing email and it’s our role to prepare them for any scenario. But when employees perceive a message as hurtful, the discussion becomes about the security team’s ethics and competence and not about phishing and its dangers. Such a discussion drives employees away from learning and allows them to feel confident that they failed a test because it wasn’t fair, hence there was nothing to learn here. This is a lose-lose situation – and that’s exactly what’s happened in the West Midland case.

Not everything that can be done should be done

As mentioned, this isn’t the first time that a company has generated negative feedback from phishing simulations as part of security awareness training. Recently, we published an article on Cloud Security Alliance, analyzing the GoDaddy phishing simulation that was coined “the cruelest prank you could make on employees.” In a similar approach to the security team behind the WMT incident, GoDaddy announced a $650 holiday bonus, which was in fact a phishing simulation.

In both cases training was not provided immediately upon making a mistake; Employees left the initial email with a good feeling about their workplace, just to develop a negative sentiment later about themselves as the fools, and of the company for putting them in that position. In the GoDaddy case, we know this feedback took two days and in the WMT case, we don’t know how much time elapsed between the initial email and its follow-up.

In both cases, this email could have been sent by hackers but shouldn’t have been sent by security teams. The case for impersonating an internal official or the CEO is real and the case for faking financial benefit claims is also real, but the way they’re combined totally derailed the learning effect of the training.

In the face of the actual result, does it even matter that hackers could have sent such messages..?

There are two sides to the equation. You should know yours

You can’t trick your staff into a better security posture. For security awareness training to work, you need employee engagement. By design, this requires your employees to buy into the program. In phishing simulations, it’s important to remember that security teams and employees are on the same side, against hackers. Some security teams feel it’s them and the hackers against the employees – and that’s where things go terribly wrong.

Ask yourself – do I know how our employees learn best? Do we consider employee feelings when we design our security solutions and programs? Have we considered the downsides of our simulations of choice? And most importantly: will that phishing simulation generate a real learning experience? If the answer to any of these questions is NO, stop and reevaluate your security training approach.

Want to avoid such common mistakes, and learn more about creating security awareness training at scale? Talk to our experts and schedule a demo of our autonomous training platform, built by training experts.

4a34e52d-562b-4e1e-8b71-5c005a7559a9