How do I know if my security awareness training is working? This is one of the most common questions that we get asked, and it’s usually because our customer has traditionally relied on click rate to measure success. Let’s look at the fallacy of the click rate metric in more detail.

What is Click Rate Measuring?

Click rate measures how many employees click on your phishing simulations, and often starts out as the champion of security teams. After all, if you’re looking to validate the necessity of security awareness training – there’s no better tool than a data point that proves that 30% or 40% of your employees are falling for phishing simulations.

Now let’s fast forward to your next meeting and the one after that. What’s your goal? Is it to lower the click rate down to as close to 0% as possible? If so – that means one of two things:

  1. That none of the employees are engaging with phishing simulations, which means they aren’t seeing the training content.
  2. That the phishing simulations sent to employees are so obvious that they can all identify them and avoid clicking them. Since in reality, there are no “easy” or “hard” phishing emails, they are all context-based (more on this later) this is a problematic assumption that can only indicate there’s an error with our training method.

On the other side of the coin, if your click rate doesn’t reduce, then how are you showing the ROI of your training program? According to a rising click rate, more and more employees are clicking on the simulations – so learning is clearly not occurring, and you’re doing something wrong.

Add Context to Your Security Awareness Program

When you’re measuring the success of your security awareness program, you want to think about context. This is intrinsically linked with how much data you have as a company. Most programs will measure click-rate, which shows mere participation because they simply don’t have enough data to measure with context and prove employee progress over time.

Let’s say your company sends out 3 phishing simulations over a year. There’s no way of knowing whether one was sent while an employee was on holiday, while another employee clicked because they were new to the company and a bit ‘green’, or whether another simply missed the email altogether in a flurry of meetings and emergencies. Even if all three simulations are clicked on, is one email every four months really enough to tell you anything valuable about a daily threat?

With click rate, you’re only seeing how many times links are clicked on, a metric that you constantly have to balance when you’re reporting to the board to align expectations of improvement versus efficacy, as we showed above.

Look for Progress, not Participation

In contrast, when progress is measured instead of participation, your teams are getting a clear view of the benefits of a security awareness training solution over time.

To make this happen, your awareness program needs to be continuous, with at least 10-12 data points for each employee in a single year. This should also be measured over time, so that if an employee falls for simulations 1, 2, and 3, this is not weighed equally in risk to an employee that falls for 10, 11, and 12, for example.

By using this data, you can then use this context to identify more valuable metrics such as:

  • High-risk employees: The number of employees who are failing to learn at a good pace and avoid more scams over time.
  • Resilience: The level of security awareness in your company, or even inside specific teams.
  • Meantime between failures: Proving that employee learning is occurring and that retention is improving over time. In machinery, MTBF it’s used to measure the amount of time since the machine failed last. In the security awareness industry, MTBF shows the resilience of an organization. If you see that employees have fewer simulation failures and that these mistakes are getting fewer and further between overtime, your employees are getting knowledge from the program, and best of all – retaining it.

Still Considering Click Rate to Measure Success? Read the full video transcript here:

Click Rate is Detached From The Learning Curve

“Most programs measure participation and not progress. They measure how many employees have enrolled to the program, how many employees watched the videos, how many employees interacted with their phishing simulations, or with other security-wise content, but that’s not progress. Progress is being able to show a change in behavior. Now, most measurements, for instance, phishing simulations, are about the click rate, but click rate is detached from progress. It’s detached from learning curves.

Measuring Change in Employee Behavior Over Time

Click rate doesn’t show us if an employee was better before or after. It’s the only point in time that doesn’t distinguish between different risk groups within the organization, with the new employees, the repeat offenders, it doesn’t show us all of this. If we want to measure progress, we need to take it a step away or beyond the click rate. We need to be able to measure behavioral change. Not how many times an employee took the training and not how many videos they watched, did they change their behavior?

How Many Data Points are Enough?

The only way to do that is to actually test them and run them through phishing simulations month after month. Only in that way, we’ll have enough data points to build the right measurements. What is enough data points? At least 10 different data points per employee in a year. That would allow us to build a learning curve. Getting enough data points will allow us to prove to ourselves, but also to our managers, that our program is really working and changing behavior.”

Want to learn more about how the CybeReady algorithm identifies high-risk employees? Schedule a call with one of our security experts. 

Author:
Omer Taran
March 16 2021
4a34e52d-562b-4e1e-8b71-5c005a7559a9