Benchmarking in Security Awareness

Benchmarking in Security Awareness

The following is a personal opinion article on the use of Benchmarking in Security Awareness.

In my more than 7 years working at SMARTFENSE, I have often encountered questions like: what percentage of users fall for the Phishing traps of SMARTFENSE clients?

And the goal is always the same: to compare one’s own organization with the rest.

As if the fact that organizations have an average click rate on Phishing links of 25% could serve as consolation if my organization also hovers around that concerning number. Or as if having a lower percentage than others could be a parameter to consider my awareness efforts successful and sufficient.

This, which is basically Benchmarking applied to Security Awareness, has two problems from my point of view:

  • Benchmarking numbers are not always what they seem
  • It’s difficult to find an organization that is similar enough to ours for the comparison to make sense

Benchmarking numbers are not always what they seem

To keep the reading light, let’s assume a very simple example based on end-user engagement with awareness video campaigns.
Usually, we will have metrics like:

  • Number of users who were assigned to the campaign
  • Number of users who started watching the video
  • Number of users who finished watching the video

Now let’s assume two organizations:
Organization 1

  • Users assigned: 20,000 (100%)
  • Videos started: 10,000 (50%)
  • Videos finished: 5,000 (25%)

Organization 2

  • Users assigned: 50 (100%)
  • Videos started: 45 (90%)
  • Videos finished: 40 (80%)

What does Benchmarking indicate here?

In cases like this, if our provider does not clarify it, we can have results with very different interpretations:

Average of interactions

Let’s analyze the Benchmarking of finished videos. For that, we sum the number of users who finish videos and divide it by the total:
Organization 1: 5,000 videos finished

Organization 2: 40 videos finished

Total assigned users across both organizations: 20,050

Average of users who finish the videos: 25.13%

Average of percentages

Another possible calculation, however, is to focus on the proportions and detach from the number of users, for example:
Organization 1: 25% of users finish the videos

Organization 2: 80% of users finish the videos

Average of users who finish the videos: 52.5%

As we can see, depending on the approach taken to calculate the Benchmarking, the results can vary significantly.

Filtering by industry and number of users

One common way to reduce some bias that can come from an example like the previous one is to compare our organization against others in the same sector and similar number of employees.
This seems to be a very good idea, but when we talk about awareness, it’s not that straightforward. As mentioned above, “it’s difficult to find an organization that is similar enough to ours for the comparison to make sense.

In Security Awareness, the industry and number of users are not sufficient filters for comparing ourselves against another organization.

Let’s remember that with our awareness programs, we can have different objectives even in similar organizations:

  • Regulatory compliance
  • Development of safe habits
  • Development of a safety culture

According to these objectives, the way to manage the awareness program can vary in factors like:

  • Level of difficulty of the content used
  • Number of campaigns sent per year
  • Channels used to reach the user
  • Messages tailored to different user groups based on their age, position, interests, etc.
  • Use of complementary techniques such as gamification
  • Etc.

And not only that. Let’s remember that we are dealing with people, and the results of awareness campaigns can be affected by some general factors such as:

  • Number of users with technical knowledge
  • Age of the users
  • Personal and family situations of each user
  • Heterogeneity of users
  • Etc.

Moreover, the results of campaigns can also be determined by the reality that users experience in that organization:

  • Work environment
  • Work overload
  • Stress levels
  • Stage the organization is going through
  • Etc.

And also by social and political issues of the city and country in which each individual lives.
Finally, and no less important, with awareness we usually aim to manage the risk of social engineering. And the acceptable risk level for our organization does not have to be the acceptable level for others. So, in terms of risk management, the results that may be good for one organization could be unacceptable for another.

think

Some specific examples

Let’s assume that we have filtered to obtain Benchmarking information comparing our organization with another in the same sector and a similar number of users.

What use is it to know the average percentage of opening attachments in our industry?

Perhaps some of the organizations we compare ourselves with use simple-level content, easily detectable by the user, with the sole purpose of meeting a certain number of annual simulations.
Perhaps some of these organizations do not complement the simulations with awareness actions such as sending videos and interactive modules, and their users are not motivated with gamification techniques to better absorb the content and improve their habits.

A cold number like “20% of users in organizations of your same size and sector open unsolicited attachments” actually provides very little value in understanding the current reality of our organization and even less for setting objectives related to our awareness program.

What use is it to know the average percentage of exam passing in our industry?

Perhaps some of the organizations we compare ourselves with have a larger number of technical users who find it easy to pass exams on certain topics.
Additionally, the average age of users in many of those organizations is around 35 years, which means their users have a close relationship with technology and can easily discern some situations.

If in our organization the majority of users do not have technical knowledge and the average age is 45 years, the reality can be very different. And having a lower passing percentage does not mean we are doing things wrong.

Why do we want to compare ourselves with other organizations?

According to the previous analysis, Benchmarking is not a useful or representative parameter for guiding our awareness plans and only serves to satisfy curiosity about how others are performing in their campaigns.
In light of this, it’s worth analyzing why this need to compare ourselves with others arises and how we can satisfy it through a better understanding of our own results, which is ultimately what will allow us to improve.

1. Understanding our current situation

In order to compare ourselves with others, we need to have some level of information about them. That means we’re likely to be measuring our results against their benchmarks.
And here comes an interesting paradox: if we rely on their information to understand our current situation, then we are more likely to reproduce their reality instead of constructing our own.

Moreover, what we need to understand our current situation and its improvement is clear data, not comparative figures.

If we have a percentage of users who fell for a Phishing trap, this should make us focus on how to improve that percentage instead of trying to know if it is good or bad against others.

2. Having clear objectives

As I mentioned before, when we analyze the objectives of awareness programs, it becomes clear that we should never set our objectives based on those of other organizations.
Instead, we should create objectives that respond to our own reality based on our own risks and the situation we want to achieve.

Again, focusing on the real problems we have rather than on arbitrary numbers of others will help us set realistic objectives that can be achieved and that truly reflect an improvement in the situation of our users.

3. Measuring progress

Benchmarking, as I mentioned before, serves only to satisfy curiosity. However, we must know what the progress looks like in our organization to understand if we are improving or not.
For this, we must create our own key indicators that allow us to know if we are making progress with our users. This requires a prior analysis of the risks in our organization, the objectives of the program, and a good measurement plan.

4. Learning from mistakes

Lastly, and equally important, when we manage a security awareness program, we need to learn from our mistakes. Understanding why users fell for a trap or failed an exam is vital to guide future campaigns.
Here, in our context, our own mistakes will provide us more information than the average number of other organizations. It’s through learning from our errors that we can better tailor the future of our programs and avoid repeating them.

Conclusion

In conclusion, Benchmarking has little usefulness for our awareness programs. What truly matters is understanding our users and the reality of our organization.
If you want to compare yourself with others, analyze their approaches and find out how to incorporate them into your program. However, it is crucial to always focus on what really matters: understanding your own risks, setting objectives based on them, and measuring your progress.

Nicolás Bruna

Product Manager de SMARTFENSE. Su misión en la empresa es mejorar la plataforma día a día y evangelizar sobre la importancia de la concientización. Ha escrito dos whitepapers y más de 150 artículos sobre gestión del riesgo de la ingeniería social, creación de culturas seguras y cumplimiento de normativas. También es uno de los autores de la Guía de Ransomware de OWASP y el Calculador de costos de Ransomware, entre otros recursos gratuitos.

Leave a Reply