Lessons Learned: Human Factors in Security Awareness

 

We’re hosting a new series highlighting members of our Cyber Security Awareness Manager community and their lessons learned while creating and running awareness programs that go beyond checking the box, they make an impact.

This week we're excited to feature Robin Bylenga, Human Factor and internal threat management specialist. She currently serves as Human Factor Performance Lead for scoutbee.What is human factors and how does that affect performance from a security standpoint? In a very simplistic definition, it's looking at the people, processes and technology and ensuring everything is set up with the human in mind.

Human Factor Risk Assessment

Every business typically performs a business risk assessment which covers the technological aspects of firewalls, endpoints, etc along with tools that help monitor user behavior - like what is being downloaded, permission level, etc. A human factor risk assessment provides a human element to the interpretation of the technical data provided by the tools. For instance, identifying individuals and workflows to identify an employee who is changing routines that might signal a departure from a company or understanding that login at 3 am was a new dad trying to squeeze in a little work with the newborn feeding.

As Robin likes to put it, "Know your people, people!". 

Human Factors Analysis Classification System adapted to Cyber

What is Human Factors Analysis Classification System, or HFACS? It's an investigative tool developed by behavioral scientists in the United States Navy in order to assess why their programs were experiencing increased failures compared to their counterparts in other branches of the military, specifically aviation accidents.

HFACS comes along post-incident and identifies points prior to an accident that led up to it in order to determine the underlying cause at an organizational level. The creators of the system, Drs. Shappell and Wiegmann, applied scientifically valid framework for accident investigation developed by Dr. James Reason. 

Dr. Reason's model is known as the "Swiss-cheese model of accident causation". Just as slices of swiss cheese have connecting holes throughout with each preceding layer representing an organizational barrier created to prevent unwanted events (aka security), the connecting 'holes' or lapses in security in each step leaves a trail back to the underlying cause at the organizational level.

Working backwards from an incident / accident are the four levels of organizational barriers.

**Active Failure (Incident) is preceded by...

4. Unsafe Acts which were preceded by...

3. Preconditions that were preceded by...

2. Unsafe Supervision which was influenced by...

1. Organizational Influences (aka underlying organizational cause)

Robin shared the scenario of a plane crash as an example for understanding this framework. The crash is the active failure. Identifying what was going on in the instance before the crash is the unsafe acts and before that existing preconditions. These were all preceded by a failure at the supervisory level that was left unaddressed. Lastly, there is the initial causation identified at the organizational level. Each hole in each preceding level are latent dormant conditions that set the stage, so to speak, to allow the active failure (or plane crash) to occur. 

Robin noted there is always a direct correlation that can be traced back to the organizational level. One other interesting aspect of Dr. Reason's model is that it completely removes blame from the equation (but not accountability) when identifying each step leading up to an incident. 

HFACS comes in and identifies what the 'holes in the cheese' are - the break in the security chain for the organization - which unveils some unifying correlation back to the top at the organizational level.

Their system has been implemented across industries for a human factor analysis in incidents globally including nuclear incidents, rail, engineering and more but had as yet not been adapted to cyber incidents. Robin's work is changing that.

Integrating HFACS with Security Awareness - a contradiction?

Not only has Robin's work adjusted the HFACS tool to code it to cyber under the guidance of Dr. Shappell as part of her research committee, in her position with scoutbee, Robin is utilizing HFACS to act as a preventative tool by proactively identifying issues before or as they arise.

Robin culled through all the cyber frameworks including NIST, CIS, ISO, and more along with breaches and other incidents to break them down and layer them with the human factors involved including skill, decision-making, errors based on misperception and the like with the goal to reduce the impact of an error. 

As part of the preventative integration of the human factor into cyber security it comes down to knowing your people. As an organization grows, Robin stresses it then comes down to a management training issue for managers for their areas to be able to recognize flags that need addressing. 

The blame-free model of HFACS supports the trend in more cybersecurity awareness programs and cultures to provide a two-way communication and more positive security culture overall within an organization.

At the end of the day, we want to reduce risk - that is what cybesecurity awareness is all about.

It's by no means a cut and dried approach and dealing with people always comes with unknowns but being able to identify when an individual under your management is going through a particularly challenging time you can, as Robin says, help [reduce impact of errors] by understanding your people. For instance, if you are aware of what key workflows or platforms the stressed worker is involved in, as a manager you can be better equipped to consider the impact that distracted employee may have unintentionally and is an opportunity to proactively address the situation.

Robin notes that most insider threat is very, very rarely malicious but more often is attributed to human error.  Ways to mitigate when an potential issue is identified can perhaps look like reassigning the employee temporarily for certain sensitive tasks or even consulting with HR to see what services can be rendered to reduce stress. It's definitely not an easy task but even just being aware is a start.  

Lessons Learned 

Part of Robin's lessons learned in working with her own team to build a cybersecurity awareness program for their developers is to involve members from each department and allow them to own the process for guiding the program so it is something that is relevant and useful to them and they will want to be engaged. 

Also, top down buy-in from org leaders is critical to build a positive security culture. One idea she hopes to implement to support this is creating personal videos from key leaders and their experience with phishing links or the like to perpetuate a safe culture mindset and the importance of security from all levels of management - to humanize it and support blame-free environment.

Robin also made the point in regards to creating user-generated content such as the personal videos from management and open reporting from employees that these also become a source of data. This qualitative data helps to understand what scams employees are being faced with in real-time so that as an added measure or value, the cybersecurity awareness manager can bring more relevant insights on scams related to that particular department.

Her final nugget on understanding the human factor and security awareness was reiterating the importance of knowing your people through having more conversations with them, no matter how seemingly trivial.

Thanks again, Robin!

Resources mentioned:

HFACS Framework

Dr. Calvin Nobles work

"Just Culture" by Sydney Dekker

Lessons Learned with Dennis Legori and Paula West

 

Looking for awareness training that is short, relevant and engaging? Check out Wizer’s free security awareness video library.