All articles

Building Effective Security Programmes: Part 10 – The Human Factors Domain

Author:

Greg Van Der Gaast

Security

•  Apr 10, 2024

Welcome back to our series on building security programmes. It’s a series we hope helps you not only better secure your organisation, but also highlights CDW’s commitment to help customers approach security holistically and effectively, with their unique business context in mind.  

This is likely to be an unusual instalment for some readers, as the field of Human Factors isn’t one that’s yet widely applied or recognised in most security functions.  

It relates to human error and behaviours, and deals with everything from training, understanding human actions, improving adherence to instructions/processes, cultural change, and more. 

How critical a focus on Human Factors will be in your security program depends entirely on your business, and particularly on its level of organisational health. (I highly recommend Patrick Lencioni’s books on this subject.

What we need from this function will also vary wildly based on the people and cultural challenges in your organisation. 

As a result, I can only give you an example from my personal experience to give you a feel of what can be achieved. 

In my last role as a CISO, I employed a human factors expert on the team. The core driver for this was the number of cultural problems within the organisation. This manifested itself in a lack of structure and discipline, distrust of management, resistance to change, and an unwillingness to accept outside criticism.  

Plus, there had never been a security function in the organisation before, let alone one that wanted to change the way everyone operated. Resistance was expected. 

The initial response to our expert was the same as it was to the rest of us: polite acceptance to our face, with back-channel communications of, “Why are these people here? We have better things to do. Get rid of them.”  

This was, in many ways, a reflection of the general culture at the time, and no number of reports or plans was going to change that.  

Instead, we tasked our expert, who was also a student of Fair Culture (worth a search if you are unfamiliar) to start picking random people to talk to and ask them about their work… and to show an interest. 

The first thing this gave us was lots of insights into what teams did. The next thing it did was build relationships and trust so that people would share more with us. This, in turn, yielded some significant finds about processes not documented anywhere, sensitive data being used in places and ways that few knew about, some dangerous practices, and many of what turned out to be our most significant business risks. Risks that technical solutions would have been unlikely to detect.

Body-image-1 

People manipulating production data using system accounts, entire undocumented environments with no controls testing with live data, 30 people using the admin account of a departed employee, code being left on internet repositories, and much more. 

Assessing what training we needed and getting people to take it up was another responsibility, as was building up a positive, proactive, and altruistic culture throughout the business to ensure people would care enough to help us achieve our security goals. 

That leads me to our Human Factors expert; working side by side with HR to educate people around Fair Culture and promote its adoption. 

In many ways our expert acted as my right hand when it came to creating relationships as I got too busy. A personal and hands-on approach (with people, not tools) has always been something I strive for, but you quickly reach a point where it’s no longer possible to have regular interaction with everyone or even every team. 

Another critical aspect of Human Factors is human error and process engineering.  

The first part aims to determine why people make mistakes and rectify the root causes. This can be a poor culture with a slapdash approach to things (or even contempt), poor training, unclear processes (due to badly formatted instructions, ineffective distribution, or even misunderstandings due to cultural differences), management pressures, personal difficulties, inadequate tools, confusing systems, and any number of other factors.   

Process engineering involves making sure processes, whether they are instructions, mechanisms to distribute those instructions, prompts on a form, etc., are optimised in such a way (including psychologically) to minimise human errors, and maximise people’s inclination to perform the task as intended rather than avoid or shortcut it. 

It's all about making sure that your people work not just according to the system you design, but also act in the ideal way even when the situation isn’t defined.  

I’ll sum it up as follows:  

Security, or quality, is about defining the desired state and maintaining that state. In many ways, it is very much about fighting and limiting entropy so we can maintain control. 

Almost everything in our framework is there to create the mechanisms and structure to fight that entropy. We define how all the things that could have an impact on security should be done, and we get the support to be able to enforce those definitions. 

There is no greater source of entropy in our battle than human beings. They can cause chaos, whether accidentally or intentionally.  

But they can also be powerful allies and champions that not only reduce their own entropy but also help spot and rectify deviations elsewhere. All of which helps us maintain better control over our environments. 

The Human Factors domain is likely to be the least structured in the framework, but it’s important to define the activities for them to be supported and measured as needed. 

As always, if we can help you with advisory services along with tools and platforms for your training and cultural change needs, we’d love to hear from you. 

That’s all for this instalment, join us next time for a look at how Security Operations should fit within our framework. 

Contributors
Share
Subscribe to email updates

Related insights

Building Effective Security Programmes Part 1 Introduction
  • Security

Building Effective Security Programmes: Part 1 – Introduction

Greg Van Der Gaast, Security Advisor to CDW, looks at how to build effective security programmes, and identifies common issues organisation face when designing and building their own security programmes.

Read article
Building Effective Security Programmes Part 9 The Product Engineering Domain
  • Security

Building Effective Security Programmes: Part 9 – The Product & Engineering Domain

Greg Van Der Gaast, Security Advisor to CDW, looks at how to build effective security programmes, and how to ensure your Product and Engineering functions are effectively covered by your security framework.

Read article
Building Effective Security Programmes Part 11 The Secops Domain
  • Security

Building Effective Security Programmes: Part 11 – The Secops Domain

Greg Van Der Gaast, Security Advisor to CDW, looks at how to build effective security programmes, and how to establish good compliance when building a strong security programme.

Read article