The “Four Eyes” in Information Security

This is not a post about:4eyes
  • Middle-school-level verbal abuse of those that wear glasses.
  • The notion that a specific transaction must be approved by at least two people.
  • Clever wordplay about the dynamics of the relationship between the “Five Eyes” nations – US, Canada, UK, Australia, New Zealand – as it relates to surveillance and any recent bad blood between leaders.

Rather, it’s about a different way of looking at problems. Two questions inspired this post:
 
What if we broaden the arsenal of tools/methods we use for making progress with security initiatives?

What if we have been misdiagnosing issues and thus keep applying the wrong remedies?

ClearerThinking is an initiative/site/movement/… founded by Spencer Greenberg, a mathematician and entrepreneur, with a simple mission: “help people make better decisions”. I came across his site as I researched cognitive biases in the context of behaviour economics, and have been an avid reader ever since. He’s got tonnes of material on all sorts of topics, from helping to cross the political spectrum, to evaluating how rational you are, to – one of my favourites – an analysis of just how much is your time really worth. If you have the time, you might want to take a look. If you don’t have the time, then you absolutely must…

In “Learn a simple model for tackling complex problems“, Spencer describes the “4 Is” framework that he advises when looking at issues. His post includes a link to a short video of a presentation he gave on the topic. In essence, his message – advice to other entrepreneurs – boiled down to:

When looking at a “persistent problem” (something that is important, looks insurmountable, and has not yet been resolved), it is critical to understand where other have failed. This can apply both at a societal/world scale, as well as within organizations. The failure will usually derive from one (or a combination of) the following causes:
  • Individuals or groups were not exposed to the right incentives – positive or negative – to solve the problem.
  • There is ignorance about how to handle the problem, or that was impeding the process to continue.
  • While other elements were in place, the initiative had a severe lack of resources due to limited investment into the issue.
  • Finally, while all elements might have been in place, human irrationality – through cognitive biases or a poor decision-making process – impeded action.
Hence, the “4 Is”: incentives, ignorance (or information), investment, irrationality.

Once the issue has been properly diagnosed, then there are different types of remedies for each:incentives
  • Incentives. Well, create the right incentives: these can be positive (monetary rewards, recognition, etc…) or negative (introduce regulations/rules). There’s a
    famous example of how FedEx solved issues with delivery delays by changing the compensation model for the workers, so that it would reward them for finishing the job faster.
  • Ignorance. In this case, identify ignorancehow to provide the additional information. Is it a matter of simply educating the participants about something they didn’t know or thought incorrectly? Spencer uses the example of AIDS-prevention campaigns that fail because local participants held wildly incorrect views about how contraceptives work. Or is it a matter of the information needed not existing in the first place? In that case, the answer might be basic research, or data collection outside of the organization.investment
  • Investment. Here the answer is, quite simply, find ways to redirect more resources to the problem. It might be an issue of justifying additional budgets, or perhaps redirecting resources from elsewhere. The example Spencer uses is poignant: depending on your values, you should care that a lot more resources are spent saving pets than non-pet animals from cruelty. Should the money and attention be redirected?
  • Irrationality. Finally, the irrationalityway to address human irrationality (be it cognitive biases or flaws in decision making) can include the use of checklists (to reduce mental strain during stressful times), or proper design of system defaults. This is what behaviour economics practitioners refer to as “choice architecture”, and there are fantastic examples of the effect of this with organ donations and medical prescriptions.
To be clear, it might be that a particular issue results from a combination of these, but without applying this kind of clearer thinking, we’re bound to miss out on addressing the problem.

This is a great framework for looking at problems. I love it!

Applying it to Information Security

To me, the applications of the “4 Is” framework to security is direct, simple, and essential. Let’s look at a few scenarios:
  • Effectiveness of Security Awareness training. Security awareness is a common component , but often it is structured heavily on the “information” side of things. Could the answer to better security behaviour be better incentives (again, positive or negative)? Or perhaps the matter is irrationality, and we need to review the choice architecture (defaults)?
  • Deficiencies in rolling out patches. Is poor patch deployment a matter of information (teams don’t know when to roll them out), investment (it’s too onerous to roll them out using current mechanisms), incentives (no one other than security cares, “don’t fix what ain’t broken” mentality), or even irrationality (patches are too low on checklist of things to do).
  • Non-compliance to internal or external requirements. This is another area where a deeper look into the issue using the 4 “I”s framework can yield interesting results. In many cases, we seem to jump to conclusions and infer the cause of failure from our pre-conceived notions. Is that really the case?

The list goes on. We could cover software quality/security, risk management, technology adoption, security culture, …

Moving forward

I really like looking at other areas of knowledge for how we can apply their learnings to information security. This post was an example of that.

Hopefully, this post gives you another tool in your toolset when going about your work with security.

When looking at a security issue, think it through: how much of it is a matter of incentives, information, investments, or irrationality? The answer might not be obivous, and will likely help you …
Note: all images on this post are from clearerthinking.org.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s