Liars and Outliers - Headcount Matters

Sebastian Whiting | Apr 11, 2024 min read

I recently finished reading Liars and Outliers by Bruce Schneier and it was not what I expected and in a lot of ways that is a good thing. It is so easy to get focused on the technology and forget that at the end of the day, people are at the core of what we do. In his book, Schneier does an excellent job of abstracting the concept of security to a point where we can understand contracts and societal pressures as forms of security in addition to cryptography and EDR tools. This allows him to draw an accurate picture of the relationship between trust and security.

Incentivize the Right Behavior

There was a lot to unpack! A good portion of the book talked about one of my favorite topics which is competing interests and intended consequences. In the spirit of application, I try to always ask myself:

What behavior is this going to incentivize?

Another way of thinking about it is:

What new behaviors will come about because of my decision or change?

  • Prohibition incentivized organized criminals to engage in bootlegging.
  • Tax laws and loopholes drive accounting practices and investment decisions.
  • Unrealistic standards tend to incentivize apathy, corner cutting, or outright lying.

Back in the boat times (my submarining days) I had a lot of time, seriously a lot, to ponder these topics. Being trapped in a steel tube with all your best friends for months on end gives a unique opportunity to observe behavior over a long period of time. You could see how decisions made by peers and leadership, or changes in the schedule, could impact an individual’s attitude and behavior.

I love this topic so much I wrote my capstone paper for my master’s degree on it: capstone paper

You can read it if you want, if not I won’t be offended. That style of writing is not my preferred one but I poured the figurative ‘blood, sweat, and tears’ into that paper so I figure I better share it.

Human Capacity for Trust

What really struck me though came in the first several chapters and formed part of the foundation of for the book. Dunbar’s numbers.

To quote Wikipedia

Dunbar’s number is a suggested cognitive limit to the number of people with whom one can maintain stable social relationships—relationships in which an individual knows who each person is and how each person relates to every other person

This wasn’t the first I’d encountered this number but it was the first time that I had really given it much thought. What struck me was this quote from Dunbar:

…when there are more than about 150 individuals, they cannot control the behaviour of members by peer pressure.

This quote was in reference to the Hutterites and Schneier included a second quote, most likely to soften the message, and quoted Hardin as saying “Perhaps we should say a community below 150 really is managed–managed by conscience.”

Fast attack submarines usually have about 120 or so crew members and I can attest that peer pressure is one of the biggest drivers of accountability and behavior. Submariners are constantly observing and correcting the behaviour of other sailors both up and down the chain of command. Turns out that holding a standard matters when everyone’s literal lives are on the line.

Size of the Organization and the Human Capacity for Trust

Schneier spends the rest of the book referencing Dunbar’s numbers, because yes there is actually more than one. The general idea is that as social groups, whether that be communities, companies, families, etc. expand, people need systems, processes, and abstract entities like governments to allow them to continue to trust. This is as opposed to smaller groups where peer pressure is sufficient.

As an organization grows, it would do well to keep these numbers in mind and have a plan in place for for implementing scalable systems in place of peer pressure as they grow. If the transition doesn’t happen as it should, there is a good chance that the culture of excellence that the organization (hopefully) built might start to to degrade.

As a security professional, I often find myself tasked with maintaining trust with end users. I find that trust to be important in order to help build security into the culture but also to make sure the end user feels like they can come to me when they need to. I’m a big fan of Jason Meller’s Honest Security principles ( I usually refer to this as his Honest Security Manifesto, but that’s my term not his ). Honest Security

Traditional corporate security sends a message to the end user of “We don’t trust you” and it often doesn’t scale. Requesting new tooling is an arduous process that leaves most employees just making do with what might be a sub standard tool for the job. Agents and scanners are installed on the end user’s device without their knowledge and the first time they hear from security is when something goes wrong. That sets an odd tone, and can also feel accusatory. This point and more are covered in Jason’s principles. Tools like Kolide, which is Jason’s company, can help fix this transparency issue.

But I digress, we are supposed to be talking about the human capacity for trust here. Vulnerability scanners and EDR agents are important right? Absolutely. They provide much needed telemetry about devices on our networks and are critical components in the IR process. But many users are put off by EDR tools. Partly, they don’t like losing control over their device, but often times it comes down to “What, you don’t trust me?”

Here is the big takeaway from Schneir’s book and Dunbar’s numbers. It isn’t that I don’t, it’s that I can’t. I literally lack the capacity. There are 1000 of you and 1 of me. I’d love to know each and every one of you but I can’t even remember every member of my family tree! I need the EDR agents and vulnerability scanning agents in place because I can’t enforce good behavior through peer pressure alone. Even if I could, I need a way to hand over objective evidence to auditors and regulators that everything is as it is supposed to be (yay compliance).

So where does that leave us? When users push back on rolling out EDR for the first time, there are two points that I think are crucial to make:

  1. It isn’t that I don’t trust you, its that I can’t. I can’t possibly personally verify every user’s technical aptitude.
  2. I need this data to prove that it isn’t you should something originate from your device.

So what is that second point about? If a user’s laptop is identified as the source of malicious activity, without the right telemetry it might be impossible to say whether the end user is to blame or if their device has been compromised and they are just a victim.

Conclusion

We as humans have a limited capacity for trust and relationships. This ends up putting a limit on how large our social groups can be while still maintaining order through peer pressure (conscious) alone. Scale your systems accordingly, and make sure you have a plan for transitioning to tooling and systems for trust. Hopefully, this proves useful. It was a somewhat stream of consciousness type of post.