Hidden in Plain Sight

What I Do with My Bias

It’s no secret that many organizations and individuals work hard to fight the sorts of bias that affect our interpersonal relationships, work lives, and basically every facet of existence. I spend a large part of my time ruminating on and otherwise dissecting the biases that we hear about most; however, lately I’ve been contemplating other types of bias–specifically those types that can have potent effects on both research and analysis.

The discussion that spurred my focus on biases in analysis went something like this:

“We should only scrutinize x kind of traffic because y kind of traffic is probably safe to ignore.”

Hmm…need I remind you that even DNS has been leveraged as a C2 channel?

I’ve also heard “we should only review failed logins because bruteforce attacks!”

In the age of credential theft (I’m looking at you, Mimikatz), it could be unwise to assume that your attacker is likely to use what is possibly the noisiest avenue in order to compromise a system. According to Verizon’s awesome DBIR report, use of stolen credentials was the top “threat action variety” across all breaches.

The majority of my coworkers both now and in the past have had significantly more experience in the field than I do. Undoubtedly, they have significant, valuable historical knowledge and experience which means lots of wisdom for me to absorb! Unfortunately, though, I sometimes run into the “seen it all before/been there, done that” attitude. There are a couple of reasons I take issue with this mindset: 1) Information security, like technology as a whole, is rapidly changing. The threat landscape evolves. Evolve with it or risk falling behind. 2) While that experience might cover ~90% of the attacks an organization might face, there’s still that 10% outlier that, while less likely, has the potential to be most damaging because it’s not a known quantity and could be the work of a determined attacker.

That ever-unknown quantity is part of the reason I try to do my best to think like an attacker while working as a defender. I take a significant interest in knowing and understanding what the traffic looks like on my network while also thinking about ways I’d circumvent the defenses and blend in with the environment as an attacker.

Every day that I work in information security, I become aware of a new way that things just aren’t always as they appear. Lately I’ve been thinking about HAMMERTOSS, a piece of malware that used dynamically generated Twitter handles to grab instructions from its handlers. Its behavior was configurable such that its activities could be synced up with target work schedules in order to blend in even further. Taking a piece of data at face value can be in the best situation shortsighted and in the worst–flat-out hazardous. At the same time, it’s hard to know where to focus my efforts. Cutting out the noise in terms of system logs or network traffic can be nearly impossible when there are myriad possibilities for that faint, fleeting signal to hide in plain sight. I’m still working on refining my own techniques and practices for collecting actionable telemetry but I am wholeheartedly in favor of subjecting successful logins to scrutiny.

I’m certainly not an expert in bias but I have recently been making more of an effort to combat my own bias when I detect it. As a defender, I feel it’s important to not get boxed in by my own baggage. Here are a few of the steps I take to combat my own pesky bias when doing my job as a security analyst:

  1. Stop and consider why I am inclined to go in direction a. What evidence leads me there? (documented iocs, sketchy executables, weird host behavior…)
  2. Ask “is this an assumption based on real-world statistics, trends, or other information that I have analyzed and understand or is it influenced primarily by my own availability bias?”
  3. Consider the alternatives and ask myself “what do I risk by excluding them?”
  4. Ask “is there additional data I can collect before I decide on direction a?” Do I have evidence in support of this decision that is outside of my own intuition or experience? (threat intel, situational or organizational context, additional indicators…)

A couple of things I’m reading right now are:

I am working hard each day to become better at what I do and to find more ways to conquer bias by documenting the hard evidence, visualizing the connections where I can (Maltego is pretty sweet for this), and building my analytical skills overall. If you have any advice or feedback for me, please reach out to me on Twitter or email me hello[at]alexia.codes.