Author Archives: Chris Sanders

Investigation Theory Course On Site in Augusta, GA!

I’m really excited to announce the first ever public LIVE in person offering of my Investigation Theory course. The two-day course will be taught on site in Augusta, Georgia on September 13th and 14th, right ahead of the Security Onion Conference and BSides Augusta. If you were planning on coming for those conferences, you can come in a couple of days early for training. Alternatively, you can come for the course and stay for what I think is the best defensively focused pair of security conferences in the country.

This offering of Investigation Theory is delivered in person over two days. You’ll participate in lectures, individual labs, and team exercises aimed to help you become better at the underlying processes that help you become an effective security analyst. This will be a very interactive class designed to take advantage of the fact that we are together in person. You’ll also get to use Investigation Ninja, our custom simulation platform designed to teach investigation skills in a tool-agnostic, data focused way!

Detailed course information and tickets can be found here: https://www.eventbrite.com/e/investigation-theory-2-day-training-course-tickets-34912113070

Register in the next couple of weeks to take advantage of early bird pricing. Seating is limited.

Source Code S1: Episode 5 – Gerald Combs

Did you know that Wireshark was almost named Etherweasel? I brought in Gerald Combs to tell us about the history of Wireshark from a small tool for his own use to one of the world’s most popular open source projects in the world with over a million downloads a month. We also talk about growing up in Kansas City (with a good BBQ recommendation) and why open source is important to him.

Listen Now:

You can also subscribe to it using your favorite podcasting platform:

If you like what you hear, I’d sincerely appreciate you subscribing, “liking”, or giving a positive review of the podcast on whatever platform you use. If you like what you hear, make sure to let Gerald know by tweeting at him @geraldcombs. As always, I love hearing your feedback as well and you can reach me @chrissanders88.

5 Human-Centered Takeaways from the SANS SOC Survey

SANS recently released the results of their SOC survey that was put together by Chris Crowley. The report has a lot of useful data points and is worth your time to go through whether you’re in a SOC and wondering how you stack up against others, or if you’re thinking about establishing a SOC and need to see where the goal posts currently are.

In this post, I want to focus on five takeaways I garnered from the report*. These takeaways will revolve around the human analyst, just as all investigations do.

Heavily Regulated Industries (and vendors) are Leading the Way

Figure 1 illustrates the distribution of SOCs across specific industries. Taking dedicated cyber security and technology companies aside, the industries that appear to have a greater number of SOCs share a commonality of being heavily regulated. This includes government, finance, manufacturing, and healthcare. This seems consistent with the notion that many organizations develop their security operations by first embracing required compliance.

SOC Survey Industries Represented

By virtue of being the first and most prolific adopters of SOCs, these industries will naturally dictate best practices across the field as they mature. The common traits and mindsets predominant in these industries will influence the direction of the SOC as we know it. This will matriculate to cyber security vendors who will inevitably swap staff with practitioners in these SOCs. When combined with vendor’s focusing sales goals towards these industries ensure that vendors are also more likely to build products and produce educational materials that also promote the mindsets predominant in these fields.

A mindset is neither good or bad, and bias can be both helpful and harmful. It’s important we identify common trait distributions and mindset biases associated with these fields so that the evolution of the SOC concept benefits from a diversity of opinion.

 

SIEM as the Investigative Centerpiece

Figure 13 shows how SOC analysts correlate and analyze event data, IOCs, and other security and threat-related data. This chart essentially identifies the tool at the center of the investigative process. 77% cited the use of a SIEM for facilitating the investigation process.

In my experience, many SOCs tend to let the workflow inherent to their SIEM dictate their analyst investigation workflow. New analysts learn primarily via on the job training and through the lens of the workflow dictated by the SIEM. Given there are only a handful of widely popular and accepted SIEMs and investigative theory training isn’t widespread, most analysts currently practicing likely learned their craft via a few popular SIEMs like Arcsight, QRadar, or others. I would posit that a test could be developed wherein you could present an analyst with investigation scenarios and monitor how they solve them to arrive at an accurate assessment of which SIEM they cut their teeth on.

If the SOC doesn’t provide training in fundamental investigation concepts then an additional concern moving forward is that analysts are more likely to be “SIEM-locked” wherein they don’t know how to perform investigations without the use of a specific SIEM.  SOC managers must be certain that their SIEM supports their human-centered workflow rather than developing a workflow solely because it aligns with a SIEM. Currently, my assessment of the SIEM market is that most tools that exist don’t adequately consider or deliver workflow features that focus enough on the needs of the human analyst. It’s likely that many of the 23% of organizations identified as having built their own SIEM like tool share this opinion.

 

Investigation Metrics are Non-Existent

Collecting actionable metrics has been a pain point for most SOCs I’ve worked in or consulted with. Figure 18 describes metrics that are used, enforced, and consistently met. There are very few metrics associated with the investigation experience itself, except for the time from detection to containment and eradication. As the investigation function is the central workflow of the SOC, this continues to be an area where improvement is desired. Instead, most metrics that are considered are focused on SOC output, and not the efficiency of the SOC itself. While this is helpful for justifying the existence of the SOC (why an org spends money on the function), it isn’t as helpful for improving the SOC (reducing the cost of the function).

SOC Metrics Collected

Investigation-centric metrics might include tracking the usage of specific data sources during investigations (assists), the number of times a data source would have been helpful but was unavailable (turnovers), the most commonly aggregated fields, and average time spent viewing specific data sources. An investigation-centric metric is one that seeks to better understand how the human analyst spends their time while attempting to connect the dots in pursuit of greater speed and accuracy.

 

Internationalization of SOCs

A significant number of SOC practitioners exist outside the United States, as shown in Figure 2. However, a much smaller percentage of organizations who responded to the survey are headquartered outside the US. The disproportionate number of international analysts is likely attributed to organizations attempting to cut costs by hiring in lower income regions, and organizations that seek to staff 24×7 operations by staffing in different time zones (thereby avoiding having to hire a night shift in the US, which is notoriously difficult).

Locale of Security Operations

There are significant differences in how people think based on the culture they hail from. By nature, most Americans tend to be less sensitive to these variances and project their way of thinking onto others. As the number of international practitioners grows, it’s critical to consider the biases inherent to how Americans think so that we can identify where they may not hold up for international practitioners. As an example, people from Asian cultures tend to require more certainty about a conclusion than their American counterparts before feeling confident in it. Put more simply, someone from Kansas may view the investigation process completely differently than someone from Kazakhstan. By understanding how differing cultural mindsets impact how people approach investigations, we can draw useful data and conclusions towards a more universal investigation process.

It’s worth noting that SANS is a US-based company with a larger market penetration in the US (I don’t know this for a fact, it is an assumption). Therefore, the respondents for this survey question probably under-represent the number of international practitioners and the survey may not represent an adequate global sample. Without access to the source data, I’m unable to assert confidence regarding sample distribution. None the less, this only seeks to strengthen the points mentioned here.

 

Distributed Environments Require Unique Communication Skills

The increased number of international SOC practitioners in remote SOCs (Figure 2) and the significant number of distributed SOCS (Figure 3) stress the importance of communication in which analysts are not in the same room.

SOC Architectural Approaches

Communication is a critical function of the SOC, and it must be facilitated with appropriate tools. This stresses the importance of investigation tools that provide built-in collaboration features such as the assignment of cases, shared notes, and context tagging. It also stresses the importance of data access and information sharing via tools like wikis or knowledgebases.

Not lost here is the ability to identify and hire staff who excel at non-present communication. As someone who has managed remote teams, I quickly learned that some people simply aren’t effective communicators via text-based mechanisms like chatrooms. Managers should strive to develop strategies for identifying analysts who can excel specifically in these environments. Furthermore, if we can identify these traits and qualities, we should strive to enhance our ability to teach improvement in this communication skillset.

Conclusion

I enjoyed parsing through the SOC survey and want to thank SANS and Chris Crowley for putting it together. In the future, I’d love to see more questions relating to how human analysts perform their jobs and what pain points they have beyond just the tools they use. I’d also love to see this survey repeated on a periodic basis so that trends could be highlighted.

Finally, I’d love to see the methodology used for data collection described here and why they chose the questions they did. I appreciate SANS identifying that the research is sponsored, but citing the methodological approach would shed light on how much influence the vendors had on the questions and the interpretation of their output. A positive step would be making the raw source data publicly available for additional analysis.

I’d love to hear your thoughts on my analysis of the report, including both things you agree and disagree with. You can reach me via Twitter @chrissanders88 or you can e-mail me.

 

*Note: I was not asked to do this by SANS. This post only reflects my analysis and opinions. 

Source Code S1: Episode 4 – Mike Poor

This week, I’m joined by Mike Poor of InGuardians. We spoke about growing up in Brazil, how a few individuals in his early life led him towards an interest in computers, how he got involved teaching with SANS, the formation of InGuardians, fostering a family environment at work, and some stories from his long career teaching people about packets.

Listen Now:

You can also subscribe to it using your favorite podcasting platform:

If you like what you hear, I’d sincerely appreciate you subscribing, “liking”, or giving a positive review of the podcast on whatever platform you use. If you like what you hear, make sure to let Mike know by tweeting at him @mike_poor. As always, I love hearing your feedback as well and you can reach me @chrissanders88.

Know Your Bias – Availability Heuristic

This is part three in the Know your Bias series where I examine a specific type of bias, how it manifests in a non-technical example, and provide real-world examples where I’ve seen this bias negatively affect a security practitioner. You can view part one here, and part two here. In this post, I’ll discuss the availability heuristic.

The availability heuristic is a mental shortcut that relies on recalling the most recent or prevalent example that comes to mind when evaluating data to make a decision.

For the security practitioner, this type of bias is primarily an attack on your time more so than your accuracy. Let’s go through a few examples both inside and outside of security before discussion ways to mitigate the negative effects availability heuristic can have.

Availability Heuristic Outside of Security

Are you more likely to be killed working as a police officer or as a fisherman? Most people select police officer. However, statistics show that you are as much as 10x more likely to meet your end while working on a fishing boat [1]. People get this wrong because of the availability heuristic. Whenever a police officer is killed in the line of duty, it is often a major news event. Police officers are often killed in the pursuit of criminals and this is typically viewed as a heroic act, which means it becomes a human interest story and news outlets are more likely to cover it.

Try this yourself. Go to Google News and search for “officer killed”. You will almost certainly find multiple recent stories and multiple outlets covering the same story. Next, search for “fisherman killed”, and you’ll find a lot fewer results returned. When there are results, they are typically only covered by the locale the death happened in and not picked up by national outlets. The news disproportionately covers the death of police officers over fishermen. To be clear, I’m not questioning that practice at all. However, this does explain why most tend to think that the police work is more deadly than being a fisherman. We are more likely to trust the information we can recall more quickly, and by virtue of seeing more news stories about police deaths, the availability heuristic tricks us into thinking that police work is more deadly. I’d hypothesize that if we posed the same question to individuals who were regular viewers of the Discovery Channel show “The Deadliest Catch”, they might recognize the danger associated with commercial fishing and select the correct answer to the question.

One thing we know about human memory and recall is that it is efficient. We often go with the first piece of information that can be recalled from our memory. Not only does the availability of information drive our thoughts, it also shapes our behavior.  It’s why advertisers spend so much money to ensure that their product is the first thing we associate with specific inputs. When many Americans think of cheeseburgers they think of McDonalds. When you think of coffee you think of Starbucks. When you think of APT you think of Mandiant. These aren’t accidental associations — a lot of money has been spent to ensure those bonds were formed.

Availability Heuristic in Security

Availability is all about the things you observe the most and the things you observe most recently. Consider these scenarios that highlight examples of how availability can affect decisions in security practice.

Returning from a Security Conference

I recently attended a security conference where multiple presenters showed examples that included *.top domains that were involved with malicious activity. These sites were often hosting malware or being used to facilitate command and control channels with infected machines. One presenter even said that any time he saw a *.top domain, he assumed it was probably malicious.

I spoke with a colleague who had really latched on to that example. He started treating every *.top domain he found as inherently malicious and driving his investigations with that in mind. He even spent time actively searching out *.top domains as a function of threat hunting to proactively find evil. How do you think that turned out for him? Sure, he did find some evil. However, he also found out that the majority of *.top domains he encountered on his network were actually legitimate. It took him several weeks to realize that he had fallen victim to the availability heuristic. He put too much stock in the information he had received because of the recency and frequency of it. It wasn’t until he had gathered a lot of data that he was able to recognize that the assumption he was making wasn’t entirely correct. It wasn’t something that warranted this much of his time.

In another recent example, I saw a colleague purchase a lot of suspected APT owned domains with the expectation that sinkholing them would result in capturing a lot of interesting C2 traffic. He saw someone speak on this topic and thought that his success rate would be higher than it was because they speaker didn’t cover that topic in depth. My colleague had to purchase a LOT of domain names before he got any interesting data, and by that point, he had pretty much decided to give up after spending both a lot of time and money on the task.

It is very hard for someone giving a 30-minute talk to fully support every claim they make. It also isn’t easy to stop and cite additional sources in the middle of a verbal presentation. Because our industry isn’t strict about providing papers to support talks, we end up with a lot of opinions and not much fact. Those opinions get wheels and they may be taken much farther than the original presenter ever intended. This tricks people who are less metacognitively aware into accepting opinions as fact. 

Data Source Context Availability

If you work in a SOC, you have access to a variety of data sources. Some of those are much lower context like flow data or DNS logs, and some are much higher context like PCAP data or memory. In my research, I’ve found that analysts are much more likely to pursue high-context data sources when working an investigation, even when lower context data sources contain the information they need to answer their questions.

On one hand, you might say that this doesn’t matter because if you are arriving at the correct answer, why does it matter how you got there? Analytically speaking, we know that the path you take to an answer matters. It isn’t important just to be accurate in an investigation, you also need to be expedient. Security is an economic problem wherein the cost to defend a network needs to be low and the cost to attack it needs to be high. I’ve seen that users who start with higher context data sources when it is not entirely necessary often spend much more time in an investigation. By using higher context data sources when it isn’t necessary, it introduces an opportunity for distractions in the investigation process. The more opportunity for distracting information, the more opportunity that availability bias can creep in as a result of the new information being given too much priority in your decision making. That isn’t to say that all new information should be pushed aside, but you also have to carefully control what information you allow to hold your attention.

Structured Adversary Targeting

In the past five years, the security industry has become increasingly dominated by fear-based marketing. A few years ago it was the notion that sophisticated nation-state adversaries were going to compromise your network no matter who you were. These stories made national news and most security vendors began to shift their marketing towards guaranteeing protection against these threats.

The simple truth is that most businesses are unlikely to be targeted by nation-state level threat actors. But, because the news and vendor marketing have made this idea so prevalent, the availability of it has led an overwhelming number of people to believe that this could happen to them. When I go into small businesses to talk about security I generally want to talk about things like opportunistic attacks, drive-by malware, and ransomware. These are the things small businesses are mostly likely to be impacted by. However, most of these conversations now involve a discussion about structured threat actors because of the availability of that information. I don’t want to talk about these things, but people ask about them. While this helps vendors sell products, it takes some organizations’ eye off the things they should really be concerned about. I’m certain Billy Ray’s Bait Shop will never get attacked by the Chinese PLA, but a ransomware infection has the ability to destroy the entire business. In this example, the abundance of information associated with structured threat actors clouds perspective and takes time away from more important discussions. 

Diminishing Availability Heuristic

The stories above illustrate common places availability heuristic manifests in security. Above all else, the availability of information is most impactful to you in how you spend your time and where you focus your attention. Attention is a limited resource, as we can only focus on one or two things at a time. Consider where you place your attention and what is causing you to place it there.

Over the course of the next week, start thinking about the places you focus your attention and actively question why information led you to do that. Is that information based on fact or opinion? Are you putting too much or too little time into your effort? Is your decision making slanted in the wrong direction?

Here are a few ways you can recognize when availability heuristic might be affecting you or your peers and strategies for diminishing its effects:

Carefully consider the difference between fact and opinion. In security, most of the publicly available information you’ll find is a mix of opinions and facts and the distinction isn’t always so clear. Whenever you make a judgment or decision based on something elsewhere, spend a few minutes considering the source of the information and doing some manual research to see if you can validate it elsewhere.

Use patience as a shield. Since your attention is a limited resource, you should protect at accordingly. Just because new information has been introduced doesn’t mean it is worthy of shifting your attention to it. Pump the breaks on making quick decisions. Take a walk or sleep on new information before acting to see if something still matters as much tomorrow as it does today. Patience is a valuable tool in the fight to diminish the effects of many biases.

Practice question-driven investigating. A good investigator is able to clearly articulate the questions they are trying to answer, and only seeks out data that will provide those answers. If you go randomly searching through packet capture data, you’re going to see things that will distract you. By only seeking answers to questions you can articulate clearly, you’ll diminish the opportunity for availability bias to distract your attention.

Utilize a peer group for validation. By definition, we aren’t good at recognizing our own biases. When you are pursuing a new lead or deciding whether to focus your attention on a new task or goal, considering bouncing that idea off of a peer. They are likely to have had differing experiences than you, so their decision making could be less clouded by the recency or availability of information that is affecting you. A question to that group can be as simple as “Is ____ as big of a concern as I think it is?”

If you’re interested in learning more about how to help diminish the effects of bias in an investigation, take a look at my Investigation Theory course where I’ve dedicated an entire module to it. This class is only taught periodically, and registration is limited.

[1] http://www.huffingtonpost.com/blake-fleetwood/how-dangerous-is-police-w_b_6373798.html