Friday, March 20, 2009

Mapping the Unknown Unknowns

There comes a time in an InfoSec professional’s career when they’re forced to do a risk assessment. I know, they’re a big pain in the butt and no one ever reads them, but some people seem to think they’re kind of important1. I say if you’re going to do it, you might as well get some use of the thing.

First of all, I’m not going to explain some formal risk assessment methodology. There are far too many out other sources out there for that. What I am going to talk about is the general stance you bring to an analysis. As the poet Rumsfeld said, how do we deal with the unknown unknowns. This is where your prejudices can color an analysis and you could miss something important. Hopefully by better defining the known unknowns, we can shrink the size of the unknown unknowns. Here’s where I start:

Who is qualified to be working on this?
1. You? Do you really understand what is going on here? Were you paying careful attention to what was presented? One way to check yourself is paraphrase things back. Seriously, I can’t tell you how many times I’ve starting solving the wrong problem simply because I misunderstood what I was being told.

2. Are the people giving you data qualified to give you what they’re giving you? Nothing seems complicated to the person who don’t know what they’re is talking about.

How are people involved?
1. Generally, the more people are involved, the greater the chance of error. And hastily implemented automation can magnify that.

2. Will people have the opportunity to take reckless actions? Recklessness boils down to knowing what a reasonable person should have done, knowing the possible outcomes but going ahead and then doing the dangerous thing anyway. I’m willing to say this is somewhat uncommon in infosec because people rarely understand what a reasonable person should be doing, or the real probability of a bad outcome.

3. Speaking of reckless, how can someone’s personal misjudgment compromise the entire operation? For example, one guy surfing porn could bring down a costly lawsuit. You need to be aware if those kinds of situations exist in whatever your examining.

4. Can you truly say you understand all of the user’s intentions, all of the time? Unless you’re Professor Charles Xavier, this is another unknown that should be considered.

How is technology involved?
1. Software will always be buggy; hardware will always eventually fail; and operational and project emergencies will always occur. What happens when it does?

2. If you’ve got a track record of the technology involved, it’s helpful to look not just at the failures but the “near misses”. How many close calls were there with that tech and what could have happened if it had gone pear-shaped? Just because it worked up to now, doesn’t mean it will keep working.

3. How polluted is the technology? Is it well-maintained and well-understood? What are the dependant linkages? How many moving parts, software or hardware? How resilient is the system to delays or failures? How many outside parties have their fingers in the system? Are you sure you’re aware of all the outside parties and linkages?

Some specific known unknowns about technology
1. The systems you don’t know about
2. The data that you didn’t know existed
3. The systems storing data that shouldn’t be on that system
4. The connections you don’t know about
5. The services on those systems that you don’t know about
6. The accounts, privileges or firewall rules that you don’t know about

Conclusions
These are all things that you will need to account for when you’re doing a risk analysis and filling out those worksheets or forms. And hopefully the solution deals with these things in one way or another – if nothing else at least accepting the risk that these things exist and crossing your fingers.

All of this stuff can take a while to keep in your head, but I’ve extracted a few insights from this process to keep me on track:

o It will not always be obvious which technologies or processes are relevant to the security of a system. Follow the money (or data, or control).

o It is difficult to maintain a secure, operational system in a changing environment. Assume things will get broken and be prepared to deal.

o Listen to complaints. Make sure there is a way for complaints to get to you, from both the people and the systems. Even if the complaints are wrong, they’re complaining for reason. Figure out the reason.

o There will always be people in positions of trust who can hurt you occasionally

o Security policies should allow for workarounds

o Workarounds will create vulnerabilities

o There will always be residual risks

o Assume everything is insecure until proved otherwise (see name of blog)


1 Okay, I’m kidding and you know it. You can probably get through your entire career without doing risk assessments. Just keep buying firewalls and hope for the best.

No comments: