Tuesday, September 21, 2010

Things I hate about security reports, a rant

This post is by request from @shrdlu and how I can say to no to that? 

I am frequently dismayed the quality (or lack there of) in what we security professionals choose to present outside our little geeky enclave.   I’ve covered some of this before when talking about pen-testing / vuln assessment.   

Sadly, it hasn’t improved much.  I am frequently put in the position of having to apologize for our profession’s inability to craft a document that anyone else but a security professional would consider a “business document”*.  This doesn’t even cover the persuasiveness (or lack of) in most “Security recommendations”.  The icing on the cake is that these documents are often the work product of consulting engagements costing tens of thousands of dollars.  When someone spends thirty grand for a pen-test or a firewall recommendation, the value of the work done needs to show in the document.  And I’m not talking about color glossy graphics.  I’m talking about clarity, relevance and clear reasoning.

You wonder why the executives ignore us, this is one big reason.

Now, I’ll just grab a random VENDOR$ report off my desk here and get into some specifics.

Your template makes you look lazy.  And the fact that you used improperly makes you look sloppy.

It’s got hooks for things that I didn’t buy yet there are orphaned headers and text in there from them.  It’s an awkward one-sized-fits-all affair. Does the advice you dispense also fall into that category?  I’m tempted to believe that.

Executive summaries that aren’t summaries and aren’t written for executives

Here’s how the current exec summary reads:

1.       Client hired Consultant to do job XYZ
2.       Consultant did job XYZ using generic technical process blahblahblah
3.       More detail on generic technical process blahblahblah
4.       Job XYZ was done on date ABC, the end.

Huh?  What is this a summary of?  The proposal?  Here’s how I would expect it to read:

1.       Client hired Consultant to do job XYZ and job was performed on date ABC
2.       Consultant found MOST-HEINOUS-FINDING1 explained in 1-sentence non-technical language covering likelihood and impact (repeat as necessary) or Consultant found no significant vulnerabilities and security of Client appears to be sufficient in comparison to comparable organizations
3.       Consultant also found OTHER-FINDINGS but they aren’t that important because of low likelihood or low impact
4.       We’re not perfect and were given constraints in our testing, other vulns could be there, please plan accordingly

Chart junk

Graphics, diagrams and charts that convey almost no useful information or are so confusing that they actually detract from the report.   More common than not in technical reports.  Sadly.  Do yourself a favor and read some Tufte.

Technical Tables of Torpor

Trying read through most tables in reports usually causes, blurry vision, dizziness, and finally sleep.  Sweet, sweet sleep.  The purpose of a table in report (especially if non-techies are going to see it) is to make your reasoning clear, to invite easy comparisons or to clarify a difficult concept.   Think about what you want to convey with a table before you start slapping text and numbers into boxes.   What decisions do want the reader to make using the table? (besides being impressed with your ability to cite lots of data)  Then eliminate everything else that doesn’t need to be there.

Apparently Arbitrary ratings

There are long strings of “high” attached to things like “Total risk” or “Cost to mitigate”.  Executives wonder if this is canned bs (yes, it is) or was this calculated relevant to their organization in a meaningful way (likely not). This just makes us want to see how you came up with the choices.  And often those details aren’t there.  How did you decide that this is a “Magenta priority” and the probability is “Unlikely”.  What does that mean anyway?  Where are you getting your data? (out of your posterior cavity, I bet)

Frontloading reams of technical detail

Technical detail needs to be there.  It falls under category of showing your work and how you came to some conclusions.  But put this stuff in the back.  No one wants to wade through it in the first reading of the report.   It gives me the nagging suspicion that you’re trying to impress me with your technical prowess.   Hint: Good work should not need to call attention to itself.  When it tries to, I suspect it’s the opposite of good work.

Qualitative Quantitative

The security person’s trap – mixing and matching Qualitative (real numbers) and Quantitative (subjective wild guesses) .  Both have their place (as long as their explained) but when their mixed together, or worse multiplied together, it just sets my teeth on edge.  And it confuses anyone who looks closely at whatever is being measured is going to ask “What exactly is being measured here?”  Cut it out.

Lack of examples

Whatever your doing, the more real world examples, you cite, the more credibility you gain. Screen shots, legal citations, news clippings, hacker emails, quote, whatever.  Put them in the report.  Cherry pick a few and put the rest in the back (again, don’t frontload). 

* Before you say it, let me add that if an organization spends a bunch of money on a security report, you can bet your sweet weasel that someone in a suit and tie is going to at least look it over.  So don’t go playing the “these reports aren’t meant for non-techies” card on me.  In any case, I’m a techie and think these reports are terribly written. So there.