Your template makes you look lazy. And the fact that you used improperly makes you look sloppy.
It’s got hooks for things that I didn’t buy yet there are orphaned headers and text in there from them. It’s an awkward one-sized-fits-all affair. Does the advice you dispense also fall into that category? I’m tempted to believe that.
Executive summaries that aren’t summaries and aren’t written for executives
Here’s how the current exec summary reads:
1. Client hired Consultant to do job XYZ
2. Consultant did job XYZ using generic technical process blahblahblah
3. More detail on generic technical process blahblahblah
4. Job XYZ was done on date ABC, the end.
Huh? What is this a summary of? The proposal? Here’s how I would expect it to read:
1. Client hired Consultant to do job XYZ and job was performed on date ABC
2. Consultant found MOST-HEINOUS-FINDING1 explained in 1-sentence non-technical language covering likelihood and impact (repeat as necessary) or Consultant found no significant vulnerabilities and security of Client appears to be sufficient in comparison to comparable organizations
3. Consultant also found OTHER-FINDINGS but they aren’t that important because of low likelihood or low impact
4. We’re not perfect and were given constraints in our testing, other vulns could be there, please plan accordingly
Graphics, diagrams and charts that convey almost no useful information or are so confusing that they actually detract from the report. More common than not in technical reports. Sadly. Do yourself a favor and read some Tufte.
Technical Tables of Torpor
Trying read through most tables in reports usually causes, blurry vision, dizziness, and finally sleep. Sweet, sweet sleep. The purpose of a table in report (especially if non-techies are going to see it) is to make your reasoning clear, to invite easy comparisons or to clarify a difficult concept. Think about what you want to convey with a table before you start slapping text and numbers into boxes. What decisions do want the reader to make using the table? (besides being impressed with your ability to cite lots of data) Then eliminate everything else that doesn’t need to be there.
Apparently Arbitrary ratings
There are long strings of “high” attached to things like “Total risk” or “Cost to mitigate”. Executives wonder if this is canned bs (yes, it is) or was this calculated relevant to their organization in a meaningful way (likely not). This just makes us want to see how you came up with the choices. And often those details aren’t there. How did you decide that this is a “Magenta priority” and the probability is “Unlikely”. What does that mean anyway? Where are you getting your data? (out of your posterior cavity, I bet)
Frontloading reams of technical detail
Technical detail needs to be there. It falls under category of showing your work and how you came to some conclusions. But put this stuff in the back. No one wants to wade through it in the first reading of the report. It gives me the nagging suspicion that you’re trying to impress me with your technical prowess. Hint: Good work should not need to call attention to itself. When it tries to, I suspect it’s the opposite of good work.
The security person’s trap – mixing and matching Qualitative (real numbers) and Quantitative (subjective wild guesses) . Both have their place (as long as their explained) but when their mixed together, or worse multiplied together, it just sets my teeth on edge. And it confuses anyone who looks closely at whatever is being measured is going to ask “What exactly is being measured here?” Cut it out.
Lack of examples
Whatever your doing, the more real world examples, you cite, the more credibility you gain. Screen shots, legal citations, news clippings, hacker emails, quote, whatever. Put them in the report. Cherry pick a few and put the rest in the back (again, don’t frontload).