I was just listening to the Exotic Liability Podcast and once again, Chris and gang were lamenting the sorry state of pen-testing. While I've ranted before on the poor quality of the risk reporting in pen-tests, EL was lamenting the watered-down nature of most testing.
Specifically, they asked "Why are pentests so limited?" And that's true. In most external security testing (which includes both pen-testing and vulnerability scanning), there is often no intelligence gathering, no social engineeringn testing, and no physical security testing. Of course, no "cheating", like hitting DNS or business partners, either. Very often the scope of the attack is limited in both targets (only touch these assets and these IP addresses), and limited in time (you can only attack us during this timeframe and spend only 40 hours on the testing). Implied by these restrictions, include restrictions - no time for extensive manual testing, deep analysis, or reverse engineering.
A water-down test of your defenses means a myopic analysis of the strength of your perimeter. And remember, even in the best of the times, security testing only tells you two things: where some of the holes are and a measure of the skill of the attacker. Passing a security test never means you are secure. The more "real world" your testing, the closer you approach some kind of reasonable measure of useful information about possible holes. But why water them down?
Well, the obvious reason for the reason for these limitations is not wanting to spend a lot of money on consultants. Of course, I think this is a distractor. Having been a tester and now, one who hires testers, I can tell you a bigger reason is not wanting the liability. Consider, most testing that is going on right now is because of compliance. PCI requires vulnerability scanning. Most organizations acting as custodians for other organization's data are beholden to demonstrate "best practices" - and that includes pen-testing. And here's the real rub - many auditors and customers want to see the results of those security tests.
As a tester, I've also been told by very large e-tailers that they were limiting the scope of our engagement not because they knew we wouldn't find anything, but because they knew we would. They knew we would find too many security issues for them to feasibly fix without going out of business. And if they had a report of all those holes, well, now they're liable for fixing them.
So what's a poor organization to do? They need to hire someone to do security testing that has a strong reputation but at the same time, won't do too good a job. Credibility but not competence. Or barring lack of competence, someone who will sell them a testing service that is so cookie cutter that the scope will be automatically limited to the basic scan-and-patch kind of findings. Enter the big organizations, like Veri zon Cyb ertrust, I BM, Hac kerSafe, etc. Yes, there is some collusion there. But hey, it's all about staying in business and meeting unreal expectations. After all, most people don't actually want to pay to have their data protected. At least pay what it would really cost.
BTW, you can lather, rinse, repeat this post for entire financial audit industry. See Enron, WorldCom, Lehman Brothers, WaMu, etc.
“We’ve just traced the attack... its coming from inside the house!” How do you secure your network when the bad guys already have control of your servers? It’s so hard to keep up with the attacks, maybe it’s safer to architect with the assumption that you’ve already been breached. What does this entail?
Tuesday, October 27, 2009
Monday, October 26, 2009
The art and science of infosec
"The art of war and the science of war are not coequal. The art of war is clearly the most important. It's science in support of the art. Any time that science leads in your ability to think about and make war, I believe you're headed down a dangerous path. "
Lieutenant General Paul K. Van Riper
I think it's no different in infosec, especially in the senior decision-maker roles.Sure, there are cool technology to learn, awesome risk analysis models to study, complex financial calculations to crunch, but in the end, these are but tools for the practicioner, not ends in of themselves. Just because a some report said some risk should be rated high, doesn't mean it should be taken at face value. Nor should any defense be considered adequate for any length of time.
Too many security folk, especially consultants and auditors, seem to fall into the trap of having the science drive their work more than the art. I think there is a tendency to do this since many of us infosec folks started off in engineering. And yeah, in theory, engineering should be tamed by mathematics and science. But security, especially defense, has a huge human element. And this is where the art is necessary.
Optimizing specific defenses with statistical analysis is useful, but remember that attacks evolve. By the time you perfect a defensive technique, it'll be obsolete. For an example, read up on the history of the invincible Fort Pulaski.
But, it's still better than the cargo cult science of best practices in security.
What skills are useful in the art? Obviously experience and people skills. But to be more specific... well, off the top of my head: Good threat modelling (with a healthy dose of game theory), Logistics, Behaviorial Economics, Theory of Mind, what my boss calls "BS detection", Projecting integrity (not tripping other people's BS detectors), conviction and courage.
Lieutenant General Paul K. Van Riper
I think it's no different in infosec, especially in the senior decision-maker roles.Sure, there are cool technology to learn, awesome risk analysis models to study, complex financial calculations to crunch, but in the end, these are but tools for the practicioner, not ends in of themselves. Just because a some report said some risk should be rated high, doesn't mean it should be taken at face value. Nor should any defense be considered adequate for any length of time.
Too many security folk, especially consultants and auditors, seem to fall into the trap of having the science drive their work more than the art. I think there is a tendency to do this since many of us infosec folks started off in engineering. And yeah, in theory, engineering should be tamed by mathematics and science. But security, especially defense, has a huge human element. And this is where the art is necessary.
Optimizing specific defenses with statistical analysis is useful, but remember that attacks evolve. By the time you perfect a defensive technique, it'll be obsolete. For an example, read up on the history of the invincible Fort Pulaski.
But, it's still better than the cargo cult science of best practices in security.
What skills are useful in the art? Obviously experience and people skills. But to be more specific... well, off the top of my head: Good threat modelling (with a healthy dose of game theory), Logistics, Behaviorial Economics, Theory of Mind, what my boss calls "BS detection", Projecting integrity (not tripping other people's BS detectors), conviction and courage.
Subscribe to:
Posts (Atom)