Monday, November 5, 2012

Losing the war

I'm tired of hearing "we're losing the war with the hackers"

You know it's really bad when one top cybercops is saying things like: "You never get ahead, never become secure, never have a reasonable expectation of privacy or security"  

Okay, enough.  Let's decompose.

Last year, we have story in TechNews claiming "2011 Set to Be Worst Year Ever for Security Breaches" 

How bad was it?  Well, Privacy Rights claimed 30.4 million records breached.  Wow, big scary numbers.

But, what's that per capita?  In 2011, there were approximately 7 billion people on the Internet.

This means than less than one-half of one percent of Internet users were victimized because math.

Let's put that in perspective, in 2011, the Average American had a 3% chance of being a victim of a property crime.

That means you were seven times more likely to get something stolen then hacked.   Hmm.

But hacking so much worse!  Terrors.

Right, when your credit card gets stolen off a website, how much does it really cost you?  Directly, not very much because the credit card companies absorb the loss.  Worst case, you're out of the use of your card for a few days while it's re-issued.

Shut up, you, you're mentioning identiy theft. That's real bad.  Right.  In 2010 a quarter million American's had their identity stolen.

Wait, a quarter million sounds scary but really, that’s only 0.08% of the population.

And remember, not all identity theft is cyber-driven.

Hmmm.  How bad is this really?  We hear a lot of stories about scary hacks going on.  Right, we do.  Because it's NEWS.   We never hear anything about the millions and millions of records being protected every day... because it's boring.  Let me introduce you to my little friend, the Availability Heuristic, which is commonly used in poor risk judgements. 

In closing, I'd also like to point out -

It’s not a game or a war  to be  “won” or “lost”

It never was.

You wanna talk about game metaphors? Let's quote Fight Club -
"A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don't do one."

That is how it's done. It's not a win/lose, it's a risk spectrum with tradeoffs on risk and cost.   When the cybercrime gets too unbearable, we'll turn up the controls and the enforcement until the cost balance is met.    It always worked this way.  It probably always will.

Now stop whining about losing "the war against the hackers."  It sounds amateurish.

This post was derived from a lecture given recently for the University of Washington Certificate Program in Information Systems Security program.

Tuesday, September 11, 2012

Threat or menace?

Sorry to be ranting again, but I am doing a survey that was sent via a certain large certifying organization.  One of the questions for "information security professionals" to answer was:

Thinking about your own organization, please rate the following potential security threats on the degree of concern you have for each.
  • Trusted third parties                   
  • Hacktivists                   
  • Hackers                   
  • Internal employees                   
  • Contractors                   
  • Mobile devices                   
  • Cloud-based services                   
  • Malware                   
  • State sponsored acts                   
  • Cyber terrorism                   
  • Organized crime                   
  • Application vulnerabilities    

Mobile devices are a threat?   Cloud-based services are a threat?    Really?  I think of those two things as technologies.  Neutral technologies.  Now these technologies may be full of vulnerabilities.  And there is a probability that these vulnerabilities may be exploited by threats... which will have impacts

Heck I'd even say malware isn't a threat.  Attackers using malware is.

And this from a survey targeting ccertified professionals who are supposedly tested in the basics of the risk equation.   But maybe different folks have a different way of thinking about threats?   Am I taking crazy pills?

So the question I put to all of you - What is a threat? 


Thursday, August 2, 2012

Hofstadter's Corollary on Remediation

Any long-time readers of this blog know that when I get a pen-test or vulnerability scan report, I am usually displeased.  One of the things that get under my skin are the apparently wild-guess remediation estimates they put in for discovered problems.*

These estimates are usually done by a third-party (consultant, auditor, industry busy-body) who has not asked any detailed questions about my operational, development or business processes. They rarely even understand the business value of the service they're evaluating. They have no clue about the current workload or pipeline. Yet somehow they can pop off in a report that goes to my boss, my customers, and my regulators proclaiming that it shan't be any trouble 'tall to fix.

Now any of us in the security business for more than a year knows to take anything an outsider puts in a report with a grain of salt or two.  Unfortunately, our boss, our customers and our regulators tend to take these reports as gospel.

And then I hafta splain why it take so darned long to fix those things. I swear, sometimes I feel like Admiral Hopper explaining nanoseconds.

So here I present "Hofstadter's Corollary on Remediation".

Let's start with Hofstadter's law, which is:

Hofstadter's Corollary on Remediation states that remediation efforts are always longer than estimated by an outsider, even when they take into account Hofstadter's Corollary on Remediation.

But why is this? Let's unpack. Here's an example:

"Finally, let's estimate that each SQL injection vulnerability will require 40 developer hours x $100 per hour to fix, or about $4,000 total in labor costs." - Well Respected Industry Smart Person

Variable and fuzzy costs

Is that all it takes?  Well, no.  Let's put aside opportunity cost - which means you stall the development pipeline of requested customers features - a large but kinda fuzzy cost. 

And let's also put aside the flippant assumption that fixing a SQL injection vulnerability is just that simple and there are no underlying major software infrastructure components that need to be worked out. It happens but let's just leave that aside because it's also hard to quantify until you dig into the particulars.

And let's also not include the possibility that an organization has vulnerabilities because the whole development process is fubar - which actually is likely in places with lots of discovered vulnerabilities because of Boulding's Backward Basis - but that's also pretty variable for now so let's leave that aside too.

Known costs

So, let's focus on the tangible costs. For one, most organizations that care enough about security in their products usually also have very defined operational processes.

And some of us on highly regulated industries can't take advantage of new-fangled rapid deployment models

So any code change means it goes through release planning, functional requirements capture, use case development, specification development, specification review, test plan development, test case development, coding (what was estimated in quoted estimate above), code review, feature testing, regression testing, integration testing (sometimes against KLOCS of of code and third-party libraries), readiness review, release management, package development, documentation of release, site deployment to user acceptance testing, site deployment to production in an acceptable change window, site deployment to recovery sites.  Each one of these steps is at least a few hours of someone's time. And most painful of all, this whole wheel takes several months to turn... even for a seemingly minor code change.

In other words, remediation will slam right up into the immovable object of business requirements. 

A more simple example of this - You find a vulnerability on mail site - do you fix it immediately but in so doing shut off your CEO's email while he's at a customer site in tense negotiation?  Ummm. It's not an easy call and nor should it be.

But wait, there’s more

Suppose what you find something that appears minor, like a single XSS on a help screen.  Well fine and dandy, a quick fix. But wait, your developers just certified their product as 100% OWURST compliant with the Brand Spanking Impenetrable Cross-Site Scripting Defense Force-field.  So how did this XSS bug slip through?  You may have an endemic problem you weren't aware of.  And it's like that there are probably more of them to be found.   So now a minor remediation effort becomes a major bug hunt. Or it should if you're Doing The Right Thing(tm).  And some bugs, like XSS, are often part of the overall I/O engine of an application which may entail a major overhaul.

Now why would an outsider would stamp low remediation estimates on things they knew we have to hand craft a solution for?  I'll just slice it with Hanlon's razor.  But...

What about vendor-patched vulnerabilities?

So far I've just focused on the big and nasty vulnerabilities found in apps that you're responsible for fixing yourself. 

Suppose you have a minor vulnerability in your Pythia database server.  Well, you can just download the fix. Remediation effort: low.  Just apply the patch. 

Whoa boy, can't just do that.  You're talking about a production system on a high-volume financial transaction system.  We gots a procedure here. A procedure that's checked and audited with paperwork up the wazoo. Can't do squat without six manager's sign-off, two dbas to test queries to make sure nothing done broke (regardless of how banal the change actually is), at least a full month of regression testing, and then finally waiting for the appropriate change window which comes once a month at 3am on a Sunday.  When the moon is blue.

Remediation effort low, my ass.

* Remediation guestimation is the second most annoying thing in assessor reports.  This is the first.

Tuesday, January 31, 2012

The Vulnerability Assessor Risk Rating System revealed

Vulnerability Assessor Risk Rating System
Server has an IP address
Web server apparently is serving up web pages
Server running SSL instead of TLS
Super High
Directory content listing found in default Apache directory
Double-plus ungood
Self-signed certificate found on test web server
Correctly guessed login name is “admin”
DoS vulnerability found in version number in banner grab
Fraught with peril
Developer email address found in HTML source
Password autocomplete not disabled on login form
Non-persistent cross-site scripting found
SQL injection found on admin SQL database query tool