Once again I'm ranting again about things that I thought should have already been settled long ago. And in some ways, they have… but on the ground, I see nothing but confusion. So today is: what is this thing that we do?
I purposely chose the blandest and overused term "cyber security", because I see it thrown around by the folks who seem the most clueless about it. This simple thing of what is expected of a cyber security professional gets to be particularly problematic when an organization goes to staff up and build their first cyber security program. You're hiring your first security professionals, what should they know? You're reorging your IT group, where does the security department fit in? Who does your head of security report to? What should the dedicated cyber security team be responsible for and more importantly, be not allowed to do? I've seen a wide variety of expectations across industries and organizations. In some cases, the role is defined by regulation but when it's not, it tends to be squishy and inefficient.
As previously alluded to, old school cyber security equated IT security entirely with info security. A tactical place to start but definitely not a mature or effective model to thrive in a DevOps meets APT world. And in the most ineffective and inefficient cases, you'd see normal IT and development stuff going on with the security team coming in afterwards to "make things secure".
First off, let me say that I don't know exactly what cyber security should definitively encompass. In this post, I'm only going to discuss my own thoughts and experience. A bit of background, I'm an IT ops guy (first 10 years of my career) who grew into IT security (next 5 to 7 years) and now does information assurance (last 10 to 12 years). I think the best way to break this down in this first cut is to roll thru the ISC2's CBK. It ain't perfect and it's kinda old, but it's a general shape matches what we're supposed to be doing in cyber security.
Access Control
I see this is as split between the security group and the IT operations group. The security group should definitely consult, design and assess against The big picture stuff, like models, techniques, and threats around access control. The IT ops group should deal with the specific mechanisms. You will not see me tinkering with AD permission groups or managing two-factor tokens. But I will provide guidance on how these things should done to match the risk and business requirements of the organization.
Telecommunications and Network Security
Same as access control. I grew up as a firewall guy but I haven't configured a firewall for years. The IT network engineering team is best to do this. Especially since they understand all the nuances and implications of opening this port over that port, splitting a particular VLAN or how the VPN links should be laid out. I absolutely consult and oversee the design and configurations to make sure they meet our requirements. I still keep up on latest DDOS techniques and research new perimeter controls and make sure the infrastructure team gets the highlights regularly as well.
InfoSec Governance and Risk Management
Pretty much entirely the security department to drives this. That doesn't mean that everyone else doesn't understand and follow the policies to the best of their abilities. Hopefully, the message that security is everyone's job applies here. I often spend a bulk of my time translating and interpreting the organizations policies for individuals and departments wanting to accomplish some business thing in accordance with policy (well, actually they want to make sure they flag the attention of the auditors, but that's mostly because I've trained that way).
Software Development Security
Realistically, this is it's own thing looking like network security but for programmers. A group of security consultants and assessors doing the big picture stuff embedded with the developers doing the actual implementing of the more secure code. I can think of no better concrete example of how to organize this than Microsoft Trustworthy Computing. I'd even slice web security off as it's own specialty here as well, but still following the same model.
Cryptography
Beyond the specialized discipline, the security team would know the concepts, the algorithms and the pitfalls as to advise the various IT and developer teams on implementation of specific controls.
Security Architecture and Design
This is something that consumes a lot of my time… and something I enjoy doing. This is definitely where I would expect a "cyber security professional" to know things deep and wide. Lots of advising and assessing both up and down the organization chart on how and why some designs are better than others. And yes, lots of research and upkeep to keep abreast of new threats and technologies.
Operations Security
Ideally, I think this should be something designed with direction from the security team but run at the edges or even outside of the security team's domain. Realistically, it's often part of the day-to-day duties of the security group. Vulnerability management should really be part of the IT groups daily grind. Incident response is led by security but should pull from all groups, tho forensics is it's own thing and usually owned by security or legal (see below)
Legal and Compliance
Security should definitely understand a lot here, especially because in some organizations, the legal department can be often lack cyberlaw knowledge. The minimum expected of any security professional should be to able to translate compliance requirements into technical standards. If the security team is lucky, they've got a technically savvy legal team that collaborates on cyber security matters. I haven't been lucky. Cyber forensics and investigations should also live in the legal department, with some assist from security.
Business Continuity
Again, I think this should be it's own group but often security owns most if not all of this function. It's a deep and wide discipline and affects everyone in the organization which demands a lot of resources. Because of this, when it's own solely by security, it's often not done very effectively. The security team is often so busy they rarely have time (or interest or skills) to do a great job here.
Physical and Environmental Security
In large organizations, this is a separate group standing parallel to cyber security. In most, the security team owns it as well. Typically, the security group does the design and sets the standards, then works with either facilities and/or the IT team to make sure things are enforced. Things definitely get strained when a physical security event occurs and three different groups come to the table and try to figure what the heck happened and where the fingers should be pointed. Since I'm in a medium/small organization with high assurance requirements, it means that I own most of this and have to know a bit about alarms, door locks, and building materials.
Okay, that's it. Rant off. What's your experience? What is it do you think we should be doing?
Extra credit: does infosec require an IT background or can you be a pure infosec professional with minimal engineering training?
“We’ve just traced the attack... its coming from inside the house!” How do you secure your network when the bad guys already have control of your servers? It’s so hard to keep up with the attacks, maybe it’s safer to architect with the assumption that you’ve already been breached. What does this entail?
Friday, November 8, 2013
Thursday, October 31, 2013
13 Warning signs that your infosec program is stuck in the wrong decade
- Over-focus on operational controls: Firewalls, anti-virus, passwords. Under-focus on security architecture, systems analysis, and business needs.
- Risk analysis is to do a gap analysis against the list of best practices du jour.
- Incident response means everyone run around with their hair on fire.
- A "post-mortem" means patch the hole and move on. Root cause? It was that hole. And we just fixed it!
- "We recognize that privacy is very important to our customers. Our website uses Secure Sockets Layer so that information you provide to us is protected over the Internet."
- Security is all about the CIA triad, but when push comes to shove, it's availability that wins.
- Vulnerability scanning is done once every few months. Against the Internet perimeter. And you only look at the "highs"
- Access controls are binary: you can either see nothing or you can see everything.
- Security policies are inches thick, nobody reads them.
- Authentication is entirely about really complicated passwords, rotated frequently.
- Paper shredders everywhere because physical security is important, dammit! Laptops and drives rarely encrypted.
- The IT department manages the IT and the security department adds the security afterwards.
- Application security means our software supports strong passwords.
Any additions? Please post here or on your Myspace page!
Wednesday, October 30, 2013
Assuming the assumption
Fresh out of SIRACon 2013, I'm a-bubble with ideas and rants, ready to hit the security trail cracking. But back in the real world, I get a taste of some of the same-old same-old problems that we security folks stumble over: bad assumptions.
It bears repeating that we should always question our assumptions, especially until they're formally validated in a meaningful way. But many of us still take simple things at face value when we shouldn't. And especially if we're told the information third-hand. After all, people will tell the security director what they think she wants to hear.
For example, I recently ran into an organization that got a nasty outbreak of malware. They had assumed that every workstation was decently patched and was running updated AV. My rule of thumb is to expect 80% average coverage out of the gate in a well run organization with no prior checking. And of course, I'm inclined to use my handy dandy scanning tools to bring that number up to at least 95%.
In the aftermath of the outbreak, it turned out my assumption was closer to the reality than theirs. The unprotected machines were slammed pretty hard... and now they know better.
Not that this was an aberration. I've seen this time and time again with firewall rules, encryption policies, user privileges, facility authorization lists, key inventories... anything that is complex, tedious to manage and invisible without formal review. Don't assume something that's been sitting for a long while is still the same. Don't guess on how something was set up by a previous administration. Don't believe what you're told. Double check. It's your responsibility to know better. That's what you're paid to do.
Monday, August 5, 2013
CVSS is insufficient for risk determination
No, I didn't attend BH/DC/BSLV this past week... that makes it about a decade since I last attended. I did try to catch up on some of the more interesting talks.
This one from the Risk I/O gang on CVSS trends 1 really struck me. A good talk, watch the whole thing if you haven't. They talk about how organizations are using CVSS scores to determine patching priority (or in many cases, patch or no patch) and how this yields a very low effectiveness against breaches.
My question: Do people not read? The "V" is CVSS stands for vulnerability. Vulnerability != Risk. Vulnerability is part of the risk equation. About one third of the part of the standard risk formula. And any case, are people reading the actual vulnerability ? Or are they just taking the score and blindly applying it to their patch program?
Never mind.
This one from the Risk I/O gang on CVSS trends 1 really struck me. A good talk, watch the whole thing if you haven't. They talk about how organizations are using CVSS scores to determine patching priority (or in many cases, patch or no patch) and how this yields a very low effectiveness against breaches.
My question: Do people not read? The "V" is CVSS stands for vulnerability. Vulnerability != Risk. Vulnerability is part of the risk equation. About one third of the part of the standard risk formula. And any case, are people reading the actual vulnerability ? Or are they just taking the score and blindly applying it to their patch program?
Never mind.
- BTW, I remember when Risk I/O was HoneyApps and they were just launching. Bright guys but when they pitched to me, I asked if I could apply risk models to my stored vuln data. They said no, I couldn't. So I wasn't really interested in their service. I see they've since pivoted to match the market demand. Good for them.
Monday, July 29, 2013
What's a normal person supposed to do if they're hacked?
Today I had to do some quickie malware clean-up for my daycare provider... she's a sweet woman who doesn't much about technology much less the complex cat-and-mouse game of security. Like many of you, I am the one who has to step in and clean friends-and-family computers when this happens. But if wasn't for us volunteering our services, what are these people supposed to do?
I'm well versed in the organizational response and the accompanying responsibilities. But the ordinary Joe is left to the mercy of trying to figure out themselves. I'm betting many of them end up going to a computer repair shop (big or small) and paying hundreds of dollars for clean-up. And honestly, maybe that is a fair price for the level of expertise required to fix things. Provided they do a good job.
But even the simple basics of prevention aren't making it down to the street level. After I did the clean up, I ended up doing a whole round of patching and anti-virus updating. Then a short lecture on choosing good passwords and being careful about what one loads. Yeah, anyone under the age of 25 probably knows this stuff already. But how many over the age of 50?
At a recent security seminar, TJ Campana of Microsoft Digital Crimes Unit said something that really resonated with me. He said good or bad, there are millions of Windows users out there counting on them for protection. And realistically, whatever software update gets automatically pushed down to those thousands of infected boxes is probably the only help these people may get.
I'm hoping with all the hoopla and sploiting and games going on in Vegas this week, someone is thinking about how we can do a better job of reaching out to the technically challenged.
I'm well versed in the organizational response and the accompanying responsibilities. But the ordinary Joe is left to the mercy of trying to figure out themselves. I'm betting many of them end up going to a computer repair shop (big or small) and paying hundreds of dollars for clean-up. And honestly, maybe that is a fair price for the level of expertise required to fix things. Provided they do a good job.
But even the simple basics of prevention aren't making it down to the street level. After I did the clean up, I ended up doing a whole round of patching and anti-virus updating. Then a short lecture on choosing good passwords and being careful about what one loads. Yeah, anyone under the age of 25 probably knows this stuff already. But how many over the age of 50?
At a recent security seminar, TJ Campana of Microsoft Digital Crimes Unit said something that really resonated with me. He said good or bad, there are millions of Windows users out there counting on them for protection. And realistically, whatever software update gets automatically pushed down to those thousands of infected boxes is probably the only help these people may get.
I'm hoping with all the hoopla and sploiting and games going on in Vegas this week, someone is thinking about how we can do a better job of reaching out to the technically challenged.
Wednesday, April 17, 2013
Which certification should I get for my organization to prove its "secure" ?
Of course, we all know you can't prove anything is secure. The best you can hope for is to prove a specific scoped set of things, is tested against a specific set of criteria at a specific period of time. And the quality of that testing definitely varies.
But anywho, the big well-known ones are:
The answer: Whatever certification your customers/regulators/suppliers will accept.
That's it. It ain't about validity or rationality or what is economical or appropriateness or truthiness. No matter what you do, if you don't do the one your third-parties are asking for, they won't be satisfied.
Of course, some folks chose to do none-of-the-above and simply say "hey, just come audit me" but that has a price too.
In the end, it'll always be a trade-off between what you do for compliance and what you do for security. Trade-off could be big or small, but we know that no standard fits every organization and this is just what you have to do.
Saturday, March 16, 2013
Footnotes and further research for my Cascadia talk
These are additional notes and research links for folks who attended my Cascadia IT conference talk "Into the Breach - Transitioning into an Infosec Career"
Talk Slides here
- Visible Ops Security - a basic how-to for security in the enterprise
- How to do a risk analysis - a overview of the methods and types
- Avoid becoming one of these - Who in infosec gets a bad rap
- Dealing with people, politics and non-technical folks - Great podcast, great site
- The Six Dumbest Ideas in Computer Security - Marcus Ranum is very wise
- Ten ways to build/improve your infosec career
- UW program for budding CISOs
- UW program for security engineers
- Online Coursera class on security strategy - highly recommended!!
Subscribe to:
Posts (Atom)