They are etched into the conventional wisdom of IT security, but are these 12 articles of faith (to some) actually wise, or are they essentially myths? We’ve assembled a panel of experts to offer their judgments.
1. There’s security in obscurity.
David Lacey, Jericho Forum founder and researcher: Yes, there is. Not everything is known or knowable to an attacker. This uncertainty prevents and deters the vast majority of attacks.
Nick Selby, analyst, The 451 Group: No, there’s convenience in security. Say you’re trying to keep your kid from discovering the birthday party plans you’re making, and you don’t want the workaday toil of waiting until he’s asleep to discuss them. So around the dinner table, speak German. Now, for protection of … well, anything, it’s just not on. Wherever you hide the front door, it is trivially discovered, so recognize you live in a bad area, get a strong front door with good locks — and don’t hide the key under the garden gnome.
Bruce Schneier, crypto expert, chief security technology officer at BT: All security requires some secrets: a cryptographic key , for example. But good security comes from minimizing and encapsulating those secrets. The more parts of a system you can make public — the less you have to rely on secrecy or obscurity — the more secure your system is.
Peter Johnson, global information security architect, Lilly UK: It can slow down the bad guys, but they will find out in the end. It is like closing the front door at home, and hoping nobody will try opening it.
John Pescatore, Gartner analyst: Only true within the bounds of the tried and true concept of ‘need to know.’ For example, keeping your password obscure is obviously a smart strategy — only you have a need to know. … Where this one falls apart is when the assumption is that ‘obscurity means security.’ This is never true — and worse, when people design software with this concept in mind, all kinds of bad things happen.
Richard Stiennon , independent analyst: I was thinking about this in terms of Web application firewalls. There are 70 million Web sites but probably only a few thousand Web application firewalls sold so far. Most Web sites are protected by the principal of security through obscurity.
Andrew Yeomans, vice president global information security at an investment bank, and Jericho Forum member: Obscurity buys you time, but doesn’t last forever. Obscurity can add an extra barrier, and may deter poorly resourced attacks. But a better-resourced attacker may succeed, and as costs keep dropping, may only need low-cost resources in the future. And once obscurity is lost, security is lost forever, too.
2. Open source software is more secure than closed source.
Yeomans: At least when open source breaks you get to keep the pieces, and might be able to glue them together yourself. Some open source software has been well inspected (‘many eyes make bugs shallow’) but conversely other open source software is relatively insecure. There’s probably little to choose between comparable open and closed source software on pure security grounds. But open source has the advantage that you can do a code review yourself, or pay to have one done, and also that it is possible to fix problems yourself without having to wait for the vendor.
Lacey: They present a different set of risks . Neither is more secure than the other.
Schneier: Secure software is software that has been analyzed by smart security programmers. There are two basic ways to get software analyzed: You can pay people, or you can make the code public and hope they do it for free. Open source software has the potential to be more secure than proprietary software, but making code public doesn’t magically make it more secure.
Johnson: At least you know what you’re getting [with open source] — but it requires a different approach to support it, particularly in a regulated environment.
Pescatore: This one is not that is not far off, but still not true. The most secure software is software that is developed with the most attention to security. Most open source development projects do not have much of a secure development life cycle. But I do believe that software developed knowing the source will be open is more secure than software developed that is depending on security through obscurity.
Developers are less likely to build in Easter eggs, back doors and other stupid things when they know the source will be widely viewed.
3. Regulatory compliance is a good measure of security.
Regularoty complieance drives security
Lacey: Yes, it is. I have always found a direct correlation between the number of controls implemented and the level of incidents and vulnerability.
Selby: (laughter)
Stiennon: Obviously not. You can be extremely secure but not compliant. Just as you can easily be compliant but not secure.
Schneier: Compliance is a good measure of the regulation. If the security regulation is a good one, then compliance improves security. If it’s a bad one, then it doesn’t.
Yeomans: It’s not always a measure of good security. Regulatory compliance will help provide a reasonable base level of security, and may make it easier to justify the budget cost. But it may sometimes lead to good security measures being non-compliant, and compliant measures being more expensive than is justified.
Johnson: There are usually many ways to comply with a regulation — not all are as secure as the others. Experience has shown this, and now the regulators are starting to try to specify requirements, which is going to be difficult as they generally do not understand security.
Pescatore: No-brainer, dead wrong. Especially for something like Sarbanes Oxley, which has actually nothing to do with security. What we tell c