Once upon a time, IT people worried about keeping systems running. Auditors worried about financial statements. The twain rarely met.
Then along came the Internet, which begat IT security and privacy requirements. Then fell Enron, which begat compliance requirements. Now auditors worry about systems. IT people worry about auditing. Worlds are colliding. People are not happy.
Many system auditors say IT professionals lack a broad understanding of the controls needed to maintain systems integrity. At a fundamental level, a control is a process to prevent, detect or compensate for risk. The big, implicit question in that simplistic definition is: risk of what?
Therein lies the first problem: lack of training. “Most general computer science programs focus on systems administration, networks, development and so on. There’s little to teach them about the pitfalls of corporations, different security breaches or opportunities for fraud. The reason you’re unlikely to find that in a general computer science program is because each business environment is going to be different,” says Will O’Brien, president of the Manta Group, a Toronto-based IT governance consulting firm.
This leads to problem number two: a little knowledge is a dangerous thing. Incomplete knowledge of potential risks may lull IT people into a false sense of security. “Traditionally, IT controls have been more tailored to deal with operational risks rather than, say, internal controls over financial reporting.
When you talk to IT people, they may believe they have the risks addressed, but they haven’t in a lot of cases had the chance to learn about what’s involved in Sarbanes-Oxley,” says Mario Durigon, senior manager within the information risk management practice at KPMG.
An IT professional might retort: but it’s up to the business units to have the requisite understanding of the risks in their own areas. But reality does not always match the ideal. “You’ve got situations in larger organizations where the core understanding of responsibilities isn’t there,” says Aron Feuer, president of Ottawa-based consultancy Cygnos IT Security. “If handled appropriately, people would have a clear understanding based on policies about who has responsibility for implementing controls.”
But IT people, he said, often get stuck, and this conundrum results in failures in many organizations when an audit is done.
Feuer adds that it’s unfair and that IT departments should have a primary mandate to “keep the lights on or to develop new business functions. But the fact is, because security is so broad and deep, no single entity can avoid responsibility and IT people do get an unfair burden because they’re the ones responsible for stringing things together,” he says.
Controls in 3-D Broadening the context of controls means looking at controls in three dimensions, explains Feuer. “A techie may interpret security controls as the application of a hardening template and maybe removing some questionable source code. But when we look at controls in the enterprise, we should really be saying the function of a control defines its design,” he says.
The first dimension centres around determining what kind of control is actually needed: is it preventative, detective, corrective or deterrent in nature? The second is determining where placement of the control in the system will be effective, with a view to being able to measure its ability to achieve business goals such as confidentiality, integrity, availability and mitigation of vulnerabilities.
The third is determining how a control is applied: is it technical, architectural, or procedural in nature, and does it map back to policy, regulation and governance objectives. Awareness training can help IT people gain a better understanding of controls, but there are some psychological stumbling blocks to overcome, says Feuer.
One issue centres around techies coming in with a lot of attitude. “In one client’s case, we did two separate sessions for developers and operational people,” said Feuer. “The developers said, ‘I understand that hosts can be vulnerable but why should I care about that, it’s operations’ job.’ When we sat with the operations folks and talked about buffer overflows and poor coding practice, they said, ‘That’s development’s problem.’”
Communicating the vast interdependencies in systems is often an eye-opening experience for trainees, says Feuer. “It’s important for them to understand security and controls have higher breadth and priority than many of the activities they’re engaged in independently.”
Developers don’t appreciate the fact that a poorly hardened host system can bypass all the security they’ve built into their .Net application, he says, adding that the same thing happens from an operations perspective. “If you have an application running at root- or system-level privileges that are compromised, it doesn’t matter what you’ve done around the host.” As in many spheres, people want to be shown, not told. Feuer believes the challenge is to give them real-world examples that make sense to them.
When doing training for developers, he says, the Cygnos team broke into a demo online shopping application, something that was similar to a system trainees might have developed, and walked them through the actual exploits. They were shown, in a live environment what it meant for areas they were responsible for. “Every time we took a break, the discussion was, ‘Holy crap, I didn’t realize these things could be done.’ They just didn’t have the core awareness of how compromises might be executed.”
After showing them, Feuer and his team found that resistance to the discussion stopped. “If you can’t show folks why they need to care in their own environment, then it’s just academic (to them) and they don’t get involved.”
QuickLink 054938