Time for IT security professionals to do a gut check over the risks of consumerization.
If you don’t do a quantitative risk analysis of employee-owned devices in the enterprise, “you might as well go with your gut,” rather than rely on a qualitative or quasi-quantitative risk analysis, said Peter Davis, principal of risk analysis firm Peter Davis & Associates.
The good news: subjectivity and objectivity aren’t discrete binaries, but rather a spectrum of data-supported decision-making. “It always comes down to a human at some point,” Davis said. But the idea is to reduce subjectivity in risk analysis; it’s less precise, more uncertain.
But just having data doesn’t mean your risk analysis is any more accurate. You can’t aggregate the colours of a threat survey. Polar diagrams “represent and mean nothing.” Just because it’s structured and formal doesn’t make it meaningful. “Astrology is both structured and formal,” Davis said.
Good data and good models are necessary for a good risk analysis. It starts with the information you have within your own system and supply chain, but there’s plenty of other data to draw to determine the value of assets, systems and processes.
“You can find exemplars in other industries,” he said. Much of the modeling of risk analysis for disaster recovery, for example, is drawn from the nuclear power industry. There are also many relevant surveys that can address the probability of, for example, someone losing a smart phone, which can be factored into the analysis.
There are roadblocks in people’s lack of understanding in mathematics, though. Ask a person trying to draw a red marble blindly out of one of two urns, telling them one has one red marble in 10, while the other has eight in 100, Davis suggests. Most people will choose the urn with the most red marbles, even though the other has a two per cent higher probability, he said.
“Information security people don’t like to talk about probability,” but prefer to focus on the possibility of a threat.
Davis advocates an open source risk analysis process called the FAIR (Factor Analysis of Information Risk) standard, created by the Open Group. The standard models risk as a factor of loss frequency and loss magnitude. Loss frequency breaks down into threat frequency and vulnerability (which he defines as the probability that the threat capability exceeds the ability to resist the threat; think Superstorm Sandy versus the New York subway). Loss magnitude breaks down into primary and secondary losses, and so on.
“It keeps decomposing,” Davis said.
Minimum, maximum and most likely (or mode) values for the variables undergo a Monte Carlo simulation, which repeatedly samples the numbers at random.
Eugene Taylashev, manager of information security for International Financial Data Services, described three other models of risk analysis.
A simple threat assessment uses business language and processes oriented toward line-of-business managers. The seven-step process relies on the knowledge of the businesspeople involved.
It begins with a list of concerns. For example, moving the front end of a Web site to a cloud services vendor might cause the front-end to become unavailable, possibly violating a customer SLA; the data might be exposed or corrupted, with either leading to penalties for the company. The steps involve separating the event from the impacts (unavailability I an event, SLA violation an impact); estimating the frequency and impact on a logarithmic scale; producing a risk heat map and developing risk treatment options.
A failure modes and effects analysis (FMEA) is better suited to technologists, but still requires only a single spreadsheet page. Threat probability, vulnerability severity and ability to detect are rate from one to three and multiplied together for each asset; this produces a risk priority number.
The ISO 27005 standard is more rigorous and in-depth. Anything with a business value – primarily business processes and information, but also the supporting systems, personnel, supply chain—is listed. All related threats are evaluated on the basis of their likelihood versus their impact on business value. Risks over a certain threshold are priority to ameliorate.