It’s time IT departments overhauled the way software is developed internally, a U.S. researcher and author has told a Canadian security audience.
Gene Kim told the annual SecTor conference in Toronto on Tuesday that collaboration between software developers and IT operations staff — dubbed DevOps — is the only way for troubled organizations to pull themselves out of a dangerous downward spiral of putting out bugging and insecure code.
“There’s no doubt in my mind (DevOps) is an inherent competitive advantage — great for dev, great for ops and great for information security,” he said.
It’s “almost existential threat to the way we (information security personnel) work,” he said, “and yet I believe its one of the best opportunities for us to genuinely shape the quality what dev and ops do”
The two-day conference will see speakers advising on how to secure Big Data, explaining how malware creators automate their payloads and detailing how cloud computing means new strategies for protecting the edge of corporate networks are needed.
The DevOps movement, which dates back to 2009 and has passionate adherents, including Kim.
Developers are often pressured to add more features to applications, he argued, sometimes taking shortcuts to meet deadlines. As result, work on stability and security capabilities suffer.
Meanwhile IT, which has to run flaky production code, suffers having to put out fires. DevOps forces the two to work together so, for example, test environments are available for developers to use.
It involves new ways of thinking about developing software, he said, such as focusing on ways to increase the pace of development, not passing on defects before code is released, deploying small changes instead of a lot of big ones at once, understanding the needs of internal and external customers, and not being afraid to learn from failure.
Those who practice DevOps offer amazing performance statistics: Amazon says it deploys new code once every 11 seconds, Kim said.
In 2012 he helped study 4,200 organizations that found high performers were deploying code 30 times more frequently than others, and had twice the success rates. If there were problems, they were fixed 12 times faster than other organizations.
Elsewhere the conference heard a consultant warn that vendors pushing the nascent concept of software defined networking haven’t aren’t talking enough about the security solutions that will be needed.
SDN is a way of virtualizing data centre networks with a software controller overseeing switches and routers. That means a sensor at the Layer 2/3 level able to signal the controller in an automated way it has detected an anomaly in traffic, said Llewelyn Derry, vice-president of business development at Texas-based ISC8, which makes threat detection solutions.
Rob Johnson, lead security architect at Unisys, said that the trend to running flatter networks means that its hard for organizations to separate sensitive data to meet regulatory demands by building separate networks.
Instead, he advocates controlling access to certain data pools by limiting access to groups of people through what he calls cryptographically-isolated virtualized networks.
“It’s a way of virtualizing the network and enforcing it using cryptography” rather than a VLAN, he said. Briefly, when a user logs in to the network the identity management software (Active Directory or LDAP) identifies the person has restricted access and creates a secure tunnel to the data.
Johnson said the solution is a software layer on top of an existing network.
Lucas Zaichkowsky of Access Data said organizations spend too much time and money with defensive solutions. In particular, he said there’s an over-reliance on network threat detection. Instead they should be proactive, combing network traffic and end point logs looking for anomalies.
Usually intruders spend time looking around the network before they begin stealing data, he said, which can give organizations time to find them.
In his words, that’s a “golden opportunity” to catch them before damage is done