Predictive analytics is going to become the norm for businesses in 2015, but perhaps not where you might think, according to Hewlett-Packard.
The company’s executives believe that predictive analytics – where large amounts of data are analysed to make forecasts about future trends – will move from an experimental project for many companies to a ‘must have’.
Predictive analytics has often required a deep understanding of data science. Will it really make any difference to smaller businesses that don’t have the IT expertise or capital to buy specialised big data servers?
“It should make all of the difference,” said Chris Surdak, global subject matter expert for HP (NYSE: HPQ). “As more big data tools become commoditized, cloud-based, and delivered as utilities, and as more data becomes accessible through open data initiatives, smaller companies can use their greater agility to put predictive tech to use more quickly than larger firms. Costs are falling and availability is growing.”
The idea for smaller companies is to process these large amounts of data in the cloud, using managed services that can crunch the numbers while keeping the cost as operational expenditure, off the balance sheets.
Eventually, an independent restaurant with a couple of locations could aggregate its own sales data with other external data – everything from traffic to the weather – and draw inferences about how many customers will visit and what they will eat. This could produce actionable data about how many pounds of prawns to buy for the following week, for example.
That’s the dream, but don’t expect all big data users to get to those kinds of applications by 2015, though. These insights may be applied to specific domains internally in 2015, said HP experts. One thing near the top of their list is IT operations, and an example within that area is IT security.
“From our analysis of the 2014 breaches there are similar cause and effect matters that can be identified, said Dragan Rakovich, CTO of analytics and data management at HP Enterprise Services. He cites one breach where attackers got in via a vulnerability in heating, air conditioning and ventilation (HVAC) equipment. “All interconnected devices need to be thought of as threat vectors in the larger ecosystem.”
If we accept that IT infrastructure is complex – and that bad actors are more complex still – then scavenging data from IT operations to detect trends and determine potential future threats makes a lot of sense.
Traditionally, log analysis in IT has been notoriously difficult, but with the emergence of business-intelligence-style log management applications such as Splunk, things are getting better. Big data excels at detecting emergent behaviour or picking up early signals of growing trends. There may be an opportunity to use this as a way of hardening IT infrastructure, especially if it can draw from multiple sources including security incident and event management (SIEM) tools.
“Big data analytics is used to quickly identify malware and other vulnerabilities within an information ecosystem by simultaneously analysing all relevant data to understand new patterns and behaviours and to promptly react to new threats and avoid serious business damages,” said Rakovich.
Big trees grow from little acorns – and in the case of predictive analytics, big data insights grow from thousands of little data points. Could this powerful tool give CIOs valuable new insights about the workings of their IT infrastructure?