The storage story of 2008 was growth: An accelerating explosion of information, much of it in the form of video, led IT administrators to try to make better use of their capacity and staff.
Overall demand for storage capacity is growing by about 60 percent per year, according to IDC. Another research company, Enterprise Strategy Group, pegs the annual growth rate of data between 30 percent and 60 percent.
“Organizations are having a hard time getting their arms around all that data,” said ESG analyst Lauren Whitehouse. Economic woes are making it even harder, with frozen or scaled-back budgets, while the downturn isn’t expected to significantly slow data growth next year.
Stuck in that bind, organizations don’t want to have to roll out a gigabyte of capacity in their own data centers for every new gigabyte that’s created, analysts said.
“What we’ll see more of in companies is a focus on efficiency,” IDC analyst Rick Villars said. They’re seeking to increase the utilization of their storage capacity as well as other IT resources.
The storage industry this year has witnessed meagre innovation if you consider the handful of startups that emerged in this space relative to the beginning of the decade, according to Santa Clara, Calif.-based Sun Microsystems Inc. storage technical director Gary Francis.
A big part of that effort is virtualization of storage, which often goes hand in hand with server virtualization and became a mainstream technology in 2008, according to analyst John Webster of Illuminata. Storage vendors are offering more virtualization products and seeing more demand for them, he said. A virtualization capability such as thin provisioning, which lets administrators assign storage capacity to a new application without having to figure out how much it ultimately will need, helps make better use of resources, Webster said.
But in addition to the trend toward disconnecting logical from physical resources, there were a handful of acquisitions this year that signaled other trends in storage world.
On Dec. 19, Brocade Communications and Foundry Networks completed a deal they had announced in July before navigating the roughest waters the financial and credit markets have seen in a generation. The merger, now valued at $2.6 billion, is intended to address a coming merger of SAN (storage area network) and LAN technology.
SAN builders have long relied on Fibre Channel, a specialized networking technology designed not to drop packets. But in most cases, the rest of the enterprise network is based on Ethernet, which is cheaper than Fibre Channel and now available at higher speeds. Maintaining both requires more adapters on storage equipment and adds to an IT department’s workload. The two types of networks are headed toward gradual consolidation under the FCOE (Fiber Channel Over Ethernet) standard, which is intended to make Ethernet reliable enough for storage networks. Then, Ethernet can be the network of choice across data centers and keep getting faster.
Brocade wasn’t the only company thinking this way. Cisco, which will be the main competitive target of the merged company, bought out Nuova Systems in April and simultaneously announced a line of routing switches designed to connect the whole data center. The flagship Nexus 7000, which Cisco has positioned as one of its most important products ever, is built to scale to 15T bps (bits per second) and has a virtualized version of IOS (Internetwork Operating System) called NX OS. Like the combination of Brocade and Foundry, the Nexus line is likely to help enterprises virtualize their storage and computing resources and eventually streamline networking and management.
EMC and NetApp also introduced FCOE products this year. But the protocol is not expected to be in widespread use until 2010.
2. IBM-Diligent
In April, IBM acquired Diligent Technologies, which specializes in data de-duplication for large enterprise storage systems. The company didn’t reveal how much the acquisition cost, but it was a key move in a market that could grow to US$1 billion in annual revenue by 2009, according to research company The 451 Group.
De-duplication systems find identical bits of data in a storage system, treat them as redundant, and eliminate them. So if there are several nearly identical copies of a document, all will be deleted except one copy and the differences that are unique to the other copies.
The Diligent deal was an early move in a year full of de-duplication activity. In June, Hewlett-Packard introduced a suite of de-duplication systems for small and medium-sized businesses and added some features to its HP StorageWorks backup line. And in November, EMC, Quantum and Dell said they would use a common software architecture for data de-duplication products. Dell will enter the de-duplication business next year. It is already a major reseller of EMC gear, under a partnership that in December was extended until 2013.
Data de-duplication can reduce the amount of storage capacity an enterprise requires by as much as two thirds, said ESG’s Whitehouse. It has been available before, but this year companies started to integrate it with storage arrays or sell it in appliances, bringing the technology closer to a turnkey solution, she said. They also established data de-duplication as a technology customers could trust, at least for archived material.
“If you eliminate a block of data that somehow negates the value of that data when you recover it … that’s a really scary prospect for some companies,” Whitehouse said.
We'd love to hear your opinion about this or any other story you read in our publication.
Jim Love, Chief Content Officer, IT World Canada