It may not have the buzzword appeal of VoIP. It’s not as trendy as RFID, or as passionately discussed as security. But storage has effectively become a frontline technology – and is likely to remain so.
From blade servers and virtualized arrays, to magneto-optical disks and new information lifecycle management (ILM) solutions, vendors of storage solutions continue to push the envelope in this fiercely competitive market. An abundance of options, however, has not made choice any easier.
Public sector CIOs – faced with shrinking IT budgets and pressures to justify investments – may be challenged to identify and implement the “right” storage products and processes.
But the alternative can be unfortunate. A case in point: The mysterious disappearance, last July, of two removable computer disks containing classified nuclear weapons data from the Los Alamos National Laboratory in New Mexico. The loss was discovered during an inventory for a forthcoming experiment.
Soon after the incident – the third disk disappearance at the facility since 2000 – the lab introduced new storage and security safeguards, which some observers felt was akin to shutting the stable door after the horse had bolted. That’s because implementation of secure storage and disaster management systems and processes after a major “incident” like the New Mexico case – although increasingly common – often cannot reverse damage already done. Industry insiders agree that in today’s fragile security environment, enterprises – in both the public and private sectors – need to put in place storage policies, procedures and systems that pre-empt rather than react to data threats.
STORAGE CONUNDRUM
Such programs, however, sometimes lead to a kind of Catch-22. It’s become much too precarious, for example, for many government agencies to store mission critical or sensitive data at a single location. On the other hand, data repositories at many different sites can trigger other challenges – mostly relating to the low speed, steep costs and high risks associated with inter-site data transfer.
A recent federally supported storage and information management (IM) initiative may prove to be the answer to this quandary. Dubbed the Global Data Habitat (GDH), this project seeks to change the IM paradigm to enable both secure storage and multi-site data collaboration.
Technology Partnerships Canada, an Industry Canada agency, has invested $7.67 million in the initiative, spearheaded by Edmonton-based Yotta Yotta Inc. at a total cost of $30.1 million. Venture capital firms are providing the remaining funds. “Our goal is to create a new data habitat that doesn’t live in one place but adjusts to where information is being used,” said Wayne Karpoff, chief technology officer at Yotta Yotta. Karpoff said the new initiative combines heterogeneous storage and supercomputing to enable high-performance data sharing. This, he said, will be accomplished through “mirrored” copies of data at multiple locations. The GDH will be expected to manage two or more data centres separated by thousands of kilometres as a single, seamless location.
GLOBAL STORAGE POOL
According to Karpoff, using multiple data storage centres to provide scalability and fault tolerance is not enough. Under the traditional “insular” model, he said, data storage centres act like islands. “Losing a data centre could mean losing access to data associated with it. Besides, collaboration between sites is slow and expensive as speed limitations impede data transfer across networks.” The antidote, he said, is a global storage pool that can be managed as a single entity. “We need a technology that links data centres together, allows site failures without loss of data access, and enables information to be shared between sites at local performance rates.”
A tall order?
“Not at all,” said Karpoff, adding that Yotta Yotta’s NetStorager manages many of those tasks. Karpoff said NetStorager’s “intelligence layer” resides between data consumers (computers, servers, clusters) and physical heterogeneous storage systems developed by vendors like EMC, IBM and Hitachi. “Coherence operations within NetStorager ensure a consistent, worldwide data image,” Karpoff said.
DATA FUSION
Yotta Yotta’s new storage model, he said, can be used effectively by government departments, as well as by private enterprises across the board – from Fortune 500 companies to mid-sized firms. However, he said, the most compelling applications are in defence, aerospace and security.
It was the security aspect that Deputy Prime Minister Anne McLellan emphasized when announcing the government’s investment in the new technology. The project, McLellan suggested, is in line with the government’s efforts to “provide an integrated response to public safety and national security emergencies and threats.” According to Karpoff, the Global Data Habitat can be a tremendous asset to Canadian intelligence or law enforcement agencies. “It’s absolutely crucial for these bodies to protect their data storage infrastructure, so if a particular installation gets taken out others can continue functioning.”
Technology used in the GDH makes this possible, he said. “If one data centre loses some or all of its systems, access automatically shifts to a surviving data centre. If an entire centre is lost, data access continues using other locations. When sites reappear, storage is automatically rebuilt with only data changes transmitted across the network. Many sites can be involved in both data replication and collaboration.”
This paradigm, he said, can be used to foster collaboration between agencies, by providing an effective way for the Canada Border Services Agency, for example, to collaborate with U.S. outfits like the CIA, FBI, and National Geo-Spatial-Intelligence Agency (NGA). “It also allows intelligence agencies to operate satellite data centres with real-time images of the same data. It’s a concept called Data Fusion.”
BREAKING BARRIERS
The ability to dramatically accelerate file retrieval over huge distances is another NetStorager feature that will be harnessed by GDH. For instance, in a trial involving data transfer between Vancouver, Chicago and Ottawa, Yotta Yotta and its Canadian and U.S. partners reportedly broke the previous world record for bulk data transfer by a factor of 16. This capability, Karpoff said, is vital in matters relating to domestic security and infrastructure, where large amounts of data need to be moved quickly across networks. In another test of data centres in Ottawa and Edmonton, sharing a single data image, NetStorager delivered performance more than 10 times faster than single site performance. Yotta Yotta says the test – conducted by Silicon Graphics Inc. of Mountain View, Calif. – is another indication that, with the right technology, long-distance shared storage really works.