CERN, the Geneva-based nuclear physics research center, has launched a collaborative effort with some of the biggest names in IT to tighten up security on its landmark Large Hadron Collider (LHC) project, as well as working on platform virtualization and the interoperability of grid software.
CERN is using grid technology to give scientists around the world access to the huge quantities of data expected to be created by the LHC. CERN’s Openlab project is aimed at working with large IT companies to ease the process, and to see that the technologies developed for the LHC make their way into the world of business.
In the first phase of Openlab, CERN has worked with Enterasys, HP, IBM, Intel and Oracle over the past three years to develop a computing and storage cluster. As part of the second phase, CERN is bringing in Finnish security specialists Stonesoft and F-Secure to deploy their products in the test environment CERN uses for LHC-related applications.
The idea is to work on virus protection, antispyware, intrusion detection and intrusion prevention with a particular focus on client security and mail server security, all in the context of a global grid environment. “The weakest link in the chain remains local security on any given site open to visiting scientists and other staff,” said Stonesoft.
Secondly, CERN is creating a Platform Competence Centre to look at platform virtualization and hardware and software optimization, both of which are expected to play a key role in helping grid applications to deal with LHC data. “Optimization can help to cope with the expected huge demand for computing resources by the scientists involved in the LHC experiments, and avoid that demand outstrips the available resources of the grid,” said CERN.
Finally, a Grid Interoperability Centre will focus on integration and certification of grid middleware, as initially established by the CERN-led Enabling Grids for E-sciencE (EGEE) project. HP, Intel and Oracle are working with CERN on the two centers.
The LHC will be the largest scientific instrument on the planet when it begins operation next year, and is expected to generate 1GB per second, or 15 petabytes per year, for 10 years, of data as the result of particle collision tests.
The infrastructure for storage and analysis of the data will be the LHC computing grid, designed to give access to thousands of users at a time. A grid, broadly speaking, is an infrastructure that allows resources such as storage, processing power and the like to be coordinated and shared across different departments or institutions in a dynamic way.