Keeping cool
There’s a power-saving paradox when it comes to blades. “You can overconsolidate in a given facilities infrastructure,” says Eunice. “That’s been one of the problems with a lot of the so-called green technologies.”
Part of it is the power draw versus floor space issue outlined above; a corollary, says Eunice, is a chassis full of blades generates more heat than the same volume of standalone servers. Is there enough air flow to cool it? Call it the Toyota Corollary: Take your 132-horsepower Corolla S and swap in a 450-horsepower racing engine. Is your gas tank big enough?
Aaron Hay, research consultant with Info-Tech Research in London, Ont., says in many ways, we in the Great White North have some data centre cooling advantages because of an overlooked natural resource: cold weather.
But Step 1, he says, is making sure you’re not overdoing it with the A/C to begin with.
“In most data centres, the temperature is set too low,” Hay says. This is a throwback to the days of 10 to 15 years ago, when data centre equipment was more sensitive to heat. “There was a lifecycle impact,” he says.
Running a little warm might have an impact toward the end of your new server’s lifespan, but “data centre equipment doesn’t hang around that long” – it’s usually phased out long before the end of its useful life.
According to the American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE), data centres have a much wider latitude for temperature and humidity than they once did – from about 14 to 28 degrees Celsius, and 20 to 80 per cent humidity, Hay says.
“Most data centres can bump their temperature up 10 degrees (Fahrenheit) without affecting their equipment,” Hay says.
Once you’ve raised the set point, it’s time to take advantage of the wintry conditions. An airside economizer will draw cold air from the outside, filter it and complement or replace air conditioning. It can be an expensive job, depending on your ventilation system, and in some office towers, not possible at all. But the payoff can be quick, since chillers eat up as much as a third of the data centre’s power bill.
And there are other small things within the data centre which make incremental savings add up: make sure filters and vents are clean; make sure all ceiling tiles are in place and unbroken; and use rubber grommets on cables that enter the server, sealing it off so there’s no leakage.
“You could potentially save 10 to 15 per cent of your power bill,” Hay says. Like installing energy efficient light bulbs or caulking the windows of your house, they’re incremental savings, but they add up. And your data centre draws the power of 20 or more houses.