If you sit down with pretty much any enterprise-level executive nowadays, the topic of conversation invariably turns green. This happened throughout my recent trip to Silicon Valley, as Sun Microsystems hosted journalists from around the world to highlight the company’s new strategic initiatives and overall direction.
One common thread throughout the company’s systems, servers, software and storage business was the need to be ecologically responsible. Sun stressed the fact that it was “green before green was cool” — a statement that I’ve heard from more than a few companies. But with in this instance I believed them, and simply because it’s focused a lot of its green IT emphasis at the heart of its company: the data centre. And as the case with many large companies, Sun’s data centres account for a significant part of its total power, cooling and energy consumption.
“Thirty years ago, data centres were basically a mainframe with a few card punchers and printers attached,” Subodh Bapat, vice president at Sun’s eco-computing team, said. “But as we got into the 1980s, data centres got bigger and all of a sudden we had 500,000 square foot client server back-ends. So, not just the space expanded, but also the power consumption as well.”
While the price for pumping kilowatts of bandwidth has gone down in recent years, the price for pumping kilowatts of energy has gone way up. And with the average data centre now becoming extremely complex — with things like new technologies, acquisitions, and security concerns changing like the seasons — enterprises have been forced to completely rethink it.
Last summer, in its own efforts to reduce costs as well as its carbon footprint, Sun unveiled a trio of its “next-generation” data centres. With these newly designed data centres — opened in Blackwater, U.K., Bangalore, India, and Santa Clara, Calif. — Sun hopes its best practices in design and hardware consolidation will serve as a blueprint and eventually work its way into the redesign initiatives at other enterprises.
With a group that included journalists from South American, Asia, and Europe, I headed over to the Santa Clara campus to see what advice the company could give to IT managers looking to run their infrastructure as efficiency as Sun does.
Sun’s Santa Clara campus lives up to its name, as it very much resembles what you’d see at any university across Canada. Upon entering a brick building that looked like all the other research facilities I’d seen on the campus, I was told that I had arrived at the data centre.
My tour guide was Dean Nelson, director of global lab and data centre design services, who leads a small team at Sun that has been outfitted with the task of consolidating the company’s data centres and labs across the world.
Tight squeeze
At the Santa Clara location, Sun compressed 152 data centres and 202,000 square feet from its Newark and Sunnyvale, Calif. locations down to 14 next-generation data centres over 76,000 square feet of space.
The new data centre meant that Sun saved US$9 million in future construction requirements and had an 88 per cent reduction in overall real estate costs. The company also said it will save US$1.1 million in power cost savings per year, reduce power load 60 per cent, and see an 88 per cent compression in server and storage space footprint as a result of the consolidation project.
According to Nelson, one of the most important concepts enterprises need to get their head around when considering a data centre restructure is the ability to think to the future.
“It’s all about modularity and flexibility,” Nelson said. “You need to be able to deal with acquisitions or reorganizations as they come. So with modularity it basically means you can snap things in as you need it.” For Sun that meant “future proofing” the data centre with a modular approach to organizing its server and cooling power — a strategy that’s immediately evident upon entering one of the rooms.
Pod palace
For starters, all the server racks, cooling fans and cables are consolidated into pods. I entered a room which had multitude of server pods, all of which could be easily accessed and walked through. It was basically structured as a bunch of little server rooms within a big server room. Nelson estimated that in a 3,000 square foot room, a company would probably have space for about six pods, which could each house are least 20 racks in them.
As I was taken through one of the pods, or as it was referred to as “hot aisle,” I could feel the cooling system working. It was basically designed with in-row units that automatically detect the temperature within the pods and speed up or slow down the fans as necessary. Data centre managers can also opt for overhead cooling units if space constraints call for it.
“Really, the philosophy comes down to closely coupled cooling,” Nelson said. “This allows you to run servers at a higher level if need be, but also cut overall usage costs.”
On the cabling side of things, having 300 cables per 40-system rack means thousands of cables are housed in every pod. Nelson said that instead of going up to a patch panel or back to the main distribution frame, all the cables can be collapsed into an intermediate distribution frame and consolidated down to just a few cables coming out of each pod.
Retro style
And to further emphasize the “plug and play” style of the pods, the power comes from an overhead system equipped with modules that allow enterprises to snap in or snap out power capabilities when needed.
“You need the retro fitting options within these spaces, so you can not only take advantage of what you have, but also build out when the time calls for it,” Nelson said. “Even with the housing market going down in the U.S., data centres are going up, as it’s all about content on the Web. I’d say even data centres that are 10 years old can limit a company from being flexible when business changes.”
Nelson views IT as a weapon, but it’s one that can’t be used if enterprises have to wait weeks to provision more servers. The ability to drop in another pod, Nelson said, makes all the difference in the world to improving cost efficiency and productivity.
But the greatest benefit, according to many at Sun, is the fact that companies can get a head start on complying with future environmental initiatives and reduce their impact on the planet.
“It’s true that people are still doing things for business reasons as opposed to for the environment,” Peter Ryan, senior vice-president of the Americas sales region at Sun, said. “But, with environmental legislation changing in many parts of the world, companies are going to have to look into this if they don’t want to make a handbrake turn to green their IT later.”
And with government-based programs such as Energy Star certain to take their regulations beyond servers or consumer products and into entire data centres, showing more interest in greening your IT infrastructure could be beneficial both today and in the future. And as Sun continually said, being green doesn’t mean having to compromise anymore.