With VMware’s success, virtualization has taken on a life of its own. Beyond the server, vendors are now touting products for virtualizing any and every layer of the infrastructure: network, storage, desktop, application, database, user interface – even security and mobility.
This technology explosion makes sense.
Although enterprises gain incremental benefits from applying virtualization in one area, they gain much more by using it across every tier of the IT infrastructure.
Virtualization and you
Before you decide how much virtualization is right for your enterprise.
Ask yourself
Does my security infrastructure rely on IP addresses and how will that impact my virtualization plans?
What processes do I use to troubleshoot root causes and how can I modify that to support virtualization?
Ask your vendors
What ties does the product require to physical devices and how do you ensure such ties will not interfere with the product’s flexibility?
How well does it integrate with other types of virtualization (say, server, storage, network and desktop)?
What management tools support the product and how do such tools manage underlying physical devices?
Is it possible to manage the product via a single console that provides an overview of the whole environment?
Does your virtualization technique follow accepted standards or will your virtualization lock me into you as a sole vendor? (Users can use both VMware and Microsoft Virtual Server within one environment, as both support importing virtual machines from one to the other. But that’s about as far as interoperability currently extends.)
“It’s very difficult to apply virtualization to one part of your infrastructure unless you apply it to many or most parts of your infrastructure,” says Andreas Antonopolous, senior vice-president of Nemertes Research, a firm that specializes in analyzing and quantifying the business value of emerging technologies. “If you decouple some of your resources from the physical, yet they interact with other resources that are coupled [with] the physical, it lessens the benefits.”
Antonopolous offers the example of implementing server virtualization without network or storage virtualization. “Some of the biggest benefits you get from server virtualization, like the ability to boot a given server in a different data center for disaster-recovery purposes, you can only do if your storage is virtualized and you have a [storage-area network] replicated between the two sites. Once you have those pieces in place, the benefits from server virtualization become huge.”
The problem is that not all virtualization technologies are equally mature. Whereas server virtualization seems to have hit its stride, other areas are not as far along, especially in the management and security realms. And getting the various virtualized pieces to work together cohesively can be a big challenge.
Virtual frauds
Watch for application vendors that say their applications are “virtualization-ready.” Application vendors have been known to overplay the virtualization card, says Paul Winkeler, founder of PBnJ Solutions, an IT consulting firm. “Application vendors realize customers are thinking about virtualization, so they will happily say their app runs fine in virtualized environments.”
But as Winkeler points out that’s the whole idea behind virtualization – the application can’t tell whether or not it’s virtualized. “So they’re not saying anything.”
Companies should also to guard against application vendors who say their isolation tools are virtualized, says another industry insider. “Some application vendors use the term virtualization, when they are really just isolating,” says Andy Gerringer, senior network administrator at Alamance Regional Medical Center, in Burlington, N.C. (see Alamance’s award-winning virtualization project).
“To isolate an application means that files are still installed and simply redirected or shielded from the operating system. That’s not virtualization,” he says.
Neal Tisdale, vice president of software development at New Energy Associates in Atlanta, agrees. “Sun says Solaris Containers is virtualization, and it’s not full virtualization – it’s more isolation,” he says. “At the application level it is because your application thinks it has its own machine, but full virtualization allows you to change even network settings and [basic input/output system] settings and operating system settings and have entire copies of the operating system running.”
Eating the layer cake
Baptist Healthcare System in Louisville, Ky., has struggled with this challenge firsthand. It uses VMware’s ESX Server to consolidate as many as five Citrix servers onto one hardware box. It then lays Softricity’s SoftGrid on top of the Citrix servers to isolate each application and deliver them to users on the fly in real time.
“So now we have multiple points of virtualization. We have SoftGrid on top of Citrix, running on top of ESX,” says Tom Taylor, corporate manager for client/server infrastructure at the hospital group. “That’s all running on [virtual LANs] and connected through a VPN and running on a SAN.”
For the most part, the architecture works well and runs smoothly, Taylor says. But when performance issues crop up, pinpointing the problem through all those layers of virtualization is difficult.
“It’s been a struggle eating that layer cake, if you will,” he says. “The drawback to virtualization is added complexity. If all these different layers are virtualized, and there’s a problem, who owns it? Ultimately, it falls on the poor guy putting it into the enterprise, and in my environment, that’s me. It’s my responsibility to work with the vendors to find root causes, and when you’re dealing with all these different layers, it’s complex and it’s frustrating.”