10 Oct Pitching virtualization: Benefits go far beyond cost cutting
Madison, Wis. – One would be hard pressed to think of organizations that haven’t embarked on some sort of consolidation effort to cut costs, but until recently virtualization wasn’t part of the consolidation package.
Yet by automating network infrastructure through virtualization, which is a way to pool together physical resources into a more efficient virtual environment, information technology providers are helping business organizations improve network manageability in a consolidated data-center environment.
The word “decoupling” is found in definitions of virtualization because the technology employs virtual servers and operating systems to run side-by-side, decoupling physical hardware from the machine and the OS. Virtualization could involve taking a single physical component – a server, operating system, application, or storage device – and make it function as multiple resources, or it could involve taking multiple pieces and make them appear as a single entity.
In effect, a virtual architecture is created, moving the computing workload that sits on physical servers to a smaller number of virtual servers, which allows IT departments to leverage their storage, network, and other computing resources to control costs, simplify deployment, and respond faster to business needs.
Cost redux
Virtualization also has been touted as a remedy for excess (i.e. unused) server capacity.
Scott Severson, director of systems and storage for CDW Berbee, said virtualization lowers the cost of server infrastructure because in a non-virtual environment, organizations have a lot of excess (wasted) processor capacity on multiple servers. When these applications are moved to either a single server or a few servers in a virtual environment, not only are many servers eliminated, but the remaining server(s) use their processing capacity to the fullest.
“If you optimize the virtualized server environment, you eliminate a lot of this excess capacity,” Severson noted. “Eliminate excess capacity (servers), and you eliminate cost.”
According to an estimate provided by Paragon Development Systems, by allowing multiple virtual servers to reside on a single physical server, a company can take its physical server utilization rates from the traditional five to eight percent range to anywhere from 60 to 80 percent.
This permits organizations to improve upon a one-to-one ratio of applications to servers, and maximize their investments in the server space. They can then allocate applications based on capacity and performance rather than a numerical value of physical servers.
Austin Park, vice president of infrastructure services for PDS, cautioned that the business case for virtualization should not be based on cost-cutting alone. He said the cost case should be made over time, especially with some upfront investment required, and he said decision-makers should know there are a range of benefits that can be derived from a successful implementation.
Operations, licensing, facility, manpower, and connection costs all factor into this part of the business case. A major chunk of the savings comes in the reduced cost of operating servers. Due to the power of microprocessors, individual units consume about 700 to 800 watts of power, which translates into monthly per-unit electricity and cooling costs of approximately $3,000. (Servers cost about $9,000 to purchase).
Additional savings are derived in off-site storage. Storage Area Networks have relied on fiber connections that can run anywhere from $1,500 to $2,500 per connection, and the cost of such “intelligent connections” is beginning to rival system acquisition costs. Reducing the number of servers through virtual consolidation reduces those connection costs.
While slashing these costs is important, it’s not the whole business case. If the approach to virtualization is strictly cost cutting, organizations will see it as a failure, Park explained. The cost reduction comes at the tail end, when there is some type of scale.
“You will almost certainly get a cost reduction,” Park said, “but if that’s all you’re looking at, you’re a doing virtualization a disservice.”
Performance
Given the explosive growth in systems, storage, and connections, performance gains can be realized in each area, especially with storage virtualization.
Many think of virtualization as a server technology, but that’s a small percentage of the architecture. Like server virtualization, deploying a storage virtualization strategy should enable an organization to use storage resources more efficiently, and there is a significant side benefit. Thanks to storage virtualization technology, a typical organization can deploy lower-cost storage subsystems without sacrificing performance offered by higher-cost, higher performing storage subsystems, Severson observed.
While performance may not be the key driver behind consolidation, Park noted that new systems with future processors could be more than 16 times faster than existing ones from a performance standpoint. Virtualization enables organizations to take advantage of this by separating applications from the systems for easier migration to a new tier of services and by marrying expected performance to peak workloads rather than architecting a system for the slowest performance.
When the fastest point becomes the benchmark, “virtualization allows us to match resources to that so we can get a consistent performance,” Park said.
For back-up purposes, the new system can be copied to a remote location, permitting businesses to consolidate their data centers at another data center. Park, in fact, believes virtualization, which also has evolved into a disaster recovery toolset, is the way to architect the future data center with reduced operational expenses in power and cooling, and reduced capital expenses because of the need for fewer servers.
Yet there are impacts on software licensing for individual servers and applications that must be factored into the business case. The software licensing issue is a big challenge, enough of a speed bump to suppress the adoption of virtualization. It’s not necessarily in the interest of vendors like Oracle to support virtualization, which requires them to change their licensing models.
Eventually, Park believes competitive pressure will convince vendors to be more accommodating with their licensing policies.
Responsiveness
The true promise of virtualization, Park said, is greater flexibility and technological agility. Virtualization enables IT departments to be more responsive to the business and ultimately to customers. For example, without a virtualization strategy, Severson said it takes an organization days or even weeks to provision a server – to justify and configure it, and to order, receive, install, and test it. In contrast, a virtual server can be provisioned in a matter of minutes, a time-saving alternative that can free more time for innovation and other activities that drive IT-business alignment.
“Some IT organizations are now to the point where they have enabled their users to self-provision virtual servers,” Severson said. “This lets the IT staff spend more time on more strategic initiatives for the business.”
There also are staff benefits because virtualization builds high availability (much less downtime) into the core infrastructure, and even automates recovery when hardware issues arise. This enables the IT staff to get out of the 24/7 service mindset. “They don’t have to come in at midnight to make changes,” Severson remarked. “They can do that at a more convenient time. It lends itself to a better work experience for your staff.”
Promega: A case study
Jim O’Donnell, director of IS enterprise services for Promega Corp., has introduced virtualization to the company’s server environment. In addition to saving on capital costs and energy consumption, the company pursued virtual servers because of the amount of heat output being generated by physical servers in its data center. As technology made smaller servers possible, the company packed more of them into the data center’s finite amount of space, so heat was the primary business driver.
Even though the data center has dual air conditioning systems, the simple act of standing in the center generated perspiration in humans, which could not be healthy for servers and switching equipment.
The company has chosen to take an attrition approach to replacing physical servers. As a server upgrade or replacement approaches, the company determines whether it can be “virtualized,” and thus far 19 percent of the company’s production servers have been. O’Donnell did not have to make a business case to upper management because the minimal amount of investment that needed to be made was immediately offset by the physical servers he no longer has to buy.
Excessive heat was not the only justification; O’Donnell said Promega is green conscious, so transitioning to a data center with less electricity draw was another lure.
The virtualization vendor, VMware, helps Promega conduct an audit of its server environment so that it can evaluate what can and can’t be “virtualized.” Based on that, the company expects to reach 50 percent virtualization in its server environment, but it wants to be selective and avoid “force fitting” things into the virtual arena.
It has not experienced licensing problems with vendors, who seem to be well aware of what virtualization brings to the marketplace and are rearranging their product lines and pricing accordingly, O’Donnell said.
While Promega’s virtual aims were not necessarily strategic, the benefits have been above and beyond what Promega expected. O’Donnell said the company has made a dent in its heating issue and is working with Madison Gas & Electric to try and get an assessment of the true environmental cost impact.
The company not only enjoys faster speeds that give applications an extra punch, it has been able to maximize the hardware that it has acquired.
“Some of the hidden benefits that we didn’t expect are that because we’re building a higher-capacity server that has much more horsepower to serve up computing power to applications,” he said. “We’re an internationally based company, so for example our ERP system, while being used in the United States during our work time, when we go home and go to sleep that horsepower can be given to our customers in China and around the globe wherever our branch locations are.”
O’Donnell said the only potential downside is “putting your eggs in one basket” with more of a centralized, single point of failure, but there are ways around that by “mirroring what you’re doing” elsewhere.
A virtual reality check
Virtualization is something that must be architected; it’s not just a matter of downloading VMware. But whether or not an organization actively chooses to deploy it, virtualization is gradually finding its way into various components of network infrastructure.
The Gartner Group says even organizations that are not ready to take this step must be prepared for the reality that virtualization will become a default component of many of the infrastructure components on the market, including PCs and servers. At some point, technology managers will have to figure out what to do with them.
Call it “creeping virtualization,” but the cost savings that result from greater adoption will add some flexibility. “I believe we are still on the forefront of virtualization benefits,” Severson said. “Most organizations have deployed it for testing, development, or production. What’s next is leveraging newer virtualization technology to enable high-availability for business-critical applications.”
Related stories
• Visions: Founder of Paragon Development Systems advises technologists to “get out of IT”
• Promega Corporation: Wisconsin’s quiet innovator
• Anatomy of a deal: CDW and Berbee
• PDS survives with technological reinvention