iXBT Labs - Computer Hardware in Detail






Key Problems of Virtualization Deployment

The history of virtualization technologies counts over forty years. IBM were the first to think about creating virtual environments for various user tasks in mainframes. Virtualization was of pure scientific interest in the 60-s of the last century. It was an original solution to isolate computer systems within a single physical computer. After the appearance of personal computers, interest to virtualization abated because of the rapid development of operating systems with adequate hardware requirements. However, rapid development of hardware capacities in the late nineties made the IT community recall virtualization technologies. VMware was not a part of EMC at that time. This company was one of the first to stake solely on virtualization. Time showed that it was justified. IT specialists are now very interested in running several virtual systems on a single physical machine, not only because it makes the IT infrastructure more flexible, but also because virtualization actually saves money. So why hadn't OS virtualization been so popular before? Let's give a recap of the main problems that stood in the way of this technology:

  • Low hardware performance
  • No simple and powerful tools to manage virtualization
  • No OS support
  • Low reliability of virtualization software
  • Reluctance of large companies to deal with "untried" technology
  • Low activity of companies that provide the new technology

The situation is absolutely different now: both server and client systems can support several operating systems simultaneously, virtualization providers offer reliable and easily manageable platforms to large companies, and the market of these technologies is going through a real boom. Virtualization is currently estimated to be one of the three most promising computer technologies. Many experts predict that half of computer systems will be virtual by 2015.

At the same time, virtualization is popular not only in IT industry. Clients of large virtualization platforms include hospitals, universities, architectural corporations, and even US Navy. That's in the Western countries. Virtualization technologies failed to interest Big Business as well as the SMB sector in Russia (Small and Medium Business) so far. According to analysts, Russia is not ready psychologically yet to accept the new technology. But this situation will change in the nearest future. It's very difficult to persuade CIO (Chief Executive Officers) that virtualization is efficient, mostly because there are no good tools for the evaluation of IT infrastructure as far as virtualization is concerned, which could visualize the effect of the new technology.

For better understanding of how virtualization can make the life of a company easier, let's list the main advantages of virtualization technologies.

Why We Need Virtualization Now

Virtual machines offer the widest scope for upgrading the IT infrastructure to a new more flexible and technological level. It can be built with independent, isolated units, which work together without being tied to physical equipment. Here are the key prerequisites for virtualization of computer systems:

  • Low average load of production servers (below 40%)
  • No mandatory specific hardware
  • A company has to manage a large park of servers
  • A company has to maintain uninterrupted duty of servers with low downtime
  • High expenses on hardware and energy in data centers
  • A company has to maintain a sufficient number of system for software testing

If your IT infrastructure has the above-mentioned prerequisites, you need virtualization in order to:

  • Consolidate several virtual systems on a single physical one, reduce hardware and energy expenses, as well as make them more flexible as you migrate to other equipment
  • Manage computer systems efficiently and ensure their 100% availability by using backup solutions and instant restores after failures
  • Make it easier to test software by creating a storage of ready-to-use virtual machines, which can be reclaimed as a matter of minutes and started on different computers in a virtual testlab
  • Create isolated hardware-independent environments for presentations and training

Key Problems of Virtualization Projects

Despite all advantages of virtualization, companies of different sizes face some problems at different stages of adopting virtualization. There are three conventional stages of virtualization problems:

  • Analysis and planning
  • Adaptation and post-adaptation period
  • Maintenance of the virtual infrastructure

Analysis and Planning

This group includes problems that a company or independent consultants face as they analyze the existing IT infrastructure and plan migrating to a virtual infrastructure. In particular, we can mention the following immediate tasks:

  1. Compatibility and support

    This problem includes analyzing hardware and software components of an IT infrastructure and evaluating a possibility of migrating to a virtual environment. Software developers are not ready to guarantee fail-safe operation of all their programs in virtual machines. Solution of this problem requires thorough hardware/software inventory taking. You should ask developers how well they will work in a virtual infrastructure. Many developers of virtualization platforms publish various reports on compatibility and performance of popular programs in virtual machines. For example, in June 2007 VMware published a report on Microsoft Exchange performance on the ESX Server platform - we can draw a conclusion that the Exchange server can be virtualized adjusted for a further increase in the number of clients and small expenses on virtualization platform support.

    Virtualization support problems can be divided into three types:

    • Technical limitations (support for specific hardware, hardware requirements of a virtualization platform)
    • Marketing strategies (for example, a decision to support only a certain range of software)
    • Political strategies (decisions to address virtualization in context of supported software, software or hardware virtualization, decisions to use platforms from a certain vendor)

  2. Licensing

    Planning a virtualization project, you must thoroughly examine licenses of OS developers, as well as independent software developers (ISV, Independent Software Vendors) as far as virtualization is concerned. Some operating systems or programs may fail to start up in virtual systems, for example, Windows Vista Home Basic or Home Premium from Microsoft.

    As it's very easy to bring virtual machines to other physical platforms, OS manufacturers introduce some limitations on using their products in virtual machines (especially OEM versions). Such scenarios are often described in separate license chapters. There may also be some problems with licensing software for a certain number of processors, as a virtual machine uses SMP (Symmetric Multi Processing) to emulate a different number of processors than in a host system. Besides, you should take into account some pleasant issues of OS virtualization, for example, you can use an unlimited number of virtual systems with Windows Server 2003 Datacenter Edition without extra costs.

  3. Planning Deployment

    This group of tasks includes plans on deploying consolidated virtual servers, migrating physical servers, as well as obtaining a virtualization ratio (a number of virtual machines per physical server). It's the most important task at the planning stage. Some manufacturers of virtualization platforms offer their potential clients tools to evaluate the virtualization ratio (for example, VMware Capacity Planner). Or they have to use third-party software (as a rule, it's rather expensive), for example, PowerRecon from PlateSpin. Here is a screenshot of VMware Capacity Planner:

    Planning virtual servers in VMware Capacity Planner
    Planning virtual servers in VMware Capacity Planner

    The sector of software for collecting information on server load is not overcrowded yet, so it's currently hard to find a proper tool.

    What concerns migrating physical systems to virtual ones, there are a lot of tools from virtualization developers and third-party developers, which allow stream migration of physical servers. Many virtualization systems also have P2V migration tools (Physical to Virtual), for example Xen Server.

    Examples of specialized products for P2V migration:

  4. Staff training

    This problem is currently one of the most burning ones, as there are no virtualization experts, who can deploy and maintain a virtual infrastructure. "Heavy" virtualization platforms (VMware ESX, XenEnterprise, Virtual Iron) require serious training of staff who will maintain them. Training such specialists is rather expensive and is not always available. The lack of specialists of a required level is one of the three main reasons why companies refuse to start using virtualization.

  5. Evaluating ROI (Return of Investments)

    In practice, inability to evaluate the return of virtualization investments is the main factor why almost half of companies cannot call their virtualization projects successful. There exists a problem with tools to measure quantitative and qualitative quotients of a virtual infrastructure. It's hard to speak of virtualization efficiency despite all its advantages, until we have no such tools. Nevertheless, you should be well aware of what you need virtualization for, and what economic effect it will have.

Adaptation and post-adaptation period

At the stage of deploying virtualization platforms, companies usually experience problems with integrating the virtual infrastructure with the existing heterogenous elements of the real IT infrastructure, as well as a necessity to deploy specialized servers to manage virtual systems, monitor load, allocate servers for backup jobs, to introduce high availability solutions. Besides, at this stage, large companies experience problems with using virtual platforms in SAN. Not all platforms support a wide range of equipment and the popular iSCSI protocol. You cannot introduce virtualization solutions into data storage networks without trained specialists. There may appear the following problems as the company deploys virtual systems and starts using them:

  1. Reliability

    As several virtual servers work on a single physical server, you must plan and realize disaster recovery strategies. Failures of hardware components in servers are not rare. A platform with integrated tools to ensure reliability of a virtual infrastructure will be a better solution. It will be great if the platform can integrate with recovery solutions from other software developers. Virtual machines can be backed up on three levels:

    • guest system level (installing special agents in a guest system, for example, Symantec products)
    • host system level (copying images of virtual machines or files in a guest system, for example, using esxRanger from Vizioncore)
    • SAN storage level (where virtual infrastructure units correspond to certain segments)

    HA (High Availability) of virtual machines can be provided with platforms themselves (for example, VMware HA), which contain several elements (optimized file system, virtual machines may restart automatically on another server, if one of the cluster servers fails). It is anticipated that third-party developers will start working on high availability tools for virtual machines in the nearest future. The following companies are actively working in this field:

  2. Deployment and Preparation in an Industrial Environment

    Simplicity of deploying virtual machines often tempts users to host them on various servers without any control, which leads to problems of performance and load distribution control. It's hard to evaluate objectively how virtual machines affect a virtualization server, and how many virtual machines can be hosted on it in a resource pool. Software developers try to create complex systems to deploy virtual machines and control them. Here is the list of the most successful companies in this field:

    Here is a screenshot of Virtual Desktop Orchestrator:

    Dunes VDO console
    Dunes VDO console
  3. Evaluating efficiency

    After virtual machines are deployed, you should evaluate their efficiency in various aspects. It's much more difficult to detect bottlenecks in virtual infrastructures than in real ones. Performance bottlenecks depend not only on the standard factors, but also on some specific issues (efficiency of hardware emulation, performance problems of a host platform, virtualization settings, etc). Locating such bottlenecks is the task of special programs. Developers of software for evaluating performance will soon adapt their products to virtualization platforms. Besides, some virtualization systems vendors offer such tools even now. VMware has recently released VMmark, a complex benchmark for virtual machines on ESX Server. Here is sample VMmark report:


    As you can see, a system scores some points in each aspect. These points can be used by hardware manufacturers to recommend their components for virtualization systems. These points can also be used in special software to plan a virtual infrastructure. Hardware manufacturers, such as Intel, are actively co-operating with virtualization companies in this respect.

Further Maintenance of a Virtual Infrastructure

As a virtual infrastructure is deployed and put into operation, system administrators and analysts face the task of scaling, maintenance, centralized control, security, and access right management.

  1. Scalability

    As a virtual infrastructure is deployed, and corporate requirements in virtual machines grow, you face the problem of virtual machine scalability. The main component of this problem is the fact that hardware performance does not grow evenly. For example, multi-CPU systems are rapidly developing, and they allow to increase the number of virtual systems on a single physical one. However, some other hardware components do not grow that fast: here is a typical example – network throughput. Out of doubt, we'll soon be able to run several dozens of virtual machines on a single physical server, but what about their network traffic?

    Besides, large-scale virtual infrastructures require homogeneous software, which is managed on the level of a guest system, which may pose some problems. Only VMware is currently working on giving control over software deployment and maintenance to a host platform.

  2. Security

    The problem of security of a virtual infrastructure can be divided into two components:

    • security of a virtual machine
    • security of a virtualization platform

    In the first case, just like on a physical platform, safeguarding software must be installed in a guest operating system (antivirus, firewall, etc). Virtual machines and networks must also be properly configured. A virtual world may have a different idea of security, which has to do with not yet defined requirements to security of virtual systems. For example, virtual switches in some VMware products (Workstation, Server) act as hubs, so it opens up possibilities for other virtual machines to intercept unprotected traffic and deceive system administrators.

    What concerns virtualization platform security, when you plan a virtual machine deployment strategy, you must analyze reports on vulnerabilities of platforms, and take this factor into account as you choose a platform. Timely updates of a virtualization system and tracking critical vulnerabilities must be a part of virtual infrastructure maintenance. You should also take into account a possibility of internal unauthorized access to a host system. As for now, attacks on virtual systems are extremely rare, because virtualization platforms are not widely spread. So we don't know a true security level of virtualization platforms. Analysts predict that every second operating system will be virtual by 2015. So the security issue will become increasingly serious.

  3. Responsibility

    Introduction of a virtual infrastructure will bring new roles, which rights should be delegated to existing specialists (system administrators, network administrators, tech support specialists, security specialists). It requires a clear usage model of virtual systems for each employee. When you determine responsibilities, you should have a crystal-clear idea of how virtual systems are used. Otherwise, you may have conflicts between employees because of new responsibilities no one wants to take upon.

  4. Evaluation of the Virtualization Market

    The virtualization market is rapidly growing now. Virtualization platform vendors compete with each other for support of their products from hardware manufacturers and software developers. For example, VMware is currently an absolute leader in the market of corporate virtualization. Microsoft, XenSource, SWSoft, and Virtual Iron put up a weak fight here. But the situation may change: Microsoft will launch its own virtualization platform integrated into Windows Server in 2008. It may change the market, considering the partner network of the software giant. So you should keep tabs on virtualization support from various software developers and stake on a certain platform considering market development prospects.


Virtualization technologies radically change the standard approach to deploying an IT infrastructure. Despite all evident advantages of virtualization, its deployment raises a lot of serious problems we mentioned above. These problems can certainly be solved with a competent approach. A key element of this approach is thorough planning of all virtualization deployment stages. Special software may be required at each stage, which is not always free of charge. Most large virtualization projects were a failure because it's currently hard to estimate their efficiency in terms of numbers, and there are no tools to maintain virtualization platforms at all stages.

When you plan your virtualization project, you should pay special attention to disaster recovery strategies, take into account OS licenses, and peculiarities of integration with the existing infrastructure. You should also take into account that virtualization technologies facilitate computer management on one hand, and significantly complicate their structure on the other hand. It requires highly skilled specialists, but there are currently few such people (out of doubt, their numbers will rapidly grow owing to popularity of this technology). Virtualization is already used by oil, financial, telecommunication, and other companies, being an indispensable element of their IT infrastructure. But the full virtualization effect can be achieved only if you understand your requirements, take into account virtualization requirements, and thoroughly plan your virtualization project.

Alexander Samoilenko (admin@vmgu.ru, www.vmgu.ru)
September 21, 2007

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.