Virtualization and 'the new availability'.

Author:Abbott, John
Position::DATABASE AND NETWORK INTELLIGENCE: WHITE PAPER
 
FREE EXCERPT

Long established techniques addressing application resilience, data availability and disaster recovery are being challenged by server virtualization technologies from companies such as VMware, Citrix and Microsoft. Where do we go from here?

High availability, business continuity, fault tolerance, disaster recovery: all of these labels reflect different approaches to keeping data and applications available to computer systems users when something goes wrong. Established techniques have been built up over the years to help achieve this, ranging from fully redundant hardware (using standby components to eliminate single points of failure), through to software techniques for maintaining identical copies of data in multiple locations such as replication.

Vendors offering high-availability features or products have typically approached the problem from two different directions. One camp focuses on the infrastructure layer to make sure that server, operating systems and databases keep running. Another starts with the application and tries to make sure users aren't affected even if there is a failure somewhere in the layers beneath the surface. And a third camp relies heavily on tools that run in conjunction with shared storage resources in the form of storage area networks.

Of course there is a huge difference in terms of technology requirements between providing on site and remote "availability". And there are various different types of "failure", ranging from data corruption and user errors, application, hard drive and server failures, fight through to a site-wide disaster. All require different technical approaches and different configurations of supporting infrastructure.

Virtually available

However, things move rapidly in the computer industry, and over the last few years increased use of server and storage virtualization has added a new element to the mix. Virtualization layers from companies such as VMware, Citrix and Microsoft enable "virtual machines" to be configured in software, which look to an application exactly like a physical computer. Pre-virtualization, users would often devote a single server to running a single application, in order to isolate workloads that would otherwise conflict with each other. But by encapsulating these applications within a VM, they could run multiple applications on a single server for the first time. VMware has claimed that its customers run an average of 10-12 virtual servers on a single physical server, though...

To continue reading

REQUEST YOUR TRIAL