Desktop computing is vital to the efficiency and management of business in order to increase productivity. It does not only boost the utilization of resources, but the effectivity and productivity of desktop users as well. With the growing number of employees using computers, including their diverse computing needs and issues, managing their IT needs such as updating hardware and software requirements, access to programs and data, security issues, and troubleshooting individual computers, can be complicated. With these considerations, companies are now finding ways to veer away from the conventional desktop infrastructure. Virtualization is one of the perceived solutions in addressing computer problems that also changes the design of traditional desktop infrastructure.
Virtualization is a conceptual technique that means to “create a virtual version of a device or resource, such as a server, storage device, network or even an operating system” (Webopedia, n.d.), that can be shared by several computing devices and users. It may be as simple as sharing a server, a computer’s storage device, or an operating system. Thus, instead of having several computers with each computer having its own operating system and the same applications installed, a virtualized computing system may serve as an abstract computing resource accessed by several operating systems in different computers. These computers then “share” all the applications in the machine as if the applications are actually installed in the virtual units (IBM 3). In short, virtualization is a replication of a software or hardware “upon which other software runs” (Scarfone, Souppaya, and Hoffman ES-1). Users’ files and data are all stored in a central location, and some desktop settings may be customizable, but will not affect the virtual storage or how other users access the system. Thus, when one or more of the virtual computers crash, troubleshooting is limited to those units only and other virtual machines are left unaffected (IBM 3).
Among the many benefits of moving to virtualized environments are ease in identifying the erring program, isolation of applications, less hardware dependency on a particular vendor, and “reduced data center footprint” (Marshall). As an example, computers performing redundant functions or are underutilized may be used as environments to isolate “computing platforms from potentially unstable applications” (Williams). In addition, encapsulating some functions of an application within a virtual machine helps in controlling how data is treated during a particular process, thus, software testing produces a more accurate result than when done in traditional desktop set up. This is particularly true for legacy applications with compatibility issues with regard to new software programs. Old applications do not run on modern operating systems as much as it will not work with new programs. There is also the remote possibility that the original software vendors of legacy systems are no longer available, thus, updating the old applications is not possible. In such cases, dedicating a separate workstation to perform old tasks is a good option. Aside from getting accurate data processing results, treating legacy programs separate from the rest will not pose a problem when it comes to planning and management decisions pertaining to the new programs (Williams).
Isolation of applications in virtual machines from other virtual units is a crucial feature of virtualization considering that issues and bugs confronting one virtual machine will neither have a direct impact on other virtual machines nor affect the performance of other machines (Rosenblum). It will appear that users are still separately accessing data, while in reality, all necessary data and processing occur in one central unit.
The concept of virtualization began in the 1960s with IBM at the forefront. A new programmer, Jim Rymarczyk, joined IBM at a time when IBM was developing and working on the model of virtualization. According to Rymarczyk, IBM used CP-67 software during its initial efforts at “virtualizing mainframe operating systems” (Brodkin). As Rymarczyk states, “Back in the mid-60s, everyone was using key punches and submitting batch jobs It was very inefficient and machines were quite expensive.” (qtd. in Brodkin). Thus, the birth of virtualization that allowed customers a hand at increasing their “hardware utilization by running many applications at once” (Brodkin).
The original operating system used was the CP-40 for the System/360 mainframe computer developed by IBM’s Robert Creasy and Les Comeau in 1964. In 1968, CP-67 replaced it upon Rymarczyk’s entrance in IBM (Brodkin). Each mainframe user was given a “conversational monitor system (CMS), essentially a single-user operating system [that] supported the time-sharing capabilities. CP-67 enabled memory sharing across VMs while giving each user his own virtual memory space” (Brodkin).
While virtualization was an accepted technology for mainframe computers, the concept of virtualization slowly began to wane with the advent of desktop computers in the 1980s (Hand). Thus, to apply the same virtualization concepts to X86 platforms, VMware stepped in and developed its own virtualization model derived from the IBM technology. By 1999, “VMware introduced virtualization to x86 systems” (Hand), but pointed out differences on how the mainframe and desktop VMs are different. It customized the x86 systems into a wholly isolated shared hardware infrastructure. By 2008, more companies began to embrace the concept of virtualizing their “not-business-critical” applications (Hand), and soon, vendors such as Microsoft and Citrix began exploring and developing their own virtualization solutions, the Hyper-V hypervisor and XenServer, respectively (Hand). Other vendors that develop virtualizing software include Oracle, Red Hat, Virtual Computer, Wanova, and Scense.
Depending on the need of a company, various kinds of virtualization tools are available that will address desktop, server, or storage virtualization requirements, among others. There are products that aid clients in business operations and leverage on the use of “physical and virtual infrastructures” (TechXtend), support software testing and development, and supervise servers and workstation processes, including computer “configuration, inventory, software deployment, [and] patch management” (TechXtend). Other factors to consider when choosing virtualization software include the current number of workstations the company has, the technical expertise of the IT group, and the level of technical support available when the need arises.
Performance management and “meeting service level agreements” (Tom) may be a complex process when done in traditional desktop infrastructures. Without the proper visibility and understanding of how the network system works, including the behavior of one module with another, performance issues and possible risks cannot be properly identified. However, with the use of virtual environments, managing these types of data centers is possible. Netuitive (Tom) aids in monitoring the performance of interrelated “virtual and physical systems [especially in] dynamic and constantly changing metrics” (Tom). It helps in capturing reliable and precise reports about performance and quality.
Veeam is another virtualization software that is an “easy to deploy, frame-work independent solution for VMware ESX(i) host performance monitoring” (Tom). It has a support group that clients can approach for issues and troubleshooting and is capable of producing reports on trend and capacity planning. It works well with other VMware and Microsoft products, and has free, downloadable software aside from the paid software (Tom).
VMware View is another product that “delivers desktop services [that] enable end user freedom and IT management and control” (VMware View 1). By isolating operating system, data, and applications, and moving them into a central location that can be controlled easily, data becomes more secured and organized, thus, data and user management is simple as well. Apart from these, use of any VMware software proves to be a cost-effective measure for managing data (VMware View 1). Because data is also centrally located in a secure data center, users who are allowed to access data can also easily connect from any devices such as desktops, iPads, and other mobile devices (VMware View 2).
IBM started the concept of virtualization but with the modernization of computers and the efforts of VMware to make the virtualization technology available to higher versions and modern desktop systems, virtualization is here to stay. As more companies join in developing various kinds of virtualizing software, competition will be greater, but there will not be market saturation considering that more companies now veer towards virtualized environments to improve stability of processes, increase security measures, make data and user management easier, and decrease operational costs. With changes in computer architecture, virtualization will surely help raise the capabilities and flexibility of users’ desktop infrastructure. In addition, it will also help in cutting down the costs of having a huge number of users with differing equipment and data requirements.
Brodkin, Jon. “With long history of virtualization behind it, IBM looks to the future.” Network World. 30 April 2009. Web. 17 January 2013. < http://www.networkworld.com/news/2009/043009-ibm-virtualization.html>.
Hand, Joseph. “Virtualization History.” VMblog.com. n.d. Web. 17 January 2013. < http://vmblog.com/archive/2012/02/02/virtualization-history-has-an-impact-on-windows-server-backup.aspx>.
IBM. “Virtualization in Education.” 2007. Web. 17 January 2013.
Marshall, David. “Top 10 benefits of server virtualization.” 2011. Web. 18 January 2013. < http://www.infoworld.com/d/virtualization/top-10-benefits-server-virtualization-177828>.
Rosenblum, Mendel. “The Reincarnation of Virtual Machines.” ACMQueue. 1 July 2004. Web. 17 January 2013.
Scarfone, Karen, Murugiah Souppaya, and Paul Hoffman. Guide to Security for Full Virtualization Technologies. 2011. PDF file. 16 January 2013.
TechXtend. “Virtualization World View”. n.d. Web. 17 January 2013. < http://www.techxtend.com/ppi_us/content.aspx?name=virtualization-world-view-products>.
Tom.”A List of Virtualization Management Software.” Real User Monitoring Blog. Correlsense. 28 Nov 2011. Web. 17 January 2013. < http://www.real-user-monitoring.com/a-list-of-virtualization-management-software/>.
VMware View. VMware.com. n.d. PDF. 18 January 2013. < http://www.vmware.com/files/pdf/view/VMware-View-Datasheet.pdf>.
Webopedia. “Virtualization”. IT Business Edge. n.d. Web. 17 January 2013. < http://www.webopedia.com/TERM/V/virtualization.html>.
Williams, Margi. “What is Virtualization.” WiseGeek. Conjecture Corporation. n.d. Web. 16 January 2013.