TG Daily Special: Virtualization Explored - Part 2

Posted by Rick C. Hodgin

We continue with part 2 our 3-part series on virtualization.  In this installment, we look at applications, benefits and challenges. We will discuss server farms, ISPs and businesses that have made virtualization a key component of their strategy. And we will look at end-user applications and the reasons why an individual might choose virtualization for their desktop. 

 


Published in this TG Daily Special: Virtualization Explored


 

We are heading into the second installment of our virtualization special. After learning about the basics of virtualization, we are now taking a closer look at what benefits you can expect from this technology. On Monday, we will explain how to install the virtualization environment VMware and take a look at where this technology is heading and the author's opinion about what is needed.


Virtual machines in business

Businesses are typically the first recipients of new technologies. They have much larger resources and much more riding on the success or failure of technology. Businesses look at what's available and despite the often enormous up-front costs, they see the long-term savings in technology and are often the early adopters of new technologies.

When you look at the core of enterprise architecture and the way they have to evolve over time in order to keep pace with changes required through competition, the path almost always leads across a server operating system and enterprise application. As soon as you consider the software that is keeping an enterprise alive and a possibility of re-engineering, it is clear that you are talking about a very costly – and dangerous – process.  Compatibility can be broken when things are changed and the length of time required to develop and test new versions is often prohibitive.  When corporate big wigs sit down and crunch numbers the solutions they want ask for the the least risk, fastest roll-out time, biggest savings and largest returns.

Virtualization hits all of these points.  It gives businesses what they need:  more.


Server farms

Server farm administrators are turning to virtualization as an absolutely indispensable component of their success. Whereas previous configurations might have had specialized machines running particular applications, these new models allow very dense machines to be created with simple OS installs.  

These machines are setup with a small host OS.  From there, multiple, canned virtualized guest machines are rolled out as needed. Server farm managers can direct their utilization as workloads change over time, even throughout the course of a day. This adds a new type of real-time management ability that simply isn't possible without virtualization.  By wielding via software what was previously hardware-only, the sky becomes the limit, from what we can see today.

Businesses spend a great deal of time searching for ways to squeeze more from their machines - their hardware investments need to be utilized. And when analysts determine their server farms are running at less than full utilization, it's time to virtualize. Such machines are often stripped down to the barest of essentials for the host operating system.  Their previous single-job workload and original OS install are then migrated to a virtual install. That virtual install runs on the machine alongside other virtual OSes in parallel.  This maximizes machine use with only a very small overhead being incured (for the initial conversion). Depending on how big the original gap was, the savings can be great.


Variable targets

So, why does this need even come about?  Don't corporations analyze equipment needs before they spend millions of dollars on hardware?  And any server farm manager worth his salt would be able to determine in advance with a high degree accuracy how much utilization their machines will see, right?  

Not really. The reality in server farms today is that machines are purchased for the maximum workload required. Most server models fall into variable workload patterns which track throughout daily activity. Companies can see huge cost savings by shifting workloads across machines as necessary. The ability to turn virtual machines and physical machines on and off as needed reduces overall cost. But it also exposes an interesting new ability. Unused time on excess machines can be leased for additional virtual machine processing by outside entities. What was a fixed cost previously can now become an additional source of revenue.

ISPs

Many ISPs today are also taking advantage of this scaling. By offering what outwardly appears to be a full, dedicated machine, they attract more customers. These machines are stand-alone servers, but they're actually virtual machines which run alongside many other virtual machines on the same physical hardware. By setting things up this way, ISPs can maximize the utilization of physical hardware resources by having many virtual machines running on the same hardware.

There might be 50 websites hosted on a single machine, for example. But if each one sees only a small amount of traffic or utilization, that single machine can handle them all.  It might be running at near capacity servicing all 50 websites, but it is still providing enough power for each virtual machine based on their use. The ISP sees that single machine as a source of revenue for 50 clients – a very attractive value proposition.  But still, that's not even the really cool part.

The really cool thing is this: As the workload of a given virtual machine goes up and the physical machine becomes taxed, that virtual machine can be moved away from that server onto another with more processing power available. And it can happen in near real-time. Only a few seconds of shut-down need be seen while the physical switch is made inside the ISP. This momentary blip is often not even visible to the outside world.


Read on the next page: Cost savings, Software lead and security considerations


Cost savings

It is abilities like these that provide the cost-saving advantages in virtualization. A minimal investment in additional hard drive capacity and memory buys much computing ability. This is especially true with the multi-core servers of today. The machines do more with less equipment, and without any specialized operating system development or custom applications. Canned Off-The-Shelf (COTS) applications and OSes can be employed at any time, any place, and without any problems. Virtualization drives it all.

Virtualization in business has so many advantages over individual hardware machines that we have no doubt that it will be fully embraced and eventually trickle down into the mainstream. Cost savings are just one factor. But the ease of roll-out, minimal down-time, the ability to move things around immediately when hardware fails, these are all huge concerns for the business world. The truth is when businesses buy these solutions, the consumers will also be getting them nearly for free for desktops and notebooks.


Software lead

If we consider the direction many software technologies are headed today, the idea of virtualizing the hardware is really the next logical step. Extremely popular emulation libraries like Java and .NET, where the compiled output is no longer written for the physical machine, serve as beacons. They indicate where the software developers are headed. No longer do they care about what machine runs the thing. The machines of today are fast enough for most workloads without running directly on the physical hardware. Virtualization is just an extension of that mindset. We will discuss opportunities like these in greater detail in part three of this series.

Security and VDI

Security is of great concern these days. Virtual desktops have the potential to increase security and remote desktops can increase security greatly. An initiative began last year by VMware, called Virtual Desktop Infrastructure (VDI), has been put into practice by several large server companies, including HP.

These configurations allow the physical operating system to run on a secure piece of equipment at a remote location (like a server room). The user is able to wield their local machine like a remote desktop client but with richer utilization of local hardware resources. This ability to have complete control over the user's desktop no longer becomes an issue of server-based group policy alone. While that control is still in effect, the real control comes from the fact that the user's machine physically exists someplace away from their desk and in soft form.  Because of this isolation, it can be manipulated in many ways. It can be launched, shut-down, restarted, isolated, dumped, or whatever it is required as necessary.

New abilities also exist in this model. Things like service packs, software upgrades, new installs, full inventories and anything else can be run.  It's all done without ever powering up on the user's desktop. An automated remote client can simply “tap in” to that virtual machine. Once tapped, it does the upgrade and shuts itself down.  When the user comes back the next day, everything is ready to go. Not once did the user's desktop machine have to turn on for the upgrade.  Everything was handled automatically, behind the scenes, in a remote server room.

The truth is, automated tools like this can cycle through thousands of virtual machines each night. Upgrading, installing, backing up or whatever else is required. All of it is done without ever having to touch any desktop machines at all.  This means tremendous power savings on the one side, but the real key here is security and control.

Everything about these machines is completely secure. Only by having the correct protocols, which can be proprietary in nature though Remote Desktop Protocol is commonly used, access is granted.  Further security is possible by locking down the IP address(es) that can access the remote machine. There are time-sharing features involved as well: A particular machine might only be able to be run from 8 am to 5 pm.

VDI gives results. It offers a great deal of isolation and control. Administrators have abilities that wouldn't otherwise exist. And the typical users won't ever see any changes in how they use their machine. To them, it's completely transparent. It saves money while providing usable performance.


Read on the next page: What are the benefits for home users?


Home user benefits

We've seen many reasons why businesses use virtualization.  Most of them are about saving money.  But what about home users?  What advantages can they expect?  The real advantages there are many.  Read on to find out more.

Pain-free migration

If we begin with the basics we can ask this:  how many times have we upgraded our physical hardware only to have to have to re-install the OS and everything else?  Or, what if we upgraded from one OS version to another? Usually the same thing, a complete re-install. By having a virtual machine instead of a real machine, the entire system can be migrated without having to change a single thing.

The simple host OS is installed on the new hardware, and all virtual machines are copied over as necessary.  With the small Linux install and the copy operation, you can be up and running exactly as you were before, with all of your old settings and files in tact, in just a matter of minutes.

Easier OS upgrades

Installing a new OS version is just as easy.  You simply create a new virtual machine, then power it on.  It's ready to receive the new OS install immediately.  And if you want to try to upgrade your old OS to a new version, it's simple as well.  Simply make a backup of your virtual machine before you start. This way, if it hoses anything, then nothing is really lost because you can just revert back to your previously saved state.  I can only imagine how many 10s of hours (100s of hours more likely) I could've saved over my computing career with this ability.  But, the real benefits still extend further.

Protection from viruses

Let's look at the security aspect of virtualization. If your virtual machine is completely isolated (something that can be changed as needed, by the way), and you make regular backups (something that's very easy to do with virtualization), then there is a chance that you will depend less on properly functioning anti-virus software down the road.

This one thing alone should greatly speed up your system.  The removal of anti-virus software is possible because the machine itself can be fully encapsulated.  If any virus were to hit your virtual machine, then it would stop right there at the virtual walls.  It could go no further, at least if we consider today's common virus architectures. As far as the virus knows, it's attacked the whole machine.  It may have hit every possible file destroying everything in its wake.  It might even have brought the virtual machine to a screeching halt.  But, it can't go any further.  The real machine - the physical machine - is still running the host OS without error.  No harm.  No foul.  And, by turning to a recent backup, you're back up in running in no time.

The virus, no matter how malicious, can only take down the virtual machine. And with a good backup policy, the impact of (today's) viruses can be softened. Of course, virtualization will not be a tool to free you from subscribing to anti-virus software. Viruses still can extract data and send it to someone else, so being careful will be just as important in virtualized environments as it it is today (and in some cases even more as you may have to take care of multiple and not just one operating environemnt.) But, in the case a virus strikes, you have a better defense system and a much better recovery path than what is available today – at least until we see viruses that can reach across virtual systems and through to the physical core.

Easier backups

Backing up a complete system was never easier. With a virtualized machine, regular full backups can be automated through scripts to copy data from one location to another. It's that easy. And, if you prefer a DVD-based backup, then you can ZIP the entire thing and back it up.

Virtualization and a good backup policy makes virus recovery a no-brainer. With virtualization, you can even try to remove the virus with anti-virus software. If it blows up, just copy the backup you made before you started back over and try again. And, if worse comes to worst, simply go back to your previous backup before the virus hit and you're home free.  Again, all of these become only file copy operations. There's no true reinstalling, no time consuming efforts wasted on software residing on hardware that has been damaged to the point where the OS can no longer boot, for example

Read on the next page: Instant on computing, multiple operating systems, cost considerations


Instant on computing

This is one of my favorite virtual machine abilities. Virtual engines like VMware come with a pause button. This allows you to stop execution of a virtual machine right where it is. When paused, it won't consume any compute resources but the memory allocation will still be there. It also swaps out to disk, requiring a few seconds which first clicked.  Pausing allows an ability which is often difficult to implement in hardware:  Instant on computers. In order to return to the exact location you were previously, simply resume the paused virtual machine. Everything will be exactly as it was. In testing this feature for this article, I even paused the machine during the initial music played by Windows XP when booting. It paused mid melody and began saving its state to disk.  When I resumed, the melody picked up exactly where it had left off and finished the tune. Now don't tell me that this isn't an enthusiast's "that is really cool" moment.


Multiple OSes simultaneously

Many users may not see the advantages of running multiple OSes initially. However, there are many advantages appearing on the horizon already. When the time is ripe, adding additional virtual machines with new OSes is very easy. For example, right now many of us do not run websites because we don't want the hassle of maintaining another machine. With virtualization, a single copy of a virtual machine can be created and distributed. It could come with all software necessary to be up and running immediately.  In fact, many Linux distros are provided this way in server forms. But with virtualization, no new hardware is required. Suddenly, a new ability is exposed where none existed before. The limit is set by the creativity by the developer and user community.

“Virtualization Ready” is cheap

Linux is free.  And versions like Ubuntu will install on almost any machine without any problems whatsoever. This means your small footprint machine can be setup very quickly (about 20 minutes). When the host OS boots, it will consume about 100 MB of RAM.  This can be tweaked down if you want to spend the time. It can go as low as about 50 MB. I haven't seen any Linux distros which are less, but if anyone reading this knows of some which are then please post a note in the comments provided at the end of this article.

Upgrading a machine to support virtualization often involves little more than buying more memory and a bigger hard drive.  Neither one is required, but your machine will run faster to native speed if you do. Hardware resources of 2+ GB of memory, 500+ GB hard drives and lower-end dual- and quad-core processors are most likely within enthusiast's budgets. Upgrading existing components also costs much less than buying a whole new machine. And for those of us fortunate enough to have purchased dual-core processors recently, future upgrade paths to quad-core will mean the end of the new machine buying cycle for at least a few years. We can upgrade just our processors and memory to gain much computing power. We can buy bigger video cards and monitors, faster networks, larger hard drives, etc. But, we can do it all piece-meal over time. And all the while, we'll still have the same virtual machines running atop the same small Linux install. Nothing's been changed in the virtual machines.


Simplicity wrapped up in elegance

These virtual configurations are just crying out to be had by all. Virtualization makes it all possible. Even casual users today, who spends hours posting messages on various websites could, instead, be hosting their own website and starting up the next big thing.  Little more than running a virtual machine server in the background is required.  It's just the setup to host their own URL. Virtualization allows them to do this in the background while running other OSes in the foreground to carry out normal computer use. Having abilities like these available opens up opportunities.

Read on the next page: Challenges and conclusion


Challenges

The challenges facing the future of virtualization are not just technical ones. While there is much need for additional hardware support in the virtualization engine, that isn't the main concern or focus. The reality here is that we currently have two disparate technologies moving forward in x86 space, AMD-V and Intel VT.  

These technologies do the same basic thing, but they are not compatible.  Neither do they see identical futures ahead. The need for a true standard is required above all else.  Without it, we'll have two competing camps going on about their businesses, which most likely will divide the virtualization community. And in the end, virtualization engines like VMware will require more than double the effort to develop and maintain. It will be a mess unless one path is chosen. Companies like VMware might just choose one path, leaving the others outside.

We believe that virtualization must choose one path soon, and then run with it. We need to let AMD-V and IVT fight it out to see which one performs better and is more desirable.  Once that fact is realized, it hopefully will develop into a standard.


Today's conclusion

Virtualization is absolutely key to successful server farm utilization in business, no doubt about it. Security is enhanced with virtualization. Initiatives to bring the power of virtualization to business desktops are well under way. End-users also will have no problems finding very good reasons to make the switch. While there is some occasional performance loss observed when moving into a virtualized environment, the improved usability and flexibility of the machine more than compensate for anything lost.

Java and .NET brought virtualized binary code to the end-user with a write-once, run anywhere attitude. Virtualization technology takes that idea a step further by bringing the new machine forward with its create-once, run anywhere attitude.

There is a really good chance that, if you try to squeeze more out of your PC than writing occasional letters and compiling your tax returns, that virtualization could become your new best friend.

Read in the next article:  VMware installation, the future of virtualization and the author's opinion

 

Previously published articles in this series:

Part 1: What is Virtualization?