Need of virtualization for cloud computing

Why is virtualization needed for cloud computing?

Virtualization is the key to cloud computing. It enables technology, the creation of an intelligent abstraction layer which hides the complexity of underlying hardware or software.
Virtualization prevents possible damage to the underlying system.
It provides better resource utilization and lower cost.

Virtualization is convenient for cloud computing as :Cloud computing is much more than a web app running in IIS. ActiveDirectory isn’t a web app. SQL Server isn’t a web app. To get full benefit of running code in the cloud, you need the option to install a wide variety of services in the cloud nodes just as you would in your own IT data center. Many of those services are not web apps governed by IIS. If you only look at the cloud as a web app, then you’ll have difficulty building anything that isn’t a web app.

Because the folks running and administering the cloud hardware underneath the covers need ultimate authority and control to shut down, suspend, and occasionally relocate your cloud code to a different physical machine. If some bit of code in your cloud app goes nuts and runs out of control, it’s much more difficult to shut down that service or that machine when the code is running directly on the physical hardware than it is when the rogue code is running in a VM managed by a hypervisor.

For Resource utilization - multiple tenants (VMs) executing on the same physical hardware, but with much stronger isloation from each other than IIS’s process walls. Lower cost per tenant, higher income per unit of hardware