Skip to main content

Cloud Computing vs. Virtualization

Cloud computing and virtualization are both technologies that were developed to maximize the use of computing resources while reducing the cost of those resources. They are also mentioned frequently when discussing high availability and redundancy. While it is not uncommon to hear people discuss them interchangeably; they are very different approaches to solving the problem of maximizing the use of available resources. They differ in many ways and that also leads to some important considerations when selecting between the two.

Virtualization: More Servers on the Same Hardware


Virtualization

It used to be that if you needed more computing power for an application, you had to purchase additional hardware. Redundancy systems were based on having duplicate hardware sitting in standby mode in case something should fail. The problem was that as CPUs grew more powerful and had more than one core, a lot of computing resources were going unused. This obviously cost companies a great deal of money.


Enter virtualization. Simply stated, virtualization is a technique that allows you to run more than one server on the same hardware. Typically one server is the host server and control the access to the physical server’s resources. One or more virtual servers then run within containers provided by the host server. The container is transparent to the virtual server so the operating system does not need to be aware of the virtual environment. This allows server to be consolidated which reduces hardware costs. Less physical servers also means less power which further reduces cost.

Most virtualization systems allow the virtual servers to be easily moved from one physical host to another. This makes it very simple for system administrators to reconfigure the servers based on resource demand or to move a virtual server from a failing physical node.

Virtualization helps reduce complexity by reducing the number of physical hosts but it still involves purchasing servers and software and maintaining your infrastructure. It’s greatest benefit is reducing the cost of that infrastructure for companies by maximizing the usage of the physical resources.

Cloud Computing: Measured Resources, Pay for What You Use

Cloud Computing

While virtualization may be used to provide cloud computing, cloud computing is quite different from virtualization. Cloud computing may look like virtualization because it appears that your application is running on a virtual server detached from any reliance or connection to a single physical host. And they are similar in that fashion. However, cloud computing can be better described as a service where virtualization is part of a physical infrastructure.

Cloud computing grew out of the concept of utility computing. Essentially, utility computing was the belief that computing resources and hardware would become a commodity to the point that companies would purchase computing resources from a central pool and pay only for the amount of CPU cycles, RAM, storage and bandwidth that they used. These resources would be metered to allow a pay for what you use model much like you buy electricity from the electric company. This is how it became known as utility computing.

It is common for cloud computing to be distributed across many dedicated servers. This provides redundancy, high availability and even geographic redundancy. This also makes cloud computing very flexible. It is easy to add resources to your application. You just use them, just like you just use the electricity when you need it. Cloud computing has been designed with scalability in mind.

The biggest drawback of cloud computing is that, of course, you do not control the servers. Your data is out there in the cloud and you have to trust the provider that it is safe. Many cloud computing services offer SLAs that promise to deliver a level of service and safety but it is critical to read the fine print. A failure of the cloud service could result in a loss of your data.

Which One is Right for My Application?

How do you decide whether you need virtualization or cloud computing? They both can save money but they do it in different ways. One key consideration is when do you need to save the money. If you use virtualization, you will have a great deal of upfront cost. A new application will need servers and you’ll have to purchase the infrastructure for it. Virtualization means you’ll be spending less upfront and you will save money over time, but there is still going to be a large amount of capital spent early on. Cloud computing works in just the opposite fashion. You new application may not need many resources initially so cloud computing will likely cost very little in the beginning. However, as your application becomes popular and uses more resources, paying by the resource may become more expensive than using virtual servers on your own infrastructure.

Another important consideration is how safe will your data be. Are you comfortable with the cloud computing vendor? In a virtualized environment, you data is on your own hardware. You know who has access, where it is and how its being backed up. You also know exactly how you’ll handle a disaster recovery scenario. Cloud computing, on the other hand, places more that control in the hands of the vendor. While you’ll likely have a SLA to fall back on, it may not be enough. Last year, Microsoft had a failure in a data center that provided cloud computing services for T-Mobile’s Sidekick service. This failure resulted in the loss of customer data and a huge blow to T-Mobile’s reputation. While the SLA will likely provide some monetary compensation to T-Mobile, it cannot repair their reputation with the customers who lost data. You’ll want to consider carefully whether the SLA will cover all your bases as well.

Virtualization and cloud computing are both ways to reduce infrastructure cost by maximizing the utilization of computing resources. They are not the same thing however. Virtualization allows server consolidation by hosting many servers on a single piece of hardware where cloud computing is a service that delivers computer resources on a metered pay-as-you-go model. While they both have advantages, you’ll want to think about factors like start up cost versus long term costs and the possible loss of control of your infrastructure when deciding which model to utilize.

Comments

Popular posts from this blog

Odoo/OpenERP: one2one relational field example

one2one relational field is deprecated in OpenERP version>5 but you can achieve the same using many2one relational field. You can achieve it in following two ways : 1) using many2one field in both the objects ( http://tutorialopenerp.wordpress.com/2014/04/23/one2one/ ) 2)  using inheritance by deligation You can easily find the first solution with little search over internet so let's start with 2nd solution. Scenario :  I want to create a one2one relation between two objects of openerp hr.employee and hr.employee.medical.details What I should do  i. Add _inherits section in hr_employee class ii. Add field medical_detail_id in hr_employee class class hr_employee(osv.osv):     _name = 'hr.employee'     _inherits = {' hr.employee.medical.details ': "medical_detail_id"}     _inherit = 'hr.employee'         _columns = {              'emp_code':fields.char('Employee Code', si

How to draw Dynamic Line or Timeseries Chart in Java using jfreechart library?

Today we are going to write a code to draw a dynamic timeseries-cum-line chart in java.   The only difference between simple and dynamic chart is that a dynamic event is used to create a new series and update the graph. In out example we are using timer which automatically calls a funtion after every 1/4 th second and graph is updated with random data. Let's try with the code : Note : I had tried my best to provide complete documentation along with code. If at any time anyone have any doubt or question please post in comments section. DynamicLineAndTimeSeriesChart.java import java.awt.BorderLayout; import java.awt.Color; import java.awt.event.ActionEvent; import java.awt.event.ActionListener; import javax.swing.Timer; import javax.swing.JPanel; import org.jfree.chart.ChartFactory; import org.jfree.chart.ChartPanel; import org.jfree.chart.JFreeChart; import org.jfree.chart.axis.ValueAxis; import org.jfree.chart.plot.XYPlot; import

Flickr and OAuth

What is Flickr? I think you landed on this page because you know what Flickr is, so let’s come to the point and discuss about the API. Why am explaining? Although each and everything, about API, is well documented on Flickr website here , I'm just trying to explain the whole process by dividing it into small parts. Old Authentication API The current Flickr authorization scheme is not the first one it used. In the early days of Flickr, users granted the power to an app to act on their behalf by giving  the apps their Flickr username and password. Doing so meant that in order to revoke  an app’s permission, users would have to change their Flickr password. Of course, doing that would also instantly revoke permissions of other third-­party apps with knowledge of the user’s password. The new authorization scheme is meant to correct obvious problems with the old scheme. Why should you as a user have to use your Flickr password for anything other than your dealings with Flickr?