Cloud computing is the delivery of computing services.

cloud computing is the delivery of computing services—including servers, storage, databases, networking, software, analytics, and intelligence—over the Internet to offer faster innovation, flexible resources, and economies of scale.Cloud computing facilitates the access of applications and data from any location worldwide and from any device with an internet connection.

Cloud Computing

Cloud Computing provides us a means by which we can access the applications as utilities, over the Internet. It allows us to create, configure, and customize applications online.

What is Cloud?

The term Cloud refers to a Network or Internet. In other words, we can say that Cloud is something, which is present at remote location. Cloud can provide services over network, i.e., on public networks or on privatenetworks, i.e., WAN, LAN or VPN. Applications such as e-mail, web conferencing, customer relationship management (CRM),all run in cloud.

What is Cloud Computing?

Cloud Computing refers to manipulating, configuring, and accessing the applications online. It offers online data storage, infrastructure and application.

Cloud Computing can be defined as delivering computing power( CPU, RAM, Network Speeds, Storage OS software) a service over a network (usually on the internet) rather than physically having the computing resources at the customer location.

Example: AWS, Azure, Google Cloud


Cloud computing is a recently developing paradigm of distributed computing. Though it is not a new idea that emerged just recently. In 1969 [16] L. Kleinrock anticipated, “As of now, computer networks are still in their infancy. But as they grow up and become more sophisticated, we will probably see the spread of ’computer utilities’ which, like present electric and telephone utilities, will service individual homes and offices across the country.” His vision was the true indication of today’s utility based computing paradigm. One of the giant steps towards this world was taken in mid 1990s when grid computing was first coined to allow consumers to obtain computing power on demand. The origin of cloud computing can be seen as an evolution of grid computing technologies. The term Cloud computing was given prominence first by Google’s CEO Eric Schmidt in late 2006 (may be he coined the term) [6]. So the birth of cloud computing is very recent phenomena although its root belongs to some old ideas with new business, technical and social perspectives. From the architectural point of view cloud is naturally build on an existing grid based architecture and uses the grid services and adds some technologies like virtualization and some business models.

In brief cloud is essentially a bunch of commodity computers networked together in same or different geographical locations, operating together to serve a number of customers with different need and workload on demand basis with the help of virtualization. Cloud services are provided to the cloud users as utility services like water, electricity, telephone using pay-as-you-use business model. These utility services are generally described as XaaS (X as a Service) where X can be Software or Platform or Infrastructure etc. Cloud users use these services provided by the cloud providers and build their applications in the internet and thus deliver them to their end users. So the cloud users don’t have to worry about installing, maintaining hardware and software needed. And they also can afford these services as they have to pay as much they use. So the cloud users can reduce their expenditure and effort in the field of IT using cloud services instead of establishing IT infrastructure themselves. Cloud is essentially provided by large distributed data centers. These data centers are often organized as grid and the cloud is built on top of the grid services.

Cloud users are provided with virtual images of the physical machines in the data centers. This virtualization is one of the key concept of cloud computing as it essentially builds the abstraction over the physical system. Many cloud applications are gaining popularity day by day for their availability, reliability, scalability and utility model. These applications made distributed computing easy as the criticalaspects are handled by the cloud provider itself.

Cloud computing is growing now-a-days in the interest of technical and business organizations but this can also be beneficial for solving social issues. In the recent time E-Governance is being implemented in developing countries to improve efficiency and effectiveness of governance. This approach can be improved much by using cloud computing instead of traditional ICT. In India, economy is agriculture based and most of the citizens live in rural areas. The standard of living, agricultural productivity etc can be enhanced by utilizing cloud computing in a proper way. Both of these applications of cloud computing have technological as well associal challenges to overcome.

In this we would try to clarify some of the ideas – Why is cloudcomputing a buzzword today? i.e. what are the benefits the provider and the users get using cloud? Though its idea has come long back in 1990 but what situation made it indispensable today? How is cloud built? What differentiates it from similar terms like grid computing and utility computing? What are the different services are provided by the cloud providers? Though cloud computing now-a-days talks about business enterprises not the non-profit organizations; how can this new paradigm be used in the services like e-governance and in social development issues of rural India?

Cloud Computing Basics

Cloud computing is a paradigm of distributed computing to provide the customers on-demand, utility based computing services. Cloud users can provide more reliable, available and updated services to their clients in turn. Cloud itself consists of physical machines in the data centers of cloud providers. Virtualization is provided on top of these physical machines. These virtual machines are provided to the cloud users. Different cloud provider provides cloud services of different abstraction level. E.g. Amazon EC2 enables the users to handle very low level details where Google App-Engine provides a development platform for the developers to develop their applications. So the cloud services are divided into many types like Software as a Service, Platform as a Service or Infrastructure as a Service. These services are available over the Internet in the whole world where the cloud acts as the single point of access for serving all customers. Cloud computing architecture addresses difficulties of large scale data processing.

Types of Cloud

Cloud can be of three types.

1. Private Cloud – This type of cloud is maintained within an organization and used solely for their internal purpose. So the utility model is not a big term in this scenario. Many companies are moving towards this setting and experts consider this is the 1st step for an organization to move into cloud. Security, network bandwidth are not critical issues for private cloud.

2. Public Cloud – In this type an organization rents cloud services from cloud providers on-demand basis. Services provided to the users using utility computing model.

3. Hybrid Cloud – This type of cloud is composed of multiple internal or external cloud. This is the scenario when an organization moves to public cloud computing domain from its internal private cloud.

Cloud Stakeholders

To know why cloud computing is used let’s first concentrate on who use it. And then we would discuss what advantages they get using cloud. There are three types of stakeholders cloud providers, cloud users and the end users .Cloud providers provide cloud services to the cloud users. These cloud services are of the form of utility computing i.e. the cloud users uses these services pay-as-you-go model. The cloud users develop their product using these services and deliver the product to the end users.

Advantages of using Cloud

The advantages for using cloud services can be of technical, architectural, business

etc .

1. Cloud Providers’ point of view

(a) Most of the data centers today are under utilized. They are mostly 15% utilized. These data centers need spare capacity just to cope with the huge spikes that sometimes get in the server usage. Large companies having those data centers can easily rent those computing power to other organizations and get profit out of it and also make the resources needed for running data center (like power) utilized properly.

(b) Companies having large data centers have already deployed the resources and to provide cloud services they would need very little investment and the cost would be incremental.

2. Cloud Users’ point of view

(a) Cloud users need not to take care about the hardware and software they use and also they don’t have to be worried about maintenance. The users are no longer tied to some one traditional system.

(b) Virtualization technology gives the illusion to the users that they are having all the resources available.

(c) Cloud users can use the resources on demand basis and pay as much as they use. So the users can plan well for reducing their usage to minimize their expenditure.

(d) Scalability is one of the major advantages to cloud users. Scalability is provided dynamically to the users. Users get as much resources as they need. Thus this model perfectly fits in the management of rare spikes in the demand.

Motivation towards Cloud in recent time Cloud computing is not a new idea but it is an evolution of some old paradigm of distributed computing. The advent of the enthusiasm about cloud computing in recent past is due to some recent technology trend and business models [5].

Cloud Computing

1. High demand of interactive applications – Applications with real time response and with capability of providing information either by other users or by nonhuman sensors gaining more and more popularity today. These are generally attracted to cloud not only because of high availability but also because these services are generally data intensive and require analyzing data across different sources.

2. Parallel batch processing – Cloud inherently supports batch-processing and analyzing tera-bytes of data very efficiently. Programming models like Google’s map-reduce [18] and Yahoo!’s open source counter part Hadoop can be used to do these hiding operational complexity of parallel processing of hundreds of cloud computing servers.

3. New trend in business world and scientific community – In recent times the business enterprises are interested in discovering customers needs, buying patterns, supply chains to take top management decisions. These require analysis of very large amount of online data. This can be done with the help of cloud very easily. Yahoo! Homepage is a very good example of such thing. In the homepage they show the hottest news in the country. And according to the users’ interest they change the ads and other sections in the page. Other thanthese many scientific experiments need very time consuming data processing jobs like LHC (Large Hadron Collider). Those can be done by cloud.

4. Extensive desktop application – Some desktop applications like Matlab, Mathematica are becoming so compute intensive that a single desktop machine is no longer enough to run them. So they are developed to be capable of usingcloud computing to perform extensive evaluations.

Cloud Architecture

The cloud providers actually have the physical data centers to provide virtualized services to their users through Internet. The cloud providers often provide separation between application and data. This scenario is shown in the Figure 2. The underlying physical machines are generally organized in grids and they are usually geographically distributed. Virtualization plays an important role in the cloud scenario. The data center hosts provide the physical hardware on which virtual machines resides. User potentially can use any OS supported by the virtual machines used.

Operating systems are designed for specific hardware and software. It results in the lack of portability of operating system and software from one machine to another machine which uses different instruction set architecture. The concept of virtual machine solves this problem by acting as an interface between the hardware and the operating system called as system VMs. Another category of virtual machine is called process virtual machine which acts as an abstract layer between the operating system and applications. Virtualization can be very roughly said to be as software translating the hardware instructions generated by conventional software to the understandable format for the physical hardware. Virtualization also includes the mapping of virtual resources like registers and memory to real hardware resources. The underlying platform in virtualization is generally referred to as host and the software that runs in the VM environment is called as the guest. Here the virtualization layer covers the physical hardware. Operating System accesses physical hardware through virtualization layer. Applications can issue instruction by using OS interface as well as directly using virtualizing layer interface. This design enables the users to use applications not compatible with the operating system.

Virtualization enables the migration of the virtual image from one physical machine to another and this feature is useful for cloud as by data locality lots of optimization is possible and also this feature is helpful for taking back up in different locations. This feature also enables the provider to shut down some of the data center physical machines to reduce power consumption.

 Comparison between Cloud Computing and Grid Computing

Most of the cloud architectures are built on Grid architecture and utilizes its service. Grid is also a form of distributed computing architecture where organizations owning data centers collaborate with each other to have mutual benefit. Although if apparently seen it seems that cloud computing is no different from its originator in the first look but there are substantial difference between them in spite of so many similarities.

 Relation between Cloud Computing and Utility Computing

The cloud users enjoy utility computing model for interacting with cloud service providers. This Utility computing is essentially not same as cloud computing. Utility computing is the aggregation of computing resources, such as computation and storage, as a metered service similar to a traditional public utility like electricity, water or telephone network. This service might be provided by a dedicated computer cluster specifically built for the purpose of being rented out, or even an under-utilized supercomputer. And cloud is one of such option of providing utility computing to the users.

Popular Cloud Applications: A Case study

Applications using cloud computing are gaining popularity day by day for their high availability, reliability and utility service model. Today many cloud providers are in the IT market. Of those Google App-Engine, Windows Azure and Amazon EC2, S3 are prominent ones for their popularity and technical perspective.

Amazon EC2 and S3 Services

Amazon Elastic Computing (EC2) [13] is one of the biggest organizations to provide Infrastructure as a Service. They provide the computer architecture with XEN virtual machine. Amazon EC2 is one of the biggest deployment of XEN architecture to date. The clients can install their suitable operating system on the virtual machine. EC2 uses Simple Storage Service (S3) for storage of data. Users can hire suitable amount CPU power, storage, and memory without any upfront commitment. Users can control the entire software stack from kernel upwards. Thearchitecture has two components one is the EC2 for computing purposes and S3 is for storage purposes .

• Simple Storage Service: S3 can be thought as a globally available distributed hash table with high-level access control. Data is stored in name/value pairs. Names are like UNIX file names and the value can be object having size up-to 5 GB with up-to 4K of metadata for each object. All objects in Amazon’s S3 must fit into the global namespace. This namespace consists of a “bucket name” and an “object name”. Bucket names are like user names in traditional email account and provided by Amazon on first come first serve basis. An AWS (Amazon Web Services) account can have maximum of 100 buckets.Data to S3 can be sent by SOAP based API or with raw HTTP “PUT” commands. Data can be retrieved using SOAP HTTP or BitTorrent. While using BitTorrent the S3 system operates as both tracker and the initial seeder. There are also some tools available which enables the users to view S3 as a remote file system. Upload download rate from and to S3 is not that much exiting. One developer from Germany reported experiencing 10-100 KBps. This rate can go up-to 1-2 MBps on the higher side depending on the time of the day. Although the speed is not that much fascinating it is good enough for delivering web objects and for backup purposes although for doing computation it is not suitable.

Amazon S3 has a very impressive support for privacy, integrity and short term availability. Long term availability is unknown as this depends on the internal commitment of Amazon data centers. Data privacy can be obtained by encrypting the data to be stored. But this encryption is to be done by the user before storing the data in S3. One can use SSL with HTTPS to connectto S3 for more security but this usage of SSL increases upload/download time also. Data integrity can be achieved by checking end to end MD5 checking. When an object is stored into S3 then it returns MD5 of that object. One can easily check it with previously computed hash value to guarantee dataintegrity. Short term availability depends upon the Amazon’s connectivity and load on its server at that instant. Once the data is actually in the S3 then it is Amazon’s responsibility to take care of it’s availability. They claim that the data is backed up on multiple hard drives in multiple data centers but doesn’t guarantee this by any Service Level Agreement. There is no backup or recovery mechanism if the user accidentally deletes any data. Amazon has a very impressive scheme of authentication in comparison to other cloud services. Every AWS account has an Access Key ID and a Secret Key The ID is of 20 characters and the Key is a 41 character string. When signing HMAC is first computed for the sign request parameters using that Key. And in the Amazon server that HMAC is again computed and compared with the value previously computed in the client side. These requests also include timestamp to prevent replay attacks.

• Elastic Compute Cloud: As the name implies EC2 rents cloud of computers to the users with flexibility of choosing the configuration of the virtual machine like RAM size, local disk size, processor speeds etc. Machines that deliver EC2 services are actually virtual machines running on top of XEN platform. Users can store a disk image inside S3 and create a virtual machine in EC2 using tools provided by Amazon. This virtual machine can be easily instantiated using a java program and can also be monitored. As EC2 is based on XEN it supports any linux distribution as well as other OSs. Amazon does not promise about reliability of the EC2 computers. Any machine can crash at any moment and they are not backed up. Although these machine generally don’t crash according to the experience of the users but it is safe to use S3 to store information which is more reliable and replicated service. EC2 security model is similar to that of S3. The only difference is that the commands are signed with an X 509 private key. But this key is downloaded from AWS account so the security depends fundamentally on the AWS username and password.

Windows Azure

Windows Azure is an intermediate in the spectrum of flexibility vs programmer convenience. These systems use .NET libraries to facilitate language independent managed environment. This service falls under the category of Platform as a Service. Though it is actually in between complete application framework like Google App-Engine and hardware virtual machines like EC2. Azure applications run on machines in Microsoft data centers. By using this service customers can use it to run applications and store data on internet accessible machines owned by Microsoft.

windows Azure platform provides three fundamental components - compute component, storage component and fabric component. Basic components of Windows

• The Compute Service: The primary goal of this platform is to support a large number of simultaneous users. (Microsoft also said that they would use Azure to build their SaaS applications which motivated many potential users.) To allow applications to scale out Microsoft uses multiple instances of that applications on virtual machines provided by Hypervisor. Developers use Windows Azure portal through Web browser, and use Windows live ID to sign in into his/her hosting account or storage account or both. Two different types of Azure instance is available: Web role instance and Worker role instances.

  • Web role instance: As the name implies this type of instance can accept HTTP or HTTPS requests. For this facility Microsoft uses IIS (Internet Information Services) as a web server inside the VM provided. Developers can build applications using ASP.NET, Windows Communication Foundation (WCF) or any other .NET technology or native codes also like C++. PHP or java based technologies also supported in Azure. Azure scales applications by running multiple instances without any affinity with a particular Web role instance. So it is perfectly natural for an Azure application to serve multiple requests from a single user by multiple instances. So this requires to write the client state in the Azure storage after each client request.
  • Worker role instance: This type of instances are very similar to that ofWeb role instances. But unlike the Web role instances these don’t have IIS configured. They can be configured to run executable of users’ right. Worker role instance is more likely to function like a background job. Web role instances can be used to accept request from the users and then they can be processed by Worker role instances in a later point of time. For a compute intensive work many Worker role instances can run in parallel.

Loging and monitoring of Azure applications is made easy by provision of application wide log. a developer can collect performance related information like measure of CPU usage, store crash dumps in the storage. Azure doesn’t give the developer the freedom to use his/her own VM image for Windows Azure. The platform maintains its own Windows. Applications in Azure run only in user mode - no administrative access isn’t allowed here. So Windows Azure can update the operating system in each VM without any concern of affecting the applications running on it. This approach separates administrative work from the user domain.

Cloud Computing Application in Indian context

Today most of the studies in cloud computing is related to commercial benefits. But this idea can also be successfully applied to non-profit organizations and to the social benefit. In the developing countries like India Cloud computing can bring about a revolution in the field of low cost computing with greater efficiency, availability and reliability. Recently in these countries e-governance has started to flourish. Experts envisioned that utility based computing has a great future in egovernance. Cloud computing can also be applied to the development of rural life in

India by building information hubs to help the concerned people with greater access to required information and enable them to share their experiences to build new knowledge bases.


Cloud computing is a newly developing paradigm of distributed computing. Virtualization in combination with utility computing model can make a difference in the IT industry and as well as in social perspective. Though cloud computing is still in its infancy but its clearly gaining momentum. Organizations like Google, Yahoo, Amazon are already providing cloud services. The products like Google App-Engine, Amazon EC2, Windows Azure are capturing the market with their ease of use, availability aspects and utility computing model. Users don’t have to be worried about the hinges of distributed programming as they are taken care of by the cloud providers. They can devote more on their own domain work rather than these administrative works. Business organizations are also showing increasing interest to indulge themselves into using cloud services. There are many open researchissues in this domain like security aspect in the cloud, virtual machine migration, dealing with large data for analysis purposes etc. In developing counties like India cloud computing can be applied in the e-governance and rural development with great success. Although as we have seen there are some crucial issues to be solved to successfully deploy cloud computing for these social purposes. But they can be addressed by detailed study in the subject.

Advantages and Disadvantages of Cloud Computing


1) Easy implementation. Cloud hosting allows business to retain the same applications and business processes without having to deal with the backend technicalities. Readily manageable by the Internet, a cloud infrastructure can be accessed by enterprises easily and quickly. 2)2)2) 2)Accessibility. Access your data anywhere, anytime. An Internet cloud infrastructure maximizes enterprise productivity and efficiency by ensuring your application is always accessible. This allows for easy collaboration and sharing among users in multiple locations.

3) No hardware required. Since everything will be hosted in the cloud, a physical storage center is no longer needed. However, a backup could be worth looking into in the event of a disaster that could leave your company's productivity stagnant.

4) Cost per head. Overhead technology costs are kept at a minimum with cloud hosting services, enabling businesses to use the extra time and resources for improving the company infrastructure.

5) Flexibility for growth. The cloud is easily scalable so companies can add or subtract resources based on their needs. As companies grow, their system will grow with them.

6) Efficient recovery. Cloud computing delivers faster and more accurate retrievals of applications and data. With less downtime, it is the most efficient recovery plan.


1) No longer in control. When moving services to the cloud, you are handing over your data and information. For companies who have an in-house IT staff, they will be unable to handle issues on their own. However, Stratosphere Networks has a 24/7 live help desk that can rectify any problems immediately.

2)  May not get all the features. Not all cloud services are the same. Some cloud providers tend to offer limited versions and enable the most popular features only, so you may not receive every feature or customization you want. Before signing up, make sure you know what your cloud service provider offers.

3) Doesn't mean you should do away with servers. You may have fewer servers to handle which means less for your IT staff to handle, but that doesn't mean you can let go of all your servers and staff. While it may seem costly to have data centers and a cloud infrastructure, redundancy is key for backup and recovery.

4) No Redundancy. A cloud server is not redundant nor is it backed up. As technology may fail here and there, avoid getting burned by purchasing a redundancy plan. Although it is an extra cost, in most cases it will be well worth it.

5) Bandwidth issues. For ideal performance, clients have to plan accordingly and not pack large amounts of servers and storage devices into a small set of data centers.