Cloud Computing: Cloud computing is the on-demand availability of computer system resources, especially data storage and computing capabilities, without the need for direct user proactive management. This term is often used to describe data centers available to many users on the Internet. Today’s dominant large clouds typically have the ability to be distributed across multiple locations from a central server. If the connection to the user is relatively close, you can designate it as an edge server.
The cloud may be limited to a single organization (enterprise cloud), available to many organizations (public clouds), or both (hybrid cloud).
Cloud computing relies on resource sharing to achieve consistency and economies of scale.
Advocates of public and hybrid clouds point out that cloud computing allows companies to avoid or minimize the cost of upfront IT infrastructure. Proponents also claim that cloud computing allows businesses to get their applications up and running faster, improve manageability and reduce maintenance, and enable IT teams to adjust resources faster to meet fluctuating and unpredictable demands. Cloud providers typically use a “pay-as-you-go” model, which can lead to unexpected operating expenses if the administrator is not familiar with the cloud pricing model.
High-capacity networks, the availability of low-cost computers and storage devices, and hardware virtualization, service-oriented architectures, and the widespread adoption of autonomous and utility computing have led to the growth of cloud computing.
With Amazon releasing its Elastic Compute Cloud product in 2006, “cloud computing” was promoted, and as early as 1996 there was a reference to the phrase “cloud computing,” which was first mentioned in Compaq’s internal documents.
As early as 1977, the cloud symbol was used to represent the computing device network in the original ARPANET, and in 1981 it was used to represent CSNET – both of which were the predecessors of the Internet itself. The word cloud is used as a metaphor for the Internet, and a standardized cloud shape is used to represent the network on the phone schematic. By this simplification, the implication is that the way the network endpoints are connected is independent of the purpose of understanding the chart.
Back in 1993, when Apple split General Magic and AT&T to describe their (paired) Telescript and PersonaLink technologies, the term cloud was used to refer to the platform for distributed computing. In the April 1994 close-up “Bill and Andy’s Outstanding Adventure II”, Andy Hertzfeld commented on the general magic distributed programming language Telescript:
“The beauty of Telescript … is that now, instead of just having a device to program, we now have the entire Cloud out there, where a single program can go and travel to many different sources of information and create sort of a virtual service. No one had conceived that before. The example Jim White [the designer of Telescript, X.400 and ASN.1] uses now is a date-arranging service where a software agent goes to the flower store and orders flowers and then goes to the ticket shop and gets the tickets for the show, and everything is communicated to both parties.”
In the 1960s, the original time-sharing concept was popularized through RJE (remote job entry); the term was primarily related to large suppliers such as IBM and DEC. In the early 1970s, a full-time sharing solution was provided on platforms such as Multics (on GE hardware), Cambridge CTSS, and the earliest UNIX ports (on DEC hardware). However, the data center model that users submit jobs to operators to run on IBM mainframes is absolutely dominant.
In the 1990s, telecommunications companies that previously provided dedicated point-to-point data circuits began offering virtual private network (VPN) services with comparable quality of service, but at a lower cost. They can use the overall network bandwidth more efficiently by switching the traffic they think is appropriate to balance servers usage. They started using cloud symbols to represent the demarcation point between what the provider is responsible for and what the user is responsible for. Cloud computing extends this boundary to cover all servers and network infrastructure. As computers become more fragmented, scientists and technicians have explored ways to provide large-scale computing power to more users in a time-sharing manner. They try to use algorithms to optimize the infrastructure, platforms and applications to prioritize CPUs and improve end-user efficiency.
Virtualization services using cloud metaphors can be traced back at least to General Magic in 1994, which describes the range of “places” that mobile agents can enter in a Telescript environment. As described by Andy Hertzfeld:
The beauty of Telescript,” says Andy, “is that now, instead of just having a device to program, we now have the entire Cloud out there, where a single program can go and travel to many different sources of information and create sort of a virtual service.
2000’s: Cloud computing has been around since 2000.
In August 2006, Amazon created its subsidiary Amazon Web Services and launched its Elastic Compute Cloud (EC2).
In April 2008, Google released a beta version of Google App Engine.
In early 2008, NASA’s OpenNebula was enhanced in the RESERVOIR European Commission-funded project, becoming the first open source software to deploy private and hybrid clouds and cloud federation.
By mid-2008, Gartner saw the opportunity of cloud computing “the consumer of IT services, the relationship between people who use IT services and the people who sell them” and observed that “the organization is owning hardware and software from the company.” The shift in assets to each use of the service-based model “so” is expected to shift to computing will lead to a sharp increase in IT products in certain areas, and significantly reduced in other areas.”
In 2008, the National Science Foundationlaunched a cluster exploration program that uses Google-IBM cluster technology to analyze large amounts of data to fund academic research.
2010’s: Microsoft released Microsoft Azure, which was announced in October 2008.
In July 2010, Rackspace Hosting and NASA jointly launched an open source cloud software project called OpenStack. The OpenStack project is designed to help organizations provide cloud computing services that run on standard hardware. The early code came from NASA’s Nebula platform and Rackspace’s Cloud Files platform. As an open source product, as well as other open source solutions such as CloudStack, Ganetti and OpenNebula, it has attracted the attention of several major communities. Some studies aim to compare these open source products against a range of criteria.
On March 1, 2011, IBM announced the launch of the IBM SmartCloud framework that supports Smarter Planet. Among the various components of the smart computing foundation, cloud computing is a key part. On June 7, 2012, Oracle announced the launch of Oracle Cloud. The cloud product will be the first to offer users access to a complete IT solution, including the application (SaaS), platform (PaaS) and infrastructure (IaaS) layers.
In May 2012, the Google Compute Engine was released in the preview version, followed by the regular version in December 2013.
The goal of cloud computing is to allow users to benefit from all of these technologies without having to have in-depth knowledge or expertise on each technology. The cloud is designed to reduce costs and help users focus on their core business, rather than being hindered by IT barriers. The main supporting technology for cloud computing is virtualization. Virtualization software divides physical computing devices into one or more virtual devices, each of which can be easily used and managed to perform computing tasks. An extensible system that essentially creates multiple independent computing devices through operating system level virtualization allows for more efficient allocation and use of idle computing resources. Virtualization provides the flexibility needed to accelerate IT operations and reduce costs by increasing infrastructure utilization. Autonomic computing automates the process of configuring resources on demand by users. By minimizing user engagement, automation speeds processes, reduces labor costs, and reduces the likelihood of human error.
Users often face difficult business problems. Cloud computing uses a Service Oriented Architecture (SOA) concept that helps users break down these issues into services that can be integrated to provide a solution. Cloud computing provides all of its resources as a service and leverages the proven standards and best practices gained in the SOA space to provide comprehensive and easy access to cloud services in a standardized manner.
Cloud computing also leverages the concept of utility computing to provide metrics for the services used. These metrics are at the heart of the public cloud pay-per-use model. In addition, measurement services are an important part of the feedback loop in autonomic computing, allowing services to scale as needed and perform automatic failover. Cloud computing is a type of grid computing; it evolves by addressing QoS (quality of service) and reliability issues. Compared to traditional parallel computing technologies, cloud computing provides tools and techniques for building data/computation-intensive parallel applications at a more affordable price.
Cloud computing has the following characteristics:
Client – Server Model – Client – Server Computing refers to any distributed application that distinguishes between a service provider (server) and a service requester (client).
Computer Bureau – A service bureau that provides computer services, especially from the 1960s to the 1980s.
Grid Computing – A form of distributed and parallel computing where “super and virtual machines” consist of a group of networked, loosely coupled computers that work together to perform very large tasks.
Fog Calculation – A distributed computing paradigm that provides data, computing, storage, and application services closer to a client or near-end user edge device, such as a network router. In addition, fog computing handles data on the network level, smart devices, and end-user clients (eg, mobile devices) rather than sending data to remote locations for processing.
Large Computers – Powerful computers are used primarily by large organizations for critical applications, typically bulk data processing such as censuses; industrial and consumer statistics; police and secret intelligence; enterprise resource planning; and financial transaction processing.
Utility Computing – “Packaging of computing resources, such as computing and storage, as a metering service similar to traditional utilities, such as electricity.”
Peer-to-peer – a distributed architecture that does not require central coordination. Participants are both resource providers and consumers (as opposed to traditional client-server models).
Cloud Sandbox – A real-time, stand-alone computer environment in which programs, code or files can be run without affecting the applications they run.
It seems we can’t find what you’re looking for. Perhaps searching can help.