Читать книгу Cloud Computing Solutions - Группа авторов - Страница 21
1.1 Evolution of Cloud Computing
ОглавлениеCloud computing isn’t new. Truth be told, a lot of what we do on our personal computers today requires it. What is changing is the way that we look at what cloud computing is able to do for us today. The idea of cloud computing came after the period of mainframe computers of the 1960s when the likelihood of utility computing had already been proposed by MIT personal computer researcher John McCarthy, who opined that “Calculation may some time or another be sorted out as an open utility.”
In 1961, John McCarthy proposed [1]: “If personal computers of the kind I have supported turned into the personal computers without bounds, at that point computing may some time or another be sorted out as an open utility similarly as the phone framework is an open utility... The personal computer utility could turn into the premise of another critical industry. Utility Computing is called a service providing model. Here, the service provider ensures the accessibility of computing resources and framework administration to the client as per requirement. This method is just like the pay per use service and that of the metered services, which implies that clients can pay according to their use of network access, sharing of records and several other applications. In 1966, Douglas F Parkhill published the book The Challenge of the Computer Utility. In his book, he investigated the idea of versatile provisioning and resource sharing.”
The sequential development of the computing environment may be arranged in a year wise manner as shown in Figure 1.1. IBM System/360 entered the global market in 1964. This model and other items of the same family attracted attention from the business community because the fringe parts were movable and the item unit was implemented in all systems of the family [1]. The scaling down of the mainframe systems and more improvements over time prompted the free machines, the reported minicomputers; for example, DEC’s PDP-8 minicomputer introduced in 1964 and Xerox’s Alto in 1974 [2].
The computer era began in the early 1970s with the release of the first Intel 4004 microprocessor (MP) in 1971, followed by the release of the Intel 8008 MP in 1972. The first personal home computer, the Micral, was created by André Truong Trong Thi [2] based on the Intel 8008 MP. Development of the Mark-8 or TV-Typewriter was the first project for microcomputer hobbyists. In 1975, the MITS Altair 8800 microcomputer kit advertised in several scientific and hobby magazines is credited with having popularized microcomputers. This personal computer was supposed to be the underlying idea behind home computers. The first programming language for the machine was Microsoft’s founding product, Altair BASIC. Successively, Apple, Commodore, Atari and others entered the personal home computer market. IBM introduced its first personal computer to the market, which was commonly known as the IBM PC. Microsoft engineered the operating system (OS) for IBM PCs, which was built up and standardized and wound up being used by numerous PC makers. There have been numerous consecutive periods of improvement with headway being made in the advancements whipping up the market. With the creation of graphical user interface (GUI), the next stage of improvements is being prompted.
While thinking about how to significantly improve interactions among numerous personal computers, another point of reference began in the business sector, which was the Internet. The Advanced Research Projects Agency (ARPA)1 presented the idea of the Internet as an exploratory venture. Each interfacing point is known as a node in a web. With the help of the U.S. Department of Homeland Security, a correspondence framework has been made with the end goal that in case any of the nodes get broken, the correspondence framework remains connected. In the long run, from this endeavor, the ARPANET was created and nearly 200 foundations were connected to this system. The thought of TCP/IP in 1983 has been exhibited, and the internet was changed to TCP/IP, which relates the entire subnet to the ARPANET. By and by the internet has become known as a system of systems. With the advancement of the World Wide Web (WWW) by British expert and personal computer researcher Sir Timothy John Berners-Lee in 1989, the web accomplished its definitive leap forward. Berners-Lee proposed an information administration framework for CERN (European Organization for Nuclear Research)2 where hyperlinks were utilized. In the end, with respect to the end clients there was a necessity for web programs. Thus, the WWW became ubiquitous when the web browser Mosaic was introduced in the market.
Figure 1.1: Evolution of cloud computing.
Today, the entire information technology sector is putting effort into outlining the quality of web programming by increasing the bandwidth and also by using some innovative ideas to build up programs. We can easily develop user interactive websites with the use of Java, PHP or AJAX. These advancements result in the development of different multimedia websites and interactive applications for the business sector.
Meanwhile, in the 1990s, the idea of grid computing was presented in academia. Carl Kesselman and Ian Foster disseminated their book The Grid: Blueprint for a New Computing Infrastructure. The new linkage was related to the idea of an electric grid. We can relate the idea of grid computing with our day-to-day example. When a device is connected to a power outlet, we are unaware of how electric power is generated and reaches the outlet, we just use it. This is what is known as virtualization. Here, we don’t know the basic architecture or method behind the scene. We don’t know how things are made available to the users, but we are aware that they are actively using it. We can predict that power is virtualized; virtualization conceals a gigantic scattering grid and power generation stations. This idea may be adapted in computing, where distinctive conveyed segments, such as storage, data management, and software assets, are incorporated [3]. In innovations like cluster, grid and now cloud computing, every one of the developments have focused on enabling access to an enormous amount of computing assets in a totally virtualized design. It influences a singular framework for examining social occasions involving resources in a total example. These are all given to the users or the organizations or to customers on the basis of “pay-per-use” or “pay-as-you-go” design (payment based on utilization).
In 1997, the term “cloud computing” was introduced in academia by Ramnath Chellappa, who defined it as a “computing paradigm where the boundaries of computing will be determined by economic rationale rather than technical limits alone.” In 1999, Salesforce started conveying applications to their clients utilizing basic websites. The actual applications were undertaken and dispersed over the web; in this manner, utility-based computing began being used in the real world. Amazon started its point of reference by creating Amazon Web Services (AWS) and conveying storage services, estimations and so on in 2002. Amazon allows clients to integrate its immense online substance with their own website. Its web services and computing facility have expanded slowly upon request. In 2006, Amazon initially launched its Elastic Compute Cloud (Amazon EC2)3 as a commercial internet benefit that allows small enterprises and individuals to lease infrastructure (resources, storage, memory) upon which they can carry and run their own applications. With the implementation of Amazon storage (Amazon S3), a “pay-per-use” model was also implemented. Cloud’s Google App Engine,4 Force.com, Eucalyptus,5 Windows Azure,6 Aneka7 and a lot more of their kind are capturing the cloud business.
The next section is about cluster, grid and mobile computing.