Читать книгу Code Nation - Michael J. Halvorson - Страница 27
3.1Solving Problems with Computers
ОглавлениеComputer programming is a catch-all term for problem solving with a computer. The core task of programming is to create a sequence of instructions in a computer language that will automate a given task or find a solution to a problem of interest.
Although early computer programmers devised their solutions by directly manipulating or “setting up” the wiring and circuitry of computers, by the late 1950s most programmers used computer languages to automate problem solving. Low-level languages Low-level languages (such as Machine language machine language and assembly language) provide instructions that are closely related to a computer’s underlying architecture. High-level languages High-level languages (such as FORTRAN and Java Java) provide instructions that are optimized for specific problem-solving requirements. Programs written in a high-level language usually have an additional advantage—they are more easily moved from one computing platform to the next.
In a modern context, writing a program usually entails the use of numerous software development tools, each with a specific purpose. The activity often takes place in a comprehensive Integrated development environments (IDEs) integrated development environment (IDE), where the programmer can design the user interface, enter program code, adjust settings, review documentation, test and debug the application, and interact with online support communities. Present-day examples of an IDE include Microsoft Visual Studio for Windows, Xcode for Apple platforms, and Eclipse for Java development.
In the early years of computing, however, the programming tools for software developers were much more limited. A typical programmer would write out his or her instructions for the computer by hand, and then prepare them for entry into the computer’s memory using input media such as punched cards, punched tape, magnetic tape, or (in later years) keyboard input. In these contexts, programming was a deeply mental exercise that only involved a computer in the later stages. For this reason, a fundamental step in the programming process was planning and research. Engineers worked to solve a computational problem as efficiently as possible and a major concern was always maximizing limited computer resources. Programmers prepared a program for the computer in its near-final form, and only later loaded the routines into memory. Fixing problems that arose often involved a painful process of trial and error.
Figure 3.1Photograph of the punched paper tape for MITS ALTAIR BASIC 1.0, created by Bill Gates, Paul Allen, and Monte Davidoff for the Altair 8800 microcomputer. Dated March 2, 1975.Altair BASICGates, BillAllen, PaulDavidoff, MonteAltair 8800 microcomputer (Courtesy of the Computer History Museum)
A concrete example of this problem-solving approach comes from one of the first commercial programming languages written for microcomputers, the original BASICAltair BASIC BASIC interpreter created for the MITS Altair 8800.Altair 8800 microcomputer MITS Altair 8800 (See Figure 3.1). This program was written by Bill Gates, Paul Allen, and Monte Davidoff while Gates and Davidoff were students at Harvard.
When the Altair was announced in early 1975, there was no commercial software available for the machine. But computing enthusiasts soon realized that if someone could create a BASIC BASIC interpreter for the Altair, then hobbyists could write their own BASIC programs on the computer and get the device to perform non-trivial work. (Chapter 4 explains why BASIC was chosen for this duty, despite its limitations.) Gates and his friends researched the Altair’s specifications, and then they bought a book about the Intel 8080 microprocessor Intel 8080 microprocessor—the electronic “brain” of the Altair. As luck would have it, they had been working for some time with time-sharing systems, and they also knew how to write Assembly language assembly language programs that could make efficient use of a computer’s internal architecture.
To prepare for this project, Gates studied recent versions of the BASIC language (originally created by Kemeny, John John Kemeny and Kurtz, Thomas Thomas Kurtz) and he scrutinized how the language operated. Then he designed an interpreter program that would provide Altair users with essential BASIC features while consuming as little computer memory as possible. At this stage in its development, the Altair 8800 microcomputer Altair microcomputer only had 4K of system memory to work with (a tiny amount). Gates studied the instruction set for the Intel 8080 microprocessor carefully, and he was determined to fit the new BASIC into available memory and leave a little room for the user’s programs.
How did he go about solving this problem?
We know more about Gates’s solution than we do many of his contemporaries because the program that he built became well known, and Gates wrote about his method in two texts that have attracted the attention of journalists. The first comments about his approach were published in September 1975, followed by a longer interview with Susan Lammers about Gates’s coding procedures in 1986.1 I am especially interested in Gates’s advice to new programmers of the Intel 8080 microprocessor, because I am fascinated with stories about how novices learn to program. In this case, we can observe a 19-year-old coding prodigy teaching others how to code.
Gates suggested that the best way to familiarize yourself with a new instruction set was “to go out of your way to use every instruction at least once.”2 As you are learning the syntax and architecture of a new chip, Gates advised, “go through the instruction set… and look closely at the instructions you seem to use very rarely.” He suggested that successful programmers will continually search for better commands and more efficient ways to solve their problems. But in the final routines, when efficiency is the most pressing concern, it was important to flip the strategy and “use the least number of instructions possible to perform each function.” Finally (and I love this recommendation as a technical writer), Gates cautions that one should not trust the instruction books on programming too much, because they sometimes neglect important shortcuts.3 Programmers, in other words, should learn by doing, internalizing every aspect of the instruction set and hardware features until they are deeply engrained in the coding psyche. This sentiment would eventually become a mantra of the learn-to-program movement.4
As the team of students completed their work, Bill Gates wrote out the assembly language Assembly languageroutines routines for the interpreter on yellow legal pads, drawing explanatory charts to document what was happening in computer memory.5 Paul Allen and Davidoff, Monte Monte Davidoff also collaborated at this stage, competing with Gates to make the code as tight as possible. At one point, Gates called owner Ed Roberts at MITS to ask how the BASICAltair BASIC Altair processed characters typed on a keyboard, because they had no access to the actual Altair device.6 Finally, the group typed the program into Harvard’s DEC PDP-10 PDP-10 minicomputer using a console display and a keyboard. The PDP-10 was running a software emulator program designed to mimic the Altair microcomputer, created by Paul Allen. When Gates, Allen, and Davidoff finished inputting and debugging their program, they tested its operations by keying in several BASIC programs—the types pioneered by Kemeny and Kurtz at Dartmouth College over a decade earlier.
The programmer-entrepreneurs then used the PDP-10 to create a spool of punched paper tape containing the completed BASIC interpreter. (See Figure 3.1.) Allen flew to Albuquerque, New Mexico, and fed the punched tape into the Altair test machine at MITS, establishing what would become the first commercial high-level programming language for a PC.7 The entire process took a little over 8 weeks. In the coming years, the nascent Micro-Soft team (later Microsoft) adapted their solution to work on computer systems with different hardware characteristics, expanding the interpreter’s abilities as microcomputers and PCs became more powerful. They applied the same basic approach as they managed system resources for operating systems. (See Figure 3.2.)
Figure 3.2Gates, BillBill Gates and Paul Allen pose for a portrait at Microsoft in 1984. Behind the programmers is a white board with illustrations of computer memory, including a plan for allocating resources in an IBM PC that contains 64KB of RAM. (Photo by ©Doug Wilson/CORBIS/Corbis via Getty Images)
As this example demonstrates, there is more than meets the eye to building non-trivial computer programs, and much of the process takes place well before the programmer loads the code into memory and actually runs the program. The emphasis here is on Teamwork teamwork, and it serves as a corrective to the misconception that Programming programming is usually the work of a solitary coder sitting alone in front of a computer screen.
Conceptually, programming involves refining Algorithms algorithms, the ordered collections of steps that are proposed to automate processes and solve problems elegantly and efficiently. Some algorithms are limited in scope, like the eight or ten steps that might be necessary to receive contact information from a user and store it in a computer file. (To complete this task, an algorithm might prompt the user for a name and address, assign the input to temporary variables, check the variables for suitable content, format the content, and then insert the information into a database at the appropriate location.) Algorithms Algorithms can also be incredibly complex, such as the comprehensive searching and sorting schemes that Google uses to sift through data gathered from the World Wide Web, then present this information to a user via a commercial web Programming C; C++ browser.
Systematic attempts to teach programming must somehow train students to become proficient in coding skills, the use of algorithms, debugging techniques, and other important abilities. To appreciate how this relatively obscure problem-solving process became a popular movement, we turn now to the proliferation of programming languages in the 1950s, and the development of an extremely successful programming Computer language language, Formula translation (FORTRAN) FORTRAN.