Читать книгу Inside Intel - Tim Jackson - Страница 15

Оглавление

8

The Microprocessor

THE NEWS OF INTEL’S BREAKTHROUGH in MOS memory technology with the 1101 and 1103 chip caused some jealousy inside competing electronics companies, but there was one place where it provoked outright despair. Inside Fairchild Semiconductor’s research department a young Italian engineer called Federico Faggin was on loan from a European joint-venture company in which Fairchild had a stake. He had spent his time at Fairchild Semiconductor working on silicon gate – the technology that proved to be the key to Intel’s 1101 memory chip – and had watched, with growing frustration, Fairchild’s failure to do anything with the knowledge that had been created inside the lab. Noyce and Moore had hired dozens of people from Fairchild, down to the lab technician (in the department) who had been working most closely on the silicon-gate process; now their new company seemed destined to commercialize it independently. But Faggin himself couldn’t follow the stars of the Fairchild research effort across to Intel. He had decided to stay in America, and his change of status from temporary exchange visitor to permanent resident meant that he couldn’t change jobs while his application was being processed.

It was another year before Faggin was ready to make his move. He called Les Vadasz at Intel, and told him that he would be interested in joining the young company. He made only one stipulation: he wanted to work on chip designs, not production processes. Vadasz invited him over to Mountain View for a chat, but refused to say anything in detail about the work that Faggin might do if he came to Intel. So Faggin was taking a leap in the dark when he accepted an offer from Vadasz and Grove, gave notice to Fairchild, and reported for work at Intel two weeks later to the day.

Boredom was evidently not going to be a problem in the new job. Vadasz explained on Faggin’s first day that Intel had entered into a contract to design a set of chips for a Japanese firm called the Nippon Calculating Machine Corporation. Faggin was to be the project leader, and his first job would be to meet a client representative who was flying in from Tokyo the very next day to check on how the project was progressing.

An afternoon of reading the files and talking to his new colleagues gave Faggin the background he needed. The product Intel was working on was a desktop calculating machine sold under the brand name of Busicom, and the approach had come from Japan at a time when Intel was desperate for work of any kind. At first the client’s technical people had asked Intel to tender for a project to design and manufacture a set of eight logic chips customized specially for their calculator and pre-programmed to carry out the basic arithmetical functions it would need to offer. But with three different memory circuit projects already in progress, Intel simply did not have the resources to carry out eight new logic chip designs. It was Ted Hoff, the brilliant Stanford engineer who had come up with the idea for the DRAM cell, who proposed an alternative solution. Why not build a miniaturized general-purpose computer, he suggested, which could then be programmed to do the arithmetic for the client’s desktop calculator?

The key difference between a general-purpose computer such as the PDP-8 and the customized logic circuits required by the Busicom specification was that the PDP had a subroutine capability: it could stop in the middle of a series of program steps, go off and carry out another job, and then return where it had left off. Hoff saw that if he could only add a subroutine capacity to the Busicom design, he could then take all the high-level functions that the calculator required and turn them into a set of subroutines. The basic computer could then be stripped down to the point where it could perform only the simplest tasks, and everything else – even something as apparently basic as adding a pair of integers together – could be reduced to combinations of these simple tasks.

This was an insight of dazzling brilliance. Discussing it afterwards, however, Hoff managed to make it seem almost obvious. ‘I’d been using a digital PDP-8 computer to run full-scale FORTRAN programs,’ he recalled, ‘yet the PDP-8’s central processing unit was a great deal simpler than the Busicom machine. It got all its complexity from memory. The subtle bits were in the program, not the hardwired logic.’

After a few days of thinking about it, Hoff sketched out a plan which involved a four-chip set: a central processing unit (CPU), a memory chip for working data, a read-only memory chip ROM) where the program written specially for the Busicom functions could be stored, and a fourth device to deal with input and output, or I/O. Such a plan would be considerably less complex to design than a set of specialized logic chips – and it might well prove cheaper.

The Japanese company had initially been sceptical, particularly since its engineers had already done months of preliminary work on the logic design in the specification. But gradually, the Japanese managers came to accept the merits of the idea. Hoff’s general-purpose design would allow the company to offer a range of more complex calculators in future without having to build an entirely new set of logic chips. In a market that was becoming increasingly competitive, anything that allowed it to sell a better calculator at a lower price than its competitors was to be welcomed. And if Hoff was confident that he could deliver its four-chip set, with a tiny general-purpose computer running the arithmetical programs, then who were the engineers in Japan to doubt him? The company had handed over an advance of $100,000, and it was expecting to see some evidence that the chips for its new product were well on their way to production.

There was just one problem. After coming up with this brilliant idea, Hoff was told to concentrate his efforts in other projects. The project made some progress when another engineer named Stan Mazor, who had joined Intel from Fairchild, added a few instructions to the architecture and wrote some sample programs to prove the feasibility of the calculator design. But by April 1970, when Faggin arrived at Intel, the chip was still very far from complete. The set of instructions that the CPU would handle had been agreed, and confirmed by the Japanese engineers as correct. Hoff had drawn up an overall design for the chip, indicating broadly how many transistors it would require and which jobs would be carried out where. But when Faggin asked to see the detailed design, he got a shock. There wasn’t one – but a Busicom engineer was about to arrive in California for a progress check.

Masatoshi Shima, the Busicom engineer, was all smiles when Faggin and another designer met him at the airport. His tune changed when he saw the materials that Faggin had been ordered to show him. Shima, a talented engineer in his own right, realized immediately that it was no advance on what he had seen on his last visit. ‘You bad!’ he shouted. ‘You promised! You said design done! No design! This is just idea! This is nothing! I came here to check, but there is nothing to check!’

Faggin had been briefed not to let on that he had just arrived, but he realized that hiding the truth would merely make his position more untenable. He confessed to Shima what had happened and agreed to start work immediately. Carrying out the first piece of processor work of his career, he was now faced with a schedule that was almost impossible to meet.

Working with Stan Mazor, a fellow circuit designer, Faggin managed to turn the concepts into working prototypes of four chips at extraordinary speed. So rigorous was he as a circuit designer that the manufacturing prototypes of the first three devices of the four – the ROM, the RAM and the input-output chip – worked perfectly when they came off the line. The fourth, the processor itself, proved a little more problematic. The first prototype was absolutely dead, and it took Faggin some time to work out what had happened: the circuit was supposed to be built on the chip in a series of layers, and one of the layers had been accidentally missed out of the manufacturing process. The second prototype worked, and needed only minor adjustments. Three months after Faggin’s arrival from Fairchild and Shima’s arrival from Japan, defying the ‘one man, one chip, one year’ rule, Faggin and Mazor had produced four working chips.

By then, however, the calculator market in Japan had become a great deal more competitive. Shima’s bosses back in Tokyo decided that they could not build the Busicom machine profitably if they were to pay Intel the price for its chips that had originally been agreed. They came back to Intel, demanding a price cut.

Had Intel responded differently to this demand, it could never have become the company it is today. But Bob Noyce, fed with good advice from Hoff and Faggin, knew exactly what he wanted. He was willing to refund $60,000 to the client – but in return, he demanded a change in the licence terms. Instead of giving the client exclusive rights over the chip design, he said, Intel wanted the right to sell the design to other customers. The response from Tokyo was a qualified ‘yes’. As long as Intel would agree not to sell it to competing calculator companies, the general-purpose processor that had been designed for the Busicom machine was Intel’s to keep.

Faggin’s extraordinary achievement in delivering the chips so swiftly had not been without cost. With his wife and new baby back in Italy, Faggin had worked twelve, fourteen, sometimes even sixteen hours a day for weeks on end. Yet Andy Grove was in the middle of a campaign to turn Intel into a more serious, more professional outfit – and Les Vadasz, following Grove’s lead, relentlessly complained at Faggin when he arrived late for work. The complaints irritated Faggin mightily. Everyone in the lab, he said, knew that there were nights when he would still be at his workbench until dawn, going home only to snatch a few hours’ rest before returning to the plant. But Vadasz would not be moved. The result was that relations between Faggin and Vadasz began to deteriorate – and the talented young circuit designer began to find his work in the Intel research department increasingly miserable.

But there were compensations. Although he still missed his home town of Vicenza, Faggin was confident that he no longer wanted to return to Italy. He was becoming accustomed to the brilliant blue skies of northern California, and the beauty of the fruit orchards to the south that were still beyond the reach of the electronics industry. He was coming to terms with the strange food and drinks consumed by the Americans – Jell-O, cold milk, and weak coffee by the pint – and with their strange Midwestern objection to the civilized custom of drinking a glass of wine with one’s lunch. He was also beginning to appreciate the contrast between the orderliness of the engineers he was working with and the relaxed, timeless chaos still prevalent in Italy. And he devised a simple rule of thumb for life in America as a European. The ratio of everything was the same as the ratio of an inch to a centimetre. Cars, houses, refrigerators, shopping carts – everything in the States was two and a half times the size of its equivalent in Europe. On balance, Faggin decided, he might as well stay in California. The future looked interesting.

* * *

A NEW ERA IN INTEGRATED ELECTRONICS. This was the headline that Intel used to announce the launch of its 4004 microprocessor in an ad in Electronics News in November 1971. Gordon Moore went further. He described the microprocessor as ‘one of the most revolutionary products in the history of mankind’.

That wasn’t how it looked at the time.

People in the computer business viewed the 4004 as a fascinating novelty. They knew that it matched the power of the ENIAC, the world’s first vacuum-tube computer project, which was completed in 1946 at the University of Pennsylvania under contract to the united States government. Built to calculate ballistics and detonation tables for American weaponry, ENIAC occupied an entire room, used 18,0 vacuum tubes, and consumed 200 kilowatts of power – enough to heat several family houses. The 4004, by contrast, was small enough to rattle around in a matchbox, and cost under $100.

But a 1946 computer wasn’t a practical comparison. In the quarter-century that had passed since then, the ENIAC had long been superseded by more modern machines using integrated circuits. By 1971 the 4004 offered an extraordinary combination of price and performance, but in absolute terms it wasn’t a serious contender for work inside ‘real’ computers. With the ability to process only 4 bits of information at a time, it was many times slower and many times punier than the 1971 state of the art in mainframe central processing units. One executive from the computer industry, entirely missing the point of the revolution that was about to take place, joked to Bob Noyce that he wouldn’t want to lose his whole computer through a crack in the floor. Even Stan Mazor, one of the members of the microprocessor design team, was famous for telling his friends that they should never trust a computer they could lift. Only if you could foresee that the 4004 would be followed by improved versions that would double its performance every eighteen months for the next quarter-century was it clear that microprocessors would eventually displace the great monoliths of the mainframe era, and bring computing power to every office desktop.

Intel’s marketing people looked at the new chip, and made pessimistic noises. Even if the part’s performance would increase as quickly as the performance of Intel’s memory devices had done – and that was a big if – there was a worry about how big the market for microprocessors would be. After all, only 20,000 mainframe computers had been sold in the entire world in 1971. Assume optimistically that Intel could gain a 10% market share, and you were left with sales of only 2,000 units a year. This was about a week’s production – and nowhere near enough to justify a serious R&D budget.

So if it wasn’t going to put the mainframe computer out of business, what was this new gadget for? Its creators inside Intel’s research lab were full of ideas. If the chip could go into a calculator like the Busicom machine, said the 4004’s creators, then it could also add intelligence to a whole range of electrical business machines – cash registers, coin-change machines, traffic lights, weighing machines, blood analysers, cocktail dispensers, microwave ovens, cars, whatever. Until now, building intelligence into such machines had been prohibitively expensive, because it required designing a special dedicated piece of computer hardware for each application. The 4004 would change all that. Since it was a miniature general-purpose computer, it could be used by industrial designers to do any number of jobs. The customization would be in the software – in the program that controlled the chip.

The target customers for this use of the 4004 were engineers in America’s biggest industrial companies. But most of these engineers knew nothing about computer programming. Instead, it was smaller, hungrier companies without a strong entrenched market position that saw the potential of the tiny chip first. This gave rise to a problem in Intel’s sales department. The customer list for the company’s memory products made up a Who’s Who of the computer industry: big, reliable, blue-chip firms that could be counted on not only to pay this month’s outstanding bill, but also to carry on sending in orders month after month to the far blue yonder. The early adopters of the 4004 were much more obscure. Ed Gelbach, the new sales and marketing VP who had been brought in to replace Bob Graham, described the 4004 customer list as ‘not so much Who’s Who as Who’s That’.

In August 1972 Intel released a second microprocessor. Like the 4004, it had started out as a custom design project – this time for a company called Computer Terminals Corporation, which wanted to build a new display terminal. In accordance with the specification that CTC had given to Intel, the new processor handled data in chunks of 8 bits at a time rather than 4. It was known as the 1201 during its development. When the time came for launch, however, Ed Gelbach discovered that most Intel customers were utterly mystified by the company’s obscure system of numbering parts by function, capacity and start date. They thought the 4004 had earned its name because it was the company’s first four-bit microprocessor. It seemed only natural to call Intel’s first 8-bit machine the 8008.

The release of the 8008 helped to awaken glimmers of interest in the idea of using Intel microprocessors inside business computers as well as to add intelligence to industrial products. But the 8008 was little easier to program than the 4004. Until a set of compilers were developed, you had to write assembly-language instructions for the chip, telling it step by step to input this chunk of data, store it in that register, add it to the contents of the other register, output the result, and so on. Once you had your program ready in assembler, you then had to turn it into machine code – a set of two-digit hexadecimal numbers that could be fed to the processor one by one from the memory chip where the program was to be stored.

If it sounds complex and unfriendly, that’s because it was. Only two categories of people were likely to go to the trouble: engineers who could see a real commercial advantage from incorporating the chip into a product, and teenage hackers who thought the idea of messing about with their own computers was cool.

Two of the earliest hackers in this category were a pair of kids, aged seventeen and nineteen, from a private high school in northern Seattle. Bill Gates and Paul Allen clubbed together to raise the $360 they needed to buy an 8008 chip from a local electronics store. But not even the founders of what would later be Microsoft could make the 8008 support the BASIC programming language. Instead, they made an abortive attempt to use the chip to build a machine for a local traffic consulting company that would analyse tickertape counts of cars on suburban streets.

What made engineers in American industry take the microprocessor seriously was Intel’s first development system. Sold in a big blue box and known as the Intellec 4, the system was a tool that made it much easier and faster for outside engineers to develop and test programs for the new microprocessors.

Development systems proved a neat way of hooking customers into Intel’s product line. When a customer spent $5,000 on an Intellec 4, the chances were that it would spend another $50,000 on microprocessors over the next year or so. Then for every microprocessor, another half a dozen or more other peripheral chips including memory, ROM and input–output devices would be needed. This was the kind of business Ed Gelbach liked.

Even to those who didn’t believe the new device was the biggest thing that had ever happened to the electronics industry, the microprocessor began to look as though it had some promise. Perhaps it would never be a big money-spinner on its own account – but hell, if it helped to sell more memory, then that was just fine too.

Inside Intel

Подняться наверх