The Advent of the Microcomputer Era:  

An Eyewitness Account


Photograph of the desktop console of the world's first microcomputer system, which utilized the Intel 8008 single-chip 8-bit microprocessor. The computer system was developed and manufactured by Q1 Corporation, and delivered to Litton Industries, Litcom Division, in Melville, New york on December 11, 1971. In April 1974, Intel introduced the 8080: the second generation single-chip 8-bit microprocessor. Until then, Q1 was the world's only company to deliver self-contained general-purpose microcomputer systems.



Introduction of the 1,024-bit memory chip.


Advanced Memory Systems.
In 1969, Advanced Memory Systems developed the first integrated circuit containing 1,024 bits (one kilobit, or 1 Kb) of dynamic random access memory chip. The Wall Street firm of Philips, Appel & Walden considered underwriting the initial public offering of Advanced Memory Systems. I evaluated the technology of Advanced Memory Systems on behalf of Philips, Appel & Walden.


1960’s – onset of the semiconductor revolution.
The one kilobit memory chip represented the endpoint of a decade of explosive development in the semiconductor industry. It is also a natural entry point for understanding the emergence of the computer-on-a-chip. The transistor was invented at Bell Telephone Laboratories at the end of 1947. In 1958, Jack Kilby of Texas Instruments invented the integrated circuit: an entire electronic circuit implemented on a single silicon chip. In 1959, Robert Noyce, of Fairchild Semiconductor at the time, independently invented an improved version of the integrated circuit. An integrated circuit with 1,024 transistors (1 Kb) was introduced by the end of the decade by Advanced Memory Systems.


Figure 1.3  The distance between transistors is reduced with each generation of RAM chips. Halving the linear distance between adjacent transistors doubles the speed and quadruples the transistor density. If the cost of chip per unit area remains the same then the cost per transistor drops by a factor of four, and the computing cost by a factor of eight.



Memory chip generation as a measure of progress.

During the 1960’s the number of transistors in a chip grew by a thousandfold. The simplest way to view that development is as the result of five generations of doubling linear transistor density and therefore as the result of five generations of quadrupling the number of transistors per unit area. Such increase in transistor density leaves the chip area unchanged. There are three implications of that development.


Figure 1.3.1   Five stages of doubling linear density increases the transistor density per unit area by a factor of 1,024. Such five stages account for the increase from one bit to kilobit, from kilobit to megabit, and from megabit to gigabit.



Processing speed increased.

By cutting the linear distance between adjacent transistors in half, the time needed to transmit a signal between them is halved. In effect, doubling the linear transistor density is equivalent to doubling the processing speed. Thus, five generations of doubling linear transistor density decreases the distance between transistors, and increases the effective processing speed by a factor of 32.


The cost per transistor decreased.

Five generations of increasing linear transistor density by a factor of 32 resulted in the 32×32=1,024, or approximately a thousandfold, increase in the number of transistors per unit area. The production cost per unit area of silicon remained roughly the same over that time period. For this reason, the increase in the number of transistors per unit area represented a thousandfold drop in the cost per transistor.


Computing cost decreased.

Multiplying the time required to compute a task by the hardware cost provides an estimate of the computing cost. Thus, the 1960’s represented a thousandfold drop in the cost per transistor and a 32-fold increase in processing speed. Computing cost, therefore, dropped by a factor of 32,000 during that decade.


Extrapolation into the 1970’s.

It is the future, however, that matters, especially when it is not a linear extrapolation of the past. While evaluating Advanced Memory Systems for the Wall Street underwriter, I consulted several experts in the field about extrapolating the increase in transistor density into the 1970’s. In performing that evaluation, I benefited from discussion with some experts in the field. Professor Carver Mead, who was at the California Institute of Technology at that time, was one of those experts. He impressed on me that physical laws set limits to the increased transistor density. It was clear to me that Advanced Memory Systems had lead-time in a crucial field of technology. The management of Advanced Memory Systems was not marketing oriented. However, they did have a lead-time in a crucial field of technology. I therefore recommended that the initial public offering take place.


Speculating about a computer on a chip.

The rapidly increasing number of transistors per unit area made it unavoidable that, at some point, an entire central processing unit would be implemented on a single chip. I did not have the competence to determine what would be the minimum number of transistors necessary to implement a central processing unit as a single integrated circuit, but it seemed plausible that one or two generations of quadrupling the number of transistors per unit area should prove sufficient.


Microprocessor as a negative cost factor.

The drop in computing cost is steeper than the reduction in the cost per transistor. In a device such as a computer terminal, a computer on a chip could perform the tasks of more expensive components and have time left over for performing local data processing operations essentially for free.


Implication for remote computing.

Databases are intrinsically non-local. During the evaluation period, information processing was similarly non-local. But the prospect of subzero processing cost implied an impending dissociation between centralized data storage and point-of-use information processing. To me, therefore, it appeared inevitable that there would be a fundamental shift away from remote computing and locally shared computing, and that computing at the point-of-use would emerge as a fundamental new trend.




The computer terminal Datapoint 3300.

Philips, Appel & Walden then considered underwriting the initial public offering of Computer Terminal Corporation (later re-named Datapoint) of San Antonio, Texas. Their initial product, the Datapoint 3300, was a terminal for remote computing environment. I evaluated the company for Philips, Appel & Walden, while William Hambrecht did the due diligence on behalf of the investment firm Hambrecht and Quist, of San Francisco. I was impressed by the engineering and manufacturing expertise of the group but I considered their beautifully designed Datapoint 3300 to be conceptually outdated.


Discussing future trends with Gus Roche.

In that initial visit, I had an extended discussion with Austin (Gus) Roche, who was the vice president for research and development at that time. I conveyed to him my conviction that point-of-use computers would replace dumb computer terminals. Gus met me halfway by agreeing that an intelligent terminal with a central processing unit would be a natural next step. Following my recommendation, Philips, Appel & Walden underwrote (with Hambrecht and Quist) the initial public offering of Computer Terminal Corporation.


The intelligent terminal Datapoint 2200.
A month prior to the public offering, the company began development of Datapoint 2200. Then in November of that year, the design of the central processing unit was given to Victor (Vic) D. Poor and Harry S. Pyle, who were outside consultants to Datapoint. Although I was not in communication with either at that time, my understanding was that they expected to implement the central processing unit by utilizing the potential of large scale integration semiconductor technology. In December of 1969, Datapoint asked Intel and Texas Instrument to implement the central processing unit design using large scale integration. Intel agreed. Intel initially named the 8-bit microprocessor project the 1201, and it was subsequently renamed the 8008.


The delay in the introduction of the 8008.

Intel’s main business at the time was to design and manufacture 1,024-bit dynamic random access memory chips. Intel also undertook designing a programmable calculator chip set for Busicom of Japan (later named the 4004). When Intel realized that the capacity of its technical staff was not sufficient to develop both products at the same time, it shelved development of the 8008. Datapoint then let Texas Instruments implement the design on a single microprocessor chip. Texas Instruments promptly filed for a patent application on such implementation. After Texas Instruments encountered yield problems, Datapoint implemented the central processing unit of the 2200 using random logic integrated circuits (Malone, 1995; Noyce and Hoff, 1981; Poor, 1996).


Meeting with Robert Noyce of Intel.

As soon as I heard that Intel had shelved the 8008 development I met with Dr. Noyce, who was Intel’s CEO at the time. I expressed to him my view that the 4004 was unsuitable for general purpose use because, among other things, its 4-bit wide type size was insufficient even to represent alphabetical characters. I conveyed to him my conviction that an 8-bit single-chip microprocessor would revolutionize the computer industry and I urged him to resume the 8008 project. I suggested that I might be the 8008′s first customer. Dr. Noyce agreed to complete the development of the 8008 after first completing the 4-bit chip-set for Busicom. He added that Intel would need to obtain a release from Datapoint before the 8008 project could resume. I told Dr. Noyce that I would talk to Phil Ray, the president of Datapoint, about granting the needed release. I returned to San Antonio and met with Phil Ray, who agreed to provide Intel with the requested release.


The Intel 8008 and 8080 single chip 8-bit microprocessors.

The 8008 first led to the design of the 8080 and eventually to the Pentium processor. Naturally, there was no design commonality between the 4004 and the subsequent microprocessor family. Faced with the facts that the logic design of the 8008 was made by Datapoint and its initial chip implementation was covered by a Texas Instruments patent application, Intel conferred on 4004 the status of the first microprocessor.


Gus Roche of Datapoint.

Datapoint’s contribution to the advent of the microprocessor era was virtually unknown. I felt that this ought to be corrected. In 1975, I was scheduled to chair the opening session at the IEEE International Conference in New York, which was titled “The Microcomputer Revolution.” I asked Gus Roche to present a paper on Datapoint’s contribution. He agreed. Shortly before the conference, he died in a car accident.

Figure 3.   Daniel Alroy comments on the Microcomputer Revolution, the opening session of the 1975 International IEEE Conference, which he organized and chaired.



Q1 Corporation.



James Walden.

Prior to Datapoint’s public offering, I discussed my view that the Datapoint 3300 was based on flawed assumptions with Jim Walden, the Managing Partner of Phillips, Appel & Walden. In response, he suggested that if I could design a better product, I should do so, and that he would invite some private investors to provide me with seed capital. As a result, Q1 Corporation was formed.


System design considerations.

The computer system I set out to develop was based on two assumptions. The first related to semiconductor technology. I assumed that there would be a continued, rapid increase in the number of transistors per unit area and a corresponding decline in the cost per transistor. From this, I concluded that a basic new trend would emerge of computing at the point-of-use. The second assumption about system design borrowed from the philosophy of science and the view that a major new theory is typically characterized by:

*   A wider scope of applicability

*   A smaller number of necessary assumptions

*   More precise measurements

If this were true for science, it ought to be true for technology—the application of science. This assumption contrasts with the commonly held view that custom-made solutions are the most effective. Microfilm as a method of saving space for information storage represents this common-sense view but, now it has already been recognized that it is not feasible to treat information storage in isolation (Metz, 1971). The central processing unit is a general-purpose computer. In contrast, in a computer system this general-purpose character is lost to limited-purpose peripherals and software. My aim was to make a general-purpose system (Alroy 1978). Such a general-purpose design would:

*    Replace a multiplicity of limited-purpose systems

*    Perform the functions with greater specificity

*    Accomplish both of the above goals at a lower cost


Figure 4.   The Q1 Lite, which was installed in the eleven bases of NASA. It utilized the Intel 8080, second generation microprocessor.



The first two generations of 8-bit microprocessors


The 8008-based Q1 microcomputer system.

In April 1972, Intel introduced the 8008, which was the first 8-bit single chip microprocessor. On December 11, 1972, Q1 Corporation delivered a microcomputer system based on the Intel 8008 microprocessor to the Litcom Division of Litton Industries in Melville, Long Island. This was the very first delivery of a microcomputer system.


Sale of know-how to Nixdorf Computer Company.

Early in 1973, Dr. Ron Sommer arranged for me to meet with Mr. Heinz Nixdorf, who was the founder and president of Nixdorf Computer Company of Paderborn, Germany. Following that meeting, Q1 Corporation received ten monthly payments of $40,000 from Nixdorf Computer Company in exchange for a sale of know-how (Electronic News, June 18, 1973). The income from the know-how sale expedited the development of the 8080-based Q1 microcomputer system.


Figure 4.3    Printed circuit board of the Q1 Lite microcomputer system.


The 8080-based Q1 microcomputer system.

In April of 1974, Intel introduced a second-generation 8-bit microprocessor, the 8080. That month, Q1 shipped a pre-production unit of its 8080-based microcomputer system, on loan and with a buy option, to the Israeli Air Force. In June 1974, Q1 received a follow-up order for a number of 8080-based systems, which were subject to acceptance tests. The first two 8080-based systems were delivered in August 1974, and the pre-production unit was returned to Q1.


The limited-purpose microcomputer market.

In May of 1973, Mircal, in France, introduced a special-purpose process controller. In the fall of that year, Jonathan Titus offered the Mark-8, an 8008 assembly-kit for the hobbyist market (Titus 1974). Neither product was a self-contained, general-purpose computer system, nor were they intended to be. Late in 1974, Altair began offering the Altair 8800 kit for the hobby market, based on the Intel 8080 chip. The Altair 8800 kit, like the 8008 products of Mircal and Titus, was never intended to be a self-contained general-purpose computer system. During the period of 1973-1974, Q1 Corporation was the only company in the world to deliver self-contained, general-purpose microcomputer systems.


The National Aeronautic and Space Administration (NASA).

In 1974, Computer Science Corporation made a study of microcomputer systems for NASA. Based on the recommendation of the Computer Science Corporation, the Q1 Lite computer systems were installed in all eleven NASA bases.


Figure 4.6    Sophisticated customer reports on its experience with the Q1 Lite.



UK’s National Enterprise Board.

In 1979, the National Enterprise Board, an entity of the British Government, invested $11.5 million in a joint venture with Q1 Corporation (Electronic News, August 20, 1979).


Figure 5.    Q1 Facility in Hauppauge. The flags indicate some of the countries where Q1 computers were installed.



Returning to my core interest.

My involvement in the computer field was subsidiary to my interest in the relation of mind and brain. Running a public company is not a 9–5 job. It did not allow me, for example, to complete my doctoral studies. The time had come for me to return to my core interest. The joint venture with the UK’s National Enterprise Board was the opportune moment to install a president in my place.


A postscript


Recapitulating the two basic assumptions.

Q1 was formed on the basis of two assumptions: The first assumption related to semiconductor technology—that the increase in the number of transistors per unit area, and the corresponding decrease in the cost per transistor, will continue. The second assumption related to computer system design—that there would be an advantage in convergent system design. Looking back, each is reviewed in turn.


The semiconductor technology.


Figure 6.1.1   The semiconductor revolution is best exemplified by the billion-fold-increase in the transistor density in 15 generations of RAM chips.



Fifteen generations of memory chips.

Present-day dynamic random access memory chip contains about one billion transistors. It may be viewed in terms of three stages of thousandfold increase in the number of transistors per unit area, where each such major stage consists of five generations of quadrupling the number of transistors per chip.


A major development in human history.

The size of the silicon chip has remained substantially the same. Hence, present-day memory chip represents a billion-fold increase in the number of transistors per unit area. Moreover, the production cost per unit area has also remained about the same. Consequently, the billion-fold increase in the number of transistors per unit area reflects a billion-fold decrease in the cost per transistor, as if a ten million dollar mansion in 1960 cost a penny now. Multiplying the reduction in the cost per transistor by the increased processing speed yields an estimate of the drop in computing cost, which comes to about thirty-two trillion-fold. If the above reasoning is basically correct, then the semiconductor revolution is the most significant technological event in human history.



Convergence involves the reduction in the number of types of equipment that are needed to perform a set of tasks. Convergence implies the counterintuitive notion that a general-purpose system can perform tasks with greater specificity than equipment custom made for a given purpose. But, consider, for example, the facsimile function. Given the ubiquitous use of a personal computer with an inkjet printer, scanner, and a modem, it can replace a fax machine, perform fax functions better than a fax machine, and do so at zero additional hardware cost. Convergence was chosen as a design criterion on the belief that it is an elusive concept, and would thus allow the company to extend its lead-time. This assumption has proved correct. Andrew Grove, the former president of Intel, denied the validity of convergence as a design criterion for information technology products. Lou Gerstner, the former president of IBM, while recognizing the validity of convergence, viewed it primarily in terms of interoperability. To this day, there is no adequate and explicit characterization of what convergence entails.


Convergence through generalization.

Inductive generalization is a key to convergence. In most cases, a solution to a stated problem cannot be effective unless that stated problem is first replaced by its generalized version. Then, the proposed solution needs to be generalized as well. The office function is an example of a problem we sought to generalize from the outset. Some office functions differ by the type of industry and by the size of companies. We focused on those functions that are common to companies regardless of industry type and size. We were then surprised to discover that when defined as such the office function is a major, if not the major, segment in the economies of developed countries. Yet, economists have not recognized that market segment by name. Section 3.2, above, indicates some of our considerations in generalizing solutions in our design of information systems.


Teleological considerations in convergence.

Sub-goals are defined relative to goals. Any such goal typically is located in the more distant future than its sub-goals. Any proposed information system must be designed to constitute a sub-goal stage that will converge toward subsequent information systems.


The next major phase in information systems.

The proliferation of personal information devices with overlapping functions signals the end of an era. Currently, most information products fail to satisfy the convergence requirements indicated above. Despite being away from the information technology field for some decades, I seem to be in a unique position to outline the next major phase in information systems and consumer electronics. I intend, therefore, to share this outline with some congenial interested party (provided that it would not take me away from my present focus on the mind/brain). This summer I will seek to identify and establish a relation with such an entity.



Alroy, D. The microcomputer revolution. IEEE International Conference. New York, NY. April, 1975.

Alroy, D. The convergence towards the multifunction microcomputer systems. International Word Processing Association. New Orleans, LA. February, 1978.

Electronic News. January 29, 1973; June 18, 1973; August 20, 1979.

Malone, MS. The Microprocessor: A Biography. New York, NY. Springer-Verlag. 1995.

Metz, R. Mini-Computer: Some Concern. An Interview with Daniel Alroy. New York Times. October 6, 1971.

Noyce, RN and Hoff, ME Jr. A history of microprocessor development at Intel. IEEE Micro. pp 8-21. February, 1981.

Poor, VD. Personal communication. September, 1996.

Titus, J. Build the Mark-8: Your Personal Minicomputer. Radio Electronics. July, 1974.

© 2017 Daniel Alroy