Computer History 101: The Development Of The PC

From Toms Hardware website. Toms hardware’s web site has a a series of articles that I will be reposting here as I go readying them because they are very informational

The First Electronic Computer

A physicist named John V. Atanasoff (with associate Clifford Berry) is officially credited with creating the first true digital electronic computer from 1937 to 1942, while working at Iowa State University. The Atanasoff-Berry Computer (called the ABC) was the first to use modern digital switching techniques and vacuum tubes as switches, and it introduced the concepts of binary arithmetic and logic circuits. This was made legally official on October 19, 1973 when, following a lengthy court trial, U.S. Federal Judge Earl R. Larson voided the ENIAC patent of Eckert and Mauchly and named Atanasoff as the inventor of the first electronic digital computer.

Military needs during World War II caused a great thrust forward in the evolution of computers. In 1943, Tommy Flowers completed a secret British code-breaking computer called Colossus, which was used to decode German secret messages. Unfortunately, that work went largely uncredited because Colossus was kept secret until many years after the war.

ENIAC Is Born

Besides code-breaking, systems were needed to calculate weapons trajectory and other military functions.In 1946, John P. Eckert, John W. Mauchly, and their associates at the Moore School of Electrical Engineering at the University of Pennsylvania built the first large-scale electronic computer for the military. This machine became known as ENIAC, the Electrical Numerical Integrator and Calculator.It operated on 10-digit numbers and could multiply two such numbers at the rate of 300 products per second by finding the value of each product from a multiplication table stored in its memory. ENIAC was about 1000 times faster than the previous generation of electromechanical relay computers.

ENIAC used approximately 18 000 vacuum tubes, occupied 1800 square feet (167 square meters) of floor space, and consumed around 180 000 watts of electrical power. Punched cards served as the input and output; registers served as adders and as quick-access read/write storage.

The executable instructions composing a given program were created via specified wiring and switches that controlled the flow of computations through the machine. As such, ENIAC had to be rewired and switched for each program to be run.

Although Eckert and Mauchly were originally given a patent for the electronic computer, it was later voided and the patent awarded to John Atanasoff for creating the Atanasoff-Berry Computer.

Programs: Change The Software, Not The Hardware

Earlier in 1945, the mathematician John von Neumann demonstrated that a computer could have a simple, fixed physical structure and yet be capable of executing any kind of computation effectively by means of proper programmed control without changes in hardware. In other words, you could change the program without rewiring the system. The stored-program technique, as von Neumann’s ideas are known, became fundamental for future generations of high-speed digital computers and has become universally adopted.

The first generation of modern programmed electronic computers to take advantage of these improvements appeared in 1947.This group of machines included EDVAC and UNIVAC, the first commercially available computers. These computers included, for the first time, the use of truerandom access memory (RAM) for storing parts of the program and the data that is needed quickly. Typically, they were programmed directly in machine language, although by the mid-1950s progress had been made in several aspects of advanced programming. The standout of the era is the UNIVAC (Universal Automatic Computer), which was the first true general-purpose computer designed for both alphabetical and numerical uses. This made the UNIVAC a standard for business, not just science and the ­military.

From Tubes To Transistors

From UNIVAC to the latest desktop PCs, computer evolution has moved very rapidly. The first-generation computers were known for using vacuum tubes in their construction. The generation to follow would use the much smaller and more efficient transistor.

From Tubes…

Any modern digital computer is largely a collection of electronic switches. These switches are used to represent and control the routing of data elements called binary digits (or bits).Because of the on-or-off nature of the binary information and signal routing the computer uses, an efficient electronic switch was required. The first electronic computers used vacuum tubes as switches, and although the tubes worked, they had many problems.

The three main components of a basic triode vacuum tube.

The type of tube used in early computers was called a triode and was invented by Lee De Forest in 1906. It consists of a cathode and a plate, separated by a control grid, suspended in a glass vacuum tube. The cathode is heated by a red-hot electric filament, which causes it to emit electrons that are attracted to the plate. The control grid in the middle can control this flow of electrons. By making it negative, you cause the electrons to be repelled back to the cathode; by making it positive, you cause them to be attracted toward the plate. Thus, by controlling the grid current, you can control the on/off output of the plate.

Unfortunately, the tube was inefficient as a switch. It consumed a great deal of electrical power and gave off enormous heat—a significant problem in the earlier systems. Primarily because of the heat they generated, tubes were notoriously unreliable—in larger systems, one failed every couple of hours or so.

…To Transistors

The invention of the transistor was one of the most important developments leading to the personal computer revolution.The transistor was invented in 1947 and announced in 1948 by Bell Laboratory engineers John Bardeen and Walter Brattain. Bell associate William Shockley invented the junction transistor a few months later, and all three jointly shared the Nobel Prize in Physics in 1956 for inventing the transistor. The transistor, which essentially functions as a solid-state electronic switch, replaced the less-suitable vacuum tube. Because the transistor was so much smaller and consumed significantly less power, a computer system built with transistors was also much smaller, faster, and more efficient than a computer system built with vacuum tubes.

The conversion from tubes to transistors began the trend toward miniaturization that continues to this day. Today’s small laptop PC (or netbook, if you prefer) and even Tablet PC systems, which run on batteries, have more computing power than many earlier systems that filled rooms and consumed huge amounts of electrical power.

Although there have been many designs for transistors over the years, the transistors used in modern computers are normally Metal Oxide Semiconductor Field Effect Transistors (MOSFETs). MOSFETs are made from layers of materials deposited on a silicon substrate. Some of the layers contain silicon with certain impurities added by a process called doping or ion bombardment, whereas other layers include silicon dioxide (which acts as an insulator), polysilicon (which acts as an electrode), and metal to act as the wires to connect the transistor to other components. The composition and arrangement of the different types of doped silicon allow them to act both as a conductor or an insulator, which is why silicon is called a semiconductor.

MOSFETs can be constructed as either NMOS or PMOS types, based on the arrangement of doped silicon used. Silicon doped with boron is called P-type (positive) because it lacks electrons, whereas silicon doped with phosphorus is called N-type (negative) because it has an excess of free electrons.

MOSFETs have three connections, called the source, gate, and drain. An NMOS transistor is made by using N-type silicon for the source and drain, with P-type silicon placed in between. The gate is positioned above the P-type silicon, separating the source and drain, and is separated from the P-type silicon by an insulating layer of silicon dioxide. Normally there is no current flow between N-type and P-type silicon, thus preventing electron flow between the source and drain. When a positive voltage is placed on the gate, the gate electrode creates a field that attracts electrons to the P-type silicon between the source and drain. That in turn changes that area to behave as if it were N-type silicon, creating a path for current to flow and turning the transistor “on.”

Cutaway view of an NMOS transistor.

A PMOS transistor works in a similar but opposite fashion. P-type silicon is used for the source and drain, with N-type silicon positioned between them. When a negative voltage is placed on the gate, the gate electrode creates a field that repels electrons from the N-type silicon between the source and drain. That in turn changes that area to behave as if it were P-type silicon, creating a path for current to flow and turning the transistor “on.”

When both NMOS and PMOS field-effect transistors are combined in a complementary arrangement, power is used only when the transistors are switching, making dense, low-power circuit designs possible. Because of this, virtually all modern processors are designed using CMOS (Complementary Metal Oxide Semiconductor) technology.

Compared to a tube, a transistor is much more efficient as a switch and can be miniaturized to microscopic scale. Since the transistor was invented, engineers have strived to make it smaller and smaller. In 2003, NEC researchers unveiled a silicon transistor only 5 nanometers (billionths of a meter) in size. Other technology, such as Graphene and carbon nanotubes, are being explored to produce even smaller transistors, down to the molecular or even atomic scale. In 2008, British researchers unveiled a Graphene-based transistor only 1 atom thick and 10 atoms (1 nm) across, and in 2010, IBM researchers created Graphene transistors switching at a rate of 100 gigahertz, thus paving the way for future chips denser and faster than possible with silicon-based designs.

Integrated Circuits: The Next Generation

The third generation of modern computers is known for using integrated circuits instead of individual transistors. Jack Kilby at Texas Instruments and Robert Noyce at Fairchild are both credited with having invented the integrated circuit (IC) in 1958 and 1959. An IC is asemiconductor circuit that contains more than one component on the same base (or substrate material), which are usually interconnected without wires. The first prototype IC constructed by Kilby at TI in 1958 contained only one transistor, several resistors, and a capacitor on a single slab of germanium, and it featured fine gold “flying wires” to interconnect them. However, because the flying wires had to be individually attached, this type of design was not practical to manufacture. By comparison, Noyce patented the “planar” IC design in 1959, where all the components are diffused in or etched on a silicon base, including a layer of aluminum metal interconnects. In 1960, Fairchild constructed the first planar IC, consisting of a flip-flop circuit with four transistors and five resistors on a circular die only about 20 mm2 in size. By comparison, the Intel Core i7 quad-core processor incorporates 731 million transistors (and numerous other components) on a single 263 mm2 die!

Birth Of The Personal Computer

The fourth generation of the modern computer includes those that incorporate microprocessors in their designs.Of course, part of this fourth generation of computers is the personal computer, which itself was made possible by the advent of low-cost microprocessors and memory.

Birth of the Personal Computer

In 1973, some of the first microcomputer kits based on the 8008 chip were developed. These kits were little more than demonstration tools and didn’t do much except blink lights. In April 1974, Intel introduced the 8080 microprocessor, which was 10 times faster than the earlier 8008 chip and addressed 64 KB of memory. This was the breakthrough that the personal computer industry had been waiting for.

A company called MITS introduced the Altair 8800 kit in a cover story in the January 1975 issue of Popular Electronics. The Altair kit, considered by many to be the first personal computer, included an 8080 processor, a power supply, a front panel with a large number of lights, and 256 bytes (not kilobytes) of memory. The kit sold for $395 and had to be assembled. Assembly back then meant you got out your soldering iron to actually finish the circuit boards—not like today, where you can assemble a system of premade components with nothing more than a screwdriver.

Ed Roberts, The "Father Of The Personal Computer"

Micro Instrumentation and Telemetry Systems was the original name of the company founded in 1969 by Ed Roberts and several associates to manufacture and sell instruments and transmitters for model rockets. Ed Roberts became the sole owner in the early 1970s, after which he designed the Altair. By January 1975, when the Altair was introduced, the company was called MITS, Inc., which then stood for nothing more than the name of the company. In 1977, Roberts sold MITS to Pertec, moved to Georgia, went to medical school, and became a practicing physician. Considered by many to be the “father of the personal computer,” Roberts passed away in 2010 after a long bout with pneumonia.

The Altair included an open architecture system bus later called the S-100 bus, so named because it became an industry standard and had 100 pins per slot. The S-100 bus was widely adopted by other computers that were similar to the Altair, such as the IMSAI 8080, which was featured in the movie WarGames. The S-100 bus open architecture meant that anybody could develop boards to fit in these slots and interface to the system, and it ensured a high level of cross-compatibility between different boards and systems.The popularity of 8080 processor–based systems inspired software companies to write programs, including the CP/M (control program for microprocessors) OS and the first version of the Microsoft BASIC (Beginners All-purpose Symbolic Instruction Code) programming language.

IBM’s First Personal Computer (But Not PC Compatible)

IBM introduced what can be called its first personal computer in 1975.The Model 5100 had 16 KB of memory, a built-in 16-line-by-64-character display, a built-in BASIC language interpreter, and a built-in DC-300 cartridge tape drive for storage. The system’s $8975 price placed it out of the mainstream personal computer marketplace, which was dominated by experimenters (affectionately referred to as hackers) who built low-cost kits ($500 or so) as a hobby. Obviously, the IBM system was not in competition for this low-cost market and did not sell as well by comparison.

The Model 5100 was succeeded by the 5110 and 5120 before IBM introduced what we know as the IBM Personal Computer (Model 5150). Although the 5100 series preceded the IBM PC, the older systems and the 5150 IBM PC had nothing in common. The PC that IBM turned out was more closely related to the IBM System/23 DataMaster, an office computer system introduced in 1980.In fact, many of the engineers who developed the IBM PC had originally worked on the DataMaster.

Apple’s First Personal Computer (But Not Macintosh Compatible)

In 1976, a new company called Apple Computer introduced the Apple I, which originally sold for $666.66. The selling price was an arbitrary number selected by one of Apple’s cofounders, Steve Jobs. This system consisted of a main circuit board screwed to a piece of plywood; a case and power supply were not included. Only a few of these computers were made, and they reportedly have sold to collectors for more than $20 000. The Apple II, introduced in 1977, helped set the standard for nearly all the important microcomputers to follow, including the IBM PC.

The microcomputer world was dominated in 1980 by two types of computer systems. One type, the Apple II, claimed a large following of loyal users and a gigantic software base that was growing at a fantastic rate. The other type, CP/M systems, consisted not of a single system but of all the many systems that evolved from the original MITS Altair. These systems were compatible with one another and were distinguished by their use of the CP/M OS and expansion slots, which followed the S-100 standard. All these systems were built by a variety of companies and sold under various names. For the most part, however, these systems used the same software and plug-in hardware.It is interesting to note that none of these systems was PC compatible or Macintosh compatible, the two primary standards in place today.

A new competitor looming on the horizon was able to see that to be successful, a personal computer needed to have an open architecture, slots for expansion, a modular design, and healthy support from both hardware and software companies other than the original manufacturer of the system. This competitor turned out to be IBM, which was quite surprising at the time because IBM was not known for systems with these open-architecture attributes. IBM, in essence, became more like the early Apple, whereas Apple became like everybody expected IBM to be. The open architecture of the forthcoming IBM PC and the closed architecture of the forthcoming Macintosh caused a complete turnaround in the industry.

The IBM Personal Computer

At the end of 1980, IBM decided to truly compete in the rapidly growing low-cost personal computer market. The company established the Entry Systems Division, located in Boca Raton, Florida, to develop the new system. The division was intentionally located far away from IBM’s main head­quarters in New York, or any other IBM facilities, so that it would be able to operate independently as a separate unit. This small group consisted of 12 engineers and designers under the direction of Don Estridge and was charged with developing IBM’s first real PC. (IBM considered the previous 5100 system, developed in 1975, to be an intelligent programmable terminal rather than a genuine computer, even though it truly was a computer.) Nearly all these engineers had come to the new division from the System/23 DataMaster project, which was a small office computer system introduced in 1980 and the direct predecessor of the IBM PC.

Enter The First PC-Compatible System, Using Intel’s 8088 CPU

Much of the PC’s design was influenced by the DataMaster design. In the DataMaster’s single-piece design, the display and keyboard were integrated into the unit. Because these features were limiting, they became external units on the PC, although the PC keyboard layout and electrical designs were copied from the DataMaster.

Several other parts of the IBM PC system also were copied from the DataMaster, including the expansion bus (or input/output slots), which included not only the same physical 62-pin connector, but also almost identical pin specifications. This copying of the bus design was possible because the PC used the same interrupt controller as the DataMaster and a similar direct memory access (DMA) controller. Also, expansion cards already designed for the DataMaster could easily be redesigned to function in the PC.

The DataMaster used an Intel 8085 CPU, which had a 64 KB address limit and an 8-bit internal and external data bus. This arrangement prompted the PC design team to use the Intel 8088 CPU, which offered a much larger (1 MB) memory address limit and an internal 16-bit data bus, but only an 8-bit external data bus. The 8-bit external data bus and similar instruction set enabled the 8088 to be easily interfaced into the earlier DataMaster designs.

IBM brought its system from idea to delivery of functioning systems in one year by using existing designs and purchasing as many components as possible from outside vendors. The Entry Systems Division was granted autonomy from IBM’s other divisions and could tap resources outside the company, rather than go through the bureaucratic procedures that required exclusive use of IBM resources. IBM contracted out the PC’s languages and OS to a small company named Microsoft. That decision was the major factor in establishing Microsoft as the dominant force in PC software.

Digital Research Makes Way For Microsoft

It is interesting to note that IBM had originally contacted Digital Research (the company that created CP/M, then the most popular personal computer OS) to have it develop an OS for the new IBM PC. However, Digital was leery of working with IBM and especially balked at the nondisclosure agreement IBM wanted Digital to sign. Microsoft jumped on the opportunity left open by Digital Research and, consequently, became the largest software company in the world. IBM’s use of outside vendors in developing the PC was an open invitation for the after-market to jump in and support the ­system—and it did.

On August 12, 1981, a new standard was established in the microcomputer industry with the debut of the IBM PC. Since then, hundreds of millions of PC-compatible systems have been sold, as the original PC has grown into an enormous family of computers and peripherals. More software has been written for this computer family than for any other system on the market.

The PC Industry 30 Years Later

In the 30 years since the original IBM PC was introduced, many changes have occurred. The IBM-compatible computer, for example, advanced from a 4.77 MHz 8088-based system to 3 GHz (3000MHz) or faster systems—about 100 000 or more times faster than the original IBM PC (in actual processing speed, not just clock speed). The original PC had only one or two single-sided floppy drives that stored 160 KB each using DOS 1.0, whereas modern systems can have several terabytes (trillion bytes) or more of hard disk storage.

A rule of thumb in the computer industry (called Moore’s Law, originally set forth by Intel cofounder Gordon Moore) is that available processor performance and disk-storage capacity doubles every one and a half to two years, give or take.

Since the beginning of the PC industry, this pattern has held steady and, if anything, seems to be accelerating.

Moore’s Law

In 1965, Gordon Moore was preparing a speech about the growth trends in computer memory and made an interesting observation. When he began to graph the data, he realized a striking trend existed. Each new chip contained roughly twice as much capacity as its predecessor, and each chip was released within 18–24 months of the previous chip. If this trend continued, he reasoned, computing power would rise exponentially over relatively brief periods.

Moore’s observation, now known as Moore’s Law, described a trend that has continued to this day and is still remarkably accurate. It was found to not only describe memory chips, but also accurately describe the growth of processor power and disk drive storage capacity. It has become the basis for many industry performance forecasts. As an example, in less than 40 years the number of transistors on a processor chip increased more than half a million fold, from 2300 transistors in the 4004 processor in 1971 to 1.17 billion transistors in the six-core versions of the Core i-series processors released in 2010.

The PC-Compatible Open Standard

In addition to performance and storage capacity, another major change since the original IBM PC was introduced is that IBM is not the only manufacturer of PC-compatible systems. IBM originated the PC-compatible standard, of course, but today it no longer sets the standards for the system it originated. More often than not, new standards in the PC industry are developed by companies and organizations other than IBM.

Today, Intel, Microsoft, and AMD are primarily responsible for developing and extending the PC hardware and software standards. Some have even taken to calling PCs “Wintel” systems, owing to the dominance of the first two companies. Although AMD originally produced Intel processors under license and later produced low-cost, pin-compatible counterparts to Intel’s 486 and Pentium processors (AMD 486, K5/K6), starting with the Athlon AMD has created completely unique processors that are worthy rivals to Intel’s own models.

In more recent years, the introduction of hardware standards such as the universal serial bus (USB), Peripheral Component Interconnect (PCI) bus, Accelerated Graphics Port (AGP) bus, PCI Express bus, ATX motherboard form factor, as well as processor socket and slot interfaces show that Intel is the ­driving force behind PC hardware design. Intel’s ability to design and produce motherboard chipsets as well as complete motherboards has enabled Intel processor–based systems to first adopt newer memory and bus architectures as well as system form factors. Although in the past AMD has on occasion made chipsets for its own processors, the company’s acquisition of ATI has allowed it to become more aggressive in the chipset marketplace.

PC-compatible systems have thrived not only because compatible hardware can be assembled easily, but also because the most popular OS was available not from IBM but from a third party (Microsoft). The core of the system software is the basic input/output system (BIOS); this was also available from third-party companies, such as AMI, Phoenix, and others. This situation enabled other manufacturers to license the OS and BIOS software and sell their own compatible systems. The fact that DOS borrowed the functionality and user interface from both CP/Mand UNIX probably had a lot to do with the amount of software that became available. Later, with the success of Windows, even more reasons would exist for software developers to write programs for PC-compatible systems.

The Apple Macintosh Closed Standard

One reason Apple’s Macintosh systems have never enjoyed the market success of PC systems is that Apple has often used proprietary hardware and software designs that it was unwilling to license to other companies. This proprietary nature has unfortunately relegated Apple to a meager 5% market share in personal computers.

One fortunate development for Mac enthusiasts was Apple’s shift to Intel x86 processors and PC architecture in 2006, resulting in greatly improved performance and standardization as compared to the previous non-PC-compatible Mac systems. Although Apple has failed to adopt some of the industry-standard component form factors used in PCs (rendering major components such as motherboards non-interchangeable), the PC-based Macs truly are PCs from a hardware standpoint, using all the same processors, chipsets, memory, buses, and other system architectures that PCs have been using for years. I’ve had people ask me, “Is there a book like Upgrading and Repairing PCs that covers Macs instead?” Well, since 2006 Macs have essentially become PCs, they are now covered in this book by default! The move to a PC-based architecture is without a doubt the smartest move Apple has made in years—besides reducing Apple’s component costs, it allows Macs to finally perform on par with PCs.

Apple could even become a real contender in the OS arena (taking market share from Microsoft) if the company would only sell its OS in an unlocked version that would run on non-Apple PCs. Unfortunately for now, even though Apple’s OS X operating system is designed to run on PC hardware, it is coded to check for a security chip found only on Apple motherboards. There are ways to work around the check (see OSx86project.org), but Apple does not support them.

Apple’s shift to a PC-based architecture is one more indication of just how popular the PC has become. After 30 years the PC continues to thrive and prosper. With far-reaching industry support and an architecture that is continuously evolving, I would say it is a safe bet that PC-compatible systems will continue to dominate the personal computer marketplace for the foreseeable future.

Mechanical To Modern

Many discoveries and inventions have directly and indirectly contributed to the development of the PC and other personal computers as we know them today. Examining a few important developmental landmarks can help bring the entire picture into focus.

The Timeline Of Computer Advancements:

The following is a timeline of significant events in computer history. It is not meant to be complete, just a representation of some of the major landmarks in computer development:

Pre-1900s: Mechanical Computers

1617:       John Napier creates “Napier’s Bones,” wooden or ivory rods used for ­calculating.

1642:       Blaise Pascal introduces the Pascaline digital adding machine.

1822:       Charles Babbage introduces the Difference Engine and later the Analytical Engine, a true general-purpose computing machine.

The Early 1900s: The Vacuum Tube Era

1906:       Lee De Forest patents the vacuum tube triode, used as an electronic switch in the first electronic computers.

1936:       Alan Turing publishes “On Computable Numbers,” a paper in which he conceives an imaginary computer called the Turing Machine, considered one of the foundations of modern computing. Turing later worked on breaking the German Enigma code.

1936:       Konrad Zuse begins work on a series of computers that will culminate in 1941 when he finishes work on the Z3. These are considered the first working electric binary computers, using electromechanical switches and relays.

1937:       John V. Atanasoff begins work on the Atanasoff-Berry Computer (ABC), which would later be officially credited as the first electronic computer. Note that an electronic computer uses tubes, transistors, or other solid-state switching devices, whereas an electric computer uses electric motors, solenoids, or relays (electromechanical switches).

1943:       Thomas (Tommy) Flowers develops the Colossus, a secret British code-breaking computer designed to decode teleprinter messages encrypted by the German army.

1945:       John von Neumann writes “First Draft of a Report on the EDVAC,” in which he outlines the architecture of the modern stored-program computer.

1946:       ENIAC is introduced, an electronic computing machine built by John Mauchly and J. Presper Eckert.

1947:       On December 23, William Shockley, Walter Brattain, and John Bardeen successfully test the point-contact transistor, setting off the semiconductor revolution.

1949:       Maurice Wilkes assembles the EDSAC, the first practical stored-program computer, at Cambridge University.

1950:       Engineering Research Associates of Minneapolis builds the ERA 1101, one of the first commercially produced ­computers.

1952:       The UNIVAC I delivered to the U.S. Census Bureau is the first commercial computer to attract widespread public attention.

1953:       IBM ships its first electronic computer, the 701.

1954:       A silicon-based junction transistor, perfected by Gordon Teal of Texas Instruments, Inc., brings a tremendous reduction in costs.

1954:       The IBM 650 magnetic drum calculator establishes itself as the first mass-produced computer, with the company selling 450 in one year.

1955-1981: From Transistors In Labs, To Integrated Circuits In The Home

1955:       Bell Laboratories announces the first fully transistorized computer, TRADIC.

1956:       MIT researchers build the TX-0, the first general-purpose, programmable computer built with transistors.

1956:       The era of magnetic disk storage dawns with IBM’s shipment of a 305 RAMAC to Zellerbach Paper in San Francisco.

1958:       Jack Kilby creates the first integrated circuit at Texas Instruments to prove that resistors and capacitors can exist on the same piece of semiconductor material.

1959:       IBM’s 7000 series mainframes are the company’s first transistorized computers.

1959:       Robert Noyce’s practical integrated circuit, invented at Fairchild Camera and Instrument Corp., allows printing of conducting channels directly on the silicon surface.

1960:       Bell Labs designs its Dataphone, the first commercial modem, specifically for converting digital computer data to analog signals for transmission across its long-distance network.

1961:       According to Datamation magazine, IBM has an 81.2% share of the computer market in 1961, the year in which it introduces the 1400 series.

1964:       IBM announces System/360, a family of six mutually compatible computers and 40 peripherals that can work together.

1964:       Online transaction processing makes its debut in IBM’s SABRE reservation system, set up for American Airlines.

1965:       Digital Equipment Corp. introduces the PDP-8, the first commercially successful minicomputer.

1969:       The root of what is to become the Internet begins when the Department of Defense establishes four nodes on the ARPAnet: two at University of California campuses (one at Santa Barbara and one at Los Angeles) and one each at Stanford Research Institute and the University of Utah.

1971:       A team at IBM’s San Jose Laboratories invents the 8-inch floppy disk drive.

1971:       The first advertisement for a microprocessor, the Intel 4004, appears in Electronic News.

1971:       The Kenbak-1, one of the first personal computers, is advertised for $750 in Scientific American.

1972:       Intel’s 8008 microprocessor makes its debut.

1973:       Robert Metcalfe devises the Ethernet method of network connection at the Xerox Palo Alto Research Center.

1973:       The Micral is the earliest commercial, nonkit personal computer based on a microprocessor, the Intel 8008.

1973:       The TV Typewriter, designed by Don Lancaster, provides the first display of alphanumeric information on an ordinary television set.

1974:       Researchers at the Xerox Palo Alto Research Center design the Alto, the first workstation with a built-in mouse for input.

1974:       Scelbi advertises its 8H computer, the first commercially advertised U.S. computer based on a microprocessor, Intel’s 8008.

1975:       Telenet, the first commercial packet-switching network and civilian equivalent of ARPAnet, is born.

1975:       The January edition of Popular Electronics features the Altair 8800, which is based on Intel’s 8080 microprocessor, on its cover.

1976:       Steve Wozniak designs the Apple I, a ­single-board computer.

1976:       The 5 1/4-inch floppy disk drive is introduced by Shugart Associates.

1977:       Tandy RadioShack introduces the TRS-80.

1977:       Apple Computer introduces the Apple II.

1977:       Commodore introduces the PET (Personal Electronic Transactor).

1979:       Motorola introduces the 68000 microprocessor.

1980:       Seagate Technology creates the first hard disk drive for microcomputers, the ST-506.

1981-1995: The PC-Compatible Standard Is Entrenched

1981:       Xerox introduces the Star, the first personal computer with a graphical user interface (GUI).

1981:       Adam Osborne completes the first portable computer, the Osborne I, which weighs 24 pounds and costs $1795.

1981:       IBM introduces its PC, igniting a fast growth of the personal computer market. The IBM PC is the grandfather of all ­modern PCs.

1981:       Sony introduces and ships the first 3 1/2-inch floppy disk drive.

1981:       Philips and Sony introduce the CD-DA (compact disc digital audio) format.

1983:       Apple introduces its Lisa, which incorporates a GUI that’s similar to the one introduced on the Xerox Star.

1983:       Compaq Computer Corp. introduces its first PC clone that uses the same software as the IBM PC.

1984:       Apple Computer launches the Macintosh, the first successful mouse-driven computer with a GUI, with a single $1.5 million commercial during the 1984 Super Bowl.

1984:       IBM releases the PC-AT (PC Advanced Technology), three times faster than original PCs and based on the Intel 286 chip. The AT introduces the 16-bit ISA bus and is the computer on which all modern PCs are based.

1985:       Philips introduces the first CD-ROM drive.

1986:       Compaq announces the Deskpro 386, the first computer on the market to use Intel’s 32-bit 386 chip.

1987:       IBM introduces its PS/2 machines, which make the 3 1/2-inch floppy disk drive and VGA video standard for PCs. The PS/2 also introduces the MicroChannel Architecture (MCA) bus, the first plug-and-play bus for PCs.

1988:       Apple cofounder Steve Jobs, who left Apple to form his own company, unveils the NeXT Computer.

1988:       Compaq and other PC-clone makers develop Enhanced Industry Standard Architecture (EISA), which unlike MicroChannel retains backward compatibility with the existing ISA bus.

1988       Robert Morris’s worm floods the ARPAnet. The 23-year-old Morris, the son of a computer security expert for the National Security Agency, sends a nondestructive worm through the Internet, causing problems for about 6,000 of the 60,000 hosts linked to the network.

1989       Intel releases the 486 (P4) microprocessor, which contains more than one million transistors. Intel also introduces 486 motherboard chipsets.

1990       The World Wide Web (WWW) is born when Tim Berners-Lee, a researcher at CERN—the high-energy physics laboratory in Geneva—develops Hypertext Markup Language (HTML).

1993-2005: Windows 95 to XP, Pentiums, And Athlons

1993       Intel releases the Pentium (P5) processor. Intel shifts from numbers to names for its chips after the company learns it’s impossible to trademark a number. Intel also releases motherboard chipsets and, for the first time, complete motherboards.

1995:       Intel releases the Pentium Pro processor, the first in the P6 processor family.

1995:       Microsoft releases Windows 95 in a huge rollout.

1997:       Intel releases the Pentium II processor, essentially a Pentium Pro with MMX instructions added.

1997:       AMD introduces the K6, which is compatible with the Intel P5 (Pentium).

1998:       Microsoft releases Windows 98.

1998:       Intel releases the Celeron, a low-cost version of the Pentium II processor. Initial versions have no cache, but within a few months Intel introduces versions with a smaller but faster L2 cache.

1999:       Intel releases the Pentium III, essentially a Pentium II with SSE (Streaming SIMD Extensions) added.

1999:       AMD introduces the Athlon.

1999:       The IEEE officially approves the 5 GHz band 802.11a 54 Mb/s and 2.4 GHz band 802.11b 11 Mb/s wireless networking standards. The Wi-Fi Alliance is formed to certify 802.11b products, ensuring interoperability.

2000:       The first 802.11b Wi-Fi-certified products are introduced, and wireless networking rapidly builds momentum.

2000:       Microsoft releases Windows Me (Millennium Edition) and Windows 2000.

2000:       Both Intel and AMD introduce processors running at 1GHz.

2000:       AMD introduces the Duron, a low-cost Athlon with reduced L2 cache.

2000:       Intel introduces the Pentium 4, the latest processor in the Intel Architecture 32-bit (IA-32) family.

2001:       The industry celebrates the 20th anniversary of the release of the original IBM PC.

2001:       Intel introduces the first 2 GHz processor, a version of the Pentium 4. It takes the industry 28 1/2 years to go from 108 KHz to 1 GHz but only 18 months to go from 1 GHz to 2 GHz.

2001:       Microsoft releases Windows XP, the first mainstream 32-bit operating system (OS), merging the consumer and business OS lines under the same code base (NT 5.1).

2001:       Atheros introduces the first 802.11a 54 Mb/s high-speed wireless chips, allowing 802.11a products to finally reach the market.

2002:       Intel releases the first 3 GHz-class processor, a 3.06 GHz version of the Pentium 4. This processor also introduces Intel’s Hyper-Threading (HT) technology, appearing as two processors to the OS.

2003-Present: Multiple CPU Cores And 64-bits

2003:       Intel releases the Pentium M, a processor designed specifically for mobile systems, offering extremely low power consumption that results in dramatically increased battery life while still offering relatively high performance.

2003:       AMD releases the Athlon 64, the first x86-64 (64-bit) processor for PCs, which also includes integrated memory controllers.

2003:       The IEEE officially approves the 802.11g 54 Mb/s high-speed wireless networking standard.

2004:       Intel introduces a version of the Pentium 4 codenamed Prescott, the first PC processor built on 90-nanometer technology.

2004:       Intel introduces EM64T (Extended Memory 64 Technology), which is a 64-bit extension to Intel’s IA-32 architecture based on (and virtually identical to) the x86-64 (AMD64) technology first released by AMD.

2005:       Microsoft releases Windows XP x64 Edition, which supports processors with 64-bit AMD64 and EM64T extensions.

2005:       The era of multicore PC processors begins as Intel introduces the Pentium D 8xx and Pentium Extreme Edition 8xx dual-core processors. AMD soon follows with the dual-core Athlon 64 X2.

2006:       Apple introduces the first Macintosh systems based on PC architecture, stating they are four times faster than previous non-PC-based Macs.

2006:      Intel introduces the Core 2 Extreme, the first quad-core processor for PCs.

2006:       Microsoft releases the long-awaited Windows Vista to business users. The PC OEM and consumer market releases would follow in early 2007:

2007:       Intel releases the 3x series chipsets with support for DDR3 memory and PCI Express 2.0, which doubles the available bandwidth.

2007:       AMD releases the Phenom processors, the first quad-core processors for PCs with all four cores on a single die.

2008:       Intel releases the Core i-series (Nehalem) processors, which are dual- or quad-core chips with optional Hyper-Threading (appearing as four or eight cores to the OS) that include an integrated memory controller.

2008:       Intel releases the 4x and 5x-series chipsets, the latter of which supports Core i-series processors with integrated memory ­controllers.

2009:       Microsoft releases Windows 7, a highly anticipated successor to Vista.

2009:       AMD releases the Phenom II processors in 2-, 3-, and 4-core versions.

2010:       Intel releases six-core versions of the Core i-series processor (Gulftown) and a dual-core version with integrated graphics (Clarkdale). The Gulftown is the first PC processor with more than 1 billion ­transistors.

2010:       AMD releases six-core versions of the Phenom II processor.

2011:       Intel releases the second-generation Core i-series processors along with new 6-series motherboard chipsets. The chipsets and motherboards are quickly recalled due to a bug in the SATA host adapter. The recall costs Intel nearly a billion dollars and results in a several month delay in the processors and chipsets reaching the ­market.

Advertisements

If you would like to comment, please refer to which section of the article you liked so I know you are not a spammer. Too many out there :)

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s