Popular Posts

Aug 21, 2011

Generation of Computer, History


Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices.


The history of computer development is often referred to in reference to the different generations of computing devices. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. Read about each generation and the developments that led to the current devices that we use today.



      First Generation (1940-1956) Vacuum Tubes

      The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
      First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
      The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

      Second Generation (1956-1963) Transistors

      Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
      Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
      The first computers of this generation were developed for the atomic energy industry.

      Third Generation (1964-1971) Integrated Circuits

      The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
      Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitorsand interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

      Fourth Generation (1971-Present) Microprocessors

      The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
      In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
      As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handhelddevices.

      Fifth Generation (Present and Beyond) Artificial Intelligence

      Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

      Computer History

      About this section...

      This section contains links and resources related to computer history.

       

      Computer History - Pre-1945

      An example of an abacus, a digital computer driven by fingers instead of electricity which has been used for over 2,000 years.
      abacus

      The Antikythera Mechanism - A Roman-Era Analog Computer (2,000 years ago)

      http://www.antikythera-mechanism.gr
      Excellent article on the mechanism
      http://www.newyorker.com/reporting/2007/05/14/070514fa_fact_seabrook
      The Antikythera Mechanism is the only surviving example of ancient computing technology. Sometime around 2,100 years ago, Greek scientists and engineers created this device. The extreme complexity of the mechanism, as well as scattered references in classical texts, indicate that many such devices must have existed. They have not survived to our era since they were made of brass, which often was melted down and recycled.
      antikythera mechanismpieces
      The mechanism, as originally found in a Roman-era shipwreck.
      The mechanism was finally reconstructed after using advanced X-ray and other scanning techniques to examine it.
      antikythera_mechanism2 gear diagram
      Side view of the gears in the mechanism.
      antikythera mechanism 2front panel
      Reconstruction of the front gearset of the mechanism. Note the small balls representing the moon and sun. Additional lost gears probably indicated the positions of the five known planets at that time.
      Antikhytera Mechanism 3 antikythera mechanism detail
      The complete mechanism, with dials indicating calendar data and important festival dates (e.g. the Olympics). The right image shows detail of the back panel -the large calendar dial, and the smaller dial recording times of Olympic festivals.
      Images of the Antikythera mechanism - a complex analog computer which predicted the motions of the stars and planets during the Roman Empire, reconstructed

      Computing's First Dark Age (400AD-1000AD in Europe)

      During this period few technical advances were made, due to the collapse of the Roman Empire and the Han empire in China. In China, computing machines were re-discovered in the early 1100s, but were wiped out during the Mongol Invasion led by Gehngis Khan.

      The Re-Discovery of Mechanical Computation (1500-1800 in Europe)

      The astrolabe, a very simple mechanical computing system, was developed in the Arabic world and later used by England to establish mastery over the seas. It is used to calculate the angle of the sun, moon, and planets above the horizon to determine one's position on Earth - a sort of proto-GPS.
      Astrolabe
      During the 1400s to 1800s, increasingly complex mechanical clocks and simple computers were constructed. Unlike computers built after 1800, they could not be programmed. However, they could perform simple arithmetic calculations. Pascal's calculator could do addition and multiplication.

      Pascal's Calculator (1670)

      This is a mechanical calculator that could add and subtract, built by the philosopher Blaise Pascal in the 1670s. The system of cranks and gears is comparable in complexity to the Antikythera mechanism from 1000 years earlier.
      Pascal Pascal's Calculator

      Pascal's Calculator pascaline

      The mathematician Gottfied William von Liebnitz built a better gear-driven calculating instrument, called the Stepped Reckoner. It was faster than Pascal's design, and could also multiply and divide.
      Leibnitz stepped reckoner Leibnitz
      stepped reckoner2
      However, most scientists and engineers used a much simpler analog computer called a Slide Rule. Below is an example. Slide Rules were common well into the 1970s.
      Slide Rule

      Jacquard Loom - the first industrial robot (1801)

      The Jacquard Loom is a mechanical loom invented by Joseph Marie Jacquard in 1801. It is controlled by cards with punched holes, each row of which corresponds to one row of the design. Changing the cards changes the design of the textile produced.
      jacquard loom 1jacquard loom 2 Jacquard
      The punched cards used to encode data are to the right in the image above. Note their similarity to the IBM punched card system used 150 years later (see below).
      Jacquard Loom front
      Jacquard loom 3
      Another close-up of the punched-card system on the Jacquard Loom - in effect the first automatic programmable robot.

      Charles Babbage's (1791-1871) Difference Engines (circa 1830)

      English scientist Charles Babbage began to extend computing in the 1820s and 1830s. During his lifetime, he designed several mechanical computers much more complex than any made before. The most advanced of these systems included all the components found in a modern digital computer. However, they were mechanical instead of electrical - turned by crank, or by a steam engine, as in the larger Analytical Engine.
      Babbage had an extraordinary career apart from his computing machines, and held many patents for new technology in the 1800s. He was also a codebreaker - he cracked the supposedly unbreakable Vigenère cipher. Code creation and codebreaking formed a major part of the drive to automate computing in the 19th and 20th centuries.
      Difference Engine 2 Charles Babbage
      An early form of mechanical computer developed by Babbage in the 1820s as a "test system" for his difference engines.

      Difference Engine No. 2 (reconstruction from 1830s design)

      Charles Babbage's Difference Engine No. 2
      A more complex difference engine designed by Babbage. This design was never fully completed, due to Babbage's fights with various individuals and with the British government, which partly sponsored the effort.

      The Difference Engine in Action

      High-quality:
      http://www.youtube.com/watch?v=Lcedn6fxgS0&feature=related
      http://www.youtube.com/watch?v=aCsBDNf9Mig&feature=fvw
      http://www.youtube.com/watch?v=Iv2rCMqW-tg&feature=related
      The Difference Engine implemented as LEGO!
      http://www.youtube.com/watch?v=KL_wy-CxBP8&feature=related

      The Scheutz Difference Engine (1850s)

      Other people built functional difference engines during the 1800s. Here is an example of a successful engine from the 1850s.
      Schultze difference engine
      Difference Engine Scheutz
      Father and son, George and Edward Scheutz, built a Difference Engine, based on the ideas of British computing pioneer Charles Babbage, with the numbers impressed into paper mache or metal strips. These were then used as moulds from which multiple copies of the results could be printed. The engine was bought in 1859, and in 1864 was used, by the General Register Office to produce the 'English Life Table', life assurance tables.

      Babbage's Analytical Engine (fragment)

      Babbage's Analytical Engine was never fully built due to cost overruns and the inventor's cranky personality. Study of the designs shows that the system would have worked, and would have been comparable to mechanical computers built 100 years later at the end of the WWII. The Harvard Mark IV, built in the early 1940s, has many features borrowed from the Analytical Engine's design. If it had been built, the Analytical Engine would have incorporated memory, a central processor or "mill", an input system, and even a graphical plotter for producing charts and images from the computations.
      The images on the left shows part of the "mill", or computing portion, of Babbage's engine. The right engine shows the punched-card system storing data along with part of the processing unit. The right component was built by Henry Provost Babbage, the youngest son of Charles Babbage, after his father's death.
      Analytical Engine fragment analytical engine punched card mechanism

      Ada Byron (Lovelace) 1815-1852

      Ada Byron is sometimes called the world's first programmer. She was one of the few people who fully understood the potential of Babbage's machines, and also realized that the code, or software operating them, was as important as their hardware.
      Ada Byron ada byron
      Ada Lovelace was the only legitimate child of the poet Lord Byron. From 1832, when she was seventeen, her remarkable mathematical abilities began to emerge, and she knew and corresponded with many scientists of her day, including Mary Somerville, noted researcher and scientific author of the 19th century, who introduced her to Charles Babbage on 5 June 1833.
      Ada Lovelace met and corresponded with Charles Babbage on many occasions, including socially and in relation to Babbage's Difference Engine and Analytical Engine. Babbage was impressed by Lovelace's intellect and writing skills. He called her "The Enchantress of Numbers".
      During a nine-month period in 1842-43, Lovelace translated Italian mathematician Luigi Menabrea's memoir on Babbage's newest proposed machine, the Analytical Engine. With the article, she appended a set of notes.The notes include the world's first computer program, designed for Babbage's Analytical Engine.
      Lovelace is now widely credited as being the first computer programmer. As the patron saint of programmers, she has a holiday on May 24th.
      Ada died at age thirty-six - what would have happened if she had lived, and Babbage had completed his design for the Analytical Engine and she could have tested her programs?
      More in Wikipedia - http://en.wikipedia.org/wiki/Ada_Lovelace
      Ada's 1842 article containing the first computer program, designed for Babbage's Analytical engine (HEAVY math).
      http://www.fourmilab.ch/babbage/sketch.html
      Another good book (Amazon)
      http://www.amazon.com/exec/obidos/ASIN/0912647094/vintagecompute0d?creative=327641&camp=14573&link_code=as1


      Computing's Second "Dark Age"

      Between the end of the 19th century and the period just before WWII, very little work on machine computing was done. In some ways, it constitutes a "dark age" of computing. Interest in computing re-emerged in the 1930s as the world headed for a global war.

      Mechanical calculators make a comeback

      For the 1890 census, Hollerith developed an electro-mechanical computer to tabulate statistics.
      The left image shows the Hollerith punched-card system used in the 1890 census, and a 1920s era punched-card system that created data for these mechanical calculators.
      Hollerth Tabulator Hollerth tabulator 1908
      Hollerith later founded IBM, the main computing company of the 20th century. IBM produced ever more sophisticated "partial" computers based on the Hollerith model, but stopped short of creating a fully functional digital computer.The Hollerith tabulator below (from 1928) was typical of mechanical computing machinery in the 1920s and 1930s. To the right, an example of the 80-column punched-card for storing data (also used by the Jacquard Loom and Babbage's Analytical Engine).
      Hollerth tabulator 1928 IBM punched card
      A 1928 Hollerith machine.
      More on the Hollerith machines and IBM's early history at - http://www.columbia.edu/acis/history/tabulator.html

      The Rise of Electronic Computers

      By the 1930s, it was practical to implement computing machinery using electricity instead of mechanical parts. This allowed calculation speed to jump 1000-fold, and ushered in the modern computer age.

      Vannevar Bush and the Differential Analyzer (1930)

      In 1930, Vannevar Bush introduced the first electronic "computer" in the United States. Bush worked on developing machines that could automate human thinking. His first one was an analog device rather than digital - a sort of electric slide rule that did not use symbol processing the way a modern digital computer (and Babbage's engines) does.



      Atanasoff-Berry Computer (1937)

      Unlike the Differential Analyzer, the Atanasoff-Berry computer used digital symbols to compute. It used electricity, along with vacuum tube technology to reach speeds impossible for a mechanical computer. However, it was severely limited in the types of computations it could do.
      Atanasoff-Berry Computer

      Harvard Mark I - The first modern digital computer (1942)

      mark i

      Mark 1 Howard Aiken
      Developed by Howard H. Aiken, built at IBM and shipped to Harvard in February 1944. The machine was directly inspired by Babbage's Analytical Engine - in some ways it is the realization of Babbage's vision. Howard Aiken saw himself as something close to a reincarnated Babbage, and even invited Babbage's grandson to participate in the first conference on computers held after the war.
      It is a hybrid electrical-mechanical device, with a large shaft synchronizing the operation of its parts.  It was made up of 78 adding instruments and desk calculators connected by almost 500 miles of wires. In one second it could add three-digit numbers. It was 51 feet long and 8 feet high and had over one million parts.
      The Mark I was also unique in not being an experimental system - it was immediately put into use by the US Military. Part of the complex computations needed to produce the first atomic bomb were performed on the Mark I. One of its remarkable features was its reliability - unlike the early electronic computers, it could compute 20 hours a day without breaking down. The first programmers of the Mark I were computing pioneers Richard Milton Block, Robert Campbell and Grace Hopper.
      Postwar, there was a serious rivalry between Aiken and IBM. IBM had built the machine, but did not fully understand what Aiken had created. The Mark I was superior to any of IBM's machines.

      Grace Murray Hopper

      Grace Hopper Grace Hopper Hopper older
      Of the Mark I programmers - probably the first people to create and run useful computing in a data center - Grace Hopper stood out. With a doctorate in mathematics, Hopper joined the Navy in WWII and later joined Aiken's group. Grace Hopper worked closely with Aiken, and in some ways the old Babbage/Byron hardware/software duo had reappeared. Grace Hopper went on to define many features of modern computing, including "open source" and high-level computing languages. Her creation of software "compilers" was critical in that it brought large numbers of new programmers into the profession in the 1950s. COBOL, the first high-level programming language that she developed circa 1960, is still in use - in fact, 70% of all code running on computers today is still written in COBOL. In part due to Hopper (and Ada Lovelace before her), computing was seen as a profession open to women, and by the 1960s computing was the preferred "high-tech" choice of many women
      .first computer bug
      An image of the first "computer bug" - On September 9, 1947, Grace Hopper was working on the Harvard University Mark II, and discovered a problem with the program was actually a moth trapped between the points of Relay #70, in Panel F. When it was fixed, word got out that the program was "debugged" - a a new word in computer jargon was born.

      The Navy was so proud, they made her the first female Rear Admiral in history, and named a guided missile destroyer (seen here launching) after her!
      grace hopper

      ACE (1942)

      ACE (Automatic Computing Engine): Alan Turing presented a detailed paper to the National Physical Laboratory (NPL) Executive Committee, giving the first reasonably complete design of a stored-program computer. However, because of the strict and long-lasting secrecy around his wartime work at Bletchley Park, he was prohibited (having signed the Official Secrets Act) from explaining that he knew his ideas could be implemented in an electronic device. ACE is the first true electronic computer using stored programs. Unfortunately, due to wartime secrecy, the computer and its plans were destroyed at the end of the war. Automatic Computing Engine Alan Turing
      Turing also published the first formal descriptions of digital computing. All modern computers are a form of "Turing Machine" following the principles he described. He also helped to found the science of mathematical biology.
      Alan Turing unfortunately did not live long enough to contribute to computing. In the 1950s his sexual orientation caused problems, leading to botched female hormone treatments (chemical castration causing his breasts to grow), and his ultimate suicide. At the time in England, there was acute public anxiety about spies and homosexual entrapment by Soviet agents, which explains the reduced tolerance by society.

      Colossus (1943)

      Developed to break German encrypted communications, 10 of these machines were built in Britain.
      Colossus Colossus reconstruction


      Movie showing the Colossus rebuild:
      http://www.youtube.com/watch?v=O8WXNPn1QKo&feature=related
      Closeup of the Lorenz machine - a mechanical encryption computer used in Germany, whose code Colossus we developed to "crack". Superiority in computing was one factor in the Allies winning WWII.
      Lorenz
      John von Neuman and the architecture of the modern digital computer

      Vannevar Bush and the Memex (1945)

      Differential Analyzer Vannebar Bush

      Bush later wrote an astounding article in 1945 which fully envisioned the World Wide Web 50 years before it actually appeared. Like Babbage's Analytical Engine, it was never built. He called his "web browser" a Memex. The design used electrical wires and microfilm files filling a desk to create the equivalent experience of web surfing - in particular, "hyperlinking" one page of data to another. There are two screens, one for visual display of information, and another - a touch-sensitive pad - for writing and drawing. Bush also imagined a user would have a small camera connected to the system, similar to webcams today.
      Link to the original Atlantic Monthly article - http://www.theatlantic.com/doc/194507/bush
      Bush Memex 2 Bush Memexmemex_webcamBush Memex 3

       

      Computer Hardware


      Computer Circuits - a lot of switches

      Computers perform logical operations by opening and closing switches. The "1s" and "0s" computers are often said to use are actually the presence or absence of electricity in one of their circuits. Since there is no "2" inside a computer, the "1s" and "0s" are called by a special name - Boolean, which also maps to "TRUE" and "FALSE" in logic.
      http://www.technologystudent.com/elec1/dig2.htm

      AND OR
      The logic of these circuits can be written out formally (as when designing a computer chip) using a "truth table" - note the "1s" and "0s" are actually TRUE or FALSE, or the presence or absence of electric current, or whether a switch is open (0) or closed (1).
      AND truth table OR logic
      To make a circuit that does something requires LOTS of these circuits (100s of millions in a modern PC)
      logic complex

      What computer circuits are made from

      vacumn tube
      Early electronic computers in the 1940s used vacumn tubes to create logic switches.
      vacuum tube logic
      The transistor replaced the vacumn tube in the 1950s. Transistors are smaller (can be made VERY small, see rightmost dot in left image), and don't burn out over time. They use silicon to make the switches.Many 1950s computers used hand-wired transistor boards, like the one shown on the right
      transistor versus vacumn tube transistor circuits
      computer logic realized with vacuum tubes, 1940s
      transistor clock transistor pcb
      Logic circuits made with large transistors (a digital clock) and part of a computer circuit board. Today's computers still require lots of large transistors to interconnecter the more complex computer chips (see below)
      http://blog.makezine.com/archive/2008/06/transistor_clock_kit_uses.html

      logic chip
      Logic circuits physically realized on a silicon computer chip. The small squares are individual transistors

      Computer Hardware Minaturization 1945-2010

      Modern electronic computer technology uses logic circuits created on silicon wafers, or "chips", which don't have to be hand-wired. This is accomplished by "printing" the transistors on a single chip of silicon. The process is very similar to traditional stone lithography used in graphic design for centuries.
      first integrated circuit Intel 4004
      Images of the first integrated circuit printed on silicon (circa 1960) and the Pentim 4004, the first full computer on a chip (1972)
      Pentium 4
      Image of the Pentium 4 (2000). Each small square in the processor is roughly as complex as the entire 4004 processor. See "Microprocessor hall of fame" for more examples: http://www.tayloredge.com/museum/processor/processorhistory.html

      How Integrated circuits are made

      The circuits of a chip are too small to create using lasers, robots, or other devices. Instead, they are make by a process similar to classic lithography used by book printers and graphic designers to print images.
      photolithography photolithography
      silicon ignot and wafers wafer photoresist wafer cutting chips
      Silicon ingots (a single crystal of pure silicon), showing cut wafters, then a wafer during the photoresist process, finally, a wafter being cut into individual silicon chips. Finally, wires are attached ("pins") and the whole chip is encased in plastic to protect it from light
      chip board circuit board
      Silicon chips encased in plastic and sodered onto a circuit board. Right, a closeup of the pins from the chip connecting to the boards. Parallel lines on the green board are wires interconnecting the chips. Other objects are individual transistors which must be large since they handle lots of power.
      Videos on Computer Chip manufacture
      Good technical video showing photolithography
      http://www.youtube.com/watch?v=stdnSB0A3S8&feature=related
      AMD
      http://www.youtube.com/watch?v=-GQmtITMdas&feature=related

      All computers today use a von Neuman architecture

      von neuman architectureJohn Von Neuman
      Current computers use a "Von Neuman" architecture, named after John von Neuman, who proposed the design in the late 1940s
      • Control Unit - stores the commands of the computer program, and executes them one at a time
      • Arithmetic Logic Unit - performs both math and logic calculations
      • Accumulator - where results of a computation are stored
      • Input - a device that allows information from outside the computer to be sent into the Logic Unit (e.g. data from a webcam or a keyboard)
      • Output - a device that allows the results to be sent outside the computer (e.g. to a computer monitor or speaker)

      Hardware Versus Software

      Under the von Neuman architecture:
      • Hardware is the actual electrical circuits used to create a computer
      • Software is a list of the order in which switches open an close during computation.
      • Computer programs are a specific file holding these instructions (with an .exe file extension under Windows)

      Types of Computers - Mainframes

      supercomputer
      Supercomputer (array of smaller computers)
      IBM Mainframe 2008mainframe installed
      Mainframes (IBM)

      Internet Servers

      Internet server Internet server
      Internet server computers with 9 or 15 (count them) hard drives. There is no monitor or keyboard, since this computer is always accessed via the Internet.
      Internet Server Rackmountdata center
      Internet servers rack-mounted in a data center. Such computers can handle thousands of users per hour. Companies like Google have 25,000 or more of these computers, the equivalent of several mainframes.

      Workstations and "Gamer" Computers

      Gamer PC HP science workstation
      Workstation - Scientific and "Gamer" PCs are about the same power/configuration. Prices range from the low thousands to 20,000 or more. These computers typically excel at 3D graphics and fast rendering.

      Consumer PCs

      Standard Dell PC
      Standard Consumer PCs are similar to workstations, but generally have slower hardware and cheaper video cards. Prices range from a few hundred to a few thousand dollars.
      consumer PC
      "Low end" consumer PC suitable for word processing and web-surfing (and not much else). They often cannot play 3D games.
      Xbox

      The XBox - a specialized low-end PC coupled with high-performance graphics.
      TViO
      TViO, a dedicated low-end, Linux-based PC optimized for video recording and display

      Laptops PCs

      Notebook PC ibook
      Notebook PCs have slow hardware and limited graphics to conserve battery power. They make up for it in portability. Left a Windows portable, right the Apple iBook.

      Netbooks


      Netbook
      The newest type of portable is the "Netbook" - a very small notebook with very limited onboard storage and no 3D graphics. Netbook users typically use their portables for net surfing and little else. Due to their low power, battery life is generally longer than a notebook.

      eBooks

      eBooks are slow portable computers optimized for the display of text. Frequently, they have a special wireless download system (NOT wi-fi) tying them to a specific vendor, in a manner similar to iTunes and Apple hardware. Compared to other portables they have very long (1 week) battery life, since they do not display color images or movies.Propular for people who travel extensively.
      kindle sony ebook
      Left to right: the Amazon Kindle, Sony eBook

      PDAs (Personal Digital Assistants)

      Palm 5000 Apple Newton
      Early PDAs (Personal Digital Assistants) from the 1990s - the Palm 5000 and the Apple Newton
      Dell PDA Iphone Linux PDA
      Modern PDAs (Personal Digital Assistant), Pocket PC (Dell), Apple's iPhone, Linux-Based
      portable playstation
      Portable game systems (a PDA with higher-performance graphics)

      Smartphones

      cellphone smartphone
      Smartphones are cellphones with additional features installed, like Internet surfing. Unlike a PDA, their primary functions remains phone calls, and screen sizes are very small

      Consumer Media Players

      iPodPVR
      Smaller computers - iPod, and portable DVD player

      Embedded computers

      single-board computer
      Single-board computer, similar to the one in your car engine. These computers work behind the scenes, and typically do not require a monitor or keyboard.

      SPOT computers

      SPOT computer watch
      SPOT computers are very small systems created by Microsoft to relay data including the weather and stock quotes. Developed around 2002, they have not been successful competing with smartphones and PDAs, and Microsoft plans to "shut them off" in 2012.

      Single Microprocessor Computers

      Basic Atom microprocessor Fork with microprocessor digital parking meter
      Single-chip computers ("Microprocessors") - a kind of very simple embedded computer found in microwaves, video cameras and digital cameras, parking meters, and even cooking knives. Despite their tiny size, they are complete "computers on a chip".

      Computer History - Mainframe Era (1944-1978)

      ENIAC (1944)

      ENIAC, short for Electronic Numerical Integrator And Computer, was the first large-scale, electronic, digital computer capable of being reprogrammed to solve a full range of computing problems. ENIAC was designed at the University of Pennsylvania to calculate artillery firing tables, but was stands as the first modern digital computer. It was designed and built by John Mauchly and J. Presper Eckert of the University of Pennsylvania. Interestingly, all the programmers were female, Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Wescoff, Fran Bilas and Ruth Lichterman.
      Programmers Betty Jean Jennings (left) and Fran Bilas (right) operate the ENIAC's main control panel at the Moore School of Electrical Engineering.
      Eniac
      Programming ENIAC
      eniac with programmers
      eniac
      Internal wiring of ENIAC
      ENIAC wiring
      More on the ENIAC (Wikipedia)
      http://en.wikipedia.org/wiki/ENIAC

      IBM SSEC (1948)

      This image show the gigantic size of early electronic digital computers. Often, their design seems a little like a temple, with the terminal like an altar for the "high priests" - the programmers - to work.
      IBM SSEC

      The Artificial Intelligence dream starts early!

      Tom Watson Time Magazine
      The cover of Time magazine showing Tom Watson, head of IBM in the 1950s

      Univac (1951)

      The UNIVAC I was the world's first commercial computer, delivered to the U.S. Census Bureau in 1951.
      Univac
      Grace Hopper
      Grace Hopper with fellow programmers, early 1950s
       

      The SAGE Computer Network (about 1954)

      IBM’s SAGE(Semi-Automatic Ground Environment) is a large semi-automated air defense system created in the early 1950s for real-time analysis of radar data.
      The machine weighed 300 tons and occupied one floor of a concrete blockhouse. It have had Dual Processor: one on line and other trainning, maintenance and hot backup. It costed aproximately $10 billion in 1954 dollars. At its peak, the project employed 20% of the world's programmers. Users interacted witht he system via a "light pen" - they drew directly onto the computer screen.
      http://www.youtube.com/watch?v=06drBN8nlWg&feature=channel
      SAGE Computer sage lightpen

      1950s Mainframes

      1950s mainframe
      RAMVAC - first computer with a hard drive
      ramvac

      CDC 6600 (late 1950s)

      Control Data Corporation’s 6600 machine was designed by noted computer architect Seymour Cray, who later created the Cray supercomputers of the 1980s and 1990s. It was the fastest computer in the world during the 1960s. A CDC computer was used to implement the first real-time computer game, Spacewars, at MIT at the end of the 1950s.
      CDC 6600 DEC PDP 1
      The Dec 6600ran the first real-time video game, called SpaceWar, and (linked to an electric typewriter for output) the first word processor program.

      SABRE Reservation system (1960)

      An outgrowth of the SAGE miltary computer system for tracking aircraft, SABRE was the first commercial computer-based airline reservation system when it came online in 1962. A version of this system is still used to handle airline reservations today.
      Sabre reservation system


      The IBM 360 (1960s)

      Many mainframe models were developed by IBM in the 1950s and 1960s, but the IBM 360 is particularly important, since it was the first mainframe to enjoy widespread use in business and government. Large numbers of men and women sought careers in computers and data processing during this era.
      ibm 360
      ibm 360

      An IBM 360 mainframe in use, compared to a Hollerth calculator on the right. Images from this era show lots of women for two reasons. First, new business environments often don't have gender rules prebuilt, so it wasn't obvious that a woman couldn't work with computers, even program. Second, entering data and managing software was seen as inferior to creating hardware, an exclusively male profession at the time. Today, the relative values of hardware and software are reversed.

      Computing in the 1960s

      Some images of Univac computers from the 1960s. Note the use of female models. This is not just advertising - computer programming and data processing became a common career path for young women in the 1960s. As with the telegraph 100 years earlier, they encountered less prejudice in "high tech" industries.
      univac 1065 Univac 1965
      PDP ad

      Timesharing and "Dumb Terminals"

      By the 1960s a model for computing had emerged featuring a centeral computer with a large set of "dumb" terminals. This allowed many people to use the mainframe at the same time, and caused an enormous expansion of the computing community. The scenes resemble a data center today, but remember that there is only one computer here, connected to a large number of screens. Single-person computers (or "personal" computers) did not become widespread until the 1980s.
      The left image show IBM computers being used to manage airline reservatons in a timeshare environment. The larger machines on the left are microfilm readers, which substituted for computers for many years. The other images show examples of "dumb" terminals, along with a typical text-based computer interface of the era.
      IBM mainframe airline reservationVT 100 terminal dumb terminalTimesharing
      Timeshare computers at a college in the 1960s, with an older punched-card interface.

      Apollo Guidance Computer (1968)

      This computer was used to guide the Apollo lander to the surface of the moon. It was also used in other NASA spacecraft of the 1970s, including Skylab. Moon "hoaxers" sometimes claim that moon landings were impossible due to lack of computers. In reality, powerful computers were available by the mid-1960s - they were just too expensive for ordinary people to own.
      Apollo Computer interface Apollo Guidance Conputer LEM
      Transcript of the first lunar landing (including reboots of the guidance conputer)



      Computer History - Workstation Era (1968-1985)

      The PDS-I at NASA-Ames (1972)

      PDS-1

      Xerox Alto (1976)

      Alto showing SmalltalkXerox Alto
      Alto desktopAlto Screen

      Xerox Star (1981)

      Developed during the late 1970s by Xerox, the Star incorporated features that today define personal computers: a bitmapped display, mouse, windows, local hard disk, network connectivity via Ethernet, and laser printing. The Star also refined the "desktop metaphor" - showing files and folders as icons, dialog boxes, and a "point and click" style of interaction." It was the first commercial object-oriented computer interface. Apple essentially stole key Star concepts to produce the Apple Lisa, and later, the Apple Macintosh. Today, virtually all personal computers have an interface directly descended from the Star.
      Xerox Star
      Xerox star fonts
      Xerox Star Interface 2
      Xerox Star interface with person
      Xerox star with printing
      Xerox star icons Xerox star icons 2
      The development of an icon-driven user interface for the Xerox Star
      http://www.digibarn.com/collections/software/xerox-star/xerox-world-according-to-norm.html


      Smalltalk, Object-Oriented Programming, and Adele Goldberg
      Adele Goldberg
      Smalltalk screen


      Computer History - Microcomputer Era (1978-1994)

      Visions of the Personal Computer - Alan Kay's Dynabook (1969)
      Alan Kay 1 Alan Kay 2
      Dynabook 1 Dynabook3 Apple iPad 2010
      Alan Kay's Dynabook concepts from the 1970s and early 1980s (on left), versus the 2010 Apple iPad (on right). Note that Kay predicted the form factor of a portable notebook computer, as well as tablet computers, over 30 years ago. The Dynabook on the far left is functional, and uses a 1970s flatscreen display. However, since the electronics were too large for the model, it was connected to a much larger computer. Today, the iPad also replacates this behavior by connecting to a bunch of "big computers" - in other words, the Internet.

      BYTE Magazine 1976-1982

      Byte MagazineByte Magazine 2Byte Magazine 3Byte Magazine 4Byte Magazine 5
      Images showing a vision of small computers used as word processors, computer data analogized to documents, computers used in the stock market, a wrist-computer, and computers applied to music and the arts.

      Computers as tools of liberation and the Baby Boomer "counterculture"

      Computer Lib cover
      Microcomputers were not promoted as "the next great technology", but were rather seen in the social and political lens of the 1960s and 1970s. The oppression-liberation thesis familiar from the civil rights, black power, and women's movement was applied to computers. The personal computer was developed and marketed as a way for the people to "fight the system" and gain individual freedom. This attitude of "think different" and "question authority" is still part of Apple's marketing strategy today.
      Links to the "Computer Lib" and "Dream Machines"
      http://www.digibarn.com/collections/books/computer-lib/index.html
      Page from an early issue of BYTE magazine
      BYTE Page
      More pages at Digibarn
      http://www.digibarn.com/collections/mags/byte-sept-oct-1975/one/index.html

      Altair 8800 (1976)

      Altair 8800
      This is the computer that Microsoft wrote and sold its first programs for. It had no mouse, keyboard, or printer, or even a monitor. Programs were entered byf lipping switches on the front panel, and blinking lights showed the program's execution.

      Pirates of Silicon Valley

      Bill Gates in jail Steve and Steve'
      The early pioneers of personal computing considered themselves rebels in the 1960s counterculture sense. Here we see Bill Gates in jail, and Steve Jobs shortly after he left the hippie commune (where he fathered and left an illegitmate daughter).

      Apple I (1976)

      The Apple I was originally designed by Steve Wozniak, but his friend Steve Jobs had the idea of selling the computer. 50 units of the original Apple I were sold at a price of $666.66 (because Wozniak liked repeating digits and because they originally sold it to a local shop for $500 and added a one-third markup). To make a working computer, users still had to add a case, power supply, keyboard, and display. An optional board providing a cassette interface for data storage was later released for $75.
      Apple IApple I computer inside

       

      OSBORNE protable computer (1981)

      Osborne portable computer
      An early entry into the world of business computing, the Osborne was the first computer that could be carried onboard and used on a plane.

      The IBM PC (1982)

      IBM PC
      The IBM PC helped to convert the microcomputer from a hobbyist geekland to a serious business tool.

      The Apple Macintosh (1984)

      Created by Apple computer under Steven Jobs, with brillian programming by Steve Wozniak, the Macintosh borrowed heavily from the Xerox Star but was affordable (~$2,000 in 1984 dollars) by the masses. The user interface of the Mac is remarkably similar to the design of current Macintosh computers, despite the 500x slower operating speed.
      Steve Jobs Steve Jobs iphone
      Steve Jobs as a young visionary in 1984, and and old one in 2009.
      Byte Magazine Macintosh Macintohs front viewMacintosh Inside

      The user interface of personal computing was complete by 1984

      Pictures of the original 1984 Mac, showing external floppy and keyboard.
      Macintosh control panel Macintosh Screen
      The Macintosh operating system, showin the windows, menus, control panels, calculator, and other features found on all GUI (Graphical User Interface) computers today.
      Macintosh MacDrawMacintosh Alice game
      Macdraw (an ancestor of Adobe Illustrator) and Alice, a simple 3D animated game on the origina 1984 Mac.

      Screen shot of Macpaint, the ancestor of Adobe Photoshop and bitmap-drawing programs.
      Macintosh Microsoft Excel
      An early version of Microsoft Excel - note the similarity to the current Excel user interface.

      New is not always better...Macintosh from 1986 BEATS a 3.0GHz Duo-Core AMD computer from 2006!

      For the functions that people use most often in Microsoft Office, the 1986 vintage Mac Plus beats the 2007 AMD Athlon 64 X2 4800+: 9 tests to 8! Out of the 17 tests, the antique Mac won 53% of the time! Including a jaw-dropping 52 second whipping of the AMD from the time the Power button is pushed to the time the Desktop is up and useable.
      We also didn't want to overly embarrass the AMD by comparing the time it takes to install the OS vs. the old Mac. The Mac's average of about a minute is dwarfed by the approximately one hour install time of Windows XP Pro.
      http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins
      PC mac plus compare 2 PC Mac compare 3 PC Mac compare 4 PC mac compare 5