This article describes the Historical Development of the Computers. The generations of computers are list out and brief description is given.
The development of computing dates back from the early Sumerian civilization (4000-1200 BC) when people started to keep records of transactions on clay tables. The actual computing using a machine started around 3000 BC when Babylons invented the abacus and Chinese people started using them around 1200 BC.
No significant development took place until seventeenth century. In 1642-43 Blaise Pascal created a gear-driven adding machine named Pascalene. It was the first mechanical adding machine. Later, in 1673-74, Leibnitz joined the crew with his version of mechanical calculator named “Stepped Reckoner” that could multiply. In 1801 Jacquard constructed a Loom, which is the first machine programmed with punched cards. In 1822, Charles Babbage designed and built his first Difference Engine, which is credited the first mechanical computer. For this machine, Babbage is known as the “Father of Computer”. In 1842, Ada Augusta Lovelace wrote the first program for the Difference Engine made by Babbage. A programming language (Ada) has been named after her. In 1847-49, Babbage designed his second version of Difference Engine but he could not complete it. The same machine was conceived in 1991 by the Science Museum in Kensington, England, and it worked!
In 1854 George Bool developed Boolean Logic, which is a system for symbolic and logical reasoning and the basis for computer design. In 1890, computations for the US Census were carried out by Herman Hollerith’s punched card machine. Hollerith started Tabulating Company, which eventually became IBM.
Before the 20th century, all the machines were mechanical. With the advent of vacuum tubes, the electronics era of computers started. In 1944, Harvard Mark I, the first large scale, automatic, general purpose, electromechanical calculator driven by a paper tape containing the instructions was constructed. In the second version of this machine, Grace Hopper, found the first computer bug, a bug beaten to death in the jaws of a relay, which she glued into the logbook. The logbook is now in the National Museum of American History of the Smithsonian.
The first electronic computer, ENIAC (Electronic Numerical Integrator And Computer) was built in 1946. It was programmed through rewiring between the various components. After this, many machines like EDVAC, IBM 701, 704 were built and some of them were commercially successful at that time because they used transistors in place of vacuum tubes thereby increasing the reliability and performance.
In 1961, Integrated Circuit (IC) were commercially available and since then computers use ICs instead of individual transistors or other components. IBM 360 and DEC PDP 8 were the most popular machines during that time. The invention of microprocessor further reduced the size and increased the performance of the computers. Altair 8800 was the first easily available microcomputer for which Bill Gates co-wrote BASIC.
1981 was the year of IBM when it launched the PC. Since then, there had been significant development in the field of microelectronics and microprocessor. Today, we run a 2GHz processor on our desk with 256MB of memory (640KB was considered more than sufficient once upon a time) and more than 40G of storage.
Five Generations of Modern Computers
The development of computers has been divided into generations. Computers of each generation have certain common characteristics in terms of components used, computing power, reliability etc.
First Generation (1945-1956)
The computers of this generation we entirely mechanical or electromechanical that use vacuum tubes and relays. Some of the important machines of this generation were Colossus, Mark I, ENIAC, EDVAC etc. The machines were very slow (3-5 seconds per calculation to few thousands of calculations per second) and inflexible (mostly built for special purpose) and could perform basic operations. The machines were huge (eg. ENIAC -18000 ft3, 18000 vacuum tubes) and consume a lot of power (160 KW for ENIAC).
Second Generation Computers (1957-1963)
By 1948, the invention of the transistor greatly changed the computer’s development. The transistor replaced the large, cumbersome vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since. The transistor was at work in the computer by 1956. Coupled with early advances in magnetic-core memory, transistors led to second generation computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.
Throughout the early 1960’s, there were a number of commercially successful second-generation computers used in business, universities, and government. These second generation computers contained transistors in place of vacuum tubes. They also contained all the components we associate with the modern day computer: printers, tape storage, disk storage, memory, operating systems, and stored programs. One important example was the IBM 1401. High-level languages such as COBOL (Common Business-Oriented Language) and FORTRAN (Formula Translator) came into common use during this time, and have expanded to the current day. These languages replaced cryptic binary machine code with words, sentences, and mathematical formulas, making it much easier to program a computer. New types of careers (programmer, analyst, and computer systems expert) and the entire software industry began with second-generation computers.
Third Generation Computers (1964-1971)
Though transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer’s sensitive internal parts. The Integrated Circuits (IC) were invented and used to fit even more components on a single chip. As a result, computers became ever smaller as more components were squeezed onto the chip. Another third-generation development included the use of an operating system that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer’s memory and other resources.
Fourth Generation (1971-Present)
After the integrated circuits, the only place to go was down – in size, that is. Large Scale Integration (LSI) could fit hundreds of components onto one chip. By the 1980’s, Very Large Scale Integration (VLSI) squeezed hundreds of thousands of components onto a chip. Ultra Large Scale Integration (ULSI) increased that number into the millions. It also increased their power, efficiency and reliability. The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minuscule chip. Whereas previously the integrated circuit had had to be manufactured to fit a special purpose, now one microprocessor could be manufactured and then programmed to meet any number of demands. Soon everyday household items such as microwave ovens, television sets and automobiles with electronic fuel injection incorporated microprocessors.
Such condensed power allowed everyday people to harness a computer’s power. They were no longer developed exclusively for large business or government contracts. By the mid-1970’s, computer manufacturers sought to bring computers to general consumers. These minicomputers came complete with user-friendly software packages that offered even non-technical users an array of applications, most popularly word processing and spreadsheet programs. Pioneers in this field were Commodore, Radio Shack and Apple Computers.
In 1981, IBM introduced its personal computer (PC) for use in the home, office and schools. The 1980’s saw an expansion in computer use in all three arenas as clones of the IBM PC made the personal computer even more affordable. The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used. Computers continued their trend toward a smaller size, working their way down from desktop to laptop computers (which could fit inside a briefcase) to palmtop (that could fit inside a pocket).
As computers became more widespread in the workplace, new ways to harness their potential developed. As smaller computers became more powerful, they could be linked together, or networked, to share memory space, software, information and communicate with each other. As opposed to a mainframe computer, which was one powerful computer that shared time with many terminals for many applications, networked computers allowed individual computers to communicate with each other. A global web of computer circuitry, the Internet, for example, links computers worldwide into a single network of information.
Fifth Generation (Present and Beyond)
Defining the fifth generation of computers is somewhat difficult because the field is in its infancy. With artificial intelligence, these computers will be able to hold conversations with its human operators, use visual input, and learn from its own experiences. Using recent engineering advances, computers are able to accept spoken word instructions (voice recognition) and imitate human reasoning. The ability to translate a foreign language is also moderately possible with fifth generation computers. This feat seemed a simple objective at first, but appeared much more difficult when programmers realized that human understanding relies as much on context and meaning as it does on the simple translation of words.
Many advances in the science of computer design and technology are coming together to enable the creation of fifth-generation computers. Two such engineering advances are parallel processing, which replaces von Neumann’s single central processing unit design with a system harnessing the power of many CPUs to work as one. Another advance is superconductor technology, which allows the flow of electricity with little or no resistance, greatly improving the speed of information flow. Computers today have some attributes of fifth generation computers. For example, expert systems assist doctors in making diagnoses by applying the problem-solving steps a doctor might use in assessing a patient’s needs. It will take several more years of development before expert systems are in widespread use.