Computer Generation refers
to the major development in electronic data processing. The generation of
computer is characterized by a major technological development that
fundamentally changed the way computer operates, resulting in increasingly
smaller, cheaper, powerful and more efficient and reliable devices.
First
Generation: Vacuum Tube (1940-1956)
The major characteristic of the First Generation computer
is the use of Vacuum Tubes for internal computing operations; these devices
were considerably faster than electromechanical devices for processing data.
Calculations can be performed in milliseconds.
Vacuum
Tube was invented by John Fleming in 1904, a device designed to control the
flow of electrical current. It is composed of metal plates and wires scaled in
glass enclosure. It performs special tasks such as receiving radio signals,
amplifying sound, and switching electrical signals ON and OFF.
The
first computers used vacuum tubes for circuitry and magnetic drums for memory,
and were often enormous, taking up entire rooms. They were expensive to operate
and addition to using a great deal of electricity generated a lot to heat,
which was often the cause of malfunctions.
First generation computer relied on machine language, the
lowest level programming language understood by computers, to perform
operations, and they could only solve one problem at a time. Input was based on
punched cards and paper tape, and output was displayed in printouts.
The
UNIVAC and ENIAC computers are examples of first-generation computer device.
Second
Generation: Transistor (1956-1963)
Transistors
replaced vacuum tubes and ushered in the second generation of computers. The
transistor was invented in 1947 by three American Physicists at the Bell
Telephone Laboratories, John Bardeen, Walter H. Bratain, and William B.
Shockley.
Transistors
were being manufactured using semiconductor material called Germanium. However,
germanium has a drawback that it broke down at a certain temperature and
replaced by silicon.
Second
Generation computers still relied on punched cards for input and printouts for
output. Second Generation computers moved from cryptic binary machine language
to symbolic, or assembly languages, which allow programmers to specify
instructions in words. High-level programming languages were also being
developed at this time, such as early versions of COBOL and FORTRAN. These were
also first computers that stored their instructions in their memory, which
moved from magnetic drums to magnetic core technology.
The
first computers of this generation were developed for the atomic energy
industry.
Third
Generation: Integrated Circuit (1964-1970)
The development of
the integrated circuit was the hallmark of the third generation computers. The
concept behind integrated circuit is simple: An entire electrical circuit with
numerous transistors, wires and other electrical devices all built into a
single square of silicon called semiconductors, which drastically increased the
speed and efficiency of computers.
Instead of punched cards and
printouts, users interacted with third generation computers through keyboards
and monitors interfaced with an operating system, which allowed the device to
run many different applications at one time with a central program that
monitored the memory. Computers for the first time became accessible to a mass
audience because they were smaller and cheaper than their predecessors.
Fourth
Generation: Microprocessor (1971-Present)
The
microprocessors brought the fourth
generation computers, as thousands of integrated circuits were built onto a
silicon chip like LSI (Large Scale Integration). As a result, thousands of
transistors, diodes, and resistors were packed into a silicon chip less than
0.2-inch (5 mm.) square. During the early 1980’s very large scale integration
(VLSI) that holds hundreds of thousands of electronic components increase the
circuit density of the microprocessor. What in the first generation filled the
entire room could now fit in the palm of the hand. The Intel 4004 chip,
developed in 1971, located all the components of the computer – from the
central processing unit and memory input/output controls – on a single chip.
In 1981 IBM (International Business Machine) introduced
the first computer for the home user and in 1984, Apple introduced the
Macintosh. Microprocessors also moved out of the realm of desktop computers and
into many areas of life as more and more everyday products began to use
microprocessors.
As these
small computers became powerful, they could be linked together to form
networks, which eventually led to the development of the Internet. Fourth
generation computers also saw the development of GUI’s, the mouse and handheld
devices. Indeed, among innovation in different areas of computer technology
are: multi-processing, multi-programming, time-sharing, operating speed and
virtual storage.
Fifth
Generation: Present and Beyond (Artificial Intelligence)
Fifth
Generation computing devices, based on artificial
intelligence, are still in development though there are some applications,
such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make
artificial intelligence a reality. Quantum
computation and molecular and nanotechnology will radically change
the face of computers in years to come. The goal of fifth-generation computing
device is to develop devices that respond to natural language input and are
capable of learning and self-organization.