Computer Generation
The history of
computer development is often referred to in reference to the different generations of
computing devices. A generation refers to the state of improvement in the
product development process.
This term is also used in the different advancements of new computer
technology. With each new generation, the circuitry has gotten smaller
and more advanced than the previous generation before it. As a result of
the miniaturization, speed, power, and
computer memory has proportionally increased. New discoveries are constantly being developed that affect the way we live, work and play.
Each generation of computers is characterized by major technological
development that fundamentally changed the way computers operate,
resulting in increasingly smaller, cheaper, more powerful and more
efficient and reliable devices. Read about each generation and the
developments that led to the current devices that we use today.
First Generation - 1940-1956: Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic
drums for memory, and were often enormous, taking up entire rooms. A
magnetic drum,also referred to as drum, is a metal cylinder coated with
magnetic iron-oxide material on which data and programs can be stored.
Magnetic drums were once use das a primary storage device but have since
been implemented as auxiliary storage devices.
Second Generation - 1956-1963: Transistors
Transistors replaced vacuum tubes and ushered in the
second generation computer.
Transistor is a device composed of semiconductor material that
amplifies a signal or opens or closes a circuit. Invented in 1947 at
Bell Labs, transistors have become the key ingredient of all digital
circuits, including computers. Today's
latest microprocessor contains tens of millions of microscopic transistors.
Prior to the invention of transistors, digital circuits were composed
of vacuum tubes, which had many disadvantages. They were much larger,
required more energy, dissipated more heat, and were more prone to
failures. It's safe to say that without the invention of transistors,
computing as we know it today would not be possible.
The transistor was invented in 1947 but did not see widespread use in
computers until the late 50s. The transistor was far superior to the
vacuum tube,allowing computers to become smaller, faster, cheaper,more
energy-efficient and more reliable than their first-generation
predecessors. Though the transistor still generated a great deal of heat
that subjected the computer to damage, it was a vast improvement over
the vacuum tube. Second-generation computers still relied on punched
cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine
language to symbolic, or assembly, languages,which allowed programmers
to specify instructions in words. High-level programming languages were
also being developed at this time, such as early versions of COBOL and
FORTRAN. These were also the first computers that stored their
instructions in their memory, which moved from a magnetic drum to
magnetic core technology.
Third Generation - 1964-1971: Integrated Circuits
The development of the integrated circuit was the hallmark of the
third generation of computers. Transistors were miniaturized and placed
on silicon chips, called semiconductors, which drastically increased the
speed and efficiency of computers.
A nonmetallic chemical element in the carbon family of elements.
Silicon - atomic symbol "Si" - is the second most abundant element in
the earth's crust, surpassed only by oxygen. Silicon does not occur
uncombined in nature. Sand and almost all rocks contain silicon combined
with oxygen, forming silica. When silicon combines with other elements,
such as iron, aluminum or potassium, a silicate is formed. Compounds of
silicon also occur in the atmosphere, natural waters,many plants and in
the bodies of some animals.
Silicon is the basic material used to make computer chips,
transistors, silicon diodes and other electronic circuits and switching
devices because its atomic structure makes the element an ideal
semiconductor. Silicon is commonly doped, or mixed,with other elements,
such as boron, phosphorous and arsenic, to alter its conductive
properties.
A chip is a small piece of semi conducting material(usually silicon)
on which an integrated circuit is embedded. A typical chip is less than
¼-square inches and can contain millions of electronic
components(transistors). Computers consist of many chips placed on
electronic boards called printed circuit boards. There are different
types of chips. For example, CPU chips (also called microprocessors)
contain an entire processing unit, whereas memory chips contain blank
memory.
Semiconductor is a material that is neither a good conductor of electricity (like copper) nor a good insulator (like rubber).
Fourth Generation - 1971-Present: Microprocessors
The microprocessor brought the fourth generation of computers, as
thousands of integrated circuits we rebuilt onto a single silicon chip. A
silicon chip that contains a CPU. In the world of personal
computers,the terms microprocessor and CPU are used interchangeably. At
the heart of all personal computers and most workstations sits a
microprocessor. Microprocessors also control the logic of almost all
digital devices, from clock radios to fuel-injection systems for
automobiles.
Three basic characteristics differentiate microprocessors:
- Instruction Set: The set of instructions that the microprocessor can execute.
- Bandwidth: The number of bits processed in a single instruction.
- Clock Speed: Given in megahertz (MHz), the clock speed determines how many instructions per second the processor can execute.
In both cases, the higher the value, the more powerful the CPU. For
example, a 32-bit microprocessor that runs at 50MHz is more powerful
than a 16-bitmicroprocessor that runs at 25MHz.
What in the first generation filled an entire room could now fit in
the palm of the hand. The Intel 4004chip, developed in 1971, located all
the components of the computer - from the central processing unit and
memory to input/output controls - on a single chip.
Abbreviation of central processing unit, and pronounced as separate
letters. The CPU is the brains of the computer. Sometimes referred to
simply as the processor or central processor, the CPU is where most
calculations take place. In terms of computing power,the CPU is the most
important element of a computer system.
On large machines, CPUs require one or more printed circuit boards.
On personal computers and small workstations, the CPU is housed in a
single chip called a microprocessor.
Two typical components of a CPU are:
- The arithmetic logic unit (ALU), which performs arithmetic and logical operations.
- The control unit, which extracts instructions from memory and decodes and executes them, calling on the ALU when necessary.
Fifth Generation - Present and Beyond: Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence,
are still in development,though there are some applications, such as
voice recognition, that are being used today.
Artificial Intelligence is the branch of computer science concerned
with making computers behave like humans. The term was coined in 1956 by
John McCarthy at the Massachusetts Institute of Technology. Artificial
intelligence includes:
- Games Playing: programming computers to play games such as chess and checkers
- Expert Systems: programming computers to make
decisions in real-life situations (for example, some expert systems help
doctors diagnose diseases based on symptoms)
- Natural Language: programming computers to understand natural human languages
- Neural Networks: Systems that simulate intelligence by attempting to reproduce the types of physical connections that occur in animal brains
- Robotics: programming computers to see and hear and react to other sensory stimuli
Currently, no computers exhibit full artificial intelligence (that
is, are able to simulate human behavior). The greatest advances have
occurred in the field of games playing. The best computer chess programs
are now capable of beating humans. In May,1997, an IBM super-computer
called Deep Blue defeated world chess champion Gary Kasparov in a chess
match.
In the area of robotics, computers are now widely used in assembly
plants, but they are capable only of very limited tasks. Robots have
great difficulty identifying objects based on appearance or feel, and
they still move and handle objects clumsily.
Voice Recognition
The field of computer science that deals with designing computer
systems that can recognize spoken words. Note that voice recognition
implies only that the computer can take dictation, not that it
understands what is being said. Comprehending human languages falls
under a different field of computer science called natural language
processing. A number of voice recognition systems are available on the
market. The most powerful can recognize thousands of words. However,
they generally require an extended training session during which the
computer system becomes accustomed to a particular voice and accent.Such
systems are said to be speaker dependent.
Many systems also require that the speaker speak slowly and
distinctly and separate each word with a short pause. These systems are
called discrete speech systems. Recently, great strides have been made
in continuous speech systems -- voice recognition systems that allow you
to speak naturally. There are now several continuous-speech systems
available for personal computers.
Because of their limitations and high cost, voice recognition systems
have traditionally been used only in a few specialized situations. For
example, such systems are useful in instances when the user is unable to
use a keyboard to enter data because his or her hands are occupied or
disabled. Instead of typing commands, the user can simply speak into a
headset. Increasingly, however, as the cost decreases and performance
improves, speech recognition systems are entering the mainstream and are
being used as an alternative to keyboards.
The use of parallel processing and superconductors is helping to make
artificial intelligence a reality. Parallel processing is the
simultaneous use of more than one CPU to execute a program. Ideally,
parallel processing makes a program run faster because there are more
engines (CPUs) running it. In practice, it is often difficult to divide a
program in such a way that separate CPUs can execute different portions
without interfering with each other.
Most computers have just one CPU, but some models have several. There
are even computers with thousands of CPUs. With single-CPU computers,
it is possible to perform parallel processing by connecting the
computers in a network. However, this type of parallel processing
requires very sophisticated software called distributed processing
software.
Note that parallel processing differs from multitasking, in which a single CPU executes several programs at once.
Parallel processing is also called parallel computing.
Quantum computation and molecular and nano-technology will radically
change the face of computers in years to come. First proposed in the
1970s, quantum computing relies on quantum physics by taking advantage
of certain quantum physics properties of atoms or nuclei that allow them
to work together as quantum bits, or qubits, to be the computer's
processor and memory. By interacting with each other while being
isolated from the external environment,qubits can perform certain
calculations exponentially faster than conventional computers.
Qubits do not rely on the traditional binary nature of computing.
While traditional computers encode information into bits using binary
numbers, either a 0or 1, and can only do calculations on one set of
numbers at once, quantum computers encode information as a series of
quantum-mechanical states such as spin directions of electrons or
polarization orientations of a photon that might represent a 1 or a 0,
might represent a combination of the two or might represent a number
expressing that the state of the qubit is somewhere between 1 and 0, or a
superposition of many different numbers at once. A quantum computer can
doan arbitrary reversible classical computation on all the numbers
simultaneously, which a binary system cannot do, and also has some
ability to produce interference between various different numbers. By
doing a computation on many different numbers at once,then interfering
the results to get a single answer, a quantum computer has the potential
to be much more powerful than a classical computer of the same size.In
using only a single processing unit, a quantum computer can naturally
perform myriad operations in parallel.
Quantum computing is not well suited for tasks such as word
processing and email, but it is ideal for tasks such as cryptography and
modeling and indexing very large databases.