The History of Computers


The history of the computer owes its existence to the fact that people, who are lazy by nature, have always sought to improve their ability to calculate, in order to reduce errors and save time.

Origins: The abacus

The "abacus" was invented in the year 700; it was in use for a long time, and still is in some countries.

Then came the logarithm

The invention of the logarithm is generally credited to the Scotsman John Napier (1550-1617). In 1614, he showed that multiplication and division could be performed using a series of additions. This discovery led, in 1620, to the invention of the slide rule.
However, the true father of logarithm theory is Mohamed Ybn Moussa Al-Khwarizmi, an Arab scholar from the Persian town of Khwarizmi. This scholar also developed algebra, a term which comes from the Arabic "Al-Jab r", meaning compensation, with the implication being "looking for the unknown variable X in order to compensate by balancing the results of the calculations."

The first calculating machines

In 1623, William Schickard invented the first mechanical calculating machine.
In 1642, Blaise Pascal created the arithmetic machine (called the Pascaline), a machine that could add and subtract, intended to help his father, a tax collector.
In 1673, Gottfried Wilhelm Von Leibniz added multiplication and division to the Pascaline.
In 1834, Charles Babbage invented the difference engine, which could evaluate functions.
However, once he learned that a weaving machine (called a Jacquard loom) was programmed with perforated cards, he started building a calculating machine that could take advantage of this revolutionary idea.
In 1820, the first four-function mechanical calculators debuted. They could:
  • add
  • subtract
  • multiply
  • divide
By 1885, they were being built with keyboards for entering data. Electrical motors quickly supplanted cranks.

Programmable computers

In 1938, Konrad Zuse invented a computer based around electro-mechanical relays: The Z3. This computer was the first to use binary instead of decimals
In 1937, Howard Aiken developed a programmable computer 17 meters long and 2.5 meters high, which could calculate 5 times faster than a human.
It was IBM's Mark I.
It was built using 3300 gears and 1400 switches linked with 800 km of electrical wiring.
In 1947, the Mark II appeared, with its predecessor's gears being replaced by electronic components.


Vacuum tube computers

In 1942, the ABC (Atanasoff Berry Computer), named after its designers, J.V. Atanasoff and C.Berry, was introduced.
In 1943, the first computer with no mechanical parts was created by J.Mauchly and J.Presper Eckert: the ENIAC (Electronic Numerical Integrator And Computer). It was made using 18000 vacuum tubes, and took up 1500 m2 of space. It was used for calculations required for designing the H-bomb.
The ENIAC's main drawback was its programming:
It could only be programmed manually, by flipping switches or plugging in cables.
The first computer error was caused by an insect, which was attracted to the vacuum tubes by the heat and became lodged in them, creating a short circuit. Thus, the name "bug" came to mean a computer error.
Indeed, as the tubes were poor conductors, they required a great deal of electrical energy, which they released as heat. This problem was solved in 1946 with the creation of the EDVAC (Electronic Discrete Variable Computer), which could store programs in memory (1024 words in central memory and 20000 words in magnetic memory).

The transistor

In 1948, the transistor was created by the firm Bell Labs (thanks to the work of the engineers John Bardeen, Walter Brattain and William Shockley). With transistors, the computers of the 1950s could be made less bulky, less energy-hungry, and therefore less expensive: This marked a turning point in computing history.

The integrated circuit

The integrated circuit was perfected in 1958 by Texas Instruments, and made even smaller and cheaper computers possible, by integrating multiple transistors on the same circuit without using electrical wiring.

The first transistor computers

In 1960, the IBM 7000 became the first transistor computer.
In 1964, the IBM 360 appeared, along with the DEC PDP-8.

Microcomputers

In 1971, the first microcomputer came out: the Kenback 1, with a 256-byte memory.

Microprocessors

In 1971, the first microprocessor, the Intel 4004, appeared. It could carry out 4 bits of operations at once.
Around the same time, Hewlett Packard put out the HP-35 calculator.
The Intel 8008 processor (which could process 8 bits at a time) was released in 1972.
In 1973, The Intel 8080 processor was used in the first microcomputers: the Micral and the Altair 8800, with 256 bytes of memory. In late 1973, Intel came out with processors that were already 10 times faster than their predecessor (the Intel 8080) and included 64 Kb of memory.
In 1976, Steve Wozniak and Steve Jobs created the Apple I in a garage. This computer had a keyboard, a 1 MHz microprocessor, 4 Kb of RAM and 1 KB of video memory.
The story goes that the two friends didn't know what to name the computer; Steve Jobs, seeing an apple tree in the garden, decided he would call the computer "apple" if he couldn't think up another name in the next five minutes.
In 1981, IBM sold the first "PC", made from an 8088 processor with a clock speed of 4.77 MHz.

Computers today

It is very difficult today to tell where computers are going. Their development has followed Moore's Law: "Every three years, four times as many transistors can be put on a chip."
This would imply that there will be 1 billion transistors on a chip around the year 2010.