What is the history of the computer?

The history of the computer goes back hundreds of years, during that time several people and entities collaborated in the evolution of the computer. However, the computer or computer as we know it today had its beginning in the 19th century. Let’s see some history:

The first mechanical calculating machine, a precursor to the digital computer, was invented in 1642 by the French mathematician Blaise Pascal. That device used a series of ten-toothed wheels, each of which represented a digit from 0 to 9. The wheels were connected in such a way that numbers could be added by advancing them the correct number of teeth. In 1670 the German philosopher and mathematician Gottfried Wilhelm Leibniz perfected this machine and invented one that could also multiply.

French inventor Joseph Marie Jacquard, in designing an automatic loom, used thin, perforated wooden plates to control the weave used in complex designs. During the 1880s the American statistician Herman Hollerith conceived the idea of ​​using punched cards, similar to Jacquard plates, to process data. Hollerith was able to compile the statistical information for the 1890 United States census of population by using a system that passed punch cards over electrical contacts.

The Analytical
Engine Also in the 19th century, the British mathematician and inventor Charles Babbage worked out the principles of the modern digital computer. He invented a series of machines, such as the difference engine, designed to solve complex mathematical problems. Many historians consider Babbage and his associate, the British mathematician Augusta Ada Byron (1815-1852), the daughter of the English poet Lord Byron, to be the true inventors of the modern digital computer.

The technology of that time was not capable of translating its successful concepts into practice; but one of his inventions, the Analytical Engine, already had many of the features of a modern computer. It included a stream, or input stream in the form of a pack of punched cards, a memory to store the data, a processor for mathematical operations, and a printer to make the record permanent.

Early Computers
Analog computers began to be built in the early 20th century. Early models performed calculations using rotating shafts and gears. With these machines, numerical approximations of equations too difficult to be solved by other methods were evaluated. During the two world wars, analog computer systems, first mechanical and later electrical, were used to predict the trajectory of torpedoes in submarines and for remote control of bombs in aircraft.

Electronic computers
During World War II (1939-1945), a team of scientists and mathematicians working in Bletchley Park, north London, created what was considered the first fully electronic digital computer: the Colossus.

By December 1943 the Colossus, which incorporated 1,500 valves or vacuum tubes, was already operational. It was used by the team led by Alan Turing to decode encrypted radio messages from the Germans.

In 1939 and independently of this project, John Atanasoff and Clifford Berry had already built a prototype of an electronic machine at Iowa State College (USA). This prototype and subsequent research was done in anonymity, and was later overshadowed by the development of the Electronic Digital Numerical Integrator and Calculator (ENIAC) in 1945. The ENIAC, which the evidence showed was based largely on the ‘computer ‘ Atanasoff-Berry (ABC, an acronym for Electronic Numerical Integrator and Computer), obtained a patent that expired in 1973, several decades later.

El ENIAC 

The ENIAC contained 18,000 vacuum tubes and had a speed of several hundred multiplications per minute, but its program was linked to the processor and had to be modified manually. A successor to the ENIAC was built with a program store that was based on the concepts of the Hungarian-American mathematician John von Neumann. The instructions were stored inside a so-called memory, which freed the computer from the speed limitations of the paper tape reader during execution and allowed problems to be solved without the need to reconnect to the computer.

In the late 1950s the use of the transistor in computers marked the advent of smaller, faster, and more versatile logic elements than valved machines allowed. Since transistors use much less power and have a longer lifespan, their development led to the birth of more sophisticated machines, which were called computers or second-generation computers. Components were made smaller, as were the spaces between them, making the system cheaper to manufacture.

Integrated Circuits
In the late 1960s, the integrated circuit (IC) appeared, making it possible to manufacture multiple transistors on a single silicon substrate onto which the interconnecting leads were soldered. The integrated circuit allowed a further reduction in price, size and error rates. The microprocessor became a reality in the mid-1970s with the introduction of the Large Scale Integration (LSI) circuit and, later, the Large Scale Integration (VLSI) circuit. short for Very Large Scale Integrated), with several thousand interconnected transistors soldered onto a single silicon substrate.
Source: Microsoft Encarta

Leave a Reply

Your email address will not be published.

Back to top button