The Evolution of Computers: From the First Generation to the Fifth

 The Evolution of Computers: From the First Generation to the Fifth

The Evolution of Computers: From the First Generation to the Fifth

Computer history and stages of development 

The origin of the word computer dates back to the sixteenth century, where this term referred to a human employee who performs certain mathematical calculations, and this term remained associated with humans until the end of the nineteenth century, where this term became referring to machines through which calculations can be performed, and the appearance of the first computer was in 1833, where the scientist Charles Babbage, who is considered the godfather of computers, designed a mechanical computer with the aim of using it for general purposes.

Charles Babbage called the device he designed the analytical engine, and this computer included several parts similar to what is contained in computers at the present time, as the device contained what was known as the mill, which is the central processing unit, and it also contained what is known as the store, and Charles provided his machine with a data entry unit known as the reader, in addition to being able to print the results of the processing processes that occur through this machine through what is known as the printer. 

First generation of computers 

The period of time that witnessed the outbreak of World War II is considered a period of great importance in the development of computers, as the German engineer Konrad Souze, specifically in 1938, was able to create the first programmable binary computer in history, and it was called (Z1), and one year later, both the American physicist John Atanasoff and the engineer Clifford Perry were able to build an analogue computer known as (Atanasoff Berry Computer). 

The ABC computer is an electrical computer that works through more than 300 tubes for numerical, arithmetic and logical calculations, and 1943 saw the start of the manufacture of the first general-purpose computer, the ENIAC (short for digital integration tool and electronic calculator), which was completed in 1946. 

The computer (ENIAC) and its predecessors are the first generation of the emergence of computers throughout history, which lasted until the end of the fifties of the twentieth century, where these devices were characterized by their slow work, large size, and high cost, because they contain vacuum tube units as basic components for both the CPU and the memory unit, in addition to the device's reliance on magnetic and paper tapes as input and output devices. 

Second generation computers 

Computers for the first generation relied on vacuum tubes, but the increasing commercial interest in the field of computers led to the emergence of a new generation of these devices, which was known as the second generation, as it relied on transistors as an essential component of their manufacture, and the device (Transac S-2000), which was manufactured in 1958 by the company (Philco Corporation), is one of the first devices manufactured from transistors. 

IBM used the transistor in the manufacture of its devices through the production of the IBM 7090 computer, and it is worth noting that the second-generation computers were mainly used through magnetic disks and tapes in order to store data, and were also programmed in different languages such as COBOL, which is a programming language oriented to common business, in addition to the FORTRAN language, which was used in both the scientific and commercial fields.

The period of use of the second generation of computers lasted between 1959-1965, as the use of the transistor reduced the cost of manufacture, reduced its size, and increased its speed compared to first-generation devices, and devices of this generation were characterized by lower energy consumption.

Third generation computers 

The emergence of the third generation of computers dates back to what is known as the integrated circuit, which is referred to by the abbreviation (IC), and the integrated circuit was invented by Robert Noys and Jack Kilby between 1958-1959, and the emergence of third-generation devices was the first steps for the emergence of computers used at the present time, and the computer (IBM-360) is the most important type of third-generation computer, as IBM spent nearly 5 billion US dollars on the production of this series.

The computer (IBM-360) was used to carry out many operations that require fast data processing, such as weather forecasting, astronomy, space exploration, and other specialized scientific fields, and the emergence of the third generation of computers continued until 1971, where the devices were characterized by their high speed, small size, and good efficiency compared to second-generation devices, and high-level programming languages were used.

Fourth generation of computers 

The invention of what is known as the microprocessor led to the emergence of the fourth generation of computers, and the processor (Intel 8008), which was invented in 1972, was the first processor, as the emergence of microprocessors significantly reduced the cost of producing computers, which contributed to the emergence of personal computers, which are referred to by the abbreviation (PC), laptops, in addition to mobile phone devices.

Altair 8800, IBM 5100 and Micral are examples of some older types of fourth-generation devices, and microprocessors or what is known as the central processing unit are used in computers today. 

The fifth generation of computers 

The fifth generation devices appeared in 2010, and the devices of this generation rely on the use of artificial intelligence, as these devices work in a way that enables them to interact with natural language inputs, and have the ability to learn and self-organize in a way that makes them have intelligence similar to the intelligence of humans to some extent, and the Watson computer produced by (IBM); One of the most famous examples of fifth-generation devices.

Comments