Early computers

What is a computer? – Early computers

Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical or electromechanical machines (e.g. Konrad Zuse's Z3) did specialized analog or digital calculations in the early 20th century. The first digital* electronic* calculating machines were developed during World War II. The first programmable, electronic, general-purpose digital computer was the famous ENIAC which was completed in 1945.
    The vacuum tubes and the first discrete semiconductor components like diodes and bipolar transistors were followed by the silicon-based monolithic integrated circuit or chip technologies in the late 1950s. Integrated circuits were integrating a gradually increasing number of tiny transistors into one single semiconductor chip. This gradually lead to the development of the microprocessor and in turn the microcomputer revolution in the 1970s. The speed, power and versatility of computers* have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (as predicted by Moore's law). Along with the rapid development of wide-area computer networks* like the Internet, this lead to the Digital Revolution during the late 20th to early 21st centuries.

What does electronic mean?

The main hardware units of an electronic computer (e.g. the central unit, the motherboard, extension cards etc.) consist of electronic components for the most part, such as resistors, capacitors, diodes, transistors, integrated circuits etc.

In terms of the logic functions and logical operations that electronic circuits technically implement, logic gates and circuits form the basis of how the computer works.

Boda István, 2023.