What is the definition of computer

Computer terminology and definition

Today computers are part of almost every area of ​​life. Mostly unconsciously, but they are contained in devices in the medical field, they control automatic machines of all kinds and, of course, they are mainly to be found in consumer electronics devices. computer were originally only intended as an aid for "arithmetic tasks" that were too complex or too extensive for conventional calculations. Today, however, they can do a lot more.

Explanation of terms and origins

The word computer has its origin in the English or the Latin language. In both cases, “to compute” or “computare” mean “add up”. The real meaning of the invention of a computer can also be found in this origin of the word. It should be a Adding machine be developed that relieves the burden on people.

Originally, the "computer" was a job title for people who were well versed in all areas of mathematics, that is, for arithmetic specialists. Konrad Zuse developed the first mechanical calculating machine in 1938 and in 1946 the term computer was first used in connection with a calculating machine (Electronic Numerical Integrator and Computer).

Historical development of the computer

As a calculating machine, the computer is based on the well-known mathematical principles Basic arithmetic. The development of the various number systems and the discovery of the individual mathematical arithmetic operations go back hundreds of years, but represent the most essential basis of the way computers work today.

Calculating machines were developed in earlier centuries, which in principle could already be referred to as computers. The abacus, which over 1000 years BC Was developed, should be the first noteworthy example here. The Antikythera mechanism from the 1st century of our time is a wheel clock that uses many cogwheels to depict a sun, lunar and eclipse calendar.

In the 17th century, the "four-species machine" was developed as the first mechanical computer of modern times, which according to the concept can perform all four basic arithmetic operations. Its inventor Wilhelm Schickard is therefore also referred to as the "father of the computer era". The development of the binary number system by Gottfried Wilhelm Leibniz 1673. This forms the basis for today's digital computers, but was already used at that time, for example, in various applications of punch card systems.

With the industrial Revolution The development in the area of ​​computers also advanced extremely. In the 19th century a large number of different calculating machines were developed, most of which were based on the principles of the difference machine and the "analytical engine" by Charles Babbage. At the end of the 19th century the first "game computer" was developed, it was a simple one Chess computerwho could checkmate with king and rook.

Forerunner of the computer

In the 20th century there was a real competition between the powerful nations. Above all, the calculating machines from Konrad Zuse and the first digital computer, the Atanasoff-Berry computer. In the Second World War came the British "Colossus“At that. Especially during World War II, the resources were used to develop encryption and decryption machines.

After the war, the civil sector also developed very quickly. As early as 1949, “Simon”, the first digital computer for private home use, was presented. In the fifties and sixties, commercial computers based on transistors were mainly built. Only at the end of the sixties was the HP-9100A developed the first calculator to be officially recognized as a Personal computer was designated.

The definition of a new computer architecture and the invention of the integrated circuit were particularly groundbreaking for the further development of performance. From today's perspective, the development of the computer mouse 1968 and the connection of the first computer to the Internet 1969.

The first home computers

In the seventies, the development in the field of personal computers in particular accelerated considerably. Different companies brought their own versions onto the market. Above all, they set the tone Apple, Atari, IBM and Commodore. With the development of the microprocessor, computers became smaller and more powerful. In 1974, for example, the first programmable pocket calculator was available.

Some of the most popular home computers today were produced in the 1980s. The C64, the Amiga, the Applemacintosh or the IBM PC are pioneers in the field of PC. During this time, the development of processors advanced rapidly. In the beginning there were 8-bit processors with 64kb RAM, but 32-bit processors were already used in the Amiga. In the mid-eighties, Usenet and remote data transmission (DFÜ) were used for the first time to work seriously on the Internet.

At the end of the 20th century, development mainly focused on progress in the field of Processors and discovering the possibilities in World wide web. In the future, the use of computers for biological systems, that is, the link between nature and technology, will become more and more likely. Artificial intelligences, so far only fiction, could then become reality. In addition, the integration of new computational models, for example from quantum physics, will be the order of the day.

How a computer works

Basically, computers in digital and analog computers differentiated. The digital variant can process both numbers and text characters (i.e. digital data), the analog variant only analog data, for example measured variables.

Physically, the computer consists of various components. These hardware is logically divided into different areas of responsibility. The arithmetic unit (ALU - arithmetic - logical unit) and the control unit are usually combined in the central processor (CPU) today. According to the Von Neumann architecture, the hardware also includes the bus unit, the memory and the input and output units.

The arithmetic unit has various modules (Arithmetic operations), with which a wide variety of operations can be processed. The Storage is used by the system in such a way that data and programs are automatically stored in different areas. In the Von Neumann architecture these areas are "logically" separated, in the Harvard architecture programs are prevented from being overwritten by data through physical delimitation.

The input devices are used by the user to specify the desired task. The control unit regulates the reading out of data. These are forwarded to the processing unit and combined there using specified operations. These internal, unreal calculations are then made visible via the output unit.

However, this principle would be impractical for the user, since he would have to enter the commands for each individual memory cell. Therefore, this effort was reduced by the invention and use of Programming languages considerably reduced. A large number of text commands are stored here in the corresponding libraries. The user uses a simplified command and triggers a large number of automatic reactions in the computer, at the end of which the desired information is output.

This control of the computer through the input of commands is nowadays usually through the use of a graphical user interface and the provision of various programs has been replaced. Here the computer user selects the desired function using a mouse or keyboard, this is automatically converted into the corresponding programming language commands and then physically processed.

For your quotes: just copy and paste the permalink