LINFO

Computer Definition



A computer can be defined broadly as any of a class of man-made devices or systems that can modify data in some meaningful way.

Data is a collection of distinct pieces of information, particularly information that has been formatted (i.e., organized) in some specific way for use in analysis or making decisions. Information can be broadly defined as any pattern that can be recognized by some system (e.g., a living organism, an electronic system or a mechanical device) and/or that can influence the formation or transformation of other patterns.

All computers make use of both hardware and software and utilize some form of energy. Hardware refers to the physical components of computers, both those that are directly involved in processing the data, such as the CPU (central processing unit) and memory chips, and peripheral devices, such as storage devices (e.g., hard disk drives) and devices for human-computer interface (e.g., display monitors and keyboards).

Software can be divided into two broad categories. One is data, and the other is programs, which are sequences of instructions to manipulate data. Programs, in turn, can likewise be divided into two broad categories: application programs, which users work with directly to manipulate data, and operating systems, which are collections of programs that manage all the other programs as well as the allocation and use of hardware resources.

Computers can be designed for various levels of specificity of applications, from just a single application through a class of applications to general purpose. The type of data that can be accommodated depends on the type of computer: computers designed for a specific application might only be able to accommodate one type of data, whereas general purpose computers can accommodate a wide range of data types, including text, numeric, image and audio.

General purpose computers are programmable; that is, programs can easily be entered into them, stored in them, created and modified within them, and removed from them by users. This is in contrast with some highly specific computers that contain only a single program that cannot be modified or replaced without great difficulty.

Electronic computers process data by breaking it down into the smallest practical units, which are bits (short for binary digits), and manipulating them at extremely high speed with simple arithmetic calculations such that it can seem that some sort of magic is taking place.

Although nearly all computers today perform their operations on data via electronic circuits, this was not always the case. For example, early computers were mechanical, such as the abacus, Charles Babbage's uncompleted difference engine and early calculators. Even the first electronic computers, such as the Z3, which was completed in Germany in 1941, made use of non-electronic parts (i.e., electro-mechanical relays) for their operation.

The first fully electronic computer was the Colossus, which was completed in 1944; however, it was designed only to perform a single task (i.e., code breaking) and was not programmable. The first fully electronic computer that was programmable was ENIAC (Electronic Numerical Integrator And Computer), which began test operation in late 1945.

Tremendous progress has been made on improving performance and reducing the size of cost of computers subsequent to the development of ENIAC. For example, whereas ENIAC measured some 2.4m by 0.9m by 30.5m and weighed roughly 30 metric tons, today the same amount of computational power can be accommodated on a single chip of silicon about the size of a grain of rice. Moreover, whereas ENIAC cost approximately a half million dollars just for construction (and much more when operation is included), computers with superior performance are available today for just a few cents and are even used in wristwatches and cheap toys.

The term computer originally referred to a person whose profession was spending all day at the tedious task of doing calculations with a pencil and paper. A major use for such calculations was producing trajectory tables for ballistics for the military. The desire to speed up such calculations was the driving force behind the development of ENIAC.

The term today is no longer used to refer to such human computers. Now it generally means the electronic device that does the data processing together with the peripheral hardware for input, output and storage, such as a display screen, disk drives, a keyboard and a mouse. But it can also refer to a single chip which performs the basic computer functions, referred to as a computer on a chip. Such chips can be used in conventional, general purpose computers, such as desktop or notebook computers, but their largest application is in embedded systems, which are products into which computer functions are built (e.g., aircraft, electronic medical equipment, industrial production controls, communications equipment, elevators, locomotives, and test and measurement instruments).

Computers can also be looked at as being much broader than just being single machines at single locations. This is because computing has becoming increasingly distributed, with various aspects of data entry, processing and storage occurring at different locations on a network, including the Internet (the largest of all networks). Thus the expression The network is the computer1 is probably the best way to describe modern computers.

Despite their already astonishing performance, a tremendous amount of room remains for further improvement in computers. Such advances could be attained in large part through the use of non-electronic technologies. In particular, much work has been conducted in recent years on the use of purely optical components instead of electronic components in order to attain both higher speeds and lower power consumption. Research is also being conducted on biological computers, which would use materials and processes similar to those used by living organisms.

In some ways, today's computers have capabilities that vastly exceed those of the human brain, such as performing mathematical calculations at extremely fast speeds and transforming complex images. In other ways, however, they are still extremely dumb, such as their inability to make sense of simple situations or to be creative.


________
1This phrase was created in 1984 by John Gage, chief researcher at Sun Microsystems, and it became that company's motto.






Created May 14, 2006.
Copyright © 2006 The Linux Information Project. All Rights Reserved.