[Home]Computer

HomePage | Recent Changes | Preferences | Receive an article a day!
You can edit this page right now!

What is a computer?

Broadly, a computer is any device used to process information according to a well-defined procedure. The word was originally used to describe people employed to do arithmetic calculations, with or without mechanical aids, but was transferred to the machines themselves. Originally, the information processing was almost exclusively related to arithmetical problems, but modern computers are used for many tasks unrelated to mathematics. Within such a definition sit mechanical devices such as the slide rule, the gamut of mechanical calculators from the abacus onwards, as well as all contemporary electronic computers.

However, the above definition includes many special-purpose devices that can compute only one or a limited range of functions. When considering modern computers, their most notable characteristic that distinguishes them from earlier computing devices is that, given the right programming, any computer can emulate the behaviour of any other (though perhaps limited by storage capacity and at different speeds), and, indeed, it is believed that current machines can emulate any future computing devices we invent (though undoubtedly more slowly). In some sense, then, this threshold capability is a useful test for identifying "general-purpose" computers from earlier special-purpose devices. This "general-purpose" definition can be formalised into a requirement that a certain machine must be able to emulate the behaviour of a universal Turing machine. Machines meeting this definition are referred to as Turing-complete.

In the last 20 years or so, however, many household devices, notably including video games consoles but extending to mobile telephones, video casette recorders, and myriad other household, industrial, automotive, and other electronic devices, all contain computer-like circuitry capable of meeting the above Turing-completeness requirement (with the proviso that the programming of these devices is often hardwired into a ROM chip which would need to be replaced to change the programming of the machine). These computers inside other special-purpose devices are commonly referred to as "microcontrollers" or "embedded computers". Therefore, many restrict the definition of computers to devices whose primary purpose is information processing rather than being a part of a larger system such as a telephone, microwave oven, or aircraft, and can be adapted for a variety of purposes by the user without physical modification. Mainframe computers, Minicomputers, and Personal Computers are the main types of computers meeting this definition.

Finally, many people who are unfamiliar with other forms of computers use the term exclusively to refer to Personal Computers.

How Computers Work

While the technologies used in digital computers have changed dramatically since the first computers of the 1940's (see History of computing for more details), most still use the von Neumann architecture proposed in the early 1940's by John von Neumann.

Basically, a computer fetches an instruction from an addressed memory cell. Then it adds one to the address, to find the address of the next instruction. Then it does the instruction it just fetched. Then it repeats the process. This process, "fetch, increment, execute, repeat," repeats millions of times per second. If the program is right, it calculates more quickly and more accurately than any human could.

The instruction can change the next address to fetch, which permits repetitive operation by setting the address back to the start of some list of instructions. The instruction may also check an arithmetic result, say, to see if it's zero, and change the address only if it is zero. Since a complex calculation could lead up to that simple arithmetic decision, the computer program can calculate complex decisions.

The very clever thing about this scheme is that it uses the memory to store the program as well as the data. Early computers only stored data in memory. The program was a complex set of plugs and wires in a plug-board. The idea that a program was also a sort of data was a brilliant generalization by a famous mathematician, John von Neumann.

Von Neumann's architecture describes a computer with four main sections: the Arithmetic and Logic Unit (ALU), the control circuitry, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by a bundle of wires, a "bus."

In this system, memory is a sequence of numbered "cells" or "pigeon holes," each containing a small piece of information. The information may be an instruction to tell the computer what to do. The cell may contain data that the computer needs to perform the instruction. Any slot may contain either, and indeed what is at one time data might be instructions later.

In general, memory can be rewritten over millions of times - it is a scratchpad rather than a stone tablet.

The size of each cell, and the number of cells, varies greatly from computer to computer, and the technologies used to implement memory have varied greatly - from electromechanical relays, to mercury-filled tubes (and later springs) in which acoustic pulses were formed, to matrices of permanent magnets, to individual transistors, to integrated circuits with millions of capacitors on a single chip.

The Arithmetic-Logic Unit, or ALU, is the device that performs elementary operations such as arithmetic operations (addition, subtraction, and so on), logical operations (AND, OR, NOT), and comparison operations (for example, comparing the contents of two "slots" for equality). This unit is where the "real work" is done.

The Control Unit keeps track of which slot contains the current instruction that the computer is performing, telling the ALU what operation to perform and retrieving the information (from the memory) that it needs to perform it, and transfers the result back to the appropriate memory location. Once that occurs, the control unit goes to the next instruction (typically located in the next slot, unless the instruction is a jump instruction informing the computer that the next instruction is located in another location).

The I/O allows the computer to obtain information from the outside world, and send the results of its work back there. There are incredibly broad range of I/O devices, from the familiar keyboards, monitors and floppy disk drives, to the more unusual such as webcams.

The instructions discussed above are not the rich instructions of a human language. A computer only has a limited number of well-defined, simple instructions. Typical sorts of instructions supported by most computers are "copy the contents of cell 123, and place the copy in cell 456", "add the contents of cell 666 to cell 042, and place the result in cell 013", and "if the contents of cell 999 are 0, your next instruction is at cell 345".

Instructions are represented not in English or any other human-readable language, but as numbers - the code for "copy" might be 001, for example. The particular instructions that a specific computer supports, and the codes for them, are known as the computer's [machine language]?. In practice, no human being actually writes the instructions for computers directly in machine language. People use a more convenient, higher level description of algorithms in some programming language; the translation of the programming language into the machine language is then done automatically by special computer programs.

Contemporary computers put the ALU and control unit into a single integrated circuit known as the Central Processing Unit or CPU. Typically, the computer's memory is located on a few small integrated circuits near the CPU. The overwhelming majority of the computer's mass is either ancilliary systems (for instance, to supply electrical power) or I/O devices.

Some larger computers differ from the above model in one major respect - they have multiple CPUs and control units working simultaneously. Additionally, a few computers, used mainly for research purposes and scientific computing, have differed significantly from the above model, but they have found little commercial application.

[Computer programs]? are simply large lists of instructions for the computer to execute, perhaps with tables of data. Many computer programs contain millions of instructions, and many of those instructions are executed repeatedly. A typical modern PC (in the year 2001) can execute around 1 billion instructions per second. Computers do not gain their extraordinary capabilities through the ability to execute complex instructions. Rather, they do millions of simple instructions arranged by clever people, "programmers." Good programmers develop sets of instructions to do common tasks (for instance, draw a dot on screen) and then make those sets of instructions available to other programmers.

Nowadays, most computers appear to execute several programs at the same time. The CPU executes instructions from first program, then after a time, jumps to the second program and executes some of its instructions, then jumps back and so forth. This creates the illusion of parallel processing by sharing the CPU's time between the programs. The operating system is the program that does "time-sharing."

The operating system is a sort of catch-all of useful pieces of code. Whenever some kind of computer code becomes sharable by many different types of computer program, over many years, programmers eventually move it into the operating system.

The operating system, for example, decides which programs get to run, and when, and what resources (such as memory or I/O) they get to use. The operating system also provides services to other programs, such as code ("drivers") which allow programmers to write programs for a machine without needing to know the intimate details of all attached electronic devices.

Now widely used programs are starting to be included in the operating system just because it is an economical way to distribute them. It's now commonplace for operating systems to include web browsers, text editors, e-mail programs, network interfaces, movie-players and other programs that were once quite exotic special-order programs.

Uses of computers

The first digital computers, with their massive size and cost, mainly performed scientific calculation. ENIAC, an early US computer, calculated neutron cross-sectional densities to see if the [hydrogen bomb]? would explode. The CSIR Mk I, the first Australian computer, evaluated rainfall patterns for the catchment of the [Snowy Mountains scheme]?, a large hydroelectric? generation project. Early visionaries anticipated the programming th would allow chess playing, moving pictures and other uses.

People in governments and large corporations also used computers to automate many of the data collection and processing tasks previously performed by humans - for example, maintaining and updating accounts and inventories. In academia, scientists of all sorts began to use computers for their own analyses. Continual reductions in the costs of computers saw them adopted by ever-smaller organizations.

With the invention of the microprocessor in the 1970's, it became possible to produce very inexpensive computers. [Personal computers]? became popular for a many tasks, including keeping books, writing and printing documents. Calculating forecasts and other repetitive math with spreadsheets, communicatiing with e-mail and, the Internet. However, computers' wide availability and easy customization has seen them used for many other purposes.

At the same time, small computers, usually with fixed programming, began to find their way into other devices such as home appliances, automobiles, aeroplanes, and industrial equipment. These embedded? processors controlled the behaviour of such devices more easily, allowing more complex control behaviours (for instance, the development of [anti-lock brakes]? in cars). By the start of the twenty-first century, most electrical devices, most forms of powered transport, and most factory production lines are controlled by computers. Most engineers predict that this trend will continue.

Flesh this out!

Information about many different aspects of computer-related mathematics, technology, usage, and history can be found on the Computing page.


Below here is an older version before the rewrite. It might be useful for grabbing ideas and so on, but can be deleted after the rewrite above is finished. I suggest no new edits below because it will eventually go away. Contribute above instead.
A computer is a programmable machine? that processes data. As currently defined, a computer consists of at least one central processor units or CPU's that almost always follow the Von Neumann model with an arithmetic/logical unit (ALU) for simple calculations, and a sequencer that takes the instructions and converts them to electrical signals to control the ALU and data movement operations. Additionally, there is a place to keep the instructions for the sequencer, and the data to be operated on, which is called computer storage or memory, and a way to get the information in and out of the machine, using input and output devices or I/O. Current memory and I/O devices are semiconductor RAM, [Disk drive]?s, [CD Rom]?s, keyboards, [video display terminal]?s or monitors?, and audio speakers.

Memory can be either volatle or non volatile, which means that it can be tempoary or perminant depending on the requirements. Usually data is collected onto a perminant memory such as [Disk drive]?, or CD ROM and then transfered to RAM which is the short tearm volatle memory used for storing the data being processed and the instructions that the computer follows.

Then there are peripheral devices. These consist of input devices to send data from the outside world to the computer, output devices for the computer to send data back to the outside world, and often long term memory devices.

Computers have become cheap plentiful and pervasive in the last few years of the 20th century. Their economic, social and political effects have been enormous.

The word "computer"

Over the years there has been several slightly different meanings to the word computer, and several different words for the thing we now usually call a computer.

For instance "computer" was once commonly used to mean a person employed to do arithmetic calculations, with or without mechanical aids. Charles Babbage designed one of the first computing machines called the Analytical engine, but due to technological problems it was not built in his lifetime. Various simple mechanical devices such as the slide rule kind have also been called computers. In some cases they were referred to as "analogue computers", as there was no discrete or digital computational ability. What are now called simply "computers" were once commonly called "digital computers" to distinguish them from these other devices.

In thinking of other words for the computer, it is worth noting that in other languages the word chosen does not always have the same literal meaning as the English language word. In French for example, the word is "ordinateur", which means approximately "organizer", or "sorting machine". The Spanish word is "ordenador" , with the same meaning, although in some countries they use the anglicism computadora. In Italian, computer is "calcolatore", calculator, emphasizing its computational uses over logical ones like sorting. In Swedish, a computer is called "dator" from "data". At least in the 1950s, they were called "matematikmaskin" ("mathematics machine").

(Would anyone like to tell us and attempt to explain the Chinese language word, or others?)

In English too, other words and phrases have been used, such as "data processing machine".

There is also a quote (not sure who from, any takers?): "They are called computers because computing is the only non-trivial use they have been put to so far." That was said quite a long time ago, when "computing" just meant "arithmetic calculating". Since then, data storage and retrieval, communications applications, and many embedded processing tasks have become important uses of "computers".

See also:


This page needs a bit of wikification. In the meantime, if you are looking for links to more detail on the stuff touched on here, see computing and computer science. For the history of the computer, start with History of computing.

/Talk


HomePage | Recent Changes | Preferences | Receive an article a day!
You can edit this page right now! It's a free, community project
Edit text of this page | View other revisions
Last edited November 2, 2001 6:05 am (diff)
Search: