วันพฤหัสบดีที่ 25 ตุลาคม พ.ศ. 2550

How Computer Work

Main articles: Central processing unit and Microprocessor
A general purpose computer has four main sections: the
arithmetic and logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by busses, often made of groups of wires.
The control unit, ALU, registers, and basic I/O (and often other hardware closely linked with these) are collectively known as a
central processing unit (CPU). Early CPUs were composed of many separate components but since the mid-1970s CPUs have typically been constructed on a single integrated circuit called a microprocessor.

Control unit
Main articles:
CPU design and Control unit
The control unit (often called a control system or central controller) directs the various components of a computer. It reads and interprets (decodes) instructions in the program one by one. The control system decodes each instruction and turns it into a series of control signals that operate the other parts of the computer.
[11] Control systems in advanced computers may change the order of some instructions so as to improve performance.
A key component common to all CPUs is the
program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[12]

Diagram showing how a particular MIPS architecture instruction would be decoded by the control system.
The control system's function is as follows — note that this is a simplified description and some of these steps may be performed concurrently or in a different order depending on the type of CPU:
Read the code for the next instruction from the cell indicated by the program counter.
Decode the numerical code for the instruction into a set of commands or signals for each of the other systems.
Increment the program counter so it points to the next instruction.
Read whatever data the instruction requires from cells in memory (or perhaps from an input device). The location of this required data is typically stored within the instruction code.
Provide the necessary data to an ALU or register.
If the instruction requires an ALU or specialized hardware to complete, instruct the hardware to perform the requested operation.
Write the result from the ALU back to a memory location or to a register or perhaps an output device.
Jump back to step (1).
Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of
control flow).
It is noticeable that the sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program - and indeed, in some more complex CPU designs, there is another yet smaller computer called a
microsequencer that runs a microcode program that causes all of these events to happen.

Computer Trend



















HISTORY OF THE COMPUTER






A HISTORY OF THE COMPUTER: PREHISTORY




The Antikythera mechanism, used for registering and predicting the motion of the stars and planets, is dated to the first century B.C. It was discovered off the coast of Greece in 1901.
Arabic numerals are introduced to Europe in the eighth and ninth centuries A.D. Roman numerals remain in use in some parts of Europe until the seventeenth century. The Arabic system introduced the concepts of the zero and fixed places for tens, hundreds, thousand, etc., and greatly simplified mathematical calculations.
John Napier, Baron of Merchiston, Scotland, invents logs in 1614. Logs allow multiplication and division to be reduced to addition and subtraction.
Wilhelm Schickard builds the first mechanical calculator in 1623. It can work with six digits, and carries digits across columns. It works, but never makes it beyond the prototype stage. Schickard is a professor at the University of Tubingen, Germany.
Blaise Pascal builds a mechanical calculator in 1642. It has the capacity for eight digits, but has trouble carrying and its gears tend to jam.
Joseph-Marie Jacquard invents an automatic loom controlled by punch cards.
Charles Babbage conceives of a "Difference Engine" in 1820 or 1821. It is a massive steam-powered mechanical calculator designed to print astronomical tables. He attempts to build it over the course of the next 20 years, only to have the project cancelled by the British government in 1842. Babbage's next idea is the Analytical Engine - a mechanical computer that can solve any mathematical problem. It uses punch-cards similar to those used by the Jacquard loom and can perform simple conditional operations.
Augusta Ada Byron, the countess of Lovelace, met Babbage in 1833. She describes the Analytical Engine as weaving "algebraic patterns just as the Jacquard loom weaves flowers and leaves." Her published analysis of the Analytical Engine is our best record of its programming potential. In it she outlines the fundamentals of computer programming, including data analysis, looping and memory addressing.












A HISTORY OF THE COMPUTER: ELECTRONICS

Konrad Zuse, a German engineer, completes the first general purpose progammable calculator in 1941. He pioneers the use of binary math and boolean logic in electronic calculation.
Colossus, a British computer used for code-breaking, is operational by December of 1943. ENIAC, or Electronic Numerical Integrator Analyzor and Computer, is developed by the Ballistics Research Laboratory in Maryland to assist in the preparation of firing tables for artillery. It is built at the University of Pennsylvania's Moore School of Electrical Engineering and completed in November 1945.
Bell Telephone Laboratories develops the transistor in 1947.
UNIVAC, the Universal Automatic Computer (pictured below), is developed in 1951. It can store 12,000 digits in random access mercury-delay lines.
EDVAC, for Electronic Discrete Variable Computer, is completed under contract for the Ordinance Department in 1952.
In 1952 G.W. Dummer, a radar expert from the British Royal Radar Establishment, proposes that electronic equipment be manufactured as a solid block with no connecting wires. The prototype he builds doesn't work and he receives little support for his research.
Texas Instruments and Fairchild semiconductor both announce the integrated circuit in 1959.
The IBM 360 is introduced in April of 1964 and quickly becomes the standard institutional mainframe computer. By the mid-80s the 360 and its descendents will have generated more than $100 billion in revenue for IBM.













A HISTORY OF THE COMPUTER: MINI

Texas Instruments and Fairchild semiconductor both announce the integrated circuit in 1959.
Ivan Sutherland demonstrates a program called Sketchpad on a TX-2 mainframe at MIT's Lincoln Labs in 1962. It allows him to make engineering drawings with a light pen.
A typical minicomputer costs about $20,000.
1965: An IC that cost $1000 in 1959 now costs less than $10. Gordon Moore predicts that the number of components in an IC will double every year. This is known as Moore's Law.
Doug Engelbart demonstrates in 1968 a word processor, an early hypertext system and a collaborative application: three now common computer applications.
Gordon Moore and Robert Noyce found Intel in 1968.
Xerox creates its Palo Alto Research Center - Xerox PARC - in 1969. Its mission is to explore the "architecture of information."
Fairchild Semiconductor introduces a 256-bit RAM chip in 1970.
In late 1970 Intel introduces a 1K RAM chip and the 4004, a 4-bit microprocessor. Two years later comes the 8008, an 8-bit microprocessor.










A HISTORY OF THE COMPUTER: MICRO

Bill Gates and Paul Allen form Traf-O-Data in 1971 to sell their computer traffic-analysis systems.
1972: Gary Kildall writes PL/M, the first high-level programming language for the Intel microprocessor.
Steve Jobs and Steve Wozniak are building and selling "blue boxes" in Southern California in 1971.
April 1972: Intel introduces the 8008, the first 8-bit microprocessor.
Jonathan A. Titus designs the Mark-8, "Your Personal Minicomputer," according to the July, 1974 cover of Radio-Electronics.
Popular Electronics features the MITS Altair 8800 on its cover, January 1975. It is hailed as the first "personal" computer. Thousands of orders for the 8800 rescue MITS from bankruptcy.
Pictured below: The Homebrew Computer Club in 1975.
Paul Allen and Bill Gates develop BASIC for the Altair 8800. Microsoft is born.
1977: Apple is selling its Apple II for $1,195, including 16K of RAM but no monitor.
Software Arts develops the first spreadsheet program, Visicalc, by the spring of 1979. It is released in October and is an immediate success. Copies shipped per month rise from 500 to 12,000 between 1979 and 1981.
By 1980 Apple has captured 50% of the personal computer market.
In 1980 Microsoft is approached by IBM to develop BASIC for its personal computer project. The IBM PC is released in August, 1981.
The Apple Macintosh debuts in 1984. It features a simple, graphical interface, uses the 8-MHz, 32-bit Motorola 68000 CPU, and has a built-in 9-inch B/W screen.
Microsoft Windows 1.0 ships in November, 1985.
Motorola announces the 68040, a 32-bit 25MHz microprocessor.
Microsoft's sales for 1989 reach $1 billion, the first year to do so.




A HISTORY OF THE COMPUTER: NETWORK

Timesharing, the concept of linking a large numbers of users to a single computer via remote terminals, is developed at MIT in the late 50s and early 60s.
1962: Paul Baran of RAND develops the idea of distributed, packet-switching networks.
ARPANET goes online in 1969.
Bob Kahn and Vint Cerf develop the basic ideas of the Internet in 1973.
In 1974 BBN opens the first public packet-switched network - Telenet.
A UUCP link between the University of North Carolina at Chapel Hill and Duke University establishes USENET in 1979. The first MUD is also developed in 1979, at the University of Essex.
TCP/IP (Transmission Control Protocol and Internet Protocol) is established as the standard for ARPANET in 1982.
1987: the number of network hosts breaks 10,000.
1989: the number of hosts breaks 100,000.
Tim Berners-Lee develops the World Wide Web. CERN releases the first Web server in 1991.
1992: the number of hosts breaks 1,000,000.
The World Wide Web sports a growth rate of 341,634% in service traffic in its third year, 1993.
The main U.S. Internet backbone traffic begins routing through commercial providers as NSFNET reverts to a research network in 1994.
The Internet 1996 World Exposition is the first World's Fair to be held on the internet.