วันพฤหัสบดีที่ 25 ตุลาคม พ.ศ. 2550

Computer Trend



















HISTORY OF THE COMPUTER






A HISTORY OF THE COMPUTER: PREHISTORY




The Antikythera mechanism, used for registering and predicting the motion of the stars and planets, is dated to the first century B.C. It was discovered off the coast of Greece in 1901.
Arabic numerals are introduced to Europe in the eighth and ninth centuries A.D. Roman numerals remain in use in some parts of Europe until the seventeenth century. The Arabic system introduced the concepts of the zero and fixed places for tens, hundreds, thousand, etc., and greatly simplified mathematical calculations.
John Napier, Baron of Merchiston, Scotland, invents logs in 1614. Logs allow multiplication and division to be reduced to addition and subtraction.
Wilhelm Schickard builds the first mechanical calculator in 1623. It can work with six digits, and carries digits across columns. It works, but never makes it beyond the prototype stage. Schickard is a professor at the University of Tubingen, Germany.
Blaise Pascal builds a mechanical calculator in 1642. It has the capacity for eight digits, but has trouble carrying and its gears tend to jam.
Joseph-Marie Jacquard invents an automatic loom controlled by punch cards.
Charles Babbage conceives of a "Difference Engine" in 1820 or 1821. It is a massive steam-powered mechanical calculator designed to print astronomical tables. He attempts to build it over the course of the next 20 years, only to have the project cancelled by the British government in 1842. Babbage's next idea is the Analytical Engine - a mechanical computer that can solve any mathematical problem. It uses punch-cards similar to those used by the Jacquard loom and can perform simple conditional operations.
Augusta Ada Byron, the countess of Lovelace, met Babbage in 1833. She describes the Analytical Engine as weaving "algebraic patterns just as the Jacquard loom weaves flowers and leaves." Her published analysis of the Analytical Engine is our best record of its programming potential. In it she outlines the fundamentals of computer programming, including data analysis, looping and memory addressing.












A HISTORY OF THE COMPUTER: ELECTRONICS

Konrad Zuse, a German engineer, completes the first general purpose progammable calculator in 1941. He pioneers the use of binary math and boolean logic in electronic calculation.
Colossus, a British computer used for code-breaking, is operational by December of 1943. ENIAC, or Electronic Numerical Integrator Analyzor and Computer, is developed by the Ballistics Research Laboratory in Maryland to assist in the preparation of firing tables for artillery. It is built at the University of Pennsylvania's Moore School of Electrical Engineering and completed in November 1945.
Bell Telephone Laboratories develops the transistor in 1947.
UNIVAC, the Universal Automatic Computer (pictured below), is developed in 1951. It can store 12,000 digits in random access mercury-delay lines.
EDVAC, for Electronic Discrete Variable Computer, is completed under contract for the Ordinance Department in 1952.
In 1952 G.W. Dummer, a radar expert from the British Royal Radar Establishment, proposes that electronic equipment be manufactured as a solid block with no connecting wires. The prototype he builds doesn't work and he receives little support for his research.
Texas Instruments and Fairchild semiconductor both announce the integrated circuit in 1959.
The IBM 360 is introduced in April of 1964 and quickly becomes the standard institutional mainframe computer. By the mid-80s the 360 and its descendents will have generated more than $100 billion in revenue for IBM.













A HISTORY OF THE COMPUTER: MINI

Texas Instruments and Fairchild semiconductor both announce the integrated circuit in 1959.
Ivan Sutherland demonstrates a program called Sketchpad on a TX-2 mainframe at MIT's Lincoln Labs in 1962. It allows him to make engineering drawings with a light pen.
A typical minicomputer costs about $20,000.
1965: An IC that cost $1000 in 1959 now costs less than $10. Gordon Moore predicts that the number of components in an IC will double every year. This is known as Moore's Law.
Doug Engelbart demonstrates in 1968 a word processor, an early hypertext system and a collaborative application: three now common computer applications.
Gordon Moore and Robert Noyce found Intel in 1968.
Xerox creates its Palo Alto Research Center - Xerox PARC - in 1969. Its mission is to explore the "architecture of information."
Fairchild Semiconductor introduces a 256-bit RAM chip in 1970.
In late 1970 Intel introduces a 1K RAM chip and the 4004, a 4-bit microprocessor. Two years later comes the 8008, an 8-bit microprocessor.










A HISTORY OF THE COMPUTER: MICRO

Bill Gates and Paul Allen form Traf-O-Data in 1971 to sell their computer traffic-analysis systems.
1972: Gary Kildall writes PL/M, the first high-level programming language for the Intel microprocessor.
Steve Jobs and Steve Wozniak are building and selling "blue boxes" in Southern California in 1971.
April 1972: Intel introduces the 8008, the first 8-bit microprocessor.
Jonathan A. Titus designs the Mark-8, "Your Personal Minicomputer," according to the July, 1974 cover of Radio-Electronics.
Popular Electronics features the MITS Altair 8800 on its cover, January 1975. It is hailed as the first "personal" computer. Thousands of orders for the 8800 rescue MITS from bankruptcy.
Pictured below: The Homebrew Computer Club in 1975.
Paul Allen and Bill Gates develop BASIC for the Altair 8800. Microsoft is born.
1977: Apple is selling its Apple II for $1,195, including 16K of RAM but no monitor.
Software Arts develops the first spreadsheet program, Visicalc, by the spring of 1979. It is released in October and is an immediate success. Copies shipped per month rise from 500 to 12,000 between 1979 and 1981.
By 1980 Apple has captured 50% of the personal computer market.
In 1980 Microsoft is approached by IBM to develop BASIC for its personal computer project. The IBM PC is released in August, 1981.
The Apple Macintosh debuts in 1984. It features a simple, graphical interface, uses the 8-MHz, 32-bit Motorola 68000 CPU, and has a built-in 9-inch B/W screen.
Microsoft Windows 1.0 ships in November, 1985.
Motorola announces the 68040, a 32-bit 25MHz microprocessor.
Microsoft's sales for 1989 reach $1 billion, the first year to do so.




A HISTORY OF THE COMPUTER: NETWORK

Timesharing, the concept of linking a large numbers of users to a single computer via remote terminals, is developed at MIT in the late 50s and early 60s.
1962: Paul Baran of RAND develops the idea of distributed, packet-switching networks.
ARPANET goes online in 1969.
Bob Kahn and Vint Cerf develop the basic ideas of the Internet in 1973.
In 1974 BBN opens the first public packet-switched network - Telenet.
A UUCP link between the University of North Carolina at Chapel Hill and Duke University establishes USENET in 1979. The first MUD is also developed in 1979, at the University of Essex.
TCP/IP (Transmission Control Protocol and Internet Protocol) is established as the standard for ARPANET in 1982.
1987: the number of network hosts breaks 10,000.
1989: the number of hosts breaks 100,000.
Tim Berners-Lee develops the World Wide Web. CERN releases the first Web server in 1991.
1992: the number of hosts breaks 1,000,000.
The World Wide Web sports a growth rate of 341,634% in service traffic in its third year, 1993.
The main U.S. Internet backbone traffic begins routing through commercial providers as NSFNET reverts to a research network in 1994.
The Internet 1996 World Exposition is the first World's Fair to be held on the internet.

ไม่มีความคิดเห็น: