Tuesday, January 22, 2013

History of computer? and A Very Brief History of Computer Science










 
About these ads


 

A Very Brief History of Computer Science

Before 1900

People have been using mechanical devices to aid calculation for thousands of years. For example, the abacus probably existed in Babylonia (present-day Iraq) about 3000 B.C.E. The ancient Greeks developed some very sophisticated analog computers. In 1901, an ancient Greek shipwreck was discovered off the island of Antikythera. Inside was a salt-encrusted device (now called the Antikythera mechanism) that consisted of rusted metal gears and pointers. When this c. 80 B.C.E. device was reconstructed, it produced a mechanism for predicting the motions of the stars and planets. (More Antikythera info here.) John Napier (1550-1617), the Scottish inventor of logarithms, invented Napier's rods (sometimes called "Napier's bones") c. 1610 to simplify the task of multiplication.
In 1641 the French mathematician and philosopher Blaise Pascal (1623-1662) built a mechanical adding machine. Similar work was done by Gottfried Wilhelm Leibniz (1646-1716). Leibniz also advocated use of the binary system for doing calculations.
Recently it was discovered that Wilhelm Schickard (1592-1635), a graduate of the University of Tübingen (Germany), constructed such a device in 1623-4, before both Pascal and Leibniz. A brief description of the device is contained in two letters to Johannes Kepler. Unfortunately, at least one copy of the machine burned up in a fire, and Schickard himself died of bubonic plague in 1635, during the Thirty Years' War.
Joseph-Marie Jacquard (1752-1834) invented a loom that could weave complicated patterns described by holes in punched cards. Charles Babbage (1791-1871) worked on two mechanical devices: the Difference Engineand the far more ambitious Analytical Engine (a precursor of the modern digital computer), but neither worked satisfactorily. (Babbage was a bit of an eccentric -- one biographer calls him an "irascible genius" -- and was probably the model for Daniel Doyce in Charles Dickens' novel, Little Dorrit. A little-known fact about Babbage is that he invented the science of dendrochronology -- tree-ring dating -- but never pursued his invention. In his later years, Babbage devoted much of his time to the persecution of street musicians (organ-grinders).) The Difference Engine can be viewed nowadays in the Science Museumin London, England.
One of Babbage's friends, Ada Augusta Byron, Countess of Lovelace (1815-1852), sometimes is called the "first programmer" because of a report she wrote on Babbage's machine. (The programming language Ada was named for her.)
William Stanley Jevons (1835-1882), a British economist and logician, built a machine in 1869 to solve logic problems. It was "the first such machine with sufficient power to solve a complicated problem faster than the problem could be solved without the machine's aid." (Gardner) It is now in the Oxford Museum of the History of Science.
Herman Hollerith (1860-1929) invented the modern punched card for use in a machine he designed to help tabulate the 1890 census.

1900 - 1939: The Rise of Mathematics

Work on calculating machines continued. Some special-purpose calculating machines were built. For example, in 1919, E. O. Carissan (1880-1925), a lieutenant in the French infantry, designed and had built a marvelous mechanical device for factoring integers and testing them for primality. The Spaniard Leonardo Torres y Quevedo (1852-1936) built some electromechanical calculating devices, including one that played simple chess endgames. In 1928, the German mathematician David Hilbert (1862-1943) addressed the International Congress of Mathematicians. He posed three questions: (1) Is mathematics complete; i.e. can every mathematical statement be either proved or disproved? (2) Is mathematics consistent, that is, is it true that statements such as "0 = 1" cannot be proved by valid methods? (3) Is mathematics decidable, that is, is there a mechanical method that can be applied to any mathematical assertion and (at least in principle) will eventually tell whether that assertion is true or not? This last question was called the Entscheidungsproblem.
In 1931, Kurt Gödel (1906-1978) answered two of Hilbert's questions. He showed that every sufficiently powerful formal system is either inconsistent or incomplete. Also, if an axiom system is consistent, this consistency cannot be proved within itself. The third question remained open, with 'provable' substituted for 'true'.
In 1936, Alan Turing(1912-1954) provided a solution to Hilbert's Entscheidungsproblem by constructing a formal model of a computer -- the Turing machine -- and showing that there were problems such a machine could not solve. One such problem is the so-called "halting problem": given a Pascal program, does it halt on all inputs?

1940's: Wartime brings the birth of the electronic digital computer

The calculations required for ballistics during World War II spurred the development of the general-purpose electronic digital computer. At Harvard, Howard H. Aiken (1900-1973) built the Mark I electromechanical computer in 1944, with the assistance of IBM. Military code-breaking also led to computational projects. Alan Turing was involved in the breaking of the code behind the German machine, the Enigma, at Bletchley Park in England. The British built a computing device, the Colossus, to assist with code-breaking.
At Iowa State Universityin 1939, John Vincent Atanasoff (1904-1995) and Clifford Berry designed and built an electronic computer for solving systems of linear equations, but it never worked properly.
Atanasoff discussed his invention with John William Mauchly (1907-1980), who later, with J. Presper Eckert, Jr. (1919-1995), designed and built the ENIAC, a general-purpose electronic computer originally intended for artillery calculations. Exactly what ideas Mauchly got from Atanasoff is not complely clear, and whether Atanasoff or Mauchly and Eckert deserve credit as the originators of the electronic digital computer was the subject of legal battles and ongoing historical debate. The ENIAC was built at the Moore School at the University of Pennsylvania, and was finished in 1946.
In 1944, Mauchly, Eckert, and John von Neumann (1903-1957) were already at work designing a stored-program electronic computer, the EDVAC. Von Neumann's report, "First Draft of a Report on the EDVAC", was very influential and contains many of the ideas still used in most modern digital computers, including a mergesort routine. Eckert and Mauchly went on to build UNIVAC.
Meanwhile, in Germany, Konrad Zuse (1910-1995) built the first operational, general-purpose, program-controlled calculator, the Z3, in 1941. More information about Zuse can be found here.
In 1945, Vannevar Bush published a surprisingly prescient article in the Atlantic Monthlyabout the ways information processing would affect the society of the future. (Another copy of the Bush article appears here.)
Maurice Wilkes (b. 1913), working in Cambridge, England, built the EDSAC, a computer based on the EDVAC. F. C. Williams (b. 1911) and others at Manchester Universitybuilt the Manchester Mark I, one version of which was working as early as June 1948. This machine is sometimes called the first stored-program digital computer.
The invention of the transistor in 1947 by John Bardeen (1908-1991), Walter Brattain (1902-1987), and William Shockley (1910-1989) transformed the computer and made possible the microprocessor revolution. For this discovery they won the 1956 Nobel Prize in physics. (Shockley later became notorious for his racist views.)
Jay Forrester (b. 1918) invented magnetic core memory c. 1949. More about Forrester here.

1950's

Grace Murray Hopper(1906-1992) invented the notion of a compiler, at Remington Rand, in 1951. Earlier, in 1947, Hopper found the first computer "bug"-- a real one -- a moth that had gotten into the Harvard Mark II. (Actually, the use of ``bug'' to mean defect goes back to at least 1889.) John Backus and others developed the first FORTRAN compiler in April 1957. LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthyabout 1958. Alan Perlis, John Backus, Peter Naur and others developed Algol.
In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959.
Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. (Dijkstra is famous for his caustic, opinionated memos. For example, see his opinions of some programming languages).
In a famous paperthat appeared in the journal Mind in 1950, Alan Turing introduced the Turing Test, one of the first efforts in the field of artificial intelligence. He proposed a definition of "thinking" or "consciousness" using a game: a tester would have to decide, on the basis of written conversation, whether the entity in the next room responding to the tester's queries was a human or a computer. If this distinction could not be made, then it could be fairly said that the computer was "thinking".
In 1952, Alan Turingwas arrested for "gross indecency" after a burglary led to the discovery of his affair with Arnold Murray. Overt homosexuality was taboo in 1950's England, and Turing was forced to take estrogen "treatments" which rendered him impotent and caused him to grow breasts. On June 7, 1954, despondent over his situation, Turing committed suicide by eating an apple laced with cyanide.

1960's

In the 1960's, computer science came into its own as a discipline. In fact, the term was coined by George Forsythe, a numerical analyst. The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965. Operating systems saw major advances. Fred Brooks at IBM designed System/360, a line of different computers with the same architecture and instruction set, from small machine to top-of-the-line. Edsger Dijkstra at Eindhoven designed the THE multiprogramming system.
At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed.
Many new programming languages were invented, such as BASIC (developed c. 1964 by John Kemeny (1926-1992) and Thomas Kurtz (b. 1928)).
The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin. Chomsky later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy.
Proving correctness of programs using formal methods also began to be more important in this decade. The work of Tony Hoare played an important role. Hoare also invented Quicksort.
Douglas C. Englebart invents the computer mouse c. 1968, at SRI.
Ted Hoff (b. 1937) and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971.
A rigorous mathematical basis for the analysis of algorithms began with the work of Donald Knuth (b. 1938), author of 3-volume treatise entitled The Art of Computer Programming.

1970's

The theory of databases saw major advances with the work of Edgar F. Codd on relational databases. Codd won the Turing award in 1981. Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson (b. 1943) and Dennis Ritchie (b. 1941). Brian Kernighan and Ritchie together developed C, an influential programming language.
Other new programming languages, such as Pascal (invented by Niklaus Wirth) and Ada (developed by a team led by Jean Ichbiah), arose.
The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time.
The 1970's also saw the rise of the supercomputer. Seymour Cray (b. 1925) designed the CRAY-1, which was first shipped in March 1976. It could perform 160 million operations in a second. The Cray XMP came out in 1982. Cray Research was taken over by Silicon Graphics.
There were also major advances in algorithms and computational complexity. In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. Whit Diffie and Martin Hellman published a paper that introduced the theory of public-key cryptography, and a public-key cryptosystem known as RSA was invented by Ronald Rivest, Adi Shamir, and Leonard Adleman.
In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet.

1980's

This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer. The first computer viruses are developed c. 1981. The term was coined by Leonard Adleman, now at the University of Southern California.
In 1981, the first truly successful portable computer was marketed, the Osborne I. In 1984, Apple first marketed the Macintosh computer.
In 1987, the US National Science Foundation started NSFnet, precursor to part of today's Internet.


1990's and Beyond

Parallel computers continue to be developed. Biological computing, with the recent work of Len Adleman on doing computations via DNA, has great promise. The Human Genome Projectis attempting to sequence all the DNA in a single human being.
Quantum computing gets a boost with the discovery by Peter Shor that integer factorization can be performed efficiently on a (theoretical) quantum computer.
The "Information Superhighway" links more and more computers worldwide.
Computers get smaller and smaller; the birth of nano-technology.





No comments:

Post a Comment