Originally, the term "computer" referred to a person who performed numerical calculations, often with the aid of a mechanical calculating device or analog computer. Examples of these early devices, the ancestors of the computer, included the abacus and the Antikythera mechanism, an ancient Greek device for calculating the movements of planets, dating from about 87 BC.[1] The end of the Middle Ages saw a reinvigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of European engineers to construct a mechanical calculator.[2] The abacus has been noted as being an early computer, as it was like a calculator in the past. In 1801, Joseph Marie Jacquard made an improvement to existing loom designs that used a series of punched paper cards as a program to weave intricate patterns. The resulting Jacquard loom is not considered a true computer but it was an important step in the development of modern digital computers. Charles Babbage was the first to conceptualize and design a fully programmable computer as early as 1820, but due to a combination of the limits of the technology of the time, limited finance, and an inability to resist tinkering with his design, the device was never actually constructed in his lifetime. A number of technologies that would later prove useful in computing, such as the punch card and the vacuum tube had appeared by the end of the 19th century, and large-scale automated data processing using punch cards was performed by tabulating machines designed by Hermann Hollerith.
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated, special-purpose analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. These became increasingly rare after the development of the programmable digital computer.
A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features of modern computers, such as the use of digital electronics (largely invented by Claude Shannon in 1937)[3] and more flexible programmability. Defining one point along this road as "the first digital electronic computer" is exceedingly difficult. Notable achievements include the Atanasoff-Berry Computer (1937), a special-purpose machine that used valve-driven (vacuum tube) computation, binary numbers, and regenerative memory; the secret British Colossus computer (1944), which had limited programmability but demonstrated that a device using thousands of valves could be made reliable and reprogrammed electronically; the Harvard Mark I, a large-scale electromechanical computer with limited programmability (1944); the decimal-based American ENIAC (1946) — which was the first general purpose electronic computer, but originally had an inflexible architecture that meant reprogramming it essentially required it to be rewired; and Konrad Zuse's Z machines, with the electromechanical Z3 (1941) being the first working machine featuring automatic binary arithmetic and feasible programmability.
The team who developed ENIAC, recognizing its flaws, came up with a far more flexible and elegant design, which has become known as the Von Neumann architecture (or "stored program architecture"). This stored program architecture became the basis for virtually all modern computers. A number of projects to develop computers based on the stored program architecture commenced in the mid to late-1940s; the first of these were completed in Britain. The first to be up and running was the Small-Scale Experimental Machine, but the EDSAC was perhaps the first practical version that was developed.
Valve (tube) driven computer designs were in use throughout the 1950s, but were eventually replaced with transistor-based computers, which were smaller, faster, cheaper, and much more reliable, thus allowing them to be commercially produced, in the 1960s. By the 1970s, the adoption of integrated circuit technology had enabled computers to be produced at a low enough cost to allow individuals to own a personal computer.
2006-06-14 06:24:39
·
answer #2
·
answered by patni_ankit 3
·
1⤊
0⤋
So many places to start. Try Charles Babbage, or Herman
Hollerith. http://en.wikipedia.org/wiki/Hollerith_card
I Corinthians 13;8a, Love never fails!!!
2006-06-14 04:23:32
·
answer #3
·
answered by ? 7
·
0⤊
1⤋
Go to this site http://www.google.ch/search?hl=fr&q=history+of+computers&meta= , and it'll have a list of sites you can visit.
2006-06-14 04:18:39
·
answer #4
·
answered by Brian Reed 3
·
1⤊
0⤋
http://rds.yahoo.com/_ylt=A0geut5Txo9EBbAAqe1XNyoA;_ylu=X3oDMTE3ZGZ2ZnA5BGNvbG8DZQRsA1dTMQRwb3MDMwRzZWMDc3IEdnRpZANGNzU0XzExNA--/SIG=1245e5ibb/EXP=1150359507/**http%3a//inventors.about.com/library/blcoindex.htm
http://www.nuhsd.k12.ca.us/brhs/faculty/Stephan/history/
2006-06-14 04:19:49
·
answer #8
·
answered by Amy 5
·
1⤊
0⤋