Check out the link below for the history of computing..
http://en.wikipedia.org/wiki/Computer#History_of_computing
Best of luck!!
2006-10-10 00:09:15
·
answer #1
·
answered by Asher 3
·
1⤊
0⤋
Pentium I is history, Pentium IV is now, who knows what will be!?
2006-10-10 07:15:40
·
answer #2
·
answered by haringrobert 3
·
0⤊
0⤋
The computers that you see and use today hasn't come off by any inventor at one go. Rather it took centuries of rigorous research work to reach the present stage. And scientists are still working hard to make it better and better. But that is a different story.
First, let us see when the very idea of computing with a machine or device, as against the conventional manual calculation, was given a shape.
Though experiments were going on even earlier, it dates back to the 17th century when the first such successful device came into being. Edmund Gunter, an English mathematician, is credited with its development in 1620. Yet it was too primitive to be recognized even as the forefather of computers. The first mechanical digital calculating machine was built in 1642 by the French scientist-philosopher Blaise Pascal. And since then the ideas and inventions of many mathematicians, scientists, and engineers paved the way for the development of the modern computer in following years.
But the world has had to wait for yet another couple of centuries to reach the next milestone in developing a computer. Then it was the English mathematician and inventor Charles Babbage who did the wonder with his works during 1830s. In fact, he was the first to work on a machine that can use and store values of large mathematical tables. The most important thing of this machine is its use in recording electric impulses, coded in the very simple binary system, with the help of only two kinds of symbols.
This is quite a big leap closer to the basics on which computers today work. However, there was yet a long way to go. And, compared to present day computers, Babbage's machine could be regarded as more of high-speed counting devices. For, they could only work on numbers alone!
The Boolean algebra developed in the 19th century removed the numbers-alone limitation for these counting devices. This technique of mathematics, invented by Boole, helped correlate the binary digits with our language. For instance, the values of 0s are related with false statements and 1s with the true ones. British mathematician Alan Turing made further progress with the help of his theory of a computing model. Meanwhile the technological advancements of the 1930s helped much in furthering the advancement of computing devices.
But the direct forefathers of present-day computer systems evolved in about 1940s. The Harvard Mark 1 Computer designed by Howard Aiken is the world's first digital computer which made use of electro-mechanical devices. It was developed jointly by the International Business Machines (IBM) and the Harvard University in 1944.
But the real breakthrough was the concept of the stored-program computer. This was when the Hungarian-American mathematician John von Neumann introduced the Electronic Discrete Variable Automatic Computer (EDVAC). The idea--that instructions as well as data should be stored in the computer's memory for better results--made this device totally different from its counting device type of forerunners. And since then computers have increasingly become faster and more powerful.
Still, as against the present day's personal computers, they had the simplest form of designs. It was based on a single CPU performing various operations, like, addition, multiplication and so on. And these operations would be performed following an order of instructions, called program, to produce the desired result.
This form of design, was followed, with a little change even in the advanced versions of computers developed later. This changed version saw a division of the CPU into memory and arithmetic logical unit (ALU) parts and a separate input and output sections.
In fact, the first four generations of computers followed this as their basic form of design. It was basically the type of hardware used that caused the difference over the generation. For instance, the first generation variety was based on vacuum tube technology. This was upgraded with the coming up of the transistors, and printed circuit board technology in the 2nd generations. It was further upgraded by the coming up of integrated circuit chip technology where the little chips replaced a large number of components. Thus the size of computer was greatly reduced in the 3rd generation, while it become more powerful. But the real marvel came during the 1970s. It was with the introduction of the very large scale integrated technology (VLSI) in the 4th generation. Aided by this technology a tiny microprocessor can store millions of pieces of data.
And based on this technology the IBM introduced its famous Personal Computers. Since then IBM itself, and other makers including Apple, Sinclair, and so forth, kept on developing more and more advanced versions of personal computers along with bigger and more powerful ones like Mainframe and Supercomputers for more complicated works.
Meanwhile the tinier versions like laptops and even palmtops came up with more advanced technologies over the past couple of decades. But only advancement of technology cannot take the full credit for the amazing advancement of computers over the past few decades. Software, or the inbuilt logic to run the computer the way you like, kept on being developed at an equal pace. The coming of famous software manufacturers like Microsoft, Oracle, Sun have helped pacing up the development. The result of these all, is to add to our ease in solving complex problems at a lightning speed with a handier version of a device called computer.
2006-10-10 07:19:13
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
Long, long story, and depends what you think of as a computer.
Look here ---> http://www.computerhistory.org/
2006-10-10 07:12:29
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋