This is actually a fascinating question, so I'm going to give a long answer.
To the average user, it seems like there are a lot of similarities between a human and a computer:
they both have memory,
they both take in information and put out information,
they both need sources of power
They both "think," sort of
But really, these are only similarities on the surface. Although both you and your computer have memory, there are vast differences in how memory works, to the point where using the same word for the two types of memory is misleading. While both you and the computer "think," how you do so is vastly different. You work in patterns, the computer works in digital representations, which is more precise but far less efficient.
Computers are phenominally fast calculators, but that's all. They can add, subtract, multiply, divide, compare to numbers (determining which is greater or if they are equal) and move information around. And nothing else. I'm a programmer, trust me on this. Everything they do is based on those basic operations. Playing a video game is just basic math done incredibly fast. Displaying an e-mail is just basic math on memory addresses and moving a block of numbers (representing the letters) from one location to another, where it can be turned into pixels (complicated math, but math none the less) and moving the results to specific memory ports where the electronic adapter turns the numbers into colors on the screen. Computers seem "smart" because they're wickedly fast at processing numbers.
You, on the other hand, do poorly at numbers. (And by 'you', of course, I mean humans. But you knew that). The fastest human savants calculate thousands of times slower than the computer in your microwave oven.
You have advantages, though. Your brain is a phenominal pattern matcher. Your pattern matching abilities exceed anything any computer can achieve. For example, if I scrawl the word "Howdy there!" on a piece of paper and hand it to you, you'd instantly know what it said, even if I was a wee bit sloppy with the letters or gave it to you upside-down. The letters (and even the words) are patterns you've seen before and can immediately interpret them. A computer would have to analyze it pixel-by-pixel and MIGHT be able to interpret the letters eventually. But in all probability, the computer is going to have trouble figuring out where the lines start and stop, and which line goes with which line in the text. Character recognition is a tricky thing for a computer... and if you give it the text upside-down it'll never figure it out.
Even a six-month-old baby is better at recognizing different faces than the best computer program running on the fastest computer that Homeland Security has. And trust, me, they're working on improving that so they can recognize bad guys from security cameras.
Now we get back to memory and how vastly different they are. A week from now, you'll remember that the paper said "howdy there!" and may or may not remember vaguely whether the handwriting was neat or sloppy. This is because your memory stores the pattern. The computer, however, could give an exact reproduction.
Studies have been done with what types of things people remember. When flashed briefly with an image of a cluttered desk, for example, and then asked to write down what was on the desk, people tend to remember the things that are out of place (a shoe, a tennis ball) and not remember things that were expected there (a stapler, a cofee mug). This is because that powerful pattern matcher in your head evaluates the pattern (an office desk) and notes anything that doesn't match. Your memory doesn't need to store typical desk items as much as it does exceptions. It's very efficient that way. You filter out the "ordinary" and spend almost no time thinking about them. That frees up your brain to think about the stuff that matters.
And then there's intelligence. You need intelligence to take full advantage of the patterns your brain processes and analyzes. Patterns are no good if you can't draw useful information out of them.
If I dump 500 pennies on the table in front of you and ask you and a computer (with a sufficiently good visual sensor system and the software to do so) how many were there, the computer would figure it out long before you do. It's good at county and repetative stuff. But if I take one penny away and ask you how many there are now, you'd say "499" immediately. The computer would have to count them again. That's the difference.
Let's go back to that "Howdy there!" note. When you read that, it never occurred to you that there was a 'W' in the word. There was, and if you think about it you'd know it was there as well, because the word 'howdy' always has a W in it. But you didn't see the W. You saw the word. If you think about it, the W has 4 diagonal lines in it. You didn't see those either, even when you just looked at the letter. You only saw the letter. That's pattern matching. You see the pattern, then work down to the specifics. Computers start with the pixels, then figure out where the diagonal lines are, and then try to compute whether or not those lines are a W or \/\/ or VV.
Now I'm starting to ramble, so I'll end it here. I hope you found this helpful in some way.
2006-06-20 06:12:50
·
answer #1
·
answered by Mantis 6
·
0⤊
0⤋
"A computer is a device with a CPU, some RAM, an address bus, data bus, and so forth. We could construct an elaborate definition or a simple one. But any such definition is easily shattered. I could readily build an absurd machine to meet any such specification. I could bollix up the ALU with inverted arithmetic relations, or scramble the address lines so that it could not retrieve data reliably."
2006-06-20 13:09:46
·
answer #3
·
answered by poetic_lala 5
·
0⤊
0⤋