English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I know that they have programmed the computer to understand what the words and signs mean, but how does the computer know what the letters in the words mean, or if you give the computer the definintion of "alphabet" and said something like the 26 letters of the English alphabet, how does it know the definintion of the words in the definition of alpahbet? I could also be thinking too hard about this and be making it more complicated than it really is. I've been wondering about this for some time now and would REALLY appreciate this if someone could answer my question.
Thank You.

2007-02-12 12:53:08 · 9 answers · asked by Lawton 3 in Computers & Internet Programming & Design

9 answers

Actually that's a great question. Computers are really quite dumb.

When you think of English or Spanish, they are human languages, each word has a meaning or meanings in each language. Sentences are strung together so that words can be combined to tell people about more complicated things than just one word can communicate.

Computer languages aren't quite that simple. but there is a pattern you can follow, think of them like layers of a cake or generations of a family.

In a computer language called "BASIC" the computer doesn't "know" much of anything, except the "BASIC" language which itself is built upon Machine Language (see below), Basic was designed to be VERY simple to use and mostly it is.

The Original BASIC language was only built up of 20 or 30 WORDS or COMMANDS that it "KNEW".

So here is a simple program in BASIC

print "Please type a letter";
input A$
if A$ = "A" then print "You typed an A".
if A$ = "B" then print "You typed a B".
...

When I run this program I would see something like this

Please type a letter?_

And as a user I might do what the computer asked and type a letter.

The computer doesn't KNOW anything about the alphabet it just appears that way to the person running the program above - the user. The computer just knows enough to follow the simple instructions of the BASIC language.

the INPUT, IF/THEN and PRINT commands ARE all known by the computer and translate into even longer instructions of Assembly language and ultimately Machine Language of 1 and 0.

The PRINT command itself for instance is actually a complicated set of instructions telling the computer to evaluate what is on the RIGHT side of the PRINT command and interpret that, take that result and place it on the computer's screen. etc.

As I said before, computer languages are made in "layers" like a cake with each language layer build up, on top of the previous language. You could also consider it like generations of a family , with the first , 2nd, 3rd , 4th and 5th generation of languages, this is how computer folks talk about them.

In the beginning and presently the "simplest" computer language is Machine Language. This is the language of one's and zero's which originally corresponded to the physical switches that a machine operator would have to flip to get the computer to perform calculations.

Basically it's unreadable by almost all people except those systems engineers and electrical engineers that design the hardware itself or do very "low-level" programming. It's actually anything but low, it's extremely hard to learn and use for most people, at least.

2nd generation languages really only a language of one, with differing variations for different types of computers (Apples, PC's etc). Assember this is NOT a simple language from a human perspective, it's actually moderately to very hard for humans to program and is not very widely used by all but the best human programmers.

But its relatively simple for a computer to understand. You can think of machine language is essentially the "2nd generation language" of computers.

To do even a simple thing like add two numbers is not a pretty picture, but back in the day it sure beat going to an adding machine to do it manually.

3rd generation languages are the OLDEST languages that most programmers have or are using, these include languages like COBOL, FORTRAN , C and BASIC. These languages are designed for various needs, COBOL for Business, FORTRAN for Science, and BASIC was for students.

The language C was a bit strange in that it was developed independently but designed in a way by software engineers for software engineers and has REALLY stood the test of time and spawned a whole host of "sibling" 3rd generation languages itself, C++, C#, .NET, Java , Awk.

These languages have even gone on to spawn some of their own, Ruby, PHP,Perl are languages built upon these C/C++ variants,and are used widely on the internet.

The beauty of these languages is that most of these are relatively easy to learn and master and you can perform any variety of tasks with them. Newer versions of these languages have been created over time it's not just C that had a clutch of it's own children ,the old language ALGOL, spawned a 1/2 dozen languages and could in fact be considered the inspiration for C itself.

Recently Microsoft has sought to merge it's variant of of C++ and it's brand of Basic to become C#, which is basically much more like C++ than Basic but it has characteristics of both.

In the last 20 years or so, engineers have developed "4th generation languages" , these languages are ususally VERY easy to program in and so programmers are able to be MUCH more productive than with 3rd generation languages.

However since these were hard to develop and are almost universally proprietary and typically very expensive they are endangered and could become obsolete or extinct.

5th Generation languages were developed around the same time as 4th Generation languages, (the original idea was that eventually 5th generation languages would be like human languages "English" or "English-like"). So that everyone could use them. However while these languages have been extremely powerful from a research perspective, it's not incorrect to say they have been a dismal failure in making computers easier to use for everyone, perhaps someday they will but for the moment - there's no such luck.

Each generation of languages is what is called increasingly "Abstract" or written in such a way as to be increasingly EASY for people to use, that's not how it turned out.

It turned out that humans have developed/settled into the idea of a professional "class" of workers that specialize in talking to computers, like other types of worker specialists Programmers typically know one or two languages, sometimes the good ones may know several and be able to program competently in all but usually have mastery or only a couple at a time.

In this way, computer languages are alot like human languages, it takes a little while to speak a little of...any human language and a pretty concerted effort to learn that language well. In that way, the languages of computers have been developed to be about as easy as the EASIEST human language is to learn.

Hope this helps.

2007-02-12 14:08:51 · answer #1 · answered by Mark T 7 · 4 0

By words and signs, you mean English words and signs? Are you presuming that computers have an ability to understand human language? Technically speaking Computers do not understand English any more than your DVD player understands English when it plays a movie with a soundtrack in English. The text you see on the computer screen, or the audio (sound) that is part of a computer program was put together by a computer programmer, and a team of specialized technical people. A computer that really could understand any English sentence, or listen to you speak or observe anything you type or write, and understand it, really would be a really amazing thing, however technically there is no such thing.

There is a tradition in the computer world for programmers to try to create programs that may appear as if they understand english words and sentences, but these pieces of software are ultimately just toys.

The languages that computers can "understand", or more properly speaking, that they can "parse" are extremely simple (in terms of the numbers of rules or principles involved), and completely constructed so as to be unambiguous, that is to say, to only be able to be understood in one way. Human languages have ambiguities, and a human understanding of speech is more subtle and sophisticated than a computer's completely static and non-intuitive process for evaluating input.

If you ask a computer "what do you think we should do next?", it cannot make a creative decision. It can be programmed to respond by any algorithm that a human programmer can concieve, but it does not strictly speaking, have the intelligence which human beings have.

There is a long-standing idea that some people hold, that some day a breakthrough in "Artificial Intelligence" will create a computer that is indistinguishable in intellectual ability from a human, or perhaps that computers will posess at some point "a superior intellect". Computers are sublime machines, when it comes to manipulating data, displaying it, storing, processing it, analyzing it, transmitting and transforming information, but they do not think, and I do not think they ever will, but I could of course be wrong.

2007-02-12 13:32:55 · answer #2 · answered by InternetsDood 1 · 0 2

To put it simply, as you may already know, a computer can only understand binary. So, regarding the alphabet for example, lets say you print out the letter 'A' in a C program. Now the computer of course can not read letters. But it is designed to convert everything to its equivalent ASCII code (see below for the full table). So the letter 'A', after going through compilation, will eventually be read as 01000001 (0x41 in hex, I just converted to the binary equivalent). Now the computer can read this and will associate this particular binary combination with the letter 'A'. Likewise, the same is done with each character.

2007-02-12 19:24:38 · answer #3 · answered by Entfusion 3 · 0 0

The computer is an unbelievably simple system for storing numbers, and following very simple instructions to copy or tabulate the numbers.

Back in the old days (Apple II) the processor could only move data around, add or subtract. It could divide, but only by two. To divide by anything else, you had to teach it how to do that, in very simple instructions. But then anytime you wanted to divide in the future, you could just re-use those instructions!

It took 2048 bytes of instructions to give it a "Monitor", a very simple ability to interact with the user through keyboard and screen. Once that was done, 5000 more bytes of instructions gave it a respectable "BASIC" programming language. You could write many cool games in BASIC.

That's how all programming is done. Layer upon layer upon layer. There you had 3 layers, Monitor -> BASIC -> your game.

Computers are very similar now, at their core. Just now they have much more memory and much faster processors, so the computer has time and space for more layers and more complexity. Now we play World of Warcraft instead of Pong.

But it's the same thing, pretty much. Just a lot more of it.

2007-02-12 16:12:50 · answer #4 · answered by Wolf Harper 6 · 0 0

its kind of complicated to explain but words and stored as strings so say if you spell a word, the program takes each individual letter and stores it into a string, if one letter is mispelled it finds the closest match. so the more letters that are out of place or missing the less likely it will find a match.
so if i spell out "Bace" it will store b,a,e, and it will cycle through 26 letters to finally find "s" "Base". but it also stores all of the other possibilities such as b,a, so it will replace the remainding two with "s" "Bass"
get the idea?
neither do i

2007-02-12 13:04:50 · answer #5 · answered by jlp.media 3 · 0 2

Computers understand only machine language. Programmers use advance languages to tell the computer what to do. These get transulated to machine for computer to understand.

2007-02-12 13:00:58 · answer #6 · answered by bigboywasim 2 · 0 1

All your concern is in a specific academic field call Artificial Intelligence, you don't see it in action on an ordinary computer.

2007-02-12 13:00:10 · answer #7 · answered by Andy T 7 · 0 1

very carefully

2007-02-12 15:49:19 · answer #8 · answered by FrankyJay 1 · 0 0

fedest.com, questions and answers