At the heart of the computer is base 2 numbers -- on or off -- 0 or 1 the binary. Combine a string of those 0 or 1 to make Octals, Hexadeximals -- base 16.
Two hex numbers make a Latin letter of the alphabet from an ordered chart agreed upon by a group of computer people. The chart is called the ASCII chart.
We use computer languages like Java with ASCII as a human interface to those base 2 numbers. After we code in Java, we compile back down into binary. Binary runs the fastest inside the computer and over the internet.
ASCII chart ran out of room representing all the world's characters. Today we have UTF-8, a different number chart for the most of the alphabets.
2006-10-06 07:49:02
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
Rehash + more.
Binary = 1,0 (switch on,off)
Bit = 1 binary switch
Byte = 8 binary switches
Have "endianness" which means that two systems may read your binary data opposite of each other.
Intel processors are typically 'little-endian' and sparc and others 'big-endian'.
This means that when assigning a value to a binary set of bits, big-endian would be added as:
128 64 32 16 8 4 2 1
... and little-endian as
1 2 4 8 16 32 64 128
So the binary value of 11000000 could be either 192 or 3.
This is important in a minute.
ASCII and other encodings assign letters values to numbers (since numbers can be written in binary) and therefore stored/transmitted on a binary media.
ASCII = American Standard Code for Information Interchange
- Uses 7 bits to assign characters (not all printable)
- 1111111 = 1+2+4+8+16+32+64 = 128 possible combinations (including 0)
EBCDIC = Extended Binary Coded Decimal Interchange Code
- Used mostly on IBM systems (mainframes, AS/400)
- Uses 8 bits to assign characters (not all printable)
- 11111111 = 1+2+4+8+16+32+64+128 = 256 possible combinations (including 0)
The reason why these are important is that if you need to send data between a system that uses one character set to one that uses another character set, there needs to be a character set conversion. This automagically takes place in most file transfer programs, but it's not always what you want to happen.
For instance, with FTP, you can use the keywords, 'ascii' to do the conversion or 'binary' to keep things the same.
Back to the endian issue. If one computer is big-endian and the other little-endian and they sent data to each other (without a standard), not only could your character encoding be wrong, but your bits would all be backwards!
That's why all data on an IP network is transmitted in 'Network Byte Order' - Big Endian. So the software needs to know what 'endianness' it has and flip them accordingly.
2006-10-10 22:14:15
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
Binary is a string of zeros and ones. Each combonation has 8 numbers in it (i.e.: 5=00000101). It is used to send info between the CPU and the keyboard for example. Each key has its own code.
ASCII (American Standard Code For Information Interchange) is numbers 1-9 and letters A-F (i.e:932=3A4). What it is used for, I don't know except that it can be used in the Calculator application.
2006-10-06 07:42:06
·
answer #3
·
answered by John C. 4
·
0⤊
0⤋
Binary code are just Ones and Zeros 1,0
2006-10-06 07:34:53
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
0 and 1
ASCII means American Standard Code For Information Interchange
2006-10-06 07:36:49
·
answer #5
·
answered by JayHawk 5
·
1⤊
0⤋
binary code -1010101010-only zero and one
ascii-american standard code for information interchange
2006-10-06 07:41:06
·
answer #6
·
answered by Chirag G 1
·
0⤊
0⤋
I will give it to you plain and simple, which makes it easier to understand OK
Consists of...ones and zero's
ASCII codes represent text in computers
2006-10-06 07:59:32
·
answer #7
·
answered by Matthew B 2
·
0⤊
0⤋