It refers to the number of bits that can be fed to the processor at one shot.
The more bits, the more instructions that can be handled in a single clock cycle.
Obviously, that's a boiled-down answer. For a more technical, accurate and complete answer try "How Computers Work" (cheap, decent reference), or just Google it.
2007-02-05 04:01:35
·
answer #1
·
answered by piperjoe68 3
·
1⤊
0⤋
A 16 bit computer works with data in groups of 16 bits, or 2 bytes. A 32 bit computer works with data in groups of 32 bits, or 4 bytes. And 64 does the same with 64 bits / 8 bytes. The bit size refers to what is the native, or standard byte count of the machine. All instructions will be, for example, 32 bits long for a 32 bit machine and so on for each bit size. Also, the native integer type size is typically defined by the processor architecture.
2007-02-05 03:35:37
·
answer #2
·
answered by Pfo 7
·
0⤊
0⤋
the present customary is a 32 bit operating gadget. So, except you specifically ordered a sixty 4 bit gadget even as to procure the computer, you pick the 32 bit version of this technique.
2016-10-17 05:28:15
·
answer #3
·
answered by chicklis 4
·
0⤊
0⤋
How big the numbers are, as 2^16, 2^32, 2^64 these correspond to decimal numbers 10^4.816, 10^9.63, 10^19.26 which are some pretty big numbers.
2007-02-05 03:39:15
·
answer #4
·
answered by jimmymae2000 7
·
0⤊
0⤋
it refers to the memory when you talk about 32 and so on it is megabytes when you have single figures it relates to gigabytes(a thousand million megabytes)
2007-02-05 03:33:37
·
answer #5
·
answered by zerocool 3
·
0⤊
2⤋
http://en.wikipedia.org/wiki/64-bit
http://en.wikipedia.org/wiki/32-bit
http://en.wikipedia.org/wiki/16-bit
You must have tried real hard when you were researching by yourself.
2007-02-05 03:33:14
·
answer #6
·
answered by Andrew G 2
·
3⤊
0⤋