The answer to your question is two fold.
When you are talking about something like "computer ram" you will find that, in that case, the more official counting method is used.
1 Gigabyte of Ram = 1,024 Megabytes (2gb = 2,048, 4gb = 4,096)
However, when you are talking about something like a USB key, harddrive, etc.. those "storage" devices use the convention that 1 GB = 1,000,000,000 bytes, which comes out to ~976 Megabytes.
This is why a "300 gigabyte" harddrive formats out at around 270 "something" gigabytes.
Who is correct? It's a matter of semantics.
Memory manufacturers consider a giga to be 1,024 mega's as that's the "powers of 2" result
(2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048, 4096, 8192, 16384, 32768, 65536, etc....)
However, "Giga" means "billion", and a byte is that individual tiny amount, so,
Giga + byte DOES also, technically, equal 1,000,000,000 bytes
Computer terminolgy allows for
1 = byte
1,024 = kilobyte
1,024x1,024 (1,048,576) = MEGAbyte
1024x1024x1024 (1,073,741,824) = GIGAbyte
Compare 1,073,741,824 vs 1,000,000,000 and you can now see where there is loss in translation from "ram" to "harddrive" in storage calculation.
2006-12-18 05:27:42
·
answer #4
·
answered by A N 3
·
2⤊
0⤋