First, it has to do with the way hard drive size is measured. Hard drive manufacturers measure in decimal - 1 gigabyte being 1,000,000,000 bytes. However, a computer measures in binary - 1 gigabyte being 1,073,741,824 bytes.
Hard drive manufacturers use decimal because it makes the hard drive size appear bigger - 300,000,000,000 bytes being 300 gigabytes on the box label, but only 279.4 gigabytes using the computer's measurements.
Then you lose a bit of additional space due to formatting, but it's negligible. Mostly, the difference comes from the measurement techniques.
2007-04-02 14:17:17
·
answer #1
·
answered by cs_gmlynarczyk 5
·
0⤊
0⤋
How much exactly does it show? If in the 275-290 gig range, that's because 300gig is the pre-formatted capacity. Once you format it, it decreases to about 290 gigs. Then when you install an OS on it, it's even less.
If they're MUCH less, like say a hundred less gigs, then it's because your system is not set up to recognize such big volumes.
2007-04-02 21:10:00
·
answer #2
·
answered by Jack 3
·
1⤊
1⤋
Very simple.
Hard disk manufacturers use 1000MB = 1GB
That is a FACT.
Windows reports hard disk size based on 1024MB = 1GB.
That is also a FACT.
Now ... besides that there is the compromise of available disk space due to the variables resulting from partitioning options.
However .... all else being equal ... the conflict between Windows standards & manufacturer's standards is clearly the prime explanation.
regards,
Philip T
2007-04-02 21:17:58
·
answer #3
·
answered by Philip T 7
·
0⤊
0⤋