the answer isnt exactly 1 billion, because of differences in defining gigabyte, this is directly from wikipedia
There are three slightly different definitions of the size of a gigabyte in use:
* 1,000,000,000 bytes or 109 bytes is the decimal definition used in telecommunications (such as network speeds) and most computer storage manufacturers (such as hard disks and flash drives). This usage is compatible with SI. HP claims media makers Maxtor, Iomega and Western Digital adhere to using this amount.[1]
* 1,073,741,824 bytes, equal to 10243, or 230 bytes. This is the definition used for computer memory sizes, and most often used in computer engineering, computer science, and most aspects of computer operating systems. The IEC recommends that this unit should instead be called a gibibyte (abbreviated GiB), as it conflicts with SI units used for bus speeds and the like. HP states Microsoft normally adheres to this definition [1]
* 1,024,000,000 bytes, is the definition used by Hard Disk maker Seagate. [1]
however in general when not talking about computer memory, the prefix giga means 1 billion, so i woudl guess a gigabuck is 1 billion bucks
2006-08-22 10:59:09
·
answer #1
·
answered by abcdefghijk 4
·
1⤊
0⤋
1,073,741,824 bytes in a gigabyte.
I've never heard of a buck or a gigabuck. Sure you're not making that up? ;)
2006-08-22 10:49:01
·
answer #2
·
answered by ♫☼♥ ≈ Debbi ≈ ♥☼♫ 3
·
0⤊
0⤋