English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

6 answers

Gigahertz refers to the system clock speed. which is used to measure a proccessors speed.
Gigahertz = 1 billion hertz
Gigabytes refers to the amout of memory or space you have on your hard drive.
Gigabytes = 1 billion bytes

2006-09-25 14:11:30 · answer #1 · answered by Anonymous · 0 0

The gigahertz are used for the procesor, 1 gigahert is 1024 kilohertz.
Computers now are having from 2 gigahertz to 5 gigahertz in procesor.

A gigabyte is 1024 megabites. A HD normally has 80 gigabytes.

1 gigabite is about half a movie or 350 songs

2006-09-25 14:11:38 · answer #2 · answered by carlos o 4 · 1 0

Hertz was a scientist who worked with electricty. The name of an AC current used to be measured in 'cycles' which was refering to the frequency of the electrical circuit. Your home AC is usually 60 cycles per second. This was changed to honor Professor Hertz and the word 'cycles' became 'hertz'

So, your home electricity - in America - is 60 Hz instead of 60 cycles.

Your computer works using a cyclical pattern in the CPU, or the computer's brain. The faster it does this each second, the sooner it finds the solution it is 'thinking' about. Faster is better, but hotter. My old computer is chugging away at 800,000,000 Hz per second, or 800 MHz (Mega=1 million) but the modern computers today are zipping along at 2 GHz! (Giga = 1 billion) 2 Giga = 2,000,000,000! So they are really faster than my old eMac, which I like anyway! So ha!

When you eat, you take a bite. When a computer 'eats' in takes a 'byte.' The size of the byte is how much data a computer can process or 'think about' at one time. Digital data is stored as 'zero' or 'one' but a computer needs a bunch of these to run a program. It takes a bunch of these 'bits' together as 1 'byte.' Computers were able to process 4, 8, 16 and 32 bits at a time. Now they are transitioning to be able to use 64 bits at a time. All of these bits are stored as 'bytes' in the computer's hard drive or RAM space.

So, a computer eats 'bytes!' and the more it has the more it likes it. All the information you use, whether visual or auditory is somehow saved and accessed as bytes in the computer. A gigabyte is 1 billion bytes. Modern hard drives are now reaching toward 1000 gigabyte capacity. My old eMac has a 60 Gigabyte hard disk and 1 megabyte or RAM. My new hard drive can store 200 Gigabytes of information. It seems like a lot, but then, I thought that 60 Gigabytes was a lot when I got my eMac!

;-D Technology is still getting better all the time!

2006-09-25 18:26:40 · answer #3 · answered by China Jon 6 · 1 0

Gigahertz is the speed of the processor. The higher the number, the more things it can process in a second.

Gigabytes is the amount of memory (either RAM or harddrive). The higher the number, the more memory the computer has.

2006-09-25 14:12:31 · answer #4 · answered by Flip 3 · 1 0

Giga is a prefix meaning one billion.
Hertz is a measure of cycles. A one gighertz CPU chip can complete one billion calculations per second.
Bytes is a a measure of file size. A one gigabyte hard drive can hold one billion bytes of information.

2006-09-25 14:14:45 · answer #5 · answered by Anonymous · 0 0

gigahertz measures speed

gigabytes measures space.

2006-09-25 14:13:38 · answer #6 · answered by thunder2sys 7 · 0 0

fedest.com, questions and answers