English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I keep hearing the term megabits on a TV commercial.

2006-08-22 20:05:30 · 4 answers · asked by quizkid 3 in Computers & Internet Other - Computers

4 answers

Yes, the first poster got it right. There are 8 bits in every byte.

The reason why you hear both terms (bit and byte) usually has to do with its context. For example, when it comes to networking or transfer speeds, you typically hear "bits" being used. Broadband companies might advertise 2.0 Mbps (megabits per second) as their average speed. This translates to 250 KB/s (kilobytes per second) when you do the math and divide by 8.

"Bytes" are typically used to describe storage compacity.

____________________
Update:

Zerggle has part of it right below, but not the last part. There are:

1024 bytes in a kilobyte
1024 kilobytes in a megabyte
1024 megabytes in a gigabyte
etc...

and there are:

1024 bits in a kilobit
1024 kilobits in a megabit
etc...


Everything is a multiple of base 2 (since we're talking about binary, you can't have even amounts of 1000). You get the idea...

2006-08-22 20:18:02 · answer #1 · answered by SirCharles 6 · 1 0

One byte is 8 bits.

One bit is one single piece of digital information. Digital information cannot be sub-divided further. Specifically, all a bit is, is on or off. This is where the term binary comes from. Bi means two - and there are only two possibilities for a bit.

To represent something more complex, like a single character - say the letter e, or a number, computer programmers decided they could use 8 bits in varying patterns to represent that character.

In my experience, bits are generally used for talking about data communications - as in - a T1 line can send and receive data at 1.544 Mb (Megabits) per second.

Bytes are generally used for talking about data storage - as in - a 512 MB (Megabyte) flash disk.

A Kilobit is 1000 bits, and a Kilobyte is 1000 bytes.
A Megabit is 1000000 bits, and a Megabyte is 1000000 bytes.
A Gigabit is 1000000000 bits and a Gigabyte is 1000000000 bytes.

The following Wikipedia article discusses a bit at length.
http://en.wikipedia.org/wiki/Bit

2006-08-22 20:18:55 · answer #2 · answered by zerggle 2 · 0 0

1 byte = 8 bits.
1 megabyte = 8 megabits.

1 bit = 1 binary diget. Either a 1 or a 0.

2006-08-22 20:08:08 · answer #3 · answered by Anonymous · 0 0

Megabits are one million bits.

2006-08-22 20:12:38 · answer #4 · answered by Zeta 5 · 0 0

fedest.com, questions and answers