English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

5 answers

bit rate is number of bits transmitted per second through a transmission medium.
band width is maximum number of bits that can be transferred per second through that transmission medium.

2006-10-01 03:25:20 · answer #1 · answered by nick 2 · 0 0

These are terms people used interchangeably, but actually are a little different.

Bandwidth in a wire medium like Ethernet, serial port, or dial up line is the clock rate. One clock, one signal. (Strictly speaking, this is not necessary true. But just for the discussion here, this is close enough.)

The bit rate is how many 0s or 1s can be transmitted in a period of time. This may be different from bandwidth depending on which physical layer protocol is used. If you pack two bits into a four-level signal (00, 01, 10, 11), the bit rate can be twice of the bandwidth. If your data is convoluted with error detection or error correction code (you added some redundant info so you can detect or correct transmission errors), your bit rate may be less than the bandwidth.

2006-10-01 02:47:52 · answer #2 · answered by muon 3 · 0 0

They're very similar, but bit rate is usually used to describe the maximum speed at which data can be transmitted/received and bandwidth is primarily used to describe what you can realistically expect to transmit/receive on average.

2006-10-01 02:36:19 · answer #3 · answered by nospamcwt 5 · 0 0

Bit Rate - Speed of transmissioof Data
Bandwidth - Capacity or volume of Data that can be transmitted

2006-10-02 02:58:43 · answer #4 · answered by Maxx S 1 · 0 0

http://en.wikipedia.org/wiki/Bitrate#Bandwidth_conversion

2006-10-01 02:33:10 · answer #5 · answered by Splishy 7 · 0 0

fedest.com, questions and answers