English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

if bit rate = 5Mbits/sec what is period for one bit??

2007-02-16 02:48:06 · 3 answers · asked by Anonymous in Science & Mathematics Engineering

3 answers

Period for one bit
=1/5Mbits
=0.2 SECOND for 5 Mbits.
NOW,
1 megabit = 10^6 = 1,000,000 bits
5 megabit = 5*10^6 = 5,000,000 bits
Period for 1 bit
=1/5*10^6
=0.2*10^(-6) second/bit.
=2*10^(-7) second/bit

In digital telecommunication, the bit rate is the number of bits that pass a given point in a telecommunication network in a given amount of time, usually a second. Thus, a bit rate is usually measured in some multiple of bits per second - for example, kilobits, or thousands of bits per second (Kbps). The term bit rate is a synonym for data transfer rate (or simply data rate). Bit rate seems to be used more often when discussing transmission technology details and data transfer rate (or data rate) when comparing transmission technologies for the end user.

2007-02-16 03:18:20 · answer #1 · answered by Anonymous · 3 0

The equations above seem correct except for the fact that 1 megabit does not equal 1 million bits in computers. 1 megabit is 2^20 bits which is (1024*1024) > 1 million.

so I'd redo the computations.

2007-02-16 11:49:27 · answer #2 · answered by cw 3 · 0 0

since 1 Mbit is 1000000 bits, the answer should be 1/5000000
or, .0000002 seconds.
or better yet, .2 microseconds

2007-02-16 10:59:35 · answer #3 · answered by governorkickass 2 · 0 0

fedest.com, questions and answers