English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I understand dispperssion.

How does 400Mhz of bandwidth deteriorate to 200Mhz of bandwidth while traveling through the fiber?

2007-03-04 08:21:11 · 2 answers · asked by Chi Guy 5 in Science & Mathematics Engineering

2 answers

I assume you are asking why a short line can have a higher bandwidth than a longer line.

It all comes back to dispersion. As the bits are traveling down the line, some of the light modes travel faster than the others. If you try to send the bits too fast, some of the faster modes/wavelenghts will overrun the slower ones and the bits will blend together. The reciever won't be able to discerne between the 1's and 0's. The longer the line, the more the dispersion factors in. Which means you need to slow the bits down more in order to keep them separate.

If you had 4 cars - one traveling at 40 mph, one at 50 mph, one at 60 mph and one at 70 mph. If you started them one second apart (1st the 40, then the 50, then the 60, then the 70) and they were driven down the road. 100 feet down the road, they would still be in the same order, even though the ones in the back would be gaining. 2 miles down the road, they would be in a different order. Just as the cars pass each other, the faster modes and wavelengths pass over the slower ones in a fiber.

2007-03-04 09:01:15 · answer #1 · answered by buckeye_brian31 2 · 0 0

Because the Fourier components of a square wave are delayed by different amounts (due to dispersion) causing the rise and fall times of the wave to deteriorate. This makes the exact 'edge' of the wave more difficult to detect with increasing dispersion (increasing length) and the equivalent bandwidth must be reduced to keep errors to within acceptable limits.

HTH ☺

Doug

2007-03-04 16:56:54 · answer #2 · answered by doug_donaghue 7 · 0 0

fedest.com, questions and answers