Each computer(data terminal) transmits data in a selected data rate(for example: 1.1-19.2 kbps).
this is the rate that it transmits data to the modem.
But, each modem has another data rate that it uses to transmit to the line(to the second modem), normally in a sychronous modem the data rate is much higher than the data rate of the terminal.
Why are there two different data rates?
if the comuter will transmit in 19.2 kbps and the modem will transmit for example in 2.048 mbps, what would happen with that extra data rate that is not being used.
2006-12-20
04:20:09
·
2 answers
·
asked by
Anonymous
in
Computers & Internet
➔ Computer Networking