No.
I wouldn't expect an amplifier to be driven into saturation.
It's a technique applicable to using the transistor as a switch.
At saturation the collector-emitter voltage will be low (less than the base-emitter voltage) so the dissipation in the transistor will be low.
2006-08-02 02:30:23
·
answer #1
·
answered by dmb06851 7
·
0⤊
0⤋
well, there are two main cases:
a) amplification of a wideband signal - like speech or music is
[range of ~(100 - 15.000)Hz]
b) amplification of a narrowband signal - xample: FM radio transmission at 100MHz. [range of ~ (99.9 - 100.1)MHz]
In case a) it is really important not to distort the signal, ie. to preserve its spectral characteristics (this is what we sense when listeniong to music, speech, etc.).
from this reason we should not saturate the transistor, because it would yield to clipping, ie. cutting the signal in amplittudo domain - and changing its spectral characteristics (enriching the spectrum).
However, in case b) we can saturate the thing, enriching the spectrum, ie. adding another spectral components at different MHz values - far away (!) from the band we are working at (99.9-100.1)MHz. Then, of course, even by applying a simple analog filter we can remove the other frequencies and transmit only the one we are interested in.
The good thing about saturating a transistor is that it works in a much (!) bigger range (ie. till the limits), therefore its efficiency could be much higher.
2006-08-02 09:40:35
·
answer #2
·
answered by Marian 1
·
0⤊
0⤋