English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-09-13 11:08:07 · 3 answers · asked by lord_andys_new_id 1 in Science & Mathematics Engineering

3 answers

Yes you can, but the quality of the DAC is not likely to be 16-bits.

What you have to do is attenuate the output of the 2nd DAC so that a full scale signal is equivalent to an LSB of the 1st DAC, and then combine the two signals. You can do this with a resistive network.

The quality of such a 16 bit DAC is not likely to be "16-bits", because the linearity of the 1st DAC is not likely to be 16 bits. For example, if the linearity of the 8-bit DAC is +/- lsb/2, then the best DAC you can make from a linearity spec is only 9 bits. You will have a 16-bit DAC that is linear to 9 bits. Not very useful.

2007-09-16 20:31:09 · answer #1 · answered by Robert T 4 · 0 0

In a straight 'linear' fashion, you can only make a 9-bit ADC out of two 8-bit ADCs.

If you add an 8-bit DAC and some analog circuitry, it can be done. LeCroy used a circuit in some of the first 200+ MSPS digitizers in the mid 80s using two 4-bit flash converters, feedback, a fast 4-bit DAC and some front-end analog circuitry to make a fast 8-bit ADC.

I'd have to do a little more research to remember exactly how it is done, but I know it can be done.

.

2007-09-13 18:25:32 · answer #2 · answered by tlbs101 7 · 0 0

I think not, since

the limitations of accuracy appertain to
resolving 1 in 256 (+/-20mV in 5 volts)

not 1 in 65536

if you could cascade them with a reference of 256 volts
and a full scale output of 256 volts

2007-09-13 19:19:32 · answer #3 · answered by Anonymous · 0 0

fedest.com, questions and answers