You might find some help on Wikpedia.org...
_
_
_
_
2007-03-19 23:04:55
·
answer #1
·
answered by Anonymous
·
3⤊
0⤋
A CCD transports the charge across the chip and reads it at one corner of the array. An analog-to-digital converter (ADC) then turns each pixel's value into a digital value by measuring the amount of charge at each photosite and converting that measurement to binary form.
CMOS devices use several transistors at each pixel to amplify and move the charge using more traditional wires. The CMOS signal is digital, so it needs no ADC.
Photons hitting a photosite and releasing electrons
Differences between the two types of sensors lead to a number of pros and cons:
CCD sensors create high-quality, low-noise images. CMOS sensors are generally more susceptible to noise.
Because each pixel on a CMOS sensor has several transistors located next to it, the light sensitivity of a CMOS chip is lower. Many of the photons hit the transistors instead of the photodiode.
CMOS sensors traditionally consume little power. CCDs, on the other hand, use a process that consumes lots of power. CCDs consume as much as 100 times more power than an equivalent CMOS sensor.
CCD sensors have been mass produced for a longer period of time, so they are more mature. They tend to have higher quality pixels, and more of them.
2007-03-18 15:34:39
·
answer #2
·
answered by PhotoWhiz 1
·
0⤊
0⤋
I shoot with cameras using both CMOS and CCD sensors...and both produce great images. You won't notice any difference in quality between current products.
I can remember hearing some negatives about CMOS sensors, when they were first introduced, but I was still shooting film back then...waiting for digital to catchup.
2007-03-12 09:07:19
·
answer #3
·
answered by Greg S 5
·
0⤊
0⤋
CMOS is much faster and less noisy.
2007-03-17 16:32:50
·
answer #4
·
answered by dand370 3
·
0⤊
0⤋