For an analog signal the information is directly in the variation of the signal. For a monitor the video drive is the voltage of a signal, for a FM radio the signal is the frequency deviation from the carrier wave.
Analog signals are (in theory) infinitely variable. Back to the monitor example, there is an infinite number of levels that the signal can be at between the 0 volt minimum and the 0.7 volt maximum. However, this makes the signal very suceptible to noise.
For digital signals the information is encoded into the sequence of highs and lows of the signal. This means that the signal will have to move much faster than the analog one to pass over the same information and that the signal needs extra processing to put it into a digital format and to interpret it at the other end. However, because the level of digital signals are either high or low they are very noise resistant.
Mathematically, analog is infinitely variable (set of all possible numbers, including decimals), digital is quantized (set of all possible intergers)
2007-10-02 05:52:12
·
answer #1
·
answered by Simon T 6
·
0⤊
0⤋
I'm math it may mean a certain type of wave equation. In computer science an analog signal is a continuous signal like a wave which is the opposite of discrete (For example a decimal that never ends 3.14159256.... is not discrete and in fact, numbers like these pervade the set of Real numbers. Whereas numbers like 1 and 0 are discrete because they have definite ranges. 1 and 0 in particular are used to represent digital (discrete) signals in computers because they are like on/off switch is to an electrical switch (with 1 representing on and 0 representing off.))
2007-10-01 11:21:10
·
answer #2
·
answered by Anonymous
·
0⤊
1⤋
A continuously changing value. A clock with a second hand is an analog device because it is capable of indicating every conceivable time of day. A digital clock cannot - it operates in finite times, ex, tenths of seconds, etc.
2007-10-01 11:22:27
·
answer #3
·
answered by Steve in NC 7
·
0⤊
0⤋
Analog signals are transmitted as variations in electrical values, i.e., 20 milliamps, 1.5 volts, while digital signals are represented as binary code, i.e., 1 and 0s to give certain values.
2007-10-01 11:19:27
·
answer #4
·
answered by Bruce Almighty 4
·
0⤊
1⤋
Analog- varying amount (as in a signal).
Digital- either ON or OFF (like a switch)
;-)=
2007-10-01 11:23:48
·
answer #5
·
answered by Jcontrols 6
·
0⤊
1⤋
basically its not digital.Its also a technology in use for more than 50 years to transmit conventional radio and TV and telephone signals
what it does it encodes information as an analog signal, that is, by varying the voltages and/or frequencies of the signal.
2007-10-01 11:20:07
·
answer #6
·
answered by hel8itch 2
·
0⤊
2⤋
The term Son of guy would not recommend that He replaced right into a guy, along with we seem to work out on the same time as we are "in" the human journey God's Son is His eternal Christ, actuality. Jesus replaced into endowed with the Christ. hence He replaced into the Son of God.
2016-12-17 14:29:44
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋
analog most likely means variation in data related to physical quality.
2007-10-01 11:18:46
·
answer #8
·
answered by rod_dollente 5
·
0⤊
2⤋
meens you gona get some lady and some wood and solve math on your computa
2007-10-01 11:19:13
·
answer #9
·
answered by Anonymous
·
0⤊
3⤋