What is the actual difference between analog and digital signals? In the end, any physical signal put on a medium must be analog in its nature. Is the difference only that a "digital" signal is always completely refreshed by any intermediate devices between a sender and receiver, why an "analog" signal is only (if at all) amplified, with all noise accumulating?
What is the relationship between signal frequency, bandwidth, symbol rate, and bitrate? How does analog/digita (see above) come into play here (if at all)?
A digital signal inhabits only discrete states rather than a continuous range of them.
Baud rate = symbol rate bitrate = bandwidth = bits/symbol * symbol rate
Robert Ward
An analog signal can be any value in a range. Generally this means using less advanced processing and hoping the signal travels well, but you could process it if you wanted.
Justin Scott
>What is the actual difference between analog and digital signals? the signals themselves? nothing it's all about how they're interpreted
Wyatt Cox
This user killed it.
Also, someone more on the telecom side might correct me, but I don't think it makes sense to talk about "symbols" on a analog signal
>A digital signal inhabits only discrete states rather than a continuous range of them. That's conceptual only, any physical signal (even if it carries digital data) is analog.
Xavier Carter
An old serial modem signal is considered "analog" even though it carries digital data. How is it different from "digital" signals such as ISDN/xDSL/Ethernet etc.?
Anthony Rodriguez
>An old serial modem signal is considered "analog" by who?