Physical data transmission

What is the actual difference between analog and digital signals? In the end, any physical signal put on a medium must be analog in its nature. Is the difference only that a "digital" signal is always completely refreshed by any intermediate devices between a sender and receiver, why an "analog" signal is only (if at all) amplified, with all noise accumulating?

What is the relationship between signal frequency, bandwidth, symbol rate, and bitrate? How does analog/digita (see above) come into play here (if at all)?

Attached: 187-Physical-layer-rapresenting signals.jpg (510x344, 13K)

Other urls found in this thread:

pcmag.com/encyclopedia/term/37751/analog-modem
en.wikipedia.org/wiki/Modulation
twitter.com/NSFWRedditGif

A digital signal inhabits only discrete states rather than a continuous range of them.

Baud rate = symbol rate
bitrate = bandwidth = bits/symbol * symbol rate

An analog signal can be any value in a range. Generally this means using less advanced processing and hoping the signal travels well, but you could process it if you wanted.

>What is the actual difference between analog and digital signals?
the signals themselves? nothing
it's all about how they're interpreted

This user killed it.

Also, someone more on the telecom side might correct me, but I don't think it makes sense to talk about "symbols" on a analog signal

analog: weeeeooeewwwwwweeeeewwwweeee
digital: beep boop boop boop beep beep boop

That should explain it.

>A digital signal inhabits only discrete states rather than a continuous range of them.
That's conceptual only, any physical signal (even if it carries digital data) is analog.

An old serial modem signal is considered "analog" even though it carries digital data. How is it different from "digital" signals such as ISDN/xDSL/Ethernet etc.?

>An old serial modem signal is considered "analog"
by who?

pcmag.com/encyclopedia/term/37751/analog-modem