Physical data transmission

What is the actual difference between analog and digital signals? In the end, any physical signal put on a medium must be analog in its nature. Is the difference only that a "digital" signal is always completely refreshed by any intermediate devices between a sender and receiver, why an "analog" signal is only (if at all) amplified, with all noise accumulating?

What is the relationship between signal frequency, bandwidth, symbol rate, and bitrate? How does analog/digita (see above) come into play here (if at all)?

Attached: 187-Physical-layer-rapresenting signals.jpg (510x344, 13K)

Other urls found in this thread:

pcmag.com/encyclopedia/term/37751/analog-modem
en.wikipedia.org/wiki/Modulation
twitter.com/NSFWRedditGif

A digital signal inhabits only discrete states rather than a continuous range of them.

Baud rate = symbol rate
bitrate = bandwidth = bits/symbol * symbol rate

An analog signal can be any value in a range. Generally this means using less advanced processing and hoping the signal travels well, but you could process it if you wanted.

>What is the actual difference between analog and digital signals?
the signals themselves? nothing
it's all about how they're interpreted

This user killed it.

Also, someone more on the telecom side might correct me, but I don't think it makes sense to talk about "symbols" on a analog signal

analog: weeeeooeewwwwwweeeeewwwweeee
digital: beep boop boop boop beep beep boop

That should explain it.

>A digital signal inhabits only discrete states rather than a continuous range of them.
That's conceptual only, any physical signal (even if it carries digital data) is analog.

An old serial modem signal is considered "analog" even though it carries digital data. How is it different from "digital" signals such as ISDN/xDSL/Ethernet etc.?

>An old serial modem signal is considered "analog"
by who?

pcmag.com/encyclopedia/term/37751/analog-modem

Computer data is always digital, yet a 56k modem signal travelling over a POTS phone line is considered analog even though it carries digital data. ISDN and xDSL also use the same phone cabling for the last mile, but are considered digital. What is the difference?

presumably they mean that it's designed to operating over a primarily-analog medium (POTS), "analog modem" is redundant though, since MODulator/DEModulator already means to convert between an analog and digital signal
>but then why does it need converting at all?
sending a digital signal over carefully-routed traces on a circuit board and sending them over 50km of nasty old copper is two very different things, the latter needs various extra things like error correction to function reliably
note that ISDN/*DSL/Ethernet modems need to do the same sort of things as well, though

>convert between an analog and digital signal
But what do you consider the "digital signal"? Just the raw bit pattern, abstracting away from any physical carrier?

actually just look up what modulation is, that basically explains the need for modems and what they're actually doing
en.wikipedia.org/wiki/Modulation

in this case, the "digital signal" i'm referring to is the TTL signal used by digital circuits

Attached: frequency-shift-key.png (414x287, 30K)

The only difference is the interpretation.

Digital signals have predefined possible levels of existence. Analogue signals carry the information in its amplitude/frequency/phase.

For example, if you were a digital receiver that expected your input to be driven by either 0V or 5V, anything below 2.5V would by considered logical level 0, while anything higher than 2.5V would be considered logical level 1.

The information is contained in the order of logical levels, not in their actual amplitude.

In an analogue system, 3V and 5V would represent completely different values, and hence why it's a billion times easier for noise to deteriorate anything.

Digital doesn't care about noise until you reach the threshold in which you can't differentiate your logical levels anymore.

On any high distance digital system though, you'll likely be modulating the actual series of bits into symbols. QAM is usually the default options (modulating in both amplitude and phase) but simpler systems might chose to use PSK instead (phase only). Having a higher number of possible symbols means they're closer to each other in the constellation and therefore harder to differentiate after you add noise and attenuation, but you transmit more bits per symbol as you increase the size of your constellation.

based

Thanks, cool info

>anything below 2.5V would by considered logical level 0, while anything higher than 2.5V would be considered logical level 1.
Typically you can't rely on that. The standard defines the ranges in which a signal must be interpreted as a logical 1 or 0. In between you might be fucked by "undefined behaviour"

Typically you're not sending literal bits either, it'll at least be in NRZ or Manchester. But that's a good way of explaining without getting too technical.

>Manchester
Why is it not used for FastEthernet? It has at least one symbol per cycle and one bit per symbol, so for a 100MHz cable (such as Cat5e) it should be able to do 100Mbit.

except it's never exact
>beep boop boop boop beep beep boop
more like
>beap boop boup boop beip beep boap

bump

So basically there's a two analog physical signals, one is as close to the ideal would-be "digital signal" as possible, the other is an analog signal of a fixed frequency, the first superimposed onto the other one.

I like manchester.I use it with FSK in rfics. It's cool how, if you use a chip rate of 4800 baud and BT of 0.5, you basically get a stream you can transmit through the audio filters of analogue repeaters.
Purely digital 5$ rfics can transmit the said stream and analogue fm receivers see the stream as tone MSK