??? 10/13/06 15:18 Read: times |
#126425 - As simple as i can make it Responding to: ???'s previous message |
What Mr Shannon is saying is that any communications channel has a limit to the amount of information which it can transmit,information in this case being the entropy of the data being sent, the limit is proportional to the channel bandwidth and the signal to noise ratio of the channel.
The reason noise prevents us fron transmitting data at the theoretical channel capacity is that the symbols transmitted by the channel representing the data can be seen as a set in some phase space, the receiver maps the received symbols onto this same phase space and makes a desision as to the received symbol by finding the closest member in the symbol set, closest being some Eucidian distance metric.Receiver noise causes the received symbols to be moved from their intended position in the phase space, clearly if the noise amplitude is greater than 1/2 of the Eulidian distance between adjacent set members then the receiver will decode the recieved symbol incorrectly giving a bit error. The bit error rate is therefore proportional to the Euclidian distance between the symbol set in the phase space. What Shannon's theory says is that although we cannot exceed the channel capacity by any possible coding method,we can get arbitarily close to the theoretical limit by maximizing the Euclidian distance between the symbol set in the phase space.this is basically what we are trying to do when using phase/amplitude/quadrature modulation. What virterbi decoding does is it tries to find not the closest member of the symbol set but the one which is the most likely to have been received given constraints in how the transmitted symbols can move in the phase space and the past behaviour of the communications channel.Covolutional coding constrains how the transmitted symbols can move from one symbol to the other in the recievers phase space thereby allowing the decoder to make informed guesses about the transmitted data. |
Topic | Author | Date |
Viterbi Decoder | 01/01/70 00:00 | |
Oh please........I give up. | 01/01/70 00:00 | |
Kids Today | 01/01/70 00:00 | |
And before you come back | 01/01/70 00:00 | |
Link to Shannon's Paper | 01/01/70 00:00 | |
Ok.and i have... | 01/01/70 00:00 | |
They dont write em like that anymore | 01/01/70 00:00 | |
See "Long Division" thread, this forum. | 01/01/70 00:00 | |
Sorry i couldnt get you.. | 01/01/70 00:00 | |
Just a cross-reference | 01/01/70 00:00 | |
Thats nicely said joe :) | 01/01/70 00:00 | |
Full quote. | 01/01/70 00:00 | |
WOW! | 01/01/70 00:00 | |
Another one | 01/01/70 00:00 | |
As simple as i can make it | 01/01/70 00:00 | |
Convolutional coding | 01/01/70 00:00 | |
Yeah its not a good diagram..but | 01/01/70 00:00 | |
digital filter approach? | 01/01/70 00:00 | |
yey!!!! | 01/01/70 00:00 | |
The fact that its a moore machine is irrelevent | 01/01/70 00:00 | |
I was.. | 01/01/70 00:00 | |
Mealy vs Moore | 01/01/70 00:00 | |
Hello Jez., | 01/01/70 00:00 |