Next: Problem 7 - Newton
Up: Information Science I
Previous: Problem 5 - Turing
Consider a symmetric binary channel such that, when 0 or 1 is sent, 0 or 1 is received correctly with
probability
(), and, incorrectly 1 or 0 is received with probability . Answer the following
questions.
- 1.
- Assume that the information source at sender produces 0 with probability
() and 1
with probability . Compute the entropy
of this information source.
- 2.
- Compute the probability
that 0 is received.
- 3.
- When 0 is received at the receiver side, with what probabilities the sender side sent 0 and 1?
Compute the entropy
of information source which generates 0 and 1 with these probabilities.
Similarly, consider the other case that 1 is received, and then compute the corresponding entropy .
- 4.
- Let
be an average of
and
with the probabilities ,
of receiving 0 and 1
respectively:
Then, show that:
- 1.
-
- 2.
- We have the following transition probability matrix:
And we have the following relations:
Thus:
- 3.
- We use Baye's rule:
- 4.
Next: Problem 7 - Newton
Up: Information Science I
Previous: Problem 5 - Turing
Reynald AFFELDT
2000-06-08