Review of related probability and statistics related topics.
1st
Definition of random variable, definition of Alphabet, definition of
joint probability.
2nd
Conditional probabilities and Bayes rule .Independence of two
random variables .Venn's diagram.
3rd
Model of information transmission system. Common sense
definition of information .Logarithmic measure of information.
Self-information.
4th
Definition of information for noisy channel .Posteriori
probabilities.
Average mutual information for noisy channel.
5th
Shannon representation diagram of information source.
Parameters of discrete channel.
Average information (entropy) of a discrete and continuous
source, maximum source entropy. Source efficiency.
7th
Entropy for continuous uniform distribution source.
Entropy for continuous Gaussian distribution source.
8th
Entropy for continuous Triangular distribution source.
Entropy for continuous Exponential distribution source.
9th
Transition probability matrix of channel, discrete noiseless and
noisy channel models, uniform channel. Ternary symmetric
channel.
10th
Information transmission over symmetric channel, noiseless
channel, binary symmetric channel, ternary symmetric channel.
11th
Memory and memory less information channels .Binary Erasure
channel (BEC).
12th
Capacity of discrete channel, channel capacity for noiseless
channel. Channel efficiency and redundancy. Channel capacity
for symmetric channels.
13th,14th
Channel capacity for nonsymmetrical channels .binary
nonsymmetrical channel.
15th
Mutual information of continuous channel. Capacity of
continuous channels. Efficiency and redundancy of continuous
channel.
Lecturer
Material File