Class Exercises : 1st March 2013. Telecommunications 6
Acknowledgements : Some problem taken from Hwei Hsu, Analog and Digital Communications chapter 10 and modified for MATLAB experimentation.
You will need the files
BSC01.mdl
CasBSC.mdl
Find them on Moodle or some other location.
Question 1 ( Warm up exercise )
A communication system can transmit 10 different symbols. How many 4-symbol sequences can be formed (made ) if
Ans :104
(a)Repetitions are allowed. ()
(b)Repetitions are not allowed.( Ans :5040 )
(c)The last symbol must be the same as the second symbol and repetitions are
no allowed. (Ans :720 )
Question 2.
1)Explain what the term discrete memoryless channel means.
2)Explain what the term entropy means and how it different from the
information contents of a symbol
x1 , x2, x3 and x4
3) A DMS has four symbols with the with the following
probabilities.
P(x1)=0.4 , P(x2)= 0.3 ,P(x3)=0.2 , P(x4)= 0.1 Show that the entropy is 1.846 bits. Verify with a MATLAB program that
you have met or a with any similar program.
S1=x1x2x1x3
4)The following five bit sequence is transmitted.
Show that
Probability of 4 alphabet symbols is 0.009600 and information content is
6.702750 bits.
, and show your method. ( Use MATLAB if you wish, remember on a
calculator log2(x)=3.322log10(x)
S2=x4x3x3x2
5)The following four signals are transmitted . Show that
a.Probability of 4 alphabet symbols is 0.001200 and information
contents is 9.702750 bits.
6) What would you expect the average information of a four symbol sequence
to be? Compare this to the values given for s1 and s2 and comment on the
difference.
Question 3.
Consider the channel shown below
Write out the channel matrix
Answer : Read the section of the notes called The Channel Matrix
]
P(Y│X)=[P(y1|x1)P(x2|x1)
P(y1|x2)P(y2|x2)]=[0.90.1
0.20.8
Write an equation that relates the probabilities at the output, to those at the input.
P[Y]=P[X]×P[Y|X]
If the input probabilities are both 0.5, show that the output probabilities arematlab 下载
P(y1)=0.55, P(y2)=0.45
Write a short MATLAB script to check the calculations.
]
P(Y│X)=[P(y1|x1)P(x2|x1)
P(y1|x2)P(y2|x2)]=[0.90.1
0.20.8
Question 4.
Two binary symmetric channels are connected in cascade,
(1) Find the channel matrix of the resultant channel and fill in the values shown.
Use MATLAB to check the calculations.
Ans : from MATLAB you should get the following, now check it.
0.6200 0.3800
0.3800 0.6200
P(x1)=0.6, P(x2)=0.4
(2) If find the values of the z outputs.
P(z1)=0.524 P(z2)=0.476
(3)Now model the above system using the MATLAB model BSC01.mdl and
check that it agrees with theory. The Bernoulli source has a p or the
appropriate value and does the BSC block, un the simulations and check that the predicted errors rates are in agreement with the cascaded matrix. Note
the use of rate transition blocks. The configuration parameters are shown. Question : If you replace the single BSC by two cascaded BSC you will probably not get the same error rate as having a single BSC, try this and explain why.
Now replace the one BSC by two in cascade, the file is CasBSC.mdl (Cas stands for cascaded, connected together ).
For the example shown above the error rate is around 11%, this seems different, why. Why does this cascaded system appear NOT to work, and how do you fix it?Question 5.
Remember to learn the information theory proofs in the back of the notes. Show that the capacity of a channel with AWGN and an infinite bandwidth is given by where is the two sided noise spectrum density.
C ∞≈1.44S N 0N 02Question 6.
Use the equation
0≤H (X )≤log 2m
Where m is the size of the alphabet of X, and an equation involving mutual information to prove that the channel capacity of a lossless channel ( I(X,Y) ) that is transmitting an alphabet of 128 symbols is 7 bits per symbol. H (X │Y )=0What name is usually given to H (X │Y )
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论