Amount of Information & average information, Entropy - MCQs
Q1. The expected information contained in a message is calleda) Entropy
b) Efficiency
c) Coded signal
d) None of the above
View Answer / Hide AnswerQ2. The information I contained in a message with probability of occurrence is given by (k is constant)a) I = k log
21/P
b) I = k log
2P
c) I = k log
21/2P
d) I = k log
21/P
2View Answer / Hide AnswerQ3. The memory less source refers toa) No previous information
b) No message storage
c) Emitted message is independent of previous message
d) None of the above
View Answer / Hide AnswerANSWER: c) Emitted message is independent of previous message
Q4. Entropy isa) Average information per message
b) Information in a signal
c) Amplitude of signal
d) All of the above
View Answer / Hide AnswerANSWER: a) Average information per message
Q5. The relation between entropy and mutual information isa) I(X;Y) = H(X) - H(X/Y)
b) I(X;Y) = H(X/Y) - H(Y/X)
c) I(X;Y) = H(X) - H(Y)
d) I(X;Y) = H(Y) - H(X)
View Answer / Hide AnswerANSWER: a) I(X;Y) = H(X) - H(X/Y)
Q6. The mutual information a) Is symmetric
b) Always non negative
c) Both a) and b) are correct
d) None of the above
View Answer / Hide AnswerANSWER: c) Both a) and b) are correct