Relation between entropy & mutual information
Q. The relation between entropy and mutual information is- Published on 04 Nov 15a. I(X;Y) = H(X) - H(X/Y)
b. I(X;Y) = H(X/Y) - H(Y/X)
c. I(X;Y) = H(X) - H(Y)
d. I(X;Y) = H(Y) - H(X)
ANSWER: I(X;Y) = H(X) - H(X/Y)