Conditional entropy

Creator
Creator
Seonglae Cho
Created
Created
2025 Mar 11 11:27
Editor
Edited
Edited
2025 Mar 11 11:33
Refs
Refs
Quantifies the amount of information needed to describe the outcome of a random variable YYgiven that the value of another random variable XX is known.

The entropy of YYconditioned on XX is written as:

H(YX)=xXp(x)yYp(yx)logp(yx)H(Y|X) = -\sum_{x \in \mathcal{X}} p(x) \sum_{y \in \mathcal{Y}} p(y|x) \log p(y|x)
 
 
 
 
 
 
 

Backlinks

Homogeneity

Recommendations