Shannon Game

Creator
Creator
Seonglae Cho
Created
Created
2025 Mar 19 11:37
Editor
Edited
Edited
2025 Mar 19 11:43
Refs
Refs
The information content of a message is a function of how predictable it is. The information content (number of bits) needed to encode i is log2(1/pi)=log2pi\log_2 (1/p_i) = -\log_2 p_i. So
Next Token Prediction
probability is containing information content itself.
The entropy of a message is the expected number of bits needed to encode it. H(p)=i=1npilog2piH(p) = -\sum_{i=1}^{n} p_i \log_2 p_i (
Shannon entropy
)
 
 
 
 
 
 

Recommendations