Kolmogorov Complexity

Created
Created
2023 Jul 5 17:19
Editor
Creator
Creator
Seonglae ChoSeonglae Cho
Edited
Edited
2026 Jan 15 19:43

Algorithmic complexity, algorithmic entropy

Also known as algorithmic complexity, Solomonoff–Kolmogorov–Chaitin complexity, program-size complexity, descriptive complexity, or algorithmic entropy. Kolmogorov complexity is the length of the shortest computer program that can output a given solution. It is also defined as the computational resources needed to describe an object.
This measure of complexity is based on the length of the shortest program needed to describe any given string. It serves as one of the key indicators for measuring the complexity of finite-length data sequences. It defines the minimum length of a program that can reproduce the given data such as
MDL
does.

Interpretation

Kolmogorov Complexity is similar to a natural loss function for emergence, where lower complexity implementations appear first. As a result, other programs that emerge temporally later are not selected in
Evolution
due to their delayed appearance.
 
 
 
 

Epiplexity

Traditional information theory (Shannon entropy, Kolmogorov complexity) is defined under the assumption of infinite computational resources, which leads to a mismatch with the notion of information observed in actual machine learning. Deterministic transformations do not increase information. Information is independent of data order.
However, real-world models (LLMs, NNs) have limited computation, and as a result, they create 'meaningful structures' during the learning process. The important information in machine learning is not Shannon information, but rather structural information that is computationally accessible to the model (epiplexity). Epiplexity is proposed as the concept of information that can actually be learned under computational constraints.
 

Recommendations