Help


[permalink] [id link]
+
Page "Chaitin's constant" ¶ 40
from Wikipedia
Edit
Promote Demote Fragment Fix

Some Related Sentences

Kolmogorov and complexity
In algorithmic information theory ( a subfield of computer science ), the Kolmogorov complexity of an object, such as a piece of text, is a measure of the computational resources needed to specify the object.
Kolmogorov complexity is also known as " descriptive complexity " ( not to be confused with descriptive complexity theory ), Kolmogorov – Chaitin complexity, algorithmic entropy, or program-size complexity.
Thus, the Kolmogorov complexity of the raw file encoding this bitmap is much less than 1. 62 million.
It can be shown that the Kolmogorov complexity of any string cannot be more than a few bytes larger than the length of the string itself.
Strings whose Kolmogorov complexity is small relative to the string's size are not considered to be complex.
The notion of the Kolmogorov complexity can be used to state and prove impossibility results akin to Gödel's incompleteness theorem and Turing's halting problem.
To define the Kolmogorov complexity, we must first specify a description language for strings.
If a description of s, d ( s ), is of minimal length ( i. e. it uses the fewest number of characters ), it is called a minimal description of s. Thus, the length of d ( s ) ( i. e. the number of characters in the description ) is the Kolmogorov complexity of s, written K ( s ).
Algorithmic information theory is the area of computer science that studies Kolmogorov complexity and other complexity measures on strings ( or other data structures ).
Kolmogorov used this theorem to define several functions of strings, including complexity, randomness, and information.
The general consensus in the scientific community, however, was to associate this type of complexity with Kolmogorov, who was concerned with randomness of a sequence, while Algorithmic Probability became associated with Solomonoff, who focused on prediction using his invention of the universal a priori probability distribution.
There are several other variants of Kolmogorov complexity or algorithmic information.
An axiomatic approach to Kolmogorov complexity based on Blum axioms ( Blum 1967 ) was introduced by Mark Burgin in the paper presented for publication by Andrey Kolmogorov ( Burgin 1982 ).
Some consider that naming the concept " Kolmogorov complexity " is an example of the Matthew effect.

0.031 seconds.