Subfield of information theory and computer science
Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" (except for a constant that only depends on the chosen universal programming language) the relations or inequalities found in information theory.[1] According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."[2]
Besides the formalization of a universal measure for irreducible information content of computably generated objects, some main achievements of AIT were to show that: in fact algorithmic complexity follows (in the self-delimited case) the same inequalities (except for a constant[3]) that entropy does, as in classical information theory;[1] randomness is incompressibility;[4] and, within the realm of randomly generated software, the probability of occurrence of any data structure is of the order of the shortest program that generates it when running on a universal machine.[5]
AIT principally studies measures of irreducible information content of strings (or other data structures). Because most mathematical objects can be described in terms of strings, or as the limit of a sequence of strings, it can be used to study a wide variety of mathematical objects, including integers. One of the main motivations behind AIT is the very study of the information carried by mathematical objects as in the field of metamathematics, e.g., as shown by the incompleteness results mentioned below. Other main motivations came from surpassing the limitations of classical information theory for single and fixed objects, formalizing the concept of randomness, and finding a meaningful probabilistic inference without prior knowledge of the probability distribution (e.g., whether it is independent and identically distributed, Markovian, or even stationary). In this way, AIT is known to be basically founded upon three main mathematical concepts and the relations between them: algorithmic complexity, algorithmic randomness, and algorithmic probability.[6][4]
^ abChaitin 1975
^"Algorithmic Information Theory". Archived from the original on January 23, 2016. Retrieved May 3, 2010.
^or, for the mutual algorithmic information, informing the algorithmic complexity of the input along with the input itself.
^ abCalude 2013
^Downey, Rodney G.; Hirschfeldt, Denis R. (2010). Algorithmic Randomness and Complexity. Springer. ISBN 978-0-387-68441-3.
^Li & Vitanyi 2013
and 26 Related for: Algorithmic information theory information
Algorithmicinformationtheory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information...
algorithmic complexity theory, algorithmicinformationtheory and information-theoretic security. Applications of fundamental topics of information theory...
In algorithmicinformationtheory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is...
Algorithmic complexity may refer to: In algorithmicinformationtheory, the complexity of a particular string in terms of all algorithms that generate...
In algorithmicinformationtheory, algorithmic probability, also known as Solomonoff probability, is a mathematical method of assigning a prior probability...
Beginning in the late 1960s, Chaitin made contributions to algorithmicinformationtheory and metamathematics, in particular a computer-theoretic result...
systems from an algorithmic point of view Algorithmic number theory, algorithms for number-theoretic computation Algorithmic game theory, game-theoretic...
learning theory and algorithmic inductive inference[citation needed]. Algorithmic learning theory is different from statistical learning theory in that...
first described algorithmic probability in 1960, publishing the theorem that launched Kolmogorov complexity and algorithmicinformationtheory. He first described...
This is a list of informationtheory topics. A Mathematical Theory of Communication algorithmicinformationtheory arithmetic coding channel capacity Communication...
short descriptions, relates to the Bayesian Information Criterion (BIC). Within AlgorithmicInformationTheory, where the description length of a data sequence...
information theory include source coding, algorithmic complexity theory, algorithmicinformationtheory, and information-theoretic security. There is another...
mathematics of probability theory, topology, intuitionistic logic, turbulence, classical mechanics, algorithmicinformationtheory and computational complexity...
later-proposed minimum description length principle in algorithmicinformationtheory (AIT), a.k.a. the theory of Kolmogorov complexity, it can be seen as a formalization...
sequences are key objects of study in algorithmicinformationtheory. In measure-theoretic probability theory, introduced by Andrey Kolmogorov in 1933...
therefore an algorithmic sufficient statistic. We write `algorithmic' for `Kolmogorov complexity' by convention. The main properties of an algorithmic sufficient...
Integrated informationtheory (IIT) proposes a mathematical model for the consciousness of a system. It comprises a framework ultimately intended to explain...
affect whether or not people accept algorithmic recommendations. For example, one study found that trust in an algorithmic financial advisor was lower among...
of distributions are computationally indistinguishable if no efficient algorithm can tell the difference between them except with negligible probability...
computation, information, and automation. Computer science spans theoretical disciplines (such as algorithms, theory of computation, and informationtheory) to...
statements in algorithmicinformationtheory and proved another incompleteness theorem in that setting. Chaitin's theorem states that for any theory that can...
randomness. Many "random number generators" in use today are defined by algorithms, and so are actually pseudo-random number generators. The sequences they...
compression, image and function segmentation, etc. Algorithmic probability Algorithmicinformationtheory Grammar induction Inductive inference Inductive...
machine of a given size Chaitin's incompleteness theorem – Measure of algorithmic complexityPages displaying short descriptions of redirect targets Definable...