Algorithmic Information Theory

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 6354 Experts worldwide ranked by ideXlab platform

Kohtaro Tadaki - One of the best experts on this subject based on the ideXlab platform.

  • Algorithmic Information Theory
    SpringerBriefs in Mathematical Physics, 2019
    Co-Authors: Kohtaro Tadaki
    Abstract:

    In this chapter, we review the basic framework of Algorithmic Information Theory to the extent necessary to read the rest of the book.

  • A Statistical Mechanical Interpretation of Algorithmic Information Theory
    2019
    Co-Authors: Kohtaro Tadaki
    Abstract:

    We develop a statistical mechanical interpretation of Algorithmic Information Theory by introducing the notion of thermodynamic quantities, such as free energy, energy, statistical mechanical entropy, and specific heat, into Algorithmic Information Theory. We investigate the properties of these quantities by means of program-size complexity from the point of view of Algorithmic randomness. It is then discovered that, in the interpretation, the temperature plays a role as the compression rate of the values of all these thermodynamic quantities, which include the temperature itself. Reflecting this self-referential nature of the compression rate of the temperature, we obtain fixed point theorems on compression rate.

  • A statistical mechanical interpretation of Algorithmic Information Theory iii: Composite systems and fixed points
    Mathematical Structures in Computer Science, 2012
    Co-Authors: Kohtaro Tadaki
    Abstract:

    The statistical mechanical interpretation of Algorithmic Information Theory (AIT for short) was introduced and developed in our previous papers Tadaki (2008; 2012), where we introduced into AIT the notion of thermodynamic quantities, such as the partition function Z(T), free energy F(T), energy E(T) and statistical mechanical entropy S(T). We then discovered that in the interpretation, the temperature T is equal to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature itself as a thermodynamic quantity, namely, for each of the thermodynamic quantities above, the computability of its value at temperature T gives a sufficient condition for T ∈ (0, 1) to be a fixed point on partial randomness. In this paper, we develop the statistical mechanical interpretation of AIT further and pursue its formal correspondence to normal statistical mechanics. The thermodynamic quantities in AIT are defined on the basis of the halting set of an optimal prefix-free machine, which is a universal decoding algorithm used to define the notion of program-size complexity. We show that there are infinitely many optimal prefix-free machines that give completely different sufficient conditions for each of the thermodynamic quantities in AIT. We do this by introducing the notion of composition of prefix-free machines into AIT, which corresponds to the notion of the composition of systems in normal statistical mechanics.

  • robustness of statistical mechanical interpretation of Algorithmic Information Theory
    Information Theory Workshop, 2011
    Co-Authors: Kohtaro Tadaki
    Abstract:

    The statistical mechanical interpretation of Algorithmic Information Theory (AIT, for short) was introduced and developed in our former work [K. Tadaki, Local Proceedings of CiE 2008, pp. 425–434, 2008], where we introduced the thermodynamic quantities into AIT. In this paper, we reveal a certain sort of the robustness of statistical mechanical interpretation of AIT. The thermodynamic quantities in AIT are originally defined based on the set of all programs, i.e., all halting inputs, for an optimal prefix-free machine, which is a universal decoding algorithm used to define the notion of program-size complexity. We show that we can recover the original properties of the thermodynamic quantities in AIT if we replace all programs by all minimal-size programs in the definitions of the thermodynamic quantities in AIT. The results of this paper illustrate the generality and validity of the statistical mechanical interpretation of AIT.

  • ITW - Robustness of statistical mechanical interpretation of Algorithmic Information Theory
    2011 IEEE Information Theory Workshop, 2011
    Co-Authors: Kohtaro Tadaki
    Abstract:

    The statistical mechanical interpretation of Algorithmic Information Theory (AIT, for short) was introduced and developed in our former work [K. Tadaki, Local Proceedings of CiE 2008, pp. 425–434, 2008], where we introduced the thermodynamic quantities into AIT. In this paper, we reveal a certain sort of the robustness of statistical mechanical interpretation of AIT. The thermodynamic quantities in AIT are originally defined based on the set of all programs, i.e., all halting inputs, for an optimal prefix-free machine, which is a universal decoding algorithm used to define the notion of program-size complexity. We show that we can recover the original properties of the thermodynamic quantities in AIT if we replace all programs by all minimal-size programs in the definitions of the thermodynamic quantities in AIT. The results of this paper illustrate the generality and validity of the statistical mechanical interpretation of AIT.

Alexandre Miranda Pinto - One of the best experts on this subject based on the ideXlab platform.

  • Algorithmic Information Theory for obfuscation security
    International Conference on Security and Cryptography, 2015
    Co-Authors: Rabih Mohsen, Alexandre Miranda Pinto
    Abstract:

    The main problem in designing effective code obfuscation is to guarantee security. State of the art obfuscation techniques rely on an unproven concept of security, and therefore are not regarded as provably secure. In this paper, we undertake a theoretical investigation of code obfuscation security based on Kolmogorov complexity and Algorithmic mutual Information. We introduce a new definition of code obfuscation that requires the Algorithmic mutual Information between a code and its obfuscated version to be minimal, allowing for controlled amount of Information to be leaked to an adversary. We argue that our definition avoids the impossibility results of Barak et al. and is more advantageous then obfuscation indistinguishability definition in the sense it is more intuitive, and is Algorithmic rather than probabilistic.

  • SECRYPT - Algorithmic Information Theory for obfuscation security
    Proceedings of the 12th International Conference on Security and Cryptography, 2015
    Co-Authors: Rabih Mohsen, Alexandre Miranda Pinto
    Abstract:

    The main problem in designing effective code obfuscation is to guarantee security. State of the art obfuscation techniques rely on an unproven concept of security, and therefore are not regarded as provably secure. In this paper, we undertake a theoretical investigation of code obfuscation security based on Kolmogorov complexity and Algorithmic mutual Information. We introduce a new definition of code obfuscation that requires the Algorithmic mutual Information between a code and its obfuscated version to be minimal, allowing for controlled amount of Information to be leaked to an adversary. We argue that our definition avoids the impossibility results of Barak et al. and is more advantageous then obfuscation indistinguishability definition in the sense it is more intuitive, and is Algorithmic rather than probabilistic.

Andrei Romashchenko - One of the best experts on this subject based on the ideXlab platform.

  • communication complexity of the secret key agreement in Algorithmic Information Theory
    Mathematical Foundations of Computer Science, 2020
    Co-Authors: Emirhan Gürpınar, Andrei Romashchenko
    Abstract:

    It is known that the mutual Information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parities can use private sources of random bits. We show that for some pairs x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key whose size is much smaller than the mutual Information between x and y. On the other hand, we discuss examples of x, y such that the communication complexity of the protocol declines gradually with the size of the resulting secret key. The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma, as well as Information theoretic inequalities.

  • MFCS - Communication Complexity of the Secret Key Agreement in Algorithmic Information Theory
    2020
    Co-Authors: Emirhan Gürpınar, Andrei Romashchenko
    Abstract:

    It is known that the mutual Information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parities can use private sources of random bits. We show that for some pairs x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key whose size is much smaller than the mutual Information between x and y. On the other hand, we discuss examples of x, y such that the communication complexity of the protocol declines gradually with the size of the resulting secret key. The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma, as well as Information theoretic inequalities.

  • Communication Complexity of the Secret Key Agreement in Algorithmic Information Theory.
    arXiv: Information Theory, 2020
    Co-Authors: Emirhan Gürpınar, Andrei Romashchenko
    Abstract:

    It is known that the mutual Information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parties can use private sources of random bits. We show that for some x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key whose size is much smaller than the mutual Information between x and y. On the other hand, we discuss examples of x, y such that the communication complexity of the protocol declines gradually with the size of the derived secret key. The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma, as well as Information theoretic techniques.

  • An Operational Characterization of Mutual Information in Algorithmic Information Theory
    Journal of the ACM, 2019
    Co-Authors: Andrei Romashchenko, Marius Zimand
    Abstract:

    We show that the mutual Information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties—one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair—can establish via a probabilistic protocol with interaction on a public channel. For e > 2, the longest shared secret that can be established from a tuple of strings (x1, …, xe) by e parties—each one having one component of the tuple and the complexity profile of the tuple—is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.

  • ICALP - An operational characterization of mutual Information in Algorithmic Information Theory
    2018
    Co-Authors: Andrei Romashchenko, Marius Zimand
    Abstract:

    We show that the mutual Information, in the sense of Kolmogorov complexity, of any pair of strings $x$ and $y$ is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having $x$ and the complexity profile of the pair and the other one having $y$ and the complexity profile of the pair, can establish via a probabilistic protocol with interaction on a public channel. For $\ell > 2$, the longest shared secret that can be established from a tuple of strings $(x_1, . . . , x_\ell)$ by $\ell$ parties, each one having one component of the tuple and the complexity profile of the tuple, is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness. We also show that if the communication complexity drops below the established threshold then only very short secret keys can be obtained.

Alexander Shen - One of the best experts on this subject based on the ideXlab platform.

  • Algorithmic Information Theory
    2016
    Co-Authors: Alexander Shen
    Abstract:

    Algorithmic Information Theory uses the notion of algorithm to measure the amount of Information in a finite object. The corresponding definition was suggested in 1960s by Ray Solomonoff, Andrei Kolmogorov, Gregory Chaitin and others: the amount of Information in a finite object, or its complexity, was defined as the minimal length of a program that generates this object.

  • Fields of Logic and Computation II - K-trivial, K-low and MLR-low sequences: a tutorial
    Fields of Logic and Computation II, 2015
    Co-Authors: Laurent Bienvenu, Alexander Shen
    Abstract:

    A remarkable achievement in Algorithmic randomness and Algorithmic Information Theory was the discovery of the notions of K-trivial, K-low and Martin-Lof-random-low sets: three different definitions turns out to be equivalent for very non-trivial reasons. This paper, based on the course taught by one of the authors (L.B.) in Poncelet laboratory (CNRS, Moscow) in 2014, provides an exposition of the proof of this equivalence and some related results. We assume that the reader is familiar with basic notions of Algorithmic Information Theory.

  • around kolmogorov complexity basic notions and results
    arXiv: Information Theory, 2015
    Co-Authors: Alexander Shen
    Abstract:

    Algorithmic Information Theory studies description complexity and randomness and is now a well-known field of theoretical computer science and mathematical logic. There are several textbooks and monographs devoted to this Theory (Calude, Information and Randomness. An Algorithmic Perspective, 2002, Downey and Hirschfeldt, Algorithmic Randomness and Complexity, 2010, Li and Vitanyi, An Introduction to Kolmogorov Complexity and Its Applications, 2008, Nies, Computability and Randomness, 2009, Vereshchagin et al., Kolmogorov Complexity and Algorithmic Randomness, in Russian, 2013) where one can find a detailed exposition of many difficult results as well as historical references. However, it seems that a short survey of its basic notions and main results relating these notions to each other is missing. This chapter attempts to fill this gap and covers the basic notions of Algorithmic Information Theory : Kolmogorov complexity (plain, conditional, prefix), Solomonoff universal a priori probability, notions of randomness (Martin-Lof randomness, Mises–Church randomness), and effective Hausdorff dimension . We prove their basic properties (symmetry of Information, connection between a priori probability and prefix complexity, criterion of randomness in terms of complexity, complexity characterization for effective dimension) and show some applications ( incompressibility method in computational complexity Theory, incompleteness theorems). The chapter is based on the lecture notes of a course at Uppsala University given by the author (Shen, Algorithmic Information Theory and Kolmogorov complexity. Technical Report, 2000).

  • K-trivial, K-low and MLR-low sequences: a tutorial
    arXiv: Logic, 2014
    Co-Authors: Laurent Bienvenu, Alexander Shen
    Abstract:

    A remarkable achievement in Algorithmic randomness and Algorithmic Information Theory was the discovery of the notions of K-trivial, K-low and Martin-Lof-random-low sets: three different definitions turns out to be equivalent for very non-trivial reasons. This paper, based on the course taught by one of the authors (L.B.) in Poncelet laboratory (CNRS, Moscow) in 2014, provides an exposition of the proof of this equivalence and some related results. We assume that the reader is familiar with basic notions of Algorithmic Information Theory.

  • Game arguments in computability Theory and Algorithmic Information Theory
    arXiv: Logic, 2012
    Co-Authors: Andrej Muchnik, Alexander Shen, Mikhail Vyugin
    Abstract:

    We provide some examples showing how game-theoretic arguments can be used in computability Theory and Algorithmic Information Theory: unique numbering theorem (Friedberg), the gap between conditional complexity and total conditional complexity, Epstein--Levin theorem and some (yet unpublished) result of Muchnik and Vyugin

Mihai Datcu - One of the best experts on this subject based on the ideXlab platform.

  • Expanding the Algorithmic Information Theory frame for applications to earth observation
    Entropy, 2013
    Co-Authors: Daniele Cerra, Mihai Datcu
    Abstract:

    Recent years have witnessed an increased interest towards compression-based methods and their applications to remote sensing, as these have a data-driven and parameter-free approach and can be thus succesfully employed in several applications, especially in image Information mining. This paper expands the Algorithmic Information Theory frame, on which these methods are based. On the one hand, algorithms originally defined in the pattern matching domain are reformulated, allowing a better understanding of the available compression-based tools for remote sensing applications. On the other hand, the use of existing compression algorithms is proposed to store satellite images with added semantic value.

  • Algorithmic Information Theory based analysis of earth observation images an assessment
    IEEE Geoscience and Remote Sensing Letters, 2010
    Co-Authors: Daniele Cerra, A Mallet, Lionel Gueguen, Mihai Datcu
    Abstract:

    Earth observation image-understanding methodologies may be hindered by the assumed data models and the estimated parameters on which they are often heavily dependent. First, the definition of the parameters may negatively affect the quality of the analysis. The parameters could not be captured in all aspects, and those resulting superfluous or not accurately tuned may introduce nuisance in the data. Furthermore, the diversity of the data, as regards sensor type, spatial, spectral, and radiometric resolution, and the variety and regularity of the observed scenes make it difficult to establish enough valid and robust statistical models to describe them. This letter proposes Algorithmic Information Theory-based analysis as a valid solution to overcome these limitations. We will present different applications on satellite images, i.e., clustering, classification, artifact detection, and image time series mining, showing the generalization power of these parameter-free data-driven methods based on the computational complexity analysis.

  • Algorithmic Information Theory-Based Analysis of Earth Observation Images: An Assessment
    IEEE Geoscience and Remote Sensing Letters, 2010
    Co-Authors: Daniele Cerra, A Mallet, Lionel Gueguen, Mihai Datcu
    Abstract:

    Earth observation image-understanding methodologies may be hindered by the assumed data models and the estimated parameters on which they are often heavily dependent. First, the definition of the parameters may negatively affect the quality of the analysis. The parameters could not be captured in all aspects, and those resulting superfluous or not accurately tuned may introduce nuisance in the data. Furthermore, the diversity of the data, as regards sensor type, spatial, spectral, and radiometric resolution, and the variety and regularity of the observed scenes make it difficult to establish enough valid and robust statistical models to describe them. This letter proposes Algorithmic Information Theory-based analysis as a valid solution to overcome these limitations. We will present different applications on satellite images, i.e., clustering, classification, artifact detection, and image time series mining, showing the generalization power of these parameter-free data-driven methods based on the computational complexity analysis.