Algorithmic Information Theory - Explore the Science & Experts | ideXlab


Scan Science and Technology

Contact Leading Edge Experts & Companies

Algorithmic Information Theory

The Experts below are selected from a list of 6354 Experts worldwide ranked by ideXlab platform

Algorithmic Information Theory – Free Register to Access Experts & Abstracts

Kohtaro Tadaki – One of the best experts on this subject based on the ideXlab platform.

  • Algorithmic Information Theory
    SpringerBriefs in Mathematical Physics, 2019
    Co-Authors: Kohtaro Tadaki

    Abstract:

    In this chapter, we review the basic framework of Algorithmic Information Theory to the extent necessary to read the rest of the book.

    Free Register to Access Article

  • A Statistical Mechanical Interpretation of Algorithmic Information Theory
    , 2019
    Co-Authors: Kohtaro Tadaki

    Abstract:

    We develop a statistical mechanical interpretation of Algorithmic Information Theory by introducing the notion of thermodynamic quantities, such as free energy, energy, statistical mechanical entropy, and specific heat, into Algorithmic Information Theory. We investigate the properties of these quantities by means of program-size complexity from the point of view of Algorithmic randomness. It is then discovered that, in the interpretation, the temperature plays a role as the compression rate of the values of all these thermodynamic quantities, which include the temperature itself. Reflecting this self-referential nature of the compression rate of the temperature, we obtain fixed point theorems on compression rate.

    Free Register to Access Article

  • A statistical mechanical interpretation of Algorithmic Information Theory iii: Composite systems and fixed points
    Mathematical Structures in Computer Science, 2012
    Co-Authors: Kohtaro Tadaki

    Abstract:

    The statistical mechanical interpretation of Algorithmic Information Theory (AIT for short) was introduced and developed in our previous papers Tadaki (2008; 2012), where we introduced into AIT the notion of thermodynamic quantities, such as the partition function Z(T), free energy F(T), energy E(T) and statistical mechanical entropy S(T). We then discovered that in the interpretation, the temperature T is equal to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature itself as a thermodynamic quantity, namely, for each of the thermodynamic quantities above, the computability of its value at temperature T gives a sufficient condition for T ∈ (0, 1) to be a fixed point on partial randomness. In this paper, we develop the statistical mechanical interpretation of AIT further and pursue its formal correspondence to normal statistical mechanics. The thermodynamic quantities in AIT are defined on the basis of the halting set of an optimal prefix-free machine, which is a universal decoding algorithm used to define the notion of program-size complexity. We show that there are infinitely many optimal prefix-free machines that give completely different sufficient conditions for each of the thermodynamic quantities in AIT. We do this by introducing the notion of composition of prefix-free machines into AIT, which corresponds to the notion of the composition of systems in normal statistical mechanics.

    Free Register to Access Article

Alexandre Miranda Pinto – One of the best experts on this subject based on the ideXlab platform.

  • Algorithmic Information Theory for obfuscation security
    International Conference on Security and Cryptography, 2015
    Co-Authors: Rabih Mohsen, Alexandre Miranda Pinto

    Abstract:

    The main problem in designing effective code obfuscation is to guarantee security. State of the art obfuscation techniques rely on an unproven concept of security, and therefore are not regarded as provably secure. In this paper, we undertake a theoretical investigation of code obfuscation security based on Kolmogorov complexity and Algorithmic mutual Information. We introduce a new definition of code obfuscation that requires the Algorithmic mutual Information between a code and its obfuscated version to be minimal, allowing for controlled amount of Information to be leaked to an adversary. We argue that our definition avoids the impossibility results of Barak et al. and is more advantageous then obfuscation indistinguishability definition in the sense it is more intuitive, and is Algorithmic rather than probabilistic.

    Free Register to Access Article

  • SECRYPT – Algorithmic Information Theory for obfuscation security
    Proceedings of the 12th International Conference on Security and Cryptography, 2015
    Co-Authors: Rabih Mohsen, Alexandre Miranda Pinto

    Abstract:

    The main problem in designing effective code obfuscation is to guarantee security. State of the art obfuscation techniques rely on an unproven concept of security, and therefore are not regarded as provably secure. In this paper, we undertake a theoretical investigation of code obfuscation security based on Kolmogorov complexity and Algorithmic mutual Information. We introduce a new definition of code obfuscation that requires the Algorithmic mutual Information between a code and its obfuscated version to be minimal, allowing for controlled amount of Information to be leaked to an adversary. We argue that our definition avoids the impossibility results of Barak et al. and is more advantageous then obfuscation indistinguishability definition in the sense it is more intuitive, and is Algorithmic rather than probabilistic.

    Free Register to Access Article

Andrei Romashchenko – One of the best experts on this subject based on the ideXlab platform.

  • communication complexity of the secret key agreement in Algorithmic Information Theory
    Mathematical Foundations of Computer Science, 2020
    Co-Authors: Emirhan Gürpınar, Andrei Romashchenko

    Abstract:

    It is known that the mutual Information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parities can use private sources of random bits. We show that for some pairs x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key whose size is much smaller than the mutual Information between x and y. On the other hand, we discuss examples of x, y such that the communication complexity of the protocol declines gradually with the size of the resulting secret key. The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma, as well as Information theoretic inequalities.

    Free Register to Access Article

  • MFCS – Communication Complexity of the Secret Key Agreement in Algorithmic Information Theory
    , 2020
    Co-Authors: Emirhan Gürpınar, Andrei Romashchenko

    Abstract:

    It is known that the mutual Information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parities can use private sources of random bits. We show that for some pairs x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key whose size is much smaller than the mutual Information between x and y. On the other hand, we discuss examples of x, y such that the communication complexity of the protocol declines gradually with the size of the resulting secret key. The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma, as well as Information theoretic inequalities.

    Free Register to Access Article

  • Communication Complexity of the Secret Key Agreement in Algorithmic Information Theory.
    arXiv: Information Theory, 2020
    Co-Authors: Emirhan Gürpınar, Andrei Romashchenko

    Abstract:

    It is known that the mutual Information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parties can use private sources of random bits. We show that for some x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key whose size is much smaller than the mutual Information between x and y. On the other hand, we discuss examples of x, y such that the communication complexity of the protocol declines gradually with the size of the derived secret key. The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma, as well as Information theoretic techniques.

    Free Register to Access Article