Information Theory

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 294 Experts worldwide ranked by ideXlab platform

Alexander Shen - One of the best experts on this subject based on the ideXlab platform.

  • Algorithmic Information Theory
    2016
    Co-Authors: Alexander Shen
    Abstract:

    Algorithmic Information Theory uses the notion of algorithm to measure the amount of Information in a finite object. The corresponding definition was suggested in 1960s by Ray Solomonoff, Andrei Kolmogorov, Gregory Chaitin and others: the amount of Information in a finite object, or its complexity, was defined as the minimal length of a program that generates this object.

  • Algorithmic Information Theory and martingales
    arXiv: History and Overview, 2009
    Co-Authors: Laurent Bienvenu, Alexander Shen
    Abstract:

    The notion of an individual random sequence goes back to von Mises. We describe the evolution of this notion, especially the use of martingales (suggested by Ville), and the development of algorithmic Information Theory in 1960s and 1970s (Solomonov, Kolmogorov, Martin-Lof, Levin, Chaitin, Schnorr and others). We conclude with some remarks about the use of the algorithmic Information Theory in the foundations of probability Theory.

  • Multisource algorithmic Information Theory
    Lecture Notes in Computer Science, 2006
    Co-Authors: Alexander Shen
    Abstract:

    Multisource Information Theory in Shannon setting is well known. In this article we try to develop its algorithmic Information Theory counterpart and use it as the general framework for many interesting questions about Kolmogorov complexity.

  • multisource algorithmic Information Theory
    Dagstuhl Seminar Proceedings, 2006
    Co-Authors: Alexander Shen
    Abstract:

    Multisource Information Theory is well known in Shannon setting. It studies the possibilities of Information transfer through a network with limited capacities. Similar questions could be studied for algorithmic Information Theory and provide a framework for several known results and interesting questions.

  • TAMC - Multisource algorithmic Information Theory
    Lecture Notes in Computer Science, 2006
    Co-Authors: Alexander Shen
    Abstract:

    Multisource Information Theory in Shannon setting is well known. In this article we try to develop its algorithmic Information Theory counterpart and use it as the general framework for many interesting questions about Kolmogorov complexity.

George J. Klir - One of the best experts on this subject based on the ideXlab platform.

  • Generalized Information Theory
    Kybernetes, 1996
    Co-Authors: George J. Klir, David Harmanec
    Abstract:

    Provides an overview of major developments pertaining to generalized Information Theory during the lifetime of Kybernetes. Generalized Information Theory is viewed as a collection of concepts, theorems, principles, and methods for dealing with problems involving uncertainty‐based Information that are beyond the narrow scope of classical Information Theory. Introduces well‐justified measures of uncertainty in fuzzy set Theory, possibility Theory, and Dempster‐Shafer Theory. Shows how these measures are connected with the classical Hartley measure and Shannon entropy. Discusses basic issues regarding some principles of generalized uncertainty‐based Information.

  • Generalized Information Theory
    Fuzzy Sets and Systems, 1991
    Co-Authors: George J. Klir
    Abstract:

    Abstract Generalized Information Theory is viewed in this paper as an Information Theory that is liberated from the boundaries of probability Theory. After overviewing classical (probabilistic) Information Theory, the paper examines recent developments regarding nonprobabilistic measures and principles of uncertainty-based Information, which form a nucleus of the emerging generalized Information Theory.

Frédéric E. Theunissen - One of the best experts on this subject based on the ideXlab platform.

  • Information Theory and neural coding
    Nature Neuroscience, 1999
    Co-Authors: Alexander Borst, Frédéric E. Theunissen
    Abstract:

    Information Theory quantifies how much Information a neural response carries about the stimulus. This can be compared to the Information transferred in particular models of the stimulus–response function and to maximum possible Information transfer. Such comparisons are crucial because they validate assumptions present in any neurophysiological analysis. Here we review Information-Theory basics before demonstrating its use in neural coding. We show how to use Information Theory to validate simple stimulus–response models of neural coding of dynamic stimuli. Because these models require specification of spike timing precision, they can reveal which time scales contain Information in neural coding. This approach shows that dynamic stimuli can be encoded efficiently by single neurons and that each spike contributes to Information transmission. We argue, however, that the data obtained so far do not suggest a temporal code, in which the placement of spikes relative to each other yields additional Information.

Ivan B. Djordjevic - One of the best experts on this subject based on the ideXlab platform.

  • Quantum Biological Information Theory
    2016
    Co-Authors: Ivan B. Djordjevic
    Abstract:

    This book is a self-contained, tutorial-based introduction to quantum Information Theory and quantum biology. It serves as a single-source reference to the topic for researchers in bioengineering, communications engineering, electrical engineering, applied mathematics, biology, computer science, and physics. The book provides all the essential principles of the quantum biological Information Theory required to describe the quantum Information transfer from DNA to proteins, the sources of genetic noise and genetic errors as well as their effects. Integrates quantum Information and quantum biology concepts; Assumes only knowledge of basic concepts of vector algebra at undergraduate level; Provides a thorough introduction to basic concepts of quantum Information processing, quantum Information Theory, and quantum biology; Includes in-depth discussion of the quantum biological channel modelling, quantum biological channel capacity calculation, quantum models of aging, quantum models of evolution, quantum models on tumor and cancer development, quantum modeling of bird navigation compass, quantum aspects of photosynthesis, quantum biological error correction

  • Quantum Biological Information Theory
    2015
    Co-Authors: Ivan B. Djordjevic
    Abstract:

    Introduction.- Quantum Information Theory Fundamentals.- Fundamentals of Biological Thermodynamics, Biomolecules, Cellular Genetics, and Bioenergetics.- Quantum Information Theory and Quantum Mechanics Based Biological Modeling and Biological Channel Capacity Calculation.- Quantum Mechanical Modeling of Mutations, Aging, Evolution, Tumor, and Cancer Development.- Classical and Quantum Error Correction Coding in Genetics.

David Ellerman - One of the best experts on this subject based on the ideXlab platform.

  • Logical Information Theory: new logical foundations for Information Theory
    Logic Journal of the IGPL, 2017
    Co-Authors: David Ellerman
    Abstract:

    here is a new Theory of Information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical Information Theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the (product) probability measure on the sets of distinctions. The compound notions of joint, conditional, and mutual entropies are obtained as the values of the measure, respectively, on the union, difference, and intersection of the sets of distinctions. These compound notions of logical entropy satisfy the usual Venn diagram relationships (e.g., inclusion-exclusion formulas) since they are values of a measure (in the sense of measure Theory). The uniform transformation into the formulas for Shannon entropy is linear so it explains the long-noted fact that the Shannon formulas satisfy the Venn diagram relations--as an analogy or mnemonic--since Shannon entropy is not a measure (in the sense of measure Theory) on a given set. What is the logic that gives rise to logical Information Theory? Partitions are dual (in a category-theoretic sense) to subsets, and the logic of partitions was recently developed in a dual/parallel relationship to the Boolean logic of subsets (the latter being usually mis-specified as the special case of "propositional logic"). Boole developed logical probability Theory as the normalized counting measure on subsets. Similarly the normalized counting measure on partitions is logical entropy--when the partitions are represented as the set of distinctions that is the complement to the equivalence relation for the partition. In this manner, logical Information Theory provides the set-theoretic and measure-theoretic foundations for Information Theory. The Shannon Theory is then derived by the transformation that replaces the counting of distinctions with the counting of the number of binary partitions (bits) it takes, on average, to make the same distinctions by uniquely encoding the distinct elements--which is why the Shannon Theory perfectly dovetails into coding and communications Theory.

  • New Logical Foundations for Quantum Information Theory: Introduction to Quantum Logical Information Theory
    arXiv: Quantum Physics, 2017
    Co-Authors: David Ellerman
    Abstract:

    Logical Information Theory is the quantitative version of the logic of partitions just as logical probability Theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of Information is about distinctions, differences, and distinguishability, and is formalized using the distinctions (`dits') of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional, and mutual entropy of Shannon Information Theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical Information Theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as "two-draw" probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of Information for quantum Information Theory focusing on the distinguishing of quantum states.