Kraft Inequality

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 204 Experts worldwide ranked by ideXlab platform

Ertem Tuncel - One of the best experts on this subject based on the ideXlab platform.

  • Kraft Inequality and zero error source coding with decoder side information
    IEEE Transactions on Information Theory, 2007
    Co-Authors: Ertem Tuncel
    Abstract:

    For certain source coding problems, it is well known that the Kraft Inequality provides a simple sufficient condition for the existence of codes with given codeword lengths. Motivated by this fact, a sufficient condition based on the Kraft Inequality can also be sought for the problem of zero-error instantaneous coding with decoder side information. More specifically, it can be envisioned that a sufficient condition for the existence of such codes with codeword lengths {l x} is that for some 0

  • Kraft Inequality and Zero-Error Source Coding with Decoder Side Information
    2007 IEEE International Symposium on Information Theory, 2007
    Co-Authors: Ertem Tuncel
    Abstract:

    This paper tackles the problem of zero-error instantaneous coding with decoder side information in light of the Kraft Inequality. Specifically, a bounded Kraft sum over all cliques in the characteristic graph of the source/side-information pair is envisioned to be a sufficient condition for the existence of a valid code with given codeword lengths. It is shown that (i) if such a sufficient condition exists for a class of graphs, it is possible to universally bound the rate redundancy in the class, (ii) there exist graph classes of interest for which such sufficient conditions can indeed be found, and finally (iii) no such condition can be found for the class of all graphs.

  • ISIT - Kraft Inequality and Zero-Error Source Coding with Decoder Side Information
    2007 IEEE International Symposium on Information Theory, 2007
    Co-Authors: Ertem Tuncel
    Abstract:

    This paper tackles the problem of zero-error instantaneous coding with decoder side information in light of the Kraft Inequality. Specifically, a bounded Kraft sum over all cliques in the characteristic graph of the source/side-information pair is envisioned to be a sufficient condition for the existence of a valid code with given codeword lengths. It is shown that (i) if such a sufficient condition exists for a class of graphs, it is possible to universally bound the rate redundancy in the class, (ii) there exist graph classes of interest for which such sufficient conditions can indeed be found, and finally (iii) no such condition can be found for the class of all graphs.

  • Kraft Inequality and Zero-Error Source Coding With Decoder Side Information
    IEEE Transactions on Information Theory, 2007
    Co-Authors: Ertem Tuncel
    Abstract:

    For certain source coding problems, it is well known that the Kraft Inequality provides a simple sufficient condition for the existence of codes with given codeword lengths. Motivated by this fact, a sufficient condition based on the Kraft Inequality can also be sought for the problem of zero-error instantaneous coding with decoder side information. More specifically, it can be envisioned that a sufficient condition for the existence of such codes with codeword lengths {ιx} is that for some 0

Ago-erik Riet - One of the best experts on this subject based on the ideXlab platform.

Kristo Visk - One of the best experts on this subject based on the ideXlab platform.

Junshan Zhang - One of the best experts on this subject based on the ideXlab platform.

  • Arbitrary source models and Bayesian codebooks in rate-distortion theory
    IEEE Transactions on Information Theory, 2002
    Co-Authors: Ioannis Kontoyiannis, Junshan Zhang
    Abstract:

    We characterize the best achievable performance of lossy compression algorithms operating on arbitrary random sources, and with respect to general distortion measures. Direct and converse coding theorems are given for variable-rate codes operating at a fixed distortion level, emphasizing: (a) nonasymptotic results, (b) optimal or near-optimal redundancy bounds, and (c) results with probability one. This development is based in part on the observation that there is a precise correspondence between compression algorithms and probability measures on the reproduction alphabet. This is analogous to the Kraft Inequality in lossless data compression. In the case of stationary ergodic sources our results reduce to the classical coding theorems. As an application of these general results, we examine the performance of codes based on mixture codebooks for discrete memoryless sources. A mixture codebook (or Bayesian codebook) is a random codebook generated from a mixture over some class of reproduction distributions. We demonstrate the existence of universal mixture codebooks, and show that it is possible to universally encode memoryless sources with redundancy of approximately (d/2) log n bits, where d is the dimension of the simplex of probability distributions on the reproduction alphabet.

  • Arbitrary source models and Bayesian codebooks in rate-distortion theory
    2002
    Co-Authors: Ioannis Kontoyiannis, Junshan Zhang
    Abstract:

    Abstract — We characterize the best achievable performance of lossy compression algorithms operating on arbitrary random sources, and with respect to general distortion measures. Direct and converse coding theorems are given for variable-rate codes operating at a fixed distortion level, empha-sizing: (a) non-asymptotic results, (b) optimal or near-optimal redundancy bounds, and (c) results with probability one. This development is based in part on the observation that there is a precise correspondence between compression algorithms and probability measures on the reproduction alpha-bet. This is analogous to the Kraft Inequality in lossless data compression. In the case of stationary ergodic sources our results reduce to the classical coding theorems. As an application of these general results, we examine the performance of codes based on mixture codebooks for discrete memoryless sources. A mixture codebook (or Bayesian codebook) is a random codebook generated from a mixture over some class of reproduction distributions. We demonstrate the existence of universal mixture codebooks, and show that it is possible to universally encode memoryless sources with redundancy of approximately (d/2) log n bits, where d is the dimension of the simplex of probability distributions on the reproduction alphabet. Index Terms — Rate-distortion theory, redundancy rate, mixture codebooks, data compression

Amos Lapidoth - One of the best experts on this subject based on the ideXlab platform.

  • Codes for Tasks and Rényi Entropy Rate
    2015
    Co-Authors: Christoph Bunte, Amos Lapidoth
    Abstract:

    Abstract—A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ρ-th moment of the number of performed tasks are derived. The key is an analog of the Kraft Inequality for partitions of finite sets. When a sequence of tasks is produced by a source of a given Rényi entropy rate of order 1/(1 + ρ) and n tasks are jointly described using nR bits, it is shown that for R larger than the Rényi entropy rate, the ρ-th moment of the ratio of performed tasks to n can be driven to one as n tends to infinity, and that for R less than the Rényi entropy rate it tends to infinity. This generalizes a recent result for IID sources by the same authors. A mismatched version of the direct part is also considered, where the code is designed according to the wrong law. The penalty incurred by the mismatch can be expressed in terms of a divergence measure that was shown by Sundaresan to play a similar role in the Massey-Arikan guessing problem. I

  • Codes for Tasks and R\'enyi Entropy Rate
    2014 IEEE International Symposium on Information Theory, 2014
    Co-Authors: Christoph Bunte, Amos Lapidoth
    Abstract:

    A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum $\rho$-th moment of the number of performed tasks are derived. The key is an analog of the Kraft Inequality for partitions of finite sets. When a sequence of tasks is produced by a source of a given Renyi entropy rate of order $1/(1+\rho)$ and $n$ tasks are jointly described using $nR$ bits, it is shown that for $R$ larger than the Renyi entropy rate, the $\rho$-th moment of the ratio of performed tasks to $n$ can be driven to one as $n$ tends to infinity, and that for $R$ less than the Renyi entropy rate it tends to infinity. This generalizes a recent result for IID sources by the same authors. A mismatched version of the direct part is also considered, where the code is designed according to the wrong law. The penalty incurred by the mismatch can be expressed in terms of a divergence measure that was shown by Sundaresan to play a similar role in the Massey-Arikan guessing problem.