Entropy - Explore the Science & Experts | ideXlab

Scan Science and Technology

Contact Leading Edge Experts & Companies

Entropy

The Experts below are selected from a list of 801402 Experts worldwide ranked by ideXlab platform

Entropy – Free Register to Access Experts & Abstracts

Stefan Berens – One of the best experts on this subject based on the ideXlab platform.

  • On the Conditional Rényi Entropy
    IEEE Transactions on Information Theory, 2014
    Co-Authors: Serge Fehr, Stefan Berens
    Abstract:

    The Rényi Entropy of general order unifies the well-known Shannon Entropy with several other Entropy notions, like the min-Entropy or collision Entropy. In contrast to the Shannon Entropy, there seems to be no commonly accepted definition for the conditional Rényi Entropy: several versions have been proposed and used in the literature. In this paper, we reconsider the definition for the conditional Rényi Entropy of general order as proposed by Arimoto in the seventies. We show that this particular notion satisfies several natural properties. In particular, we show that it satisfies monotonicity under conditioning, meaning that conditioning can only reduce the Entropy, and (a weak form of) chain rule, which implies that the decrease in Entropy due to conditioning is bounded by the number of bits one conditions on. None of the other suggestions for the conditional Rényi Entropy satisfies both these properties. Finally, we show a natural interpretation of the conditional Rényi Entropy in terms of (unconditional) Rényi divergence, and we show consistency with a recently proposed notion of conditional Rényi Entropy in the quantum setting.

Serge Fehr – One of the best experts on this subject based on the ideXlab platform.

  • On the Conditional Rényi Entropy
    IEEE Transactions on Information Theory, 2014
    Co-Authors: Serge Fehr, Stefan Berens
    Abstract:

    The Rényi Entropy of general order unifies the well-known Shannon Entropy with several other Entropy notions, like the min-Entropy or collision Entropy. In contrast to the Shannon Entropy, there seems to be no commonly accepted definition for the conditional Rényi Entropy: several versions have been proposed and used in the literature. In this paper, we reconsider the definition for the conditional Rényi Entropy of general order as proposed by Arimoto in the seventies. We show that this particular notion satisfies several natural properties. In particular, we show that it satisfies monotonicity under conditioning, meaning that conditioning can only reduce the Entropy, and (a weak form of) chain rule, which implies that the decrease in Entropy due to conditioning is bounded by the number of bits one conditions on. None of the other suggestions for the conditional Rényi Entropy satisfies both these properties. Finally, we show a natural interpretation of the conditional Rényi Entropy in terms of (unconditional) Rényi divergence, and we show consistency with a recently proposed notion of conditional Rényi Entropy in the quantum setting.

Chein-i Chang – One of the best experts on this subject based on the ideXlab platform.

  • survey and comparative analysis of Entropy and relative Entropy thresholding techniques
    IEE Proceedings – Vision Image and Signal Processing, 2006
    Co-Authors: Chein-i Chang, Jianwei Wang, Yingzi Du, Paul D. Thouin
    Abstract:

    Entropy-based image thresholding has received considerable interest in recent years. Two types of Entropy are generally used as thresholding criteria: Shannon’s Entropy and relative Entropy, also known as Kullback – Leibler information distance, where the former measures uncertainty in an information source with an optimal threshold obtained by maximising Shannon’s Entropy, whereas the latter measures the information discrepancy between two different sources with an optimal threshold obtained by minimising relative Entropy. Many thresholding methods have been developed for both criteria and reported in the literature. These two Entropy– based thresholding criteria have been investigated and the relationship among Entropy and relative Entropy thresholding methods has been explored. In particular, a survey and comparative analysis is conducted among several widely used methods that include Pun and Kapur’s maximum Entropy, Kittler and Illingworth’s minimum error thresholding, Pal and Pal’s Entropy thresholding and Chang et al.’s relative Entropy thresholding methods. In order to objectively assess these methods, two measures, uniformity and shape, are used for performance evaluation.

  • a relative Entropy based approach to image thresholding
    Pattern Recognition, 1994
    Co-Authors: Chein-i Chang, Kebo Chen, Jianwei Wang, Mark L.g. Althouse
    Abstract:

    Abstract In this paper, we present a new image thresholding technique which uses the relative Entropy (also known as the Kullback-Leiber discrimination distance function) as a criterion of thresholding an image. As a result, a gray level minimizing the relative Entropy will be the desired threshold. The proposed relative Entropy approach is different from two known Entropy-based thresholding techniques, the local Entropy and joint Entropy methods developed by N. R. Pal and S. K. Pal in the sense that the former is focused on the matching between two images while the latter only emphasized the Entropy of the co-occurrence matrix of one image. The experimental results show that these three techniques are image dependent and the local Entropy and relative Entropy seem to perform better than does the joint Entropy. In addition, the relative Entropy can complement the local Entropy and joint Entropy in terms of providing different details which the others cannot. As far as computing saving is concerned, the relative Entropy approach also provides the least computational complexity.

Mark L.g. Althouse – One of the best experts on this subject based on the ideXlab platform.

  • a relative Entropy based approach to image thresholding
    Pattern Recognition, 1994
    Co-Authors: Chein-i Chang, Kebo Chen, Jianwei Wang, Mark L.g. Althouse
    Abstract:

    Abstract In this paper, we present a new image thresholding technique which uses the relative Entropy (also known as the Kullback-Leiber discrimination distance function) as a criterion of thresholding an image. As a result, a gray level minimizing the relative Entropy will be the desired threshold. The proposed relative Entropy approach is different from two known Entropy-based thresholding techniques, the local Entropy and joint Entropy methods developed by N. R. Pal and S. K. Pal in the sense that the former is focused on the matching between two images while the latter only emphasized the Entropy of the co-occurrence matrix of one image. The experimental results show that these three techniques are image dependent and the local Entropy and relative Entropy seem to perform better than does the joint Entropy. In addition, the relative Entropy can complement the local Entropy and joint Entropy in terms of providing different details which the others cannot. As far as computing saving is concerned, the relative Entropy approach also provides the least computational complexity.

Denes Petz – One of the best experts on this subject based on the ideXlab platform.

  • the proper formula for relative Entropy and its asymptotics in quantum probability
    Communications in Mathematical Physics, 1991
    Co-Authors: Fumio Hiai, Denes Petz
    Abstract:

    Umegaki’s relative EntropyS(ω,ϕ)=TrDω(logDω−logDϕ) (of states ω and ϕ with density operatorsDω andDϕ, respectively) is shown to be an asymptotic exponent considered from the quantum hypothesis testing viewpoint. It is also proved that some other versions of the relative Entropy give rise to the same asymptotics as Umegaki’s one. As a byproduct, the inequality TrA logAB ≧TrA(logA+logB) is obtained for positive definite matricesA andB.