Relative Entropy

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 15609 Experts worldwide ranked by ideXlab platform

David Sutter - One of the best experts on this subject based on the ideXlab platform.

  • chain rule for the quantum Relative Entropy
    Physical Review Letters, 2020
    Co-Authors: Kun Fang, Omar Fawzi, Renato Renner, David Sutter
    Abstract:

    The chain rule for the classical Relative Entropy ensures that the Relative Entropy between probability distributions on multipartite systems can be decomposed into a sum of Relative entropies of suitably chosen conditional distributions on the individual systems. Here, we prove a chain rule inequality for the quantum Relative Entropy. The new chain rule allows us to solve an open problem in the context of asymptotic quantum channel discrimination: surprisingly, adaptive protocols cannot improve the error rate for asymmetric channel discrimination compared to nonadaptive strategies.

  • A chain rule for the quantum Relative Entropy
    arXiv: Quantum Physics, 2019
    Co-Authors: Kun Fang, Omar Fawzi, Renato Renner, David Sutter
    Abstract:

    The chain rule for the classical Relative Entropy ensures that the Relative Entropy between probability distributions on multipartite systems can be decomposed into a sum of Relative entropies of suitably chosen conditional distributions on the individual systems. Here, we prove a similar chain rule inequality for the quantum Relative Entropy in terms of channel Relative entropies. The new chain rule allows us to solve an open problem in the context of asymptotic quantum channel discrimination: surprisingly, adaptive protocols cannot improve the error rate for asymmetric channel discrimination compared to non-adaptive strategies. In addition, we give examples of quantum channels showing that the channel Relative Entropy is not additive under the tensor product.

  • strengthened monotonicity of Relative Entropy via pinched petz recovery map
    International Symposium on Information Theory, 2016
    Co-Authors: David Sutter, Marco Tomamichel, Aram W Harrow
    Abstract:

    The quantum Relative Entropy between two states satisfies a monotonicity property, meaning that applying the same quantum channel to both states can never increase their Relative Entropy. It is known that this inequality is only tight when there is a “recovery map” that exactly reverses the effects of the quantum channel on both states. In this paper we strengthen this inequality by showing that the difference of Relative entropies is bounded below by the measured Relative Entropy between the first state and a recovered state from its processed version. The recovery map is a convex combination of rotated Petz recovery maps and perfectly reverses the quantum channel on the second state. As a special case we reproduce recent lower bounds on the conditional mutual information such as the one proved in [Fawzi and Renner, Commun. Math. Phys., 2015]. Our proof only relies on elementary properties of pinching maps and the operator logarithm.

  • strengthened monotonicity of Relative Entropy via pinched petz recovery map
    IEEE Transactions on Information Theory, 2016
    Co-Authors: David Sutter, Marco Tomamichel, Aram W Harrow
    Abstract:

    The quantum Relative Entropy between two states satisfies a monotonicity property meaning that applying the same quantum channel to both states can never increase their Relative Entropy. It is known that this inequality is only tight when there is a recovery map that exactly reverses the effects of the quantum channel on both states. In this paper, we strengthen this inequality by showing that the difference of Relative entropies is bounded below by the measured Relative Entropy between the first state and a recovered state from its processed version. The recovery map is a convex combination of rotated Petz recovery maps and perfectly reverses the quantum channel on the second state. As a special case, we reproduce recent lower bounds on the conditional mutual information, such as the one proved by Fawzi and Renner. Our proof only relies on the elementary properties of pinching maps and the operator logarithm.

Aram W Harrow - One of the best experts on this subject based on the ideXlab platform.

  • strengthened monotonicity of Relative Entropy via pinched petz recovery map
    International Symposium on Information Theory, 2016
    Co-Authors: David Sutter, Marco Tomamichel, Aram W Harrow
    Abstract:

    The quantum Relative Entropy between two states satisfies a monotonicity property, meaning that applying the same quantum channel to both states can never increase their Relative Entropy. It is known that this inequality is only tight when there is a “recovery map” that exactly reverses the effects of the quantum channel on both states. In this paper we strengthen this inequality by showing that the difference of Relative entropies is bounded below by the measured Relative Entropy between the first state and a recovered state from its processed version. The recovery map is a convex combination of rotated Petz recovery maps and perfectly reverses the quantum channel on the second state. As a special case we reproduce recent lower bounds on the conditional mutual information such as the one proved in [Fawzi and Renner, Commun. Math. Phys., 2015]. Our proof only relies on elementary properties of pinching maps and the operator logarithm.

  • strengthened monotonicity of Relative Entropy via pinched petz recovery map
    IEEE Transactions on Information Theory, 2016
    Co-Authors: David Sutter, Marco Tomamichel, Aram W Harrow
    Abstract:

    The quantum Relative Entropy between two states satisfies a monotonicity property meaning that applying the same quantum channel to both states can never increase their Relative Entropy. It is known that this inequality is only tight when there is a recovery map that exactly reverses the effects of the quantum channel on both states. In this paper, we strengthen this inequality by showing that the difference of Relative entropies is bounded below by the measured Relative Entropy between the first state and a recovered state from its processed version. The recovery map is a convex combination of rotated Petz recovery maps and perfectly reverses the quantum channel on the second state. As a special case, we reproduce recent lower bounds on the conditional mutual information, such as the one proved by Fawzi and Renner. Our proof only relies on the elementary properties of pinching maps and the operator logarithm.

Chein-i Chang - One of the best experts on this subject based on the ideXlab platform.

  • survey and comparative analysis of Entropy and Relative Entropy thresholding techniques
    IEE Proceedings - Vision Image and Signal Processing, 2006
    Co-Authors: Chein-i Chang, Jianwei Wang, Yingzi Du, Paul D. Thouin
    Abstract:

    Entropy-based image thresholding has received considerable interest in recent years. Two types of Entropy are generally used as thresholding criteria: Shannon's Entropy and Relative Entropy, also known as Kullback - Leibler information distance, where the former measures uncertainty in an information source with an optimal threshold obtained by maximising Shannon's Entropy, whereas the latter measures the information discrepancy between two different sources with an optimal threshold obtained by minimising Relative Entropy. Many thresholding methods have been developed for both criteria and reported in the literature. These two Entropy- based thresholding criteria have been investigated and the relationship among Entropy and Relative Entropy thresholding methods has been explored. In particular, a survey and comparative analysis is conducted among several widely used methods that include Pun and Kapur's maximum Entropy, Kittler and Illingworth's minimum error thresholding, Pal and Pal's Entropy thresholding and Chang et al.'s Relative Entropy thresholding methods. In order to objectively assess these methods, two measures, uniformity and shape, are used for performance evaluation.

  • Relative Entropy-based methods for image thresholding
    2002 IEEE International Symposium on Circuits and Systems. Proceedings (Cat. No.02CH37353), 2002
    Co-Authors: Jianwei Wang, Chein-i Chang, Eliza Yingzi Du, Paul D. Thouin
    Abstract:

    A Relative entropic thresholding approach was recently developed by Chang et al. (see Pattern Recognition, vol. 27, no. 9, p. 1275-1289, 1994). This paper extends Chang et al.'s approach to two more Relative Entropy-based thresholding methods, called local Relative Entropy thresholding (LRE) and joint Relative Entropy thresholding (JRE). Since Relative Entropy based methods are sensitive to sparse image histograms, a histogram compression and translation is suggested to compact the histogram. In order to achieve an objective assessment, uniformity and shape measures are introduced for performance evaluation. Experimental results show that when image histograms are sparse, with the proposed histogram compression and translation, JRE and LRE generally perform better than Chang et al.'s approach.

  • a Relative Entropy based approach to image thresholding
    Pattern Recognition, 1994
    Co-Authors: Chein-i Chang, Jianwei Wang, Kebo Chen, Mark L.g. Althouse
    Abstract:

    Abstract In this paper, we present a new image thresholding technique which uses the Relative Entropy (also known as the Kullback-Leiber discrimination distance function) as a criterion of thresholding an image. As a result, a gray level minimizing the Relative Entropy will be the desired threshold. The proposed Relative Entropy approach is different from two known Entropy-based thresholding techniques, the local Entropy and joint Entropy methods developed by N. R. Pal and S. K. Pal in the sense that the former is focused on the matching between two images while the latter only emphasized the Entropy of the co-occurrence matrix of one image. The experimental results show that these three techniques are image dependent and the local Entropy and Relative Entropy seem to perform better than does the joint Entropy. In addition, the Relative Entropy can complement the local Entropy and joint Entropy in terms of providing different details which the others cannot. As far as computing saving is concerned, the Relative Entropy approach also provides the least computational complexity.

  • Vapor cloud detection using Relative Entropy thresholding
    Signal Processing Sensor Fusion and Target Recognition III, 1994
    Co-Authors: Chein-i Chang, Jianwei Wang, Mark L.g. Althouse
    Abstract:

    A thresholding technique using Relative Entropy is proposed for vapor cloud detection. The idea is to cast a detection problem as a thresholding problem where the Relative Entropy is chosen to be the detection criterion and the null and alternative hypotheses correspond to background and objects respectively. Since the information content in an image can be characterized by its Entropy, the original image and the thresholded bilevel image can be viewed as two sources. As a result, the Relative Entropy becomes a natural measure to describe the mismatch between these two images. The smaller the Relative Entropy, the better the matching between the two images. In this paper, we interpret detection problems as image thresholding problems, where the null hypothesis corresponds to noise only and the alternative hypothesis represents presence of target. Three methods based on Relative Entropy are presented for chemical vapor cloud detection. The experimental results show that the suggested Relative Entropy-based methods can detect a vapor cloud very effectively. The performance is also compared against two recently developed entropic thresholding techniques, the local Entropy and joint Entropy proposed by S.R. Pal and S.K. Pal and shows that the Relative Entropy-based method outperform Pal and Pal's methods.© (1994) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Omar Fawzi - One of the best experts on this subject based on the ideXlab platform.

  • chain rule for the quantum Relative Entropy
    Physical Review Letters, 2020
    Co-Authors: Kun Fang, Omar Fawzi, Renato Renner, David Sutter
    Abstract:

    The chain rule for the classical Relative Entropy ensures that the Relative Entropy between probability distributions on multipartite systems can be decomposed into a sum of Relative entropies of suitably chosen conditional distributions on the individual systems. Here, we prove a chain rule inequality for the quantum Relative Entropy. The new chain rule allows us to solve an open problem in the context of asymptotic quantum channel discrimination: surprisingly, adaptive protocols cannot improve the error rate for asymmetric channel discrimination compared to nonadaptive strategies.

  • A chain rule for the quantum Relative Entropy
    arXiv: Quantum Physics, 2019
    Co-Authors: Kun Fang, Omar Fawzi, Renato Renner, David Sutter
    Abstract:

    The chain rule for the classical Relative Entropy ensures that the Relative Entropy between probability distributions on multipartite systems can be decomposed into a sum of Relative entropies of suitably chosen conditional distributions on the individual systems. Here, we prove a similar chain rule inequality for the quantum Relative Entropy in terms of channel Relative entropies. The new chain rule allows us to solve an open problem in the context of asymptotic quantum channel discrimination: surprisingly, adaptive protocols cannot improve the error rate for asymmetric channel discrimination compared to non-adaptive strategies. In addition, we give examples of quantum channels showing that the channel Relative Entropy is not additive under the tensor product.

  • efficient optimization of the quantum Relative Entropy
    Journal of Physics A, 2018
    Co-Authors: Hamza Fawzi, Omar Fawzi
    Abstract:

    Many quantum information measures can be written as an optimization of the quantum Relative Entropy between sets of states. For example, the Relative Entropy of entanglement of a state is the minimum Relative Entropy to the set of separable states. The various capacities of quantum channels can also be written in this way. We propose a unified framework to numerically compute these quantities using off-the-shelf semidefinite programming solvers, exploiting the approximation method proposed in Fawzi, Saunderson and Parrilo (2017 arXiv: 1705.00812). As a notable application, this method allows us to provide numerical counterexamples for a proposed lower bound on the quantum conditional mutual information in terms of the Relative Entropy of recovery.

Mark L.g. Althouse - One of the best experts on this subject based on the ideXlab platform.

  • a Relative Entropy based approach to image thresholding
    Pattern Recognition, 1994
    Co-Authors: Chein-i Chang, Jianwei Wang, Kebo Chen, Mark L.g. Althouse
    Abstract:

    Abstract In this paper, we present a new image thresholding technique which uses the Relative Entropy (also known as the Kullback-Leiber discrimination distance function) as a criterion of thresholding an image. As a result, a gray level minimizing the Relative Entropy will be the desired threshold. The proposed Relative Entropy approach is different from two known Entropy-based thresholding techniques, the local Entropy and joint Entropy methods developed by N. R. Pal and S. K. Pal in the sense that the former is focused on the matching between two images while the latter only emphasized the Entropy of the co-occurrence matrix of one image. The experimental results show that these three techniques are image dependent and the local Entropy and Relative Entropy seem to perform better than does the joint Entropy. In addition, the Relative Entropy can complement the local Entropy and joint Entropy in terms of providing different details which the others cannot. As far as computing saving is concerned, the Relative Entropy approach also provides the least computational complexity.

  • Vapor cloud detection using Relative Entropy thresholding
    Signal Processing Sensor Fusion and Target Recognition III, 1994
    Co-Authors: Chein-i Chang, Jianwei Wang, Mark L.g. Althouse
    Abstract:

    A thresholding technique using Relative Entropy is proposed for vapor cloud detection. The idea is to cast a detection problem as a thresholding problem where the Relative Entropy is chosen to be the detection criterion and the null and alternative hypotheses correspond to background and objects respectively. Since the information content in an image can be characterized by its Entropy, the original image and the thresholded bilevel image can be viewed as two sources. As a result, the Relative Entropy becomes a natural measure to describe the mismatch between these two images. The smaller the Relative Entropy, the better the matching between the two images. In this paper, we interpret detection problems as image thresholding problems, where the null hypothesis corresponds to noise only and the alternative hypothesis represents presence of target. Three methods based on Relative Entropy are presented for chemical vapor cloud detection. The experimental results show that the suggested Relative Entropy-based methods can detect a vapor cloud very effectively. The performance is also compared against two recently developed entropic thresholding techniques, the local Entropy and joint Entropy proposed by S.R. Pal and S.K. Pal and shows that the Relative Entropy-based method outperform Pal and Pal's methods.© (1994) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.