Scan Science and Technology

Contact Leading Edge Experts & Companies

14,000,000 Leading Edge Experts on the ideXlab platform

14,000,000

on the ideXlab platform

Divergence

The Experts below are selected from a list of 567750 Experts worldwide ranked by ideXlab platform

David Sutter - One of the best experts on this subject based on the ideXlab platform.

• Pretty good measures in quantum information theory
2017 IEEE International Symposium on Information Theory (ISIT), 2017
Co-Authors: Raban Iten, Joseph M. Renes, David Sutter
Abstract:

Quantum generalizations of Rényi's entropies are a useful tool to describe a variety of operational tasks in quantum information processing. Two families of such generalizations turn out to be particularly useful: the Petz quantum Rényi Divergence D̅α and the minimal quantum Rényi Divergence D̅α. In this paper, we prove a reverse Araki-Lieb-Thirring inequality that implies a new relation between these two families of Divergences, namely that αD̅α(ρ∥σ) ≤ D̅α(ρ∥σ) for α ϵ [0, 1] and where ρ and σ are density operators. This bound suggests defining a “pretty good fidelity”, whose relation to the usual fidelity implies the known relations between the optimal and pretty good measurement as well as the optimal and pretty good singlet fraction.

• Pretty Good Measures in Quantum Information Theory
IEEE Transactions on Information Theory, 2017
Co-Authors: Raban Iten, Joseph M. Renes, David Sutter
Abstract:

Quantum generalizations of Rényi's entropies are a useful tool to describe a variety of operational tasks in quantum information processing. Two families of such generalizations turn out to be particularly useful: the Petz quantum Rényi Divergence D̅α and the minimal quantum Rényi Divergence D̃α. In this paper, we prove a reverse Araki-Lieb-Thirring inequality that implies a new relation between these two families of Divergences, namely, αD̅α(Q∥σ) ≤ D̃α(Q∥σ) for α ∈[0,1] and where Q and σ are density operators. This bound suggests defining a ”pretty good fidelity,” whose relation to the usual fidelity implies the known relations between the optimal and pretty good measurement as well as the optimal and pretty good singlet fraction. We also find a new necessary and sufficient condition for optimality of the pretty good measurement and singlet fraction.

Andrew D Foote - One of the best experts on this subject based on the ideXlab platform.

• ecological morphological and genetic Divergence of sympatric north atlantic killer whale populations
Molecular Ecology, 2009
Co-Authors: Andrew D Foote, Eske Willerslev, James Newton, Stuart B. Piertney, Thomas M P Gilbert
Abstract:

Ecological Divergence has a central role in speciation and is therefore an important source of biodiversity. Studying the micro-evolutionary processes of ecological diversification at its early stages provides an opportunity for investigating the causative mechanisms and ecological conditions promoting Divergence. Here we use morphological traits, nitrogen stable isotope ratios and tooth wear to characterize two disparate types of North Atlantic killer whale. We find a highly specialist type, which reaches up to 8.5 m in length and a generalist type which reaches up to 6.6 m in length. There is a single fixed genetic difference in the mtDNA control region between these types, indicating integrity of groupings and a shallow Divergence. Phylogenetic analysis indicates this Divergence is independent of similar ecological Divergences in the Pacific and Antarctic. Niche-width in the generalist type is more strongly influenced by between-individual variation rather than within-individual variation in the composition of the diet. This first step to divergent specialization on different ecological resources provides a rare example of the ecological conditions at the early stages of adaptive radiation.

Michael I. Jordan - One of the best experts on this subject based on the ideXlab platform.

• estimating Divergence functionals and the likelihood ratio by convex risk minimization
IEEE Transactions on Information Theory, 2010
Co-Authors: Xuanlong Nguyen, Martin J. Wainwright, Michael I. Jordan
Abstract:

We develop and analyze M-estimation methods for Divergence functionals and the likelihood ratios of two probability distributions. Our method is based on a nonasymptotic variational characterization of f -Divergences, which allows the problem of estimating Divergences to be tackled via convex empirical risk optimization. The resulting estimators are simple to implement, requiring only the solution of standard convex programs. We present an analysis of consistency and convergence for these estimators. Given conditions only on the ratios of densities, we show that our estimators can achieve optimal minimax rates for the likelihood ratio and the Divergence functionals in certain regimes. We derive an efficient optimization algorithm for computing our estimates, and illustrate their convergence behavior and practical viability by simulations.

• Nonparametric estimation of the likelihood ratio and Divergence functionals
2007 IEEE International Symposium on Information Theory, 2007
Co-Authors: Xuanlong Nguyen, Martin J. Wainwright, Michael I. Jordan
Abstract:

We develop and analyze a nonparametric method for estimating the class of f-Divergence functionals, and the density ratio of two probability distributions. Our method is based on a non-asymptotic variational characterization of the f-Divergence, which allows us to cast the problem of estimating Divergences in terms of risk minimization. We thus obtain an M-estimator for Divergences, based on a convex and differentiable optimization problem that can be solved efficiently. We analyze the consistency and convergence rates for this M-estimator given conditions only on the ratio of densities.

Joseph B Slowinski - One of the best experts on this subject based on the ideXlab platform.

• estimating Divergence times from molecular data on phylogenetic and population genetic timescales
Annual Review of Ecology Evolution and Systematics, 2002
Co-Authors: Brian S Arbogast, Scott V Edwards, John Wakeley, Peter Beerli, Joseph B Slowinski
Abstract:

▪ Abstract Molecular clocks have profoundly influenced modern views on the timing of important events in evolutionary history. We review recent advances in estimating Divergence times from molecular data, emphasizing the continuum between processes at the phylogenetic and population genetic scales. On the phylogenetic scale, we address the complexities of DNA sequence evolution as they relate to estimating Divergences, focusing on models of nucleotide substitution and problems associated with among-site and among-lineage rate variation. On the population genetic scale, we review advances in the incorporation of ancestral population processes into the estimation of Divergence times between recently separated species. Throughout the review we emphasize new statistical methods and the importance of model testing during the process of Divergence time estimation.

Shun-ichi Amari - One of the best experts on this subject based on the ideXlab platform.

• Log-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences
Entropy, 2015
Co-Authors: Andrzej Cichocki, Sergio Cruces, Shun-ichi Amari
Abstract:

This work reviews and extends a family of log-determinant (log-det) Divergences for symmetric positive definite (SPD) matrices and discusses their fundamental properties. We show how to use parameterized Alpha-Beta (AB) and Gamma log-det Divergences to generate many well-known Divergences; in particular, we consider the Stein’s loss, the S-Divergence, also called Jensen-Bregman LogDet (JBLD) Divergence, Logdet Zero (Bhattacharyya) Divergence, Affine Invariant Riemannian Metric (AIRM), and other Divergences. Moreover, we establish links and correspondences between log-det Divergences and visualise them on an alpha-beta plane for various sets of parameters. We use this unifying framework to interpret and extend existing similarity measures for semidefinite covariance matrices in finite-dimensional Reproducing Kernel Hilbert Spaces (RKHS). This paper also shows how the Alpha-Beta family of log-det Divergences relates to the Divergences of multivariate and multilinear normal distributions. Closed form formulas are derived for Gamma Divergences of two multivariate Gaussian densities; the special cases of the Kullback-Leibler, Bhattacharyya, Renyi, and Cauchy-Schwartz Divergences are discussed. Symmetrized versions of log-det Divergences are also considered and briefly reviewed. Finally, a class of Divergences is extended to multiway Divergences for separable covariance (or precision) matrices.

• Log-Determinant Divergences Revisited: Alpha--Beta and Gamma Log-Det Divergences
arXiv: Computation, 2014
Co-Authors: Andrzej Cichocki, Sergio Cruces, Shun-ichi Amari
Abstract:

In this paper, we review and extend a family of log-det Divergences for symmetric positive definite (SPD) matrices and discuss their fundamental properties. We show how to generate from parameterized Alpha-Beta (AB) and Gamma Log-det Divergences many well known Divergences, for example, the Stein's loss, S-Divergence, called also Jensen-Bregman LogDet (JBLD) Divergence, the Logdet Zero (Bhattacharryya) Divergence, Affine Invariant Riemannian Metric (AIRM) as well as some new Divergences. Moreover, we establish links and correspondences among many log-det Divergences and display them on alpha-beta plain for various set of parameters. Furthermore, this paper bridges these Divergences and shows also their links to Divergences of multivariate and multiway Gaussian distributions. Closed form formulas are derived for gamma Divergences of two multivariate Gaussian densities including as special cases the Kullback-Leibler, Bhattacharryya, R\'enyi and Cauchy-Schwartz Divergences. Symmetrized versions of the log-det Divergences are also discussed and reviewed. A class of Divergences is extended to multiway Divergences for separable covariance (precision) matrices.

• $\alpha$ -Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes
IEEE Transactions on Information Theory, 2009
Co-Authors: Shun-ichi Amari
Abstract:

A Divergence measure between two probability distributions or positive arrays (positive measures) is a useful tool for solving optimization problems in optimization, signal processing, machine learning, and statistical inference. The Csiszar f-Divergence is a unique class of Divergences having information monotonicity, from which the dual alpha geometrical structure with the Fisher metric is derived. The Bregman Divergence is another class of Divergences that gives a dually flat geometrical structure different from the alpha-structure in general. Csiszar gave an axiomatic characterization of Divergences related to inference problems. The Kullback-Leibler Divergence is proved to belong to both classes, and this is the only such one in the space of probability distributions. This paper proves that the alpha-Divergences constitute a unique class belonging to both classes when the space of positive measures or positive arrays is considered. They are the canonical Divergences derived from the dually flat geometrical structure of the space of positive measures.