Posterior Probability

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 52113 Experts worldwide ranked by ideXlab platform

Bruce Rannala - One of the best experts on this subject based on the ideXlab platform.

  • Branch-length prior influences Bayesian Posterior Probability of phylogeny
    SYST BIOL, 2005
    Co-Authors: Bruce Rannala
    Abstract:

    The Bayesian method for estimating species phylogenies from molecular sequence data provides an attractive alternative to maximum likelihood with nonparametric bootstrap due to the easy interpretation of Posterior probabilities for trees and to availability of efficient computational algorithms. However, for many data sets it produces extremely high Posterior probabilities, sometimes for apparently incorrect clades. Here we use both computer simulation and empirical data analysis to examine the effect of the prior model for internal branch lengths. We found that Posterior probabilities for trees and clades are sensitive to the prior for internal branch lengths, and priors assuming long internal branches cause high Posterior probabilities for trees. In particular, uniform priors with high upper bounds bias Bayesian clade probabilities in favor of extreme values. We discuss possible remedies to the problem, including empirical and full Bayesian methods and subjective procedures suggested in Bayesian hypothesis testing. Our results also suggest that the bootstrap proportion and Bayesian Posterior Probability are different measures of accuracy, and that the bootstrap proportion, if interpreted as the Probability that the clade is true, can be either too liberal or too conservative.

  • frequentist properties of bayesian Posterior probabilities of phylogenetic trees under simple and complex substitution models
    Systematic Biology, 2004
    Co-Authors: John P Huelsenbeck, Bruce Rannala
    Abstract:

    What does the Posterior Probability of a phylogenetic tree mean? This simulation study shows that Bayesian Posterior probabilities have the meaning that is typically ascribed to them; the Posterior Probability of a tree is the Probability that the tree is correct, assuming that the model is correct. At the same time, the Bayesian method can be sensitive to model misspecification, and the sensitivity of the Bayesian method appears to be greater than the sensitivity of the nonparametric bootstrap method (using maximum likelihood to estimate trees). Although the estimates of phylogeny obtained by use of the method of maximum likelihood or the Bayesian method are likely to be similar, the assessment of the uncertainty of inferred trees via either bootstrapping (for maximum likelihood estimates) or Posterior probabilities (for Bayesian estimates) is not likely to be the same. We suggest that the Bayesian method be implemented with the most complex models of those currently available, as this should reduce the chance that the method will concentrate too much Probability on too few trees. (Bayesian estimation; Markov chain Monte Carlo; Posterior Probability; prior Probability.)

  • Probability distribution of molecular evolutionary trees a new method of phylogenetic inference
    Journal of Molecular Evolution, 1996
    Co-Authors: Bruce Rannala, Ziheng Yang
    Abstract:

    A new method is presented for inferring evolutionary trees using nucleotide sequence data. The birth-death process is used as a model of speciation and extinction to specify the prior distribution of phylogenies and branching times. Nucleotide substitution is modeled by a continuous-time Markov process. Parameters of the branching model and the substitution model are estimated by maximum likelihood. The Posterior probabilities of different phylogenies are calculated and the phylogeny with the highest Posterior Probability is chosen as the best estimate of the evolutionary relationship among species. We refer to this as the maximum Posterior Probability (MAP) tree. The Posterior Probability provides a natural measure of the reliability of the estimated phylogeny. Two example data sets are analyzed to infer the phylogenetic relationship of human, chimpanzee, gorilla, and orangutan. The best trees estimated by the new method are the same as those from the maximum likelihood analysis of separate topologies, but the Posterior probabilities are quite different from the bootstrap proportions. The results of the method are found to be insensitive to changes in the rate parameter of the branching process.

Tan Lee - One of the best experts on this subject based on the ideXlab platform.

  • tone enhanced generalized character Posterior Probability gcpp for cantonese lvcsr
    Computer Speech & Language, 2008
    Co-Authors: Yao Qian, Frank K Soong, Tan Lee
    Abstract:

    Tone-enhanced generalized character Posterior Probability (GCPP), a generalized form of Posterior Probability at subword (Chinese character) level, is proposed as a rescoring metric for improving Cantonese LVCSR performance. GCPP is computed by tone score along with the corresponding acoustic and language model scores. The tone score is output from a supra-tone model, which characterizes not only the tone contour of a single syllable but also that of adjacent ones and significantly outperforms other conventional tone models. The search network is constructed first by converting the original word graph to a restructured word graph, then a character graph and finally, a character confusion network (CCN). Based upon tone-enhanced GCPP, the character error rate (CER) is minimized or the GCPP product is maximized over a chosen graph. Experimental results show that the tone-enhanced GCPP can improve character error rate by up to 15.1%, relatively.

  • tone enhanced generalized character Posterior Probability gcpp for cantonese lvcsr
    International Conference on Acoustics Speech and Signal Processing, 2006
    Co-Authors: Yao Qian, Frank K Soong, Tan Lee
    Abstract:

    Tone-enhanced, generalized character Posterior Probability (GCPP), a generalized form of Posterior Probability at subword (Chinese character) level, is proposed as a rescoring metric for improving Cantonese LVCSR performance. The search network is constructed first by converting the original word graph to a restructured word graph, then a character graph and finally, a character confusion network (CCN). Based upon GCPP enhanced with tone information, the character error rate (CER) is minimized or the GCPP product is maximized over a chosen graph. Experimental results show that the tone enhanced GCPP can improve character error rate by up to 15.1%, relatively.

Frank K Soong - One of the best experts on this subject based on the ideXlab platform.

  • tone enhanced generalized character Posterior Probability gcpp for cantonese lvcsr
    Computer Speech & Language, 2008
    Co-Authors: Yao Qian, Frank K Soong, Tan Lee
    Abstract:

    Tone-enhanced generalized character Posterior Probability (GCPP), a generalized form of Posterior Probability at subword (Chinese character) level, is proposed as a rescoring metric for improving Cantonese LVCSR performance. GCPP is computed by tone score along with the corresponding acoustic and language model scores. The tone score is output from a supra-tone model, which characterizes not only the tone contour of a single syllable but also that of adjacent ones and significantly outperforms other conventional tone models. The search network is constructed first by converting the original word graph to a restructured word graph, then a character graph and finally, a character confusion network (CCN). Based upon tone-enhanced GCPP, the character error rate (CER) is minimized or the GCPP product is maximized over a chosen graph. Experimental results show that the tone-enhanced GCPP can improve character error rate by up to 15.1%, relatively.

  • generalized segment Posterior Probability for automatic mandarin pronunciation evaluation
    International Conference on Acoustics Speech and Signal Processing, 2007
    Co-Authors: Jing Zheng, Chao Huang, Mi Chu, Frank K Soong
    Abstract:

    In this paper, we investigate the automatic pronunciation evaluation method for native Mandarin. Multi-space distribution (MSD) hidden Markov model (HMM) is adopted to train the gold standard model. Machine scores derived from the generalized segment Posterior Probability on both syllables and phone level are proposed and investigated to measure the goodness of pronunciation (GOP). They are evaluated on the database collected internally and shown better performance than other well-known methods. In addition, detailed analyses of human scoring such as inter/intra-rater on utterance/speaker level are also given.

  • tone enhanced generalized character Posterior Probability gcpp for cantonese lvcsr
    International Conference on Acoustics Speech and Signal Processing, 2006
    Co-Authors: Yao Qian, Frank K Soong, Tan Lee
    Abstract:

    Tone-enhanced, generalized character Posterior Probability (GCPP), a generalized form of Posterior Probability at subword (Chinese character) level, is proposed as a rescoring metric for improving Cantonese LVCSR performance. The search network is constructed first by converting the original word graph to a restructured word graph, then a character graph and finally, a character confusion network (CCN). Based upon GCPP enhanced with tone information, the character error rate (CER) is minimized or the GCPP product is maximized over a chosen graph. Experimental results show that the tone enhanced GCPP can improve character error rate by up to 15.1%, relatively.

  • phonetic transcription verification with generalized Posterior Probability
    Conference of the International Speech Communication Association, 2005
    Co-Authors: Lijuan Wang, Frank K Soong, Yong Zhao, Min Chu, Zhigang Cao
    Abstract:

    Accurate phonetic transcription is critical to high quality concatenation based text-to-speech synthesis. In this paper, we propose to use generalized syllable Posterior Probability (GSPP) as a statistical confidence measure to verify errors in phonetic transcriptions, such as reading errors, inadequate alternatives of pronunciations in the lexicon, letter-to-sound errors in transcribing out-of-vocabulary words, idiosyncratic pronunciations, etc. in a TTS speech database. GSPP is computed based upon a syllable graph generated by a recognition decoder. Testing on two data sets, the proposed GSPP is shown to be effective in locating phonetic transcription errors. Equal error rates (EERs) of 8.2% and 8.4%, are obtained on two testing sets, respectively. It is also found that the GSPP verification performance is fairly stable over a wide range around the optimal value of acoustic model exponential weight used in computing GSPP.

William D Penny - One of the best experts on this subject based on the ideXlab platform.

  • efficient Posterior Probability mapping using savage dickey ratios
    PLOS ONE, 2013
    Co-Authors: William D Penny, Gerard R Ridgway
    Abstract:

    Statistical Parametric Mapping (SPM) is the dominant paradigm for mass-univariate analysis of neuroimaging data. More recently, a bayesian approach termed Posterior Probability Mapping (PPM) has been proposed as an alternative. PPM offers two advantages: (i) inferences can be made about effect size thus lending a precise physiological meaning to activated regions, (ii) regions can be declared inactive. This latter facility is most parsimoniously provided by PPMs based on bayesian model comparisons. To date these comparisons have been implemented by an Independent Model Optimization (IMO) procedure which separately fits null and alternative models. This paper proposes a more computationally efficient procedure based on Savage-Dickey approximations to the Bayes factor, and Taylor-series approximations to the voxel-wise Posterior covariance matrices. Simulations show the accuracy of this Savage-Dickey-Taylor (SDT) method to be comparable to that of IMO. Results on fMRI data show excellent agreement between SDT and IMO for second-level models, and reasonable agreement for first-level models. This Savage-Dickey test is a bayesian analogue of the classical SPM-F and allows users to implement model comparison in a truly interactive manner.

  • Posterior Probability maps
    In: Statistical Parametric Mapping: The Analysis of Functional Brain Images. (pp. 295-302). (2007), 2007
    Co-Authors: Karl J Friston, William D Penny
    Abstract:

    This chapter describes the construction of Posterior Probability maps that enable conditional or Bayesian inferences about regionally specific effects in neuroimaging. Posterior Probability maps are images of the Probability or confidence that an activation exceeds some specified threshold, given the data. Posterior Probability maps (PPMs) represent a complementary alternative to statistical parametric maps (SPMs) that are used to make classical inferences. However, a key problem in Bayesian inference is the specification of appropriate priors. This problem can be finessed using empirical Bayes in which prior variances are estimated from the data, under some simple assumptions about their form. Empirical Bayes requires a hierarchical observation model, in which higher levels can be regarded as providing prior constraints on lower levels. In neuroimaging, observations of the same effect over voxels provide a natural, two-level hierarchy that enables an empirical Bayesian approach. In this section, we present the motivation and the operational details of a simple empirical Bayesian method for computing Posterior Probability maps. We then compare Bayesian and classical inference through the equivalent PPMs and SPMs testing for the same effect in the same data. The approach adopted here is a natural extension of parametric empirical Bayes described in the previous chapter. The resulting model entails global shrinkage priors to inform the estimation of effects at each voxel or bin in the image. These global priors can be regarded as a special case of spatial priors in the more general spatiotemporal models for functional magnetic resonance imaging (fMRI) introduced in Chapter 25. To date, inference in neuroimaging has been restricted largely to classical inferences based upon statistical parametric maps (SPMs). The alternative approach is to use Bayesian or conditional inference based upon the Posterior distribution of the activation given the data (Holmes and Ford, 1993) . This necessitates the specification of priors (i.e. the Probability distribution of the activation). Bayesian inference requires the Posterior distribution and therefore rests upon a Posterior density analysis. A useful way to summarize this Posterior density is to compute the Probability that the activation exceeds some threshold. This computation represents a Bayesian inference about the effect, in relation to the specified threshold. We now describe an approach to computing Posterior Probability maps for activation effects or, more generally, treatment effects in imaging data sequences. This approach represents the simplest and most computationally expedient way of constructing PPMs.

  • Posterior Probability maps and SPMs.
    NeuroImage, 2003
    Co-Authors: Karl J Friston, William D Penny
    Abstract:

    This technical note describes the construction of Posterior Probability maps that enable conditional or Bayesian inferences about regionally specific effects in neuroimaging. Posterior Probability maps are images of the Probability or confidence that an activation exceeds some specified threshold, given the data. Posterior Probability maps (PPMs) represent a complementary alternative to statistical parametric maps (SPMs) that are used to make classical inferences. However, a key problem in Bayesian inference is the specification of appropriate priors. This problem can be finessed using empirical Bayes in which prior variances are estimated from the data, under some simple assumptions about their form. Empirical Bayes requires a hierarchical observation model, in which higher levels can be regarded as providing prior constraints on lower levels. In neuroimaging, observations of the same effect over voxels provide a natural, two-level hierarchy that enables an empirical Bayesian approach. In this note we present a brief motivation and the operational details of a simple empirical Bayesian method for computing Posterior Probability maps. We then compare Bayesian and classical inference through the equivalent PPMs and SPMs testing for the same effect in the same data.

Ziheng Yang - One of the best experts on this subject based on the ideXlab platform.

  • Probability distribution of molecular evolutionary trees a new method of phylogenetic inference
    Journal of Molecular Evolution, 1996
    Co-Authors: Bruce Rannala, Ziheng Yang
    Abstract:

    A new method is presented for inferring evolutionary trees using nucleotide sequence data. The birth-death process is used as a model of speciation and extinction to specify the prior distribution of phylogenies and branching times. Nucleotide substitution is modeled by a continuous-time Markov process. Parameters of the branching model and the substitution model are estimated by maximum likelihood. The Posterior probabilities of different phylogenies are calculated and the phylogeny with the highest Posterior Probability is chosen as the best estimate of the evolutionary relationship among species. We refer to this as the maximum Posterior Probability (MAP) tree. The Posterior Probability provides a natural measure of the reliability of the estimated phylogeny. Two example data sets are analyzed to infer the phylogenetic relationship of human, chimpanzee, gorilla, and orangutan. The best trees estimated by the new method are the same as those from the maximum likelihood analysis of separate topologies, but the Posterior probabilities are quite different from the bootstrap proportions. The results of the method are found to be insensitive to changes in the rate parameter of the branching process.