Bayesian Model Selection

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 40527 Experts worldwide ranked by ideXlab platform

Alan J. Weinstein - One of the best experts on this subject based on the ideXlab platform.

  • Template-based Gravitational-Wave Echoes Search Using Bayesian Model Selection
    Physical Review D, 2019
    Co-Authors: Alan J. Weinstein
    Abstract:

    The ringdown of the gravitational-wave signal from a merger of two black holes has been suggested as a probe of the structure of the remnant compact object, which may be more exotic than a black hole. It has been pointed out that there will be a train of echoes in the late-time ringdown stage for different types of exotic compact objects. In this paper, we present a template-based search methodology using Bayesian statistics to search for echoes of gravitational waves. Evidence for the presence or absence of echoes in gravitational-wave events can be established by performing Bayesian Model Selection. The Occam factor in Bayesian Model Selection will automatically penalize the more complicated Model that echoes are present in gravitational-wave strain data because of its higher degree of freedom to fit the data. We find that the search methodology was able to identify gravitational-wave echoes with Abedi et al.'s echoes waveform Model about 82.3% of the time in simulated Gaussian noise in the Advanced LIGO and Virgo network and about 61.1% of the time in real noise in the first observing run of Advanced LIGO with $\geq 5\sigma$ significance. Analyses using this method are performed on the data of Advanced LIGO's first observing run, and we find no statistical significant evidence for the detection of gravitational-wave echoes. In particular, we find $

  • template based gravitational wave echoes search using Bayesian Model Selection
    Physical Review D, 2019
    Co-Authors: Alan J. Weinstein
    Abstract:

    The ringdown of the gravitational-wave signal from a merger of two black holes has been suggested as a probe of the structure of the remnant compact object, which may be more exotic than a black hole. It has been pointed out that there will be a train of echoes in the late-time ringdown stage for different types of exotic compact objects. In this paper, we present a template-based search methodology using Bayesian statistics to search for echoes of gravitational waves. Evidence for the presence or absence of echoes in gravitational-wave events can be established by performing Bayesian Model Selection. The Occam factor in Bayesian Model Selection will automatically penalize the more complicated Model that echoes are present in gravitational-wave strain data because of its higher degree of freedom to fit the data. We find that the search methodology was able to identify gravitational-wave echoes with Abedi et al.'s echoes waveform Model about 82.3% of the time in simulated Gaussian noise in the Advanced LIGO and Virgo network and about 61.1% of the time in real noise in the first observing run of Advanced LIGO with $\geq 5\sigma$ significance. Analyses using this method are performed on the data of Advanced LIGO's first observing run, and we find no statistical significant evidence for the detection of gravitational-wave echoes. In particular, we find $<1\sigma$ combined evidence of the three events in Advanced LIGO's first observing run. The analysis technique developed in this paper is independent of the waveform Model used, and can be used with different parametrized echoes waveform Models to provide more realistic evidence of the existence of echoes from exotic compact objects.

Luis R Pericchi - One of the best experts on this subject based on the ideXlab platform.

  • Intrinsic Priors for Objective Bayesian Model Selection
    Advances in Econometrics, 2014
    Co-Authors: Elías Moreno, Luis R Pericchi
    Abstract:

    Abstract We put forward the idea that for Model Selection the intrinsic priors are becoming a center of a cluster of a dominant group of methodologies for objective Bayesian Model Selection. The intrinsic method and its applications have been developed in the last two decades, and has stimulated closely related methods. The intrinsic methodology can be thought of as the long searched approach for objective Bayesian Model Selection and hypothesis testing. In this paper we review the foundations of the intrinsic priors, their general properties, and some of their applications.

  • Training samples in objective Bayesian Model Selection
    The Annals of Statistics, 2004
    Co-Authors: James O. Berger, Luis R Pericchi
    Abstract:

    Central to several objective approaches to Bayesian Model Selection is the use of training samples (subsets of the data), so as to allow utilization of improper objective priors. The most common prescription for choosing training samples is to choose them to be as small as possible, subject to yielding proper posteriors; these are called minimal training samples. When data can vary widely in terms of either information content or impact on the improper priors, use of minimal training samples can be inadequate. Important examples include certain cases of discrete data, the presence of censored observations, and certain situations involving linear Models and explanatory variables. Such situations require more sophisticated methods of choosing training samples. A variety of such methods are developed in this paper, and successfully applied in challenging situations.

Jayashree Subrahmonia - One of the best experts on this subject based on the ideXlab platform.

  • a Bayesian Model Selection criterion for hmm topology optimization
    International Conference on Acoustics Speech and Signal Processing, 2002
    Co-Authors: Alain Biem, Jin-young Ha, Jayashree Subrahmonia
    Abstract:

    This paper addresses the problem of estimating the optimal Hidden Markov Model (HMM) topology. The optimal topology is defined as the one that gives the smallest error-rate with the minimal number of parameters. The paper introduces a Bayesian Model Selection criterion that is suitable for Continuous Hidden Markov Models topology optimization. The criterion is derived from the Laplacian approximation of the posterior of a Model structure, and shares the algorithmic simplicity of conventional Bayesian Selection criteria, such as Schwarz's Bayesian Information Criterion (BIC). Unlike, BIC, which uses a multivariate Normal distribution assumption for the prior of all parameters of the Model, the proposed HMM-oriented Bayesian Information Criterion (HBIC), Models each parameter by a different distribution, one more appropriate for that parameter The results on an handwriting recognition task shows that the HBIC realizes a much smaller and efficient system than a system generated through the BIC.

  • ICASSP - A Bayesian Model Selection criterion for HMM topology optimization
    IEEE International Conference on Acoustics Speech and Signal Processing, 2002
    Co-Authors: Alain Biem, Jin-young Ha, Jayashree Subrahmonia
    Abstract:

    This paper addresses the problem of estimating the optimal Hidden Markov Model (HMM) topology. The optimal topology is defined as the one that gives the smallest error-rate with the minimal number of parameters. The paper introduces a Bayesian Model Selection criterion that is suitable for Continuous Hidden Markov Models topology optimization. The criterion is derived from the Laplacian approximation of the posterior of a Model structure, and shares the algorithmic simplicity of conventional Bayesian Selection criteria, such as Schwarz's Bayesian Information Criterion (BIC). Unlike, BIC, which uses a multivariate Normal distribution assumption for the prior of all parameters of the Model, the proposed HMM-oriented Bayesian Information Criterion (HBIC), Models each parameter by a different distribution, one more appropriate for that parameter The results on an handwriting recognition task shows that the HBIC realizes a much smaller and efficient system than a system generated through the BIC.

Elías Moreno - One of the best experts on this subject based on the ideXlab platform.

  • Intrinsic Priors for Objective Bayesian Model Selection
    Advances in Econometrics, 2014
    Co-Authors: Elías Moreno, Luis R Pericchi
    Abstract:

    Abstract We put forward the idea that for Model Selection the intrinsic priors are becoming a center of a cluster of a dominant group of methodologies for objective Bayesian Model Selection. The intrinsic method and its applications have been developed in the last two decades, and has stimulated closely related methods. The intrinsic methodology can be thought of as the long searched approach for objective Bayesian Model Selection and hypothesis testing. In this paper we review the foundations of the intrinsic priors, their general properties, and some of their applications.

  • Bayesian Model Selection approach to analysis of variance under heteroscedasticity
    Journal of the Royal Statistical Society: Series D (The Statistician), 2000
    Co-Authors: Francesco Bertolino, Walter Racugno, Elías Moreno
    Abstract:

    The classical Bayesian approach to analysis of variance assumes the homoscedastic condition and uses conventional uniform priors on the location parameters and on the logarithm of the common scale. The problem has been developed as one of estimation of location parameters. We argue that this does not lead to an appropriate Bayesian solution. A solution based on a Bayesian Model Selection procedure is proposed. Our development is in the general heteroscedastic setting in which a frequentist exact test does not exist. The Bayes factor involved uses intrinsic and fractional priors which are used instead of the usual default prior distributions for which the Bayes factor is not well defined. The behaviour of these Bayes factors is compared with the Bayesian information criterion of Schwarz and the frequentist asymptotic approximations of Welch and Brown and Forsythe.

Daniel Sabanés Bové - One of the best experts on this subject based on the ideXlab platform.

  • Objective Bayesian Model Selection for Cox regression.
    Statistics in medicine, 2016
    Co-Authors: Leonhard Held, Isaac Gravestock, Daniel Sabanés Bové
    Abstract:

    There is now a large literature on objective Bayesian Model Selection in the linear Model based on the g-prior. The methodology has been recently extended to generalized linear Models using test-based Bayes factors. In this paper, we show that test-based Bayes factors can also be applied to the Cox proportional hazards Model. If the goal is to select a single Model, then both the maximum a posteriori and the median probability Model can be calculated. For clinical prediction of survival, we shrink the Model-specific log hazard ratio estimates with subsequent calculation of the Breslow estimate of the cumulative baseline hazard function. A Bayesian Model average can also be employed. We illustrate the proposed methodology with the analysis of survival data on primary biliary cirrhosis patients and the development of a clinical prediction Model for future cardiovascular events based on data from the Second Manifestations of ARTerial disease (SMART) cohort study. Cross-validation is applied to compare the predictive performance with alternative Model Selection approaches based on Harrell's c-Index, the calibration slope and the integrated Brier score. Finally, a novel application of Bayesian variable Selection to optimal conditional prediction via landmarking is described. Copyright © 2016 John Wiley & Sons, Ltd.

  • Approximate Bayesian Model Selection with the Deviance Statistic
    Statistical Science, 2015
    Co-Authors: Leonhard Held, Daniel Sabanés Bové, Isaac Gravestock
    Abstract:

    Bayesian Model Selection poses two main challenges: the specification of parameter priors for all Models, and the computation of the resulting Bayes factors between Models. There is now a large literature on automatic and objective parameter priors in the linear Model. One important class are g-priors, which were recently extended from linear to generalized linear Models (GLMs). We show that the resulting Bayes factors can be approximated by test-based Bayes factors (Johnson [Scand. J. Stat. 35 (2008) 354–368]) using the deviance statistics of the Models. To estimate the hyperparameter g, we propose empirical and fully Bayes approaches and link the former to minimum Bayes factors and shrinkage estimates from the literature. Furthermore, we describe how to approximate the corresponding posterior distribution of the regression coefficients based on the standard GLM output. We illustrate the approach with the development of a clinical prediction Model for 30-day survival in the GUSTO-I trial using logistic regression.