Joint Posterior Distribution

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 6633 Experts worldwide ranked by ideXlab platform

Minghui Chen - One of the best experts on this subject based on the ideXlab platform.

  • a new bayesian Joint model for longitudinal count data with many zeros intermittent missingness and dropout with applications to hiv prevention trials
    Statistics in Medicine, 2019
    Co-Authors: Minghui Chen, Elizabeth D Schifano, Joseph G Ibrahim, Jeffrey D Fisher
    Abstract:

    In longitudinal clinical trials, it is common that subjects may permanently withdraw from the study (dropout), or return to the study after missing one or more visits (intermittent missingness). It is also routinely encountered in HIV prevention clinical trials that there is a large proportion of zeros in count response data. In this paper, a sequential multinomial model is adopted for dropout and subsequently a conditional model is constructed for intermittent missingness. The new model captures the complex structure of missingness and incorporates dropout and intermittent missingness simultaneously. The model also allows us to easily compute the predictive probabilities of different missing data patterns. A zero-inflated Poisson mixed-effects regression model is assumed for the longitudinal count response data. We also propose an approach to assess the overall treatment effects under the zero-inflated Poisson model. We further show that the Joint Posterior Distribution is improper if uniform priors are specified for the regression coefficients under the proposed model. Variations of the g-prior, Jeffreys prior, and maximally dispersed normal prior are thus established as remedies for the improper Posterior Distribution. An efficient Gibbs sampling algorithm is developed using a hierarchical centering technique. A modified logarithm of the pseudomarginal likelihood and a concordance based area under the curve criterion are used to compare the models under different missing data mechanisms. We then conduct an extensive simulation study to investigate the empirical performance of the proposed methods and further illustrate the methods using real data from an HIV prevention clinical trial.

  • partition weighted approach for estimating the marginal Posterior density with applications
    Journal of Computational and Graphical Statistics, 2019
    Co-Authors: Yubo Wang, Minghui Chen, Paul O. Lewis
    Abstract:

    The computation of marginal Posterior density in Bayesian analysis is essential in that it can provide complete information about parameters of interest. Furthermore, the marginal Posterior density can be used for computing Bayes factors, Posterior model probabilities, and diagnostic measures. The conditional marginal density estimator (CMDE) is theoretically the best for marginal density estimation but requires the closed-form expression of the conditional Posterior density, which is often not available in many applications. We develop the partition weighted marginal density estimator (PWMDE) to realize the CMDE. This unbiased estimator requires only a single MCMC output from the Joint Posterior Distribution and the known unnormalized Posterior density. The theoretical properties and various applications of the We carry out simulation studies to investigate the empirical performance of the PWMDE and further demonstrate the desirable features of the proposed method with two real data sets from a study of dissociative identity disorder patients and a prostate cancer study, respectively.

  • Partition Weighted Approach For Estimating the Marginal Posterior Density With Applications
    2019
    Co-Authors: Yubo Wang, Minghui Chen, Lynn Kuo, Paul O. Lewis
    Abstract:

    The computation of marginal Posterior density in Bayesian analysis is essential in that it can provide complete information about parameters of interest. Furthermore, the marginal Posterior density can be used for computing Bayes factors, Posterior model probabilities, and diagnostic measures. The conditional marginal density estimator (CMDE) is theoretically the best for marginal density estimation but requires the closed-form expression of the conditional Posterior density, which is often not available in many applications. We develop the partition weighted marginal density estimator (PWMDE) to realize the CMDE. This unbiased estimator requires only a single Markov chain Monte Carlo output from the Joint Posterior Distribution and the known unnormalized Posterior density. The theoretical properties and various applications of the PWMDE are examined in detail. The PWMDE method is also extended to the estimation of conditional Posterior densities. We carry out simulation studies to investigate the empirical performance of the PWMDE and further demonstrate the desirable features of the proposed method with two real data sets from a study of dissociative identity disorder patients and a prostate cancer study, respectively. Supplementary materials for this article are available online.

  • bayesian modeling and inference for nonignorably missing longitudinal binary response data with applications to hiv prevention trials
    Statistica Sinica, 2018
    Co-Authors: Joseph G Ibrahim, Minghui Chen, Elizabeth D Schifano, Jeffrey D Fisher
    Abstract:

    Missing data are frequently encountered in longitudinal clinical trials. To better monitor and understand the progress over time, one must handle the missing data appropriately and examine whether the missing data mechanism is ignorable or nonignorable. In this article, we develop a new probit model for longitudinal binary response data. It resolves a challenging issue for estimating the variance of the random effects, and substantially improves the convergence and mixing of the Gibbs sampling algorithm. We show that when improper uniform priors are specified for the regression coefficients of the Joint multinomial model via a sequence of one-dimensional conditional Distributions for the missing data indicators under nonignorable missingness, the Joint Posterior Distribution is improper. A variation of Jeffreys prior is thus established as a remedy for the improper Posterior Distribution. In addition, an efficient Gibbs sampling algorithm is developed using a collapsing technique. Two model assessment criteria, the deviance information criterion (DIC) and the logarithm of the pseudomarginal likelihood (LPML), are used to guide the choices of prior specifications and to compare the models under different missing data mechanisms. We report on extensive simulations conducted to investigate the empirical performance of the proposed methods. The proposed methodology is further illustrated using data from an HIV prevention clinical trial.

  • bayesian computation from Posterior densities to bayes factors marginal likelihoods and Posterior model probabilities
    Handbook of Statistics, 2005
    Co-Authors: Minghui Chen
    Abstract:

    Publisher Summary This chapter deals with the Bayesian computation. In Bayesian inference, a Joint Posterior Distribution is available through the likelihood function and a prior Distribution. One way to summarize a Posterior Distribution is to calculate and display marginal Posterior densities because the marginal Posterior densities provide complete information about parameters of interest. This chapter summarizes the current state of the art in the area of estimating marginal and full Posterior densities and various applications of the Posterior density estimation in computing Bayes factors, marginal likelihoods, and Posterior model probabilities. This chapter provides a most updated overview on various Monte Carlo methods for computing marginal or full Posterior densities, including the kernel density estimation, the conditional marginal density estimator (CMDE) the importance weighted marginal density estimation (IWMDE), the Gibbs stopper approach, and an approach based on the Metropolis–Hastings output. Finally, the development of an efficient and practically useful Monte Carlo method for this problem is a very challenging and important future project.

Francisco Louzada - One of the best experts on this subject based on the ideXlab platform.

  • scale mixtures log birnbaum saunders regression models with censored data a bayesian approach
    Journal of Statistical Computation and Simulation, 2017
    Co-Authors: Victor H. Lachos, Vicente G. Cancho, Dipak K Dey, Francisco Louzada
    Abstract:

    ABSTRACTThe main objective of this paper is to develop a full Bayesian analysis for the Birnbaum–Saunders (BS) regression model based on scale mixtures of the normal (SMN) Distribution with right-censored survival data. The BS Distributions based on SMN models are a very general approach for analysing lifetime data, which has as special cases the Student-t-BS, slash-BS and the contaminated normal-BS Distributions, being a flexible alternative to the use of the corresponding BS Distribution or any other well-known compatible model, such as the log-normal Distribution. A Gibbs sample algorithm with Metropolis–Hastings algorithm is used to obtain the Bayesian estimates of the parameters. Moreover, some discussions on the model selection to compare the fitted models are given and case-deletion influence diagnostics are developed for the Joint Posterior Distribution based on the Kullback–Leibler divergence. The newly developed procedures are illustrated on a real data set previously analysed under BS regressio...

  • on the bayesian estimation and influence diagnostics for the weibull negative binomial regression model with cure rate under latent failure causes
    Communications in Statistics-theory and Methods, 2017
    Co-Authors: Bao Yiqi, Vicente G. Cancho, Francisco Louzada
    Abstract:

    ABSTRACTThe purpose of this paper is to develop a Bayesian approach for the Weibull-Negative-Binomial regression model with cure rate under latent failure causes and presence of randomized activation mechanisms. We assume the number of competing causes of the event of interest follows a Negative Binomial (NB) Distribution while the latent lifetimes are assumed to follow a Weibull Distribution. Markov chain Monte Carlos (MCMC) methods are used to develop the Bayesian procedure. Model selection to compare the fitted models is discussed. Moreover, we develop case deletion influence diagnostics for the Joint Posterior Distribution based on the ψ-divergence, which has several divergence measures as particular cases. The developed procedures are illustrated with a real data set.

  • the poisson inverse gaussian regression model with cure rate a bayesian approach and its case influence diagnostics
    Statistical Papers, 2016
    Co-Authors: Adriano K Suzuki, Vicente G. Cancho, Francisco Louzada
    Abstract:

    This paper proposes a new survival model, called Poisson Inverse-Gaussian regression cure rate model (PIGcr), which enables different underlying activation mechanisms that lead to the event of interest. The number of competing causes of the event of interest follows a Poisson Distribution and the time for the event follows an Inverse-Gaussian Distribution. The model takes into account the presence of censored data and covariates. For inferential purposes, a Bayesian approach via Markov Chain Monte Carlo was considered. Discussions on the model selection criteria, as well as a case deletion influence diagnostics are addressed for a Joint Posterior Distribution based on the \(\psi \)-divergence, which has several divergence measures as particular cases, such as Kullback–Leibler (K–L), \(J\)-distance, \(L_1\) norm and \(\chi ^2\)-square divergence measures. The procedures are illustrated in artificial and real data.

  • the zero inflated conway maxwell poisson Distribution bayesian inference regression modeling and influence diagnostic
    Statistical Methodology, 2014
    Co-Authors: Gladys D C Barriga, Francisco Louzada
    Abstract:

    Abstract In this paper we propose the zero-inflated COM-Poisson Distribution. We develop a Bayesian analysis for our model via on Markov chain Monte Carlo methods. We discuss regression modeling and model selection, as well as, develop case deletion influence diagnostics for the Joint Posterior Distribution based on the ψ -divergence, which has several divergence measures as particular cases, such as the Kullback–Leibler (K–L), J -distance, L 1 norm and χ 2 -square divergence measures. The performance of our approach is illustrated in an artificial dataset as well as in a real dataset on an apple cultivar experiment.

Cheng Cheng - One of the best experts on this subject based on the ideXlab platform.

  • A variational marginalized particle filter for jump Markov nonlinear systems with unknown transition probabilities
    'Elsevier BV', 2021
    Co-Authors: Cheng Cheng, Tourneret Jean-yves
    Abstract:

    International audienceThis paper studies a new variational marginalized particle filter for Jointly estimating the state and the system mode parameters of jump Markov nonlinear systems. Contrary to the Markovian assumption usu- ally considered to model the evolution of the system modes, we introduce conjugate prior Distributions for the system mode parameters. The Joint Posterior Distribution of the state and system mode parame- ters is then marginalized with respect to the mode variables. The remaining state vector is sampled us- ing a sequential Monte Carlo algorithm, and the mode parameters are sampled using variational Bayesian inference. In order to obtain analytical solutions for the different variational Distributions, we use an extended factorized approximation simplifying the variational Distributions. A comprehensive simulation study is conducted to compare the performance of the proposed approach with the state-of-the-art for a modified nonlinear benchmark model and maneuvering target tracking scenarios

  • A variational marginalized particle filter for jump Markov nonlinear systems with unknown transition probabilities
    'Elsevier BV', 2021
    Co-Authors: Cheng Cheng, Tourneret Jean-yves
    Abstract:

    This paper studies a new variational marginalized particle filter for Jointly estimating the state and the system mode parameters of jump Markov nonlinear systems. Contrary to the Markovian assumption usu- ally considered to model the evolution of the system modes, we introduce conjugate prior Distributions for the system mode parameters. The Joint Posterior Distribution of the state and system mode parame- ters is then marginalized with respect to the mode variables. The remaining state vector is sampled us- ing a sequential Monte Carlo algorithm, and the mode parameters are sampled using variational Bayesian inference. In order to obtain analytical solutions for the different variational Distributions, we use an extended factorized approximation simplifying the variational Distributions. A comprehensive simulation study is conducted to compare the performance of the proposed approach with the state-of-the-art for a modified nonlinear benchmark model and maneuvering target tracking scenarios

  • A marginalised particle filter with variational inference for non‐linear state‐space models with Gaussian mixture noise
    'Institution of Engineering and Technology (IET)', 2021
    Co-Authors: Cheng Cheng, Tourneret Jean-yves, Lu Xiaodong
    Abstract:

    This work proposes a marginalised particle filter with variational inference for non‐linear state‐space models (SSMs) with Gaussian mixture noise. A latent variable indicating the component of the Gaussian mixture considered at each time instant is introduced to specify the measurement mode of the SSM. The resulting Joint Posterior Distribution of the state vector, the mode variable and the parameters of the Gaussian mixture noise is marginalised with respect to the noise variables. The marginalised Posterior Distribution of the state and mode is then approximated by using an appropriate marginalised particle filter. The noise parameters conditionally on each particle system of the state and mode variable are finally updated by using variational Bayesian inference. A simulation study is conducted to compare the proposed method with state‐of‐the‐art approaches in the context of positioning in urban canyons using global navigation satellite systems

  • A Rao-Blackwellized particle filter with variational inference for state estimation with measurement model uncertainties
    'Institute of Electrical and Electronics Engineers (IEEE)', 2020
    Co-Authors: Cheng Cheng, Tourneret Jean-yves, Lu Xiaodong
    Abstract:

    International audienceThis paper develops a Rao-Blackwellized particle filter with variational inference for Jointly estimating state and time-varying parameters in non-linear state-space models (SSM) with non-Gaussian measurement noise. Depending on the availability of the conjugate prior for the unknown parameters, the Joint Posterior Distribution of the state and unknown parameters is approximated by using an auxiliary particle filter with a probabilistic changepoint model. The Distribution of the SSM parameters conditionally on each particle is then updated by using variational Bayesian inference. Experiments are first conducted on a modified nonlinear benchmark model to compare the performance of the proposed approach with other state-of-the-art approaches. Finally, in the context of GNSS multipath mitigation, the proposed approach is evaluated based on data obtained from a measurement campaign conducted in a street urban canyon

  • A Rao-Blackwellized particle filter with variational inference for state estimation with measurement model uncertainties
    IEEE Access, 2020
    Co-Authors: Cheng Cheng, Jean-yves Tourneret
    Abstract:

    This paper develops a Rao-Blackwellized particle filter with variational inference for Jointly estimating state and time-varying parameters in non-linear state-space models (SSM) with non-Gaussian measurement noise. Depending on the availability of the conjugate prior for the unknown parameters, the Joint Posterior Distribution of the state and unknown parameters is approximated by using an auxiliary particle filter with a probabilistic changepoint model. The Distribution of the SSM parameters conditionally on each particle is then updated by using variational Bayesian inference. Experiments are first conducted on a modified nonlinear benchmark model to compare the performance of the proposed approach with other state-of-the-art approaches. Finally, in the context of GNSS multipath mitigation, the proposed approach is evaluated based on data obtained from a measurement campaign conducted in a street urban canyon.

Jean-yves Tourneret - One of the best experts on this subject based on the ideXlab platform.

  • A Rao-Blackwellized particle filter with variational inference for state estimation with measurement model uncertainties
    IEEE Access, 2020
    Co-Authors: Cheng Cheng, Jean-yves Tourneret
    Abstract:

    This paper develops a Rao-Blackwellized particle filter with variational inference for Jointly estimating state and time-varying parameters in non-linear state-space models (SSM) with non-Gaussian measurement noise. Depending on the availability of the conjugate prior for the unknown parameters, the Joint Posterior Distribution of the state and unknown parameters is approximated by using an auxiliary particle filter with a probabilistic changepoint model. The Distribution of the SSM parameters conditionally on each particle is then updated by using variational Bayesian inference. Experiments are first conducted on a modified nonlinear benchmark model to compare the performance of the proposed approach with other state-of-the-art approaches. Finally, in the context of GNSS multipath mitigation, the proposed approach is evaluated based on data obtained from a measurement campaign conducted in a street urban canyon.

  • Multiband Image Fusion Based on Spectral Unmixing
    IEEE Transactions on Geoscience and Remote Sensing, 2016
    Co-Authors: Qi Wei, Jean-yves Tourneret, Nicolas Dobigeon, José M. Bioucas-dias, Marcus Chen, Simon Godsill
    Abstract:

    This paper presents a multiband image fusion algo- rithm based on unsupervised spectral unmixing for combining a high-spatial–low-spectral-resolution image and a low-spatial– high-spectral-resolution image. The widely used linear observa- tion model (with additive Gaussian noise) is combined with the linear spectral mixture model to form the likelihoods of the obser- vations. The nonnegativity and sum-to-one constraints resulting from the intrinsic physical properties of the abundances are intro- duced as prior information to regularize this ill-posed problem. The Joint fusion and unmixing problem is then formulated as maximizing the Joint Posterior Distribution with respect to the endmember signatures and abundance maps. This optimization problem is attacked with an alternating optimization strategy. The two resulting subproblems are convex and are solved efficiently using the alternating direction method of multipliers. Experiments are conducted for both synthetic and semi-real data. Simulation results show that the proposed unmixing-based fusion scheme im- proves both the abundance and endmember estimation compared with the state-of-the-art Joint fusion and unmixing algorithms.

  • nonlinear unmixing of hyperspectral images using a generalized bilinear model
    IEEE Signal Processing Workshop on Statistical Signal Processing, 2011
    Co-Authors: Abderrahim Halimi, Yoann Altmann, Nicolas Dobigeon, Jean-yves Tourneret
    Abstract:

    This paper studies a generalized bilinear model and a hierarchical Bayesian algorithm for unmixing hyperspectral images. The proposed model is a generalization of the accepted linear mixing model but also of a bilinear model recently introduced in the literature. Appropriate priors are chosen for its parameters in particular to satisfy the positivity and sum-to-one constraints for the abundances. The Joint Posterior Distribution of the unknown parameter vector is then derived. A Metropolis-within-Gibbs algorithm is proposed which allows samples distributed according to the Posterior of interest to be generated and to estimate the unknown model parameters. The performance of the resulting unmixing strategy is evaluated via simulations conducted on synthetic and real data.

  • nonlinear unmixing of hyperspectral images using a generalized bilinear model
    IEEE Transactions on Geoscience and Remote Sensing, 2011
    Co-Authors: Abderrahim Halimi, Yoann Altmann, Nicolas Dobigeon, Jean-yves Tourneret
    Abstract:

    Nonlinear models have recently shown interesting properties for spectral unmixing. This paper studies a generalized bilinear model and a hierarchical Bayesian algorithm for unmixing hyperspectral images. The proposed model is a generalization not only of the accepted linear mixing model but also of a bilinear model that has been recently introduced in the literature. Appropriate priors are chosen for its parameters to satisfy the positivity and sum-to-one constraints for the abundances. The Joint Posterior Distribution of the unknown parameter vector is then derived. Unfortunately, this Posterior is too complex to obtain analytical expressions of the standard Bayesian estimators. As a consequence, a Metropolis-within-Gibbs algorithm is proposed, which allows samples distributed according to this Posterior to be generated and to estimate the unknown model parameters. The performance of the resulting unmixing strategy is evaluated via simulations conducted on synthetic and real data.

Vicente G. Cancho - One of the best experts on this subject based on the ideXlab platform.

  • estimation and influence diagnostics for zero inflated hyper poisson regression model full bayesian analysis
    Communications in Statistics-theory and Methods, 2018
    Co-Authors: Vicente G. Cancho, Bao Yiqi, Jose A Fiorucci, Gladys D C Barriga
    Abstract:

    ABSTRACTThe purpose of this paper is to develop a Bayesian analysis for the zero-inflated hyper-Poisson model. Markov chain Monte Carlo methods are used to develop a Bayesian procedure for the model and the Bayes estimators are compared by simulation with the maximum-likelihood estimators. Regression modeling and model selection are also discussed and case deletion influence diagnostics are developed for the Joint Posterior Distribution based on the functional Bregman divergence, which includes ψ-divergence and several others, divergence measures, such as the Itakura–Saito, Kullback–Leibler, and χ2 divergence measures. Performance of our approach is illustrated in artificial, real apple cultivation experiment data, related to apple cultivation.

  • scale mixtures log birnbaum saunders regression models with censored data a bayesian approach
    Journal of Statistical Computation and Simulation, 2017
    Co-Authors: Victor H. Lachos, Vicente G. Cancho, Dipak K Dey, Francisco Louzada
    Abstract:

    ABSTRACTThe main objective of this paper is to develop a full Bayesian analysis for the Birnbaum–Saunders (BS) regression model based on scale mixtures of the normal (SMN) Distribution with right-censored survival data. The BS Distributions based on SMN models are a very general approach for analysing lifetime data, which has as special cases the Student-t-BS, slash-BS and the contaminated normal-BS Distributions, being a flexible alternative to the use of the corresponding BS Distribution or any other well-known compatible model, such as the log-normal Distribution. A Gibbs sample algorithm with Metropolis–Hastings algorithm is used to obtain the Bayesian estimates of the parameters. Moreover, some discussions on the model selection to compare the fitted models are given and case-deletion influence diagnostics are developed for the Joint Posterior Distribution based on the Kullback–Leibler divergence. The newly developed procedures are illustrated on a real data set previously analysed under BS regressio...

  • on the bayesian estimation and influence diagnostics for the weibull negative binomial regression model with cure rate under latent failure causes
    Communications in Statistics-theory and Methods, 2017
    Co-Authors: Bao Yiqi, Vicente G. Cancho, Francisco Louzada
    Abstract:

    ABSTRACTThe purpose of this paper is to develop a Bayesian approach for the Weibull-Negative-Binomial regression model with cure rate under latent failure causes and presence of randomized activation mechanisms. We assume the number of competing causes of the event of interest follows a Negative Binomial (NB) Distribution while the latent lifetimes are assumed to follow a Weibull Distribution. Markov chain Monte Carlos (MCMC) methods are used to develop the Bayesian procedure. Model selection to compare the fitted models is discussed. Moreover, we develop case deletion influence diagnostics for the Joint Posterior Distribution based on the ψ-divergence, which has several divergence measures as particular cases. The developed procedures are illustrated with a real data set.

  • the poisson inverse gaussian regression model with cure rate a bayesian approach and its case influence diagnostics
    Statistical Papers, 2016
    Co-Authors: Adriano K Suzuki, Vicente G. Cancho, Francisco Louzada
    Abstract:

    This paper proposes a new survival model, called Poisson Inverse-Gaussian regression cure rate model (PIGcr), which enables different underlying activation mechanisms that lead to the event of interest. The number of competing causes of the event of interest follows a Poisson Distribution and the time for the event follows an Inverse-Gaussian Distribution. The model takes into account the presence of censored data and covariates. For inferential purposes, a Bayesian approach via Markov Chain Monte Carlo was considered. Discussions on the model selection criteria, as well as a case deletion influence diagnostics are addressed for a Joint Posterior Distribution based on the \(\psi \)-divergence, which has several divergence measures as particular cases, such as Kullback–Leibler (K–L), \(J\)-distance, \(L_1\) norm and \(\chi ^2\)-square divergence measures. The procedures are illustrated in artificial and real data.

  • bayesian nonlinear regression models with scale mixtures of skew normal Distributions estimation and case influence diagnostics
    Computational Statistics & Data Analysis, 2011
    Co-Authors: Vicente G. Cancho, Victor H. Lachos, Marinho G. Andrade
    Abstract:

    The purpose of this paper is to develop a Bayesian analysis for nonlinear regression models under scale mixtures of skew-normal Distributions. This novel class of models provides a useful generalization of the symmetrical nonlinear regression models since the error Distributions cover both skewness and heavy-tailed Distributions such as the skew-t, skew-slash and the skew-contaminated normal Distributions. The main advantage of these class of Distributions is that they have a nice hierarchical representation that allows the implementation of Markov chain Monte Carlo (MCMC) methods to simulate samples from the Joint Posterior Distribution. In order to examine the robust aspects of this flexible class, against outlying and influential observations, we present a Bayesian case deletion influence diagnostics based on the Kullback-Leibler divergence. Further, some discussions on the model selection criteria are given. The newly developed procedures are illustrated considering two simulations study, and a real data previously analyzed under normal and skew-normal nonlinear regression models.