The Experts below are selected from a list of 37725 Experts worldwide ranked by ideXlab platform
Minghui Chen - One of the best experts on this subject based on the ideXlab platform.
-
partition weighted approach for estimating the marginal Posterior Density with applications
Journal of Computational and Graphical Statistics, 2019Co-Authors: Yubo Wang, Minghui Chen, Lynn Kuo, Paul O. LewisAbstract:The computation of marginal Posterior Density in Bayesian analysis is essential in that it can provide complete information about parameters of interest. Furthermore, the marginal Posterior Density...
-
partition weighted approach for estimating the marginal Posterior Density with applications
Journal of Computational and Graphical Statistics, 2019Co-Authors: Yubo Wang, Minghui Chen, Paul O. LewisAbstract:The computation of marginal Posterior Density in Bayesian analysis is essential in that it can provide complete information about parameters of interest. Furthermore, the marginal Posterior Density can be used for computing Bayes factors, Posterior model probabilities, and diagnostic measures. The conditional marginal Density estimator (CMDE) is theoretically the best for marginal Density estimation but requires the closed-form expression of the conditional Posterior Density, which is often not available in many applications. We develop the partition weighted marginal Density estimator (PWMDE) to realize the CMDE. This unbiased estimator requires only a single MCMC output from the joint Posterior distribution and the known unnormalized Posterior Density. The theoretical properties and various applications of the We carry out simulation studies to investigate the empirical performance of the PWMDE and further demonstrate the desirable features of the proposed method with two real data sets from a study of dissociative identity disorder patients and a prostate cancer study, respectively.
-
Partition Weighted Approach For Estimating the Marginal Posterior Density With Applications
2019Co-Authors: Yubo Wang, Minghui Chen, Lynn Kuo, Paul O. LewisAbstract:The computation of marginal Posterior Density in Bayesian analysis is essential in that it can provide complete information about parameters of interest. Furthermore, the marginal Posterior Density can be used for computing Bayes factors, Posterior model probabilities, and diagnostic measures. The conditional marginal Density estimator (CMDE) is theoretically the best for marginal Density estimation but requires the closed-form expression of the conditional Posterior Density, which is often not available in many applications. We develop the partition weighted marginal Density estimator (PWMDE) to realize the CMDE. This unbiased estimator requires only a single Markov chain Monte Carlo output from the joint Posterior distribution and the known unnormalized Posterior Density. The theoretical properties and various applications of the PWMDE are examined in detail. The PWMDE method is also extended to the estimation of conditional Posterior densities. We carry out simulation studies to investigate the empirical performance of the PWMDE and further demonstrate the desirable features of the proposed method with two real data sets from a study of dissociative identity disorder patients and a prostate cancer study, respectively. Supplementary materials for this article are available online.
-
estimating marginal Posterior densities
2000Co-Authors: Minghui Chen, Qiman Shao, Joseph G IbrahimAbstract:In Bayesian inference, a joint Posterior distribution is available through the likelihood function and a prior distribution. One purpose of Bayesian inference is to calculate and display marginal Posterior densities because the marginal Posterior densities provide complete information about parameters of interest. As shown in Chapter 2, a Markov chain Monte Carlo (MCMC) sampling algorithm, such as the Gibbs sampler or a Metropolis-Hastings algorithm, can be used to draw MCMC samples from the Posterior distribution. Chapter 3 also demonstrates how we can easily obtain Posterior quantities such as Posterior means, Posterior standard deviations, and other Posterior quantities from MCMC samples. However, when a Bayesian model becomes complicated, it may be difficult to obtain a reliable estimator of a marginal Posterior Density based on the MCMC sample. A traditional method for estimating marginal Posterior densities is kernel Density estimation. Since the kernel Density estimator is nonparametric, it may not be efficient. On the other hand, the kernel Density estimator may not be applicable for some complicated Bayesian models. In the context of Bayesian inference, the joint Posterior Density is typically known up to a normalizing constant. Using the structure of a Posterior Density, a number of authors (e.g., Gelfand, Smith, and Lee 1992; Johnson 1992; Chen 1993 and 1994; Chen and Shao 1997c; Chib 1995; Verdinelli and Wasserman 1995) propose parametric marginal Posterior Density estimators based on the MCMC sample. In this chapter, we present several available Monte Carlo (MC) methods for computing marginal Posterior Density estimators, and we also discuss how well marginal Posterior Density estimation works using the Kullback—Leibler (K—L) divergence as a performance measure.
-
performance study of marginal Posterior Density estimation via kullback leibler divergence
Test, 1997Co-Authors: Minghui Chen, Qiman ShaoAbstract:In this article, we introduce Kullback-Leibler (K-L) divergence as a performance measure of marginal Posterior Density estimation. We show that the K-L divergence can be used to compare two Density estimators as well as to assess convergence of a marginal Density estimator. We also examine performance of the importance-weighted marginal Density estimation (IWMDE) proposed by Chen (1994) under the K-L divergence and we further extend the IWMDE to some more complex Bayesian models where the kernel method, which is widely used for estimating marginal densities using Markov chain Monte Carlo (MCMC) sampling outputs is not applicable. Finally, we use a constrained linear multiple regression model as an example to illustrate our methodology.
Rahul Mukerjee - One of the best experts on this subject based on the ideXlab platform.
-
highest Posterior Density regions with approximate frequentist validity the role of data dependent priors
Statistics & Probability Letters, 2010Co-Authors: In Hong Chang, Rahul MukerjeeAbstract:Abstract For the general multiparameter case, we consider the problem of ensuring frequentist validity of highest Posterior Density regions with margin of error o ( n − 1 ) , where n is the sample size. The role of data-dependent priors is investigated and it is seen that the resulting probability matching condition readily allows solutions, in contrast to what happens with data-free priors. Moreover, use of data-dependent priors is seen to be helpful even for models, such as mixture models, where closed form expressions for the expected information elements do not exist.
-
data dependent probability matching priors for highest Posterior Density and equal tailed two sided regions based on empirical type likelihoods
Journal of Statistical Planning and Inference, 2010Co-Authors: In Hong Chang, Rahul MukerjeeAbstract:Abstract We consider a very general class of empirical-type likelihoods which includes the usual empirical likelihood and all its major variants proposed in the literature. It is known that none of these likelihoods admits a data-free probability matching prior for the highest Posterior Density region. We develop necessary higher order asymptotics to show that at least for the usual empirical likelihood this difficulty can be resolved if data-dependent priors are entertained. A related problem concerning the equal-tailed two-sided Posterior credible region is also investigated. A simulation study is seen to lend support to the theoretical results.
-
on perturbed ellipsoidal and highest Posterior Density regions with approximate frequentist validity
Journal of the royal statistical society series b-methodological, 1995Co-Authors: Jayanta K Ghosh, Rahul MukerjeeAbstract:This paper considers, in the multiparameter case, perturbed ellipsoidal and highest Posterior Density regions with both Bayesian and frequentist validity up to o(n -1 ).
-
frequentist validity of highest Posterior Density regions in the multiparameter case
Annals of the Institute of Statistical Mathematics, 1993Co-Authors: Jayanta K Ghosh, Rahul MukerjeeAbstract:In a multiparameter set-up, this paper characterizes priors which ensure frequentist validity, up to o(n-1), of confidence regions based on the highest Posterior Density. The role of Jeffreys' prior in this regard has also been investigated.
Arturo J Fernandez - One of the best experts on this subject based on the ideXlab platform.
-
highest Posterior Density estimation from multiply censored pareto data
Statistical Papers, 2007Co-Authors: Arturo J FernandezAbstract:In statistical practice, it is quite common that some data are unknown or disregarded for various reasons. In the present paper, on the basis of a multiply censored sample from a Pareto population, the problem of finding the highest Posterior Density (HPD) estimates of the inequality and precision parameters is discussed assuming a natural joint conjugate prior. HPD estimates are obtained in closed forms for complete or right censored data. In the general multiple censoring case, it is shown the existence and uniqueness of the estimates. Explicit lower and upper bounds are also provided. Due to the Posterior unimodality, HPD credibility regions are simply connected sets. For illustration, two numerical examples are included.
-
on estimating exponential parameters with general type ii progressive censoring
Journal of Statistical Planning and Inference, 2004Co-Authors: Arturo J FernandezAbstract:This article deals with the problem of estimating exponential parameters, on the basis of a general progressive Type II censored sample, using both classical and Bayesian viewpoints. A class of natural prior densities is considered in the Bayesian setting. Even though maximum likelihood and highest Posterior Density estimators do not admit closed form expressions, explicit sharp lower and upper bounds are provided in this paper. These estimators are also found to have as good large-sample properties as those of the best linear unbiased estimator. In the Bayesian framework, Posterior Density and distribution functions are derived explicitly. Assuming squared error-loss functions, Bayes estimators are obtained in closed forms. Credibility intervals and Bayes estimators under linear loss functions can readily be computed iteratively. Finally, an illustrative example is also included.
-
bayesian inference from type ii doubly censored rayleigh data
Statistics & Probability Letters, 2000Co-Authors: Arturo J FernandezAbstract:In this paper we present a Bayesian approach to inference in reliability studies based on type II doubly censored data from a Rayleigh distribution. We also consider the problem of predicting an independent future sample from the same distribution in a Bayesian setting. The results can be used to predict the failure-time of a k-out-of-m system. Bayes estimators are obtained in nice closed forms. Highest Posterior Density (HPD) and maximum likelihood (ML) estimators, and HPD intervals can readily be computed using iterative methods.
Benjamin D Wandelt - One of the best experts on this subject based on the ideXlab platform.
-
solving high dimensional parameter inference marginal Posterior densities moment networks
arXiv: Machine Learning, 2020Co-Authors: Niall Jeffrey, Benjamin D WandeltAbstract:High-dimensional probability Density estimation for inference suffers from the "curse of dimensionality". For many physical inference problems, the full Posterior distribution is unwieldy and seldom used in practice. Instead, we propose direct estimation of lower-dimensional marginal distributions, bypassing high-dimensional Density estimation or high-dimensional Markov chain Monte Carlo (MCMC) sampling. By evaluating the two-dimensional marginal Posteriors we can unveil the full-dimensional parameter covariance structure. We additionally propose constructing a simple hierarchy of fast neural regression models, called Moment Networks, that compute increasing moments of any desired lower-dimensional marginal Posterior Density; these reproduce exact results from analytic Posteriors and those obtained from Masked Autoregressive Flows. We demonstrate marginal Posterior Density estimation using high-dimensional LIGO-like gravitational wave time series and describe applications for problems of fundamental cosmology.
-
global exact cosmic microwave background data analysis using gibbs sampling
Physical Review D, 2004Co-Authors: Benjamin D Wandelt, D L Larson, Arun LakshminarayananAbstract:We describe an efficient and exact method that enables global Bayesian analysis of cosmic microwave background (CMB) data. The method reveals the joint Posterior Density (or likelihood for flat priors) of the power spectrum ${C}_{\ensuremath{\ell}}$ and the CMB signal. Foregrounds and instrumental parameters can be simultaneously inferred from the data. The method allows the specification of a wide range of foreground priors. We explicitly show how to propagate the non-Gaussian dependency structure of the ${C}_{\ensuremath{\ell}}$ Posterior through to the Posterior Density of the parameters. If desired, the analysis can be coupled to theoretical (cosmological) priors and can yield the Posterior Density of cosmological parameter estimates directly from the time-ordered data. The method does not hinge on special assumptions about the survey geometry or noise properties, etc., It is based on a Monte Carlo approach and hence parallelizes trivially. No trace or determinant evaluations are necessary. The feasibility of this approach rests on the ability to solve the systems of linear equations which arise. These are of the same size and computational complexity as the map-making equations. We describe a preconditioned conjugate gradient technique that solves this problem and demonstrate in a numerical example that the computational time required for each Monte Carlo sample scales as ${n}_{p}^{3/2}$ with the number of pixels ${n}_{p}$. We use our method to analyze the data from the Differential Microwave Radiometer on the Cosmic Background Explorer and explore the non-Gaussian joint Posterior Density of the ${C}_{\ensuremath{\ell}}$ from the Differential Microwave Radiometer on the Cosmic Background Explorer in several projections.
Qiman Shao - One of the best experts on this subject based on the ideXlab platform.
-
estimating marginal Posterior densities
2000Co-Authors: Minghui Chen, Qiman Shao, Joseph G IbrahimAbstract:In Bayesian inference, a joint Posterior distribution is available through the likelihood function and a prior distribution. One purpose of Bayesian inference is to calculate and display marginal Posterior densities because the marginal Posterior densities provide complete information about parameters of interest. As shown in Chapter 2, a Markov chain Monte Carlo (MCMC) sampling algorithm, such as the Gibbs sampler or a Metropolis-Hastings algorithm, can be used to draw MCMC samples from the Posterior distribution. Chapter 3 also demonstrates how we can easily obtain Posterior quantities such as Posterior means, Posterior standard deviations, and other Posterior quantities from MCMC samples. However, when a Bayesian model becomes complicated, it may be difficult to obtain a reliable estimator of a marginal Posterior Density based on the MCMC sample. A traditional method for estimating marginal Posterior densities is kernel Density estimation. Since the kernel Density estimator is nonparametric, it may not be efficient. On the other hand, the kernel Density estimator may not be applicable for some complicated Bayesian models. In the context of Bayesian inference, the joint Posterior Density is typically known up to a normalizing constant. Using the structure of a Posterior Density, a number of authors (e.g., Gelfand, Smith, and Lee 1992; Johnson 1992; Chen 1993 and 1994; Chen and Shao 1997c; Chib 1995; Verdinelli and Wasserman 1995) propose parametric marginal Posterior Density estimators based on the MCMC sample. In this chapter, we present several available Monte Carlo (MC) methods for computing marginal Posterior Density estimators, and we also discuss how well marginal Posterior Density estimation works using the Kullback—Leibler (K—L) divergence as a performance measure.
-
performance study of marginal Posterior Density estimation via kullback leibler divergence
Test, 1997Co-Authors: Minghui Chen, Qiman ShaoAbstract:In this article, we introduce Kullback-Leibler (K-L) divergence as a performance measure of marginal Posterior Density estimation. We show that the K-L divergence can be used to compare two Density estimators as well as to assess convergence of a marginal Density estimator. We also examine performance of the importance-weighted marginal Density estimation (IWMDE) proposed by Chen (1994) under the K-L divergence and we further extend the IWMDE to some more complex Bayesian models where the kernel method, which is widely used for estimating marginal densities using Markov chain Monte Carlo (MCMC) sampling outputs is not applicable. Finally, we use a constrained linear multiple regression model as an example to illustrate our methodology.