Bayesian Inference

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 114219 Experts worldwide ranked by ideXlab platform

Jan Drugowitsch - One of the best experts on this subject based on the ideXlab platform.

  • Variational Bayesian Inference for linear and logistic regression
    arXiv: Machine Learning, 2013
    Co-Authors: Jan Drugowitsch
    Abstract:

    The article describe the model, derivation, and implementation of variational Bayesian Inference for linear and logistic regression, both with and without automatic relevance determination. It has the dual function of acting as a tutorial for the derivation of variational Bayesian Inference for simple models, as well as documenting, and providing brief examples for the MATLAB/Octave functions that implement this Inference. These functions are freely available online.

Wallace B. Mann - One of the best experts on this subject based on the ideXlab platform.

  • Bayesian Inference in Model-Based Machine Vision
    arXiv: Artificial Intelligence, 2013
    Co-Authors: Thomas O. Binford, Tod S. Levitt, Wallace B. Mann
    Abstract:

    This is a preliminary version of visual interpretation integrating multiple sensors in SUCCESSOR, an intelligent, model-based vision system. We pursue a thorough integration of hierarchical Bayesian Inference with comprehensive physical representation of objects and their relations in a system for reasoning with geometry, surface materials and sensor models in machine vision. Bayesian Inference provides a framework for accruing_ probabilities to rank order hypotheses.

Ryan P. Adams - One of the best experts on this subject based on the ideXlab platform.

  • Patterns of Scalable Bayesian Inference
    2016
    Co-Authors: Elaine Angelino, Matthew J. Johnson, Ryan P. Adams
    Abstract:

    Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian Inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with a wide range of assumptions and applicability. Patterns of Scalable Bayesian Inference seeks to identify unifying principles, patterns, and intuitions for scaling Bayesian Inference. It examines how these techniques can be scaled up to larger problems and scaled out across parallel computational resources. It reviews existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, it characterizes the general principles that have proven successful for designing scalable Inference procedures and addresses some of the significant open questions and challenges.

  • Patterns of Scalable Bayesian Inference
    arXiv: Machine Learning, 2016
    Co-Authors: Elaine Angelino, Matthew J. Johnson, Ryan P. Adams
    Abstract:

    Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian Inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with few clear overarching principles. In this paper, we seek to identify unifying principles, patterns, and intuitions for scaling Bayesian Inference. We review existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, we characterize the general principles that have proven successful for designing scalable Inference procedures and comment on the path forward.

Peter Willett - One of the best experts on this subject based on the ideXlab platform.

  • Evaluation of a Bayesian Inference network for ligand-based virtual screening
    Journal of cheminformatics, 2009
    Co-Authors: Beining Chen, Christoph Mueller, Peter Willett
    Abstract:

    Background Bayesian Inference networks enable the computation of the probability that an event will occur. They have been used previously to rank textual documents in order of decreasing relevance to a user-defined query. Here, we modify the approach to enable a Bayesian Inference network to be used for chemical similarity searching, where a database is ranked in order of decreasing probability of bioactivity.

Wenxin Jiang - One of the best experts on this subject based on the ideXlab platform.

  • On Consistency of Bayesian Inference with Mixtures of Logistic Regression
    Neural Computation, 2006
    Co-Authors: Wenxin Jiang
    Abstract:

    This is a theoretical study of the consistency properties of Bayesian Inference using mixtures of logistic regression models. When standard logistic regression models are combined in a mixtures-of-experts setup, a flexible model is formed to model the relationship between a binary (yes-no) response y and a vector of predictors x. Bayesian Inference conditional on the observed data can then be used for regression and classification. This letter gives conditions on choosing the number of experts (i.e., number of mixing components) k or choosing a prior distribution for k, so that Bayesian Inference is consistent, in the sense of often approximating the underlying true relationship between y and x. The resulting classification rule is also consistent, in the sense of having near-optimal performance in classification. We show these desirable consistency properties with a nonstochastic k growing slowly with the sample size n of the observed data, or with a random k that takes large values with nonzero but small probabilities.