Probabilistic Inference

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 41862 Experts worldwide ranked by ideXlab platform

Ryan Martin - One of the best experts on this subject based on the ideXlab platform.

  • conditional inferential models combining information for prior free Probabilistic Inference
    2015
    Co-Authors: Ryan Martin
    Abstract:

    type="main" xml:id="rssb12070-abs-0001"> The inferential model (IM) framework provides valid prior-free Probabilistic Inference by focusing on predicting unobserved auxiliary variables. But, efficient IM-based Inference can be challenging when the auxiliary variable is of higher dimension than the parameter. Here we show that features of the auxiliary variable are often fully observed and, in such cases, a simultaneous dimension reduction and information aggregation can be achieved by conditioning. This proposed conditioning strategy leads to efficient IM Inference and casts new light on Fisher's notions of sufficiency, conditioning and also Bayesian Inference. A differential-equation-driven selection of a conditional association is developed, and validity of the conditional IM is proved under some conditions. For problems that do not admit a conditional IM of the standard form, we propose a more flexible class of conditional IMs based on localization. Examples of local conditional IMs in a bivariate normal model and a normal variance components model are also given.

  • conditional inferential models combining information for prior free Probabilistic Inference
    2015
    Co-Authors: Ryan Martin, Chuanhai Liu
    Abstract:

    Summary The inferential model (IM) framework provides valid prior-free Probabilistic Inference by focusing on predicting unobserved auxiliary variables. But, efficient IM-based Inference can be challenging when the auxiliary variable is of higher dimension than the parameter. Here we show that features of the auxiliary variable are often fully observed and, in such cases, a simultaneous dimension reduction and information aggregation can be achieved by conditioning. This proposed conditioning strategy leads to efficient IM Inference and casts new light on Fisher's notions of sufficiency, conditioning and also Bayesian Inference. A differential-equation-driven selection of a conditional association is developed, and validity of the conditional IM is proved under some conditions. For problems that do not admit a conditional IM of the standard form, we propose a more flexible class of conditional IMs based on localization. Examples of local conditional IMs in a bivariate normal model and a normal variance components model are also given.

  • inferential models a framework for prior free posterior Probabilistic Inference
    2013
    Co-Authors: Ryan Martin
    Abstract:

    Posterior Probabilistic statistical Inference without priors is an important but so far elusive goal. Fisher’s fiducial Inference, Dempster–Shafer theory of belief functions, and Bayesian Inference with default priors are attempts to achieve this goal but, to date, none has given a completely satisfactory picture. This article presents a new framework for Probabilistic Inference, based on inferential models (IMs), which not only provides data-dependent Probabilistic measures of uncertainty about the unknown parameter, but also does so with an automatic long-run frequency-calibration property. The key to this new approach is the identification of an unobservable auxiliary variable associated with observable data and unknown parameter, and the prediction of this auxiliary variable with a random set before conditioning on data. Here we present a three-step IM construction, and prove a frequency-calibration property of the IM’s belief function under mild conditions. A corresponding optimality theory is develo...

  • inferential models a framework for prior free posterior Probabilistic Inference
    2012
    Co-Authors: Ryan Martin, Chuanhai Liu
    Abstract:

    Posterior Probabilistic statistical Inference without priors is an important but so far elusive goal. Fisher's fiducial Inference, Dempster-Shafer theory of belief functions, and Bayesian Inference with default priors are attempts to achieve this goal but, to date, none has given a completely satisfactory picture. This paper presents a new framework for Probabilistic Inference, based on inferential models (IMs), which not only provides data-dependent Probabilistic measures of uncertainty about the unknown parameter, but does so with an automatic long-run frequency calibration property. The key to this new approach is the identification of an unobservable auxiliary variable associated with observable data and unknown parameter, and the prediction of this auxiliary variable with a random set before conditioning on data. Here we present a three-step IM construction, and prove a frequency-calibration property of the IM's belief function under mild conditions. A corresponding optimality theory is developed, which helps to resolve the non-uniqueness issue. Several examples are presented to illustrate this new approach.

Chuanhai Liu - One of the best experts on this subject based on the ideXlab platform.

  • conditional inferential models combining information for prior free Probabilistic Inference
    2015
    Co-Authors: Ryan Martin, Chuanhai Liu
    Abstract:

    Summary The inferential model (IM) framework provides valid prior-free Probabilistic Inference by focusing on predicting unobserved auxiliary variables. But, efficient IM-based Inference can be challenging when the auxiliary variable is of higher dimension than the parameter. Here we show that features of the auxiliary variable are often fully observed and, in such cases, a simultaneous dimension reduction and information aggregation can be achieved by conditioning. This proposed conditioning strategy leads to efficient IM Inference and casts new light on Fisher's notions of sufficiency, conditioning and also Bayesian Inference. A differential-equation-driven selection of a conditional association is developed, and validity of the conditional IM is proved under some conditions. For problems that do not admit a conditional IM of the standard form, we propose a more flexible class of conditional IMs based on localization. Examples of local conditional IMs in a bivariate normal model and a normal variance components model are also given.

  • inferential models a framework for prior free posterior Probabilistic Inference
    2012
    Co-Authors: Ryan Martin, Chuanhai Liu
    Abstract:

    Posterior Probabilistic statistical Inference without priors is an important but so far elusive goal. Fisher's fiducial Inference, Dempster-Shafer theory of belief functions, and Bayesian Inference with default priors are attempts to achieve this goal but, to date, none has given a completely satisfactory picture. This paper presents a new framework for Probabilistic Inference, based on inferential models (IMs), which not only provides data-dependent Probabilistic measures of uncertainty about the unknown parameter, but does so with an automatic long-run frequency calibration property. The key to this new approach is the identification of an unobservable auxiliary variable associated with observable data and unknown parameter, and the prediction of this auxiliary variable with a random set before conditioning on data. Here we present a three-step IM construction, and prove a frequency-calibration property of the IM's belief function under mild conditions. A corresponding optimality theory is developed, which helps to resolve the non-uniqueness issue. Several examples are presented to illustrate this new approach.

David Poole - One of the best experts on this subject based on the ideXlab platform.

  • exploiting contextual independence in Probabilistic Inference
    2011
    Co-Authors: David Poole, Nevin L Zhang
    Abstract:

    Bayesian belief networks have grown to prominence because they provide compact representations for many problems for which Probabilistic Inference is appropriate, and there are algorithms to exploit this compactness. The next step is to allow compact representations of the conditional probabilities of a variable given its parents. In this paper we present such a representation that exploits contextual independence in terms of parent contexts; which variables act as parents may depend on the value of other variables. The internal representation is in terms of contextual factors (confactors) that is simply a pair of a context and a table. The algorithm, contextual variable elimination, is based on the standard variable elimination algorithm that eliminates the non-query variables in turn, but when eliminating a variable, the tables that need to be multiplied can depend on the context. This algorithm reduces to standard variable elimination when there is no contextual independence structure to exploit. We show how this can be much more efficient than variable elimination when there is structure to exploit. We explain why this new method can exploit more structure than previous methods for structured belief network Inference and an analogous algorithm that uses trees.

  • first order Probabilistic Inference
    2003
    Co-Authors: David Poole
    Abstract:

    There have been many proposals for first-order belief networks (i.e., where we quantify over individuals) but these typically only let us reason about the individuals that we know about. There are many instances where we have to quantify over all of the individuals in a population. When we do this the population size often matters and we need to reason about all of the members of the population (but not necessarily individually). This paper presents an algorithm to reason about multiple individuals, where we may know particular facts about some of them, but want to treat the others as a group. Combining unification with variable elimination lets us reason about classes of individuals without needing to ground out the theory.

  • Probabilistic partial evaluation exploiting rule structure in Probabilistic Inference
    1997
    Co-Authors: David Poole
    Abstract:

    Bayesian belief networks have grown to prominence because they provide compact representations of many domains, and there are algorithms to exploit this compactness. The next step is to allow compact representations of the conditional probability tables of a variable given its parents. In this paper we present such a representation in terms of parent contexts and provide an algorithm that exploits this compactness. The representation is in terms of rules that provide conditional probabilities in different contexts. The algorithm is based on eliminating the variables not needed in an answer in turn. The operations for eliminating a variable correspond to a form of partial evaluation, where we are careful to maintain the Probabilistic dependencies necessary for correct Probabilistic Inference. We show how this new method can exploit more structure than previous methods for structured belief network Inference.

R Van Der Merwe - One of the best experts on this subject based on the ideXlab platform.

  • gaussian mixture sigma point particle filters for sequential Probabilistic Inference in dynamic state space models
    2003
    Co-Authors: R Van Der Merwe
    Abstract:

    For sequential Probabilistic Inference in nonlinear non-Gaussian systems, approximate solutions must be used. We present a novel recursive Bayesian estimation algorithm that combines an importance sampling based measurement update step with a bank of sigma-point Kalman filters for the time-update and proposal distribution generation. The posterior state density is represented by a Gaussian mixture model that is recovered from the weighted particle set of the measurement update step by means of a weighted EM algorithm. This step replaces the resampling stage needed by most particle filters and mitigates the "sample depletion" problem. We show that this new approach has an improved estimation performance and reduced computational complexity compared to other related algorithms.

Guy Van Den Broeck - One of the best experts on this subject based on the ideXlab platform.

  • model checking finite horizon markov chains with Probabilistic Inference
    2021
    Co-Authors: Steven Holtzen, Sebastian Junges, Marcell Vazquezchanlatte, Todd Millstein, Sanjit A Seshia, Guy Van Den Broeck
    Abstract:

    We revisit the symbolic verification of Markov chains with respect to finite horizon reachability properties. The prevalent approach iteratively computes step-bounded state reachability probabilities. By contrast, recent advances in Probabilistic Inference suggest symbolically representing all horizon-length paths through the Markov chain. We ask whether this perspective advances the state-of-the-art in Probabilistic model checking. First, we formally describe both approaches in order to highlight their key differences. Then, using these insights we develop Rubicon, a tool that transpiles Prism models to the Probabilistic Inference tool Dice. Finally, we demonstrate better scalability compared to Probabilistic model checkers on selected benchmarks. All together, our results suggest that Probabilistic Inference is a valuable addition to the Probabilistic model checking portfolio -- with Rubicon as a first step towards integrating both perspectives.

  • Probabilistic Inference in hybrid domains by weighted model integration
    2015
    Co-Authors: Vaishak Belle, Andrea Passerini, Guy Van Den Broeck
    Abstract:

    Weighted model counting (WMC) on a propositional knowledge base is an effective and general approach to Probabilistic Inference in a variety of formalisms, including Bayesian and Markov Networks. However, an inherent limitation of WMC is that it only admits the Inference of discrete probability distributions. In this paper, we introduce a strict generalization of WMC called weighted model integration that is based on annotating Boolean and arithmetic constraints, and combinations thereof. This methodology is shown to capture discrete, continuous and hybrid Markov networks. We then consider the task of parameter learning for a fragment of the language. An empirical evaluation demonstrates the applicability and promise of the proposal.

  • lifted Probabilistic Inference by first order knowledge compilation
    2011
    Co-Authors: Guy Van Den Broeck, Nima Taghipour, Jesse Davis, Wannes Meert, Luc De Raedt
    Abstract:

    Probabilistic logical languages provide powerful formalisms for knowledge representation and learning. Yet performing Inference in these languages is extremely costly, especially if it is done at the propositional level. Lifted Inference algorithms, which avoid repeated computation by treating indistinguishable groups of objects as one, help mitigate this cost. Seeking inspiration from logical Inference, where lifted Inference (e.g., resolution) is commonly performed, we develop a model theoretic approach to Probabilistic lifted Inference. Our algorithm compiles a first-order Probabilistic theory into a first-order deterministic decomposable negation normal form (d-DNNF) circuit. Compilation offers the advantage that Inference is polynomial in the size of the circuit. Furthermore, by borrowing techniques from the knowledge compilation literature our algorithm effectively exploits the logical structure (e.g., context-specific independencies) within the first-order model, which allows more computation to be done at the lifted level. An empirical comparison demonstrates the utility of the proposed approach.