Proposal Distribution

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Matthew J Payne - One of the best experts on this subject based on the ideXlab platform.

  • run dmc an efficient parallel code for analyzing radial velocity observations using n body integrations and differential evolution markov chain monte carlo
    Astrophysical Journal Supplement Series, 2013
    Co-Authors: Benjamin E Nelson, Eric B Ford, Matthew J Payne
    Abstract:

    In the 20+ years of Doppler observations of stars, scientists have uncovered a diverse population of extrasolar multi-planet systems. A common technique for characterizing the orbital elements of these planets is the Markov Chain Monte Carlo (MCMC), using a Keplerian model with random walk Proposals and paired with the Metropolis-Hastings algorithm. For approximately a couple of dozen planetary systems with Doppler observations, there are strong planet-planet interactions due to the system being in or near a mean-motion resonance (MMR). An N-body model is often required to accurately describe these systems. Further computational difficulties arise from exploring a high-dimensional parameter space (~7 × number of planets) that can have complex parameter correlations, particularly for systems near a MMR. To surmount these challenges, we introduce a differential evolution MCMC (DEMCMC) algorithm applied to radial velocity data while incorporating self-consistent N-body integrations. Our Radial velocity Using N-body DEMCMC (RUN DMC) algorithm improves upon the random walk Proposal Distribution of the traditional MCMC by using an ensemble of Markov chains to adaptively improve the Proposal Distribution. RUN DMC can sample more efficiently from high-dimensional parameter spaces that have strong correlations between model parameters. We describe the methodology behind the algorithm, along with results of tests for accuracy and performance. We find that most algorithm parameters have a modest effect on the rate of convergence. However, the size of the ensemble can have a strong effect on performance. We show that the optimal choice depends on the number of planets in a system, as well as the computer architecture used and the resulting extent of parallelization. While the exact choices of optimal algorithm parameters will inevitably vary due to the details of individual planetary systems (e.g., number of planets, number of observations, orbital periods, and signal-to-noise of each planet), we offer recommendations for choosing the DEMCMC algorithm's algorithmic parameters that result in excellent performance for a wide variety of planetary systems.

  • run dmc an efficient parallel code for analyzing radial velocity observations using n body integrations and differential evolution markov chain monte carlo
    arXiv: Earth and Planetary Astrophysics, 2013
    Co-Authors: Benjamin E Nelson, Eric B Ford, Matthew J Payne
    Abstract:

    In the 20+ years of Doppler observations of stars, scientists have uncovered a diverse population of extrasolar multi-planet systems. A common technique for characterizing the orbital elements of these planets is Markov chain Monte Carlo (MCMC), using a Keplerian model with random walk Proposals and paired with the Metropolis-Hastings algorithm. For approximately a couple of dozen planetary systems with Doppler observations, there are strong planet-planet interactions due to the system being in or near a mean-motion resonance (MMR). An N-body model is often required to accurately describe these systems. Further computational difficulties arise from exploring a high-dimensional parameter space ($\sim$7 x number of planets) that can have complex parameter correlations. To surmount these challenges, we introduce a differential evolution MCMC (DEMCMC) applied to radial velocity data while incorporating self-consistent N-body integrations. Our Radial velocity Using N-body DEMCMC (RUN DMC) algorithm improves upon the random walk Proposal Distribution of the traditional MCMC by using an ensemble of Markov chains to adaptively improve the Proposal Distribution. We describe the methodology behind the algorithm, along with results of tests for accuracy and performance. We find that most algorithm parameters have a modest effect on the rate of convergence. However, the size of the ensemble can have a strong effect on performance. We show that the optimal choice depends on the number of planets in a system, as well as the computer architecture used and the resulting extent of parallelization. While the exact choices of optimal algorithm parameters will inevitably vary due to the details of individual planetary systems, we offer recommendations for choosing the DEMCMC algorithm's algorithmic parameters that result in excellent performance for a wide variety of planetary systems.

James J Little - One of the best experts on this subject based on the ideXlab platform.

  • spl sigma slam stereo vision slam using the rao blackwellised particle filter and a novel mixture Proposal Distribution
    International Conference on Robotics and Automation, 2006
    Co-Authors: Pantelis Elinas, Robert Sim, James J Little
    Abstract:

    We consider the problem of simultaneous localization and mapping (SLAM) using the Rao-Blackwellised particle filter (RBPF) for the class of indoor mobile robots equipped only with stereo vision. Our goal is to construct dense metric maps of natural 3D point landmarks for large cyclic environments in the absence of accurate landmark position measurements and motion estimates. Our work differs from other approaches because landmark estimates are derived from stereo vision and motion estimates are based on sparse optical flow. We distinguish between landmarks using the scale invariant feature transform (SIFT). This is in contrast to current popular approaches that rely on reliable motion models derived from odometric hardware and accurate landmark measurements obtained with laser sensors. Since our approach depends on a particle filter whose main component is the Proposal Distribution, we develop and evaluate a novel mixture Proposal Distribution that allows us to robustly close large loops. We validate our approach experimentally for long camera trajectories processing thousands of images at reasonable frame rates

  • a boosted particle filter multitarget detection and tracking
    European Conference on Computer Vision, 2004
    Co-Authors: Kenji Okuma, Nando De Freitas, James J Little, Ali Taleghani, David G Lowe
    Abstract:

    The problem of tracking a varying number of non-rigid objects has two major difficulties. First, the observation models and target Distributions can be highly non-linear and non-Gaussian. Second, the presence of a large, varying number of objects creates complex interactions with overlap and ambiguities. To surmount these difficulties, we introduce a vision system that is capable of learning, detecting and tracking the objects of interest. The system is demonstrated in the context of tracking hockey players using video sequences. Our approach combines the strengths of two successful algorithms: mixture particle filters and Adaboost. The mixture particle filter [17] is ideally suited to multi-target tracking as it assigns a mixture component to each player. The crucial design issues in mixture particle filters are the choice of the Proposal Distribution and the treatment of objects leaving and entering the scene. Here, we construct the Proposal Distribution using a mixture model that incorporates information from the dynamic models of each player and the detection hypotheses generated by Adaboost. The learned Adaboost Proposal Distribution allows us to quickly detect players entering the scene, while the filtering process enables us to keep track of the individual players. The result of interleaving Adaboost with mixture particle filters is a simple, yet powerful and fully automatic multiple object tracking system.

Bin Zhu - One of the best experts on this subject based on the ideXlab platform.

  • optimal Proposal Distribution fastslam with suitable gaussian weighted integral solutions
    International Conference on Neural Information Processing, 2013
    Co-Authors: Yu Song, Zengguang Hou, Bin Zhu
    Abstract:

    One of the key issues in Gaussian SLAM is to calculate nonlinear transition density of Gaussian prior, i.e. to calculate Gaussian Weight Integral (GWI) whose integrand is with the form nonlinear function × Gaussian prior density. Up to now, some GWI solutions have been applied in SLAM (e.g. linearization, unscented transform and cubature rule), and different SLAM algorithms were derived based on theirs GWI solutions. While, how to select suitable GWI solution for SLAM is still lack of theoretical analysis. In this paper, we proposed an optimal Proposal FastSLAM algorithm with suitable GWI solutions. The main contributions of this work lies that: (1) an unified FastSLAM framework with optimal Proposal Distribution is summarized; (2) a SLAM dimensionality based GWI solution selection criterion is designed; (3) we propose a new SLAM algorithm. The performance of the proposed SLAM is investigated and compared with the FastSLAM2.0 and UFastSLAM using simulations and our opinion is confirmed by the results.

Eric A Wan - One of the best experts on this subject based on the ideXlab platform.

  • the unscented particle filter
    Neural Information Processing Systems, 2000
    Co-Authors: Rudolph Van Der Merwe, Arnaud Doucet, Nando De Freitas, Eric A Wan
    Abstract:

    In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance Proposal Distribution. This Proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available information and, secondly, it can have heavy tails. As a result, we find that the algorithm outperforms standard particle filtering and other nonlinear filtering methods very substantially. This experimental finding is in agreement with the theoretical convergence proof for the algorithm. The algorithm also includes resampling and (possibly) Markov chain Monte Carlo (MCMC) steps.

R Van Der Merwe - One of the best experts on this subject based on the ideXlab platform.

  • gaussian mixture sigma point particle filters for sequential probabilistic inference in dynamic state space models
    International Conference on Acoustics Speech and Signal Processing, 2003
    Co-Authors: R Van Der Merwe
    Abstract:

    For sequential probabilistic inference in nonlinear non-Gaussian systems, approximate solutions must be used. We present a novel recursive Bayesian estimation algorithm that combines an importance sampling based measurement update step with a bank of sigma-point Kalman filters for the time-update and Proposal Distribution generation. The posterior state density is represented by a Gaussian mixture model that is recovered from the weighted particle set of the measurement update step by means of a weighted EM algorithm. This step replaces the resampling stage needed by most particle filters and mitigates the "sample depletion" problem. We show that this new approach has an improved estimation performance and reduced computational complexity compared to other related algorithms.