Recursive Estimation

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 22347 Experts worldwide ranked by ideXlab platform

Petr Tichavsky - One of the best experts on this subject based on the ideXlab platform.

  • multicomponent polynomial phase signal analysis using a tracking algorithm
    IEEE Transactions on Signal Processing, 1999
    Co-Authors: Petr Tichavsky
    Abstract:

    We describe an efficient technique analyzing signals that comprise a number of polynomial phase components. The technique is based on a previously proposed "multiple frequency tracker", which is an algorithm for Recursive Estimation of parameters of multiple sine waves in noise. It has a relatively low SNR threshold and moderate computational complexity.

  • posterior cramer rao bounds for discrete time nonlinear filtering
    IEEE Transactions on Signal Processing, 1998
    Co-Authors: Petr Tichavsky, Carlos H Muravchik, Arye Nehorai
    Abstract:

    A mean-square error lower bound for the discrete-time nonlinear filtering problem is derived based on the van Trees (1968) (posterior) version of the Cramer-Rao inequality. This lower bound is applicable to multidimensional nonlinear, possibly non-Gaussian, dynamical systems and is more general than the previous bounds in the literature. The case of singular conditional distribution of the one-step-ahead state vector given the present state is considered. The bound is evaluated for three important examples: the Recursive Estimation of slowly varying parameters of an autoregressive process, tracking a slowly varying frequency of a single cisoid in noise, and tracking parameters of a sinusoidal frequency with sinusoidal phase modulation.

Arye Nehorai - One of the best experts on this subject based on the ideXlab platform.

  • posterior cramer rao bounds for discrete time nonlinear filtering
    IEEE Transactions on Signal Processing, 1998
    Co-Authors: Petr Tichavsky, Carlos H Muravchik, Arye Nehorai
    Abstract:

    A mean-square error lower bound for the discrete-time nonlinear filtering problem is derived based on the van Trees (1968) (posterior) version of the Cramer-Rao inequality. This lower bound is applicable to multidimensional nonlinear, possibly non-Gaussian, dynamical systems and is more general than the previous bounds in the literature. The case of singular conditional distribution of the one-step-ahead state vector given the present state is considered. The bound is evaluated for three important examples: the Recursive Estimation of slowly varying parameters of an autoregressive process, tracking a slowly varying frequency of a single cisoid in noise, and tracking parameters of a sinusoidal frequency with sinusoidal phase modulation.

Norman R Swanson - One of the best experts on this subject based on the ideXlab platform.

  • forecasting financial and macroeconomic variables using data reduction methods new empirical evidence
    Journal of Econometrics, 2014
    Co-Authors: Hyun Hak Kim, Norman R Swanson
    Abstract:

    In this paper, we empirically assess the predictive accuracy of a large group of models based on the use of principle components and other shrinkage methods, including Bayesian model averaging and various bagging, boosting, LASSO and related methods Our results suggest that model averaging does not dominate other well designed prediction model speci…cation methods, and that using a combination of factor and other shrinkage methods often yields superior predictions. For example, when using Recursive Estimation windows, which dominate other “windowing" approaches in our experiments, prediction models constructed using pure principal component type models combined with shrinkage methods yield mean square forecast error “best” models around 70% of the time, when used to predict 11 key macroeconomic indicators at various forecast horizons. Baseline linear models (which “win”around 5% of the time) and model averaging methods (which win around 25% of the time) fare substantially worse than our sophisticated nonlinear models. Ancillary …ndings based on our forecasting experiments underscore the advantages of using Recursive Estimation strategies, and provide new evidence of the usefulness of yield and yield-spread variables in nonlinear prediction

  • forecasting financial and macroeconomic variables using data reduction methods new empirical evidence
    2011
    Co-Authors: Hyun Hak Kim, Norman R Swanson
    Abstract:

    In this paper, we empirically assess the predictive accuracy of a large group of models based on the use of principle components and other shrinkage methods, including Bayesian model averaging and various bagging, boosting, LASSO and related methods Our results suggest that model averaging does not dominate other well designed prediction model specification methods, and that using a combination of factor and other shrinkage methods often yields superior predictions. For example, when using Recursive Estimation windows, which dominate other windowing" approaches in our experiments, prediction models constructed using pure principal component type models combined with shrinkage methods yield mean square forecast error best models around 70% of the time, when used to predict 11 key macroeconomic indicators at various forecast horizons. Baseline linear models (which winaround 5% of the time) and model averaging methods (which win around 25% of the time) fare substantially worse than our sophisticated nonlinear models. Ancillary findings based on our forecasting experiments underscore the advantages of using Recursive Estimation strategies, and provide new evidence of the usefulness of yield and yield-spread variables in nonlinear prediction specification.

  • nonparametric bootstrap procedures for predictive inference based on Recursive Estimation schemes
    International Economic Review, 2007
    Co-Authors: Valentina Corradi, Norman R Swanson
    Abstract:

    Our objectives in this paper are twofold. First, we introduce block bootstrap techniques that are (first order) valid in Recursive Estimation frameworks. Thereafter, we present two examples where predictive accuracy tests are made operational using our new bootstrap procedures. In one application, we outline a consistent test for out-of-sample nonlinear Granger causality, and in the other we outline a test for selecting amongst multiple alternative forecasting models, all of which are possibly misspecified. More specifically, our examples extend the White (2000) reality check to the case of non vanishing parameter Estimation error, and extend the integrated conditional moment tests of Bierens (1982, 1990) and Bierens and Ploberger (1997) to the case of out-of-sample prediction. In both examples, appropriate re-centering of the bootstrap score is required in order to ensure that the tests have asymptotically correct size, and the need for such re-centering is shown to arise quite naturally when testing hypotheses of predictive accuracy. In a Monte Carlo investigation, we compare the finite sample properties of our block bootstrap procedures with the parametric bootstrap due to Kilian (1999); all within the context of various encompassing and predictive accuracy tests. An empirical illustration is also discussed, in which it is found that unemployment appears to have nonlinear marginal predictive content for inflation.

  • nonparametric bootstrap procedures for predictive inference based on Recursive Estimation schemes
    International Economic Review, 2007
    Co-Authors: Valentina Corradi, Norman R Swanson
    Abstract:

    We introduce block bootstrap techniques that are (first order) valid in Recursive Estimation frameworks. Thereafter, we present two examples where predictive accuracy tests are made operational using our new bootstrap procedures. In one application, we outline a consistent test for out-of-sample nonlinear Granger causality, and in the other we outline a test for selecting amongst multiple alternative forecasting models, all of which are possibly misspecified. In a Monte Carlo investigation, we compare the finite sample properties of our block bootstrap procedures with the parametric bootstrap due to Kilian (1999); within the context of encompassing and predictive accuracy tests. In the empirical illustration, it is found that unemployment has nonlinear marginal predictive content for inflation.

Francesco Ravazzolo - One of the best experts on this subject based on the ideXlab platform.

  • macroeconomic factors strike back a bayesian change point model of time varying risk exposures and premia in the u s cross section
    Social Science Research Network, 2014
    Co-Authors: Daniele Bianchi, Massimo Guidolin, Francesco Ravazzolo
    Abstract:

    This paper proposes a Bayesian Estimation framework for a typical multi-factor model with time-varying risk exposures to macroeconomic risk factors and corresponding premia to price U.S. stocks and bonds. The model assumes that risk exposures and idiosyncratic volatility follow a break-point latent process, allowing for changes at any point in time but not restricting them to change at all points. An empirical application to 40 years of U.S. data and 23 portfolios shows that the approach yields sensible results compared to previous two-step methods based on naive Recursive Estimation schemes, as well as a set of alternative model restrictions. A variance decomposition test shows that although most of the predictable variation comes from the market risk premium, a number of additional macroeconomic risks, including real output and inflation shocks, are significantly priced in the cross-section. A Bayes factor analysis decisively favors the proposed change-point model.

Roy Eagleson - One of the best experts on this subject based on the ideXlab platform.

  • Recursive Estimation of time varying motion and structure parameters
    Pattern Recognition, 1996
    Co-Authors: John L Barron, Roy Eagleson
    Abstract:

    We present a computational framework for recovering both first-order motion parameters (observer direction of translation and observer rotation), second-order motion parameters (observer rotational acceleration) and relative depth maps from time-varying optical flow. We recover translation speed and acceleration in units which are scaled relative to the distance to the object. Our assumption is that the observer rotational motion is no more than ''second order'', in other words, observer motion is either constant or has at most constant acceleration. We examine the effect of noise on the solution of the motion and structure parameters. This ensemble of unknowns comprises a solution to the classical ''structure-and-motion from optic flow'' problem. Our complete framework utilizes a method for interpreting the bilinear image velocity equation by solving simple systems of linear equations. Since our noise analysis yields uncertainty measures for each parameter, a Kalman filter is employed to incrementally integrate new measurements as they become available as each additional frame in the sequence is processed. We conclude by analysing this reduction of uncertainty over time as the system converges to a stable solution for both synthetic and real image sequences.