Ridge Regression

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 249 Experts worldwide ranked by ideXlab platform

Rolf Sundberg - One of the best experts on this subject based on the ideXlab platform.

  • Continuum Regression and Ridge Regression
    Journal of the Royal Statistical Society: Series B (Methodological), 1993
    Co-Authors: Rolf Sundberg
    Abstract:

    We demonstrate the close relationship between first-factor continuum Regression and standard Ridge Regression. The difference is that continuum Regression inserts a scalar compensation factor for that part of the shrinkage in Ridge Regression that has no connection with tendencies towards collinearity. We interpret this to mean that first-factor continuum Regression is preferable in principle to Ridge Regression if we want protection against near colliearity but do not admit shrinkage as a general principle. Furthermore, our experience indicates that with first-factor continuum Regression we can obtain predictors that are at least as mean-squared error efficient as with Ridge Regression but with less sensitivity to the choice of Ridge constant

W Mahoneymichael - One of the best experts on this subject based on the ideXlab platform.

  • Sketched Ridge Regression
    Journal of Machine Learning Research, 2017
    Co-Authors: Wangshusen, Gittensalex, W Mahoneymichael
    Abstract:

    We address the statistical and optimization impacts of the classical sketch and Hessian sketch used to approximately solve the Matrix Ridge Regression (MRR) problem. Prior research has quantified t...

Magne Aldrin - One of the best experts on this subject based on the ideXlab platform.

  • Length modified Ridge Regression
    Computational Statistics & Data Analysis, 1997
    Co-Authors: Magne Aldrin
    Abstract:

    Abstract Biased Regression methods may improve considerably on ordinary least squares Regression with few or noisy data, or when the predictor variables are highly collinear. In the present work, I present a new, biased method that modifies the ordinary least squares estimate by adjusting each element of the estimated coefficient vector. The adjusting factors are found by minimizing a measure of prediction error. However, the optimal adjusting factors depend on the unknown coefficient vector as well as the variance of the noise, so in practice these are replaced by preliminary estimates. The final estimate of the coefficient vector has the same direction as the preliminary estimate, but the length is modified. Ridge Regression is used as the principal method to find the preliminary estimate, and the method is called length modified Ridge Regression. In addition, length modified principal components Regression is considered. The prediction performance of the methods are compared to other Regression methods (Ridge, James-Stein, partial least squares, principal components and variable subset selection) in a simulation study. Of all methods considered, length modified Ridge Regression shows the overall best behaviour. The improvement over Ridge Regression is moderate, but significant, especially when the data are few and noisy.

Dick Van Dijk - One of the best experts on this subject based on the ideXlab platform.

  • Nonlinear forecasting with many predictors using kernel Ridge Regression
    International Journal of Forecasting, 2016
    Co-Authors: Peter Exterkate, Patrick J. F. Groenen, Christiaan Heij, Dick Van Dijk
    Abstract:

    Abstract This paper puts forward kernel Ridge Regression as an approach for forecasting with many predictors that are related to the target variable nonlinearly. In kernel Ridge Regression, the observed predictor variables are mapped nonlinearly into a high-dimensional space, where estimation of the predictive Regression model is based on a shrinkage estimator in order to avoid overfitting. We extend the kernel Ridge Regression methodology to enable its use for economic time series forecasting, by including lags of the dependent variable or other individual variables as predictors, as is typically desired in macroeconomic and financial applications. Both Monte Carlo simulations and an empirical application to various key measures of real economic activity confirm that kernel Ridge Regression can produce more accurate forecasts than traditional linear and nonlinear methods for dealing with many predictors based on principal components.

  • Nonlinear Forecasting With Many Predictors Using Kernel Ridge Regression
    2013
    Co-Authors: Peter Exterkate, Patrick J. F. Groenen, Christiaan Heij, Dick Van Dijk
    Abstract:

    This paper puts forward kernel Ridge Regression as an approach for forecasting with many predictors that are related nonlinearly to the target variable. In kernel Ridge Regression, the observed predictor variables are mapped nonlinearly into a high-dimensional space, where estimation of the predictive Regression model is based on a shrinkage estimator to avoid overfitting. We extend the kernel Ridge Regression methodology to enable its use for economic time-series forecasting, by including lags of the dependent variable or other individual variables as predictors, as typically desired in macroeconomic and financial applications. Monte Carlo simulations as well as an empirical application to various key measures of real economic activity confirm that kernel Ridge Regression can produce more accurate forecasts than traditional linear and nonlinear methods for dealing with many predictors based on principal component Regression.

  • Nonlinear Forecasting with Many Predictors Using Kernel Ridge Regression
    SSRN Electronic Journal, 2011
    Co-Authors: Peter Exterkate, Patrick J. F. Groenen, Christiaan Heij, Dick Van Dijk
    Abstract:

    This paper puts forward kernel Ridge Regression as an approach for forecasting with many predictors that are related nonlinearly to the target variable. In kernel Ridge Regression, the observed predictor variables are mapped nonlinearly into a high-dimensional space, where estimation of the predictive Regression model is based on a shrinkage estimator to avoid overfitting. We extend the kernel Ridge Regression methodology to enable its use for economic time-series forecasting, by including lags of the dependent variable or other individual variables as predictors, as is typically desired in macroeconomic and financial applications. Monte Carlo simulations as well as an empirical application to various key measures of real economic activity confirm that kernel Ridge Regression can produce more accurate forecasts than traditional linear methods for dealing with many predictors based on principal component Regression.

Wangshusen - One of the best experts on this subject based on the ideXlab platform.

  • Sketched Ridge Regression
    Journal of Machine Learning Research, 2017
    Co-Authors: Wangshusen, Gittensalex, W Mahoneymichael
    Abstract:

    We address the statistical and optimization impacts of the classical sketch and Hessian sketch used to approximately solve the Matrix Ridge Regression (MRR) problem. Prior research has quantified t...