Multivariate Regression

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 309735 Experts worldwide ranked by ideXlab platform

Han Liu - One of the best experts on this subject based on the ideXlab platform.

  • Calibrated Multivariate Regression with application to neural semantic basis discovery
    Journal of machine learning research : JMLR, 2015
    Co-Authors: Han Liu, Lie Wang, Tuo Zhaoy
    Abstract:

    We propose a calibrated Multivariate Regression method named CMR for fitting high dimensional Multivariate Regression models. Compared with existing methods, CMR calibrates regularization for each Regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence O(1/e), where e is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package camel implementing the proposed method is available on the Comprehensive R Archive Network http://cran.r-project.org/web/packages/camel/.

  • NIPS - Multivariate Regression with Calibration
    Advances in neural information processing systems, 2014
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a new method named calibrated Multivariate Regression (CMR) for fitting high dimensional Multivariate Regression models. Compared to existing methods, CMR calibrates the regularization for each Regression task with respect to its noise level so that it is simultaneously tuning insensitive and achieves an improved finite-sample performance. Computationally, we develop an efficient smoothed proximal gradient algorithm which has a worst-case iteration complexity O(1/ ∈), where ∈ is a pre-specified numerical accuracy. Theoretically, we prove that CMR achieves the optimal rate of convergence in parameter estimation. We illustrate the usefulness of CMR by thorough numerical simulations and show that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR on a brain activity prediction problem and find that CMR is as competitive as the handcrafted model created by human experts.

  • Multivariate Regression with Calibration
    arXiv: Machine Learning, 2013
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a new method named calibrated Multivariate Regression (CMR) for fitting high dimensional Multivariate Regression models. Compared to existing methods, CMR calibrates the regularization for each Regression task with respect to its noise level so that it is simultaneously tuning insensitive and achieves an improved finite sample performance. Computationally, we develop an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence $O(1/\epsilon)$, where $\epsilon$ is a pre-specified accuracy. Theoretically, we prove that CMR achieves the optimal rate of convergence in parameter estimation. We illustrate the usefulness of CMR by thorough numerical simulations and show that CMR consistently outperforms existing Multivariate Regression methods. We also apply CMR on a brain activity prediction problem and find that CMR even outperforms the handcrafted models created by human experts.

  • Calibrated Multivariate Regression with Application to Neural Semantic Basis Discovery
    arXiv: Machine Learning, 2013
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a calibrated Multivariate Regression method named CMR for fitting high dimensional Multivariate Regression models. Compared with existing methods, CMR calibrates regularization for each Regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence $\cO(1/\epsilon)$, where $\epsilon$ is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package \texttt{camel} implementing the proposed method is available on the Comprehensive R Archive Network \url{this http URL}.

Tuo Zhao - One of the best experts on this subject based on the ideXlab platform.

  • NIPS - Multivariate Regression with Calibration
    Advances in neural information processing systems, 2014
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a new method named calibrated Multivariate Regression (CMR) for fitting high dimensional Multivariate Regression models. Compared to existing methods, CMR calibrates the regularization for each Regression task with respect to its noise level so that it is simultaneously tuning insensitive and achieves an improved finite-sample performance. Computationally, we develop an efficient smoothed proximal gradient algorithm which has a worst-case iteration complexity O(1/ ∈), where ∈ is a pre-specified numerical accuracy. Theoretically, we prove that CMR achieves the optimal rate of convergence in parameter estimation. We illustrate the usefulness of CMR by thorough numerical simulations and show that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR on a brain activity prediction problem and find that CMR is as competitive as the handcrafted model created by human experts.

  • Multivariate Regression with Calibration
    arXiv: Machine Learning, 2013
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a new method named calibrated Multivariate Regression (CMR) for fitting high dimensional Multivariate Regression models. Compared to existing methods, CMR calibrates the regularization for each Regression task with respect to its noise level so that it is simultaneously tuning insensitive and achieves an improved finite sample performance. Computationally, we develop an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence $O(1/\epsilon)$, where $\epsilon$ is a pre-specified accuracy. Theoretically, we prove that CMR achieves the optimal rate of convergence in parameter estimation. We illustrate the usefulness of CMR by thorough numerical simulations and show that CMR consistently outperforms existing Multivariate Regression methods. We also apply CMR on a brain activity prediction problem and find that CMR even outperforms the handcrafted models created by human experts.

  • Calibrated Multivariate Regression with Application to Neural Semantic Basis Discovery
    arXiv: Machine Learning, 2013
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a calibrated Multivariate Regression method named CMR for fitting high dimensional Multivariate Regression models. Compared with existing methods, CMR calibrates regularization for each Regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence $\cO(1/\epsilon)$, where $\epsilon$ is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package \texttt{camel} implementing the proposed method is available on the Comprehensive R Archive Network \url{this http URL}.

Lie Wang - One of the best experts on this subject based on the ideXlab platform.

  • Calibrated Multivariate Regression with application to neural semantic basis discovery
    Journal of machine learning research : JMLR, 2015
    Co-Authors: Han Liu, Lie Wang, Tuo Zhaoy
    Abstract:

    We propose a calibrated Multivariate Regression method named CMR for fitting high dimensional Multivariate Regression models. Compared with existing methods, CMR calibrates regularization for each Regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence O(1/e), where e is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package camel implementing the proposed method is available on the Comprehensive R Archive Network http://cran.r-project.org/web/packages/camel/.

  • NIPS - Multivariate Regression with Calibration
    Advances in neural information processing systems, 2014
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a new method named calibrated Multivariate Regression (CMR) for fitting high dimensional Multivariate Regression models. Compared to existing methods, CMR calibrates the regularization for each Regression task with respect to its noise level so that it is simultaneously tuning insensitive and achieves an improved finite-sample performance. Computationally, we develop an efficient smoothed proximal gradient algorithm which has a worst-case iteration complexity O(1/ ∈), where ∈ is a pre-specified numerical accuracy. Theoretically, we prove that CMR achieves the optimal rate of convergence in parameter estimation. We illustrate the usefulness of CMR by thorough numerical simulations and show that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR on a brain activity prediction problem and find that CMR is as competitive as the handcrafted model created by human experts.

  • Multivariate Regression with Calibration
    arXiv: Machine Learning, 2013
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a new method named calibrated Multivariate Regression (CMR) for fitting high dimensional Multivariate Regression models. Compared to existing methods, CMR calibrates the regularization for each Regression task with respect to its noise level so that it is simultaneously tuning insensitive and achieves an improved finite sample performance. Computationally, we develop an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence $O(1/\epsilon)$, where $\epsilon$ is a pre-specified accuracy. Theoretically, we prove that CMR achieves the optimal rate of convergence in parameter estimation. We illustrate the usefulness of CMR by thorough numerical simulations and show that CMR consistently outperforms existing Multivariate Regression methods. We also apply CMR on a brain activity prediction problem and find that CMR even outperforms the handcrafted models created by human experts.

  • Calibrated Multivariate Regression with Application to Neural Semantic Basis Discovery
    arXiv: Machine Learning, 2013
    Co-Authors: Han Liu, Lie Wang, Tuo Zhao
    Abstract:

    We propose a calibrated Multivariate Regression method named CMR for fitting high dimensional Multivariate Regression models. Compared with existing methods, CMR calibrates regularization for each Regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence $\cO(1/\epsilon)$, where $\epsilon$ is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package \texttt{camel} implementing the proposed method is available on the Comprehensive R Archive Network \url{this http URL}.

Michael I Jordan - One of the best experts on this subject based on the ideXlab platform.

  • union support recovery in high dimensional Multivariate Regression
    2008
    Co-Authors: Guillaume Obozinski, Martin J Wainwright, Michael I Jordan
    Abstract:

    In Multivariate Regression, a $K$-dimensional response vector is regressed upon a common set of $p$ covariates, with a matrix $B^*\in\mathbb{R}^{p\times K}$ of Regression coefficients. We study the behavior of the Multivariate group Lasso, in which block regularization based on the $\ell_1/\ell_2$ norm is used for support union recovery, or recovery of the set of $s$ rows for which $B^*$ is nonzero. Under high-dimensional scaling, we show that the Multivariate group Lasso exhibits a threshold for the recovery of the exact row pattern with high probability over the random design and noise that is specified by the sample complexity parameter $\theta(n,p,s):=n/[2\psi(B^*)\log(p-s)]$. Here $n$ is the sample size, and $\psi(B^*)$ is a sparsity-overlap function measuring a combination of the sparsities and overlaps of the $K$-Regression coefficient vectors that constitute the model. We prove that the Multivariate group Lasso succeeds for problem sequences $(n,p,s)$ such that $\theta(n,p,s)$ exceeds a critical level $\theta_u$, and fails for sequences such that $\theta(n,p,s)$ lies below a critical level $\theta_{\ell}$. For the special case of the standard Gaussian ensemble, we show that $\theta_{\ell}=\theta_u$ so that the characterization is sharp. The sparsity-overlap function $\psi(B^*)$ reveals that, if the design is uncorrelated on the active rows, $\ell_1/\ell_2$ regularization for Multivariate Regression never harms performance relative to an ordinary Lasso approach and can yield substantial improvements in sample complexity (up to a factor of $K$) when the coefficient vectors are suitably orthogonal. For more general designs, it is possible for the ordinary Lasso to outperform the Multivariate group Lasso. We complement our analysis with simulations that demonstrate the sharpness of our theoretical results, even for relatively small problems.

  • support union recovery in high dimensional Multivariate Regression
    arXiv: Machine Learning, 2008
    Co-Authors: Guillaume Obozinski, Martin J Wainwright, Michael I Jordan
    Abstract:

    In Multivariate Regression, a $K$-dimensional response vector is regressed upon a common set of $p$ covariates, with a matrix $B^*\in\mathbb{R}^{p\times K}$ of Regression coefficients. We study the behavior of the Multivariate group Lasso, in which block regularization based on the $\ell_1/\ell_2$ norm is used for support union recovery, or recovery of the set of $s$ rows for which $B^*$ is nonzero. Under high-dimensional scaling, we show that the Multivariate group Lasso exhibits a threshold for the recovery of the exact row pattern with high probability over the random design and noise that is specified by the sample complexity parameter $\theta(n,p,s):=n/[2\psi(B^*)\log(p-s)]$. Here $n$ is the sample size, and $\psi(B^*)$ is a sparsity-overlap function measuring a combination of the sparsities and overlaps of the $K$-Regression coefficient vectors that constitute the model. We prove that the Multivariate group Lasso succeeds for problem sequences $(n,p,s)$ such that $\theta(n,p,s)$ exceeds a critical level $\theta_u$, and fails for sequences such that $\theta(n,p,s)$ lies below a critical level $\theta_{\ell}$. For the special case of the standard Gaussian ensemble, we show that $\theta_{\ell}=\theta_u$ so that the characterization is sharp. The sparsity-overlap function $\psi(B^*)$ reveals that, if the design is uncorrelated on the active rows, $\ell_1/\ell_2$ regularization for Multivariate Regression never harms performance relative to an ordinary Lasso approach and can yield substantial improvements in sample complexity (up to a factor of $K$) when the coefficient vectors are suitably orthogonal. For more general designs, it is possible for the ordinary Lasso to outperform the Multivariate group Lasso. We complement our analysis with simulations that demonstrate the sharpness of our theoretical results, even for relatively small problems.

  • union support recovery in high dimensional Multivariate Regression
    Allerton Conference on Communication Control and Computing, 2008
    Co-Authors: Guillaume Obozinski, Martin J Wainwright, Michael I Jordan
    Abstract:

    In the problem of Multivariate Regression, a K-dimensional response vector is regressed upon a common set of p covariates, with a matrix B* isin RopfptimesK of Regression coefficients. We study the behavior of the group Lasso using lscr1/lscr2 regularization for the union support problem, meaning that the set of s rows for which B* is non-zero is recovered exactly. Studying this problem under high-dimensional scaling, we show that group Lasso recovers the exact row pattern with high probability over the random design and noise for scalings of (n, p, s) such that the sample complexity parameter given by thetas(n, p, s) := n/[2psi(B*) log(p - s)] exceeds a critical threshold. Here n is the sample size, p is the ambient dimension of the Regression model, s is the number of non-zero rows, and psi(B*) is a sparsity-overlap function that measures a combination of the sparsities and overlaps of the K-Regression coefficient vectors that constitute the model. This sparsity-overlap function reveals that, if the design is uncorrelated on the active rows, block lscr1/lscr2 regularization for Multivariate Regression never harms performance relative to an ordinary Lasso approach, and can yield substantial improvements in sample complexity (up to a factor of K) when the Regression vectors are suitably orthogonal. For more general designs, it is possible for the ordinary Lasso to outperform the group Lasso.

Tuo Zhaoy - One of the best experts on this subject based on the ideXlab platform.

  • Calibrated Multivariate Regression with application to neural semantic basis discovery
    Journal of machine learning research : JMLR, 2015
    Co-Authors: Han Liu, Lie Wang, Tuo Zhaoy
    Abstract:

    We propose a calibrated Multivariate Regression method named CMR for fitting high dimensional Multivariate Regression models. Compared with existing methods, CMR calibrates regularization for each Regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence O(1/e), where e is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional Multivariate Regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package camel implementing the proposed method is available on the Comprehensive R Archive Network http://cran.r-project.org/web/packages/camel/.