The Experts below are selected from a list of 6609 Experts worldwide ranked by ideXlab platform
R.p. Leland - One of the best experts on this subject based on the ideXlab platform.
-
Gradient of the log-likelihood ratio for infinite dimensional stochastic systems
IEEE Transactions on Automatic Control, 2000Co-Authors: R.p. LelandAbstract:Using a Covariance Operator approach, we derive an expression for the log-likelihood ratio gradient for system parameter estimation for continuous-time infinite-dimensional stochastic systems. The gradient formula includes the smoother estimates and derivatives of system Operators, with no derivatives of estimates or Covariance Operators. The unbounded Operators typically found in partial differential equations limit how much the gradient formula can be simplified. A random heat equation is considered.
-
The log-likelihood gradient for infinite dimensional stochastic systems
Proceedings of the 38th IEEE Conference on Decision and Control (Cat. No.99CH36304), 1Co-Authors: R.p. LelandAbstract:Using a Covariance Operator approach, we derive an explicit expression for the log-likelihood ratio gradient for system parameter estimation for continuous time infinite dimensional stochastic systems. The gradient formula includes the smoother estimates and derivatives of system Operators, with no derivatives of estimates or Covariance Operators. The unbounded Operators typically found in partial differential equations place limitations on the expression for the gradient. An example of a random heat equation is considered.
Marco Shum - One of the best experts on this subject based on the ideXlab platform.
-
inference for the lagged cross Covariance Operator between functional time series
Journal of Time Series Analysis, 2019Co-Authors: Gregory Rice, Marco ShumAbstract:When considering two or more time series of functional data objects, for instance those derived from densely observed intraday stock price data of several companies, the empirical cross‐Covariance Operator is of fundamental importance due to its role in functional lagged regression and exploratory data analysis. Despite its relevance, statistical procedures for measuring the significance of such estimators are currently undeveloped. We present methodology based on a functional central limit theorem for conducting statistical inference for the cross‐Covariance Operator estimated between two stationary, weakly dependent, functional time series. Specifically, we consider testing the null hypothesis that the two series possess a specified cross‐Covariance structure at a given lag. Since this test assumes that the series are jointly stationary, we also develop a change‐point detection procedure to validate this assumption of independent interest. The most imposing technical hurdle in implementing the proposed tests involves estimating the spectrum of a high dimensional spectral density Operator at frequency zero. We propose a simple dimension reduction procedure based on functional principal component analysis to achieve this, which is shown to perform well in a simulation study. We illustrate the proposed methodology with an application to densely observed intraday price data of stocks listed on the New York stock exchange‐20.40
-
Inference for the Lagged Cross‐Covariance Operator Between Functional Time Series
Journal of Time Series Analysis, 2019Co-Authors: Gregory Rice, Marco ShumAbstract:When considering two or more time series of functional data objects, for instance those derived from densely observed intraday stock price data of several companies, the empirical cross‐Covariance Operator is of fundamental importance due to its role in functional lagged regression and exploratory data analysis. Despite its relevance, statistical procedures for measuring the significance of such estimators are currently undeveloped. We present methodology based on a functional central limit theorem for conducting statistical inference for the cross‐Covariance Operator estimated between two stationary, weakly dependent, functional time series. Specifically, we consider testing the null hypothesis that the two series possess a specified cross‐Covariance structure at a given lag. Since this test assumes that the series are jointly stationary, we also develop a change‐point detection procedure to validate this assumption of independent interest. The most imposing technical hurdle in implementing the proposed tests involves estimating the spectrum of a high dimensional spectral density Operator at frequency zero. We propose a simple dimension reduction procedure based on functional principal component analysis to achieve this, which is shown to perform well in a simulation study. We illustrate the proposed methodology with an application to densely observed intraday price data of stocks listed on the New York stock exchange‐20.40
-
Inference for the cross-Covariance Operator of stationary functional time series
arXiv: Statistics Theory, 2017Co-Authors: Gregory Rice, Marco ShumAbstract:When considering two or more time series of functions or curves, for instance those derived from densely observed intraday stock price data of several companies, the empirical cross-Covariance Operator is of fundamental importance due to its role in functional lagged regression and exploratory data analysis. Despite its relevance, statistical procedures for measuring the significance of such estimators are undeveloped. We present methodology based on a functional central limit theorem for conducting statistical inference for the cross-Covariance Operator estimated between two stationary, weakly dependent, functional time series. Specifically, we consider testing the null hypothesis that two series possess a specified cross-Covariance structure at a given lag. Since this test assumes that the series are jointly stationary, we also develop a change-point detection procedure to validate this assumption, which is of independent interest. The most imposing technical hurdle in implementing the proposed tests involves estimating the spectrum of a high dimensional spectral density Operator at frequency zero. We propose a simple dimension reduction procedure based on functional PCA to achieve this, which is shown to perform well in a small simulation study. We illustrate the proposed methodology with an application to densely observed intraday price data of stocks listed on the NYSE.
Gregory Rice - One of the best experts on this subject based on the ideXlab platform.
-
Structural break analysis for spectrum and trace of Covariance Operators
Environmetrics, 2019Co-Authors: Alexander Aue, Gregory Rice, Ozan SönmezAbstract:This paper deals with analyzing structural breaks in the Covariance Operator of sequentially observed functional data. For this purpose, procedures are developed to segment an observed stretch of curves into periods for which second‐order stationarity may be reasonably assumed. The proposed methods are based on measuring the fluctuations of sample eigenvalues, either individually or jointly, and traces of the sample Covariance Operator computed from segments of the data. To implement the tests, new limit results are introduced that deal with the large‐sample behavior of vector‐valued processes built from partial sample eigenvalue estimates. These results in turn enable the calibration of the tests to a prescribed asymptotic level. Applications to Australian annual minimum temperature curves and sea surface temperature anomaly records confirm that the proposed methods work well in finite samples. The first application suggests that the variation in annual minimum temperature underwent a structural break in the 1950s, after which typical fluctuations from the generally increasing trend started to be significantly smaller.
-
inference for the lagged cross Covariance Operator between functional time series
Journal of Time Series Analysis, 2019Co-Authors: Gregory Rice, Marco ShumAbstract:When considering two or more time series of functional data objects, for instance those derived from densely observed intraday stock price data of several companies, the empirical cross‐Covariance Operator is of fundamental importance due to its role in functional lagged regression and exploratory data analysis. Despite its relevance, statistical procedures for measuring the significance of such estimators are currently undeveloped. We present methodology based on a functional central limit theorem for conducting statistical inference for the cross‐Covariance Operator estimated between two stationary, weakly dependent, functional time series. Specifically, we consider testing the null hypothesis that the two series possess a specified cross‐Covariance structure at a given lag. Since this test assumes that the series are jointly stationary, we also develop a change‐point detection procedure to validate this assumption of independent interest. The most imposing technical hurdle in implementing the proposed tests involves estimating the spectrum of a high dimensional spectral density Operator at frequency zero. We propose a simple dimension reduction procedure based on functional principal component analysis to achieve this, which is shown to perform well in a simulation study. We illustrate the proposed methodology with an application to densely observed intraday price data of stocks listed on the New York stock exchange‐20.40
-
Inference for the Lagged Cross‐Covariance Operator Between Functional Time Series
Journal of Time Series Analysis, 2019Co-Authors: Gregory Rice, Marco ShumAbstract:When considering two or more time series of functional data objects, for instance those derived from densely observed intraday stock price data of several companies, the empirical cross‐Covariance Operator is of fundamental importance due to its role in functional lagged regression and exploratory data analysis. Despite its relevance, statistical procedures for measuring the significance of such estimators are currently undeveloped. We present methodology based on a functional central limit theorem for conducting statistical inference for the cross‐Covariance Operator estimated between two stationary, weakly dependent, functional time series. Specifically, we consider testing the null hypothesis that the two series possess a specified cross‐Covariance structure at a given lag. Since this test assumes that the series are jointly stationary, we also develop a change‐point detection procedure to validate this assumption of independent interest. The most imposing technical hurdle in implementing the proposed tests involves estimating the spectrum of a high dimensional spectral density Operator at frequency zero. We propose a simple dimension reduction procedure based on functional principal component analysis to achieve this, which is shown to perform well in a simulation study. We illustrate the proposed methodology with an application to densely observed intraday price data of stocks listed on the New York stock exchange‐20.40
-
Structural break analysis for spectrum and trace of Covariance Operators
arXiv: Methodology, 2018Co-Authors: Alexander Aue, Gregory Rice, Ozan SönmezAbstract:This paper deals with analyzing structural breaks in the Covariance Operator of sequentially observed functional data. For this purpose, procedures are developed to segment an observed stretch of curves into periods for which second-order stationarity may be reasonably assumed. The proposed methods are based on measuring the fluctuations of sample eigenvalues, either individually or jointly, and traces of the sample Covariance Operator computed from segments of the data. To implement the tests, new limit results are introduced that deal with the large-sample behavior of vector-valued processes built from partial sample eigenvalue estimates. These results in turn enable the calibration of the tests to a prescribed asymptotic level. A simulation study and an application to Australian annual minimum temperature curves confirm that the proposed methods work well in finite samples. The application suggests that the variation in annual minimum temperature underwent a structural break in the 1950s, after which typical fluctuations from the generally increasing trendstarted to be significantly smaller.
-
Inference for the cross-Covariance Operator of stationary functional time series
arXiv: Statistics Theory, 2017Co-Authors: Gregory Rice, Marco ShumAbstract:When considering two or more time series of functions or curves, for instance those derived from densely observed intraday stock price data of several companies, the empirical cross-Covariance Operator is of fundamental importance due to its role in functional lagged regression and exploratory data analysis. Despite its relevance, statistical procedures for measuring the significance of such estimators are undeveloped. We present methodology based on a functional central limit theorem for conducting statistical inference for the cross-Covariance Operator estimated between two stationary, weakly dependent, functional time series. Specifically, we consider testing the null hypothesis that two series possess a specified cross-Covariance structure at a given lag. Since this test assumes that the series are jointly stationary, we also develop a change-point detection procedure to validate this assumption, which is of independent interest. The most imposing technical hurdle in implementing the proposed tests involves estimating the spectrum of a high dimensional spectral density Operator at frequency zero. We propose a simple dimension reduction procedure based on functional PCA to achieve this, which is shown to perform well in a small simulation study. We illustrate the proposed methodology with an application to densely observed intraday price data of stocks listed on the NYSE.
Vladimir Pavlovic - One of the best experts on this subject based on the ideXlab platform.
-
Covariance Operator based dimensionality reduction with extension to semi supervised settings
International Conference on Artificial Intelligence and Statistics, 2009Co-Authors: Minyoung Kim, Vladimir PavlovicAbstract:We consider the task of dimensionality reduction for regression (DRR) informed by realvalued multivariate labels. The problem is often treated as a regression task where the goal is to nd a low dimensional representation of the input data that preserves the statistical correlation with the targets. Recently, Covariance Operator Inverse Regression (COIR) was proposed as an eective solution that exploits the Covariance structures of both input and output. COIR addresses known limitations of recent DRR techniques and allows a closed-form solution without resorting to explicit output space slicing often required by existing IR-based methods. In this work we provide a unifying view of COIR and other DRR techniques and relate them to the popular supervised dimensionality reduction methods including the canonical correlation analysis (CCA) and the linear discriminant analysis (LDA). We then show that COIR can be eectively extended to a semi-supervised learning setting where many of the input points lack their corresponding multivariate targets. A study of benets of
-
AISTATS - Covariance Operator Based Dimensionality Reduction with Extension to Semi-Supervised Settings
2009Co-Authors: Minyoung Kim, Vladimir PavlovicAbstract:We consider the task of dimensionality reduction for regression (DRR) informed by realvalued multivariate labels. The problem is often treated as a regression task where the goal is to nd a low dimensional representation of the input data that preserves the statistical correlation with the targets. Recently, Covariance Operator Inverse Regression (COIR) was proposed as an eective solution that exploits the Covariance structures of both input and output. COIR addresses known limitations of recent DRR techniques and allows a closed-form solution without resorting to explicit output space slicing often required by existing IR-based methods. In this work we provide a unifying view of COIR and other DRR techniques and relate them to the popular supervised dimensionality reduction methods including the canonical correlation analysis (CCA) and the linear discriminant analysis (LDA). We then show that COIR can be eectively extended to a semi-supervised learning setting where many of the input points lack their corresponding multivariate targets. A study of benets of
-
dimensionality reduction using Covariance Operator inverse regression
Computer Vision and Pattern Recognition, 2008Co-Authors: Minyoung Kim, Vladimir PavlovicAbstract:We consider the task of dimensionality reduction for regression (DRR) whose goal is to find a low dimensional representation of input covariates, while preserving the statistical correlation with output targets. DRR is particularly suited for visualization of high dimensional data as well as the efficient regressor design with a reduced input dimension. In this paper we propose a novel nonlinear method for DRR that exploits the kernel Gram matrices of input and output. While most existing DRR techniques rely on the inverse regression, our approach removes the need for explicit slicing of the output space using Covariance Operators in RKHS. This unique property make DRR applicable to problem domains with high dimensional output data with potentially significant amounts of noise. Although recent kernel dimensionality reduction algorithms make use of RKHS Covariance Operators to quantify conditional dependency between the input and the targets via the dimension-reduced input, they are either limited to a transduction setting or linear input subspaces and restricted to non-closed-form solutions. In contrast, our approach provides a closed-form solution to the nonlinear basis functions on which any new input point can be easily projected. We demonstrate the benefits of the proposed method in a comprehensive set of evaluations on several important regression problems that arise in computer vision.
-
CVPR - Dimensionality reduction using Covariance Operator inverse regression
2008 IEEE Conference on Computer Vision and Pattern Recognition, 2008Co-Authors: Minyoung Kim, Vladimir PavlovicAbstract:We consider the task of dimensionality reduction for regression (DRR) whose goal is to find a low dimensional representation of input covariates, while preserving the statistical correlation with output targets. DRR is particularly suited for visualization of high dimensional data as well as the efficient regressor design with a reduced input dimension. In this paper we propose a novel nonlinear method for DRR that exploits the kernel Gram matrices of input and output. While most existing DRR techniques rely on the inverse regression, our approach removes the need for explicit slicing of the output space using Covariance Operators in RKHS. This unique property make DRR applicable to problem domains with high dimensional output data with potentially significant amounts of noise. Although recent kernel dimensionality reduction algorithms make use of RKHS Covariance Operators to quantify conditional dependency between the input and the targets via the dimension-reduced input, they are either limited to a transduction setting or linear input subspaces and restricted to non-closed-form solutions. In contrast, our approach provides a closed-form solution to the nonlinear basis functions on which any new input point can be easily projected. We demonstrate the benefits of the proposed method in a comprehensive set of evaluations on several important regression problems that arise in computer vision.
Mariela Sued - One of the best experts on this subject based on the ideXlab platform.
-
The spatial sign Covariance Operator: Asymptotic results and applications
Journal of Multivariate Analysis, 2019Co-Authors: Graciela Boente, Daniela Rodriguez, Mariela SuedAbstract:Abstract Due to increased recording capability, functional data analysis has become an important research topic. For functional data, the study of outlier detection and/or the development of robust statistical procedures started only recently. One robust alternative to the sample Covariance Operator is the sample spatial sign Covariance Operator. In this paper, we study the asymptotic behavior of the sample spatial sign Covariance Operator centered at an estimated location. Among possible applications of our results, we derive the asymptotic distribution of the principal directions obtained from the sample spatial sign Covariance Operator and we develop a testing procedure to detect differences between the scatter Operators of two populations. The test performance is illustrated through a Monte Carlo study for small sample sizes.
-
The spatial sign Covariance Operator: Asymptotic results and applications.
arXiv: Statistics Theory, 2018Co-Authors: Graciela Boente, Daniela Rodriguez, Mariela SuedAbstract:Due to the increasing recording capability, functional data analysis has become an important research topic. For functional data the study of outlier detection and/or the development of robust statistical procedures has started recently. One robust alternative to the sample Covariance Operator is the sample spatial sign Covariance Operator. In this paper, we study the asymptotic behaviour of the sample spatial sign Covariance Operator when location is unknown. Among other possible applications of the obtained results, we derive the asymptotic distribution of the principal directions obtained from the sample spatial sign Covariance Operator and we develop test to detect differences between the scatter Operators of two populations. In particular, the test performance is illustrated through a Monte Carlo study for small sample sizes.