Resampling Method

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 6513 Experts worldwide ranked by ideXlab platform

E. Maris - One of the best experts on this subject based on the ideXlab platform.

  • a Resampling Method for estimating the signal subspace of spatio temporal eeg meg data
    IEEE Transactions on Biomedical Engineering, 2003
    Co-Authors: E. Maris
    Abstract:

    Source localization using spatio-temporal electroencephalography (EEG) and magnetoencephalography (MEG) data is usually performed by means of signal subspace Methods. The first step of these Methods is the estimation of a set of vectors that spans a subspace containing as well as possible the signal of interest. This estimation is usually performed by means of a singular value decomposition (SVD) of the data matrix: The rank of the signal subspace (denoted by r) is estimated from a plot in which the singular values are plotted against their rank order, and the signal subspace itself is estimated by the first r singular vectors. The main problem with this Method is that it is strongly affected by spatial covariance in the noise. Therefore, two Methods are proposed that are much less affected by this spatial covariance, and old and a new Method. The old Method involves prewhitening of the data matrix, making use of an estimate of the spatial noise covariance matrix. The new Method is based on the matrix product of two average data matrices, resulting from a random partition of a set of stochastically independent replications of the spatio-temporal data matrix. The estimated signal subspace is obtained by first filtering out the asymmetric and negative definite components of this matrix product and then retaining the eigenvectors that correspond to the r largest eigenvalues of this filtered data matrix. The main advantages of the partition-based eigen decomposition over prewhited SVD is that 1) it does not require an estimate of the spatial noise covariance matrix and 2b) that it allows one to make use of a Resampling distribution (the so-called partitioning distribution) as a natural quantification of the uncertainty in the estimated rank. The performance of three Methods (SVD with and without prewhitening, and the partition-based Method) is compared in a simulation study. From this study, it could be concluded that prewhited SVD and the partition-based eigen decomposition perform equally well when the amplitude time series are constant, but that the partition-based Method performs better when the amplitude time series are variable.

  • A Resampling Method for estimating the signal subspace of spatio-temporal EEG/MEG data
    IEEE Transactions on Biomedical Engineering, 2003
    Co-Authors: E. Maris
    Abstract:

    Source localization using spatio-temporal electroencephalography (EEG) and magnetoencephalography (MEG) data is usually performed by means of signal subspace Methods. The first step of these Methods is the estimation of a set of vectors that spans a subspace containing as well as possible the signal of interest. This estimation is usually performed by means of a singular value decomposition (SVD) of the data matrix: The rank of the signal subspace (denoted by r) is estimated from a plot in which the singular values are plotted against their rank order, and the signal subspace itself is estimated by the first r singular vectors. The main problem with this Method is that it is strongly affected by spatial covariance in the noise. Therefore, two Methods are proposed that are much less affected by this spatial covariance, and old and a new Method. The old Method involves prewhitening of the data matrix, making use of an estimate of the spatial noise covariance matrix. The new Method is based on the matrix product of two average data matrices, resulting from a random partition of a set of stochastically independent replications of the spatio-temporal data matrix. The estimated signal subspace is obtained by first filtering out the asymmetric and negative definite components of this matrix product and then retaining the eigenvectors that correspond to the r largest eigenvalues of this filtered data matrix. The main advantages of the partition-based eigen decomposition over prewhited SVD is that 1) it does not require an estimate of the spatial noise covariance matrix and 2b) that it allows one to make use of a Resampling distribution (the so-called partitioning distribution) as a natural quantification of the uncertainty in the estimated rank. The performance of three Methods (SVD with and without prewhitening, and the partition-based Method) is compared in a simulation study. From this study, it could be concluded that prewhited SVD and the partition-based eigen decomposition perform equally well when the amplitude time series are constant, but that the partition-based Method performs better when the amplitude time series are variable.

Shimon Wdowinski - One of the best experts on this subject based on the ideXlab platform.

  • An Efficient Polyphase Filter-Based Resampling Method for Unifying the PRFs in SAR Data
    IEEE Transactions on Geoscience and Remote Sensing, 2017
    Co-Authors: Yoangel Torres, Kamal Premaratne, Falk Amelung, Shimon Wdowinski
    Abstract:

    Variable higher pulse repetition frequencies (PRFs) are increasingly being used to meet the stricter requirements and complexities of current airborne and spaceborne synthetic aperture radar (SAR) systems associated with higher resolution and wider area products. POLYPHASE, the proposed Resampling scheme, downsamples and unifies variable PRFs within a single look complex SAR acquisition and across a repeat pass sequence of acquisitions down to an effective lower PRF. A sparsity condition of the received SAR data ensures that the uniformly resampled data approximate the spectral properties of a decimated densely sampled version of the received SAR data. While experiments conducted with both synthetically generated and real airborne SAR data show that POLYPHASE retains comparable performance with the state-of-the-art best linear unbiased interpolation scheme in image quality, a polyphase filter-based implementation of POLYPHASE offers significant computational savings for arbitrary (not necessarily periodic) input PRF variations, thus allowing fully on-board, in-place, and real-time implementation.

  • an efficient polyphase filter based Resampling Method for unifying the prfs in sar data
    arXiv: Computational Engineering Finance and Science, 2015
    Co-Authors: Yoangel Torres, Kamal Premaratne, Falk Amelung, Shimon Wdowinski
    Abstract:

    Variable and higher pulse repetition frequencies (PRFs) are increasingly being used to meet the stricter requirements and complexities of current airborne and spaceborne synthetic aperture radar (SAR) systems associated with higher resolution and wider area products. POLYPHASE, the proposed Resampling scheme, downsamples and unifies variable PRFs within a single look complex (SLC) SAR acquisition and across a repeat pass sequence of acquisitions down to an effective lower PRF. A sparsity condition of the received SAR data ensures that the uniformly resampled data approximates the spectral properties of a decimated densely sampled version of the received SAR data. While experiments conducted with both synthetically generated and real airborne SAR data show that POLYPHASE retains comparable performance to the state-of-the-art BLUI scheme in image quality, a polyphase filter-based implementation of POLYPHASE offers significant computational savings for arbitrary (not necessarily periodic) input PRF variations, thus allowing fully on-board, in-place, and real-time implementation.

Yoangel Torres - One of the best experts on this subject based on the ideXlab platform.

  • An Efficient Polyphase Filter-Based Resampling Method for Unifying the PRFs in SAR Data
    IEEE Transactions on Geoscience and Remote Sensing, 2017
    Co-Authors: Yoangel Torres, Kamal Premaratne, Falk Amelung, Shimon Wdowinski
    Abstract:

    Variable higher pulse repetition frequencies (PRFs) are increasingly being used to meet the stricter requirements and complexities of current airborne and spaceborne synthetic aperture radar (SAR) systems associated with higher resolution and wider area products. POLYPHASE, the proposed Resampling scheme, downsamples and unifies variable PRFs within a single look complex SAR acquisition and across a repeat pass sequence of acquisitions down to an effective lower PRF. A sparsity condition of the received SAR data ensures that the uniformly resampled data approximate the spectral properties of a decimated densely sampled version of the received SAR data. While experiments conducted with both synthetically generated and real airborne SAR data show that POLYPHASE retains comparable performance with the state-of-the-art best linear unbiased interpolation scheme in image quality, a polyphase filter-based implementation of POLYPHASE offers significant computational savings for arbitrary (not necessarily periodic) input PRF variations, thus allowing fully on-board, in-place, and real-time implementation.

  • an efficient polyphase filter based Resampling Method for unifying the prfs in sar data
    arXiv: Computational Engineering Finance and Science, 2015
    Co-Authors: Yoangel Torres, Kamal Premaratne, Falk Amelung, Shimon Wdowinski
    Abstract:

    Variable and higher pulse repetition frequencies (PRFs) are increasingly being used to meet the stricter requirements and complexities of current airborne and spaceborne synthetic aperture radar (SAR) systems associated with higher resolution and wider area products. POLYPHASE, the proposed Resampling scheme, downsamples and unifies variable PRFs within a single look complex (SLC) SAR acquisition and across a repeat pass sequence of acquisitions down to an effective lower PRF. A sparsity condition of the received SAR data ensures that the uniformly resampled data approximates the spectral properties of a decimated densely sampled version of the received SAR data. While experiments conducted with both synthetically generated and real airborne SAR data show that POLYPHASE retains comparable performance to the state-of-the-art BLUI scheme in image quality, a polyphase filter-based implementation of POLYPHASE offers significant computational savings for arbitrary (not necessarily periodic) input PRF variations, thus allowing fully on-board, in-place, and real-time implementation.

Laura J Scott - One of the best experts on this subject based on the ideXlab platform.

  • an efficient Resampling Method for calibrating single and gene based rare variant association analysis in case control studies
    Biostatistics, 2016
    Co-Authors: Christian Fuchsberger, Laura J Scott
    Abstract:

    : For aggregation tests of genes or regions, the set of included variants often have small total minor allele counts (MACs), and this is particularly true when the most deleterious sets of variants are considered. When MAC is low, commonly used asymptotic tests are not well calibrated for binary phenotypes and can have conservative or anti-conservative results and potential power loss. Empirical p-values obtained via Resampling Methods are computationally costly for highly significant p-values and the results can be conservative due to the discrete nature of Resampling tests. Based on the observation that only the individuals containing minor alleles contribute to the score statistics, we develop an efficient Resampling Method for single and multiple variant score-based tests that can adjust for covariates. Our Method can improve computational efficiency >1000-fold over conventional Resampling for low MAC variant sets. We ameliorate the conservativeness of results through the use of mid-p-values. Using the estimated minimum achievable p-value for each test, we calibrate QQ plots and provide an effective number of tests. In analysis of a case-control study with deep exome sequence, we demonstrate that our Methods are both well calibrated and also reduce computation time significantly compared with Resampling Methods.

  • An efficient Resampling Method for calibrating single and gene-based rare variant association analysis in case–control studies
    Biostatistics, 2015
    Co-Authors: Christian Fuchsberger, Laura J Scott
    Abstract:

    : For aggregation tests of genes or regions, the set of included variants often have small total minor allele counts (MACs), and this is particularly true when the most deleterious sets of variants are considered. When MAC is low, commonly used asymptotic tests are not well calibrated for binary phenotypes and can have conservative or anti-conservative results and potential power loss. Empirical p-values obtained via Resampling Methods are computationally costly for highly significant p-values and the results can be conservative due to the discrete nature of Resampling tests. Based on the observation that only the individuals containing minor alleles contribute to the score statistics, we develop an efficient Resampling Method for single and multiple variant score-based tests that can adjust for covariates. Our Method can improve computational efficiency >1000-fold over conventional Resampling for low MAC variant sets. We ameliorate the conservativeness of results through the use of mid-p-values. Using the estimated minimum achievable p-value for each test, we calibrate QQ plots and provide an effective number of tests. In analysis of a case-control study with deep exome sequence, we demonstrate that our Methods are both well calibrated and also reduce computation time significantly compared with Resampling Methods.

Luis Alvarado - One of the best experts on this subject based on the ideXlab platform.

  • analysis of small sample size studies using nonparametric bootstrap test with pooled Resampling Method
    Statistics in Medicine, 2017
    Co-Authors: Alok Dwivedi, Indika Mallawaarachchi, Luis Alvarado
    Abstract:

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some Methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other Methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this Method also has small sample size limitation. We used a pooled Method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled Resampling Method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal–Wallis test. We suggest using nonparametric bootstrap test with pooled Resampling Method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd.