Submatrices

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 309 Experts worldwide ranked by ideXlab platform

Yongge Tian - One of the best experts on this subject based on the ideXlab platform.

Yonghui Liu - One of the best experts on this subject based on the ideXlab platform.

Mohan Ravichandran - One of the best experts on this subject based on the ideXlab platform.

  • Principal Submatrices, Restricted Invertibility, and a Quantitative Gauss–Lucas Theorem
    International Mathematics Research Notices, 2018
    Co-Authors: Mohan Ravichandran
    Abstract:

    Abstract We apply the techniques developed by Marcus, Spielman, and Srivastava, working with principal Submatrices in place of rank-$1$ decompositions to give an alternate proof of their results on restricted invertibility. This approach recovers results of theirs’ concerning the existence of well-conditioned column Submatrices all the way up to the so-called modified stable rank. All constructions are algorithmic. The main novelty of this approach is that it leads to a new quantitative version of the classical Gauss–Lucas theorem on the critical points of complex polynomials. We show that for any degree $n$ polynomial $p$ and any $c \geq 1/2$, the area of the convex hull of the roots of $p^{(\lfloor cn \rfloor )}$ is at most $4(c-c^2)$ that of the area of the convex hull of the roots of $p$.

  • PRINCIPAL Submatrices AND RESTRICTED INVERTIBILITY
    arXiv: Functional Analysis, 2016
    Co-Authors: Mohan Ravichandran
    Abstract:

    We slightly modify the techniques developed by Marcus, Spielman and Srivastava, working with principal Submatrices in place of rank $1$ decompositions to give an alternate proof of their results on restricted invertibility. We show that one can find well conditioned column Submatrices all the way upto the so called modified stable rank. All constructions are algorithmic.

  • Principal Submatrices, restricted invertibility and a quantitative Gauss-Lucas theorem
    2016
    Co-Authors: Mohan Ravichandran
    Abstract:

    We apply the techniques developed by Marcus, Spielman and Srivastava, working with principal Submatrices in place of rank $1$ decompositions to give an alternate proof of their results on restricted invertibility. We show that one can find well conditioned column Submatrices all the way upto the so called modified stable rank. All constructions are algorithmic. A byproduct of these results is an interesting quantitative version of the classical Gauss-Lucas theorem on the critical points of complex polynomials. We show that for any degree $n$ polynomial $p$ and any $c \geq \frac{1}{2}$, the area of the convex hull of the roots of $p^{(cn)}$ is at most $4(c-c^2)$ that of the area of the convex hull of the roots of $p$.

Guang-jing Song - One of the best experts on this subject based on the ideXlab platform.

Andrew B. Nobel - One of the best experts on this subject based on the ideXlab platform.

  • On the maximal size of Large-Average and ANOVA-fit Submatrices in a Gaussian Random Matrix
    Bernoulli : official journal of the Bernoulli Society for Mathematical Statistics and Probability, 2013
    Co-Authors: Xing Sun, Andrew B. Nobel
    Abstract:

    We investigate the maximal size of distinguished Submatrices of a Gaussian random matrix. Of interest are Submatrices whose entries have an average greater than or equal to a positive constant, and Submatrices whose entries are well fit by a two-way ANOVA model. We identify size thresholds and associated (asymptotic) probability bounds for both large-average and ANOVA-fit Submatrices. Probability bounds are obtained when the matrix and Submatrices of interest are square and, in rectangular cases, when the matrix and Submatrices of interest have fixed aspect ratios. Our principal result is an almost sure interval concentration result for the size of large average Submatrices in the square case.

  • On the maximal size of Large-Average and ANOVA-fit Submatrices in a Gaussian Random Matrix
    arXiv: Statistics Theory, 2010
    Co-Authors: Xing Sun, Andrew B. Nobel
    Abstract:

    We investigate the maximal size of distinguished Submatrices of a Gaussian random matrix. Of interest are Submatrices whose entries have average greater than or equal to a positive constant, and Submatrices whose entries are well-fit by a two-way ANOVA model. We identify size thresholds and associated (asymptotic) probability bounds for both large-average and ANOVA-fit Submatrices. Results are obtained when the matrix and Submatrices of interest are square, and in rectangular cases when the matrix Submatrices of interest have fixed aspect ratios. In addition, we obtain a strong, interval concentration result for the size of large average Submatrices in the square case. A simulation study shows good agreement between the observed and predicted sizes of large average Submatrices in matrices of moderate size.

  • On the size and recovery of Submatrices of ones in a random binary matrix
    Journal of Machine Learning Research, 2008
    Co-Authors: Xing Sun, Andrew B. Nobel
    Abstract:

    Binary matrices, and their associated Submatrices of 1s, play a central role in the study of random bipartite graphs and in core data mining problems such as frequent itemset mining (FIM). Motivated by these connections, this paper addresses several statistical questions regarding Submatrices of 1s in a random binary matrix with independent Bernoulli entries. We establish a three-point concentration result, and a related probability bound, for the size of the largest square submatrix of 1s in a square Bernoulli matrix, and extend these results to non-square matrices and Submatrices with fixed aspect ratios. We then consider the noise sensitivity of frequent itemset mining under a simple binary additive noise model, and show that, even at small noise levels, large blocks of 1s leave behind fragments of only logarithmic size. As a result, standard FIM algorithms, which search only for Submatrices of 1s, cannot directly recover such blocks when noise is present. On the positive side, we show that an error-tolerant frequent itemset criterion can recover a submatrix of 1s against a background of 0s plus noise, even when the size of the submatrix of 1s is very small.