The Experts below are selected from a list of 324 Experts worldwide ranked by ideXlab platform
Licheng Jiao - One of the best experts on this subject based on the ideXlab platform.
-
ISNN (1) - Image representation in visual cortex and high Nonlinear Approximation
Advances in Neural Networks — ISNN 2005, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Shuang Wang, Licheng JiaoAbstract:We briefly review the “sparse coding” principle employed in the sensory information processing system of mammals and focus on the phenomenon that such principle is realized through over-complete representation strategy in primary sensory cortical areas (V1). Considering the lack of quantitative analysis of how many gains in sparsenality the over-complete representation strategy brings in neuroscience, in this paper, we give a quantitative analysis from the viewpoint of Nonlinear Approximation. The result shows that the over-complete strategy can provide sparser representation than the complete strategy.
-
ISNN (1) - A review: relationship between response properties of visual neurons and advances in Nonlinear Approximation theory
Advances in Neural Networks — ISNN 2005, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Licheng JiaoAbstract:In this review, we briefly introduce the 'sparse coding' strategy employed in the sensory information processing system of mammals, and reveal the relationship between the strategy and some new advances in Nonlinear Approximation theory.
-
Image representation in visual cortex and high Nonlinear Approximation
Lecture Notes in Computer Science, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Shuang Wang, Licheng JiaoAbstract:We briefly review the sparse coding principle employed in the sensory information processing system of mammals and focus on the phenomenon that such principle is realized through over-complete representation strategy in primary sensory cortical areas (V1). Considering the lack of quantitative analysis of how many gains in sparsenality the over-complete representation strategy brings in neuroscience, in this paper, we give a quantitative analysis from the viewpoint of Nonlinear Approximation. The result shows that the over-complete strategy can provide sparser representation than the complete strategy.
-
A review : Relationship between response properties of visual neurons and advances in Nonlinear Approximation theory
Lecture Notes in Computer Science, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Licheng JiaoAbstract:In this review, we briefly introduce the 'sparse coding' strategy employed in the sensory information processing system of mammals, and reveal the relationship between the strategy and some new advances in Nonlinear Approximation theory.
-
ESANN - New evidences for sparse coding strategy employed in visual neurons: from the image processing and Nonlinear Approximation viewpoint
2005Co-Authors: Tan Shan, Licheng JiaoAbstract:Abstract. ‘ Sparse coding ’ is a ubiquitous strategy employed in the sensory information processing system of mammals. Some work has focused on the validation of this strategy through finding the sparse component of sensory input,and then illustrating a fact that the resulting basis functions or corresponding filter response have the visually similar receptive field to those found in primary visual cortex (V1). In this review, we show that several newly proposed systems in the area of image processing and Nonlinear Approximation provide new evidences for the ‘sparse coding’ strategy along a contrary line. Inspired by the property of receptive field of neuron in V1, the bases functions of these systems are constructed with special structures, namely, band-pass, being localized and multi-orientation. Interestingly, these systems can sparsely represent the special classes of images dominated with edges. 1 Introduction Understanding the mechanism of brain to process sensory information in mammals is a primary but challenging goal of neuroscience.
Shan Tan - One of the best experts on this subject based on the ideXlab platform.
-
ISNN (1) - Image representation in visual cortex and high Nonlinear Approximation
Advances in Neural Networks — ISNN 2005, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Shuang Wang, Licheng JiaoAbstract:We briefly review the “sparse coding” principle employed in the sensory information processing system of mammals and focus on the phenomenon that such principle is realized through over-complete representation strategy in primary sensory cortical areas (V1). Considering the lack of quantitative analysis of how many gains in sparsenality the over-complete representation strategy brings in neuroscience, in this paper, we give a quantitative analysis from the viewpoint of Nonlinear Approximation. The result shows that the over-complete strategy can provide sparser representation than the complete strategy.
-
ISNN (1) - A review: relationship between response properties of visual neurons and advances in Nonlinear Approximation theory
Advances in Neural Networks — ISNN 2005, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Licheng JiaoAbstract:In this review, we briefly introduce the 'sparse coding' strategy employed in the sensory information processing system of mammals, and reveal the relationship between the strategy and some new advances in Nonlinear Approximation theory.
-
Image representation in visual cortex and high Nonlinear Approximation
Lecture Notes in Computer Science, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Shuang Wang, Licheng JiaoAbstract:We briefly review the sparse coding principle employed in the sensory information processing system of mammals and focus on the phenomenon that such principle is realized through over-complete representation strategy in primary sensory cortical areas (V1). Considering the lack of quantitative analysis of how many gains in sparsenality the over-complete representation strategy brings in neuroscience, in this paper, we give a quantitative analysis from the viewpoint of Nonlinear Approximation. The result shows that the over-complete strategy can provide sparser representation than the complete strategy.
-
A review : Relationship between response properties of visual neurons and advances in Nonlinear Approximation theory
Lecture Notes in Computer Science, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Licheng JiaoAbstract:In this review, we briefly introduce the 'sparse coding' strategy employed in the sensory information processing system of mammals, and reveal the relationship between the strategy and some new advances in Nonlinear Approximation theory.
Reinhard Hochmuth - One of the best experts on this subject based on the ideXlab platform.
-
Anisotropic Wavelet Bases and Restricted Nonlinear Approximation
PAMM, 2003Co-Authors: Reinhard HochmuthAbstract:Restricted Nonlinear Approximation is a generalization of n-term Approximation in which a weight function is used to control the terms of the approximant. Here, restricted Nonlinear Approximation is considered with respect to anisotropic wavelet bases. In particular, characterizations for those functions, which provide a specific convergence rate by restricted Nonlinear Approximation, are presented.
-
Restricted Nonlinear Approximation and singular solutions of boundary integral equations
Analysis in Theory and Applications, 2002Co-Authors: Reinhard HochmuthAbstract:This paper studies several problems, which are potentially relevant for the construction of adaptive numerical schemes. First, biorthogonal spline wavelets on [0,1] are chosen as a starting point for characterizations of functions in Besov spaces B _r,r ^6 (0.1) with 0
-
A Besov Space Mapping Property for the Double Layer Potential on Polygons
Applicable Analysis, 2002Co-Authors: Reinhard HochmuthAbstract:A classical boundedness property for the double layer potential on polygons with respect to Sobolev spaces is extended to a scale of Besov spaces which is related to adaptive restricted Nonlinear Approximation schemes.
Ronald A Devore - One of the best experts on this subject based on the ideXlab platform.
-
Optimal Stable Nonlinear Approximation.
arXiv: Numerical Analysis, 2020Co-Authors: Albert Cohen, Ronald A Devore, Guergana Petrova, Przemysław WojtaszczykAbstract:While it is well known that Nonlinear methods of Approximation can often perform dramatically better than linear methods, there are still questions on how to measure the optimal performance possible for such methods. This paper studies Nonlinear methods of Approximation that are compatible with numerical implementation in that they are required to be numerically stable. A measure of optimal performance, called {\em stable manifold widths}, for approximating a model class $K$ in a Banach space $X$ by stable manifold methods is introduced. Fundamental inequalities between these stable manifold widths and the entropy of $K$ are established. The effects of requiring stability in the settings of deep learning and compressed sensing are discussed.
-
Nonlinear Approximation and (Deep) ReLU Networks.
arXiv: Learning, 2019Co-Authors: Ingrid Daubechies, Ronald A Devore, Simon Foucart, Boris Hanin, Guergana PetrovaAbstract:This article is concerned with the Approximation and expressive powers of deep neural networks. This is an active research area currently producing many interesting papers. The results most commonly found in the literature prove that neural networks approximate functions with classical smoothness to the same accuracy as classical linear methods of Approximation, e.g. Approximation by polynomials or by piecewise polynomials on prescribed partitions. However, Approximation by neural networks depending on n parameters is a form of Nonlinear Approximation and as such should be compared with other Nonlinear methods such as variable knot splines or n-term Approximation from dictionaries. The performance of neural networks in targeted applications such as machine learning indicate that they actually possess even greater Approximation power than these traditional methods of Nonlinear Approximation. The main results of this article prove that this is indeed the case. This is done by exhibiting large classes of functions which can be efficiently captured by neural networks where classical Nonlinear methods fall short of the task. The present article purposefully limits itself to studying the Approximation of univariate functions by ReLU networks. Many generalizations to functions of several variables and other activation functions can be envisioned. However, even in this simplest of settings considered here, a theory that completely quantifies the Approximation power of neural networks is still lacking.
-
Nonlinear Approximation and its applications
Multiscale Nonlinear and Adaptive Approximation, 2009Co-Authors: Ronald A DevoreAbstract:I first met Wolfgang Dahmen in 1974 in Oberwolfach. He looked like a high school student to me but he impressed everyone with his talk on whether polynomial operators could produce both polynomial and spectral orders of Approximation. We became the best of friends and frequent collaborators. While Wolfgang’s mathematical contributions spread across many disciplines, a major thread in his work has been the exploitation of Nonlinear Approximation. This article will reflect on Wolfgang’s pervasive contributions to the development of Nonlinear Approximation and its application. Since many of the contributions in this volume will address specific application areas in some details, my thoughts on these will be to a large extent anecdotal.
-
Restricted Nonlinear Approximation
Constructive Approximation, 2000Co-Authors: Albert Cohen, Ronald A Devore, R. HochmuthAbstract:We introduce a new form of Nonlinear Approximation called restricted Approximation . It is a generalization of n -term wavelet Approximation in which a weight function is used to control the terms in the wavelet expansion of the approximant. This form of Approximation occurs in statistical estimation and in the characterization of interpolation spaces for certain pairs of L p and Besov spaces. We characterize, both in terms of their wavelet coefficients and also in terms of their smoothness, the functions which are approximated with a specified rate by restricted Approximation. We also show the relation of this form of Approximation with certain types of thresholding of wavelet coefficients.
-
Nonlinear Approximation and the Space BV(R2)
American Journal of Mathematics, 1999Co-Authors: Albert Cohen, Ronald A Devore, Pencho PetrushevAbstract:Given a function/ ? ?2(0), Q '= (0, l)2 and a real number t > 0, let U(f,t) := infg?BV(g) 11/ ? ??ll/^/) + 'Vgig), where the infimum is taken over all functions g G BV of bounded variation on /. This and related extremal problems arise in several areas of mathematics such as interpolation of operators and statistical estimation, as well as in digital image processing. Techniques for finding minimizers g for U(f, t) based on variational calculus and Nonlinear partial differential equations have been put forward by several authors (DMS), (RO), (MS), (CL). The main disadvantage of these approaches is that they are numerically intensive. On the other hand, it is well known that more elementary methods based on wavelet shrinkage solve related extremal problems, for example, the above problem with BV replaced by the Besov space B\(L\(I)) (see e.g. (CDLL)). However, since BV has no simple description in terms of wavelet coefficients, it is not clear that minimizers for U(f, t) can be realized in this way. We shall show in this paper that simple methods based on Haar thresholding provide near minimizers for U(f, t). Our analysis of this extremal problem brings forward many interesting relations between Haar decompositions and the space BV. 1. Introduction. Nonlinear Approximation has recently played an impor tant role in several problems of image processing including compression, noise removal, and feature extraction. We have in mind techniques such as wavelet compression (DJL), wavelet shrinkage or thresholding (DJKP1), wavelet packets (CW), and greedy algorithms (MZ), (DT). There has also been an impressive contribution of techniques based on variational calculus and Nonlinear partial dif ferential equations (see e.g. (DMS), (RO), (MS), (CL)) especially to the problems of noise removal and image segmentation. The common point between these two approaches is their ability to adapt to the composite nature of images: edge, tex tures and smooth regions should be treated adaptively, a requirement which is certainly not fulfilled by the classical linear filtering techniques.
Xiangrong Zhang - One of the best experts on this subject based on the ideXlab platform.
-
ISNN (1) - Image representation in visual cortex and high Nonlinear Approximation
Advances in Neural Networks — ISNN 2005, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Shuang Wang, Licheng JiaoAbstract:We briefly review the “sparse coding” principle employed in the sensory information processing system of mammals and focus on the phenomenon that such principle is realized through over-complete representation strategy in primary sensory cortical areas (V1). Considering the lack of quantitative analysis of how many gains in sparsenality the over-complete representation strategy brings in neuroscience, in this paper, we give a quantitative analysis from the viewpoint of Nonlinear Approximation. The result shows that the over-complete strategy can provide sparser representation than the complete strategy.
-
ISNN (1) - A review: relationship between response properties of visual neurons and advances in Nonlinear Approximation theory
Advances in Neural Networks — ISNN 2005, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Licheng JiaoAbstract:In this review, we briefly introduce the 'sparse coding' strategy employed in the sensory information processing system of mammals, and reveal the relationship between the strategy and some new advances in Nonlinear Approximation theory.
-
Image representation in visual cortex and high Nonlinear Approximation
Lecture Notes in Computer Science, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Shuang Wang, Licheng JiaoAbstract:We briefly review the sparse coding principle employed in the sensory information processing system of mammals and focus on the phenomenon that such principle is realized through over-complete representation strategy in primary sensory cortical areas (V1). Considering the lack of quantitative analysis of how many gains in sparsenality the over-complete representation strategy brings in neuroscience, in this paper, we give a quantitative analysis from the viewpoint of Nonlinear Approximation. The result shows that the over-complete strategy can provide sparser representation than the complete strategy.
-
A review : Relationship between response properties of visual neurons and advances in Nonlinear Approximation theory
Lecture Notes in Computer Science, 2005Co-Authors: Shan Tan, Xiangrong Zhang, Licheng JiaoAbstract:In this review, we briefly introduce the 'sparse coding' strategy employed in the sensory information processing system of mammals, and reveal the relationship between the strategy and some new advances in Nonlinear Approximation theory.