The Experts below are selected from a list of 96 Experts worldwide ranked by ideXlab platform
Jianfeng Weng - One of the best experts on this subject based on the ideXlab platform.
-
A New One-Step Band-Limited Extrapolation Procedure Using Empirical Orthogonal Functions
2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, 2006Co-Authors: Jianfeng WengAbstract:A one-step Band-Limited extrapolation procedure is systematically developed under an a priori assumption of bandwidth. The rationale of the proposed scheme is to expand the known signal segment based on a Band-Limited basis Function set and then to generate a set of empirical orthogonal Functions (EOF's) adaptively from the sample values of the Band-Limited Function set. Simulation results indicate that, in addition to the attractive adaptive feature, this scheme also appears to guarantee a smooth result for inexact data, thus suggesting the robustness of the proposed procedure
-
Reconstructing a Band-Limited Function Using Empirical Orthogonal Functions
2006 IEEE Instrumentation and Measurement Technology Conference Proceedings, 2006Co-Authors: Jianfeng WengAbstract:A reconstruction procedure for a Band-Limited Function from its samples taken in a finite segment is systematically developed under an a priori assumption of bandwidth. The rationale of the proposed scheme is to expand the known signal segment based on a Band-Limited basis Function set and then to generate a set of empirical orthogonal Functions (EOF's) adaptively from the sample values of the Band-Limited Function set. Simulation results indicate that, in addition to the attractive adaptive feature, this scheme also appears to guarantee a smooth result for inexact data, thus suggesting the robustness of the proposed procedure
Jesse M. Zhang - One of the best experts on this subject based on the ideXlab platform.
-
A Fourier-Based Approach to Generalization and Optimization in Deep Learning
IEEE Journal on Selected Areas in Information Theory, 2020Co-Authors: Farzan Farnia, Jesse M. ZhangAbstract:The success of deep neural networks stems from their ability to generalize well on real data; however, et al. have observed that neural networks can easily overfit randomly-generated labels. This observation highlights the following question: why do gradient methods succeed in finding generalizable solutions for neural networks while there exist solutions with poor generalization behavior? In this work, we use a Fourier-based approach to study the generalization properties of gradient-based methods over 2-layer neural networks with Band-Limited activation Functions. Our results indicate that in such settings if the underlying distribution of data enjoys nice Fourier properties including bandlimitedness and bounded Fourier norm, then the gradient descent method can converge to local minima with nice generalization behavior. We also establish a Fourier-based generalization error bound for Band-Limited Function spaces, applicable to 2-layer neural networks with general activation Functions. This generalization bound motivates a grouped version of path norms for measuring the complexity of 2-layer neural networks with ReLU-type activation Functions. We empirically demonstrate that regularization of the group path norms results in neural network solutions that can fit true labels without losing test accuracy while not overfitting random labels.
Farzan Farnia - One of the best experts on this subject based on the ideXlab platform.
-
A Fourier-Based Approach to Generalization and Optimization in Deep Learning
IEEE Journal on Selected Areas in Information Theory, 2020Co-Authors: Farzan Farnia, Jesse M. ZhangAbstract:The success of deep neural networks stems from their ability to generalize well on real data; however, et al. have observed that neural networks can easily overfit randomly-generated labels. This observation highlights the following question: why do gradient methods succeed in finding generalizable solutions for neural networks while there exist solutions with poor generalization behavior? In this work, we use a Fourier-based approach to study the generalization properties of gradient-based methods over 2-layer neural networks with Band-Limited activation Functions. Our results indicate that in such settings if the underlying distribution of data enjoys nice Fourier properties including bandlimitedness and bounded Fourier norm, then the gradient descent method can converge to local minima with nice generalization behavior. We also establish a Fourier-based generalization error bound for Band-Limited Function spaces, applicable to 2-layer neural networks with general activation Functions. This generalization bound motivates a grouped version of path norms for measuring the complexity of 2-layer neural networks with ReLU-type activation Functions. We empirically demonstrate that regularization of the group path norms results in neural network solutions that can fit true labels without losing test accuracy while not overfitting random labels.
D.j. Wingham - One of the best experts on this subject based on the ideXlab platform.
-
the reconstruction of a band limited Function and its fourier transform from a finite number of samples at arbitrary locations by singular value decomposition
IEEE Transactions on Signal Processing, 1992Co-Authors: D.j. WinghamAbstract:A method for the stable interpolation of a bandlimited Function known at sample instants with arbitrary locations in the presence of noise is given. Singular value decomposition is used to provide a series expansion that, in contrast to the method of sampling Functions, permits simple identification of vectors in the minimum-norm space poorly represented in the sample values. Three methods, Miller regularization, least squares estimation, and maximum a posteriori estimation, are given for obtaining regularized reconstructions when noise is present. The singular value decomposition (SVD) method is used to interrelate these methods. Examples illustrating the technique are given. >
Qu Gang-rong - One of the best experts on this subject based on the ideXlab platform.
-
Extrapolation Algorithm of a Class of Band-Limited Function and Its Application
Science Technology and Engineering, 2020Co-Authors: Qu Gang-rongAbstract:The Gerchberg-Papoulis algorithm of Band-Limited Function in dimensional number and known region are generalized, and proved the convergence of the generalized algorithm in-Norm. This algorithm is also applied to the angle-limited image reconstruction. The numerical simulation in one-dimension situation is given and shows the efficiency of this algorithm.
-
Extrapolation Algorithm on A Class of Band-Limited Function and Its Application to Angle-Limited Image Reconstruction
Journal of Beijing Jiaotong University, 2020Co-Authors: Qu Gang-rongAbstract:We generalize the Gerchberg-Papoulis algorithm of bandlimited Function in dimensional number and known region,and prove the convergence of the generalized algorithm in L2-Norm.We also apply this algorithm to the angle-limited image reconstruction.We give the numerical simulation in one-dimension situation and show the efficiency of this algorithm.