Sampled Function

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 50394 Experts worldwide ranked by ideXlab platform

Stephane Canu - One of the best experts on this subject based on the ideXlab platform.

  • kernel basis pursuit
    Revue d'intelligence artificielle, 2006
    Co-Authors: Vincent Guigue, Alain Rakotomamonjy, Stephane Canu
    Abstract:

    Estimating a non-uniformly Sampled Function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise. This article presents a new method to estimate a Function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a L 1 -regularized-multiple-kernel estimator. The general idea is to decompose the Function to learn on a sparse-optimal set of spanning Functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression Stepwise (LARS) solver. The computation ofthe full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-Sampled Functions.

  • kernel basis pursuit
    European conference on Machine Learning, 2005
    Co-Authors: Vincent Guigue, Alain Rakotomamonjy, Stephane Canu
    Abstract:

    Estimating a non-uniformly Sampled Function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise. This article presents a new method to estimate a Function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a l1-regularized-multiple-kernel estimator. The general idea is to decompose the Function to learn on a sparse-optimal set of spanning Functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression (LARS) solver. The computation of the full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-Sampled Functions.

Vincent Guigue - One of the best experts on this subject based on the ideXlab platform.

  • kernel basis pursuit
    Revue d'intelligence artificielle, 2006
    Co-Authors: Vincent Guigue, Alain Rakotomamonjy, Stephane Canu
    Abstract:

    Estimating a non-uniformly Sampled Function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise. This article presents a new method to estimate a Function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a L 1 -regularized-multiple-kernel estimator. The general idea is to decompose the Function to learn on a sparse-optimal set of spanning Functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression Stepwise (LARS) solver. The computation ofthe full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-Sampled Functions.

  • kernel basis pursuit
    European conference on Machine Learning, 2005
    Co-Authors: Vincent Guigue, Alain Rakotomamonjy, Stephane Canu
    Abstract:

    Estimating a non-uniformly Sampled Function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise. This article presents a new method to estimate a Function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a l1-regularized-multiple-kernel estimator. The general idea is to decompose the Function to learn on a sparse-optimal set of spanning Functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression (LARS) solver. The computation of the full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-Sampled Functions.

Alain Rakotomamonjy - One of the best experts on this subject based on the ideXlab platform.

  • kernel basis pursuit
    Revue d'intelligence artificielle, 2006
    Co-Authors: Vincent Guigue, Alain Rakotomamonjy, Stephane Canu
    Abstract:

    Estimating a non-uniformly Sampled Function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise. This article presents a new method to estimate a Function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a L 1 -regularized-multiple-kernel estimator. The general idea is to decompose the Function to learn on a sparse-optimal set of spanning Functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression Stepwise (LARS) solver. The computation ofthe full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-Sampled Functions.

  • kernel basis pursuit
    European conference on Machine Learning, 2005
    Co-Authors: Vincent Guigue, Alain Rakotomamonjy, Stephane Canu
    Abstract:

    Estimating a non-uniformly Sampled Function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise. This article presents a new method to estimate a Function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a l1-regularized-multiple-kernel estimator. The general idea is to decompose the Function to learn on a sparse-optimal set of spanning Functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression (LARS) solver. The computation of the full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-Sampled Functions.

R.j.p. De Figueiredo - One of the best experts on this subject based on the ideXlab platform.

  • Sampled-Function weighted order filters
    IEEE Transactions on Circuits and Systems Ii: Analog and Digital Signal Processing, 2002
    Co-Authors: R. Oten, R.j.p. De Figueiredo
    Abstract:

    In this paper, a new type of L-filter, called a Sampled-Function weighted order (SFWO) filter is proposed. The coefficients of this filter are samples of a bounded real-valued Function. This weighting Function is derived for any given noise distribution by examining the asymptotic behavior of the corresponding L-filter coefficients. SFWO filters designed in this way constitute a good compromise between alpha-trimmed mean filters, which are easy to design, and optimal L-filters, which are more flexible but difficult to design. In fact, for a given class of distributions, a SFWO filter can be designed in the form of a smoothly-trimmed mean filter with one or more parameters and can perform as well as the optimal L-filter for that class. Design examples are given for Gaussian, Laplacian, Cauchy, triangular, parabolic, and uniform noise densities. Simulations show that SFWO filters are very promising for noise densities, the tail lengths of which vary from very short to very long.

Anders Hast - One of the best experts on this subject based on the ideXlab platform.

  • simple filter design for first and second order derivatives by a double filtering approach
    Pattern Recognition Letters, 2014
    Co-Authors: Anders Hast
    Abstract:

    Spline filters are usually implemented in two steps, where in the first step the basis coefficients are computed by deconvolving the Sampled Function with a factorized filter and the second step reconstructs the Sampled Function. It will be shown how separable spline filters using different splines can be constructed with fixed kernels, requiring no inverse filtering. Especially, it is discussed how first and second order derivatives can be computed correctly using cubic or trigonometric splines by a double filtering approach giving filters of length 7.