Laplacian Smoothing

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 267 Experts worldwide ranked by ideXlab platform

Stanley J Osher - One of the best experts on this subject based on the ideXlab platform.

  • Laplacian Smoothing stochastic gradient markov chain monte carlo
    SIAM Journal on Scientific Computing, 2021
    Co-Authors: Bao Wang, Difan Zou, Stanley J Osher
    Abstract:

    As an important Markov chain Monte Carlo (MCMC) method, the stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, S...

  • Laplacian Smoothing stochastic gradient markov chain monte carlo
    arXiv: Learning, 2019
    Co-Authors: Bao Wang, Difan Zou, Stanley J Osher
    Abstract:

    As an important Markov Chain Monte Carlo (MCMC) method, stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, SGLD typically suffers from slow convergence rate due to its large variance caused by the stochastic gradient. In order to alleviate these drawbacks, we leverage the recently developed Laplacian Smoothing (LS) technique and propose a Laplacian Smoothing stochastic gradient Langevin dynamics (LS-SGLD) algorithm. We prove that for sampling from both log-concave and non-log-concave densities, LS-SGLD achieves strictly smaller discretization error in $2$-Wasserstein distance, although its mixing rate can be slightly slower. Experiments on both synthetic and real datasets verify our theoretical results, and demonstrate the superior performance of LS-SGLD on different machine learning tasks including posterior sampling, Bayesian logistic regression and training Bayesian convolutional neural networks. The code is available at \url{this https URL}.

Bao Wang - One of the best experts on this subject based on the ideXlab platform.

  • Laplacian Smoothing stochastic gradient markov chain monte carlo
    SIAM Journal on Scientific Computing, 2021
    Co-Authors: Bao Wang, Difan Zou, Stanley J Osher
    Abstract:

    As an important Markov chain Monte Carlo (MCMC) method, the stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, S...

  • Laplacian Smoothing stochastic gradient markov chain monte carlo
    arXiv: Learning, 2019
    Co-Authors: Bao Wang, Difan Zou, Stanley J Osher
    Abstract:

    As an important Markov Chain Monte Carlo (MCMC) method, stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, SGLD typically suffers from slow convergence rate due to its large variance caused by the stochastic gradient. In order to alleviate these drawbacks, we leverage the recently developed Laplacian Smoothing (LS) technique and propose a Laplacian Smoothing stochastic gradient Langevin dynamics (LS-SGLD) algorithm. We prove that for sampling from both log-concave and non-log-concave densities, LS-SGLD achieves strictly smaller discretization error in $2$-Wasserstein distance, although its mixing rate can be slightly slower. Experiments on both synthetic and real datasets verify our theoretical results, and demonstrate the superior performance of LS-SGLD on different machine learning tasks including posterior sampling, Bayesian logistic regression and training Bayesian convolutional neural networks. The code is available at \url{this https URL}.

  • Laplacian Smoothing Gradient Descent
    arXiv: Learning, 2018
    Co-Authors: Stanley Osher, Bao Wang, Penghang Yin, Xiyang Luo, Minh Pham, Alex Tong Lin
    Abstract:

    We propose a class of very simple modifications of gradient descent and stochastic gradient descent. We show that when applied to a large variety of machine learning problems, ranging from logistic regression to deep neural nets, the proposed surrogates can dramatically reduce the variance, allow to take a larger step size, and improve the generalization accuracy. The methods only involve multiplying the usual (stochastic) gradient by the inverse of a positive definitive matrix (which can be computed efficiently by FFT) with a low condition number coming from a one-dimensional discrete Laplacian or its high order generalizations. It also preserves the mean and increases the smallest component and decreases the largest component. The theory of Hamilton-Jacobi partial differential equations demonstrates that the implicit version of the new algorithm is almost the same as doing gradient descent on a new function which (i) has the same global minima as the original function and (ii) is ``more convex". Moreover, we show that optimization algorithms with these surrogates converge uniformly in the discrete Sobolev $H_\sigma^p$ sense and reduce the optimality gap for convex optimization problems. The code is available at: \url{this https URL}

Phengann Heng - One of the best experts on this subject based on the ideXlab platform.

  • feature preserving optimization for noisy mesh using joint bilateral filter and constrained Laplacian Smoothing
    Optics and Lasers in Engineering, 2013
    Co-Authors: Mingqiang Wei, Wuyao Shen, Jing Qin, Tientsin Wong, Phengann Heng
    Abstract:

    Advanced 3D optical and laser scanners can generate mesh models with high-resolution details, while inevitably introducing noises from various sources and mesh irregularity due to inconsistent sampling. Noises and irregularity of a scanned model prohibit its use in practical applications where high quality models are required. However, optimizing a noisy mesh while preserving its geometric features is a challenging task. We present a robust two-step approach to meet the challenges of noisy mesh optimization. In the first step, we propose a joint bilateral filter to remove noises on a mesh while maintaining its volume and preserving its features. In the second step, we develop a constrained Laplacian Smoothing scheme by adding two kinds of constraints into the original Laplacian equation. As most noises have been removed in the first step, we can easily detect feature edges from the model and add them as constraints in the Laplacian Smoothing. As a result, the constrained scheme can simultaneously preserve sharp features and avoid volume shrinkage during mesh Smoothing. By integrating these two steps, our approach can effectively remove noises, maintain features, improve regularity for a noisy mesh, as well as avoid side-effects such as volume shrinkage. Extensive qualitative and quantitative experiments have been performed on meshes with synthetic and raw noises to demonstrate the feasibility and effectiveness of our approach.

Srinivas Kodiyalam - One of the best experts on this subject based on the ideXlab platform.

  • a constrained optimization approach to finite element mesh Smoothing
    Finite Elements in Analysis and Design, 1991
    Co-Authors: V N Parthasarathy, Srinivas Kodiyalam
    Abstract:

    Abstract The quality of a finite element solution has been shown to be affected by the quality of the underlying mesh. A poor mesh may lead to unstable and/or inaccurate finite element approximations. Mesh quality is often characterized by the “smoothness” or “shape” of the elements (triangles in 2-D or tetrahedra in 3-D). Most automatic mesh generators produce an initial mesh where the aspect ratio of the elements are unacceptably high. In this paper, a new approach to produce acceptable quality meshes from a topologically valid initial mesh is presented. Given an initial mesh (nodal coordinates and element connectivity), a “smooth” final mesh is obtained by solving a constrained optimization problem. The variables for the iterative optimization procedure are the nodal coordinates (excluding, the boundary nodes) of the finite element mesh, and appropriate bounds are imposed on these to prevent an unacceptable finite element mesh. Examples are given of the application of the above method for 2- and 3-D meshes generated using quadtree/octree automatic mesh generators. Results indicate that the new method not only yields better quality elements when compared with the traditional Laplacian Smoothing, but also guarantees a valid mesh unlike the Laplacian method.

Yi Liang - One of the best experts on this subject based on the ideXlab platform.

  • an improved Laplacian Smoothing approach for surface meshes
    International Conference on Conceptual Structures, 2007
    Co-Authors: Ligang Chen, Yao Zheng, Jianjun Chen, Yi Liang
    Abstract:

    This paper presents an improved Laplacian Smoothing approach (ILSA) to optimize surface meshes while maintaining the essential characteristics of the discrete surfaces. The approach first detects feature nodes of a mesh using a simple method, and then moves its adjustableor free nodeto a new position, which is found by first computing an optimal displacement of the node and then projecting it back to the original discrete surface. The optimal displacement is initially computed by the ILSA, and then adjusted iteratively by solving a constrained optimization problem with a quadratic penalty approach in order to avoid inverted elements. Several examples are presented to illustrate its capability of improving the quality of triangular surface meshes.