Backward Algorithm

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 24774 Experts worldwide ranked by ideXlab platform

Yann Guédon - One of the best experts on this subject based on the ideXlab platform.

  • Exploring the segmentation space for the assessment of multiple change-point models
    2008
    Co-Authors: Yann Guédon
    Abstract:

    This paper addresses the retrospective or off-line multiple change-point detection problem. Methods for exploring the space of possible segmentations of a sequence for a fixed number of change points may be divided into two categories: (i) enumeration of segmentations, (ii) summary of the possible segmentations in change-point or segment profiles. Concerning the first category, a forward dynamic programming Algorithm for computing the top L most probable segmentations and a forward-Backward Algorithm for sampling segmentations are derived. Concerning the second category, a forward-Backward dynamic programming Algorithm and a smoothing-type forward-Backward Algorithm for computing two types of change-point and segment profiles are derived. The proposed methods are mainly useful for exploring the space of possible segmentations for successive numbers of change points and provide a set of assessment tools for multiple change-point models. We show using examples that the proposed methods may help to compare alternative multiple change-point models (e.g. Gaussian model with piecewise constant variances or global variance), predict supplementary change points, highlight overestimation of the number of change points and summarize the uncertainty concerning the location of change points.

  • Exploring the state sequence space for hidden Markov and semi-Markov chains
    Computational Statistics & Data Analysis, 2007
    Co-Authors: Yann Guédon
    Abstract:

    The knowledge of the state sequences that explain a given observed sequence for a known hidden Markovian model is the basis of various methods that may be divided into three categories: (i) enumeration of state sequences; (ii) summary of the possible state sequences in state profiles; (iii) computation of a global measure of the state sequence uncertainty. Concerning the first category, the generalized Viterbi Algorithm for computing the top L most probable state sequences and the forward-Backward Algorithm for sampling state sequences are derived for hidden semi-Markov chains and hidden hybrid models combining Markovian and semi-Markovian states. Concerning the second category, a new type of state (and state change) profiles is proposed. The Viterbi forward-Backward Algorithm for computing these state profiles is derived for hidden semi-Markov chains and hidden hybrid models combining Markovian and semi-Markovian states. Concerning the third category, an Algorithm for computing the entropy of the state sequence that explains an observed sequence is proposed. The complementarity and properties of these methods for exploring the state sequence space (including the classical state profiles computed by the forward-Backward Algorithm) are investigated and illustrated with examples.

  • Computational methods for hidden Markov tree models-an application to wavelet trees
    IEEE Transactions on Signal Processing, 2004
    Co-Authors: Jean-baptiste Durand, Paulo Gonçalves, Yann Guédon
    Abstract:

    The hidden Markov tree models were introduced by Crouse et al. in 1998 for modeling nonindependent, non-Gaussian wavelet transform coefficients. In their paper, they developed the equivalent of the forward-Backward Algorithm for hidden Markov tree models and called it the "upward-downward Algorithm". This Algorithm is subject to the same numerical limitations as the forward-Backward Algorithm for hidden Markov chains (HMCs). In this paper, adapting the ideas of Devijver from 1985, we propose a new "upward-downward" Algorithm, which is a true smoothing Algorithm and is immune to numerical underflow. Furthermore, we propose a Viterbi-like Algorithm for global restoration of the hidden state tree. The contribution of those Algorithms as diagnosis tools is illustrated through the modeling of statistical dependencies between wavelet coefficients with a special emphasis on local regularity changes.

Audrey Repetti - One of the best experts on this subject based on the ideXlab platform.

  • ICASSP - A Forward-Backward Algorithm for Reweighted Procedures: Application to Radio-Astronomical Imaging
    ICASSP 2020 - 2020 IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP), 2020
    Co-Authors: Audrey Repetti, Yves Wiaux
    Abstract:

    During the last decades, reweighted procedures have shown high efficiency in computational imaging. They aim to handle non-convex composite penalization functions by iteratively solving multiple approximated sub-problems. Although the asymptotic behaviour of these methods has recently been investigated in several works, they all necessitate the sub-problems to be solved accurately, which can be sub-optimal in practice. In this work we present a reweighted forward-Backward Algorithm designed to handle non-convex composite functions. Unlike existing convergence studies in the literature, the weighting procedure is directly included within the iterations, avoiding the need for solving any sub-problem. We show that the obtained reweighted forward-Backward Algorithm converges to a critical point of the initial objective function. We illustrate the good behaviour of the proposed approach on a Fourier imaging example borrowed to radio-astronomical imaging.

  • Variable Metric Forward-Backward Algorithm for Composite Minimization Problems
    arXiv: Optimization and Control, 2019
    Co-Authors: Audrey Repetti, Yves Wiaux
    Abstract:

    We present a forward-Backward-based Algorithm to minimize a sum of a differentiable function and a nonsmooth function, both being possibly nonconvex. The main contribution of this work is to consider the challenging case where the nonsmooth function corresponds to a sum of non-convex functions, resulting from composition between a strictly increasing, concave, differentiable function and a convex nonsmooth function. The proposed variable metric Composite Function Forward-Backward Algorithm (C2FB) circumvents the explicit, and often challenging, computation of the proximity operator of the composite functions through a majorize-minimize approach. Precisely, each composite function is majorized using a linear approximation of the differentiable function, which allows one to apply the proximity step only to the sum of the nonsmooth functions. We prove the convergence of the Algorithm iterates to a critical point of the objective function leveraging the Kurdyka-\L ojasiewicz inequality. The convergence is guaranteed even if the proximity operators are computed inexactly, considering relative errors. We show that the proposed approach is a generalization of reweighting methods, with convergence guarantees. In particular, applied to the log-sum function, our Algorithm reduces to a generalized version of the celebrated reweighted $\ell_1$ method. Finally, we show through simulations on an image processing problem that the proposed C2FB Algorithm necessitates less iterations to converge and leads to better critical points compared with traditional reweighting methods and classic forward-Backward Algorithms.

  • A block coordinate variable metric forward–Backward Algorithm
    Journal of Global Optimization, 2016
    Co-Authors: Emilie Chouzenoux, Jean-christophe Pesquet, Audrey Repetti
    Abstract:

    A number of recent works have emphasized the prominent role played by the Kurdyka-Łojasiewicz inequality for proving the convergence of iterative Algorithms solving possibly nonsmooth/nonconvex optimization problems. In this work, we consider the minimization of an objective function satisfying this property, which is a sum of two terms: (i) a differentiable, but not necessarily convex, function and (ii) a function that is not necessarily convex, nor necessarily differentiable. The latter function is expressed as a separable sum of functions of blocks of variables. Such an optimization problem can be addressed with the Forward–Backward Algorithm which can be accelerated thanks to the use of variable metrics derived from the Majorize–Minimize principle. We propose to combine the latter acceleration technique with an alternating minimization strategy which relies upon a flexible update rule. We give conditions under which the sequence generated by the resulting Block Coordinate Variable Metric Forward–Backward Algorithm converges to a critical point of the objective function. An application example to a nonconvex phase retrieval problem encountered in signal/image processing shows the efficiency of the proposed optimization method.

  • a block coordinate variable metric forward Backward Algorithm
    Journal of Global Optimization, 2016
    Co-Authors: Emilie Chouzenoux, Jean-christophe Pesquet, Audrey Repetti
    Abstract:

    A number of recent works have emphasized the prominent role played by the Kurdyka-źojasiewicz inequality for proving the convergence of iterative Algorithms solving possibly nonsmooth/nonconvex optimization problems. In this work, we consider the minimization of an objective function satisfying this property, which is a sum of two terms: (i) a differentiable, but not necessarily convex, function and (ii) a function that is not necessarily convex, nor necessarily differentiable. The latter function is expressed as a separable sum of functions of blocks of variables. Such an optimization problem can be addressed with the Forward---Backward Algorithm which can be accelerated thanks to the use of variable metrics derived from the Majorize---Minimize principle. We propose to combine the latter acceleration technique with an alternating minimization strategy which relies upon a flexible update rule. We give conditions under which the sequence generated by the resulting Block Coordinate Variable Metric Forward---Backward Algorithm converges to a critical point of the objective function. An application example to a nonconvex phase retrieval problem encountered in signal/image processing shows the efficiency of the proposed optimization method.

  • A block coordinate variable metric forward-Backward Algorithm
    Journal of Global Optimization, 2016
    Co-Authors: Emilie Chouzenoux, Jean-christophe Pesquet, Audrey Repetti
    Abstract:

    A number of recent works have emphasized the prominent role played by the Kurdyka-Lojasiewicz inequality for proving the convergence of iterative Algorithms solving possibly nonsmooth/nonconvex optimization problems. In this work, we consider the minimization of an objective function satisfying this property, which is a sum of a non necessarily convex differentiable function and a non necessarily differentiable or convex function. The latter function is expressed as a separable sum of functions of blocks of variables. Such an optimization problem can be addressed with the Forward-Backward Algorithm which can be accelerated thanks to the use of variable metrics derived from the Majorize-Minimize principle. We propose to combine the latter acceleration technique with an alternating minimization strategy which relies upon a flexible update rule. We give conditions under which the sequence generated by the resulting Block Coordinate Variable Metric Forward-Backward Algorithm converges to a critical point of the objective function. An application example to a nonconvex phase retrieval problem encountered in signal/image processing shows the efficiency of the proposed optimization method.

Håkon Tjelmeland - One of the best experts on this subject based on the ideXlab platform.

  • Exact and Approximate Recursive Calculations for Binary Markov Random Fields Defined on Graphs
    Journal of Computational and Graphical Statistics, 2012
    Co-Authors: Håkon Tjelmeland, Haakon Michael Austad
    Abstract:

    In this article, we propose computationally feasible approximations to binary Markov random fields (MRFs). The basis of the approximation is the forward-Backward Algorithm. The exact forward-Backward Algorithm is computationally feasible only for fields defined on small lattices. The forward part of the Algorithm computes a series of joint marginal distributions by summing out each variable in turn. We represent these joint marginal distributions by interaction parameters of different orders. The approximation is defined by approximating to zero all interaction parameters that are sufficiently close to zero. In addition, an interaction parameter is approximated to zero whenever all associated lower-level interactions are (approximated to) zero. If sufficiently many interaction parameters are set to zero, the Algorithm is computationally feasible both in terms of computation time and memory requirements. The resulting approximate forward part of the forward-Backward Algorithm defines an approximation to th...

  • Approximate forward-Backward Algorithm for a switching linear Gaussian model
    Computational Statistics & Data Analysis, 2011
    Co-Authors: Hugo Lewi Hammer, Håkon Tjelmeland
    Abstract:

    A hidden Markov model with two hidden layers is considered. The bottom layer is a Markov chain and given this the variables in the second hidden layer are assumed conditionally independent and Gaussian distributed. The observation process is Gaussian with mean values that are linear functions of the second hidden layer. The forward-Backward Algorithm is not directly feasible for this model as the recursions result in a mixture of Gaussian densities where the number of terms grows exponentially with the length of the Markov chain. By dropping the less important Gaussian terms an approximate forward-Backward Algorithm is defined. Thereby one gets a computationally feasible Algorithm that generates samples from an approximation to the conditional distribution of the unobserved layers given the data. The approximate Algorithm is also used as a proposal distribution in a Metropolis-Hastings setting, and this gives high acceptance rates and good convergence and mixing properties. The model considered is related to what is known as switching linear dynamical systems. The proposed Algorithm can in principle also be used for these models and the potential use of the Algorithm is therefore large. In simulation examples the Algorithm is used for the problem of seismic inversion. The simulations demonstrate the effectiveness and quality of the proposed approximate Algorithm.

Charles Dossal - One of the best experts on this subject based on the ideXlab platform.

  • Convergence rate of inertial Forward–Backward Algorithm beyond Nesterov’s rule
    Mathematical Programming, 2018
    Co-Authors: Vassilis Apidopoulos, Jean-françois Aujol, Charles Dossal
    Abstract:

    In this paper we study the convergence of an Inertial Forward-Backward Algorithm, with a particular choice of an over-relaxation term. In particular we show that for a sequence of overrrelaxation parameters, that do not satisfy Nesterov’s rule one can still expect some relatively fast convergence properties for the objective function. In addition we complement this work by studying the convergence of the Algorithm in the case where the proximal operator is inexactly computed with the presence of some errors and we give sufficient conditions over these errors in order to obtain some convergence properties for the objective function .

  • Convergence rate of inertial Forward-Backward Algorithm beyond Nesterov's rule
    Mathematical Programming Series A, 2018
    Co-Authors: Vassilis Apidopoulos, Jean-françois Aujol, Charles Dossal
    Abstract:

    In this paper we study the convergence of an Inertial Forward-Backward Algorithm, with a particular choice of an over-relaxation term. In particular we show that for a sequence of overrrelaxation parameters, that do not satisfy Nesterov’s rule one can still expect some relatively fast convergence properties for the objective function. In addition we complement this work by studying the convergence of the Algorithm in the case where the proximal operator is inexactly computed with the presence of some errors and we give sufficient conditions over these errors in order to obtain some convergence properties for the objective function .

  • Stability of over-relaxations for the Forward-Backward Algorithm, application to FISTA
    SIAM Journal on Optimization, 2015
    Co-Authors: Jean-françois Aujol, Charles Dossal
    Abstract:

    This paper is concerned with the convergence of over-relaxations of the forward-Backward Algorithm (FB) (in particular the fast iterative soft thresholding Algorithm (FISTA)) in the case when proximal maps and/or gradients are computed with a possible error. We show that, provided these errors are small enough, the Algorithm still converges to a minimizer of the functional, and with a speed of convergence (in terms of values of the functional) that remains the same as in the noise-free case. We also show that larger errors can be allowed, using a lower over-relaxation than FISTA. This still leads to the convergence of iterates and with ergodic convergence speed faster than the classical FB and FISTA.

Hedy Attouch - One of the best experts on this subject based on the ideXlab platform.

  • Convergence of a Relaxed Inertial Forward–Backward Algorithm for Structured Monotone Inclusions
    Applied Mathematics and Optimization, 2019
    Co-Authors: Hedy Attouch, Alexandre Cabot
    Abstract:

    In this paper, we study the Backward forward Algorithm as a splitting method to solve structured monotone inclusions, and convex minimization problems in Hilbert spaces. It has a natural link with the forward Backward Algorithm and has the same computational complexity, since it involves the same basic blocks, but organized differently. Surprisingly enough, this kind of iteration arises when studying the time discretization of the regularized Newton method for maximally monotone operators. First, we show that these two methods enjoy remarkable involutive relations, which go far beyond the evident inversion of the order in which the forward and Backward steps are applied. Next, we establish several convergence properties for both methods, some of which were unknown even for the forward Backward Algorithm. This brings further insight into this well-known scheme. Finally, we specialize our results to structured convex minimization problems, the gradient projection Algorithms, and give a numerical illustration of theoretical interest.

  • Convergence of a Relaxed Inertial Forward–Backward Algorithm for Structured Monotone Inclusions
    Applied Mathematics & Optimization, 2019
    Co-Authors: Hedy Attouch, Alexandre Cabot
    Abstract:

    In a Hilbert space $${{\mathcal {H}}}$$ H , we study the convergence properties of a class of relaxed inertial forward–Backward Algorithms. They aim to solve structured monotone inclusions of the form $$Ax + Bx \ni 0$$ A x + B x ∋ 0 where $$A:{{\mathcal {H}}}\rightarrow 2^{{\mathcal {H}}}$$ A : H → 2 H is a maximally monotone operator and $$B:{{\mathcal {H}}}\rightarrow {{\mathcal {H}}}$$ B : H → H is a cocoercive operator. We extend to this class of problems the acceleration techniques initially introduced by Nesterov, then developed by Beck and Teboulle in the case of structured convex minimization (FISTA). As an important element of our approach, we develop an inertial and parametric version of the Krasnoselskii–Mann theorem, where joint adjustment of the inertia and relaxation parameters plays a central role. This study comes as a natural extension of the techniques introduced by the authors for the study of relaxed inertial proximal Algorithms. An illustration is given to the inertial Nash equilibration of a game combining non-cooperative and cooperative aspects.

  • convergence of a relaxed inertial forward Backward Algorithm for structured monotone inclusions
    Applied Mathematics and Optimization, 2019
    Co-Authors: Hedy Attouch, Alexandre Cabot
    Abstract:

    In a Hilbert space $${{\mathcal {H}}}$$ , we study the convergence properties of a class of relaxed inertial forward–Backward Algorithms. They aim to solve structured monotone inclusions of the form $$Ax + Bx \ni 0$$ where $$A:{{\mathcal {H}}}\rightarrow 2^{{\mathcal {H}}}$$ is a maximally monotone operator and $$B:{{\mathcal {H}}}\rightarrow {{\mathcal {H}}}$$ is a cocoercive operator. We extend to this class of problems the acceleration techniques initially introduced by Nesterov, then developed by Beck and Teboulle in the case of structured convex minimization (FISTA). As an important element of our approach, we develop an inertial and parametric version of the Krasnoselskii–Mann theorem, where joint adjustment of the inertia and relaxation parameters plays a central role. This study comes as a natural extension of the techniques introduced by the authors for the study of relaxed inertial proximal Algorithms. An illustration is given to the inertial Nash equilibration of a game combining non-cooperative and cooperative aspects.

  • Backward–forward Algorithms for structured monotone inclusions in Hilbert spaces
    Australian Journal of Mathematical Analysis and Applications, 2018
    Co-Authors: Hedy Attouch, Juan Peypouquet, Patrick Redont
    Abstract:

    In this paper, we study the Backward–forward Algorithm as a splitting method to solve structured monotone inclusions, and convex minimization problems in Hilbert spaces. It has a natural link with the forward–Backward Algorithm and has the same computational complexity, since it involves the same basic blocks, but organized differently. Surprisingly enough, this kind of iteration arises when studying the time discretization of the regularized Newton method for maximally monotone operators. First, we show that these two methods enjoy remarkable involutive relations, which go far beyond the evident inversion of the order in which the forward and Backward steps are applied. Next, we establish several convergence properties for both methods, some of which were unknown even for the forward–Backward Algorithm. This brings further insight into this well-known scheme. Finally, we specialize our results to structured convex minimization problems, the gradient-projection Algorithms, and give a numerical illustration of theoretical interest.

  • A Dynamical Approach to an Inertial Forward-Backward Algorithm for Convex Minimization
    SIAM Journal on Optimization, 2014
    Co-Authors: Hedy Attouch, Juan Peypouquet, Patrick Redont
    Abstract:

    We introduce a new class of forward-Backward Algorithms for structured convex minimization problems in Hilbert spaces. Our approach relies on the time discretization of a second-order differential system with two potentials and Hessian-driven damping, recently introduced in [H. Attouch, P.-E. Maingé, and P. Redont, Differ. Equ. Appl., 4 (2012), pp. 27--65]. This system can be equivalently written as a first-order system in time and space, each of the two constitutive equations involving one (and only one) of the two potentials. Its time dicretization naturally leads to the introduction of forward-Backward splitting Algorithms with inertial features. Using a Liapunov analysis, we show the convergence of the Algorithm under conditions enlarging the classical step size limitation. Then, we specialize our results to gradient-projection Algorithms and give some illustrations of sparse signal recovery and feasibility problems.