Sylvester Equation

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 2223 Experts worldwide ranked by ideXlab platform

Lin Xiao - One of the best experts on this subject based on the ideXlab platform.

  • design verification and robotic application of a novel recurrent neural network for computing dynamic Sylvester Equation
    Neural Networks, 2018
    Co-Authors: Lin Xiao, Zhijun Zhang, Zili Zhang
    Abstract:

    To solve dynamic Sylvester Equation in the presence of additive noises, a novel recurrent neural network (NRNN) with finite-time convergence and excellent robustness is proposed and analyzed in this paper. As compared with the design process of Zhang neural network (ZNN), the proposed NRNN is based on an ingenious integral design formula activated by nonlinear functions, which are able to expedite the convergence speed and suppress unknown additive noises during the solving process of dynamic Sylvester Equation. In addition, the global stability, finite-time convergence and denoising property of the NRNN model are theoretically proved. The upper bound of the finite convergence time for the NRNN model is also estimated in theory. Simulative results further verify the efficiency of the NRNN model, as well as its superior robust and finite-time performance to the conventional ZNN model for dynamic Sylvester Equation in front of additive noises. At last, the proposed design method for establishing the NRNN model is successfully applied to kinematical control of robotic manipulator in front of additive noises.

  • a finite time recurrent neural network for solving online time varying Sylvester matrix Equation based on a new evolution formula
    Nonlinear Dynamics, 2017
    Co-Authors: Lin Xiao
    Abstract:

    Sylvester Equation is widely used to study the stability of a nonlinear system in the control field. In this paper, a finite-time Zhang neural network (FTZNN) is proposed and applied to online solution of time-varying Sylvester Equation. Differing from the conventional accelerating method, the design of the proposed FTZNN model is based on a new evolution formula, which is presented and studied to accelerate the convergence speed of a recurrent neural network. Compared with the original Zhang neural network (ZNN) for time-varying Sylvester Equation, the FTZNN model can converge to the theoretical time-varying solution within finite time, instead of converging exponentially with time. Besides, we can obtain the upper bound of the finite convergence time for the FTZNN model in theory. Simulation results show that the proposed FTZNN model achieves the better performance as compared with the original ZNN model for solving online time-varying Sylvester Equation.

Long Jin - One of the best experts on this subject based on the ideXlab platform.

  • noise tolerant gradient oriented neurodynamic model for solving the Sylvester Equation
    Applied Soft Computing, 2021
    Co-Authors: Bei Liu, Long Jin, Haoen Huang
    Abstract:

    Abstract Recursive neural networks are generally divided into dynamic neural networks and static neural networks to refer to the neural networks with one or more feedback links in the network structure. Inevitably, there exist some problems such as poor approximation performance and poor stable convergence performance due to complex network structure. The noise-tolerant gradient-oriented neurodynamic (NTGON) model proposed in this study is an improved model based on the traditional idea of a gradient neural network (GNN) model. The proposed NTGON model can obtain accurate and efficient results under the condition of various noises when computing the Sylvester Equation, which is effectively used to solve various problems with noise pollution that are frequently encountered in practical engineering. Compared with the original GNN model for the Sylvester Equation, the NTGON model exponentially converges to the theoretical solution starting from any initial state. It is demonstrated that the noise-polluted NTGON model converges to the theoretical solution globally no matter how large the unknown matrix-form noise is. Furthermore, simulation results show that the proposed NTGON model achieves a performance that is superior to that of the original GNN model for solving the Sylvester Equation in the presence of noise.

  • discrete computational neural dynamics models for solving time dependent Sylvester Equation with applications to robotics and mimo systems
    IEEE Transactions on Industrial Informatics, 2020
    Co-Authors: Long Jin, Mei Liu
    Abstract:

    In this article, a neural dynamics model is constructed and investigated for solving time-dependent Sylvester Equation with matrix inversion involved in the solving process. Besides, to eliminate the matrix inversion in the model, the quasi-Newton Broyden–Fletcher–Goldfarb–Shanno method is leveraged to construct a new model. Moreover, the global convergence performance and the effectiveness of the two discrete computational models are testified by providing theoretical analyses and numerical experiments with comparisons to the existing solutions, respectively. Two applications to robotics and the multiple-input multiple-output system are given to elucidate the feasibility of the proposed models for solving time-dependent Sylvester Equation.

  • rnn for solving time variant generalized Sylvester Equation with applications to robots and acoustic source localization
    IEEE Transactions on Industrial Informatics, 2020
    Co-Authors: Long Jin, Jingkun Yan, Xiuchun Xiao
    Abstract:

    A generalized Sylvester Equation is a special formulation containing the Sylvester Equation, the Lyapunov Equation and the Stein Equation, which is often encountered in various fields. However, the time-variant generalized Sylvester Equation (TVGSE) is rarely investigated in the existing literature. In this article, we propose a noise-suppressing recurrent neural network (NSRNN) model activated by saturation-allowed functions to solve the TVGSE. For comparison, the existing zeroing neural network (ZNN) models and some improved ZNN models are introduced. Additionally, theoretical analysis on the convergence and robustness of the NSRNN model is given. Furthermore, computer simulations on illustrative examples and applications to robots and acoustic source localization are carried out. Validation results synthesized by the NSRNN model and other ZNN models are provided to illustrate the ability in solving the TVGSE and dealing with noises of the NSRNN model, and the inaction of other ZNN models to noises.

Zhijun Zhang - One of the best experts on this subject based on the ideXlab platform.

  • a complex varying parameter convergent differential neural network for solving online time varying complex Sylvester Equation
    IEEE Transactions on Systems Man and Cybernetics, 2019
    Co-Authors: Zhijun Zhang, Lunan Zheng
    Abstract:

    A novel recurrent neural network, which is named as complex varying-parameter convergent-differential neural network (CVP-CDNN), is proposed in this paper for solving the time-varying complex Sylvester Equation. Two kinds of CVP-CDNNs (i.e., CVP-CDNN Type I and Type II) are illustrated and proved to be effective. The proposed CVP-CDNNs can achieve super-exponential performance if the linear activation function is used. Some activation functions are considered for searching the better performance of the CVP-CDNN and the finite time convergence property of the CVP-CDNN with sign-bi-power activation function is testified. The convergence time of the CVP-CDNN with sign-bi-power activation function is shorter than complex fixed-parameter convergent-differential neural network (CFP-CDNN). Moreover, compared with traditional CFP-CDNN, better convergence performances of novel CVP-CDNN are verified by computer simulation comparisons.

  • Design and Analysis of a Novel Integral Recurrent Neural Network for Solving Time-Varying Sylvester Equation.
    IEEE transactions on cybernetics, 2019
    Co-Authors: Zhijun Zhang, Lunan Zheng, Hui Yang
    Abstract:

    To solve a general time-varying Sylvester Equation, a novel integral recurrent neural network (IRNN) is designed and analyzed. This kind of recurrent neural networks is based on an error-integral design Equation and does not need training in advance. The IRNN can achieve global convergence performance and strong robustness if odd-monotonically increasing activation functions [i.e., the linear, bipolar-sigmoid, power, or sigmoid-power activation functions (SP-AFs)] are applied. Specifically, if linear or bipolar-sigmoid activation functions are applied, the IRNN possess exponential convergence performance. The IRNN has finite-time convergence property by using power activation function. To obtain faster convergence performance and finite-time convergence property, an SP-AF is designed. Furthermore, by using the discretization method, the discrete IRNN model and its convergence analysis are also presented. Practical application to robot manipulator and computer simulation results with using different activation functions and design parameters have verified the effectiveness, stability, and reliability of the proposed IRNN.

  • design verification and robotic application of a novel recurrent neural network for computing dynamic Sylvester Equation
    Neural Networks, 2018
    Co-Authors: Lin Xiao, Zhijun Zhang, Zili Zhang
    Abstract:

    To solve dynamic Sylvester Equation in the presence of additive noises, a novel recurrent neural network (NRNN) with finite-time convergence and excellent robustness is proposed and analyzed in this paper. As compared with the design process of Zhang neural network (ZNN), the proposed NRNN is based on an ingenious integral design formula activated by nonlinear functions, which are able to expedite the convergence speed and suppress unknown additive noises during the solving process of dynamic Sylvester Equation. In addition, the global stability, finite-time convergence and denoising property of the NRNN model are theoretically proved. The upper bound of the finite convergence time for the NRNN model is also estimated in theory. Simulative results further verify the efficiency of the NRNN model, as well as its superior robust and finite-time performance to the conventional ZNN model for dynamic Sylvester Equation in front of additive noises. At last, the proposed design method for establishing the NRNN model is successfully applied to kinematical control of robotic manipulator in front of additive noises.

  • a new varying parameter recurrent neural network for online solution of time varying Sylvester Equation
    IEEE Transactions on Systems Man and Cybernetics, 2018
    Co-Authors: Zhijun Zhang, Lunan Zheng, Jian Weng, Yijun Mao, Lin Xiao
    Abstract:

    Solving Sylvester Equation is a common algebraic problem in mathematics and control theory. Different from the traditional fixed-parameter recurrent neural networks, such as gradient-based recurrent neural networks or Zhang neural networks, a novel varying-parameter recurrent neural network, [called varying-parameter convergent-differential neural network (VP-CDNN)] is proposed in this paper for obtaining the online solution to the time-varying Sylvester Equation. With time passing by, this kind of new varying-parameter neural network can achieve super-exponential performance. Computer simulation comparisons between the fixed-parameter neural networks and the proposed VP-CDNN via using different kinds of activation functions demonstrate that the proposed VP-CDNN has better convergence and robustness properties.

Lin Xiao - One of the best experts on this subject based on the ideXlab platform.

  • design and analysis of two ftrnn models with application to time varying Sylvester Equation
    IEEE Access, 2019
    Co-Authors: Jie Jin, Lin Xiao
    Abstract:

    In this paper, to accelerate the convergence speed of Zhang neural network (ZNN), two finite-time recurrent neural networks (FTRNNs) are presented via devising two novel design formulas. For verifying the advantages of the proposed FTRNN models, a solution application to time-varying Sylvester Equation (TVSE) is given. Compared with the conventional ZNN model, the presented new FTRNN models in this paper are theoretically proved to have better convergence performance, and they are more effective for online solving TVSE within finite time. At last, the superiority and effectiveness of the new FTRNN models for solving TVSE are verified by numerical simulations.

  • a new varying parameter recurrent neural network for online solution of time varying Sylvester Equation
    IEEE Transactions on Systems Man and Cybernetics, 2018
    Co-Authors: Zhijun Zhang, Lunan Zheng, Jian Weng, Yijun Mao, Lin Xiao
    Abstract:

    Solving Sylvester Equation is a common algebraic problem in mathematics and control theory. Different from the traditional fixed-parameter recurrent neural networks, such as gradient-based recurrent neural networks or Zhang neural networks, a novel varying-parameter recurrent neural network, [called varying-parameter convergent-differential neural network (VP-CDNN)] is proposed in this paper for obtaining the online solution to the time-varying Sylvester Equation. With time passing by, this kind of new varying-parameter neural network can achieve super-exponential performance. Computer simulation comparisons between the fixed-parameter neural networks and the proposed VP-CDNN via using different kinds of activation functions demonstrate that the proposed VP-CDNN has better convergence and robustness properties.

Debraj Ghosh - One of the best experts on this subject based on the ideXlab platform.

  • cost reduction of stochastic galerkin method by adaptive identification of significant polynomial chaos bases for elliptic Equations
    Computer Methods in Applied Mechanics and Engineering, 2018
    Co-Authors: Srikara Pranesh, Debraj Ghosh
    Abstract:

    Abstract One widely used and computationally efficient method for uncertainty quantification using spectral stochastic finite element is the stochastic Galerkin method. Here the solution is represented in polynomial chaos expansion, and the residual of the discretized governing Equation is projected on the polynomial chaos bases. This results in a system of deterministic algebraic Equations with the polynomials chaos coefficients as unknown. However, one impediment for its large scale applications is the curse of dimensionality, that is, the exponential growth of the number of polynomial chaos bases with the stochastic dimensionality and degree of expansion. Here, for a stochastic elliptic problem, an adaptive selection of polynomial chaos bases is proposed. Accordingly, during the first few iterations in the preconditioned conjugate gradient method for solving the system of linear algebraic Equations, the chaos bases with maximal contribution –in an appropriately defined metric – to the solution are first identified. Subsequently, only these bases are retained for further iterations until convergence is achieved. Using numerical studies a three times cost saving over the existing method is observed. Furthermore, for enhancing the computational cost gain, the stochastic Galerkin method is reformulated as a generalized Sylvester Equation. This step allowed efficient usage of the sparsity of moments of product of polynomial chaos bases. Through numerical studies on problems with large stochastic dimensionality, an additional cost saving of up to one order of magnitude –twenty times –is observed. This amounts to sixty times speedup over the existing method, when adaptive selection and generalized Sylvester Equation formulation are used together. The proposed methodology can be easily incorporated in an existing standard stochastic Galerkin method solver for elliptic problems.