Smoothing Function

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 47220 Experts worldwide ranked by ideXlab platform

Masri Ayob - One of the best experts on this subject based on the ideXlab platform.

  • Smoothing Secant Line Slope Using Aggregation Fischer Burmeister Function
    IEEE Access, 2020
    Co-Authors: Anahita Ghazvini, Siti Norul Huda Sheikh Abdullah, Masri Ayob
    Abstract:

    As some of the objective Functions are piecewise, so they are non-differentiable at specific points which have a significant impact on deep network rate and computational time. The non-differentiability issue increases the computational time dramatically. This issue is solved by the reformulation of the absolute value equation (AVE) through a parametrized single smooth equation. However, utilizing a single Smoothing Function is less effective to produce a better curve at the breaking points. Therefore, this work formulates a new Smoothing Function of Aggregation Fischer Burmister (AFB) via amalgamating of two popular Smoothing Functions: Aggregation (AGG) and Fischer-Burmeister (FB). These Functions are having the ability of minimum estimation from both sides of the canonical piecewise Function (CPF). If an amalgamation of Smoothing Functions can affect the differentiability of the piecewise objective Function, then amalgamating the AGG and FB Smoothing Functions will produce a smooth secant line slope on both sides with less computational time. To evaluate the proposed technique, we implement a Newton algorithm using MATLAB, with random initial values. A new Smoothing Function is formulated by firstly converting the piecewise objective Function to CPF. Then, we applied it to the Newton algorithm. Finally, to validate the AVE difficulty of the new piecewise Function, we perform one run for each initial value, and 30 runs for time evaluation. The experimental analysis verified that the proposed technique outperformed other techniques of AGG and FB individually in terms of the natural logarithm, exponential, and square root. Hence, this novel technique yields promising smooth approximation for AVE with less computational time.

  • Smoothing Secant Line Slope Using Aggregation Fischer Burmeister Function
    IEEE Access, 1
    Co-Authors: Anahita Ghazvini, Siti Norul Huda Sheikh Abdullah, Masri Ayob
    Abstract:

    As some of the objective Functions are piecewise, so they are non-differentiable at specific points which have a significant impact on deep network rate and computational time. The non-differentiability issue increases the computational time dramatically. This issue is solved by the reformulation of the absolute value equation (AVE) through a parametrized single smooth equation. However, utilizing a single Smoothing Function is less effective to produce a better curve at the breaking points. Therefore, this work formulates a new Smoothing Function via amalgamating of two popular Smoothing Functions: Aggregation (AGG) and Fischer-Burmeister (FB). These Functions are having the ability of minimum estimation from both sides of the canonical piecewise Function (CPF). If a combination of Smoothing Functions can affect the differentiability of the piecewise objective Function, then combining the AGG and FB Smoothing Functions will produce a smooth secant line slope on both sides with less computational time. To evaluate the proposed technique, we implement a Newton algorithm using MATLAB, with random initial values. A new Smoothing Function is formulated by firstly converting the piecewise objective Function to CPF. Then, we applied it to the Newton algorithm. Finally, to validate the AVE difficulty of the new piecewise Function, we perform one run for each initial value, and 30 runs for time evaluation. The experimental analysis verified that the proposed technique outperformed other techniques of AGG and FB individually in terms of the natural logarithm, exponential, and square root. Hence, this novel technique yields promising smooth approximation for AVE with less computational time.

Liang Fang - One of the best experts on this subject based on the ideXlab platform.

  • A one-parametric class of Smoothing Functions for second-order cone programming
    Computational & Applied Mathematics, 2013
    Co-Authors: Jingyong Tang, Li Dong, Liang Fang, Jinchuan Zhou
    Abstract:

    We present a new one-parametric class of Smoothing Functions for second-order cone programming (denoted by SOCP). This class is fairly general and includes the well-known Fischer–Burmeister Smoothing Function and the Chi–Liu Smoothing Function (J Comput Appl Math 223:114–123, 2009) as special cases. Based on these Functions, a Smoothing-type method is proposed for solving SOCP. The proposed algorithm solves only one linear system of equations and performs only one line search at each iteration. This algorithm can start from an arbitrary point and it is locally superlinearly convergent under a mild assumption. Some numerical results are reported.

  • a Smoothing newton method for second order cone optimization based on a new Smoothing Function
    Applied Mathematics and Computation, 2011
    Co-Authors: Jingyong Tang, Guoping He, Li Dong, Liang Fang
    Abstract:

    Abstract A new Smoothing Function is given in this paper by Smoothing the symmetric perturbed Fischer–Burmeister Function. Based on this new Smoothing Function, we present a Smoothing Newton method for solving the second-order cone optimization (SOCO). The method solves only one linear system of equations and performs only one line search at each iteration. Without requiring strict complementarity assumption at the SOCO solution, the proposed algorithm is shown to be globally and locally quadratically convergent. Numerical results demonstrate that our algorithm is promising and comparable to interior-point methods.

  • a Smoothing newton type method for second order cone programming problems based on a new Smoothing fischer burmeister Function
    Computational & Applied Mathematics, 2011
    Co-Authors: Liang Fang, Zengzhe Feng
    Abstract:

    A new Smoothing Function of the well known Fischer-Burmeister Function is given. Based on this new Function, a Smoothing Newton-type method is proposed for solving second-order cone programming. At each iteration, the proposed algorithm solves only one system of linear equations and performs only one line search. This algorithm can start from an arbitrary point and it is Q-quadratically convergent under a mild assumption. Numerical results demonstrate the effectiveness of the algorithm. Mathematical subject classification: 90C25, 90C30, 90C51, 65K05, 65Y20.

  • a Smoothing type newton method for second order cone programming problems based on a new smooth Function
    Journal of Applied Mathematics and Computing, 2010
    Co-Authors: Liang Fang
    Abstract:

    A new Smoothing Function similar with the well known Fischer-Burmeister Function is given. Based on this new Function, a Smoothing-type Newton method is proposed for solving second-order cone programming. At each iteration, the proposed algorithm solves only one system of linear equations and performs only one line search. This algorithm can start from an arbitrary point and it is Q-quadratically convergent under a mild assumption. Preliminary numerical results demonstrate the effectiveness of the method.

  • a new one step Smoothing newton method for nonlinear complementarity problem with p0 Function
    Applied Mathematics and Computation, 2010
    Co-Authors: Liang Fang
    Abstract:

    In this paper, nonlinear complementarity problem with P"0-Function is studied. Based on a new Smoothing Function, the problem is approximated by a family of parameterized smooth equations and we present a new one-step Smoothing Newton method to solve it. At each iteration, the proposed method only need to solve one system of linear equations and perform one Armijo-type line search. The algorithm is proved to be convergent globally and superlinearly without requiring strict complementarity at the solution. Numerical experiments demonstrate the feasibility and efficiency of the new algorithm.

Xianing Wu - One of the best experts on this subject based on the ideXlab platform.

  • an augmented lagrangian multiplier method based on a chks Smoothing Function for solving nonlinear bilevel programming problems
    Knowledge Based Systems, 2014
    Co-Authors: Yan Jiang, Xuyong Li, Chongchao Huang, Xianing Wu
    Abstract:

    Bilevel programming techniques deal with decision processes involving two decision makers with a hierarchical structure. In this paper, an augmented Lagrangian multiplier method is proposed to solve nonlinear bilevel programming (NBLP) problems. An NBLP problem is first transformed into a single level problem with complementary constraints by replacing the lower level problem with its Karush-Kuhn-Tucker optimality condition, which is sequentially smoothed by a Chen-Harker-Kanzow-Smale (CHKS) Smoothing Function. An augmented Lagrangian multiplier method is then applied to solve the smoothed nonlinear program to obtain an approximate optimal solution of the NBLP problem. The asymptotic properties of the augmented Lagrangian multiplier method are analyzed and the condition for solution optimality is derived. Numerical results showing viability of the approach are reported.

  • application of particle swarm optimization based on chks Smoothing Function for solving nonlinear bilevel programming problem
    Applied Mathematics and Computation, 2013
    Co-Authors: Yan Jiang, Xuyong Li, Chongchao Huang, Xianing Wu
    Abstract:

    Particle Swarm Optimization (PSO) is a new optimization technique originating from artificial life and evolutionary computation. It completes optimization through following the personal best solution of each particle and the global best value of the whole swarm. PSO can be used to solve nonlinear programming problems for global optimal solutions efficiently, so a novel approach based on particle swarm optimization is proposed to solve nonlinear bilevel programming problem (NBLP). In the proposed approach, applying Karush-Kuhn-Tucker (KKT) condition to the lower level problem, we transform the NBLP into a regular nonlinear programming with complementary constraints, which is sequentially smoothed by Chen-Harker-Kanzow-Smale (CHKS) Smoothing Function. The PSO approach is then applied to solve the smoothed nonlinear programming for getting the approximate optimal solution of the NBLP problem. Simulations on 5 benchmark problems and practical example about watershed water trading decision-making problem are made and the results demonstrate the effectiveness of the proposed method for solving NBLP.

Anahita Ghazvini - One of the best experts on this subject based on the ideXlab platform.

  • Smoothing Secant Line Slope Using Aggregation Fischer Burmeister Function
    IEEE Access, 2020
    Co-Authors: Anahita Ghazvini, Siti Norul Huda Sheikh Abdullah, Masri Ayob
    Abstract:

    As some of the objective Functions are piecewise, so they are non-differentiable at specific points which have a significant impact on deep network rate and computational time. The non-differentiability issue increases the computational time dramatically. This issue is solved by the reformulation of the absolute value equation (AVE) through a parametrized single smooth equation. However, utilizing a single Smoothing Function is less effective to produce a better curve at the breaking points. Therefore, this work formulates a new Smoothing Function of Aggregation Fischer Burmister (AFB) via amalgamating of two popular Smoothing Functions: Aggregation (AGG) and Fischer-Burmeister (FB). These Functions are having the ability of minimum estimation from both sides of the canonical piecewise Function (CPF). If an amalgamation of Smoothing Functions can affect the differentiability of the piecewise objective Function, then amalgamating the AGG and FB Smoothing Functions will produce a smooth secant line slope on both sides with less computational time. To evaluate the proposed technique, we implement a Newton algorithm using MATLAB, with random initial values. A new Smoothing Function is formulated by firstly converting the piecewise objective Function to CPF. Then, we applied it to the Newton algorithm. Finally, to validate the AVE difficulty of the new piecewise Function, we perform one run for each initial value, and 30 runs for time evaluation. The experimental analysis verified that the proposed technique outperformed other techniques of AGG and FB individually in terms of the natural logarithm, exponential, and square root. Hence, this novel technique yields promising smooth approximation for AVE with less computational time.

  • Smoothing Secant Line Slope Using Aggregation Fischer Burmeister Function
    IEEE Access, 1
    Co-Authors: Anahita Ghazvini, Siti Norul Huda Sheikh Abdullah, Masri Ayob
    Abstract:

    As some of the objective Functions are piecewise, so they are non-differentiable at specific points which have a significant impact on deep network rate and computational time. The non-differentiability issue increases the computational time dramatically. This issue is solved by the reformulation of the absolute value equation (AVE) through a parametrized single smooth equation. However, utilizing a single Smoothing Function is less effective to produce a better curve at the breaking points. Therefore, this work formulates a new Smoothing Function via amalgamating of two popular Smoothing Functions: Aggregation (AGG) and Fischer-Burmeister (FB). These Functions are having the ability of minimum estimation from both sides of the canonical piecewise Function (CPF). If a combination of Smoothing Functions can affect the differentiability of the piecewise objective Function, then combining the AGG and FB Smoothing Functions will produce a smooth secant line slope on both sides with less computational time. To evaluate the proposed technique, we implement a Newton algorithm using MATLAB, with random initial values. A new Smoothing Function is formulated by firstly converting the piecewise objective Function to CPF. Then, we applied it to the Newton algorithm. Finally, to validate the AVE difficulty of the new piecewise Function, we perform one run for each initial value, and 30 runs for time evaluation. The experimental analysis verified that the proposed technique outperformed other techniques of AGG and FB individually in terms of the natural logarithm, exponential, and square root. Hence, this novel technique yields promising smooth approximation for AVE with less computational time.

Yuping Wang - One of the best experts on this subject based on the ideXlab platform.

  • Smoothing and auxiliary Functions based cooperative coevolution for global optimization
    2013 IEEE Congress on Evolutionary Computation, 2013
    Co-Authors: Yuping Wang
    Abstract:

    In this paper, a novel evolutionary algorithm framework called Smoothing and auxiliary Functions based cooperative coevolution (Briefly, SACC) for large scale global optimization problems is proposed. In this new algorithm pattern, a Smoothing Function and an auxiliary Function are well integrated with a cooperative coevolution algorithm. In this way, the performance of the cooperative coevolution algorithm may be improved. In SACC, the cooperative coevolution is responsible for the parallel searching in multiple areas simultaneously. Afterwards, an existing Smoothing Function is used to eliminate all the local optimal solutions no better than the best one obtained until now. Unfortunately, as the above takes place, the Smoothing Function will lose descent directions, which will weaken the local search. However, a proposed auxiliary Function can overcome the drawback, which helps to find a better local optimal solution. A clever strategy on BFGS quasi-Newton method is designed to make the local search more efficient. The simulations on standard benchmark suite in CEC'2013 are made, and the results indicate the proposed algorithm SACC is effective and efficient.

  • A UNIFORM ENHANCEMENT APPROACH FOR OPTIMIZATION ALGORITHMS: Smoothing Function METHOD
    International Journal of Pattern Recognition and Artificial Intelligence, 2010
    Co-Authors: Yuping Wang
    Abstract:

    In this paper, we propose a uniform enhancement approach called Smoothing Function method, which can cooperate any optimization algorithm and improve its performance. The method has two phases. In the first phase, a Smoothing Function is constructed by using a properly truncated Fourier series. It can preserve the overall shape of the original objective Function but eliminate many of its local optimal points, thus it can well approach the objective Function. Then, the optimal solution of the Smoothing Function is searched by an optimization algorithm (e.g. traditional algorithm or evolutionary algorithm) so that the search becomes much easier. In the second phase, we switch to optimize the original Function for some iterations by using the best solution(s) obtained in phase 1 as an initial point (population). Thereafter, the Smoothing Function is updated in order to approximate the original Function more accurately. These two phases are repeated until the best solutions obtained in several successively second phases cannot be improved obviously. In this manner, any optimization algorithm will become much easier in searching optimal solution. Finally, we use the proposed approach to enhance two typical optimization algorithms: Powell direct algorithm and a simple genetic algorithm. The simulation results on ten challenging benchmarks indicate the proposed approach can effectively improve the performance of these two algorithms.

  • A Smoothing evolutionary algorithm based on square search and filled Function for global optimization
    2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA), 2010
    Co-Authors: Yuping Wang, Ning Dong
    Abstract:

    Many effective algorithms have been proposed for the global optimization problems arisen in various practical fields. However, some of these problems exist many local optima, which may lead to premature for solution algorithms. In order to avoid entrapping in the local optima, a Smoothing Function and square search method were used in the designed evolutionary algorithm. Using Smoothing Function can flatten the hilltops of the original Function and eliminate all local optimal solutions which are no better than the best one found so far. Based on the Smoothing Function, square search scheme is presented, which can fall in a lower valley easier. Then, a filled Function and local search were used to update the better solution found so far. Simulation results on 9 high dimensional standard benchmark problems indicate the performance of the proposed evolutionary algorithm is effective and sound.

  • A Smoothing Evolutionary Algorithm with Circle Search for Global Optimization
    2010 Fourth International Conference on Network and System Security, 2010
    Co-Authors: Yuping Wang
    Abstract:

    There are many global optimization problems arisen in various fields of applications. It is very important to design effective algorithms for these problems. However, one of the key drawbacks of the existing global optimization methods is that they are not easy to escape from the local optimal solutions and can not find the global optimal solution quickly. In order to escape from the local optimal solutions and find the global optimal solution fast, first, a Smoothing Function, which can flatten the landscape of the original Function and eliminate all local optimal solutions which are no better than the best one found so far, is proposed. This can make the search of the global optimal solution much easier. Second, to cooperate the Smoothing Function, a tailor-made search scheme called circle search is presented, which can quickly jump out the flattened landscape and fall in a lower landscape quickly. Third, a better solution than the best one found so far can be found by local search. Fourth, a crossover operator is designed based on uniform design. Based on these, a Smoothing evolutionary algorithm for global optimization is proposed. At last, the numerical simulations for eight high dimensional and very challenging standard benchmark problems are made. The performance of the proposed algorithm is compared with that of nine evolutionary algorithms published recently. The results indicate that the proposed algorithm is statistically sound and has better performance for these test Functions.