Negentropy

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 300 Experts worldwide ranked by ideXlab platform

Wang Gang - One of the best experts on this subject based on the ideXlab platform.

  • A Robust Algorithm of Voice Activity Detection Based on Negentropy
    Signal Processing, 2009
    Co-Authors: Wang Gang
    Abstract:

    In this paper,we introduced mathematical Negentropy on the basis of entropy,and developed a Voice Activity Detection algorithm employing Negentropy feature.In the proposed algorithm,the procedure of feature construction is as follows:first,the spectral statistics of current frame is derived from frames nearby according to the noise nature of long-term stationarity;then,the Gaussian se- quence with zero mean and unit variance,which hidden in spectrum,is abstracted from current frame based on a reasonable hypothesis; and last,the feature is constructed through applying approximate Negentropy method to the sequence.Different from the entropy,the value of Negentropy feature is approach to zero and unrelated to the amplitude of noise signal,since the threshold can be decide by the priori information.As a result,the proposed algorithm can work well in complex environments even when the type,amplitude,and SNR of noise signal are all varied or the posterior information can't be obtained correctly.And the experimental results demonstrate the robustness of the proposed algorithm.

Shuxue Ding - One of the best experts on this subject based on the ideXlab platform.

  • Negentropy-Based Sparsity-Promoting Reconstruction with Fast Iterative Solution from Noisy Measurements
    Sensors (Basel Switzerland), 2020
    Co-Authors: Yingxin Zhao, Yingjie Huang, Ming Zhang, Zhiyang Liu, Shuxue Ding
    Abstract:

    Compressed sensing provides an elegant framework for recovering sparse signals from compressed measurements. This paper addresses the problem of sparse signal reconstruction from compressed measurements that is more robust to complex, especially non-Gaussian noise, which arises in many applications. For this purpose, we present a method that exploits the maximum Negentropy theory to promote the adaptability to noise. This problem is formalized as a constrained minimization problem, where the objective function is the Negentropy of measurement error with sparse constraint -norm. On the minimization issue of the problem, although several promising algorithms have been proposed in the literature, they are very computationally demanding and thus cannot be used in many practical situations. To improve on this, we propose an efficient algorithm based on a fast iterative shrinkage-thresholding algorithm that can converge fast. Both the theoretical analysis and numerical experiments show the better accuracy and convergent rate of the proposed method.

  • Sparse Coding Algorithm with Negentropy and Weighted ℓ1-Norm for Signal Reconstruction
    Entropy, 2017
    Co-Authors: Yingxin Zhao, Zhiyang Liu, Yuanyuan Wang, Shuxue Ding
    Abstract:

    Compressive sensing theory has attracted widespread attention in recent years and sparse signal reconstruction has been widely used in signal processing and communication. This paper addresses the problem of sparse signal recovery especially with non-Gaussian noise. The main contribution of this paper is the proposal of an algorithm where the Negentropy and reweighted schemes represent the core of an approach to the solution of the problem. The signal reconstruction problem is formalized as a constrained minimization problem, where the objective function is the sum of a measurement of error statistical characteristic term, the Negentropy, and a sparse regularization term, lp-norm, for 0 < p < 1. The lp-norm, however, leads to a non-convex optimization problem which is difficult to solve efficiently. Herein we treat the lp -norm as a serious of weighted l1-norms so that the sub-problems become convex. We propose an optimized algorithm that combines forward-backward splitting. The algorithm is fast and succeeds in exactly recovering sparse signals with Gaussian and non-Gaussian noise. Several numerical experiments and comparisons demonstrate the superiority of the proposed algorithm.

  • sparse coding algorithm with Negentropy and weighted l1 norm for signal reconstruction
    Entropy, 2017
    Co-Authors: Yingxin Zhao, Zhiyang Liu, Yuanyuan Wang, Shuxue Ding
    Abstract:

    Compressive sensing theory has attracted widespread attention in recent years and sparse signal reconstruction has been widely used in signal processing and communication. This paper addresses the problem of sparse signal recovery especially with non-Gaussian noise. The main contribution of this paper is the proposal of an algorithm where the Negentropy and reweighted schemes represent the core of an approach to the solution of the problem. The signal reconstruction problem is formalized as a constrained minimization problem, where the objective function is the sum of a measurement of error statistical characteristic term, the Negentropy, and a sparse regularization term, lp-norm, for 0 < p < 1. The lp-norm, however, leads to a non-convex optimization problem which is difficult to solve efficiently. Herein we treat the lp -norm as a serious of weighted l1-norms so that the sub-problems become convex. We propose an optimized algorithm that combines forward-backward splitting. The algorithm is fast and succeeds in exactly recovering sparse signals with Gaussian and non-Gaussian noise. Several numerical experiments and comparisons demonstrate the superiority of the proposed algorithm.

Deren Sheng - One of the best experts on this subject based on the ideXlab platform.

  • The improved distribution method of Negentropy and performance evaluation of CCPPs based on the structure theory of thermoeconomics
    Applied Thermal Engineering, 2016
    Co-Authors: Jianhong Chen, Deren Sheng
    Abstract:

    Abstract In order to evaluate the performance of components in the large gas-fired combined cycle power plant (CCPP), an improved thermoeconomic analysis method based on the structure theory of thermoeconomics is proposed. First, the fuel-product model is established; the productive structure and the distribution method of Negentropy are modified. Negentropy produced and consumed in the gas turbine cycle is also considered. It is proved that the method is reasonable and practical. The exergy cost by the method is higher than that by the traditional method. Then, thermoeconomic model based on structure theory is built by using the improved distribution method of Negentropy. Compared with matrix model of thermoeconomics, the accuracy and the effectiveness of the model are verified. The relative error is less than 3%, which is within the permissible range of engineering. Afterwards, relative cost difference and exergoeconomic factor are calculated. The improved relative cost difference is put forward through introduction of non-energy weighting factor. The results indicate that the heat recovery steam generator (HRSG) has a very great potential improvement. The investment on the steam turbine (ST) and the irreversibility of the combustion chamber (CC) should also be paid more attention, the decrease of which makes a great contribution to the decrease of thermoeconomic cost. It shows that the new evaluation index for components of CCPP is reasonable and will support the researches on thermoeconomic optimization of CCPPs.

Yingxin Zhao - One of the best experts on this subject based on the ideXlab platform.

  • Negentropy-Based Sparsity-Promoting Reconstruction with Fast Iterative Solution from Noisy Measurements
    Sensors (Basel Switzerland), 2020
    Co-Authors: Yingxin Zhao, Yingjie Huang, Ming Zhang, Zhiyang Liu, Shuxue Ding
    Abstract:

    Compressed sensing provides an elegant framework for recovering sparse signals from compressed measurements. This paper addresses the problem of sparse signal reconstruction from compressed measurements that is more robust to complex, especially non-Gaussian noise, which arises in many applications. For this purpose, we present a method that exploits the maximum Negentropy theory to promote the adaptability to noise. This problem is formalized as a constrained minimization problem, where the objective function is the Negentropy of measurement error with sparse constraint -norm. On the minimization issue of the problem, although several promising algorithms have been proposed in the literature, they are very computationally demanding and thus cannot be used in many practical situations. To improve on this, we propose an efficient algorithm based on a fast iterative shrinkage-thresholding algorithm that can converge fast. Both the theoretical analysis and numerical experiments show the better accuracy and convergent rate of the proposed method.

  • Sparse Coding Algorithm with Negentropy and Weighted ℓ1-Norm for Signal Reconstruction
    Entropy, 2017
    Co-Authors: Yingxin Zhao, Zhiyang Liu, Yuanyuan Wang, Shuxue Ding
    Abstract:

    Compressive sensing theory has attracted widespread attention in recent years and sparse signal reconstruction has been widely used in signal processing and communication. This paper addresses the problem of sparse signal recovery especially with non-Gaussian noise. The main contribution of this paper is the proposal of an algorithm where the Negentropy and reweighted schemes represent the core of an approach to the solution of the problem. The signal reconstruction problem is formalized as a constrained minimization problem, where the objective function is the sum of a measurement of error statistical characteristic term, the Negentropy, and a sparse regularization term, lp-norm, for 0 < p < 1. The lp-norm, however, leads to a non-convex optimization problem which is difficult to solve efficiently. Herein we treat the lp -norm as a serious of weighted l1-norms so that the sub-problems become convex. We propose an optimized algorithm that combines forward-backward splitting. The algorithm is fast and succeeds in exactly recovering sparse signals with Gaussian and non-Gaussian noise. Several numerical experiments and comparisons demonstrate the superiority of the proposed algorithm.

  • sparse coding algorithm with Negentropy and weighted l1 norm for signal reconstruction
    Entropy, 2017
    Co-Authors: Yingxin Zhao, Zhiyang Liu, Yuanyuan Wang, Shuxue Ding
    Abstract:

    Compressive sensing theory has attracted widespread attention in recent years and sparse signal reconstruction has been widely used in signal processing and communication. This paper addresses the problem of sparse signal recovery especially with non-Gaussian noise. The main contribution of this paper is the proposal of an algorithm where the Negentropy and reweighted schemes represent the core of an approach to the solution of the problem. The signal reconstruction problem is formalized as a constrained minimization problem, where the objective function is the sum of a measurement of error statistical characteristic term, the Negentropy, and a sparse regularization term, lp-norm, for 0 < p < 1. The lp-norm, however, leads to a non-convex optimization problem which is difficult to solve efficiently. Herein we treat the lp -norm as a serious of weighted l1-norms so that the sub-problems become convex. We propose an optimized algorithm that combines forward-backward splitting. The algorithm is fast and succeeds in exactly recovering sparse signals with Gaussian and non-Gaussian noise. Several numerical experiments and comparisons demonstrate the superiority of the proposed algorithm.

Wei Jian - One of the best experts on this subject based on the ideXlab platform.

  • Efficient Optimization of Reference-Based Negentropy for Noncircular Sources in Complex ICA
    Circuits Systems and Signal Processing, 2016
    Co-Authors: Wei Zhao, Yuehong Shen, Zhigang Yuan, Yimin Wei, Wei Jian
    Abstract:

    Bingham proposed a complex fast independent component analysis (c-FastICA) algorithm to approximate the nengentropy of circular sources using nonlinear functions. Novey proposed extending the work of Bingham using information from a pseudo-covariance matrix for noncircular sources, particularly for sub-Gaussian noncircular signals such as binary phase-shift keying signals. Based on this work, in the present paper we propose a new reference-based contrast function by introducing reference signals into the Negentropy, upon which an efficient optimization FastICA algorithm is derived for noncircular sources. This new approach is similar to Novey's nc-FastICA algorithm, but differs in that it is much more efficient in terms of the computational speed, which is significantly notable with a large number of samples. In this study, the local stability of our reference-based Negentropy is analyzed and the derivation of our new algorithm is described in detail. Simulations conducted to demonstrate the performance and effectiveness of our method are also described.

  • an efficient and robust algorithm for bss by maximizing reference based Negentropy
    Aeu-international Journal of Electronics and Communications, 2015
    Co-Authors: Wei Zhao, Yuehong Shen, Zhigang Yuan, Yimin Wei, Wei Jian
    Abstract:

    Abstract A family of contrast criteria referred to as “referenced-based” has been recently proposed for blind source separation (BSS), which are essentially the cross-statistics or cross-cumulants between estimated outputs and reference signals. These contrast functions have an appealing feature in common: the corresponding optimization algorithms are quadratic with respect to the searched parameters. Inspired by this reference-based scheme, a similar contrast function is constructed by introducing the reference signals to Negentropy, based on which a novel fast fixed-point (FastICA) algorithm is proposed in this paper. This new method is similar in spirit to the classical FastICA algorithm based on Negentropy but differs in the fact that it is much more efficient in terms of computational speed than the latter, which is significantly striking with large number of samples. What is more, this new algorithm is more robust against unexpected outliers than those cumulant-based algorithms such as the FastICA algorithm based on kurtosis. The performance of this new method is validated through computer simulations.