The Experts below are selected from a list of 134781 Experts worldwide ranked by ideXlab platform
Dongxiao Zhang - One of the best experts on this subject based on the ideXlab platform.
-
Ensemble Long Short-Term Memory (EnLSTM) network
Geophysical Research Letters, 2020Co-Authors: Yuntian Chen, Dongxiao ZhangAbstract:In this study, we propose an ensemble Long Short-Term Memory (EnLSTM) network, which can be trained on a small dataset and process sequential data. The EnLSTM is built by combining the ensemble neural network (ENN) and the cascaded Long Short-Term Memory (C-LSTM) network to leverage their complementary strengths. In order to resolve the issues of over-convergence and disturbance compensation associated with training failure owing to the nature of small-data problems, model parameter perturbation and high-fidelity observation perturbation methods are introduced. The EnLSTM is compared with commonly-used models on a published dataset, and proven to be the state-of-the-art model in generating well logs with a mean-square-error (MSE) reduction of 34%. In the case study, 12 well logs that cannot be measured while drilling are generated based on logging-while-drilling (LWD) data. The EnLSTM is capable to reduce cost and save time in practice.
Wermterstefan - One of the best experts on this subject based on the ideXlab platform.
-
An analysis of Convolutional Long Short-Term Memory Recurrent Neural Networks for gesture recognition
Neurocomputing, 2017Co-Authors: Tsironieleni, Barrospablo, Webercornelius, WermterstefanAbstract:In this research, we analyze a Convolutional Long Short-Term Memory Recurrent Neural Network (CNNLSTM) in the context of gesture recognition. CNNLSTMs are able to successfully learn gestures of var...
Yuntian Chen - One of the best experts on this subject based on the ideXlab platform.
-
Ensemble Long Short-Term Memory (EnLSTM) network
Geophysical Research Letters, 2020Co-Authors: Yuntian Chen, Dongxiao ZhangAbstract:In this study, we propose an ensemble Long Short-Term Memory (EnLSTM) network, which can be trained on a small dataset and process sequential data. The EnLSTM is built by combining the ensemble neural network (ENN) and the cascaded Long Short-Term Memory (C-LSTM) network to leverage their complementary strengths. In order to resolve the issues of over-convergence and disturbance compensation associated with training failure owing to the nature of small-data problems, model parameter perturbation and high-fidelity observation perturbation methods are introduced. The EnLSTM is compared with commonly-used models on a published dataset, and proven to be the state-of-the-art model in generating well logs with a mean-square-error (MSE) reduction of 34%. In the case study, 12 well logs that cannot be measured while drilling are generated based on logging-while-drilling (LWD) data. The EnLSTM is capable to reduce cost and save time in practice.
Claire J Tomlin - One of the best experts on this subject based on the ideXlab platform.
-
Long short term Memory neural network stability and stabilization using linear matrix inequalities
International Symposium on Circuits and Systems, 2019Co-Authors: Shankar A Deka, Dusan M Stipanovic, Boris Murmann, Claire J TomlinAbstract:A global asymptotic stability condition for Long Short-Term Memory neural networks is presented in this paper. A linear matrix inequality optimization problem is used to describe this global stability condition. The linear matrix inequality formulation can be viewed as a way for stabilization of Long Short-Term Memory neural networks since the networks' weight matrices and biases can be essentially treated as control variables. The condition and how to compute numerical values for the weight matrices and biases are illustrated by some examples.
-
ISCAS - Long-Short Term Memory Neural Network Stability and Stabilization using Linear Matrix Inequalities
2019 IEEE International Symposium on Circuits and Systems (ISCAS), 2019Co-Authors: Shankar A Deka, Dusan M Stipanovic, Boris Murmann, Claire J TomlinAbstract:A global asymptotic stability condition for Long Short-Term Memory neural networks is presented in this paper. A linear matrix inequality optimization problem is used to describe this global stability condition. The linear matrix inequality formulation can be viewed as a way for stabilization of Long Short-Term Memory neural networks since the networks' weight matrices and biases can be essentially treated as control variables. The condition and how to compute numerical values for the weight matrices and biases are illustrated by some examples.
Wei Wei - One of the best experts on this subject based on the ideXlab platform.
-
Sleep staging by bidirectional Long Short-Term Memory convolution neural network
Future Generation Computer Systems, 2020Co-Authors: Xueyan Chen, Wei Yan, Wei WeiAbstract:Abstract Sleep is an indispensable physiological activity for human beings, which can supplement, enhance resistance, promote the normal growth and development, gets sufficient rest of human body. Therefore, the research of sleeping is important for people’s healthy. Sleep classification is the basis for studying sleep. Polysomnography (PSG), also known as sleep electroencephalography, is mainly used for analysis of sleep classification. Both automatic and manual classification of sleep stage are commonly used classification methods. However, compared with manual classification method the automatic classification method has inconvenient advantage, for example it can improve the efficiency and obtain more objective result. In this paper, combined the bidirectional Long Short-Term Memory recurrent neural network and convolution neural network (CNN), named bidirectional Long Short-Term Memory convolution neural network (Bi-LSTM–CNN), to perform automatic sleep classification with multichannel sleep data (electroencephalogram, electro-oculogram and electromyography). The average accuracy for 39 samples is 89.4%, 84.8% and 81.6% by cross-validation by Bi-LSTM–CNN, Bi-LSTM and LSTM–RNN, respectively. Bi-LSTM–CNN is an effective signal processing method, which can efficiently improve the accuracy of sleep classification, and has good application prospects.