Improving the performance of developed activation function-based Deep Learning Long Short-Term Memory (DLLSTM) structures by employing robust loss functions like Mean Abs...
Abstract:
In this paper, we suggest improving the performance of developed activation function-based Deep Learning Long Short-Term Memory (DLLSTM) structures by employing robust lo...Show MoreMetadata
Abstract:
In this paper, we suggest improving the performance of developed activation function-based Deep Learning Long Short-Term Memory (DLLSTM) structures by employing robust loss functions like Mean Absolute Error (MAE) and Sum Squared Error (SSE) to create new classification layers. The classification layer is the last layer in any DLLSTM neural network structure where the loss function resides. The LSTM is an improved recurrent neural network that fixes the problem of the vanishing gradient that goes away and other issues. Fast convergence and optimum performance depend on the loss function. Three loss functions (default (Crossentropyex) , (MAE) and (SSE)) that compute the error between the actual and desired output for two distinct applications were used to examine the effectiveness of the suggested DLLSTM classifier. The results show that one of the suggested classifiers’ specific loss functions (SSE)) works better than other loss functions and does a great job. The suggested functions Softsign , Modified-Elliott, Root-sig, Bi-tanh1, Bi-tanh2, Sech and wave are more accurate than the tanh function.
Improving the performance of developed activation function-based Deep Learning Long Short-Term Memory (DLLSTM) structures by employing robust loss functions like Mean Abs...
Published in: IEEE Access ( Volume: 11)
Funding Agency:
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Neural Network ,
- Classification Layer ,
- Robust Loss ,
- Loss Function ,
- Deep Learning ,
- Short-term Memory ,
- Mean Absolute Error ,
- Long Short-term Memory ,
- Recurrent Neural Network ,
- Machine Learning ,
- Activation Function ,
- Least-squares ,
- Learning Algorithms ,
- Deep Neural Network ,
- Sigmoid Function ,
- Graphics Processing Unit ,
- Multilayer Perceptron ,
- Hyperbolic Tangent ,
- Long Short-term Memory Network ,
- Output Units ,
- Long Short-term Memory Structure ,
- Weather Reports ,
- Hidden Neurons ,
- Efficiency Range ,
- Gating Function ,
- Forget Gate ,
- Action Labels ,
- Conventional Classification
- Author Keywords
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Neural Network ,
- Classification Layer ,
- Robust Loss ,
- Loss Function ,
- Deep Learning ,
- Short-term Memory ,
- Mean Absolute Error ,
- Long Short-term Memory ,
- Recurrent Neural Network ,
- Machine Learning ,
- Activation Function ,
- Least-squares ,
- Learning Algorithms ,
- Deep Neural Network ,
- Sigmoid Function ,
- Graphics Processing Unit ,
- Multilayer Perceptron ,
- Hyperbolic Tangent ,
- Long Short-term Memory Network ,
- Output Units ,
- Long Short-term Memory Structure ,
- Weather Reports ,
- Hidden Neurons ,
- Efficiency Range ,
- Gating Function ,
- Forget Gate ,
- Action Labels ,
- Conventional Classification
- Author Keywords