Sparse Bayesian Learning Based on Collaborative Neurodynamic Optimization | IEEE Journals & Magazine | IEEE Xplore

Sparse Bayesian Learning Based on Collaborative Neurodynamic Optimization


Abstract:

Regression in a sparse Bayesian learning (SBL) framework is usually formulated as a global optimization problem with a nonconvex objective function and solved in a majori...Show More

Abstract:

Regression in a sparse Bayesian learning (SBL) framework is usually formulated as a global optimization problem with a nonconvex objective function and solved in a majorization–minimization framework where the solution quality and consistency depend heavily on the initial values of the used algorithm. In view of the shortcomings, this article presents an SBL algorithm based on collaborative neurodynamic optimization (CNO) for searching global optimal solutions to the global optimization problem. The CNO system consists of a population of recurrent neural networks (RNNs) where each RNN is convergent to a local optimum to the global optimization problem. Reinitialized repetitively via particle swarm optimization with exchanged local optima information, the RNNs iteratively improve their searching performance until reaching global convergence. The proposed CNO-based SBL algorithm is almost surely convergent to a global optimal solution to the formulated global optimization problem. Two applications with experimental results on sparse signal reconstruction and partial differential equation identification are elaborated to substantiate the superiority and efficacy of the proposed method in terms of solution optimality and consistency.
Published in: IEEE Transactions on Cybernetics ( Volume: 52, Issue: 12, December 2022)
Page(s): 13669 - 13683
Date of Publication: 14 July 2021

ISSN Information:

PubMed ID: 34260368

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.