Distributed Sparse Optimization Based on Minimax Concave and Consensus Promoting Penalties: Towards Global Optimality | IEEE Conference Publication | IEEE Xplore

Distributed Sparse Optimization Based on Minimax Concave and Consensus Promoting Penalties: Towards Global Optimality


Abstract:

We propose a distributed optimization framework to generate accurate sparse estimates while allowing an algorithmic solution with guaranteed convergence to a global minim...Show More

Abstract:

We propose a distributed optimization framework to generate accurate sparse estimates while allowing an algorithmic solution with guaranteed convergence to a global minimizer. To this end, the proposed problem formulation involves the minimax concave penalty together with an additional penalty called consensus promoting penalty (CPP) that induces convexity to the resulting optimization problem. This problem is solved with an exact first-order proximal gradient algorithm, which employs a pair of proximity operators and is referred to as the distributed proximal and debiasing-gradient (DPD) method. Numerical examples show that CPP not only convexifies the whole cost function, but it also accelerates the convergence speed with respect to the system mismatch.
Date of Conference: 29 August 2022 - 02 September 2022
Date Added to IEEE Xplore: 18 October 2022
ISBN Information:

ISSN Information:

Conference Location: Belgrade, Serbia

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.