By Topic

Minkovskian Gradient for Sparse Optimization

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
Amari, S.-I. ; RIKEN Brain Sci. Inst., Wako, Japan ; Yukawa, M.

Information geometry is used to elucidate convex optimization problems under L1 constraint. A convex function induces a Riemannian metric and two dually coupled affine connections in the manifold of parameters of interest. A generalized Pythagorean theorem and projection theorem hold in such a manifold. An extended LARS algorithm, applicable to both under-determined and over-determined cases, is studied and properties of its solution path are given. The algorithm is shown to be a Minkovskian gradient-descent method, which moves in the steepest direction of a target function under the Minkovskian L1 norm. Two dually coupled affine coordinate systems are useful for analyzing the solution path.

Published in:

Selected Topics in Signal Processing, IEEE Journal of  (Volume:7 ,  Issue: 4 )