By Topic

A Class of Complex ICA Algorithms Based on the Kurtosis Cost Function

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Hualiang Li ; Dept. of Comput. Sci. & Electr. Eng., Univ. of Maryland Baltimore County (UMBC), Baltimore, MD, USA ; Adali, T.

In this paper, we introduce a novel way of performing real-valued optimization in the complex domain. This framework enables a direct complex optimization technique when the cost function satisfies the Brandwood's independent analyticity condition. In particular, this technique has been used to derive three algorithms, namely, kurtosis maximization using gradient update (KM-G), kurtosis maximization using fixed-point update (KM-F), and kurtosis maximization using Newton update (KM-N), to perform the complex independent component analysis (ICA) based on the maximization of the complex kurtosis cost function. The derivation and related analysis of the three algorithms are performed in the complex domain without using any complex-real mapping for differentiation and optimization. A general complex Newton rule is also derived for developing the KM-N algorithm. The real conjugate gradient algorithm is extended to the complex domain similar to the derivation of complex Newton rule. The simulation results indicate that the fixed-point version (KM-F) and gradient version (KM-G) are superior to other similar algorithms when the sources include both circular and noncircular distributions and the dimension is relatively high.

Published in:

Neural Networks, IEEE Transactions on  (Volume:19 ,  Issue: 3 )