By Topic

Learning Deep Neural Networks for High Dimensional Output Problems

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Labbe, B. ; INSA de Rouen, St. Etienne du Rouvray, France ; Herault, R. ; Chatelain, C.

State-of-the-art pattern recognition methods have difficulties dealing with problems where the dimension of the output space is large. In this article, we propose a framework based on deep architectures (e. g. deep neural networks) in order to deal with this issue. Deep architectures have proven to be efficient for high dimensional input problems such as image classification, due to their ability to embed the input space. The main contribution of this article is the extension of the embedding procedure to both the input and output spaces to easily handle complex outputs. Using this extension, inter-output dependencies can be modelled efficiently. This provides an interesting alternative to probabilistic models such as HMM and CRF. Preliminary experiments on toy datasets and USPS character reconstruction show promising results.

Published in:

Machine Learning and Applications, 2009. ICMLA '09. International Conference on

Date of Conference:

13-15 Dec. 2009