Abstract:
Neural networks are often used to approximate functions defined over high-dimensional data spaces (e.g. text data, genomic data, multi-sensor data). Such approximation ta...Show MoreMetadata
Abstract:
Neural networks are often used to approximate functions defined over high-dimensional data spaces (e.g. text data, genomic data, multi-sensor data). Such approximation tasks are usually difficult due to the curse of dimensionality and improved methods are needed to deal with them effectively and efficiently. Since the data generally resides on a lower dimensional manifold various methods have been proposed to project the data first into a lower dimension and then build the neural network approximation over this lower dimensional projection data space. Here we follow this approach and combine it with the idea of weak learning through the use of random projections of the data. We show that random projection of the data works well and the approximation errors are smaller than in the case of approximation of the functions in the original data space. We explore the random projections with the aim to optimize this approach.
Date of Conference: 08-13 July 2018
Date Added to IEEE Xplore: 14 October 2018
ISBN Information:
Electronic ISSN: 2161-4407
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Neural Network ,
- Neural Network Approximation ,
- Random Projection ,
- Random Neural Network ,
- Estimation Error ,
- Dimensional Space ,
- Dimensional Data ,
- Project Data ,
- Function Approximation ,
- Data Space ,
- Low-dimensional Space ,
- Lower Dimension ,
- Original Space ,
- Curse Of Dimensionality ,
- Weak Learners ,
- Use In Projects ,
- Dimensional Manifold ,
- Low-dimensional Data ,
- Low-dimensional Manifold ,
- Original Data Space ,
- Projection Matrix ,
- Projective Space ,
- Single Project ,
- Dimensional Vector ,
- Linear Projection ,
- Neurons In The Hidden Layer ,
- Locally Linear Embedding ,
- Data Manifold ,
- Simple Projection ,
- Self-organizing Map
- Author Keywords
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Neural Network ,
- Neural Network Approximation ,
- Random Projection ,
- Random Neural Network ,
- Estimation Error ,
- Dimensional Space ,
- Dimensional Data ,
- Project Data ,
- Function Approximation ,
- Data Space ,
- Low-dimensional Space ,
- Lower Dimension ,
- Original Space ,
- Curse Of Dimensionality ,
- Weak Learners ,
- Use In Projects ,
- Dimensional Manifold ,
- Low-dimensional Data ,
- Low-dimensional Manifold ,
- Original Data Space ,
- Projection Matrix ,
- Projective Space ,
- Single Project ,
- Dimensional Vector ,
- Linear Projection ,
- Neurons In The Hidden Layer ,
- Locally Linear Embedding ,
- Data Manifold ,
- Simple Projection ,
- Self-organizing Map
- Author Keywords