Abstract:
Artificial neural networks (ANNs) are one of the most promising tools in the quest to develop general artificial intelligence. Their design was inspired by how neurons in...Show MoreMetadata
Abstract:
Artificial neural networks (ANNs) are one of the most promising tools in the quest to develop general artificial intelligence. Their design was inspired by how neurons in natural brains connect and process, the only other substrate to harbor intelligence. Compared to natural brains that are sparsely connected and that form sparsely distributed representations, ANNs instead process information by connecting all nodes of one layer to all nodes of the next. In addition, modern ANNs are trained with backpropagation, while their natural counterparts have been optimized by natural evolution over eons. Here we measure the transfer entropy, that is the information that is transferred from one node to another, to determine how information is propagating through recurrent neural networks optimized either by backpropagation or a genetic algorithm. Surprisingly, we find no difference in how they relay information, suggesting that it is not the optimization method, but instead their topological structure, that causes these ANNs to process information differently compared to the natural brains they seek to emulate.
Date of Conference: 14-15 November 2020
Date Added to IEEE Xplore: 07 January 2021
ISBN Information: