Skip to Main Content
A neural network is a massively parallel distributed processor made up of simple processing units known as neurons. These neurons are organized in layers and every neuron in each layer is connected to each neuron in the adjacent layers. This connection architecture makes for an enormous number of communication links between neurons This is a major issue when considering a hardware implementation of a neural network since communication links take up hardware space, and hardware space costs money. To overcome this space problem incremental communication for multilayer neural networks has been proposed. Incremental communication works by only communicating the change in value between neurons as opposed to the entire magnitude of the value. This allows for the numbers to be represented with a fewer number of bits, and thus can be communicated with narrower communication links. To validate the idea of incremental communication an incremental communication neural network was designed and implemented, and then compared to a traditional neural network. From the implementation it is seen that even though the incremental communication neural network saves design space through reduced communication links, the additional resources necessary to shape the data for transmission outweighs any design space savings when targeting a modern FPGA.