Neurons are morphologically diverse, but the evolutionary advantage of this is unclear. In addition, neurons spike and exploit time in their computations, outputs and learning. However, most work on artificial neural networks (ANNs) abstract over these details and restrict learning and adaptation to the spatial parameters of weights and biases. Even when time is introduced, it is introduced through recurrency at a fixed time step (synchronous computation), and again, learning is restricted to weights and biases. Here we adapt weights, time constants and delays in an evolutionary context in an attempt to gain some insights into why neurons are so diverse. We show that nature might have evolved a morphologically diverse set of neurons to i) map spatio-temporal spike trains and ii) ease the evolutionary search for high performing solutions.

This content is only available as a PDF.
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. For a full description of the license, please visit https://creativecommons.org/licenses/by/4.0/legalcode.