In this paper we analyze simple computations with spiking neural networks (SNN), laying the foundation for more sophisticated calculations. We consider both a deterministic and a stochastic computation framework with SNNs, by utilizing the Izhikevich neuron model in various simulated experiments. Within the deterministic-computation framework, we design and implement fundamental mathematical operators such as addition, subtraction, multiplexing and multiplication. We show that cross-inhibition of groups of neurons in a winner-takes-all (WTA) network-configuration produces considerable computation power and results in the generation of selective behavior that can be exploited in various robotic control tasks. In the stochastic-computation framework, we discuss an alternative computation paradigm to the classic von Neumann architecture, which supports information storage and decision making. This paradigm uses the experimentally-verified property of networks of randomly connected spiking neurons, of storing information as a stationary probability distribution in each of the sub-network of the SNNs. We reproduce this property by simulating the behavior of a toy-network of randomly-connected stochastic Izhikevich neurons.
In Proc. of IWANN'17, the 14th International Work-Conference on Artificial and Natural Neural Networks, Cadiz, Spain, June, 2017, Springer, LNCS.
*This work was partially supported by the NSF-Frontiers
Cyber-Physical Heart Award, FWF-NFN RiSE Award, FWF-DC LMCS Award,
FFG Harmonia Award, FFG Em2Apps Award, and the TUW CPPS-DK Award.