University of Heidelberg
Computing with physics: aspects of bio-inspired artificial intelligence
Tuesday, 19 November 2019, 1:15pm
Building 14.6U, room 241
Any form of information storage and processing requires a physical substrate, which renders computation ultimately beholden to the laws of motion governing the dynamics of its physical carrier. So much for the obvious.
What is certainly much less obvious is the level of abstraction at which one should consider the dynamics of a system in order to most aptly describe the computation it performs. For classical computers, for example, it is absolutely reasonable to forgo the quantum mechanics of semiconductors in favor of bitwise operations - regardless of what a "bit" actually boils down to in terms of electron flows and Fermi levels. This is not particularly suprising, because we have engineered these systems for the specific purpose of manipulating binary representations. Biological brains, on the other hand, are the much more complicated result of a lengthy trial-and-error process on evolutionary timescales, and consensus about how computation is implemented is not yet in sight. Nevertheless, it is worth attempting a stab at the question, considering the ultimate benefits of even incremental progress towards general AI. Thus, even a far-from-complete answer can, for example, guide the design of novel computational devices, which in turn would benefit the study of biological and artificial intelligence alike.
In my talk, I will discuss some intriguing recent ideas about how complex cortical computation could emerge from rather simple mathematical principles. While the line of thought will certainly betray a physicist's bias towards parsimony, it is substantiated by links to both biological data and applications on artificial neuromorphic substrates. With respect to neuronal dynamics, I will address some computational advantages of having spikes in a Bayesian brain and how the associated information processing links to other cortical observables. With respect to synaptic plasticity, I will discuss how efficient learning can happen both at the level of single neurons and across cortical hierarchies. These insights help reconnect the booming field of artificial neural networks to its original, biological source of inspiration, and further nourish the development of efficient computation on synthetic, brain-inspired substrates.
Host: Prof. Dr. Markus Diesmann
Institute of Neuroscience and Medicine (INM-6)
Computational and Systems Neuroscience
Institute for Advanced Simulation (IAS-6)
last change: 28.10.2019 sw