Time frame: 2016 to 2021
Origin: Spontaneous Application
Funding: Engineering and Physical Sciences Research Council (EPSRC), UK
The “Brains on Board” project is an EPSRC funded Programme Grant, involving teams of biologists and computer scientists from the University of Sheffield, the University of Sussex, and Queen Mary University of London.
What if we could design an autonomous flying robot with the navigational and learning abilities of a honeybee? Such a computationally and energy-efficient autonomous robot would represent a step-change in robotics technology, and is precisely what the 'Brains on Board' project aims to achieve.
Autonomous control of mobile robots requires robustness to environmental and sensory uncertainty and also need the flexibility to deal with novel environments and scenarios. Animals solve these problems through having flexible brains capable of unsupervised pattern detection and learning. Behavioural biologists and neuroscientists are increasingly realising that ‘small’-brained animals such as insects have extremely rich behavioural repertoires. The honeybee, an extremely well-studied animal with a brain of only 1 million neurons, exhibits sophisticated learning and navigation abilities through highly efficient neural processes. Bees can reliably navigate over several kilometres in 3-dimensional space, learning the features that will enable them to return to their nest.
They can optimise the distances travelled on routes from the nest site to multiple forage patches, almost certainly without possessing a mental map. Furthermore, bees’ brains can multi-task, are highly adaptable to completely novel scenarios, and exhibit extremely rapid learning. This is in marked contrast to typical control engineering solutions and AI, including deep learning. Bee brains thus provide an excellent autonomous system to reverse engineer, far more sophisticated in navigation and learning abilities than those of Drosophila and other flies. Yet, their brains are still of a size that systematic investigation and modelling remain practical, in contrast to the much larger vertebrate brains of rats, cats, and primates.
The project will fuse computational and experimental neuroscience to develop a new class of highly efficient ‘brain on board’ robot controllers, able to exhibit adaptive behaviour while running on powerful yet lightweight General-Purpose Graphics Processing Unit hardware. This will be demonstrated via autonomous and adaptive control of a flying robot, using an on-board computational simulation of the bee’s neural circuits; an unprecedented achievement in robotics technology.
Collaboration with HBP
Brains on Board (BoB) and the Human Brain Project (HBP) complement each other in many ways. BoB is focused on insect brains and applications in autonomous robotics using physical robots. HBP is working on mouse and human brain data and a virtual neurorobotics platform. More closely related to the core of this partnership, BoB is focused on using GPU accelerator technology for neuromorphic simulation of brain models while HBP is developing BrainScaleS and SpiNNaker neuromorphic computing platforms. This creates synergies through the re-use of brain models for different platforms and a widening of applications for the HBP neuromorphic platforms.
The essence of this partnership is that our work in BoB will produce brain models that are suitable as neuromorphic control algorithms for robots and within the partnership we are planning to port these models to the HBP neuromorphic platforms. This will create showcases of using the platforms for brain simulations in applications, but also will diversify the BoB work beyond the GPU technology focus. As a first practical contribution Dr. James Knight (BoB) will work with Dr. Sebastian Hoeppner’s group at University of Dresden (HBP SP9) to port an existing brain model to the Santos (SpiNNaker II prototype) chip.
I. Blundell, R. Brette, T. A. Cleland, T. G. Close, D. Coca, A. P. Davison, S. Diaz Pier, C. Fernandez Musoles, P. Gleeson, D. F. M. Goodman, M. Hines, M. W. Hopkins, P. Kumbhar, D. R. Lester, B. Marin, A. Morrison, E. Müller, T. Nowotny, A. Peyser, D. Plotnikov, P. Richmond, A. Rowley, B. Rumpe, M. Stimberg, A. B. Stokes, A. Tomkins, G. Trensch, M. Woodman, J. M. Eppler (2018) Code generation in computational neuroscience: a review of tools and techniques. Front Neuroinformatics 12:68. doi: 10.3389/fninf.2018.00068
Cope A.J., Vasilaki E., Minors D., Sabo C., Marshall, J.A.R. & Barron, A.B. (2018) Abstract concept learning in a simple neural network inspired by the insect brain. PLoS Comput Biol 14(9). doi: 10.1371/journal.pcbi.1006435
Knight J.C. and Nowotny T. (2018) GPUs outperform Current HPC and Neuromorphic Solutions in Terms of Speed and Energy When Simulating a Highly-Connected Cortical Model. Frontiers in Neuroscience, 12:941. doi: 10.3389/fnins.2018.00941
Guiraud M., Roper M. and Chittka, L. (2018) High-speed Videography Reveals How Honeybees Can Turn a Spatial Concept Learning Task Into a Simple Discrimination Task by Stereotyped Flight Movements. Frontiers in Psychology, Vol 9, 1347. doi: 10.3389/fpsyg.2018.01347
Woodgate, J.L., Makinson, J.C., Lim, K.S., Reynolds A.M. and Chittka, L. (2017) Continuous Radar Tracking Illustrates the Development of Multi-destination Routes of Bumblebees. Scientific Reports, 7, 17323. doi:10.1038/s41598-017-17553-1
C. Sabo, R. Chisholm, A. Petterson, A. Cope, A lightweight, inexpensive robotic system for insect vision (2017), Arthropod Structure & Development, ISSN 1467-8039, http://dx.doi.org/10.1016/j.asd.2017.08.001
A. Cope, C. Sabo, E. Vasilaki, A. B. Barron, and J. A. R. Marshall (2017), “A Computational Model of the Integration of Landmarks and Motion in the Insect Central Complex,” PLoS One, Feb. 27. doi: 10.1371/journal.pone.0172325
Sabo C., Yavuz E., Cope A., Gurney K., Vasilaki E., Nowotny T., and Marshall J. A. R. (2017) An Inexpensive Flying Robot Design for Embodied Robotics Research. 2017 International Joint Conference on Neural Networks. Anchorage, Alaska, May 14-19.