Project Website: http://brainsonboard.co.uk

The “Brains on Board” project is an EPSRC funded Programme Grant, involving teams of biologists and computer scientists from the University of Sheffield, the University of Sussex, and Queen Mary University of London. 

What if we could design an autonomous flying robot with the navigational and learning abilities of a honeybee? Such a computationally and energy-efficient autonomous robot would represent a step-change in robotics technology, and is precisely what the 'Brains on Board' project aims to achieve. 

 

 

Autonomous control of mobile robots requires robustness to environmental and sensory uncertainty and also need the flexibility to deal with novel environments and scenarios. Animals solve these problems through having flexible brains capable of unsupervised pattern detection and learning. Behavioural biologists and neuroscientists are increasingly realising that ‘small’-brained animals such as insects have extremely rich behavioural repertoires. The honeybee, an extremely well-studied animal with a brain of only 1 million neurons, exhibits sophisticated learning and navigation abilities through highly efficient neural processes. Bees can reliably navigate over several kilometres in 3-dimensional space, learning the features that will enable them to return to their nest.

They can optimise the distances travelled on routes from the nest site to multiple forage patches, almost certainly without possessing a mental map. Furthermore, bees’ brains can multi-task, are highly adaptable to completely novel scenarios, and exhibit extremely rapid learning. This is in marked contrast to typical control engineering solutions and AI, including deep learning. Bee brains thus provide an excellent autonomous system to reverse engineer, far more sophisticated in navigation and learning abilities than those of Drosophila and other flies. Yet, their brains are still of a size that systematic investigation and modelling remain practical, in contrast to the much larger vertebrate brains of rats, cats, and primates.

 

Insect Visual Compass in the Central Complex
 

The project will fuse computational and experimental neuroscience to develop a new class of highly efficient ‘brain on board’ robot controllers, able to exhibit adaptive behaviour while running on powerful yet lightweight General-Purpose Graphics Processing Unit hardware. This will be demonstrated via autonomous and adaptive control of a flying robot, using an on-board computational simulation of the bee’s neural circuits; an unprecedented achievement in robotics technology.

Brains on Board (BoB) and the Human Brain Project (HBP) complement each other in many ways. BoB is focused on insect brains and applications in autonomous robotics using physical robots. HBP is working on mouse and human brain data and a virtual neurorobotics platform. More closely related to the core of this partnership, BoB is focused on using GPU accelerator technology for neuromorphic simulation of brain models while HBP is developing BrainScaleS and SpiNNaker neuromorphic computing platforms. This creates synergies through the re-use of brain models for different platforms and a widening of applications for the HBP neuromorphic platforms.

The essence of this partnership is that our work in BoB will produce brain models that are suitable as neuromorphic control algorithms for robots and within the partnership we are planning to port these models to the HBP neuromorphic platforms. This will create showcases of using the platforms for brain simulations in applications, but also will diversify the BoB work beyond the GPU technology focus. As a first practical contribution Dr. James Knight (BoB) will work with Dr. Sebastian Hoeppner’s group at University of Dresden (HBP SP9) to port an existing brain model to the Santos (SpiNNaker II prototype) chip.