• Feature

Developing robots with brain-derived skills

13 July 2023


The HBP is using neuro-derived technologies to make machines smarter. This not only advances the field of robotics but also helps neuroscientists to better understand how the brain works.

Scuttling across the ground, the robot moves around, nose first, with tendrils constantly waving back and forth, palpating the air. Researchers at the University of Amsterdam and the University of the West of England have taken inspiration from a rat when building it, and it is not hard to see the similarities in both appearance and behaviour. Named WhiskEye, this robot explores the world around it through two small camera eyes and a very large mechanical nose surrounded by 24 artificial whiskers arranged in a circle, much like a rat does. Trained by new computational models developed using the EBRAINS Neurorobotics Platform, WhiskEye is now resembling a rat “in mind” as well as in appearance. The models take inspiration from biological brains, which operate using impulses of electricity (spikes) instead of a continuous flux of information (Pearson et al. 2021). By implementing a more realistic, brain-like architecture, the neuronal model underlying WhiskEye’s behaviour is now able to perform object reconstruction – an easy task for us, but one that is considerably harder for robots.

The WhiskEye robot.

Some problems that humans and animals would consider trivial – such as recognising that a tree is the same tree when changing the angle of perception – remain big hurdles for artificial minds. Biological brains have been refined by millions of years of natural selection to interact with the world around them; standard computers, while powerful, were never meant to act as brains to control behaviour in an environment, and are, in general, just wired differently.

Robots with a traditional computational architecture are still struggling with object manipulation, naturalistic movement and other tasks that would be intuitive for us, a testament to the intrinsic divide between computers and brains. This is why the intersection of neuroscience and robotics is considered so promising by experts in both fields: neuro-inspired technology that mimics the information flow of a biological brain through spiking neural networks could achieve better task resolution and improve efficiency at the same time.

Robotics as a tool to study brain function

While neuro-inspired technologies mimic the way the brain handles information processing, neuro-derived ones resemble the physical construction of brain architectures and connections between areas. And this is not only a new way of thinking for roboticists: the approach provides brain researchers with a way of testing how their neuroscientific models perform within a body – embodied cognition instead of “brain in a jar”. Advanced brain models can provide us with a lot of information but often exist in isolation from the physical world, in which our brains are immersed through perception and through our bodies. Giving embodiment to a digital brain brings us even closer to how brains operate in their environment, opening up new capabilities for interaction.

Digital brains differ not just in concept and function, but in design and wiring, too. HBP researchers led by a team from Maastricht University, use the HBP’s atlas of the human brain on EBRAINS as the basis to derive the architecture of their robotics platform. This is an example of how an AI-based deep reinforcement learning system can benefit from brain knowledge: the architecture and artificial connections between areas are modelled after a map of the actual areas and neuronal connections in the brain. The resulting brain-inspired cognitive architecture is used to perform robotics learning experiments in a simulated virtual environment, in this case with a virtual copy of the Shadowhand robot, an advanced robotic hand created by the London-based company Shadow Robot. Here, the brain-derived network is trained to learn dexterous manipulations of objects, another task that we humans find very easy, but which is still very complicated for robots. By training the brain-inspired architecture to control the robotic hand, the HBP researchers hope to shed light on how the human brain coordinates complex hand movements. This is possible because the researchers can inspect every detail of the simulated brain, such as the weights of the connections between simulated neurons after successfully learning how to perform dexterous in-hand object manipulation (also see p.65).

Improving robots by mimicking the brain

Precise movements and coordination through neuro-derived AI systems were also achieved by another team of HBP researchers located at the University of Granada. They have linked a detailed artificial neural network that mimics the cerebellum (one of the evolutionarily older parts of the brain, which plays an important role in motor coordination) to a robotic arm (Abadía et al. 2021). Their system learned to perform precise movements and interact with humans in different circumstances, surpassing the performance of previous AI-based robotic steering systems, while also dealing with unpredictable natural time delays. Researchers at the University of Pavia, who specialize in cerebellum modelling, are now experimenting with inserting digital mimics of the cerebellum into robotic controllers on the EBRAINS platform (Antonietti et al. 2022).

Robot controlled by artificial cerebellum.

It’s not just about building machines that work and learn better on their own: human-robot interaction is expected to rise in the coming years, and with it the necessity for robots that are able to safely collaborate with us – a field of study also known as cobotics. HBP researchers address these issues directly using neuromorphic technology. In a collaborative initiative overseen by the Cognitive Neuroscience department at Maastricht University, HBP cobotics simulations in which robots learn to interact with humans in a safer way are being developed using EBRAINS.

Safer human-robot interaction

Imagine being a worker sharing a factory floor with a robotic arm – how does the robot know where you are? If you disappear momentarily behind a group of boxes, does it still know you are there? Would you trust the robot to hand you a delicate piece of equipment or a sharp tool without inadvertently hurting you? 

Traditional robotics might try brute force as a solution to these issues through thousands of hours of deep learning or just slowing down the robot, impacting productivity. By using brain-derived neural networks instead, the HBP cobotics system is capable of mimicking the way our brains handle visual occlusion, maintaining a temporal dimension and object permanence. And it also simulates the bone and muscle structure of the human collaborator, modelling the musculoskeletal dynamics and relevant motor circuitry. In this way, it knows when to stop pushing or pulling, and to let go of objects, interacting with you without spraining your ligament by accident.

Despite their current limitations, robots will be a larger part of our lives in the coming years, and the way research and industry will choose to tackle the main questions and issues of robotics will determine just how much smarter, safer and more efficient they will be. Neuromorphic and neuro-derived robotic cognitive architectures might just be the way to go.

This text was first published in the booklet ‘Human Brain Project – A closer look at scientific advances’, which includes feature articles, interviews with leading researchers and spotlights on latest research and innovation. Read the full booklet here.

References

Abadía I, Naveros F, Ros E, Carrillo RR, Luque NR (2021). A cerebellar-based solution to the nondeterministic time delay problem in robotic control. Sci. Robot. 6(58):eabf2756. doi: 10.1126/scirobotics.abf2756

Antonietti A, Geminiani A, Negri E, D'Angelo E, Casellato C, Pedrocchi A (2022). Brain-Inspired Spiking Neural Network Controller for a Neurorobotic Whisker System. Front. Neurorobot. 16:817948. doi: 10.3389/Fnbot.2022.817948

Pearson MJ, Dora S, Struckmeier O, Knowles TC, Mitchinson B, Tiwari K, Kyrki V, Bohte S, Pennartz CMA (2021). Multimodal Representation Learning for Place Recognition Using Deep Hebbian Predictive Coding. Front. Robot. AI 8:732023. doi: 10.3389/frobt.2021.732023