The project RobotBodySchema (or “Robot self-calibration and safe physical human-robot interaction inspired by body representations in primate brains”) aims to study the mechanisms of how the brain represents the body. These mechanisms will then be modelled using robots, and then brain-inspired algorithms will be applied to make robots more autonomous and safe.
Robots often rely on preprogrammed models and lack the capacity to adapt to unexpected changes to their bodies or environments. Furthermore, they often blindly execute movements and the perception of contacts is absent. Humans, on the other hand, seamlessly control their complex bodies, adapt to growth or failures, and use tools. Exploiting multimodal sensory information plays a key part in these processes. However, the mechanisms of how the brain represents the body and the space around it are not fully understood.
The aim of this project is to use a humanoid robot equipped with a whole-body artificial skin array to, first, develop embodied computational models of the development and operation of body representations. Second, we target two application areas where a robot with sensitive skin provides a key enabling technology: (i) automatic self-calibration, and (ii) safe and intelligent physical man-machine interaction through whole-body "awareness" coming from visuo-tactile information.
Collaboration with HBP
The project is currently using the Neurorobotics platform – in particular the iCub humanoid robot simulation connected to spiking neural networks – to test the learning of peripersonal space representations from visual and tactile inputs. CTU Prague has humanoid robots with electronic skin arrays at its disposal and hence, in a second step, the work can be transferred to a real setting. More generally, this is in line with the Neurorobotics subproject (SP10), facilitating the embedding of brain simulation into a closed sensorimotor loop – through instantiation in a humanoid robot with similar morphology and sensorimotor capacities to humans. In the long run the RobotBodySchema will also seek access to first-hand neurophysiological data related to the "body in the brain".
Matej Hoffmann (Project Coordinator)
Matej Hoffmann received the MSc degree in Computer Science, Artificial Intelligence, from Faculty of Mathematics and Physics, Charles University in Prague, Czech Republic, in 2006. Between 2006 and 2013 he completed the PhD degree and then served as Senior Research Associate at the Artificial Intelligence Laboratory, University of Zurich, Switzerland (Prof. Rolf Pfeifer). In May 2013 he joined the iCub Facility of the Italian Institute of Technology (Prof. Giorgio Metta), supported by a Marie Curie Intra-European Fellowship iCub Body Schema (2014-2016). In January 2017, he joined the Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague where he's leading a group (see the group website) focused on cognitive, neuro- , collaborative, and humanoid robotics. His research interests include synthetic approaches to embodied cognition in general and body and peripersonal space representations in particular.
Hoffmann, M.; Straka, Z.; Farkas, I.; Vavrecka, M. & Metta, G. (2018), 'Robotic homunculus: Learning of artificial skin representation in a humanoid robot motivated by primary somatosensory cortex', IEEE Transactions on Cognitive and Developmental Systems 10 (2), 163-176. [IEEE Xplore][postprint-pdf][long video][short video]
Hoffmann, M. & Pfeifer, R. (2018), Robots as powerful allies for the study of embodied cognition from the bottom up, in A. Newen, L. de Bruin; & S. Gallagher, ed., 'The Oxford Handbook 4e Cognition', Oxford University Press, pp. 841-862.[author-version-pdf][please send me an email to get a nicely formatted version from the publisher][Oxford Uni Press]
Hoffmann, M.; Chinn, L. K.; Somogyi, E.; Heed, T.; Fagard, J.; Lockman, J. J. & O'Regan, J. K. (2017), Development of reaching to the body in early infancy: From experiments to robotic models, in 'Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)', pp. 112-119. [IEEE Xplore][postprint]
Straka, Z. & Hoffmann, M. (2017), Learning a Peripersonal Space Representation as a Visuo-Tactile Prediction Task, inAlessandra Lintas; Stefano Rovetta; Paul F.M.J. Verschure & Alessandro E.P. Villa, ed., 'Artificial Neural Networks and Machine Learning – ICANN 2017: 26th International Conference on Artificial Neural Networks, Alghero, Italy, September 11-14, 2017, Proceedings, Part I', Springer International Publishing, Cham, pp. 101--109.[Springer][postprint] [ENNS Best Paper Award]
Time frame: 2017 to 2023
Origin: Spontaneous Application
Funding: Czech Science Foundation
Project Website RobotBodySchema
The little android with a sense of touch
The Neurorobotics Platform will help us to connect our brain model of the peripersonal space to the humanoid body