“We need to rethink the way we build computers”

    06 December 2021


    An interview with Giacomo Indiveri on how developers take inspiration from the human brain to make computers more energy-efficient and what the future of computing will look like

    Giacomo Indiveri is Professor of Neuroinformatics at ETH Zurich, where he studies both real and artificial neural processing systems and builds hardware that mimics the way our brain works. During his keynote at the HBP Summit, he emphasized that we need to fundamentally change the way we build computers. Otherwise, we will use 20 percent of all the world’s electricity on computing by 2025. 

    In your keynote, you spoke about the necessity of a paradigm shift in computing. What needs to change?

    Indiveri: In order to do the same amount of computation with less power, we need to rethink the way that computing architectures are designed. Computing itself is not particularly energy-consuming, but standard computers use a lot of energy to transfer data from memory to the central processing unit, the CPU. That is because in conventional computing architectures, you have the central processing unit on one side and the data storage on the other side, far away from the computing.

    To put into perspective how far the information has to travel consider this analogy, originally proposed by my colleague Piotr Dudek from the University of Manchester: Computing is based on flipping tiny bits – turning a bit from zero to one or the other way around. Imagine you were the CPU and the bits were about the size of an A4 page. After flipping a bit, you would need to put it back into memory and the closest place for this, the short-term cache memory, is a few thousand times the length of a bit away. You would have to move quite a distance, investing considerable energy, just to put back the bit. If the cache is full you will have to resort to memory which is even further away. This would be equivalent to going outside the building to store the bit. If you actually needed to use long-term storage like a hard disk that would be equivalent to going all the way to Jupiter – using huge amounts of energy. 

    How does the functioning of our brain compare to the way standard computers operate?

    Indiveri: Biological nervous systems operate in a completely different way. What happens at synapses is a lot more complicated than a simple operation of flipping a bit. We are talking about very complex computations. And the results are stored in the same location where the computation is carried out. In other words, computation and memory are co-localised. This makes neurons much more energy-efficient than standard computers.

    In the Human Brain Project, scientists are developing neuromorphic technologies that mimic some of these energy-saving characteristics of the brain. How has the project advanced the field?

    Indiveri: The HBP really stands out, because its neuromorphic architectures BrainScaleS in Heidelberg and SpiNNaker in Manchester have been scaled up to very large systems that allow neuroscientists to simulate millions of neurons in real time. Both systems have achieved really dramatic demonstrations of technologies. They can be openly accessed by researchers through the HBP’s digital infrastructure EBRAINS

    Also, by funding the HBP, the EU has broadly raised awareness of the importance of studying the brain to advance the technology sector. The HBP was the first initiative of this kind and had a very big influence in this direction. It has inspired the launch of similar projects like the BRAIN Initiative in the US and equivalents in other continents. 

    What do you see as the greatest achievements of the HBP besides brain-inspired technologies?

    Indiveri: To me, the brain mapping work is particularly impressive. In addition, I personally see a lot of potential in the HBP’s endeavour of federating the data banks and the databases across European hospitals. This requires a huge effort, and it does not only need technology and researchers but also involves agreements between hospitals and governments. There has already been a lot of progress, particularly, in developing the required bioinformatics. This achievement of the HBP could have a massive impact on society.

    It is also a huge accomplishment how the HBP has linked different research communities from neuroscience and technology that would not have worked together otherwise. It has acted like a glue between them. And now, EBRAINS will provide the infrastructure and opportunities for these communities to take advantage of the progress that is being made in one subdomain for other subdomains.

    Like taking insights from brain research to advance technological developments?

    Indiveri: Exactly. We can learn a lot from the brain. Besides the co-localization of memory and CPU, there is another aspect that reduces energy consumption: Even though it may look like brains are general-purpose, they are in fact very specialized. We cannot programme brains to do just any task like predicting the weather. They have been evolutionarily optimized for specific tasks, saving a lot of energy. To apply this concept to technology, we will need to let go of the idea of general-purpose computers that can do anything from Excel spreadsheets to AI. We will have to distinguish different types of very specialized computers in order to reduce energy consumption.

    During your keynote at the HBP Summit, you mentioned computational time as a factor in this specialization.

    Indiveri: The solution that nature has found to save energy is to couple the time scales of computing with the time scales of the signals that need to be processed. Our brains have been tuned to only process a small set of signals that is of interest for our survival. We can process signals at the time scales of speech or gestures, but we cannot, for example, perceive glaciers shifting, because it happens so slowly. 

    In order to minimize power consumption, we need to build computing elements with processing time scales that match the signals we want them to process: If we want them to process racing cars, we have to build computers that are fast in recognising signals, and if a device is supposed to determine if fish has gone bad in our refrigerator, it needs to perceive odours changing across hours or maybe even days. If our processors are too fast and the signals very slow, we would need to store them, bringing us back to the storage and data-movement issue.

    Is this what the future of computing will look like? More and very specialised devices?

    Indiveri: Yes. We are already experiencing this. Within the Human Brain Project, many developments in robotics and in neuromorphic computing are going in this direction. And just have a look at all the start-ups and also established companies that are building so-called edge-computing devices. These are devices that directly process data that they collect without having to transfer it to a central server. Computing is carried out locally at the place of application or at the ‘edge’ of the network. Examples would be wearables which we put on our wrists. We usually don’t think of them as computers but rather just as appliances that count our steps. Another example would be small devices that detect if weight is placed on a chair. They are already sort of intelligent elements that have very specialised sensors and computing structures. 

    This is the trend – there will be a broad range of “computers” with different types of dynamics, input sensors and output representations, and this is quite a dramatic change from what we have been seeing up to now. Companies that are used to building one single processor that is used for all sorts of different applications will have to adapt and start building many types of processors for different applications. This is also an opportunity for start-ups, because there are a lot of niche application areas where the big companies do not have products yet.

    The interview was conducted by Lisa Vincenz-Donnelly.