New computing and storage systems installed at a major European supercomputing centre have been designed to meet neuroscientific and other scientific research needs. One major benefit is highly interactive, high-performance computing that can perform extreme calculations.
Cloud supercomputing services that are compatible with scientific computing and data science work are the aim of a consortium formed by five major European supercomputing centres. The centres are aligning their services so as to create an e-infrastructure called Fenix Infrastructure, with well-integrated data repositories and scalable supercomputing systems. As part of the European Human Brain Project (HBP), the EU-funded ICEI project is responsible for realising the e-infrastructure’s initial version.
As part of this effort, ICEI project partner French Alternative Energies and Atomic Energy Commission has deployed new resources and services in its Very Large Computing Centre (TGCC) – one of the five supercomputing centres mentioned above. These include the standard high-performance computing (HPC) resources as well as new systems for sovereign cloud computing, object storage and interactive computing. The new systems installed in the TGCC centre are geared to the particular needs of neuroscientific research and other scientific fields requiring highly interactive HPC services capable of performing extreme calculations.
Four new systems
According to a press release posted on the project website, there are four new TGCC systems. The first is an interactive computing cluster for more efficient visualisation, AI, post-processing and interactive workloads. “This cluster is equipped with the last generation of Intel processors, high performance Nvidia GPUs, and up to 3 Tera-bytes of memory per server. This system is particularly suited to the simulation of, and interacting with, large neural structures,” the press release states.
The second system is a cloud infrastructure through which researchers can develop web services, shared knowledge bases, open data platforms and other related community services. As reported in the press release, “[r]elying on the OpenStack open-source software suite, it [the infrastructure] can run up to 600 virtual servers.”
Third is a full-flash Lustre parallel file system of almost 1 petabyte that runs on state-of-the-art DataDirect Networks controllers (DDN SSF 18KXe). This high-performance storage system is capable of supporting intensive data workloads.
The fourth system installed is “[a]n object store of 7 Petabytes running the OpenIO open-source storage software.” As the press release describes, the “system allows users to safely archive their data, and to share them on the Internet with other members of their research community.”
Deployed as part of the ICEI (Interactive Computing E-Infrastructure for the Human Brain Project) project, the new systems will be interacting with the systems of the other four supercomputing centres in Germany, Italy, Spain and Switzerland that are part of the Fenix Infrastructure. The first scientists to have the opportunity to use this e-infrastructure are researchers from or associated with the HBP. One of these users, NeuroMod Institute Deputy Director Alexandre Muzy of the Université Côte d’Azur, had this to say about the new services: “Thanks to Fenix I was able to benefit from very powerful computational resources to achieve high-performance simulations and also to interact with world class researchers to improve the relevance of my mathematical modeling approach.”
Read the original article on the CORDIS website