To maximise the vast potential of exascale computers, two major obstacles will have to be overcome: software complexity and data volume.
This is one of the findings of a new semi-automated literature study on the state of exascale computing. The study we carried out with a team of research engineers at the Netherlands eScience Center with reusable software specifically developed for this purpose, was recently published in the journal ACM Computing Surveys.
The next generation of supercomputers will soon break the exascale barrier. These supercomputers will be capable of performing at least one quintillion (billion billion) floating-point operations per second (10 to the power of 18 Flops) and are expected to accelerate advancements in many scientific disciplines. At the moment, the race towards these exascale systems is reaching its conclusion with the United States, the European Union and China all planning to build and launch their own exascale supercomputers within the next five years.
The state of current research
Mirroring the rapid rate of development in exascale computing, much has been written over the past decade about these systems: their tremendous computing power, the technical challenges to building and programming them, as well as possible solutions to overcoming these.
In their paper, the researchers provide an overview of the current body of knowledge on exascale computing and provide insights into the most important trends and research opportunities within this field. To do so, they use a three-staged approach in which they discuss various exascale landmark studies, use data-driven techniques to analyse the large collection of related literature, and discuss eight research areas in-depth based on influential articles.
‘The quantitative analysis was done with open-source software developed by the eScience Center’, says Stijn Heldens, PhD candidate and lead author of the paper. ‘We formulated a search query and collected all the exascale-related papers we could find. We then used data mining and natural language processing techniques to automatically process all the material. In addition, we selected the most prominent articles and studied these to determine the most important trends that will help us to reach exascale’
Progress and challenges
The team’s research shows that great progress has been made in tackling two of the major exascale barriers: energy efficiency and fault tolerance. ‘Energy efficiency has improved dramatically with an increase in ~35x over the past decade, meaning we are nearing the point where an exascale system would be feasible in terms of energy consumption’, says Heldens. ‘Fault tolerance is another topic that has received substantial attention with major developments in checkpoint-restart protocols, data corruption detection and fault understanding.’
Nevertheless, Heldens and his fellow researchers also foresee these two barriers slowly being overshadowed by two other challenges: software complexity and data volume. ‘With respect to software complexity, we see a clear lack of suitable programming models that simplify the development of scalable scientific applications. So even though we’ll soon have all this powerful hardware, the research community might not be able to fully harness it because of a lack of suitable software. Added to this is the problem of data volume, in which we see a growing gap between computing power of processors and data bandwidth. While we expect that computation will become cheaper over time, data movement (i.e. bytes per second) might not grow at the same pace and become relatively more expensive.’ According to the authors, novel hardware technology is promising solutions to the data movement challenge, but these conflict with the software complexity challenge since they introduce additional complications when programming these systems.
Heldens: ‘Exascale computing promises to radically open up new avenues for research and development. It is an enthralling prospect. But to harness its awesome potential, we’ll need to address these issues as soon as possible. I hope our paper provides the shot in the arm to start this process.’
Stijn Heldens, Pieter Hijma, Ben Van Werkhoven, Jason Maassen, Adam Belloum and Rob van Nieuwpoort, ‘The Landscape of Exascale Research: A Data-Driven Literature Analysis’ in ACM Computing Surveys (March 2020). DOI: https://doi.org/10.1145/3372390
* This paper was made possible by funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 777533 (PROCESS) and No 823988 (ESiWACE2), and the Netherlands eScience Center under file number 027.016.G06 (A methodology for man-core programming).