Speaker
Mr
Shaun de Witt
(Culham Centre for Fusion Energy)
Description
As data volumes grow rapidly in the science domain, the ability to process this data efficiently is becoming increasingly of interest. While in many applications the processing of very large volumes can be accomplished efficiently with map/reduce algorithms (e.g., using frameworks such as Hadoop), this does not cover a large class of problems which are best run in an HPC environment. This class of problem requires a new paradigm commonly referred to as Big Data and Extreme Computing (BDEC).
This talk explains why current HPC systems are not fully suited to BDEC class problems, and how some upcoming technologies such as advanced non-volatile memories could help to resolve some of these issues. We introduce a novel architecture that aims to address this problem, currently being developed within the SAGE project ([sagestorage.eu][1]) as part of the Horizon2020 program. We explain how this architecture has been co-designed between leading industrial partners and research organisations covering a wide spectrum of scientific disciplines, and outline new programming models that are being developed to make use of this new technology, including tools to optimise its use. Finally, we present details of testing of this new architecture and explain where such a system can overcome the limitations of traditional HPC systems.
[1]: http://www.sagestorage.eu/
Primary author
Mr
Shaun de Witt
(Culham Centre for Fusion Energy)
Co-authors
Dr
David Bond
(Diamond Light Source)
Dr
Keeran Brabazon
(Allinea Software Ltd.)
Dr
Oliver Perks
(Allinea Software Ltd.)
Dr
Sai Narasimhamurthy
(Seagate Technology)
Mr
Salem El Sayed Mohamed
(Jülich Supercomputing Centre)
Mr
Sergio Rivas Gómez
(KTH Royal Institute of Technology)
Prof.
Stefano Markidis
(KTH Royal Institute of Technology)