Conveners
Physics & Engineering Application
- Junichi Tanaka (University of Tokyo)
Physics & Engineering Application
- Josep Flix (PIC / CIEMAT)
HEPS (High Energy Photon Source) is expected to generate a massive and diverse amount of data, and the data IO bottleneck severely affects computational efficiency. In order to address these issues, we have designed and implemented an IO framework specifically for HEPS, serving as the data IO module of Daisy (Data Analysis Integrated Software System), which is a software framework developed...
While database management systems (DBMS) are one of the most important IT concepts, scientific supercomputing makes little use of them. Reasons for this situation range from a preference towards direct file I/O without overheads to feasibility problems in reaching DBMS from High-Performance Computing (HPC) cluster nodes. However, trends such as the increasing re-usage of scientific output data...
In the contemporary era, where scientific progress meets the imperative of responsible resource utilization, the need for innovative tools is paramount. This paper explores the pivotal role of a new benchmark for High-Energy Physics (HEP), HEPScore, and the HEP Benchmark Suite in steering the HEP community toward sustainable computing practices. As exemplified by projects like the Large Hadron...
The Jiangmen Underground Neutrino Observatory (JUNO) is an underground 20 kton liquid scintillator detector being built in the south of China and expected to start data taking in late 2024. The JUNO physics program is focused on exploring neutrino properties, by means of electron anti-neutrinos emitted from two nuclear power complexes at a baseline of about 53km. Targeting an unprecedented...
The discovery of Beyond the Standard Model (BSM) is a major subject of many experiments, such as the ATLAS and CMS experiments with the Large Hadron Collider, which has the world's highest centre-of-mass energy. Many types of BSM models have been proposed to solve the issues of the Standard Model. Many of them have some or many model parameters, e.g. the Minimal Supersymmetric Standard Model,...
The computing cluster of the Institute of High Energy Physics has long provided computational services for high energy physics experiments, with a large number of experimental users. With the continuous expansion of the scale of experiments and the increase in the number of users, the queuing situation of the existing cluster is becoming increasingly severe.
To alleviate the shortage of...
The Large Hadron Collider at CERN in Geneva is poised for a transformative upgrade, preparing to enhance both its accelerator and particle detectors. This strategic initiative is driven by the tenfold increase in proton-proton collisions anticipated for the forthcoming high-luminosity phase scheduled to start by 2029. The vital role played by the underlying computational infrastructure, the...
In the context of the Italian National Recovery and Resilience Plan (NRRP), the High-Performance Computing, Big Data e Quantum Computing Research Centre, created and managed by the ICSC Foundation, has been recently established as one of the five Italian “National Centres” to address designated strategic sectors for the development of the country, including simulations, computing, and...