Welcome message from Chair of Program Committee, Prof. Ludek Matyska and Chair of Organizing Committee, Prof. Song-Ming Wang
After years of anticipation, quantum computing is finally here as evidenced by many ongoing projects around the world. In June 2024, the Leibniz Supercomputing Centre (LRZ) publicly demonstrated how a job run on a supercomputer is assisted by a superconducting quantum accelerator. Although this was only the first public demonstration, it gives a clear indication of the potential of hybrid...
In computer science, monitoring and accounting involve tracking and managing the usage of system resources in IT environments by users, applications, or processes. These activities typically encompass monitoring CPU usage, memory allocation, disk space, network bandwidth, and other critical resources. The insights obtained through activity tracking and analysis serve several purposes. Resource...
Machine learning, particularly deep neural networks, has been widely used in high-energy physics, demonstrating remarkable results in various applications. Furthermore, the extension of machine learning to quantum computers has given rise to the emerging field of quantum machine learning. In this paper, we propose the Quantum Complete Graph Neural Network (QCGNN), which is a variational...
The National Institute for Nuclear Physics (INFN) has been managing and supporting Italy’s largest distributed research and academic infrastructure for decades. In March 2021, INFN introduced "INFN Cloud," a federated cloud infrastructure offering a customizable service portfolio designed to meet the needs of the scientific communities it serves. This portfolio includes standard IaaS solutions...
Quantum Machine Learning (QML) faces significant challenges, particularly in encoding classical data and the reliance on quantum hardware for inference, limiting its practical applications. Meanwhile, classical large language models (LLMs) demand immense computational resources and exhibit low training efficiency, leading to substantial cost and scalability concerns. This talk will introduce...
In 2021, the National Institute for Nuclear Physics (INFN) launched the INFN Cloud orchestrator system to support Italy’s largest research and academic distributed infrastructure. The INFN Cloud orchestration system is an open-source middleware designed to seamlessly federate heterogeneous computing environments, including public and private resource providers, container platforms, and more....
The ability to ingest, process, and analyze large datasets within minimal timeframes is a milestone of big data applications. In the realm of High Energy Physics (HEP) at CERN, this capability is especially critical as the upcoming high-luminosity phase of the LHC will generate vast amounts of data, reaching scales of approximately 100 PB/year. Recent advancements in resource management and...
High-Energy Physics (HEP) experiments involve a unique detector signature - in terms of detector efficiency, geometric acceptance, and software reconstruction - that distort the original observable distributions with smearing and biasing stochastic terms. Unfolding is a statistical technique used to reconstruct these original distributions, bridging the gap between experimental data and...
The EU project OSCARS (Open Science Clusters’ Action for Research and Society) brings your research data to new audiences and targets new use-cases in a broad range of scientific clusters including Photon and Neutron Sciences (PaN). As recommended by a new White Paper (submitted to IUCrJ) from the user organisations, ESUO and ENSA, adherence to the FAIR principles (Findable, Accessible,...
"Quantum computers promise transformative advancements in computation, yet their performance remains critically hindered by environmental noise. Qubits, the fundamental units of quantum information, are inherently fragile and highly sensitive to even minimal disturbances from their surroundings. Factors such as electromagnetic interference: We introduce the Telemetry Project, an initiative...
DESY, a leading European synchrotron facility, has taken a significant step towards making research data publicly available by establishing a metadata catalogue and data analysis portal. This development is in line with the Open and FAIR data principles, which aim to make data easily discoverable, accessible, and reusable for the wider scientific community.
The metadata catalogue, Scicat,...
As 2025 marks the International Year of Quantum Science and Technology, Taiwan's quantum computing education resources are expanding rapidly. EntangleTech, as a leading organization dedicated to fostering quantum education among high school and university students, has been actively developing a structured learning ecosystem. Our mission is to consolidate Taiwan’s diverse quantum computing...
The Torch computing platform aims to provide a one-stop scientific data analysis platform for light source discipline users, covering multiple computing service modes, supporting multiple access methods, integrating multiple data analysis methods, and facing multiple application scenarios.
The platform covers multiple computing service modes, including two types of desktop computing and...
Mass spectrometry (MS) is a compound identification technique used frequently in
life- and environmenta sciences. The specific setup of mass spectrometry on electron ionization, coupled with gass chromatrography (GC-EI-MS) is appealing due to its relative simplicity and stability of the setup, while it is challenging computationally -- the acquired data are strictly "flat", with no...
This study examines the challenges of maintaining emotional support for families in the changing social structure anticipated in 2040. New challenges are posed to traditional family roles in providing emotional connections. Changes in family structure, including an increase in single-person households and diverse family types, may increase the distance between members and complicate the...
In the field of Social Sciences, Arts and Humanities (SSAH), researchers have started to explore the possibilities of machine-learning techniques in several directions. With the current and imminent generations of open-source Large Language Models (LLMs) it seems already attainable for individual researchers to speed up onerous but necessary tasks on personal computers, while keeping control...
This talk will discuss the technical challenge of using virtual unwrapping as a technique to restore damaged film negatives from more than a century ago. The technique is applied to film going all the way back to the early photographic explorations of Etienne Marney and Muybridge, whose photographic work determined that a galloping horse has all its hooves off the ground at one time at a...
Implementing a Risk Management Process to a distributed infrastructure can be a tedious task. Usually one need to agree on a certain Risk management methodology, get a clear picture on the scope and the governance, and from that assign the relevant roles and responsibilities. Clearly this is only possible with with sufficient support from the governing body.
But even if the above mentioned...
The WLCG Security Operations Centre Working Group has been working on establishing a common methodology for the use of threat intelligence within the academic research community. A central threat intelligence platform allows working group members to easily exchange indictors of compromise (IoCs) of ongoing security incidents and to use this information to secure their own infrastructures. This...
Network security operations depends on many kings of network security tools to deal with the monitoring, detecting, and responding for security incidents, threats, and vulnerabilities across the organization's infrastructure. However, despite the evolving power of these tools, they are relatively cumbersome to use and often require interaction through specific interfaces, which increases the...
Abstract— As software systems become increasingly complex, defects in source code pose significant security risks, such as user data leakage and malicious intrusions, making their detection crucial. Current approaches based on Graph Neural Networks (GNNs) can partially reveal defect information; however, they suffer from heavy graph construction costs and underutilization of heterogeneous...
The Large Hadron Collider beauty (LHCb) experiment at CERN has successfully optimized the usage of its High-Level Trigger (HLT) farm by integrating the tools that make its usage transparent as a Worldwide LHC Computing Grid (WLCG) Tier-2D site, used opportunistically when the HLT needs are reduced. This innovative transformation leverages the power of XRootD for data access, HTCondor for...
Anomaly detection is a critical component of predictive maintenance in data centers, where early identification of abnormal patterns in system behavior can prevent failures and reduce operational costs. This work explores the application of Variational Autoencoders (VAEs) for unsupervised anomaly detection in data collected from data center infrastructure. VAEs are probabilistic generative...
The Worldwide Large Hadron Collider Computing Grid (WLCG) community’s deployment of dual-stack IPv4/IPv6 on its worldwide storage infrastructure is very successful and has been presented by us at earlier ISGC conferences. Dual-stack is not, however, a viable long-term solution; the HEPiX IPv6 Working Group has focused on studying where and why IPv4 is still being used, and how to change such...
Abstract: The Worldwide Computing Grid (WLCG) is a global collaboration that provides the computing infrastructure essential for the CERN Large Hadron Collider experiments. Spanning over 40 countries, it delivers approximately 3 exabytes of storage and 1.3 million CPU cores to support scientific research. Recently, WLCG launched a multi-year strategy to prepare for the next phase of the LHC’s...
Over a quarter century of distributed computing research and development has brought together a strong community that is willing to trust each other. Scaling well beyond the ‘human circle of trust’ that lies in the order of a few hundred, we have built computing infrastructures for scientific research
across continents and disciplines spanning hundreds of thousands of people – leveraging...
Europe hosts a vibrant cluster of Environmental Research Infrastructures (ENVRIs), serving a diverse community committed to tackling today’s most pressing environmental challenges. These infrastructures play a crucial role in advancing Europe’s strategic actions to achieve climate neutrality. The project ENVRI-Hub NEXT builds on years of collaboration to provide interoperable datasets,...
To enhance the discovery potential of the Large Hadron Collider (LHC) at CERN in Geneva and improve the precision of Standard Model measurements, the High Luminosity LHC (HL-LHC) Project was initiated in 2010 to extend its operation by another decade and increase its luminosity by approximately tenfold beyond the design value.
In this context, the scope of applications for Machine...
Qiskit-symb is a Python package designed to enable the symbolic evaluation of quantum states and quantum operators in Parameterized Quantum Circuits (PQCs) defined using Qiskit. This open-source project has been integrated into the official Qiskit Ecosystem platform, making it more accessible to the rapidly growing community of Qiskit users.
Given a PQC with free parameters, qiskit-symb can...
Models of physical systems simulated on HPC clusters often produce large amounts of valuable data that need to be efficiently managed both in the scope of research projects and continuously afterwards to derive the most benefit for scientific advancement. Database management systems (DBMS) and metadata management can facilitate transparency and reproducibility, but are rarely used in...
Quantum computing has gained significant attention in recent years, with numerous algorithms and applications under active development. Limited by the current quantum technology, quantum noise and readout error have become critical issues. Various methods have been proposed to address readout error through error mitigation techniques, typically involving post-processing of measurement data....
The AI4EOSC (Artificial Intelligence for the European Open Science Cloud) project aims at contributing to the landscape of Artificial Intelligence (AI) research with a comprehensive and user-friendly suite of tools and services within the framework of the European Open Science Cloud (EOSC). This innovative platform is specifically designed to empower researchers by enabling the...
Quantum computing is emerging as a groundbreaking approach for solving complex optimization problems, offering new opportunities in fields requiring both computational efficiency and innovative solution discovery. Quantum annealing, a specialized quantum computing paradigm, leverages quantum adiabatic theorem to efficiently find the global minimum of a problem's cost function, making it a...
The educational needs in the future classroom need to focus on a combination of student engagement in learning, inquiry-based approaches, curiosity, imagination, and design thinking. Smart classrooms leverage the advancements in Internet of Things to create intelligent, interconnected learning environments that enhance the quality of life and educational outcomes of students. With advancements...
Tracking imaging systems have progressed from manual examination to utilizing contemporary photodetectors, like SiPM arrays and CMOS cameras, to convert scintillation light into digital data and obtain physical information. This study presents RIPTIDE, a novel recoil-proton track imaging system designed for fast neutron detection, with an emphasis on the use of deep-learning methods. RIPTIDE...
Data Management Planning within the EOSC CZ - Czech National Data Infrastructure for Research Data
Author: Jiří Marek, Open Science manager at Masaryk University, Head of EOSC CZ Secretariat, Czech Republic
The rapid expansion of data availability is reshaping research methodologies across various disciplines. This surge, characterized by its Velocity, Variety, and Volume, is driven not...
Users may have difficulties finding answers in the documentation for products, when many pages of documentation are available on multiple web pages or in email forums. We have developed and tested an AI based tool, which can help users to find answers to their questions. Our product called Docu-bot uses Retrieval Augmentation Generation solution to generate answers to various questions. It...
The convergence of Natural Language Processing (NLP) and cheminformatics represents a groundbreaking approach to drug development, particularly in the critical domain of toxicity prediction. Early identification of toxic compounds is paramount in pharmaceutical research, as late-stage toxicity discoveries lead to substantial financial setbacks and delays market approval. While traditional...
With the Run2025 for sPhenix, it comes the higher data throughput and data volume
requirements.
The sustained data throughput required for sPhenix2025 is 20GB/sec. Once started in
mid-April, this sustained data steam will be steadily constant with no breaks through December.
The projected data volume is 200PB.
In order to meet these data throughput and volume requirement, we must rebuild...
A living systematic review is an approach that provides up-to-date evidence for a given research topic. It is extensively used in health domains due to its potential to enhance the efficiency of conventional systematic reviews. Furthermore, this approach is particularly suitable when the literature requires frequent updates, and the research needs continuous monitoring. Artificial Intelligence...
The 14 beamlines for the phase I of High Energy Photon Source(HEPS) will produces more than 300PB/year raw data. Efficiently storing, analyzing, and sharing this huge amount of data presents a significant challenge for HEPS.
HEPS Computing and Communication System(HEPSCC), also called HEPS Computing Center, is an essential work group responsible for the IT R&D and services for the facility,...
Recently, deepfake pornos in South Korea gained attention after unconfirmed lists of schools that had victims spread online in August this year. Many girls and women have hastily removed photos and videos from their Instagram, Facebook and other social media accounts. Thousands of young women have staged protests demanding stronger steps against deepfake porn. Politicians, academics and...
The Authentication and Authorisation for Research Collaboration (AARC) Blueprint Architecture has been a foundational framework for authentication and authorisation infrastructures (AAIs) in global research. It supports the European Open Science Cloud (EOSC), national research AAIs, and cross-regional e-infrastructures, offering a unified approach to federated identity management. As the scope...
Integrating Artificial Intelligence in digital humanities has created unprecedented opportunities for analyzing historical archives. Building upon established work with Learning-as-a-Service solutions for Maryland State Archives' Legacy of Slavery collections, specifically the Domestic Traffic Advertisements dataset, this research proposes an innovative approach combining Knowledge Graph-based...
CC-IN2P3, the french Tier-1 for W-LCG, has recently equipped itself with a dedicated in-house documentation tool, DIPLO, based on an open-source web-based solution. The centralization of information and a single point of entry have been key in this cross-organizational approach. Starting at the initial situation report, we expose the vision and the specifications that have led to the...
In IoT (Internet of Things) systems consisting of IoT devices, edges, and cloud servers, it is expected that various sensor data obtained from IoT devices will be collected, accumulated, and utilized to solve various social issues using Artificial Intelligence. However, due to the sophistication and intensification of cyber attacks, security measures for IoT systems consisting of a large...
The emergence and integration of AI tools like ChatGPT into educational contexts has ignited heated debates, especially concerning its dual role as both a powerful teaching assistant and a potential tool for dishonesty. On one side, it may hold potential as a pedagogical aid for instructors and as a source of efficiency for students (one may nickname it positively as “TeachGPT”). On the other...
The Account LInking SErvice ALISE implements the concept of site-local
account linking. For this a user can log in with one local account and
with any number of supported external accounts (e.g. Helmholtz-ID and
Google). The local account is on at an HPC centre, which also comprises
the Unix-User name.
Federated services can use this informatin whenever they need to map a
federated...
Quantum technologies represent a transformative leap in science but also in business. In recent years, quantum computing has evolved from a niche research area to a highly competitive technological frontier, with exponential growth described by Dowling’s and Neven's Law (Moore´s Law of Quantum) illustrating rapid advancements in processing power.
Significant global investments and efforts are...
UKRI-STFC's Scientific Computing Department operates Echo, a very large (~100 PiB) data storage cluster located at the Rutherford Appleton Laboratory and implemented using Ceph storage technology. Echo is one of the world's largest publicly-advertised Ceph clusters. Echo's primary use case is to provide a high-throughput data storage endpoint for the WLCG's high-throughput computing operations...
The LHCb experiment at CERN studies the outcome of particle collisions at the Large Hadron Collider at CERN. Since it began operations in 2010, the experiment has collected more than 100 PB of data resulting from proton or ion collisions. As the LHCb detector was upgraded between 2019 and 2022 and now records tens of PB of data per year, another 60 PB are expected before the end of LHC Run 3...
We present an innovative approach to optimizing the Dual Mixed Refrigerant (DMR) process for natural gas liquefaction. The DMR process, characterized by its use of two distinct mixed refrigerants, offers significant advantages over traditional single-mixed refrigerant systems. However, the complexity of the DMR system, due to multiple refrigerant circuits and intricate interactions among them,...