Quantum computers offer the potential to address complex problems that exceed the capabilities of current high-performance computers. Although fault-tolerant quantum computers have yet to be realized, numerous countries are actively engaged in their development. Among the various platforms under consideration, superconducting quantum computers utilizing Josephson junction qubits emerge as...
The Swiss National Supercomputing Centre (CSCS) is commissioning the future multi-region flagship infrastructure codenamed Alps, an HPE Cray EX system based on NVIDIA GH200 Grace-Hopper superchip. The Centre has been heavily investing in the concept of Infrastructure as Code, and it is embracing the multi-tenancy paradigm for its infrastructure. Alps will serve multiple partners, characterised...
Many years ago, the Joint WLCG/OSG/EGEE security policy group successfully developed a suite of Security Policies for use by WLCG, EGEE, EGI and others. These in turn formed the basis of the AARC Policy Development Kit, published in 2019. Many infrastructures have since used the template policies in the AARC PDK but found they had to modify them to meet their needs. The Policy Templates are...
An innovative, distributed, and elastic computing ecosystem, called UniNuvola,
is being deployed at the University of Perugia, involving the Department of
Chemistry, Biology and Biotechnologies, the Department of Physics and Geol-
ogy, the Department of Mathematics and Informatics and the Department of
Engineering. The aim of the project is the creation of a federated and...
Protecting information assets has become a top priority for organizations in the ever- changing landscape of digital security.
INFN is deeply committed to security, being a major player in the research world with distributed computing infrastructures across the entire territory and being involved in numerous research projects that deal with health and sensitive data.
The Datacloud project...
China Spallation Neutron Source (CSNS), the fourth pulse spallation neutron source in the world built during China's 11th Five-Year Plan period (2006-2010), has begun its operation after passing the national acceptance test in August 2018. There are 11 spectrometers built and put into operation and 3TB of raw data generated every day by November 2023. With the accelerator power upgraded from...
ecurity exercises can be seen as an experiment, one wants to investigate how
good, for example, the expected computer security incident response activities of an
organisation described in the procedures and policies match with real (measured)
activities in an -as realistic as possible, but contained- created security
incident situation.
The complexity of the created security situation...
We have presented previously on the strategic direction of the Security Operations Centre working group, focused on building reference designs for sites to deploy the capability to actively use threat intelligence with fine-grained network monitoring and other tools. This work continues in an environment where the cybersecurity risk faced by research and education, notably from ransomware...
China’s High-Energy Photon Source (HEPS), the first national high-energy synchrotron radiation light source, is under design and construction. HEPS computing center is the principal provider of high-performance computing and data resources and services for science experiments of HEPS. The mission of HEPS scientific computing platform is to accelerate the scientific discovery for the...
Abstract:
In scientific applications, integrating artificial intelligence (AI) and machine learning (ML) has revolutionized research methodologies and workflows. This study delves into an innovative application of cloud-based OpenAI's Large Language Models (LLMs) in developing a conversational AI chatbot, drawing exclusively from the culturally significant Legacy of Slavery (LoS) datasets...
The DESY Research Infrastructure, historically supports a large variety of sciences, like High Energy and Astro particle Physics, Dark matter research, Physics with Photons and Structural Biology. Most of those domains generate large amounts of precious data, handled according to domain specific policies and taking into account embargo periods and license restrictions. However, a significant...
For over two years the Italian National Institute for Nuclear Physics (INFN) has been actively operating a national cloud platform designed to streamline access to geographically distributed computing and storage resources. This initiative includes a catalog of high-level services that can be instantiated on demand through a web portal. Service descriptions are specified using TOSCA templates,...
This paper investigates the effect of pretraining and fine-tuning for a multi-modal dataset. The detaset used in this study is accumulated in a garbage disposal facility for facility control and consists of 25000 sequential images and corresponding sensor values. The main task for this dataset is to classify the state of garbage incineration from an input image for the combustion state...
High Energy Photon Source (HEPS) is a crucial scientific research facility that necessitates efficient, reliable, and secure services to support a wide range of experiments and applications. However, traditional physical server-based deployment methods suffer from issues such as low resource utilization, limited scalability, and high maintenance costs.Therefore, the objective of this study is...
DIRAC has been a cornerstone in providing comprehensive solutions for user communities requiring access to distributed resources. Originally initiated by the LHCb collaboration at CERN in 2000, DIRAC underwent significant changes over the years. In 2008, a major refactoring resulted in the creation of the experiment-agnostic "core" DIRAC, allowing custom extensions such as...
With the widespread adoption of containers by various organizations and companies, Kubernetes (K8s), an open-source software dedicated to container management, has in recent years become de-facto the standard for the deployment and operation of applications focused on this technological solution. K8s offers several advantages: workload balancing, dynamic resource allocation, automated rollout...
Recent advances in X-ray beamline technologies, including the advent of very high brilliance beamlines at next generation synchrotron sources and advanced detector instrumentation, have led to an exponential increase in the speed of data collection. As a result, there is a growing need for a data analysis platform that can refine and optimise data collection strategies on-line and effectively...
ABSTRACT:
The efficient manifold ranking (EMR) algorithm has been widely applied in content-based image retrieval (CBIR). For this algorithm, each image is represented by low-level features, describing color, texture, and shape. The characteristics of low-level features include the ability to quickly detect differences in color, texture, and shape, and the invariance to rotations and...
As human-wildlife conflicts escalate in our area and around Japan, safeguarding crops and farmers from animal intrusions becomes paramount. This research introduces a deep learning approach to prototype a prevention system against monkey trespassing in sweet potato fields. The proposed system was motivated by the idea of developing wildlife identification and assisting local farmers in...
The National Institute for Nuclear Physics (INFN) has been operating and supporting Italy’s largest research and academic distributed infrastructure for several decades. In March 2021, INFN launched “INFN Cloud” which provides a federated cloud infrastructure and a customizable service portfolio tailored for the scientific communities supported by the institute. The portfolio assortment...
With a rich heritage in cutting-edge distributed IT technologies, ranging from initial small clusters to Grid and Cloud-based computing, INFN has introduced "INFN Cloud" about three years ago. This user-friendly, distributed cloud infrastructure and services portfolio is designed for scientific communities, providing easy access and utilization.
Given the decentralized nature of the...
Researchers at INFN (National Institute for Nuclear Physics) face challenges from basic to hard science use cases (e.g., big-data latest generation experiments) in many areas: HEP (High Energy Physics), Astrophysics, Quantum Computing, Genomics, etc.
Machine Learning (ML) adoption is ubiquitous in these areas, requiring researchers to solve problems related to the specificity of...
The INDIGO PaaS orchestration system is an open-source middleware designed to seamlessly federate heterogeneous computing environments, including public and private clouds, container platforms, and more. Its primary function lies in orchestrating the deployment of virtual infrastructures, ranging from simple to intricate setups. These virtual infrastructures can implement high-level...
The EGI Federated Cloud (FedCloud) is a multinational cloud system that seamlessly integrates community, private, and public clouds into a scalable computing platform dedicated to research. Each cloud within this federated infrastructure configures its Cloud Management Framework (CMF) based on its preferences and constraints. The inherent heterogeneity among cloud sites can pose challenges...
This talk continues the story of virtual unwrapping with a victorious return to the stage at ISGC 2024. Last year we predicted an unprecedented breakthrough with the help of an international competition called the Vesuvius Challenge. This talk will fulfil that promise and describe how we have captured the imagination of a diverse, global audience, through the virtual unwrapping of one of the...
Typhoon-induced storm surge modeling involves forcings of 10-m wind and sea level pressure fields, typically determined by an adequate parametric typhoon model based on typhoon tracks, sizes, and intensities or a fully dynamical simulation by numerical weather prediction (NWP). Parametric models have been widely developed to simulate tropical cyclones. In conventional Holland-type parametric...
It has now been over 12 years since the HEPiX-IPv6 working group began investigating the migration of WLCG towards IPv6 in 2011.
The effort of the working group can be split into three phases. In the first phase LHC software was analyzed in Ipv6 ready, ready with caveats and not ready at all. The aim “enable IPv6 access to all storage” (the second phase of the working group) was at the end...
INFN CNAF data center provides a huge amount of heterogeneous data through the adoption of dedicated monitoring systems. Having to provide a 24/7 availability, it has started to assess artificial intelligence solutions to detect anomalies aimed to predict possible failures.
In this study, the main goal is to define an artificial intelligence framework able to classify and predict anomalies...
TeRABIT (Terabit Network for Research and Academic Big Data in ITaly) is a project funded within the initiative for realization of an integrated system of research and innovation infrastructures of the Italian National Recovery Plan. TeRABIT aims at creating a distributed, hyper-networked, hybrid Cloud-HPC computing environment offering tailored services to address the diverse requirements of...
Self-supervised learning speeds up the representation learning process in lots of computer vision tasks. It also saves time and labor of labelling the dataset. Momentum Contrast (MoCo) is one of efficient contrastive learning methods, which has achieved positive results on different downstream vision tasks with self-supervised learning. However, its performance on extracting 3D local parts...
In the past, estimating tsunami characteristics induced by volcanic-related landslides often relied on approximations of total tsunami volume or using empirical formulas to estimate initial wave heights. In this study, we take the example of Guishan Island off the eastern coast of Taiwan and employ a numerical model, specifically the Discontinuous Bi-viscous Model (DBM), combined with a...
The INFN CNAF User Support unit plays the role of the first interface to the user of the data center, which provides computing resources to over 60 scientific communities in the fields of Particle, Nuclear and Astro-particle physics, Cosmology and Medicine.
While its duties span from repetitive tasks to supporting complex scientific-computing workflows, many of them can be automatized or...
Since the global network continues to grow at a fast pace, the
inter-connection becomes more and more complicated to support reliable transmission. Meanwhile, the prosperity of network application service is getting increasing expanding as well. This brings more concerns and attractions on using software-defined concepts to make wide-area network to
be optimized and secured. However,...
Protein-protein recognition through the hydrogen-bonded chains of interfacial water is a less explored mechanism due to the technical challenges involved in the analyses, and the role of waters in forming a stable protein-protein complex is often elusive. It is still unclear whether and how the hydrogen-bonded interfacial-water chains contribute to triggering or participating in the...
The so-called trolley problem is to ask AI for an ethical solution. However, such a problem is not unique to AI. A similar ethical question has also been there in China for many centuries : “Whom to save first when both your mother and your wife fall into water?”
Indeed, the ethics associated with AI are discussed at various levels. UNESCO and OECD are the leading international...
This research investigates the impact of AI-generated scenario-based videos on the English-speaking learning experience and motivation, particularly within the context of Environment, Society, and Governance (ESG) initiatives. Recognizing the global significance of ESG initiatives for long-term corporate value and sustainability, the study emphasizes the need to integrate theoretical...
The Coherent Multirepresentation Problem (CMP) was recently proposed as a generalization of the well-known and widely studied Distance Geometry Problem (DGP). Given a simple directed graph G, the idea is to associate various types of representations to each arc of the graph, where the source vertices are supposed to play the role of reference for the corresponding destination vertices. Each...
Abstract
This study investigates the therapeutic potential of using Artificial Intelligence Generated Music in addressing the impacts of societal pressures on mental well-being. After the 2020 COVID-19 epidemic, industrial and economic uncertainty increased dramatically, and the emotional and psychological state of the individual became a stressful phenomenon. The WHO recognizes that 70% to...
The continuous development of the methods for the protein structure prediction was taking advantage from the precious experimental information obtained by structural biology as well as by sequencing of multiple organisms. Indeed, the general developed pipeline is based on determining conformations of protein fragments, and then using multiple sequence alignments to obtain long-range distances...
The intersection of participatory design and digital learning with artificial intelligence (AI) presents a transformative opportunity for costume art and craft education. This research explores the efficacy of combining AI-generated content (AIGC) with participatory design and digital learning platforms to enhance the educational experience in costume arts and crafts disciplines. Digital...
Modern technologies for DNA and RNA sequencing allow for fast, parallel reading of multiple DNA lines. While sequencing the first genome took 32 years, today with Next Generation Sequencing technologies we are able to sequence 40 genomes in about 2 days, producing 4 TB of text data (a file of about 100 GB per genome). This ability poses a challenge to computing infrastructures, which need to...
The Vesuvius Challenge (scrollprize.org) is a current machine learning and data science competition built around the goal of making visible the evidence of ink captured in micro computed tomography scans of damaged Herculaneum scrolls. The prize pool for the competition is more than $1M. The website for the challenge states as its objective that it intends “...to make history by reading an...
Background & Motivation
3D Ultrasound Computer Tomography (3D USCT) is developed at the
Karlsruhe Institute of Technology for early breast cancer detection.
Unlike conventional ultrasound sonography methods with manual guided
ultrasound (US) probes, the patient is placed on a patient bed with a
stable and reproducible measurement configuration. A reproducible stable
measurement...
Social media can play a crucial role in disseminating information about cultural heritage if a proper lexicon is available and able to identify valuable data for the management of crises that are caused by either natural or human-induced disasters. A literature review has been conducted, encompassing existing attempts to define terminology within the cultural heritage domain. A Review of the...
This study aims to investigate the potential of future public bicycle services in Taipei to achieve net-zero emission strategies and Sustainable Development Goals (SDGs) by using a speculative design. It proposes a sustainable energy model for future public bicycle service. Given the challenges of global environmental change, issues such as sustainable development and net-zero emission...
During the COVID-19 pandemic, there has been a rapid growth of literature and the need to access useful information quickly to understand its disease mechanisms, define prevention techniques, and propose treatments, therefore text classification has become an essential and challenging activity. LitCovid represents an example of the extent to which a COVID-19 literature database can grow: it...
While database management systems (DBMS) are one of the most important IT concepts, scientific supercomputing makes little use of them. Reasons for this situation range from a preference towards direct file I/O without overheads to feasibility problems in reaching DBMS from High-Performance Computing (HPC) cluster nodes. However, trends such as the increasing re-usage of scientific output data...
In the contemporary era, where scientific progress meets the imperative of responsible resource utilization, the need for innovative tools is paramount. This paper explores the pivotal role of a new benchmark for High-Energy Physics (HEP), HEPScore, and the HEP Benchmark Suite in steering the HEP community toward sustainable computing practices. As exemplified by projects like the Large Hadron...
The Jiangmen Underground Neutrino Observatory (JUNO) is an underground 20 kton liquid scintillator detector being built in the south of China and expected to start data taking in late 2024. The JUNO physics program is focused on exploring neutrino properties, by means of electron anti-neutrinos emitted from two nuclear power complexes at a baseline of about 53km. Targeting an unprecedented...
The LHCOPN network, which links CERN to all the WLCG Tier 1s, and the LHCONE network, which connects WLCG Tier1s and Tier2s, have successfully provided the necessary bandwidth for the distribution of the data generated by the LHC experiments during first two runs of the LHC accelerator. This talk gives an overview of the most remarkable achievements and the current state of the two networks....
NOTED is an intelligent network controller that aims to improve the throughput of large data transfers in FTS (File Transfers Services), which is the service used to exchange data transfers between WLCG sites, to better exploit the available network resources. For a defined set of source and destination endpoints, NOTED retrieves the data from FTS to get the on-going data traffic and uses the...
StoRM WebDAV is a component of StoRM (Storage Resource Manager) which is designed to provide a scalable and efficient solution for managing data storage and access in Grid computing environments. StoRM WebDAV specifically focuses on enabling access to stored data through the WebDAV (Web Distributed Authoring and Versioning) protocol. WebDAV is an extension of the HTTP protocol that allows...
The Scientific Data and Computing Center (SDCC) at Brookhaven National Laboratory provides data storage, transfer, and computing resources to scientific experiments and communities at Brookhaven National Lab and worldwide.
A significant amount of scientific data has been stored and retrieved daily from the multiple tiered storage systems. My presentation covers the infrastructure of the...
In recent years, different R&D activities have been developed at CERN within the WLCG (World LHC Computing Grid) to exploit the network and provide new capabilities for future applications. An example is NOTED (Network Optimised Transfer of Experimental Data) to dynamically reconfigure network links to increase the effective bandwidth available for FTS-driven transfers by using dynamic circuit...
This talk will give an overview of the second phase of the CERN Quantum Technologies Initiative (QTU2), focusing on the Quantum communications work package.
On Quantum Communications, CERN will focus on two main activities: 1) Quantun Key Distribution using White Rabbit for time synchronization and 2) very precise time and frequency distribution.
There has been a rich body of text and numerical archives about Taiwan indigenous peoples (TIPs) before 1940. Because detailed statistical archives and numerical data on TIPs were not available in the period between 1940-2010, we thus have very limited knowledge about developmental trajectory about TIPs. Such situation makes TIPs gradually become the so-called hard-to-reach population (HRP)...
Secret management stands as an important security service within the EGI Cloud federation. This service encompasses the management of various types of secrets, including tokens and certificates, and their secure delivery to the target cloud environment. Historically, accessing secrets from virtual machines (VMs) has relied on OIDC access tokens, a method that harbors potential security...
The adoption of user-friendly solutions aimed at sharing data as well as software and related configuration files, among heterogeneous and distributed resources, becomes a necessity for the scientific community. By adopting and using software products dedicated to this purpose, it is possible to facilitate the distribution of software, configurations and files. To this extent, the CernVM-File...
Summary: We propose a model to estimate and minimise full life cycle emissions of scientific computing centres based on server embodied carbon, PUE, projected next-generation performance-per-Watt improvements and actual/projected carbon intensity of the location.
In this paper we present a model for the assessment of the replacement cycle of a compute cluster as a function of the carbon...
INFN-CNAF is one of the Worldwide LHC Computing Grid (WLCG) Tier-1 data centers, providing support in terms of computing, networking, storage resources and services also to a wide variety of scientific collaborations, ranging from physics to bioinformatics and industrial engineering.
Recently, several collaborations working with our data center have developed computing and data management...
With the increase in the number of large-scale international collaborative experiments supported by the Institute of High Energy Physics (IHEP) of the Chinese Academy of Sciences, IHEP and collaborators face challenges in various application scenarios such as different data volumes, multiple data management needs, and various data authentication requirements. At the same time, new technologies...
X.509 certificates and VOMS proxies are widely used by the scientific community for authentication and authorization (authN/Z) in GRID Storage and Computing Elements. Although this has contributed to improve worldwide scientific collaboration, X.509 authN/Z comes with some downsides: mainly security issues and lots of customization needed to integrate them with other services.
The GRID...
The 14 beamlines for the phase I of High Energy Photon Source(HEPS) will produces more than 300PB/year raw data. Efficiently storing, analyzing, and sharing this huge amount of data presents a significant challenge for HEPS.
HEPS Computing and Communication System(HEPSCC), also called HEPS Computing Center, is an essential work group responsible for the IT R&D and services for the facility,...
INDIGO Identity and Access Management (IAM) is a comprehensive solution that enables organizations to manage and control access to their resources and systems effectively. It is a Spring Boot application, based on OAuth/OpenID Connect technologies and the MITREid Connect library. INDIGO IAM has been chosen as the AAI solution by the WLCG community and has been used for years by the INFN...
Authentication proxy services are becoming increasingly important in existing ID infrastructure linkage. It is necessary to clarify how the service identifies and authenticates end entities and to strictly operate the service. In this paper, we discuss a credential policy and credential practice statement of Orthros, an authentication proxy service that has begun trial...
The Horizon Europe interTwin project is developing a highly generic Digital Twin Engine (DTE) to support interdisciplinary Digital Twins(DT). Comprising 31 high profile scientific partner institutions, the project brings together infrastructure providers, technology providers and DT use cases from High Energy and AstroParticle Physics, Radio Astronomy, Climate Research and Environmental...
The Secure Shell Protocol (SSH) is the de-facto standard for accessing
remote servers on the commandline. Use cases include
- remote system administration for unix administrators
- git via ssh for developers
- rsync via ssh for system backups
- HPC access for scientists.
Unfortunately, there is no globally accepted usage pattern for globally
federated usage yet.
The large variety...
The discovery of Beyond the Standard Model (BSM) is a major subject of many experiments, such as the ATLAS and CMS experiments with the Large Hadron Collider, which has the world's highest centre-of-mass energy. Many types of BSM models have been proposed to solve the issues of the Standard Model. Many of them have some or many model parameters, e.g. the Minimal Supersymmetric Standard Model,...
Keywords: cluster computing, account passport, secure shell (SSH), lightweight certificate, remote access, SSH tunnel
Advanced computing infrastructure such as high-performance clusters, supercomputers, and cloud computing platforms offer unparalleled computing capabilities and effectively support a multitude of computing requirements across diverse fields such as scientific research, big...
The computing cluster of the Institute of High Energy Physics has long provided computational services for high energy physics experiments, with a large number of experimental users. With the continuous expansion of the scale of experiments and the increase in the number of users, the queuing situation of the existing cluster is becoming increasingly severe.
To alleviate the shortage of...
The Large Hadron Collider at CERN in Geneva is poised for a transformative upgrade, preparing to enhance both its accelerator and particle detectors. This strategic initiative is driven by the tenfold increase in proton-proton collisions anticipated for the forthcoming high-luminosity phase scheduled to start by 2029. The vital role played by the underlying computational infrastructure, the...
The authentication and authorisation infrastructures (AAIs) for research worldwide have for years now based their architectures in the “AARC Blueprint Architecture” and the suite of accompanying guidelines. Developed by the “Authentication and Authorisation for Research Collaboration” (AARC) community, and fostered by the accompanying “engagement group for infrastructures” (AEGIS), the model...
The Federated Identity Management for Research (FIM4R) community is a forum where research communities convene to establish common requirements, combining their voices to convey a strong message to Federated Identity Management (FIM) stakeholders. FIM4R produced two whitepapers on the combined Authentication and Authorization Infrastructure (AAI) requirements for research communities in 2012...
In the context of the Italian National Recovery and Resilience Plan (NRRP), the High-Performance Computing, Big Data e Quantum Computing Research Centre, created and managed by the ICSC Foundation, has been recently established as one of the five Italian “National Centres” to address designated strategic sectors for the development of the country, including simulations, computing, and...
In response to the global issues surrounding SDGs and rapid changes in the global and social environments, it is necessary for future generations to develop skills to build consensus on solutions and strategies in terms of global-level discussions and negotiations for their own benefits. Education for the future bears such mission to raise such citizens. On the other hands, the traditional...
To run workunits effectively, the BOINC server will stop sending workunits to those volunteer computers that make too many errors for some time. However, during BOINC apps development, workunit errors may arise from time to time. Therefore, to make development and debug not affected by errors, a workunit scheduling system is designed for the HEP@home project. With this scheduling system,...
Forest fires that occurred in Indonesia, especially in the areas of Sumatra, Kalimantan and Sulawesi, have had many impacts on the environment. One of these fires was triggered by peatlands and the conversion of forests in Indonesia. Apart from that, climate change, for example in 2023, the El Nino phenomenon will have quite a significant impact.
One of the data that can be used is data...
The Institute of High Energy Physics data center stores a large amount of experimental data from major scientific instruments, including the Jiangmen Underground Neutrino Observatory (JUNO), which is transmitted and backed up among collaborating data centers. The data center of the Institute of High Energy Physics has upgraded the transmission link of LHCONE from the original 10Gbps to...
This article presents a Container Image Management Service designed for High Energy Physics Research Institutes. The service utilizes Harbor, Cvmfs, and a self-developed image conversion tool, along with other related security components.It meets the specific requirements of high-energy physics image services and focuses on large-capacity and multi-version update management to ensure stable...
Log data, as the information that records the system running status, is an important part of the system. Anomalies occurring during the system running often need to be searched and rectified with the help of logs. With the increasing scale of large-scale scientific facilities and scientific data centers, the log data has exploded, and the difficulty of log anomaly detection has reached an...
Abstract:
With the advancement of space science and the rapid development of satellite control and navigation technology, the observation means become increasingly complex, larger scale and higher requirements for data processing are generated. The observation data needs to undergo a series of specialized algorithms and processes to generate data products that can meet the diverse...
In the context of the Italian National Recovery and Resilience Plan (NRRP), the High-Performance Computing, Big Data e Quantum Computing Research Centre, created and managed by the ICSC Foundation, has been recently established as one of the five Italian “National Centres” to address designated strategic sectors for the development of the country, including simulations, computing, and...
Large language models, including ChatGPT, ERNIE, and Llama, have exhibited remarkable capabilities and diverse applications in the realm of natural language understanding. These models are frequently utilized for commonplace conversations and standard question resolution. Nevertheless, they fall short in providing high-quality responses to domain-specific inquiries, largely due to the absence...
With the development of supercomputing technology, the demand for computing power and their usage scales are also continuously expanding. The construction of the Internet of Supercomputing (IoSC) has become the main development direction of the supercomputing industry in China. Being as a large-scale supercomputing infrastructure, the IoSC will integrate major domestic supercomputing nodes to...
With the rapid development of information technology, an increasing number of open-source software is widely used across various domains, significantly reducing development costs and enhancing production efficiency. However, as the number of open-source software continues to grow, the software supply chain becomes more complex, and the associated risks are also increasing. Therefore, obtaining...
The Czech WLCG Tier-2 center is hosted in the Computing Center of the Institute of Physics of the Czech Academy of Sciences (FZU) in Prague. Resources at the FZU are supplemented by disk servers at Institute of Nuclear Physics (NPI) and by compute servers at the Faculty of Mathematics and Physics of Charles University. The available dedicated computing capacity for supported LHC projects ALICE...
The Dynamic DNS service offered by IISAS plays a pivotal role in providing comprehensive, federation-wide Dynamic DNS support for virtual machines within the EGI Cloud infrastructure. This service allows users to register their preferred host names within designated domains (e.g., my-server.vo.fedcloud.eu) and associate them with public IP addresses of their servers.
The Dynamic DNS service...
With the increasing digitization of energy infrastructure, the vulnerability of critical systems to cyber threats has become a paramount concern. This work explores the application of Capability Hardware Enhanced RISC Instructions (CHERI) architecture to fortify the security posture of Smart Grid systems. CHERI, an extension of the RISC-V instruction set architecture, provides a novel approach...
HEPS, the High Energy Photon Source, is one of the key national major scientific and technological infrastructures undertaken by the Institute of High Energy Physics (IHEP) of the Chinese Academy of Sciences during the 13th Five-Year Plan period. It stores electron beam energy of 6 GeV and the first phase of construction includes 14 user beamlines, providing high-energy, high-brightness, and...
The IRIS IAM serves the UK IRIS (eInfrastructure for Research and Innovation at STFC) Community. IRIS is a collaboration developed by STFC and partner infrastructure providers in order to integrate and augment the provision of computing capabilities made available to STFC’s Science Activities, the national facilities such as ISIS and CLF, and partners such as the Diamond Light Source and the...
LTER-LIFE is a large-scale research infrastructure in the making; it aims to provide a state-of-the-art e-infrastructure to study and predict how changes in climate and other human-induced pressures affect ecosystems and biodiversity. One of the grand challenges in ecology is to understand and predict how ecosystems are impacted by changes in environmental conditions, and external pressures....
Due to the rapid development of edge devices and existing infrastructure in the post-5G era, a significant flood of large and diverse data is streaming into cloud infrastructure via services on edge devices. Consequently, many cloud infrastructure providers must develop methods for efficient resource allocation and scheduling to support service deployment with high availability and...