Risk Management is the tool to organise your limited resources to provide efficient operational security services to your organisation.
In this session we will first give an introduction to Risk Management, and discuss different methods which could be applied. After discussing the concept of Risk Management we will then dive into its components, define risks for an example organisation and...
HADDOCK2.4 antibody-antigen docking tutorial: This tutorial demonstrates the use of HADDOCK2.4 for predicting the structure of an antibody-antigen complex using information about the hypervariable loops of the antibody and either the entire surface of the antigen or a loose definition of the epitope. This tutorial does not require any Linux expertise and only makes use of our web servers and...
The Security Service Challenge SSC-2023-03 is looking into the CMS's to use the distributed compute infrastructure. In particular we look into the joining elements of the mutliple security teams involved. This particular exercise goes well beyond EGIs logical borders, which oer time got more fuzzy with the advert of new technologies, and additional resources available to our user community. In...
HADDOCK3 antibody-antigen docking: This tutorial demonstrates the use of HADDOCK3 for predicting the structure of an antibody-antigen complex using information about the hypervariable loops of the antibody and either the entire surface of the antigen or a loose definition of the epitope. It illustrate the modularity of HADDOCK3 by introducing a new workflow not possible under the current...
The Distance Geometry Problem (DGP) asks whether a simple weighted undirected graph G=(V,E,d) can be realized in the K-dimensional Euclidean space so that the distance constraints implied by the weights on the graph edges are satisfied. This problem was proven to be NP-hard in the context of graph embeddability, and has several applications. In this talk, we will focus on various currently...
In the current research and education environment, the threat from cybersecurity attack is acute having grown in recent years. We must collaborate as a community to defend and protect ourselves. This requires both the use of detailed, timely and accurate threat inteligence alongside fine-grained monitoring.
In this session we explore aspects both of sharing appropriate intelligence and the...
Protein-protein and protein-ligand interactions are central in biological mechanisms. These interactions can be classified into thermodynamics and mechanistic pathways. Estimating accurate and reliable interaction energetics along the thermodynamic pathway is one of the ongoing challenges in computational biophysics. Umbrella sampling simulation-based potential of mean force calculations is...
It has been estimated based on the graph theory that there are at least 1060 organic molecules that are relevant for small-molecule drug discovery. Using machine learning to estimate the binding free energies for screening of large chemical libraries to search for the tightly binding inhibitors would take a considerable amount of computational resources, yet it is not possible to explore the...
OPERAS is the Research Infrastructure for open scholarly communication in the field of Social Sciences and Humanities. This keynote will focus mainly on introducing 3 different acceleration mechanisms: 1. Discovery Acceleration by presenting the GoTriple platform, 2. Dissemination Acceleration of publications by presenting the DIAMOND action, 3. Quality Acceleration by presenting the OAeBu...
Accelerating time to market, time to service and time to science through computing, the challenge we encountered today may not be in the speed of data processing. To streamline huge amount of data without blocking and queuing in an AI/HPC infrastructure design is crucial.
In this speech, a case study about resource balancing in an AI driven inspection and
decision systems from a world top...
Medical devices will almost always be driven by software components. Development for this field of work requires special considerations for patient safety and data privacy and is thus governed by rules alien to other deployment scenarios. The European Union is about to switch to a new regulation framework, the Medical Device Regulations (MDR), replacing the far less comprehensive Medical...
Tosh discusses the educational paradigm for New Education Normal. Topics such as “What are the future skills?” and “Ambiance for Authentic Learning: How to implement such skills in authentic learning?” are elaborated.
This workshop aims at identifying the essential issues involved in the fundamental components of authentic education, especially in the realm of authentic learning, in the New Education Normal and then demonstrates some experimental showcases at various tiers.
In this part of the session, innovative new educational practices are showcased. We intend to share some successful educational...
The use of big data in the field of omics and biomedical studies is the enabling factor for finding new insights with sufficient statistical confidence. When dealing with such data, several issues have to be addressed, related to the personal identifiable information (PII) often included in datasets and subject to the European General Data Protection Regulation (GDPR), which imposes particular...
The Scenario-Issue-Resolution (SIR) instructional model is introduced to nurture students' abilities to tackle complex problems that are grounded on scenario-based issues. It teaches to intrigue students' foresight. SIR is oriented from issue-based inquiry and grounded on the socioscientific issues (SSI) that lie in resolving open and ill-structured world problems that are controversial with...
Dr. Takemata's team developed a STEAM curriculum to nurture K-12 students' computational thinking skills in the surrounding living environment. With the concepts of SDGs, the purpose of the learning is to think seriously about the future of the society where they will live as full-fledged members of the society. The proposed hands-on and heads-on workshop enhances active learning in PBL to...
In this pandemic times our group has coordinated large National and International Consortia to understand, through cryo Electron Microscopy (cryo-EM), both key issues on SARS Cov 2 spike dynamic (Melero et al., IUCrJ. 2020) as well as specific properties of mutations that were prevalent in Europe at certain periods (Ginex et al., PLoS Pathog. 2022), as part of our work in the European Research...
English Writing classes were conducted between Kansai University and Chihlee University for several semesters in the virtual classroom, where all students worked in teams to interact and acquire writing skills in English. Based on their interests and curiosities, the students explore their societies and compare their cultural values and lifestyles. Through writing activities, they acquired how...
From a Public Health perspective, the once-in-a-century Sars-CoV2 (COVID-19) pandemic created unprecedented challenges for Higher Education Institutions (HEIs). HEIs in the USA had to respond rapidly to switch from a mostly in-person mode of instruction to a fully remote mode. These changes had to be immediate and imminent to provide education with less to no impact on the health and safety of...
The mission of the “WISE” community is to enhance best practice in information security for IT infrastructures for research. WISE fosters a collaborative community of security experts and builds trust between those IT infrastructures. Through membership of working groups and attendance at workshops these experts participate in the joint development of policy frameworks, guidelines, and...
Collaborative Online International Learning (COIL) courses have been collaboratively conducted between the Department of Business and Management at Nanyang Polytechnic in Singapore and Kansai University/Kansai University of International Studies in Japan. Students learning and interaction are all conducted in the virtual classroom asynchronously. The realm of learning is to acquire business...
With the application of the model of design thinking, Dr. Nakazawa incorporates features of e-portfolios into reflective learning for authentic assessment. The self-awareness of the values from learning and the process of meta-cognitive learning activities are the keys to authentic assessment in New Education Normal.
Security operations center (SOC) frameworks standardize how SOCs approach their defense strategies. It helps manage and minimize cybersecurity risks and continuously improve operations. However, current most of SOC frameworks are designed as the centralized mode which serves for the single organization. These frameworks are hard to satisfy the security operations scenarios that must...
Data security issues and legal and ethical requirements on the storage, handling and analysing genetic and medical data are becoming increasingly stringent. Some regulatory obstacles may represent a gap to data sharing and the application of Open Science and Open Access principles. In this perspective, Task 6.6 of the EOSC-Pillar (https://www.eosc-pillar.eu/) project aimed to analyze the...
Before the Pandemic, university students and corporate staff of the HR departments from IBM, ANA, and Fuji Xerox, among others, gathered in Tokyo once a year to have a communication and negotiation workshop to enhance their negotiation skills. The purpose is to develop and enrich their human resource skills through the interaction of multi-tiered age groups. During the Pandemic, such...
We must protect and defend our environment against the cybersecurity threats to the research and education community, which are now acute having grown in recent years. In the face of determined and well-resourced attackers, we must actively collaborate in this effort across HEP and more broadly across Research and Academia (R&E).
Parallel efforts are necessary to appropriately respond to...
Fugaku, one of the first 'exascale' supercomputers of the world, since the beginning production, has been one of the most important R&D infrastructure for Japan, especially in producing groundbreaking results to realize Japan's Society 5.0. Such results have been obtained by the immense power and versatility of Fugaku, allowing complex modern workloads involving not only classical physics...
In the Humanities and Social Sciences, Big Data, and technologies of the Digital Humanities have helped to substantiate academic work. For the portability of data however, attributes and values require definitions to preserve their intended meaning through time and space. The standard practice of researchers to rely on ontologies, even for the study of cultures and societies, however, is...
The increasingly pervasive and dominant role of machine learning (ML) and deep learning (DL) techniques in High Energy Physics is posing challenging requirements to effective computing infrastructures on which AI workflows are executed, as well as demanding requests in terms of training and upskilling new users and/or future developers of such technologies.
In particular, a growth in the...
Science is constantly encountering parametric optimization problems whose computer-aided solutions require enormous resources. At the same time, there is a trend towards developing increasingly powerful computer clusters. Geneva is currently one of the best available frameworks for distributed optimization of large-scale problems with highly nonlinear quality surfaces. It is a great tool to be...
Influencers Matchmaking for Startups B2B-Branding interactivity: Case Studies of Linkedin Data Mining
Marketing in the fourth industrial revolution offers a global opportunity to the rise of micro-influencers. Influencer marketing allows any social media user with particular influence within the social media platform to help amplify and circulate the campaign of the brand to the...
Approaching the exascale era, complex challenges arise within the existing high performance computing (HPC) frameworks. Highly optimized, heterogenous hardware systems on the one side, and HPC-unexperienced scientists with a continuously increasing demand for compute and data capacity on the other side. Bringing both together would enable a broad range of scientific domains to enhance the...
Spatial reasoning is the ability to think about objects in two and three dimensions. Spatial reasoning skills are critical in science, art, and math and can be improved with practice. This research’s main objective is to explore how virtual reality (VR) immersive experiences can enhance spatial reasoning capability. Past research revealed the vast differences between traditional user...
High Energy Photon Source (HEPS) will generate massive experimental data for diversified scientific analysis. The traditional way of data download and analysis by users using local computing environment cannot meet the growing demand for experiments. This paper proposes a virtual cloud desktop system of HEPS based on Openstack, which is used for imaging and crystal scattering experiments in...
The prediction of the quaternary structure of biomolecular macromolecules is of paramount importance for fundamental understanding of cellular processes and drug design. In the era of integrative structural biology, one way of increasing the accuracy of modelling methods used to predict the structure of biomolecular complexes is to include as much experimental or predictive information as...
In this presentation, we will divide a general EEG study into several steps and introduce the position of this study in EEG research through the issues involved in each step.
After that, we will outline the feature selection method we used in this study that emphasizes the minority class in imbalanced datasets and states the results on imbalanced datasets for the EEG and Emotion dataset and...
3D Bin Packing is the problem of finding the best way to pack several cargos into a container in order to maximize the container density. Moreover, some problems have constraints such as weight, stack-ability, fragility, and orientation of cargo pieces. Since the 3D Bin Packing problem is known to be NP-hard, and an exact solution is hard to be obtained in a reasonable time. Therefore, various...
In the last years cloud computing has opened up interesting opportunities in many fields of scientific research. Cloud technologies allow to scale applications and adapt quickly, ease the adoption of new software development methods (e.g. DevOps), accelerating time to value.
However, the lack of integration of the existing infrastructures and the consequent fragmentation of the resources are...
River and lake water is a major resource for drinking water, food production and various industrial and agricultural purposes and it hosts or feeds many sensitive ecosystems. Therefore assuring the absence of potentially harmful chemicals is a vital issue for environmental and economic sustainability. Tens of thousands of different chemicals are present in fluctuating amounts in surface water...
Data centers house IT and physical infrastructures to support researchers in transmitting, processing and exchanging data and provide resources and services with a high level of reliability. Through the usage of infrastructure monitoring platforms, it is possible to access data that provide data center status, e.g. related to services that run on the machines or to the hardware itself, to...
Geant4 is an open-source software toolkit that has been in development since 1994 and is used to simulate the interactions between particles and matter. It is widely used in a variety of fields, including high energy physics, nuclear physics, accelerator physics, medical physics, and space science. The first paper on Geant4, published in Nuclear Instruments and Methods in Physics Research A in...
Building successful multi-national collaborations is challenging. The scientific communities in a range of physical sciences have been learning how to build collaborations that build upon regional capabilities and interests over decades, iteratively with each new generation of large scientific facilities required to advance their scientific knowledge. Much of this effort has naturally focused...
Enabling Communities - Building trust for research and collaboration
When exploring the world of Federated Identity, research communities can reap considerable benefit from using common best practices and adopting interoperable ways of working. EnCo, the Enabling Communities task of the GÉANT 4-3 and GÉANT 5-1 Trust and Identity Work Package, provides the link between those seeking to...
The Czech WLCG Tier-2 center for LHC experiments ATLAS and ALICE provides computing and storage services also for several other Virtual Organizations from high energy and astroparticle physics. The center deployed Disk Pool Manager (DPM) for almost all (only ALICE VO uses xrootd servers) supported VOs as a solution for storage until recently. The local capacity was extended by a separate...
GakuNin, an identity and access management federation in Japan, has provided a stable trust framework to academia in Japan so far. For common services that all constituent members of university or institution use such as e-journal service the framework has worked well. There are many research communities: data science, material science, high energy physics, and research project using high...
If new physics does exist at the scales investigated by the Large Hadron Collider (LHC) at CERN, it is more elusive than expected.
Finding interesting results may be challenging using conventional methods, usually based on model-dependent hypothesis testing, without substantially increasing the number of analyses.
Thus, standard signal-driven search strategies could fail in reaching new...
The Italian WLCG Tier-1 located in Bologna and managed by INFN CNAF provides computing and storage resources to several research communities in the fields of High-Energy Physics, Astroparticle Physics, Gravitational Waves, Nuclear Physics and others. Among them, the Jiangmen Underground Neutrino Observatory (JUNO), devoted to the construction and operation of a neutrino detector located...
High Energy Physics analysis workflows commonly used at LHC experiments do not scale to the data volumes expected from the HL-LHC. A rich program of research and development is ongoing to address this challenge, proposing new tools and techniques for user-friendly analysis. The IRIS-HEP Analysis Grand Challenge (AGC) provides an environment for prototyping, studying and improving workflows in...
This presentation reports on a series of exercises that checked the steps of the vetting process to gain VO membership for Check-in users. EGI Check-in accepts a range of identity providers on different trust levels, ranging from social media accounts where the identity provider can only guarantee that someone was in control of a mobile phone number or an email address.
OIDC (OpenID Connect) is widely used for transforming our digital
infrastructures (e-Infrastructures, HPC, Storage, Cloud, ...) into the token
based world.
OIDC is an authentication protocol that allows users to be authenticated
with an external, trusted identity provider. Although typically meant for
web- based applications, there is an increasing need for integrating
shell- based...
InterTwin is an EU-funded project that started on the 1st of September 2022.
The project will work with domain experts from different scientific domains in building a technology to support the emerging field of digital twins.
Digital twins are modelled for predicting the behaviour and evolution of real-world systems and applications.
InterTwin will focus on employing machine-learning...
The DIRAC Interware is the framework for building distributed computing systems which allows to integrate various kinds of computing and storage resources in a transparent way from the user’s perspective. Up until recently, the client communications with DIRAC were based on a custom protocol using X.509 PKI certificates. Following the recent move towards OIDC/OAuth2 based security...
The Square Kilometre Array (SKA) telescope’s computing platform is being developed through an agile process, with teams from across the SKA Regional Centres (SRCs) developing the SRCNetwork (SRCNet) infrastructure the SKA will need.
One such area of development is the SRCNet’s Authentication and Authorisation Infrastructure (AAI), which is currently led by an agile team, Purple Team, with...
A Distributed Grid-Data Management System serving BES, JUNO, CEPC experiments has been built at IHEP since 2014 based on DIRAC file catalog system. Meanwhile, more experiments, such as HERD or JUNO experiment with different data scales and complicated data management demands in data production, enforce us to make attempts on developing more flexible and experiments scenarios-oriented grid-data...
The Institute of High Energy Physics, Chinese Academy of Sciences is a comprehensive research base engaged in high energy physics, advanced accelerator physics and technology, advanced ray technology and its application, and has built a series of large-scale scientific facilities in China, such as Beijing Electron Positron Collider (BEPC), China Spallation Neutron Source (CSNS), High Energy...
The Coffea-casa analysis facility
prototype provides physicists with alternative mechanisms to access computing resources and explore new programming paradigms. Instead of the traditional command-line interface and asynchronous batch access, a notebook-based web interface and interactive large-scale computing is provided. The facility commissions an environment for end-users enabling...
Among the five central senses we use to perceive the world around us, nothing is more salient that our sense of hearing. Sounds play a very important role in how we understand, behave and interact with the world around us. One can close their eyes, but never their ears. In this research study, we propose design and development of a GIS-based maps application that would allow users to not only...
The Institute of High Energy Physics of the Chinese Academy of Sciences is a comprehensive research base in China engaged in high -energy physical research, advanced accelerator physics and technology research and development and utilization, and advanced ray technology and application.
The Sing sign on(SSO) system of the High Energy Institute has more than 22,000 users, the calculation...
CAGLIARI 2020 is a 25 million euro project funded within the framework of the National Operational Program for Research and Competitiveness – Smart Cities & Communities of the Italian Ministry of Education University and Research.
The project started in 2017 and ended in 2022 developing a pilot system for monitoring traffic and air quality providing innovative and environmentally friendly...
The Worldwide Large Hadron Collider Computing Grid (WLCG) actively pursues the migration from the protocol IPv4 to IPv6. For this purpose, the HEPiX-IPv6 working group was founded during the fall HEPiX Conference in 2010. One of the first goals was to categorize the applications running in the WLCG into different groups: the first group was easy to define, because it comprised of all...
Electromagnetic processes of charged particles interaction with oriented crystals provide a wide
variety of innovative applications in high-energy frontier physics, accelerator physics, detector physics, nuclear physics and radiation therapy. A small piece of crystal material could be used as
-an intense source of X- and gamma-ray radiation for nuclear physics and cancer treatment,
-a...
The transition of WLCG storage services to dual-stack IPv4/IPv6 is nearing completion after more than 5 years, thus enabling the use of IPv6-only CPU resources as agreed by the WLCG Management Board and presented by us at earlier ISGC conferences. Much of the data is transferred by the LHC experiments over IPv6. All Tier-1 storage and over 90% of Tier-2 storage is now IPv6-enabled, yet we...
This study uses numerical methods to explore the causes of meteotsunamis in the Atlantic Ocean, Caribbean Sea, and the Mediterranean Sea after the eruption of the Tonga Volcano on January 15, 2022. Topics are focused on the role of the Proudman resonant effect on the tsunami induced by the Lamb waves. The linear and nonlinear shallow water equation, fully-nonlinear and weakly-dispersive...
The European Open Science Cloud (EOSC) is a key framework through
which the EC is fostering the collaboration and interoperability
of scientific research across domains to make services and data
easily accessible to a broader audience and benefit from synergies.
The EC is establishing the EOSC by funding a series of projects,
supporting either individual science domains to get up to...
As network technique continues to flourish, current network attacks against large-scale scientific facilities and science data centers show a more sophisticated trend. In order to evade traditional security detection systems, attackers adopt more stealthy attack methods. The Domain Name System (DNS) protocol is one of the basic protocols used in the network environment of large-scale...
Forest fires that occurred in Indonesia started in 1998 and reached its peak in 2015 where in that year almost half of the world was affected by forest fires. One of the technologies used by NOAA is VIIRS (The Visible Infrared Imaging Radiometer Suite) which began operating in 2011. Most forest fires in Indonesia are mostly carried out by humans whose interests are to expand land, especially...
Greenhouse gas (GHG) emissions have been recognized as accelerators of the Global climate change phenomenon and several human activities take part in it. In particular, the contribution of the computing sector is significant and deemed to grow. While on one side unprecedented discoveries have been obtained thanks to the increasing computational power available, on the other the heavy reliance...
Over the past year, the soaring cost of electricity in many parts of the world has brought the power-requirements of computing infrastructure sharply into focus, building on the existing environmental concerns around the issue of global warming. We report here on the investigations, and subsequent actions, in the UK to respond to this pressure. The issues we address are both the overall...
The Italian WLCG Tier-1 located in Bologna and managed by INFN, provides computing resources to several research communities in the fields of High-Energy Physics, Astroparticle Physics, Gravitational Waves, Nuclear Physics and others. The facility is hosted at CNAF. Although the LHC experiments at CERN represent the main users of the Tier-1 resources, an increasing number of communities and...
Use of ‘anycasting’ internet addresses (‘IP anycast’) in load balancing and high availability, and for traffic engineering reasons, is a widely deployed technique for content delivery networks to lower latency for access to frequently accessed content such as web pages and video. Using the properties of the Border Gateway Protocol (BGP) as a variable-length path-vector protocol for routing...
Dark sector, which includes dark matter and dark photon, is one of the most important and challenging research subject in particle physics. Dark matter is known to play a crucial role in the formation of cosmic structure including stars, galaxies, and clusters of galaxies. However, we do not know the identity of dark matter to date. In order to find identity of dark matter, various searches...
We introduce the result of heavy ion beam simulation using Geant4 version 11.0.2. There has been relatively little focus on accelerator-based study of secondary particles in the nuclear physics, biophysics, and dark matter research. In this study, we have compared our simulation result with experimental data on collisions between the liquid hydrogen target and uranium beams which take the...
Internet of Things (IoT) platforms are widely deployed both in scientific and social e-infrastructures. Security is one of crucial issues to assure secure data collections and analyses on the platforms. For example, the vulnerability of software running on an IoT device may give opportunity attackers to insert malicious processes in the software (or code) such as data leakage and...
ABSTRACT
Effective Manifold Ranking (EMR) is a technique used widely in Content-based Image Retrieval (CBIR) to rank the images in a database via measuring and ranking the similarity between each image with a given query image where the images are represented by different features.
In this paper, a combined of low level and Deep features is proposed to create an EMR graph is proposed....
Based on k8s cluster, HEPS computing platform creates a container computing environment to provide analysis services for users. The computing platform provides a container data analysis environment based on jupyterlib with the jupyterhub web page as the entry point. The platform uses CVMFS to store the software library, and the container environment accesses the CVMFS by CSI. The Lustre is...
After the retirement of Globus Toolkit, HTCondor has been widely adapted in WLCG sites. The reason of the popularity of HTCondor as a batch system could be the maturity of software functionality and strong development and support team, which is getting better day by day. It is open source and fulfils most of the requirements of a user or groups. Integration with other batch systems (i.e....
ABSTRACT
The paper presents an effective method to learn the distance measure between images based on two processing phases: filter high-distinct images with query image and combine the feature sets of images. Our proposed algorithm calculates the non-linear distance measure of images, using the Fuzzy integral and based on the relevance feedback. The experiments have demonstrated that the...
In this paper, we shared on how students and staff from a Singapore tertiary institution acquired and reinforced their learning of Cloud Computing & Grid technologies through participation in applied projects on real-life scenarios in Healthcare, Mobile etc. Centre for IT Innovation (CITI) at School of Information Technology (SIT), Nanyang Polytechnic, is a key platform for staff and students...
Recent advances of X-ray beamline technologies, including the high brilliance beamlines at next-generation synchrotron sources and advanced detector instrumentation, have led to an exponential increase in the speed of data collection. As a consequence, there is an increasing need for a data analysis platform that can refine and optimize data collection strategies online and effectively analyze...