Główna > CERN Experiments > LHC Experiments > ATLAS > ATLAS Preprints > ATLAS utilisation of the Czech national HPC center |
ATLAS Slides | |
Report number | ATL-SOFT-SLIDE-2018-398 |
Title | ATLAS utilisation of the Czech national HPC center |
Author(s) | Svatos, Michal (Academy of Sciences of the Czech Republic, Institute of Physics) ; Chudoba, Jiri (Academy of Sciences of the Czech Republic, Institute of Physics) ; Vokac, Petr (Czech Technical University in Prague) |
Corporate author(s) | The ATLAS collaboration |
Collaboration | ATLAS Collaboration |
Submitted to | 23rd International Conference on Computing in High Energy and Nuclear Physics, CHEP 2018, Sofia, Bulgaria, 9 - 13 Jul 2018 |
Submitted by | [email protected] on 22 Jun 2018 |
Subject category | Particle Physics - Experiment |
Accelerator/Facility, Experiment | CERN LHC ; ATLAS |
Abstract | The Czech national HPC center IT4Innovations located in Ostrava provides two HPC systems, Anselm and Salomon. The Salomon HPC is amongst the hundred most powerful supercomputers on Earth since its commissioning in 2015. Both clusters were tested for usage by the ATLAS experiment for running simulation jobs. Several thousand core hours were allocated to the project for tests, but the main aim is to use free resources waiting for large parallel jobs of other users. Multiple strategies for ATLAS job execution were tested on the Salomon and Anselm HPCs. The solution described herein is based on the ATLAS experience with other HPC sites. ARC Compute Element (ARC-CE) installed at the grid site in Prague is used for job submission to Salomon. The ATLAS production system submits jobs to the ARC-CE via ARC Control Tower (aCT). The ARC-CE processes job requirements from aCT and creates a script for a batch system which is then executed via ssh. Sshfs is used to share scripts and input files between the site and the HPC cluster. The software used to run jobs is rsynced from the site's CVMFS installation to the HPC's scratch space every day to ensure availability of recent software. |
Related document | Conference Paper ATL-SOFT-PROC-2018-033 |