Report number
| FERMILAB-CONF-15-604-CD |
Title
| Pushing HTCondor and glideinWMS to 200K+ Jobs in a Global Pool for CMS before Run 2 |
Author(s)
|
Balcas, J (Vilnius U.) ; Belforte, S (INFN, Trieste ; Trieste U.) ; Bockelman, B (Nebraska U.) ; Gutsche, O (Fermilab) ; Khan, F (Quaid-i-Azam U.) ; Larson, K (Fermilab) ; Letts, J (UC, San Diego) ; Mascheroni, M (INFN, Milan Bicocca ; Milan Bicocca U.) ; Mason, D (Fermilab) ; McCrea, A (UC, San Diego) ; Saiz-Santos, M (UC, San Diego) ; Sfiligoi, I (UC, San Diego) Visualizza tutti i 12 autori |
Publication
| 2015 |
Number of pages
| 7 |
In:
| J. Phys.: Conf. Ser. 664 (2015) 062030 |
In:
| 21st International Conference on Computing in High Energy and Nuclear Physics, Okinawa, Japan, 13 - 17 Apr 2015, pp.062030 |
DOI
| 10.1088/1742-6596/664/6/062030
|
Subject category
| Computing and Computers |
Accelerator/Facility, Experiment
| CERN LHC ; CMS |
Abstract
| The CMS experiment at the LHC relies on HTCondor and glideinWMS as its primary batch and pilot-based Grid provisioning system. So far we have been running several independent resource pools, but we are working on unifying them all to reduce the operational load and more effectively share resources between various activities in CMS. The major challenge of this unification activity is scale. The combined pool size is expected to reach 200K job slots, which is significantly bigger than any other multi-user HTCondor based system currently in production. To get there we have studied scaling limitations in our existing pools, the biggest of which tops out at about 70K slots, providing valuable feedback to the development communities, who have responded by delivering improvements which have helped us reach higher and higher scales with more stability. We have also worked on improving the organization and support model for this critical service during Run 2 of the LHC. This contribution will present the results of the scale testing and experiences from the first months of running the Global Pool. |
Copyright/License
| publication: © 2015-2025 The Author(s) (License: CC-BY-3.0) |