002159063 001__ 2159063
002159063 003__ SzGeCERN
002159063 005__ 20170731221021.0
002159063 0247_ $$2DOI$$9IOP$$a10.1088/1742-6596/608/1/012040
002159063 0248_ $$aoai:inspirehep.net:1372988$$pcerncds:CERN:FULLTEXT$$pcerncds:FULLTEXT$$pcerncds:CERN$$qINSPIRE:HEP$$qForCDS
002159063 035__ $$9http://inspirehep.net/oai2d$$aoai:inspirehep.net:1372988$$d2016-06-07T14:56:44Z$$h2016-06-08T04:00:04Z$$mmarcxml
002159063 035__ $$9Inspire$$a1372988
002159063 041__ $$aeng
002159063 100__ $$aKlimentov, A$$uBrookhaven
002159063 245__ $$aNext Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
002159063 260__ $$c2015
002159063 300__ $$a8 p
002159063 520__ $$9IOP$$aThe Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10(2)) sites, O(10(5)) cores, O(10(8)) jobs per year, O(10(3)) users, and ATLAS data volume is O(10(17)) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center 'Kurchatov Institute' together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.
002159063 540__ $$acc-by$$uhttp://creativecommons.org/licenses/by/3.0/
002159063 65017 $$2SzGeCERN$$aComputing and Computers
002159063 690C_ $$aCERN
002159063 700__ $$aBuncic, P$$uCERN
002159063 700__ $$aDe, K$$uTexas U., Arlington
002159063 700__ $$aJha, S$$uRutgers U., Piscataway
002159063 700__ $$aMaeno, T$$uBrookhaven
002159063 700__ $$aMount, R$$uSLAC
002159063 700__ $$aNilsson, P$$uBrookhaven
002159063 700__ $$aOleynik, D$$uTexas U., Arlington
002159063 700__ $$aPanitkin, S$$uBrookhaven
002159063 700__ $$aPetrosyan, A$$uTexas U., Arlington
002159063 700__ $$aPorter, R J$$uLBL, Berkeley
002159063 700__ $$aRead, K F$$uOak Ridge
002159063 700__ $$aVaniachine, A$$uArgonne
002159063 700__ $$aWells, J C$$uOak Ridge
002159063 700__ $$aWenaus, T$$uBrookhaven
002159063 773__ $$c012040$$n1$$pJ. Phys.: Conf. Ser.$$v608$$wC14-09-01.1$$y2015
002159063 8564_ $$81195227$$s1473668$$uhttps://cds.cern.ch/record/2159063/files/10.1088_1742-6596_608_1_012040.pdf$$yFulltext
002159063 960__ $$a13
002159063 962__ $$b1557441$$k012040$$nINDICO.CERN.CH.258092
002159063 980__ $$aARTICLE
002159063 980__ $$aConferencePaper
002159063 999C6 $$a0-0-0-0-0-0-1$$t2015-05-23 23:08:47$$vInvenio/UNKNOWN refextract/1.5.44