Document details

Efficient monitoring of CRAB jobs at CMS


Description

Made available in DSpace on 2022-04-28T19:07:13Z (GMT). No. of bitstreams: 0 Previous issue date: 2017-11-23

CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates the design choices and gives a report on our experience with the tools we developed and the external ones we used.

Universidade Estadual Paulista

California Institute of Technology

INFN Sezione di Trieste

INFN Sezione di Perugia

Fermi National Accelerator Laboratory

Vilnius University

University of Sofia st. Kliment Ohridski

CIEMAT

Universidade Estadual Paulista

Document Type Other
Language English
facebook logo  linkedin logo  twitter logo 
mendeley logo

Related documents