Journal of Grid Computing

, Volume 8, Issue 2, pp 159–179 | Cite as

Distributed Analysis in CMS

  • Alessandra Fanfani
  • Anzar Afaq
  • Jose Afonso Sanches
  • Julia Andreeva
  • Giusepppe Bagliesi
  • Lothar Bauerdick
  • Stefano Belforte
  • Patricia Bittencourt Sampaio
  • Ken Bloom
  • Barry Blumenfeld
  • Daniele Bonacorsi
  • Chris Brew
  • Marco Calloni
  • Daniele Cesini
  • Mattia Cinquilli
  • Giuseppe Codispoti
  • Jorgen D’Hondt
  • Liang Dong
  • Danilo Dongiovanni
  • Giacinto Donvito
  • David Dykstra
  • Erik Edelmann
  • Ricky Egeland
  • Peter Elmer
  • Giulio Eulisse
  • Dave Evans
  • Federica Fanzago
  • Fabio Farina
  • Derek Feichtinger
  • Ian Fisk
  • Josep Flix
  • Claudio Grandi
  • Yuyi Guo
  • Kalle Happonen
  • José M. Hernàndez
  • Chih-Hao Huang
  • Kejing Kang
  • Edward Karavakis
  • Matthias Kasemann
  • Carlos Kavka
  • Akram Khan
  • Bockjoo Kim
  • Jukka Klem
  • Jesper Koivumäki
  • Thomas Kress
  • Peter Kreuzer
  • Tibor Kurca
  • Valentin Kuznetsov
  • Stefano Lacaprara
  • Kati Lassila-Perini
  • James Letts
  • Tomas Lindén
  • Lee Lueking
  • Joris Maes
  • Nicolò Magini
  • Gerhild Maier
  • Patricia Mcbride
  • Simon Metson
  • Vincenzo Miccio
  • Sanjay Padhi
  • Haifeng Pi
  • Hassen Riahi
  • Daniel Riley
  • Paul Rossman
  • Pablo Saiz
  • Andrea Sartirana
  • Andrea Sciabà
  • Vijay Sekhri
  • Daniele Spiga
  • Lassi Tuura
  • Eric Vaandering
  • Lukas Vanelderen
  • Petra Van Mulders
  • Aresh Vedaee
  • Ilaria Villella
  • Eric Wicklund
  • Tony Wildish
  • Christoph Wissing
  • Frank Würthwein
Article

Abstract

The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

Keywords

LHC CMS Distributed analysis Grid 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    CMS Collaboration, Adolphi, R., et al.: The CMS experiment at the CERN LHC, JINST, 0803, S08004 (2008)Google Scholar
  2. 2.
    CMS Collaboration, CMS: The computing project. Technical design report, CERN-LHCC-2005-023, ISBN 92-9083-252-5 (2005)Google Scholar
  3. 3.
    Grandi, C., Stickland, D., Taylor, L., et al.: The CMS computing model, CERN-LHCC-2004-035/G-083 (2004)Google Scholar
  4. 4.
    Flix, J., Sciabà, A., et al.: The commissioning of CMS sites: improving the site reliability. In: Proceedings of 17th International Conference On Computing in High Energy Physics and Nuclear Physics. J. Phys.: Conf. Ser., in press (2009)Google Scholar
  5. 5.
    Flix, J., Sciabà, A., et al.: The commissioning of CMS computing centres in the worldwide LHC computing Grid. In: Conference Record N29-5 Session Grid Computing, Nuclear Science Symposium IEEE, Dresden (2008)Google Scholar
  6. 6.
    Afaq, A., et al.: The CMS dataset bookkeeping service. J. Phys. Conf. Ser. 119, 072001 (2008)CrossRefGoogle Scholar
  7. 7.
    Blumenfeld, B., Dykstra, D., Lueking, L., Wicklund, E.: CMS conditions data access using FroNTier. J. Phys. Conf. Ser. 119, 072007 (2008)CrossRefGoogle Scholar
  8. 8.
    Egeland, R., et al.: Data transfer infrastructure for CMS data taking. In: Proceedings of Science, PoS(ACAT08)033 (2008)Google Scholar
  9. 9.
    Tuura, L., et al.: Scaling CMS data transfer system for LHC start-up. J. Phys. Conf. Ser. 119, 072030 (2008)CrossRefGoogle Scholar
  10. 10.
    CMS collaboration: CMS computing, software and analysis challenge in 2006 (CSA06) Summary. CERN/LHCC 2007-010 (2007)Google Scholar
  11. 11.
    DeFilippis, N., et al.: The CMS analysis chain in a distributed environment. Nucl. Instrum. Methods A559, 38–42 (2006)Google Scholar
  12. 12.
    Fanfani, A., et al.: Distributed computing Grid experiences in CMS. IEEE Trans. Nucl. Sci. 52, 884–890 (2005)CrossRefGoogle Scholar
  13. 13.
    Bonacorsi, D., Bauerdick, L., on behalf of the CMS Collaboration: CMS results in the combined computing readiness challenge (CCRC08). Nucl. Phys., B Proc. Suppl. 197, 99–108 (2009)CrossRefGoogle Scholar
  14. 14.
    Evans, D., et al.: The CMS Monte Carlo production system: development and design. Nucl. Phys. Proc. Suppl. 177–178, 285–286 (2008)CrossRefGoogle Scholar
  15. 15.
    Codispoti, G., et al.: CRAB: a CMS application for distributed analysis. IEEE Trans. Nucl. Sci. 56, 2850–2858 (2009)CrossRefGoogle Scholar
  16. 16.
    Codispoti, G., et al.: Use of the gLite-WMS in CMS for production and analysis. In: Proceedings of 17th International Conference on Computing in High Energy Physics and Nuclear Physics. J. Phys. Conf. Ser., in press (2009)Google Scholar
  17. 17.
    Andreetto, P., et al.: The gLite workload management system. J. Phys. Conf. Ser. 119, 062007 (2008)CrossRefGoogle Scholar
  18. 18.
    Pordes, R., et al.: The open science Grid. J. Phys. Conf. Ser. 78, 012057 (2007). http://www.opensciencegrid.org/ CrossRefGoogle Scholar
  19. 19.
    Sfiligoi, I., et al.: glideinWMS—a generic pilot-based workload management system. J. Phys. Conf. Ser. 119, 062044 (2008)CrossRefGoogle Scholar
  20. 20.
    Ellertet, M., et al.: Advanced resource connector middleware for lightweight computational Grids. Future Gener. Comput. Syst. 23, 219–240 (2007). http://www.nordugrid.org/arc/ CrossRefGoogle Scholar
  21. 21.
    Andreeva, J., et al.: Dashboard for the LHC experiments. J. Phys. Conf. Ser. 119, 062008 (2008)CrossRefGoogle Scholar
  22. 22.
    LCG: LCG Computing Grid Technical Design Report, LCG-TDR-001 CERN/LHCC 2005-024. http://lcg.web.cern.ch/lcg/ (2005)
  23. 23.
    Bonacorsi, D., Egeland, R., Metson, S.: SiteDB: marshalling the people and resources available to CMS. In: Poster at the International Conference on Computing in High Energy and Nuclear Physics (CHEP 2009), Prague, 21–27 March 2009Google Scholar
  24. 24.
    Magini, N., et al.: The CMS data transfer test environment in preparation for LHC data taking. In: Conference Record N67-2 Session Applied Computing Techniques, Nuclear Science Symposium IEEE, Dresden (2008)Google Scholar
  25. 25.
    Bayatian, G.L., et al.: CMS technical design report volume II: physics performance. J. Phys., G Nucl. Part. Phys. 34, 995–1579 (2007)CrossRefGoogle Scholar
  26. 26.
    ALICE Collaboration: ALICE technical design report of the computing, CERN-LHCC-2005-018, ISBN 92-9083-247-9 (2005)Google Scholar
  27. 27.
    ATLAS Collaboration: ATLAS computing: technical design report, CERN-LHCC-2005-022, ISBN 92-9083-250-9 (2005)Google Scholar
  28. 28.
    LHCb Collaboration: LHCb TDR computing technical design report, CERN-LHCC-2005-019 (2005)Google Scholar
  29. 29.
    Laure, E., Fisher, S.M., Frohner, A., Grandi, C., Kunszt, P., et al.: Programming the Grid with gLite. Comput. Methods Sci. Technol. 12(1), 33–45 (2006)Google Scholar
  30. 30.
    Aderholz, M., et al.: Models of networked analysis at regional centres for LHC experiments (MONARC). Phase 2 report, CERN/LCB 2000-001 (2000)Google Scholar
  31. 31.
    Moscicki, J.T., et al.: Ganga: a tool for computational-task management and easy access to Grid resources. Comput. Phys. Commun. 180(11), 2303–2316 (2009)CrossRefGoogle Scholar
  32. 32.
    Bagnasco, S., et al.: AliEn: ALICE environment on the Grid. Grid J. Phys.: Conf. Series 119(6), 062012 (2008)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2010

Authors and Affiliations

  • Alessandra Fanfani
    • 1
  • Anzar Afaq
    • 2
  • Jose Afonso Sanches
    • 3
  • Julia Andreeva
    • 4
  • Giusepppe Bagliesi
    • 5
  • Lothar Bauerdick
    • 2
  • Stefano Belforte
    • 6
  • Patricia Bittencourt Sampaio
    • 3
  • Ken Bloom
    • 7
  • Barry Blumenfeld
    • 8
  • Daniele Bonacorsi
    • 1
  • Chris Brew
    • 9
  • Marco Calloni
    • 4
  • Daniele Cesini
    • 10
  • Mattia Cinquilli
    • 11
  • Giuseppe Codispoti
    • 1
  • Jorgen D’Hondt
    • 12
  • Liang Dong
    • 13
  • Danilo Dongiovanni
    • 10
  • Giacinto Donvito
    • 14
  • David Dykstra
    • 2
  • Erik Edelmann
    • 15
  • Ricky Egeland
    • 16
  • Peter Elmer
    • 17
  • Giulio Eulisse
    • 18
  • Dave Evans
    • 2
  • Federica Fanzago
    • 19
  • Fabio Farina
    • 20
  • Derek Feichtinger
    • 21
  • Ian Fisk
    • 2
  • Josep Flix
    • 22
    • 23
  • Claudio Grandi
    • 1
  • Yuyi Guo
    • 2
  • Kalle Happonen
    • 15
  • José M. Hernàndez
    • 22
  • Chih-Hao Huang
    • 2
  • Kejing Kang
    • 24
  • Edward Karavakis
    • 25
  • Matthias Kasemann
    • 26
  • Carlos Kavka
    • 6
  • Akram Khan
    • 25
  • Bockjoo Kim
    • 27
  • Jukka Klem
    • 15
  • Jesper Koivumäki
    • 15
  • Thomas Kress
    • 28
  • Peter Kreuzer
    • 28
  • Tibor Kurca
    • 29
  • Valentin Kuznetsov
    • 30
  • Stefano Lacaprara
    • 31
  • Kati Lassila-Perini
    • 15
  • James Letts
    • 32
  • Tomas Lindén
    • 15
  • Lee Lueking
    • 2
  • Joris Maes
    • 12
  • Nicolò Magini
    • 4
    • 10
  • Gerhild Maier
    • 33
  • Patricia Mcbride
    • 2
  • Simon Metson
    • 34
  • Vincenzo Miccio
    • 4
    • 10
  • Sanjay Padhi
    • 32
  • Haifeng Pi
    • 32
  • Hassen Riahi
    • 11
  • Daniel Riley
    • 30
  • Paul Rossman
    • 2
  • Pablo Saiz
    • 4
  • Andrea Sartirana
    • 35
  • Andrea Sciabà
    • 4
  • Vijay Sekhri
    • 2
  • Daniele Spiga
    • 4
  • Lassi Tuura
    • 18
  • Eric Vaandering
    • 2
  • Lukas Vanelderen
    • 36
  • Petra Van Mulders
    • 12
  • Aresh Vedaee
    • 11
  • Ilaria Villella
    • 12
  • Eric Wicklund
    • 2
  • Tony Wildish
    • 17
  • Christoph Wissing
    • 26
  • Frank Würthwein
    • 32
  1. 1.INFN and University of BolognaBolognaItaly
  2. 2.FermilabBataviaUSA
  3. 3.University of Rio De Janeiro UERJRio De JaneiroBrazil
  4. 4.CERNGenevaSwitzerland
  5. 5.Pisa INFNPisaItaly
  6. 6.Trieste INFNTriesteItaly
  7. 7.University of NebraskaLincolnUSA
  8. 8.Johns Hopkins UniversityBaltimoreUSA
  9. 9.Rutherford Appleton LaboratoryDidcotUK
  10. 10.INFN-CNAFBolognaItaly
  11. 11.Perugia INFNPerugiaItaly
  12. 12.Brussel UniversityBrusselBelgium
  13. 13.Institute of High Energy PhysicsChinese Academy of Sciences Academia SinicaBeijingChina
  14. 14.INFN and University of BariBariItaly
  15. 15.Helsinki Institute of PhysicsHelsinkiFinland
  16. 16.University of MinnesotaTwin CitiesUSA
  17. 17.Princeton UniversityPrincetonUSA
  18. 18.University of NortheasternBostonUSA
  19. 19.Padova INFNPadovaItaly
  20. 20.Milano Bicocca INFNMilanItaly
  21. 21.Paul Scherrer Institut (PSI)VilligenSwitzerland
  22. 22.CIEMATMadridSpain
  23. 23.PICBarcelonaSpain
  24. 24.Peking UniversityPekingChina
  25. 25.Brunel UniversityLondonUK
  26. 26.DESYHamburgGermany
  27. 27.University of FloridaGainesvilleUSA
  28. 28.RWTHAachenGermany
  29. 29.Institut de Physique Nucleaire de LyonVilleurbanne CedexFrance
  30. 30.Cornell UniversityIthacaUSA
  31. 31.Legnaro INFNLegnaroItaly
  32. 32.University of California San DiegoLa JollaUSA
  33. 33.University of LinzLinzAustria
  34. 34.Bristol UniversityBristolUK
  35. 35.Ecole PolytechniqueParisFrance
  36. 36.University of GentGentBelgium

Personalised recommendations