Processing LHC data in the UK

D. Colling, D. Britton, J. Gordon, S. Lloyd, A. Doyle, P. Gronbech, J. Coles, A. Sansum, G. Patrick, R. Jones, R. Middleton, D. Kelsey, A. Cass, N. Geddes, P. Clark, L. Barnby

Research output: Contribution to conference (unpublished)Paperpeer-review

1 Citation (Scopus)


The Large Hadron Collider (LHC) is one of the greatest scientific endeavours to date. The construction of the collider itself and the experiments that collect data from it represent a huge investment, both financially and in terms of human effort, in our hope to understand the way the Universe works at a deeper level. Yet the volumes of data produced are so large that they cannot be analysed at any single computing centre. Instead, the experiments have all adopted distributed computing models based on the LHC Computing Grid. Without the correct functioning of this grid infrastructure the experiments would not be able to understand the data that they have collected. Within the UK, the Grid infrastructure needed by the experiments is provided by the GridPP project. We report on the operations, performance and contributions made to the experiments by the GridPP project during the years of 2010 and 2011—the first two significant years of the running of the LHC.
Original languageEnglish
Publication statusPublished - 28 Jan 2013


Dive into the research topics of 'Processing LHC data in the UK'. Together they form a unique fingerprint.

Cite this