13-18 March 2016
Academia Sinica
Asia/Taipei timezone

Options for the evolution of the LHCb Computing Model for LHC run 3

Mar 17, 2016, 4:00 PM
BHSS, Media Conf. Room (Academia Sinica)

BHSS, Media Conf. Room

Academia Sinica

Poster Presentation Physics (including HEP) and Engineering Applications Poster Session


Dr Christophe HAEN (CERN)


LHCb is one of the four high energy physics experiments currently in operation at the Large Hadron Collider at CERN, Switzerland. During the second long technical break of the LHC (LS2) to take place from 2018 to 2020, LHCb will undergo major upgrades. These upgrades concern not only the actual detector, but also the computing model driving the physics analysis. The main incentive and driving constraint for the new computing model or Run 3 will be the increased amount of data recorded by the experiment. The current estimations lead to a data taking rate of 100 KHz, which is an order of magnitude higher than the Run 2 numbers. Such rate forces us to review the bare management of the data in terms of files and bookkeeping, as well as the way we are processing it on the grid in order to extract relevant physics figures. Because of storage space restrictions, having a fixed and large number of replicas for each file is not a solution anymore. In particular, only files interesting for current analysis need fast access, while others can suffer reasonable delays. The data popularity system introduced for Run2 can help reaching such a dynamic data placement strategy. Storage technologies and transfer protocols will have to evolve and follow the latest trends encouraged by the various grid sites. It includes CEPH based storages, S3 or HTTP. Processing model relying on stripping/skimming are not sustainable anymore, and the use of new concepts such as the LHCb TURBO stream must be extended. Opportunistic computing such as clouds and volunteer-computing are also part of the resources that should be accounted for and efficiently used. This paper presents the challenges the LHCb computing model will have to face, as well are some solutions considered to address them.

Primary author


Presentation materials