Thumbnail
Access Restriction
Open

Author Bloom, Kenneth
Source CERN Document Server
Content type Text
File Format PDF
Date Created 2011-06-15
Language English
Subject Domain (in DDC) Natural sciences & mathematics ♦ Physics ♦ Modern physics ♦ Technology ♦ Engineering & allied operations ♦ Applied physics
Subject Keyword physics.ins-det ♦ General ♦ Detectors and Experimental Techniques
Abstract Each LHC experiment will produce datasets with sizes of order one petabyte per year. All of this data must be stored, processed, transferred, simulated and analyzed, which requires a computing system of a larger scale than ever mounted for any particle physics experiment, and possibly for any enterprise in the world. I discuss how CMS has chosen to address these challenges, focusing on recent tests of the system that demonstrate the experiment's readiness for producing physics results with the first LHC data.
Description Presented at: 2009 Meeting of the Division of Particles and Fields of the American Physical Society, Detroit, MI, USA, 26 - 31 Jul 2009
Collaboration with: CMS
Learning Resource Type Article
Publisher Date 2009-01-01
Rights License Preprint: (License: CC-BY-4.0)
Page Count 6