Description: The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made. A major feature of the first Data Challenge was the preparation and the deployment of the software required for the production of large event samples as a worldwide-distributed activity. It should be noted that it was not an option to ''run everything at CERN'' even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organizing and then carrying out this large-scale production at a significant number of sites around the world had the refore to be faced. However, the benefits of this are manifold: apart from realizing the required computing resources, this exercise created worldwide momentum for ATLAS computing as a whole. This report describes in detail the main steps carried out in DC1 and what has been learned from them as a step towards a computing Grid for the LHC experiments.
Date: April 23, 2004
Creator: Sturrock, R.; Bischof, R.; Epp, B.; Ghete, V.M.; Kuhn, D.; Mello, A.G. et al.
Item Type: Refine your search to only Article
Partner: UNT Libraries Government Documents Department