+ All Categories
Home > Documents > Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Date post: 05-Jan-2016
Category:
Upload: warren-campbell
View: 213 times
Download: 0 times
Share this document with a friend
16
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR
Transcript
Page 1: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Zprávy z ATLAS SW Week March 2004

Seminář ATLAS SW CZ Duben 2004

Jiří ChudobaFzÚ AV CR

Page 2: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Dario Barberis: Introduction & News

ATLAS Software & Computing Week - 1 Mar. 2004

ATLAS Computing Timeline

• POOL/SEAL release (done)

• ATLAS release 7 (with POOL persistency) (done)

• LCG-1 deployment (in progress)

• ATLAS complete Geant4 validation (done)

• ATLAS release 8

• DC2 Phase 1: simulation production

• DC2 Phase 2: intensive reconstruction (the real challenge!)

• Combined test beams (barrel wedge)

• Computing Model paper

• Computing Memorandum of Understanding (moved to end 2004)

• ATLAS Computing TDR and LCG TDR

• DC3: produce data for PRR and test LCG-n

• Physics Readiness Report

• Start commissioning run• GO!

2003

2004

2005

2006

2007

NOW

Page 3: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Dario Barberis: Introduction & News 3

ATLAS Software & Computing Week - 1 Mar. 2004

Near-term Software Release Plan

7.5.0: 14th Jan 2004

7.6.0: 4th Feb

7.7.0: 25th Feb <- SPMB Decision 3rd Feb

8.0.0: 17th Mar <- DC2 & CTB Simulation Release

8.1.0: 7th Apr

8.2.0: 28th Apr

8.3.0: 19th May

9.0.0: 9th Jun <- DC2 & CTB Reconstruction Release

9.1.0: 30th Jun

9.2.0: 21st Jul

9.3.0: 11th Aug

⇐ 15th Feb: LAr Technical run starts

⇐ 1st May: Baseline DC-2 Simulation starts

⇐ 10th May: Testbeam starts

⇐ 1st Jul: Baseline DC-2 Reconstruction starts

⇐ 14th Jul: Complete Testbeam

⇐ 15th Jul: Baseline DC-2 Physics Analysis starts

Page 4: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Expected Major Milestones GEANT4 Simulation

• DC2 (in validation)• Test Beam (underway)• Pile-Up, Digitization in Athena (debugging)

GeoModel• Inner Detector, Muon Spectrometer

Conversion to CLHEP Units (mm, MeV, [-pi,pi]) POOL/SEAL Persistency Bytestream Converters Preliminary Conditions Capabilities

Other Stepping Stones Move to InstallArea jobOption.txt --> jobOption.py Distribution Kits

Release 8

Page 5: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

ATLAS DC2

ATLAS Software Workshop2 March 2004

Gilbert Poulard

CERN PH-ATC

Page 6: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

DC2: goals

At this stage the goal includes: Full use of Geant4; POOL; LCG applications Pile-up and digitization in Athena Deployment of the complete Event Data Model and the Detector

Description Simulation of full ATLAS and 2004 combined Testbeam Test the calibration and alignment procedures Use widely the GRID middleware and tools Large scale physics analysis Computing model studies (document end 2004) Run as much as possible of the production on LCG-2

Page 7: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

DC2 operation Consider DC2 as a three-part operation:

o part I: production of simulated data (May-June 2004) needs Geant4, digitization and pile-up in Athena, POOL persistency “minimal” reconstruction just to validate simulation suite will run on any computing facilities we can get access to around the

worldo part II: test of Tier-0 operation (July 2004)

needs full reconstruction software following RTF report design, definition of AODs and TAGs

(calibration/alignment and) reconstruction will run on Tier-0 prototype as if data were coming from the online system (at 10% of the rate)

output (ESD+AOD) will be distributed to Tier-1s in real time for analysis

o part III: test of distributed analysis on the Grid (August-Oct. 2004) access to event and non-event data from anywhere in the world

both in organized and chaotic wayso in parallel: run distributed reconstruction on simulated data

Page 8: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

DC2: Scenario & Time scale

September 03: Release7

March 17th: Release 8 (production)

May 3rd 04:

July 1st 04: “DC2”

August 1st:

Put in place, understand & validate: Geant4; POOL; LCG applicationsEvent Data ModelDigitization; pile-up; byte-streamConversion of DC1 data to POOL; large scale persistency tests and reconstruction

Testing and validationRun test-production

Start final validation

Start simulation; Pile-up & digitizationEvent mixingTransfer data to CERN

Intensive Reconstruction on “Tier0”Distribution of ESD & AODCalibration; alignmentStart Physics analysisReprocessing

Page 9: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

DC2 resourcesProcess No. of

eventsTime duration

CPU power

Volume of data

AtCERN

Offsite

months kSI2k TB TB TBSimulation 107 2 600 25 5 20 Phas

e I(May

-June)

Pile-up (*)Digitization

107 2 400 75 15 60

Byte-stream 107 2 (small)

20 20 16

Total Phase I 107 2 1000 120 40 96ReconstructionTier-0

107 0.5 600 5 5 10 PhaseII

(July)ReconstructionTier-1

107 2 600 5 0 5

Total 107 130 45 111

Page 10: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Atlas Production System schemaAtlas Production System schema

RBChimera

RB

Task(Dataset)

PartitionTransf.

Definition

TaskTransf.

Definition+ physics signature

Executable nameRelease versionsignature

Supervisor 1 Supervisor 2 Supervisor 4

US Grid LCG NG Local Batch

Task = [job]*Task = [job]*Dataset = [partition]*Dataset = [partition]*

JOB DESCRIPTION

Humanintervention

DataManagement

System

US GridExecuter

LCGExecuter

NGExecuter

LSFExecuter

Supervisor 3

JobRun Info

LocationHint

(Task)

LocationHint(Job)

Job(Partition)

AMI

Page 11: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Tiers in DC2 Tier-0

o 20% of simulation will be done at CERNo All data in ByteStream format (~16 TB) will be copied to

CERNo Reconstruction will be done at CERN (in ~10 days).o Reconstruction output (ESD) will be exported in 2 copies

from Tier-0 ( 2 X ~5 TB).

Page 12: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Tiers in DC2 Tier-1s will have to

o Host simulated data produced by them or coming from Tier-2; plus ESD (& AOD) coming from Tier-0

o Run reconstruction in parallel to Tier-0 exercise (~2 months) This will include links to MCTruth Produce and host ESD and AOD

o Provide access to the ATLAS V.O. members Tier-2s

o Run simulation (and other components if they wish to)o Copy (replicate) their data to Tier-1 o ATLAS is committed to LCG

All information should be entered into the relevant database and catalog

Page 13: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Core sites and commitmentsSite Immediate Later

CERN 200 1200

CNAF 200 500

FNAL 10 ?

FZK 100 ?

Nikhef 124 180

PIC 100 300

RAL 70 250

Taipei 60 ?

Russia 30 50

Prague 17 40

Budapest 100 ?

Totals 864(+147) >2600(+>90)

Initial LCG-2core sites

Other firmcommitments

Will bring in the other 20 LCG-1 sites as quickly as possible

Page 14: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Comments on schedule The change of the schedule has been “driven” by

o ATLAS side: the readiness of the software

• Combined test beam has a highest priority The availability of the production tools

• The integration with grid is not always easyo Grid side

The readiness of LCG• We would prefer run Grid only!

Priorities are not defined by ATLAS only For the Tier-0 exercise

o It will be difficult to define the starting date before we have a better idea how work the “pile-up” and the “event-mixing” processes

Page 15: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

Armin NAIRZ ATLAS Software Workshop, CERN, March 1-5, 2004 12

Generation and GEANT4 Simulation (as in Rel. 7.5.0+) are already in a production-like state

stable and robust CPU times per event, event sizes ‘within specifications’

Digitisation (as in Rel. 7.6.0+) could not be tested for all sub-detectors (missing or not working)

for the tested ones, digitisation is working and stable reason for confidence in working digitisation procedure

for the whole detector in/after Rel. 7.7.0

Pile-up not yet fully functional Documentation on pre-production activities

available from DC webpage http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/DC/DC2/preprod contains also how-to’s (running event generation, simulation,

digitisation)

ConclusionsStatus of the Software for

Data Challenge 2Armin Nairz

Status of the Software for

Data Challenge 2Armin Nairz

Page 16: Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.

ATLAS SW CZ Seminář 2.4.2004 [email protected]

Další schůze

GRID Analysis Tools Distributed Analysis ... Atlantis Tutorial Athena Tutorial


Recommended