Page tree

The data analysis tool page. Content: 

A detailed description of the data analysis tool (by from 06.2015) is here Thomas-analysis_tool-manual.pdf. Two versions of the code (executable and header files for alibava) are in development : the Thomas' ver. (the code Thomas used for analysis in his thesis) and the git ver. (commited to the github version, was developed by Thomas and Eda, /terms are suggested by Unknown User (kislerdm)/

The analysis is based on Marlin (Modular Analysis and Reconstruction for the LINear collider) tool with the use of EUtelescope code.

The jobs' processors description is below.

Three data types are being used for analysis:

  • Alibava data file *.dat
  • Alibava pedestal run *.ped
  • telescope data file (includes the data from five telescope's Mimosa26 planes with/without the CMSPixRef plane) *.raw.

Analysis structure:

The command to run an analysis job :

jobsub -c {configfile_name}.cfg -csv {runlist_name}.csv {jobname} {run_number}

analysis step

{jobname} =

analysis block

  1. convert-ped
  2. pedestal
  3. commonmode
  4. pedestal2
    pedestalhisto (extra step)
  5. converter
  6. reco
  7. clustering-1
  8. clustering-2
    datahisto (extra step)
  9. telescope-converter
  10. telescope-clustering
  11. merge/merger
  12. hitmaker
  13. alignment-daf-{1-N} (interate N times)
  14. tracking-1
  15. alignment-daf-{(N+1)-(N+N1)} (interate N1 times)
  16. tracking-2
  17. alignment-daf-{(N+N1+1)-(N+N1+N2)} (interate N2 times)
  18. tracking-3

Steps 1-4.

pedestal data analysis

{run_number} =

= number of the *.ped run

Steps 5-8.

alibava data analysis

{run_number} =

= number of the *.dat run

Steps 9-10.

telescope data analysis

{run_number} =

= number of the *.raw run

Steps 11+.

telescope+alibava combined data analysis

{run_number} =

= number of the *.dat run

  • No labels