Blog

I followed this recommendation

https://hypernews.cern.ch/HyperNews/CMS/get/susy/2291/1.html

to get pileup systematics.

 

for example I took signal point with large statistic C1N2_400_LSP25

"Define the MC efficiency in each SR i for Nvtx<20 and Nvtx>20
as e1_i and e2_i.
Plot r_i=e2_i/e1_i as a function of SR for a handful of
signal points"

 

 




"Take r = <r> \pm delta, where delta is some reasonable number
based on the plot and <r> is some number very close to 1.0."


To avoid bins with low statistic I took bins with Number of events more than 4 for both Nvtx<20 and Nvtx>20, and from them I calculated <r> and delta (as Standard deviation)

<r> = 0.998215

delta = 0.232258

 
A line fit to a normalized efficiency (e(N1)=1.0) is
e(Nvtx) = 1 + (r-1)*(Nvtx-N1)/(N2-N1)
You then take a TH1D of Nvtx for data and one for fastsim MC.
(These TH1D can be the same for every model and every point).
Define
default = integral of the TH1D rescaled by e(Nvtx) with r = <r>
up      = integral of the TH1D rescaled by e(Nvtx) with r = <r> + delta
down    = integral of the TH1D rescaled by e(Nvtx) with r = <r> - delta
The (normalized) acceptance with the data pileup is
  default_data with (asymmetric) uncertainty (+ up_data - down_data)
Compare this with default_MC

 

default_data  0.999567

up_data  1.01262

down_data  0.986516

default_MC  0.999517

 

Table and plots for more points (will be updated with more points)

 C1N2_400_LSP25C1N2_100_LSP1C1N2_200_LSP25C1N2_400_LSP200stau-stau_100_LSP1
<r>0.9982151.075511.126681.182330.994051
delta0.2322580.4112570.5054610.5621780.482308
default_data

0.999567

1.017911.033151.047040.999043

up_data

1.012621.058011.100011.128571.03639
down_data0.9865160.9777980.9662890.9655030.961694
default_MC0.9995171.019911.036631.052241.01284
The (normalized) acceptance with the data pileup is

  default_data with (asymmetric) uncertainty (+ up_data - down_data)

Compare this with default_MC

Based on this comparison, assess a constant uncertainty, same over
all bins.  Make it 100% correlated across bins in combine.


Default_data with uncertainty looks in agreement with default_MC.


More plots

 

 

 


I followed this recommendation

https://hypernews.cern.ch/HyperNews/CMS/get/susy/2291/1.html

to get pileup systematics.

 

for example I took signal point with large statistic C1N2_400_LSP25

"Define the MC efficiency in each SR i for Nvtx<20 and Nvtx>20
as e1_i and e2_i.
Plot r_i=e2_i/e1_i as a function of SR for a handful of
signal points"

 

 

C1N2_400_LSP25efficiencies.pdf




"Take r = <r> \pm delta, where delta is some reasonable number
based on the plot and <r> is some number very close to 1.0."


To avoid bins with low statistic I took bins with Number of events more than 5 for both Nvtx<20 and Nvtx>20, and from them I calculated <r> and delta (as Standard deviation)

<r> = 0.952882

delta = 0.216715

 
A line fit to a normalized efficiency (e(N1)=1.0) is
e(Nvtx) = 1 + (r-1)*(Nvtx-N1)/(N2-N1)
You then take a TH1D of Nvtx for data and one for fastsim MC.
(These TH1D can be the same for every model and every point).
Define
default = integral of the TH1D rescaled by e(Nvtx) with r = <r>
up      = integral of the TH1D rescaled by e(Nvtx) with r = <r> + delta
down    = integral of the TH1D rescaled by e(Nvtx) with r = <r> - delta
The (normalized) acceptance with the data pileup is
  default_data with (asymmetric) uncertainty (+ up_data - down_data)
Compare this with default_MC

 

default_data  0.987979

up_data  0.99996

down_data  0.975998

default_MC  0.986657

 

Table and plots for more points (will be updated with more points)

 C1N2_400_LSP25C1N2_100_LSP1C1N2_200_LSP25C1N2_400_LSP200stau-stau_100_LSP1
<r>0.9528821.203360.9836911.058121.58387
delta0.2167150.1185760.4258720.3973290.44983
default_data0.9879791.049940.9954821.015761.13812

up_data

0.999961.053391.045711.058561.18599
down_data0.9759981.046490.9452530.9729521.09025
default_MC0.9866571.05530.9950441.017391.15391

 

 

 

 

stau-stau_100_LSP1efficiencies.pdfC1N2_400_LSP200efficiencies.pdfC1N2_200_LSP25efficiencies.pdfC1N2_100_LSP1efficiencies.pdf

Trigger info update

Following results are based the JSON: Cert_271036-284044_13TeV_PromptReco_Collisions16_JSON.txt


IsoMu24 : 36414 /pb

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Final/Cert_271036-284044_13TeV_PromptReco_Collisions16_JSON.txt -u /fb --hltpath="HLT_IsoMu24_*

+-----------------------+-------+------+--------+-------------------+------------------+

| hltpath               | nfill | nrun | ncms   | totdelivered(/fb) | totrecorded(/fb) |

+-----------------------+-------+------+--------+-------------------+------------------+
| HLT_IsoMu24_v1        | 23    | 65   | 29176  | 2.914             | 2.795            |
| HLT_IsoMu24_v2        | 89    | 241  | 155143 | 25.941            | 24.857           |
| HLT_IsoMu24_v4        | 32    | 86   | 47447  | 9.101             | 8.762            |
+-----------------------+-------+------+--------+-------------------+------------------+


IsoMu22 : 28952 /pb
brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Final/Cert_271036-284044_13TeV_PromptReco_Collisions16_JSON.txt -u /fb --hltpath="HLT_IsoMu22_*"


+-----------------------+-------+------+--------+-------------------+------------------+
| hltpath               | nfill | nrun | ncms   | totdelivered(/fb) | totrecorded(/fb) |
+-----------------------+-------+------+--------+-------------------+------------------+
| HLT_IsoMu22_v2        | 23    | 65   | 29176  | 2.914             | 2.795            |
| HLT_IsoMu22_v3        | 81    | 212  | 139128 | 22.010            | 21.117           |
| HLT_IsoMu22_v5        | 28    | 53   | 32533  | 5.220             | 5.040            |
+-----------------------+-------+------+--------+-------------------+------------------+




Ele25_eta2p1_WPTight_Gsf : 36414 /pb

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Final/Cert_271036-284044_13TeV_PromptReco_Collisions16_JSON.txt -u /fb --hltpath="HLT_Ele25_eta2p1_WPTight_Gsf_*"

+---------------------------------+-------+------+-------+-------------------+------------------+
| hltpath                         | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |
+---------------------------------+-------+------+-------+-------------------+------------------+
| HLT_Ele25_eta2p1_WPTight_Gsf_v1 | 23    | 65   | 29176 | 2.914             | 2.795            |
| HLT_Ele25_eta2p1_WPTight_Gsf_v2 | 4     | 12   | 7544  | 1.012             | 0.968            |
| HLT_Ele25_eta2p1_WPTight_Gsf_v3 | 4     | 16   | 8540  | 1.195             | 1.148            |
| HLT_Ele25_eta2p1_WPTight_Gsf_v4 | 24    | 85   | 56872 | 8.796             | 8.454            |
| HLT_Ele25_eta2p1_WPTight_Gsf_v5 | 17    | 36   | 27939 | 4.876             | 4.643            |
| HLT_Ele25_eta2p1_WPTight_Gsf_v6 | 41    | 92   | 54248 | 10.062            | 9.644            |
| HLT_Ele25_eta2p1_WPTight_Gsf_v7 | 32    | 86   | 47447 | 9.101             | 8.762            |
+---------------------------------+-------+------+-------+-------------------+------------------+





HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL:

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Final/Cert_271036-284044_13TeV_PromptReco_Collisions16_JSON.txt -u /fb --hltpath="HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_*"


Only Run BCDE
+----------------------------------------------------+-------+------+-------+-------------------+------------------+
| hltpath                                            | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |
+----------------------------------------------------+-------+------+-------+-------------------+------------------+
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v3 | 23    | 65   | 29176 | 2.914             | 2.795            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v4 | 4     | 12   | 7544  | 1.012             | 0.968            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v5 | 4     | 16   | 8540  | 1.195             | 1.148            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v6 | 36    | 111  | 78296 | 12.548            | 12.026           |
+----------------------------------------------------+-------+------+-------+-------------------+------------------+

All runs
+-------------------------------------------------------+-------+------+-------+-------------------+------------------+
| hltpath                                               | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |
+-------------------------------------------------------+-------+------+-------+-------------------+------------------+
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_DZ_v1 | 41    | 92   | 54248 | 10.062            | 9.644            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_DZ_v4 | 32    | 86   | 47447 | 9.101             | 8.762            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v3    | 23    | 65   | 29176 | 2.914             | 2.795            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v4    | 4     | 12   | 7544  | 1.012             | 0.968            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v5    | 4     | 16   | 8540  | 1.195             | 1.148            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v6    | 41    | 121  | 84811 | 13.672            | 13.097           |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v7    | 41    | 92   | 54248 | 10.062            | 9.644            |
| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v9    | 32    | 86   | 47447 | 2.444             | 2.354            |
+-------------------------------------------------------+-------+------+-------+-------------------+------------------+




HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL:

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Final/Cert_271036-284044_13TeV_PromptReco_Collisions16_JSON.txt -u /fb --hltpath="HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v*"

Only Run BCDE
+---------------------------------------------------+-------+------+-------+-------------------+------------------+
| hltpath                                           | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |
+---------------------------------------------------+-------+------+-------+-------------------+------------------+
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v3 | 23    | 65   | 29176 | 2.914             | 2.795            |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v4 | 4     | 12   | 7544  | 1.012             | 0.968            |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v5 | 4     | 16   | 8540  | 1.195             | 1.148            |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v6 | 36    | 111  | 78296 | 12.548            | 12.026           |
+---------------------------------------------------+-------+------+-------+-------------------+------------------+


All runs
+------------------------------------------------------+-------+------+-------+-------------------+------------------+
| hltpath                                              | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |
+------------------------------------------------------+-------+------+-------+-------------------+------------------+
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_DZ_v1 | 41    | 92   | 54248 | 10.062            | 9.644            |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_DZ_v4 | 32    | 86   | 47447 | 9.101             | 8.762            |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v3    | 23    | 65   | 29176 | 2.914             | 2.795            |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v4    | 4     | 12   | 7544  | 1.012             | 0.968            |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v5    | 4     | 16   | 8540  | 1.195             | 1.148            |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v6    | 41    | 121  | 84811 | 13.672            | 13.097           |
| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v7    | 41    | 92   | 54248 | 10.062            | 9.644            |
+------------------------------------------------------+-------+------+-------+-------------------+------------------+



 BCDEFGHsum
HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL169371071527652
HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_DZ-1840635342
HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL169371064130006
HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_DZ-1840635342

 

======================

Update based on ReReco JSON (1Jan 2017)

 

 BCDEFGHsum
HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL20233  
HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_DZ-1635736590
HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL20233  
HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_DZ-1635736590




 

 

 

from preprocessing

--- IdTransformation         : Rank : Variable  : Separation

--- IdTransformation         : ----------------------------------

--- IdTransformation         :    1 : ta_pt     : 3.854e-01

--- IdTransformation         :    2 : MTsum     : 3.588e-01

--- IdTransformation         :    3 : MTmutau   : 2.913e-01

--- IdTransformation         :    4 : met_pt    : 2.799e-01

--- IdTransformation         :    5 : Minv      : 2.604e-01

--- IdTransformation         :    6 : Dzeta     : 1.750e-01

--- IdTransformation         :    7 : MT2lester : 1.488e-01

--- IdTransformation         :    8 : MCTb      : 1.348e-01

--- IdTransformation         :    9 : dR        : 1.017e-01

--- IdTransformation         :   10 : mu_pt     : 7.965e-02

--- IdTransformation         :   11 : njets     : 3.695e-02

--- IdTransformation         :   12 : nbtag     : 2.796e-02

 

from BDT

 

--- BDT                      : Rank : Variable  : Variable Importance

--- BDT                      : -------------------------------------------

--- BDT                      :    1 : dR        : 1.102e-01

--- BDT                      :    2 : ta_pt     : 1.018e-01

--- BDT                      :    3 : MTmutau   : 9.153e-02

--- BDT                      :    4 : MT2lester : 9.106e-02

--- BDT                      :    5 : met_pt    : 9.054e-02

--- BDT                      :    6 : MCTb      : 8.911e-02

--- BDT                      :    7 : MTsum     : 8.211e-02

--- BDT                      :    8 : Minv      : 8.207e-02

--- BDT                      :    9 : mu_pt     : 7.565e-02

--- BDT                      :   10 : Dzeta     : 7.281e-02

--- BDT                      :   11 : njets     : 7.125e-02

--- BDT                      :   12 : nbtag     : 4.193e-02

 

 

 

 

 

Following results are based the JSON  Cert_271036-283059_13TeV_PromptReco_Collisions16_JSON_NoL1T.txt : 31.23/fb

Up to 16/fb : IsoMu22,  Ele25_eta2p1_WPTight_Gsf ,  Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v* & _Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v*

IsoMu22 : 26074 /pb

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Cert_271036-283059_13TeV_PromptReco_Collisions16_JSON_NoL1T.txt -u /fb --hltpath="HLT_IsoMu22_v*"

#Summary:

+----------------+-------+------+--------+-------------------+------------------+

| hltpath        | nfill | nrun | ncms   | totdelivered(/fb) | totrecorded(/fb) |

+----------------+-------+------+--------+-------------------+------------------+

| HLT_IsoMu22_v2 | 23    | 65   | 29182  | 2.914             | 2.796            |

| HLT_IsoMu22_v3 | 81    | 212  | 139183 | 22.021            | 21.128           |

| HLT_IsoMu22_v5 | 13    | 19   | 14324  | 2.232             | 2.150            |

+----------------+-------+------+--------+-------------------+------------------+

#Check JSON:

#(run,ls) in json but not in results: [(282917, 199), (282917, 200), (282917, 201), (282918, 57), (282918, 58), (282918, 59), (280363, 233), (283059, 452), (283059, 453), (283059, 454), (283059, 455), (283059, 456), (283059, 457), (283059, 458), (280385, 1397), (280385, 1409), (280385, 1427), (282731, 176), (282732, 73), (282734, 330), (280188, 69), (282730, 168), (282730, 169), (282730, 170), (282730, 171), (283050, 222), (283050, 223), (283050, 224), (283050, 225), (283050, 226), (283050, 227), (283052, 121), (283052, 122), (283052, 123), (283052, 124), (282800, 382), (283043, 247), (283043, 262), (282807, 329), (282807, 330)]

 

IsoMu24 :31228 /pb

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Cert_271036-283059_13TeV_PromptReco_Collisions16_JSON_NoL1T.txt -u /fb --hltpath="HLT_IsoMu24_v*"

#Summary:

+----------------+-------+------+--------+-------------------+------------------+

| hltpath        | nfill | nrun | ncms   | totdelivered(/fb) | totrecorded(/fb) |

+----------------+-------+------+--------+-------------------+------------------+

| HLT_IsoMu24_v1 | 23    | 65   | 29182  | 2.914             | 2.796            |

| HLT_IsoMu24_v2 | 89    | 241  | 155143 | 25.941            | 24.857           |

| HLT_IsoMu24_v4 | 15    | 38   | 20307  | 3.745             | 3.575            |

+----------------+-------+------+--------+-------------------+------------------+

#Check JSON:

#(run,ls) in json but not in results: [(282917, 199), (282917, 200), (282917, 201), (282918, 57), (282918, 58), (282918, 59), (280363, 233), (283059, 452), (283059, 453), (283059, 454), (283059, 455), (283059, 456), (283059, 457), (283059, 458), (280385, 1397), (280385, 1409), (280385, 1427), (282731, 176), (282732, 73), (282734, 330), (280188, 69), (282730, 168), (282730, 169), (282730, 170), (282730, 171), (283050, 222), (283050, 223), (283050, 224), (283050, 225), (283050, 226), (283050, 227), (283052, 121), (283052, 122), (283052, 123), (283052, 124), (282800, 382), (283043, 247), (283043, 262), (282807, 329), (282807, 330)]

 

Ele25_eta2p1_WPTight_Gsf : 31228 /pb

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Cert_271036-283059_13TeV_PromptReco_Collisions16_JSON_NoL1T.txt -u /fb --hltpath="HLT_Ele25_eta2p1_WPTight_Gsf_*"

#Summary:

+---------------------------------+-------+------+-------+-------------------+------------------+

| hltpath                         | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |

+---------------------------------+-------+------+-------+-------------------+------------------+

| HLT_Ele25_eta2p1_WPTight_Gsf_v1 | 23    | 65   | 29182 | 2.914             | 2.796            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v2 | 4     | 12   | 7544  | 1.012             | 0.968            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v3 | 4     | 16   | 8540  | 1.195             | 1.148            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v4 | 24    | 85   | 56872 | 8.796             | 8.454            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v5 | 17    | 36   | 27939 | 4.876             | 4.643            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v6 | 41    | 92   | 54248 | 10.062            | 9.644            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v7 | 15    | 38   | 20307 | 3.745             | 3.575            |

+---------------------------------+-------+------+-------+-------------------+------------------+

#Check JSON:

#(run,ls) in json but not in results: [(282917, 199), (282917, 200), (282917, 201), (282918, 57), (282918, 58), (282918, 59), (280363, 233), (283059, 452), (283059, 453), (283059, 454), (283059, 455), (283059, 456), (283059, 457), (283059, 458), (280385, 1397), (280385, 1409), (280385, 1427), (282731, 176), (282732, 73), (282734, 330), (280188, 69), (282730, 168), (282730, 169), (282730, 170), (282730, 171), (283050, 222), (283050, 223), (283050, 224), (283050, 225), (283050, 226), (283050, 227), (283052, 121), (283052, 122), (283052, 123), (283052, 124), (282800, 382), (283043, 247), (283043, 262), (282807, 329), (282807, 330)]

 

HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL : 28650

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Cert_271036-283059_13TeV_PromptReco_Collisions16_JSON_NoL1T.txt -u /fb --hltpath="HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_*"

#Summary:

+-------------------------------------------------------+-------+------+-------+-------------------+------------------+

| hltpath                                               | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |

+-------------------------------------------------------+-------+------+-------+-------------------+------------------+

 

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v3    | 23    | 65   | 29182 | 2.914             | 2.796            |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v4    | 4     | 12   | 7544  | 1.012             | 0.968            |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v5    | 4     | 16   | 8540  | 1.195             | 1.148            |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v6    | 41    | 121  | 84811 | 13.672            | 13.097           |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v7    | 41    | 92   | 54248 | 10.062            | 9.644            |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v9    | 15    | 38   | 20307 | 1.044             | 0.997            |

+-------------------------------------------------------+-------+------+-------+-------------------+------------------+

#Check JSON:

#(run,ls) in json but not in results: [(282917, 199), (282917, 200), (282917, 201), (282918, 57), (282918, 58), (282918, 59), (280363, 233), (283059, 452), (283059, 453), (283059, 454), (283059, 455), (283059, 456), (283059, 457), (283059, 458), (280385, 1397), (280385, 1409), (280385, 1427), (282731, 176), (282732, 73), (282734, 330), (280188, 69), (282730, 168), (282730, 169), (282730, 170), (282730, 171), (283050, 222), (283050, 223), (283050, 224), (283050, 225), (283050, 226), (283050, 227), (283052, 121), (283052, 122), (283052, 123), (283052, 124), (282800, 382), (283043, 247), (283043, 262), (282807, 329), (282807, 330)]

 

HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v : 27653

brilcalc lumi -b "STABLE BEAMS" --normtag=/afs/cern.ch/user/l/lumipro/public/normtag_file/normtag_DATACERT.json -i /afs/cern.ch/cms/CAF/CMSCOMM/COMM_DQM/certification/Collisions16/13TeV/Cert_271036-283059_13TeV_PromptReco_Collisions16_JSON_NoL1T.txt -u /fb --hltpath="HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v*"

#Summary:

+---------------------------------------------------+-------+------+-------+-------------------+------------------+

| hltpath                                           | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |

+---------------------------------------------------+-------+------+-------+-------------------+------------------+

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v3 | 23    | 65   | 29182 | 2.914             | 2.796            |

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v4 | 4     | 12   | 7544  | 1.012             | 0.968            |

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v5 | 4     | 16   | 8540  | 1.195             | 1.148            |

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v6 | 41    | 121  | 84811 | 13.672            | 13.097           |

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v7 | 41    | 92   | 54248 | 10.062            | 9.644            |

+---------------------------------------------------+-------+------+-------+-------------------+------------------+

 

 

 BCDEFGSum
HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL169381071527653
HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_DZ-1321930157
 BCDEFGSum
HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_169381064127579
HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL__DZ-1321930157

Notes : Need to recheck the total lumi since there are  still missing runs from validation - As strategy

Trigger

  • IsoMu22 OR IsoMu24
  • Ele25 good for all 2016 RunII
  • Mu8_Ele23 & Mu23_Ele12 : OR decision for "DZ" for RunFG

 

  • Unblinding policy (as taken from SUSY meetings)
general policy: re-blind signal regions
new limited sample for tests: Run2016G, runs
278820-279931, ~ 4.4p-1
(Run2016G part of last Friday’s JSON
Cert_271036-279931_13TeV_PromptReco_Collisions16_JSON_NoL1T.txt )

Triggers for 24,5 /fb

+-----------------------+-------+------+--------+-------------------+------------------+

| hltpath               | nfill | nrun | ncms   | totdelivered(/fb) | totrecorded(/fb) |

+-----------------------+-------+------+--------+-------------------+------------------+

| HLT_IsoMu22_v2        | 23    | 65   | 29182  | 2.914             | 2.796            |

| HLT_IsoMu22_v3        | 73    | 192  | 125830 | 19.902            | 19.076           |

+-----------------------+-------+------+--------+-------------------+------------------+

total 21,9

+----------------+-------+------+--------+-------------------+------------------+

| hltpath        | nfill | nrun | ncms   | totdelivered(/fb) | totrecorded(/fb) |

+----------------+-------+------+--------+-------------------+------------------+

| HLT_IsoMu24_v1 | 23    | 65   | 29182  | 2.914             | 2.796            |

| HLT_IsoMu24_v2 | 79    | 206  | 137128 | 22.667            | 21.696           |

+----------------+-------+------+--------+-------------------+------------------+

total 24,5

+---------------------------------+-------+------+-------+-------------------+------------------+

| hltpath                         | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |

+---------------------------------+-------+------+-------+-------------------+------------------+

| HLT_Ele25_eta2p1_WPTight_Gsf_v1 | 23    | 65   | 29182 | 2.914             | 2.796            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v2 | 4     | 12   | 7544  | 1.012             | 0.968            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v3 | 4     | 16   | 8540  | 1.195             | 1.148            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v4 | 24    | 85   | 56872 | 8.796             | 8.454            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v5 | 17    | 36   | 27939 | 4.876             | 4.643            |

| HLT_Ele25_eta2p1_WPTight_Gsf_v6 | 31    | 57   | 36233 | 6.788             | 6.483            |

+---------------------------------+-------+------+-------+-------------------+------------------+

total 24,5

+-------------------------------------------------------+-------+------+-------+-------------------+------------------+

| hltpath                                               | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |

+-------------------------------------------------------+-------+------+-------+-------------------+------------------+

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v3    | 23    | 65   | 29182 | 2.914             | 2.796            |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v4    | 4     | 12   | 7544  | 1.012             | 0.968            |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v5    | 4     | 16   | 8540  | 1.195             | 1.148            |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v6    | 41    | 121  | 84811 | 13.672            | 13.097           |

| HLT_Mu23_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v7    | 31    | 57   | 36233 | 6.788             | 6.483            |

+-------------------------------------------------------+-------+------+-------+-------------------+------------------+

total 24,5

+------------------------------------------------------+-------+------+-------+-------------------+------------------+

| hltpath                                              | nfill | nrun | ncms  | totdelivered(/fb) | totrecorded(/fb) |

+------------------------------------------------------+-------+------+-------+-------------------+------------------+

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v3    | 23    | 65   | 29182 | 2.914             | 2.796            |

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v4    | 4     | 12   | 7544  | 1.012             | 0.968            |

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v5    | 4     | 16   | 8540  | 1.195             | 1.148            |

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v6    | 41    | 121  | 84811 | 13.672            | 13.097           |

| HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v7    | 31    | 57   | 36233 | 6.788             | 6.483            |

+------------------------------------------------------+-------+------+-------+-------------------+------------------+

total 24,5

 

 

 

  • Move to 80x (Alexis, Ilya)
    This includes to make sure that the selection code works, make control plots after baseline selection
  • Triggers

    We need the SF for the following (in bold what we need now) for JSON Cert_271036-275125_13TeV_PromptReco_Collisions16_JSON.txt (golden 4.336/fb)

     1e348e337e335e333.5e332e33
    HLT_IsoMu20_v011111
    HLT_IsoMu22_v111111
    HLT_Ele23_WPLoose_Gsf_v000011
    HLT_Ele27_eta2p1_WPLoose_Gsf_v111111
    HLT_Ele25_eta2p1_WPTight_Gsf_v111111
    HLT_Ele25_eta2p1_WPLoose_Gsf_v011111
    HLT_Mu23_TrkIsoVVL_Ele8_CaloIdL_TrackIdL_IsoVL_v111111
    HLT_Mu8_TrkIsoVVL_Ele23_CaloIdL_TrackIdL_IsoVL_v111111
    HLT_Mu17_TrkIsoVVL_Ele12_CaloIdL_TrackIdL_IsoVL_v011111
    HLT_Mu8_TrkIsoVVL_Ele17_CaloIdL_TrackIdL_IsoVL_v011111

    Task : Illia Babounikau : Estimate the lumi for the ones we currently use for the latest json

  • Estimate TFR Alexis Kalogeropoulos

 

 

 

 

plan: run analysisMacro on crazy Single Ntuples (SMS)

 

 

QCD has a huge weight -

VV includes the WW,WZ,ZZ pythia samples

StauA  : stau = 200 GeV LSP  = 100 GeV

StauB : stau  = 500 GeV LSP = 200 GeV

 

dR_muTau

 

 

dPhi_JMET(all jets)

 

 

 

MET

 

 

MT-mutau pair

 

 

MT - mu only

 

 

 

 

MT2- mutau pair

 

 

 

MeffMu

 

 

MeffMu/sqrt(MET)

 

 

MeffTau

 

 

 

MeffTau/sqrt(MET)

 

InvMassMuTau

 

 

nJets

 

 

 

HT/sqrt(MET)

 

 

 

HT4 (HT from first 4 jets), HT3, HT2, HT

 

 

 

pTJ0,pTJ1,pTJ3

 

 

 

 

BJets

 

 

 

muPT

 

mu_eta

 

 

 

 

relIso  - selected muon

 

relso - all muons

 

 

 

 

miniISO - selected muon

 

 

miniISO - all muons

ons