I followed this recommendation

https://hypernews.cern.ch/HyperNews/CMS/get/susy/2291/1.html

to get pileup systematics.

 

for example I took signal point with large statistic C1N2_400_LSP25

"Define the MC efficiency in each SR i for Nvtx<20 and Nvtx>20
as e1_i and e2_i.
Plot r_i=e2_i/e1_i as a function of SR for a handful of
signal points"

 

 

C1N2_400_LSP25efficiencies.pdf




"Take r = <r> \pm delta, where delta is some reasonable number
based on the plot and <r> is some number very close to 1.0."


To avoid bins with low statistic I took bins with Number of events more than 5 for both Nvtx<20 and Nvtx>20, and from them I calculated <r> and delta (as Standard deviation)

<r> = 0.952882

delta = 0.216715

 
A line fit to a normalized efficiency (e(N1)=1.0) is
e(Nvtx) = 1 + (r-1)*(Nvtx-N1)/(N2-N1)
You then take a TH1D of Nvtx for data and one for fastsim MC.
(These TH1D can be the same for every model and every point).
Define
default = integral of the TH1D rescaled by e(Nvtx) with r = <r>
up      = integral of the TH1D rescaled by e(Nvtx) with r = <r> + delta
down    = integral of the TH1D rescaled by e(Nvtx) with r = <r> - delta
The (normalized) acceptance with the data pileup is
  default_data with (asymmetric) uncertainty (+ up_data - down_data)
Compare this with default_MC

 

default_data  0.987979

up_data  0.99996

down_data  0.975998

default_MC  0.986657

 

Table and plots for more points (will be updated with more points)

 C1N2_400_LSP25C1N2_100_LSP1C1N2_200_LSP25C1N2_400_LSP200stau-stau_100_LSP1
<r>0.9528821.203360.9836911.058121.58387
delta0.2167150.1185760.4258720.3973290.44983
default_data0.9879791.049940.9954821.015761.13812

up_data

0.999961.053391.045711.058561.18599
down_data0.9759981.046490.9452530.9729521.09025
default_MC0.9866571.05530.9950441.017391.15391

 

 

 

 

stau-stau_100_LSP1efficiencies.pdfC1N2_400_LSP200efficiencies.pdfC1N2_200_LSP25efficiencies.pdfC1N2_100_LSP1efficiencies.pdf

  • No labels