I followed this recommendation

https://hypernews.cern.ch/HyperNews/CMS/get/susy/2291/1.html

to get pileup systematics.

for example I took signal point with large statistic C1N2_400_LSP25

"Define the MC efficiency in each SR i for Nvtx<20 and Nvtx>20 as e1_i and e2_i. Plot r_i=e2_i/e1_i as a function of SR for a handful of signal points"

"Take r = <r> \pm delta, where delta is some reasonable number based on the plot and <r> is some number very close to 1.0."

To avoid bins with low statistic I took bins with Number of events more than 5 for both Nvtx<20 and Nvtx>20, and from them I calculated <r> and delta (as Standard deviation)

<r> = 0.952882

delta = 0.216715

A line fit to a normalized efficiency (e(N1)=1.0) is e(Nvtx) = 1 + (r-1)*(Nvtx-N1)/(N2-N1) You then take a TH1D of Nvtx for data and one for fastsim MC. (These TH1D can be the same for every model and every point). Define default = integral of the TH1D rescaled by e(Nvtx) with r = <r> up = integral of the TH1D rescaled by e(Nvtx) with r = <r> + delta down = integral of the TH1D rescaled by e(Nvtx) with r = <r> - delta The (normalized) acceptance with the data pileup is default_data with (asymmetric) uncertainty (+ up_data - down_data) Compare this with default_MC

default_data 0.987979

up_data 0.99996

down_data 0.975998

default_MC 0.986657

Table and plots for more points (will be updated with more points)

C1N2_400_LSP25 | C1N2_100_LSP1 | C1N2_200_LSP25 | C1N2_400_LSP200 | stau-stau_100_LSP1 | |
---|---|---|---|---|---|

<r> | 0.952882 | 1.20336 | 0.983691 | 1.05812 | 1.58387 |

delta | 0.216715 | 0.118576 | 0.425872 | 0.397329 | 0.44983 |

default_data | 0.987979 | 1.04994 | 0.995482 | 1.01576 | 1.13812 |

up_data | 0.99996 | 1.05339 | 1.04571 | 1.05856 | 1.18599 |

down_data | 0.975998 | 1.04649 | 0.945253 | 0.972952 | 1.09025 |

default_MC | 0.986657 | 1.0553 | 0.995044 | 1.01739 | 1.15391 |