I followed this recommendation
to get pileup systematics.
for example I took signal point with large statistic C1N2_400_LSP25
"Define the MC efficiency in each SR i for Nvtx<20 and Nvtx>20 as e1_i and e2_i. Plot r_i=e2_i/e1_i as a function of SR for a handful of signal points"
"Take r = <r> \pm delta, where delta is some reasonable number based on the plot and <r> is some number very close to 1.0."
To avoid bins with low statistic I took bins with Number of events more than 4 for both Nvtx<20 and Nvtx>20, and from them I calculated <r> and delta (as Standard deviation)
<r> = 0.998215
delta = 0.232258
A line fit to a normalized efficiency (e(N1)=1.0) is e(Nvtx) = 1 + (r-1)*(Nvtx-N1)/(N2-N1) You then take a TH1D of Nvtx for data and one for fastsim MC. (These TH1D can be the same for every model and every point). Define default = integral of the TH1D rescaled by e(Nvtx) with r = <r> up = integral of the TH1D rescaled by e(Nvtx) with r = <r> + delta down = integral of the TH1D rescaled by e(Nvtx) with r = <r> - delta The (normalized) acceptance with the data pileup is default_data with (asymmetric) uncertainty (+ up_data - down_data) Compare this with default_MC
Table and plots for more points (will be updated with more points)
The (normalized) acceptance with the data pileup is default_data with (asymmetric) uncertainty (+ up_data - down_data) Compare this with default_MC Based on this comparison, assess a constant uncertainty, same over all bins. Make it 100% correlated across bins in combine.
Default_data with uncertainty looks in agreement with default_MC.