Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
55436a1
Use 20 quantiles for yll instead of 10
davidwalter2 Jan 15, 2026
150189d
Update wremnants-data
davidwalter2 Jan 16, 2026
90eb410
Merge branch 'main' of github.com:WMass/WRemnants into 260116_updBinning
davidwalter2 Jan 16, 2026
c947d3f
Use new quantile file
davidwalter2 Jan 16, 2026
50b474f
Synchronize sample paths; Treat DY low mass similar to Zmumu
davidwalter2 Jan 16, 2026
eff3a82
Some more harminizations in datasets; common xsec definitions
davidwalter2 Jan 17, 2026
650b069
Use single file for angular coefficients shared across main analysis,…
davidwalter2 Jan 17, 2026
162950d
Fixes to previous commits
davidwalter2 Jan 17, 2026
e486a7c
Update wremnants-data with new angular cofficient file
davidwalter2 Jan 17, 2026
c6c2197
fix ci
davidwalter2 Jan 17, 2026
de9dc1a
Rename samples ending with 'PostVFP' to '_2016PostVFP' and introduce …
davidwalter2 Jan 19, 2026
db06e92
Further unify binning; adapt 'make_theory_corr' to use different eras
davidwalter2 Jan 19, 2026
a388aaf
Change logic to maintain backwards compatibility with previous byHeli…
davidwalter2 Jan 19, 2026
e0ece9d
Fix typos
davidwalter2 Jan 20, 2026
9558dcb
Fix CI
davidwalter2 Jan 20, 2026
56485e3
Revert correcting DYLowMass dataset
davidwalter2 Jan 20, 2026
17db1fa
Update wremnants-data with new theory corrections
davidwalter2 Jan 20, 2026
35cb7d0
Add fine binning in ptVgen and absYVGen option for theory corrections…
davidwalter2 Jan 20, 2026
f81bbb2
Add reference to new BSM sample
davidwalter2 Jan 20, 2026
e6750b9
Use more fine binning for byHelicity corrections and uncertainties
davidwalter2 Jan 21, 2026
97c6bd9
Adapt latest changes of iterative unfolding
davidwalter2 Jan 25, 2026
37e25ca
Add betavar templates
davidwalter2 Jan 25, 2026
1872570
Update rabbit version, pass metadata to rabbit from setupCombine
kdlong Jan 26, 2026
6730177
Fix accidental change of submodules
kdlong Jan 26, 2026
443c3ad
Update theory corrections by helicity and helper scripts
davidwalter2 Jan 27, 2026
e4d23c2
Merge branch 'rabbitFileMetadata' of https://github.com/kdlong/WRemna…
davidwalter2 Jan 27, 2026
2f2eb45
Adapt CI to new version of rabbit
davidwalter2 Jan 27, 2026
6fc0aef
Adapt CI to new version of rabbit
davidwalter2 Jan 27, 2026
c25296f
Set rabbit to the latest version
kdlong Jan 27, 2026
0d7d1ba
Harmonize dataset dicts
davidwalter2 Jan 27, 2026
0e95401
Update theory correction scripts
davidwalter2 Jan 27, 2026
ff7bfdb
Add option to constrain NOI variations
davidwalter2 Jan 27, 2026
1876c9b
update rabbit, exclude BB profiling from external postfit in CI
kdlong Jan 27, 2026
ff355ad
Merge branch 'rabbitFileMetadata' of https://github.com/kdlong/WRemna…
davidwalter2 Jan 27, 2026
782057f
Update submodules rabbit and wremnant-sdata
davidwalter2 Jan 27, 2026
6d32ed2
Fix extended dataset option for lowPU
davidwalter2 Jan 27, 2026
798d145
Change CI back to previous logic for externalPostfit
davidwalter2 Jan 28, 2026
a94b163
Fix lowPU
davidwalter2 Jan 28, 2026
9b07326
Fix low PU in CI and also fix plotting low PU unfolded distributions
davidwalter2 Jan 28, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 25 additions & 19 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@ env:
OUTFILE_LOWPU_EE: "mz_lowPU_ee.hdf5"
OUTFILE_LOWPU_MUMU: "mz_lowPU_mumu.hdf5"
DATAPATH: "/scratch/shared/NanoAOD/"
DATAPATH_LOWPU: "/scratch/shared/NanoAOD/LowPU/"
NOMINAL_FAKE_SMOOTHING: "hybrid"

# A workflow run is made up of one or more jobs that can run sequentially or in parallel
Expand Down Expand Up @@ -315,7 +314,7 @@ jobs:
- name: bsm rabbit setup
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run_python.sh scripts/rabbit/setupRabbit.py
-i $HIST_FILE --lumiScale $LUMI_SCALE --addBSMMixing WtoNMu_5 0.01 --breitwignerWMassWeights
-i $HIST_FILE --lumiScale $LUMI_SCALE --addBSMMixing WtoNMuMass5 0.01 --breitwignerWMassWeights
--postfix bsm -o $WREMNANTS_OUTDIR

- name: bsm rabbit fit
Expand All @@ -330,19 +329,19 @@ jobs:
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_limit.py
$WREMNANTS_OUTDIR/WMass_eta_pt_charge_bsm/WMass.hdf5 -o $WREMNANTS_OUTDIR/WMass_eta_pt_charge_bsm/
-t -1 --asymptoticLimits WtoNMu_5_mixing --modes gaussian
-t -1 --asymptoticLimits WtoNMuMass5_mixing --modes gaussian

- name: bsm plot postfit variations
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_plot_hists.py '$WREMNANTS_OUTDIR/WMass_eta_pt_charge_bsm/fitresults.hdf5'
-o $WEB_DIR/$PLOT_DIR/BSM -m Project -m Normalize --title CMS --subtitle Preliminary --titlePos 0 --yscale '1.3'
--result asimov --lowerLegCols 3 --rrange 0.98 1.02
--varNames massShiftW100MeV 'WtoNMu_5_mixing' --varColors red blue
--varNames massShiftW100MeV 'WtoNMuMass5_mixing' --varColors red blue

# - name: bsm plot parameter correlations
# run: >-
# scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_plot_hists_cov.py '$WREMNANTS_OUTDIR/WMass_eta_pt_charge_bsm/fitresults.hdf5'
# -o $WEB_DIR/$PLOT_DIR/BSM --params 'WtoNMu_5' massShiftW100MeV
# -o $WEB_DIR/$PLOT_DIR/BSM --params 'WtoNMuMass5' massShiftW100MeV
# --title CMS --subtitle Preliminary --titlePos 0 --config 'utilities/styles/styles.py' --correlation --showNumbers

# - name: bsm plot pulls and impacts
Expand All @@ -354,7 +353,7 @@ jobs:
# - name: bsm likelihood scan
# run: >-
# scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_plot_likelihood_scan.py '$WREMNANTS_OUTDIR/WMass_eta_pt_charge_bsm/fitresults.hdf5'
# -o $WEB_DIR/$PLOT_DIR/BSM --params 'WtoNMu_5' --title CMS --subtitle Preliminary --titlePos 0 --config 'utilities/styles/styles.py'
# -o $WEB_DIR/$PLOT_DIR/BSM --params 'WtoNMuMass5' --title CMS --subtitle Preliminary --titlePos 0 --config 'utilities/styles/styles.py'

w-plotting:
# The type of runner that the job will run on
Expand Down Expand Up @@ -506,7 +505,7 @@ jobs:
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_plot_pulls_and_impacts.py
$WREMNANTS_OUTDIR/WMass_eta_pt_charge_poiAsNoi/WMass_absEtaGen_ptGen_qGen_theoryfit/fitresults.hdf5
--grouping max --config 'utilities/styles/styles.py' --postfix poiAsNoi
-o $WEB_DIR/$PLOT_DIR/unfolding_mw -n 50 --otherExtensions pdf png --showNumbers --oneSidedImpacts -s absimpact --scale 1.5
-o $WEB_DIR/$PLOT_DIR/unfolding_mw -n 50 --otherExtensions pdf png --showNumbers --oneSidedImpacts -s absimpact --scaleImpacts 1.5
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think you have to implement it for this PR, but it should be possible to take this from the pdfInfo to avoid it having to be manually synchronised.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we want to keep the rabbit scripts analysis independent so I don't see how the pdfInfo can be used here. Maybe through the specified config file which. But I agree to do this at some later time.


lowpu-w:
# The type of runner that the job will run on
Expand Down Expand Up @@ -537,7 +536,7 @@ jobs:
- name: lowpu w mu analysis
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run_python.sh scripts/histmakers/mw_lowPU.py
--dataPath $DATAPATH_LOWPU -o $WREMNANTS_OUTDIR -j $NTHREADS --forceDefaultName --unfolding --unfoldingLevels postfsr
--dataPath $DATAPATH -o $WREMNANTS_OUTDIR -j $NTHREADS --forceDefaultName --unfolding --unfoldingLevels postfsr

- name: lowpu w mu plot ptW
run: >-
Expand Down Expand Up @@ -567,7 +566,7 @@ jobs:
- name: lowpu w e analysis
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run_python.sh scripts/histmakers/mw_lowPU.py
--dataPath $DATAPATH_LOWPU -o $WREMNANTS_OUTDIR -j $NTHREADS --forceDefaultName --flavor e --unfolding --unfoldingLevels postfsr
--dataPath $DATAPATH -o $WREMNANTS_OUTDIR -j $NTHREADS --forceDefaultName --flavor e --unfolding --unfoldingLevels postfsr

- name: lowpu w e plot ptW
run: >-
Expand Down Expand Up @@ -639,7 +638,7 @@ jobs:
- name: lowpu z mumu analysis
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run_python.sh scripts/histmakers/mz_lowPU.py --unfolding --unfoldingLevels postfsr
--dataPath $DATAPATH_LOWPU -o $WREMNANTS_OUTDIR -j $NTHREADS --forceDefaultName
--dataPath $DATAPATH -o $WREMNANTS_OUTDIR -j $NTHREADS --forceDefaultName

- name: lowpu z mumu plot ptll yll
run: >-
Expand All @@ -659,7 +658,7 @@ jobs:
- name: lowpu z ee analysis
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run_python.sh scripts/histmakers/mz_lowPU.py --unfolding --unfoldingLevels postfsr
--dataPath $DATAPATH_LOWPU -o $WREMNANTS_OUTDIR -j $NTHREADS --forceDefaultName --flavor ee
--dataPath $DATAPATH -o $WREMNANTS_OUTDIR -j $NTHREADS --forceDefaultName --flavor ee

- name: lowpu z ee plot ptll yll
run: >-
Expand Down Expand Up @@ -711,13 +710,20 @@ jobs:
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_fit.py
$RABBIT_DIR/Combination_WMass_lowPUZMass_lowPU_ptll/Combination.hdf5 -t -1 -o $RABBIT_DIR/Combination_WMass_lowPUZMass_lowPU_ptll/
--doImpacts --globalImpacts
--saveHists --computeHistErrors --computeHistCov
--saveHists --computeHistErrors --computeHistCov -m Project ch0_masked ptVGen -m Project ch1_masked ptVGen -m Select ch2_masked -m Select ch3_masked

- name: lowpu rabbit unfolded xsec plot
- name: lowpu rabbit unfolded xsec plots W
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_plot_hists.py --config utilities/styles/styles.py
$RABBIT_DIR/Combination_WMass_lowPUZMass_lowPU_ptll/fitresults.hdf5 -o $WEB_DIR/$PLOT_DIR/lowPU
--rrange 0.0 2.0 --unfoldedXsec --noUncertainty --chisq none --legCols 1 --noSciy --titlePos 0 --title CMS --subtitle Preliminary
--rrange 0.0 2.0 --unfoldedXsec --noUncertainty --chisq none --legCols 1 --noSciy --titlePos 0 --title CMS --subtitle Preliminary -m Project

- name: lowpu rabbit unfolded xsec plots Z
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_plot_hists.py --config utilities/styles/styles.py
$RABBIT_DIR/Combination_WMass_lowPUZMass_lowPU_ptll/fitresults.hdf5 -o $WEB_DIR/$PLOT_DIR/lowPU
--rrange 0.0 2.0 --unfoldedXsec --noUncertainty --chisq none --legCols 1 --noSciy --titlePos 0 --title CMS --subtitle Preliminary -m Select


wlike:
# The type of runner that the job will run on
Expand Down Expand Up @@ -1011,7 +1017,7 @@ jobs:
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_plot_pulls_and_impacts.py
$WREMNANTS_OUTDIR/ZMassDilepton_ptll_yll_poiAsNoi/ZMassDilepton_ptVGen_absYVGen_theoryfit/fitresults.hdf5
--grouping max --config 'utilities/styles/styles.py' --postfix poiAsNoi
-o $WEB_DIR/$PLOT_DIR/unfolding_dilepton -n 50 --otherExtensions pdf png --showNumbers --oneSidedImpacts -s absimpact --scale 1.5
-o $WEB_DIR/$PLOT_DIR/unfolding_dilepton -n 50 --otherExtensions pdf png --showNumbers --oneSidedImpacts -s absimpact --scaleImpacts 1.5

dilepton-unfolding:
# The type of runner that the job will run on
Expand Down Expand Up @@ -1118,7 +1124,7 @@ jobs:
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_plot_pulls_and_impacts.py
$WREMNANTS_OUTDIR/ZMassDilepton_ptll_yll_unfolding/ZMassDilepton_ptVGen_absYVGen_theoryfit/fitresults.hdf5
--grouping max --config 'utilities/styles/styles.py' --postfix unfolding
-o $WEB_DIR/$PLOT_DIR/unfolding_dilepton -n 50 --otherExtensions pdf png --showNumbers --oneSidedImpacts -s absimpact --scale 1.5
-o $WEB_DIR/$PLOT_DIR/unfolding_dilepton -n 50 --otherExtensions pdf png --showNumbers --oneSidedImpacts -s absimpact --scaleImpacts 1.5


dilepton-plotting:
Expand Down Expand Up @@ -1165,7 +1171,7 @@ jobs:
- name: dilepton plotting yll
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run_python.sh scripts/plotting/makeDataMCStackPlot.py
--yscale 1.3 --baseName nominal_yll --nominalRef nominal_yll --hists yll --fineGroups -o $WEB_DIR -f $PLOT_DIR -p z $HIST_FILE
--yscale 1.3 --baseName nominal_ptll_yll --nominalRef nominal_ptll_yll --hists yll --fineGroups -o $WEB_DIR -f $PLOT_DIR -p z $HIST_FILE

- name: dilepton plotting cosThetaStarll
run: >-
Expand All @@ -1187,8 +1193,8 @@ jobs:
run: >-
scripts/ci/run_with_singularity.sh scripts/ci/setup_and_run.sh rabbit_fit.py
$WREMNANTS_OUTDIR/ZMassDilepton_ptll/ZMassDilepton.hdf5 --saveHists --saveHistsPerProcess --computeHistErrors
--externalPostfit $WREMNANTS_OUTDIR/ZMassWLike_eta_pt_charge/fitresults_uncorr.hdf5 --externalPostfitResult uncorr --pseudoData uncorr
-o $WREMNANTS_OUTDIR/ZMassDilepton_ptll/ --postfix from_ZMassWLike_eta_pt_charge
--externalPostfit $WREMNANTS_OUTDIR/ZMassWLike_eta_pt_charge/fitresults_uncorr.hdf5 --externalPostfitResult uncorr --noPostfitProfileBB
-o $WREMNANTS_OUTDIR/ZMassDilepton_ptll/ --postfix from_ZMassWLike_eta_pt_charge --pseudoData uncorr

- name: dilepton ptll from wlike postfit
run: >-
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,11 @@ def main():

pdf = THEORY_PREDS[pred]["pdf"]

command = f"python {os.environ['WREM_BASE']}/scripts/histmakers/w_z_gen_dists.py --useCorrByHelicityBinning --theoryCorr {pred} -o {args.outdir} --maxFiles -1 -j 300 --filterProcs ZmumuPostVFP WplusmunuPostVFP WminusmunuPostVFP --addHelicityAxis --pdf {pdf}"
command = f"""
python {os.environ['WREM_BASE']}/scripts/histmakers/w_z_gen_dists.py --theoryCorr {pred} \
--filterProcs 'Zmumu_MiNNLO' 'Wplusmunu_MiNNLO' 'Wminusmunu_MiNNLO' --aggregateGroups Zmumu Wmunu \
-o {args.outdir} --maxFiles -1 -j 300 --addHelicityAxis --pdf {pdf}
"""
print(f"Running command: {command}")
os.system(command)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

os.system from a python script to call a python script is kind of criminal...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First of all, I didn't add this script. But I find it useful. If the histmaker scripts were written with a main function etc. it could be possible to import and run ... I would leave this for future work.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kdlong, David speaks the truth: he bears no more guilt than you, it is I who is to be blamed for this terrible sin. Punish me as you see fit


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,9 +47,13 @@ def main():

for pred in args.preds:

command = f"python {os.environ['WREM_BASE']}/scripts/histmakers/w_z_gen_dists.py --useCorrByHelicityBinning --theoryCorr {pred} -o {args.outdir} --maxFiles '-1' -j 300 --filterProcs ZmumuPostVFP WplusmunuPostVFP WminusmunuPostVFP --addHelicityAxis --pdf {THEORY_PREDS[pred]['pdf']}"
command = f"""
python {os.environ['WREM_BASE']}/scripts/histmakers/w_z_gen_dists.py --theoryCorr {pred} \
--filterProcs 'Zmumu_MiNNLO' 'Wplusmunu_MiNNLO' 'Wminusmunu_MiNNLO' --aggregateGroups Zmumu Wmunu \
-o {args.outdir} --addHelicityAxis --pdf {THEORY_PREDS[pred]['pdf']} --maxFiles '-1' -j 300
"""
print(f"Running command: {command}")
# os.system(command)
os.system(command)

if args.skim:
skim_command = f"python {os.environ['WREM_BASE']}/utilities/open_narf_h5py.py {args.outdir}/w_z_gen_dists_{pred + "_Corr"}_maxFiles_m1.hdf5 --filterHistsRegex '^(.*pdfvars_Corr.*|nominal_gen_pdf_uncorr)$' --outfile {args.outdir}/w_z_gen_dists_{pred + "_Corr"}_maxFiles_m1_skimmed.hdf5"
Expand Down
17 changes: 14 additions & 3 deletions scripts/corrections/corrs_by_helicity/make_pdf_gen_hists.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,13 @@ def parse_arguments():
action="store_true",
help="If set, will run a skimming step to only keep the PDF histograms in the file, saving a new output file.",
)

parser.add_argument(
"-j",
"--njobs",
type=int,
default=300,
help="Number of parallel threads",
)
return parser.parse_args()


Expand All @@ -55,12 +61,17 @@ def main():

for pdf in args.pdf:

command = f"python {os.environ['WREM_BASE']}/scripts/histmakers/w_z_gen_dists.py --useCorrByHelicityBinning --pdf {pdf} -o {args.outdir} --maxFiles '-1' -j 300 --filterProcs ZmumuPostVFP WplusmunuPostVFP WminusmunuPostVFP --addHelicityAxis --postfix pdfByHelicity"
command = f"""
python {os.environ['WREM_BASE']}/scripts/histmakers/w_z_gen_dists.py --pdf {pdf} -o {args.outdir} --maxFiles '-1' -j {args.njobs} \
--filterProcs 'Zmumu_MiNNLO' 'Wplusmunu_MiNNLO' 'Wminusmunu_MiNNLO' --aggregateGroups Zmumu Wmunu \
--addHelicityAxis --postfix pdfByHelicity
"""
print(f"Running command: {command}")
os.system(command)

if args.skim:
skim_command = f"python {os.environ['WREM_BASE']}/utilities/open_narf_h5py.py {args.outdir}/w_z_gen_dists_maxFiles_m1_{pdf}_pdfByHelicity.hdf5 --filterHists nominal_gen_pdf --excludeHists alpha --outfile {args.outdir}/w_z_gen_dists_maxFiles_m1_{pdf}_pdfByHelicity_skimmed.hdf5"
pdf_replace = f"_{pdf}" if pdf != "ct18z" else ""
skim_command = f"python {os.environ['WREM_BASE']}/utilities/open_narf_h5py.py {args.outdir}/w_z_gen_dists_maxFiles_m1{pdf_replace}_pdfByHelicity.hdf5 --filterHists nominal_gen_pdf --excludeHists alpha --outfile {args.outdir}/w_z_gen_dists_maxFiles_m1_{pdf}_pdfByHelicity_skimmed.hdf5"
print(f"Running skimming command: {skim_command}")
os.system(skim_command)

Expand Down
12 changes: 6 additions & 6 deletions scripts/corrections/make_muon_response_maps.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,12 @@
hist_response_smeared = None

procs = []
procs.append("ZmumuPostVFP")
procs.append("ZtautauPostVFP")
procs.append("WplusmunuPostVFP")
procs.append("WminusmunuPostVFP")
procs.append("WplustaunuPostVFP")
procs.append("WminustaunuPostVFP")
procs.append("Zmumu_2016PostVFP")
procs.append("Ztautau_2016PostVFP")
procs.append("Wplusmunu_2016PostVFP")
procs.append("Wminusmunu_2016PostVFP")
procs.append("Wplustaunu_2016PostVFP")
procs.append("Wminustaunu_2016PostVFP")


with h5py.File(infile, "r") as f:
Expand Down
2 changes: 1 addition & 1 deletion scripts/corrections/make_ptv_unfolding_corr.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@

logger = logging.setup_logger("make_ptv_unfolding_corr", 4 if args.debug else 3)

genh = input_tools.read_and_scale(args.genFile, "ZmumuPostVFP", "nominal_gen")
genh = input_tools.read_and_scale(args.genFile, "Zmumu_2016PostVFP", "nominal_gen")

unfolded_res = pickle.load(open(args.unfoldingFile, "rb"))
unfolded_datah = unfolded_res["results"]["pmaskedexp"]["chan_13TeV"]["Z"][
Expand Down
Loading