-
Notifications
You must be signed in to change notification settings - Fork 20
TF error bands, bug fixes, minor improvements #45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thanks a lot for the errorband feature and fix 3 (that likely explains why I never saw any change to the fit results when specifying errors...). One question - on |
|
but when checking the behavior in some different cases, the uncertainty on the per-bin yields is already 1 as desired, so it seemed simpler just to remove it. |
|
Do you mean I still use it in my fits so I would be in favour of leaving the test as it is, but I haven't checked in a long time if it's still necessary so I leave it up to @nsmith- and others. |
|
I discussed this with @nsmith- when I was confused by the output from the test. He explained it as follows: Whether you scale p as p*sigmascale or p/sigmascale just depends on how you define sigmascale. But having sigmascale in the base of the expression is just wrong as far as I can tell. |
|
Right, having sigmascale in the base of the expression was just a longstanding bug. I suppose to be consistent with the current usage we would want |
|
For reference import numpy as np
import matplotlib.pyplot as plt
param = np.linspace(-1, 1, 101)
n = 100
sigmascale = 10
oldfactor = n * (1 + sigmascale / np.sqrt(n)) ** param
newfactor = n * (1 + 1 / np.sqrt(n)) ** (param * sigmascale)
fig, ax = plt.subplots()
ax.plot(param, oldfactor, label="old")
ax.plot(param, newfactor, label="new")
ax.axhline(n, color="grey", linestyle="--")
ax.axhline(n + np.sqrt(n), color="red", linestyle="--")
ax.axhline(n - np.sqrt(n), color="blue", linestyle="--")
ax.set_yscale("log")
ax.legend()
ax.set_title(f"{n=}, {sigmascale=}")
ax.set_xlabel("parameter")
ax.set_ylabel("n expected") |
|
I can update the example to use p * sigmascale if desired. It wasn't clear to me that this really improved the condition of the example fit, though... |
|
Yeah for the purpose it serves (getting the error on |
|
The pytest workflow failure seems to be upstream in conda |
|
Yeah it's been quite flaky lately. Anyway you'll probably want to rebase onto #46 which fixes the python 3.8 incompatibilities |
…in parameter errors)
|
rebased now (thanks for the py3.8 fixes!) |
|
Found one more py3.8 item to fix, committed here. |
nsmith-
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding these!

New feature: evaluating the transfer factor polynomial, after performing a fit in RooFit, can propagate the parameter uncertainty via the covariance matrix to provide an error band (confidence interval) on the resulting fit. This is enabled by the
errorbandboolean option when calling the polynomial.Bug fixes:
sigmascalefrom the fit parametrizations, as it was applied in the wrong part of the formula and moved the parameter error away from unity._to_numpy()utility function.TH1::SetDefaultSumw2()is set to true, callingSetBinContent()(rather thanFill()) sets the bin error to 0. RooFit has a weird default where it sets the bin error to be equal to the bin content if the input TH1 bin error is 0: https://root.cern.ch/doc/v622/RooDataHist_8cxx_source.html#l01300.Minor improvements: