-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improving the reporting of uncertainties in the calculation made by boaviztapi #214
Comments
Thanks for this proportion. Using min/max and a log10 function is a great way of handling the very different figures we have. Here are my comments :
def round_value(val, min_val, max_val, precision=config["default_precision"]):
"""
Rounds the value based on the delta between the min and max value returned and a precision parameter
"""
# value for precision% of the min max delta
approx = (max_val - min_val) / (100/precision)
significant = math.floor(math.log10(approx))
return float(to_precision(val, significant)) I can make a PR with this implementation if you think it's ok. |
I believe there is an issue with the approx variable. The more difference there is between min and max, the bigger approx is and the number of significant figures. I believe we want the opposite. See some examples :
|
I think the issue comes from using the
The second option is obviously wrong, it is not even in the range between min and max ! The confusion probably come from the naming of my I'll make a PR with a set of test cases, it will be easier to discuss the implementation |
This function should be used when the API returns a value with an associated min and max value. The rounding depends on the delta betwwen min and max
I've added a PR with a rouding function that handles corner cases : #220 |
I am currently implementing the function in the code. I think that there is a problem when min=max=value. When so, we don't round at all, even though it gives a precision which is way too high compare to the uncertainty of the impact factors. Example
SolutionsTo account for the uncertainty of the impact factors, we could :
What are your thought about it ? |
yes, as the rounding is based on the difference between min and max, when min == max it does nothing (and it's by design ;) ) I believe the "right way" to handle this would be to specify an uncertainty on the base impact factor and propagate it when calculating the overall impact (the uncertainties python library could help here https://pythonhosted.org/uncertainties/ ) . This should probably be another issue and PR, for the next version. For now, I think we should fall back, in this specific case, to the current method where we simply cut after a fixed number of sig_fig. |
#214 add rounding based on min and max
Problem
I believe we should improve the way we handle uncertainty in the calculation made in boaviztapi. This issue is mainly meant to start the discussion on how that could be improved.
When using the API to request the impact of a server, vm etc., figures are given with a high number of digits, a
significant_figures
parameter, and a min and max value.For example :
Based on the discussion I had with @da-ekchajzer on mattermost
significant_figures
is not really calculated but mostly comes from a configuration file.I the example above, I think the value
636.11
does not really have 5 significant digits and, given the min and max the api returns for the impact, we should apply some rounding. We can probably say that the gwp impact for this server is around 600 kgCO2e, but 636,11 is clearly too precise ;)Solution
Regarding the rounding I suggest using something like the function bellow (rough code, needs polishing !) : it rounds the value based on the delta between the min and max value returned and a
precision
parameter (with is a %) ; if there is a large difference between min and max, the rounding is more aggressive.For example :
Alternatives
This approach helps solving the rounding issue, something else is needed for the uncertainties coming on the references data used for the calculation.
The text was updated successfully, but these errors were encountered: