-
-
Notifications
You must be signed in to change notification settings - Fork 11.2k
BUG: fix for evaluation of random_f and random_standard_cauchy in distributions.c #29598
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
… can also be evaluated first leading to reciprocal of actual desired result
Makes sense, as the standard is clear that order of evaluation is arbitrary. In practice, it would seem to be consistent, but this PR corrects a possible defect. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be clear: the potentially different compiler-dependent behaviors do not mean that the stream of random variates will be incorrect. Whether the numerator or denominator is evaluated first does not affect the correctness of the calculation. The behavior being compiler-dependent means that the streams of variates generated by code compiled with different compilers might be different. But the different streams are both "correct" (in as much as both are numerical approximations of the underlying random process).
Having said that, I agree that it is a good idea to fix the ambiguous order of evaluation. While in general we can't guarantee identical streams across different platforms and compilers, it seems reasonable to remove potential sources of difference when we can.
We already have legacy implementations for the f
and standard_cauchy
calculations, so this change only affects the Generator
methods.
The correctness of calculation can be impacted if there is an argument whose state changes from one call to another within the same function. Like bitgen_state in random_f(). I suggest we fix legacy_f and legacy_standard_cauchy as well. |
I have added another commit for legacy_f and legacy_standard_cauchy, this keeps both the functions in sync and consistent. |
The result is certainly affected, but in this case either order of evaluation generates a correct random variate.
Generally we don't touch the legacy code; it is considered "frozen". Please revert your changes to |
ok, will revert it. My platform is NonStop OS and these functions(legacy_f and legacy_standard_cauchy) also seem to affect the output of np.random.f in my case(giving non standard output). Will go with a local fix for legacy functions. |
This reverts commit ad574f6.
We care that the output of |
committing suggestion for random_f Co-authored-by: Joren Hammudoglu <jhammudoglu@gmail.com>
committing suggestion for random_standard_cauchy Co-authored-by: Joren Hammudoglu <jhammudoglu@gmail.com>
C program does not guarantee the order of evaluation of its operands, as a result if we have a function that returns f1()/f2(),
here f2() may get evaluated first. consider the following program,
here the function random_standard_normal uses bitgen_state struct to evaluate the first and second call, but since C guarantees no order the first call might be f2() which can cause return of 8.0 in f2() and 2.0 in f1() clearly the result returned will be 2.0/8.0 which if 1/4, however if f1() evaluates first result returned will be 8.0/2.0 which is 4.