-
-
Notifications
You must be signed in to change notification settings - Fork 10.8k
BUG: performance regression of polynomial evaluation #26843
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hmm. I suspect the dtype/casting changes here, the code itself hasn't changed. Does it make a difference if you use float coefficients instead of integers? |
There is also a lot |
Tried with both int and float coefficients, does not make a difference. |
I am not quite sure I understand the first part of the sentence about the |
@charris Actually, the code has changed. Compare https://github.com/numpy/numpy/blob/maintenance/1.26.x/numpy/polynomial/_polybase.py#L510 with https://github.com/numpy/numpy/blob/main/numpy/polynomial/_polybase.py#L525 I will look into whether this is the cause of the regression or not |
@PeterWurmsdobler First of all: numpy performs well on large matrices, but not so well on small arrays. So if you could vectorize your code, e.g. call In your minimal example the call
I think this explain the performance regression you are seeing. The changes that have been made are to resolve #17949. We could partly restore performance by replacing
(preferably implemented in C, not sure whether it already exists in numpy). I did not check whether such a change would pass unit tests. @charris Would adding special casing such as in the example above be acceptable? |
Thanks for the suggestion; unfortunately the evaluation of the polynomial is part of an optimisation loop with unknown arguments a priori. Also thanks for your detailed analysis of the internal execution performance. As it happens I tried to implement a simple version of a 2nd order polynomial with additions and multiplications; the performance was on par with the polynomial. |
Adding the fast path for Benchmark results on main:
On the PR
|
With #26550 merged I think most of the performance regression has been recovered. @PeterWurmsdobler One way to improve performance even more might be to use
|
If you actually use |
Thank you very much, @eendebakpt and @charris! When do you think the next numpy release will be cut? |
Going to close this, there doesn't seem to be a very concrete thing to improve here.
Not sure exactly, but in a few weeks there is already an rc. |
Describe the issue:
The evaluation of a polynomial of 2nd order takes about 50% more time when using numpy 2.0.0 in comparison to numpy version 1.26.4 which was revealed by running
asv
.Reproduce the code example:
Error message:
Python and NumPy Versions:
2.0.0
3.11.8 (main, Apr 29 2024, 16:52:56) [GCC 11.4.0]
Runtime Environment:
Python 3.11.8 (main, Apr 29 2024, 16:52:56) [GCC 11.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
Context for the issue:
The polynomial evaluation is part of a larger project of using cubic splines as well as scipy numerical integration and root finding where the polynomial is evaluated repeatedly. The overall performance of the project suffers a 25% execution increase.
The text was updated successfully, but these errors were encountered: