-
Notifications
You must be signed in to change notification settings - Fork 1.3k
ulab array corruption with slices #4753
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@v923z may be interested in this. |
This is correct, but perhaps the first question is, whether this has anything to do with import math
_data = [1.0] * 10000
data = _data[6:-6]
sum(data)
a = []
for idx in range(1_000_000):
a.append("AYMABTU")
del a
sum(data) If this still works, I would then do _data = np.array([1.0] * 10000)
data = _data[6:-6]
np.sum(data)
a = []
for idx in range(1_000_000):
a.append("AYMABTU")
del a
np.sum(data) Do you get similar results, if you try to initialise Also, when you have this ulab.numerical.sum(filtered_data) ### no longer 9992.0 !!
9882.01 can you find out, where the data are corrupted? At the beginning, or the end of the array? |
First part:
Although perhaps this is closer to my example:
I added a
|
@kevinjwalters Thanks for bringing up the issue; it has been fixed in v923z/micropython-ulab#388, and the fix backported to the @jepler Thanks for digging to the root of the problem! |
Fixed as reported above. |
I noticed this on CircuitPython 6.2.0 on a CLUE (nRF52840):
I'd guess it's related to premature GC of the
ndarray
returned byulab.filter.convolve()
. That would be inappropriate as the slice is a view onto that data as I understand it.The text was updated successfully, but these errors were encountered: