-
-
Notifications
You must be signed in to change notification settings - Fork 10.8k
docstring on gradient
, confusing
#6847
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I agree that this example works but does not correspond to the possibilities described earlier in the docstring. That the example works at all is anyways a coincidence, as it only works properly for evenly spaced arrays x, where dx consists of a the same number repeated over and over again. I would propose to remove this example from the docstring as it is really misleading of what gradient is capable to do. |
Fixups to docstring, and disallow non-scalars as the distance args to np.gradient. Fixes numpy#7548, fixes numpy#6847
This somehow reverts numpy#7618 and solves numpy#6847, numpy#7548 by implementing support for unevenly spaced data. Now the behaviour is similar to that of Matlab/Octave function. As argument it can take: 1. A single scalar to specify a sample distance for all dimensions. 2. N scalars to specify a constant sample distance for each dimension. i.e. `dx`, `dy`, `dz`, ... 3. N arrays to specify the coordinates of the values along each dimension of F. The length of the array must match the size of the corresponding dimension 4. Any combination of N scalars/arrays with the meaning of 2. and 3.
i may be misunderstanding the docstring on this function. the gradient function, describes an argument
varargs
to be the sample distance in each direction, but then in the docstring example it showswhich seems like
dx
is the sample distance between each sample. so this example seems to indicate thatvarargs
is more accurately described bydx1, dx2,...dxN
, notdx,dy,dz,...
.the ealier interpretation appears to align more closely with some test case results as well.
The text was updated successfully, but these errors were encountered: