-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
Add GPU support for NumPy operations with CuPy #9146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I would add to that that it would require rewriting any cython-based code, which is most of our code. |
I think it's also worth mentioning that we would love to see/reference a sklearn-contrib project where specific algorithms that can utilize GPUs do so. |
Part of the problem, @ClimbsRocks, is that for many non-trivial operations
on numpy arrays, we have low-level (Cython) CPU implementations. I presume
this means that cupy could rarely be a drop-in replacement for numpy as
used within scikit-learn.
…On 19 June 2017 at 16:00, Gael Varoquaux ***@***.***> wrote:
Closed #9146 <#9146>.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#9146 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEz6yBAGTUdgCalfyilmp_kJu_eJzmEks5sFg6MgaJpZM4N9RpL>
.
|
cupy is said to support some cython level operations, but I haven't tried
it out yet: cupy/cupy#130
On Sun, Jun 18, 2017 at 11:08 PM, Joel Nothman <notifications@github.com>
wrote:
… Part of the problem, @ClimbsRocks, is that for many non-trivial operations
on numpy arrays, we have low-level (Cython) CPU implementations. I presume
this means that cupy could rarely be a drop-in replacement for numpy as
used within scikit-learn.
On 19 June 2017 at 16:00, Gael Varoquaux ***@***.***> wrote:
> Closed #9146 <#9146>.
>
> —
> You are receiving this because you are subscribed to this thread.
> Reply to this email directly, view it on GitHub
> <https://github.com/scikit-learn/scikit-learn/issues/
9146#event-1128478562>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-
auth/AAEz6yBAGTUdgCalfyilmp_kJu_eJzmEks5sFg6MgaJpZM4N9RpL>
> .
>
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#9146 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADvEEEUSPdypKRwjovJOPoFZMcbL_2ybks5sFhBSgaJpZM4N9RpL>
.
|
Thanks for the discussion points, everybody! @jmschrei keep us posted if you do spend more time on the Cython level operations in CuPy. |
What kind of operations do you want to call at Cython level? Right now, low level functions such as wrappers for cuBLAS are exported at Cython level. |
I'll keep you in the loop when I play around with it, but as Andreas mentioned I wouldn't keep my hopes up for it being added to sklearn anytime soon. |
Any updates on integrating Cupy into scikit-learn? Maybe at least for more computationally expensive operations like t-SNE first and then gradually building up? |
t-SNE is precisely one such algorithms that is currently written in Cython. But other estimators such as preprocessors that only use pure numpy functions could potentially use one of NEP-18 / NEP-35 / NEP-37 or a combination to accept non-numpy inputs (e.g. CuPy arrays) and avoid forcing the arrays to be converted to a numpy array. See: #16196 and #16574 for instance. |
Description
In their own words, "CuPy is an implementation of NumPy-compatible multi-dimensional array on CUDA. CuPy consists of the core multi-dimensional array class, cupy.ndarray, and many functions on it. It supports a subset of numpy.ndarray interface."
I recognize that building both CPU and GPU compatible versions is a significant undertaking, and might ultimately be out of the scope of scikit-learn.
I wanted to start a conversation about adding in GPU support (wither with CuPy, or any other library), and see if people were interested in taking on this endeavor.
The text was updated successfully, but these errors were encountered: