Skip to content

[WIP] Allow SGDClassifier to support np.float32 without upcasting to float64 #9084

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 5 commits into from

Conversation

ncordier
Copy link

@ncordier ncordier commented Jun 9, 2017

Reference Issue

Works on #8769 for Stochastic Gradient Descent.

What does this implement/fix? Explain your changes.

The goal is to avoid aggressively casting input data to np.float64 when np.float32 is supplied to the fit() method of SGDClassifier. A unit test is added to DenseSGDClassifierTestCase to check whether the dtype of the output of fit() is consistent with the input.

Any other comments?

@ncordier
Copy link
Author

ncordier commented Jun 9, 2017

I'm stuck here
cc: @Henley13 @massich

  File "scikit-learn\sklearn\linear_model\base.py", line 64, in make_dataset
    dataset = ArrayDataset(X, y, sample_weight, seed=seed)
  File "sklearn\utils\seq_dataset.pyx", line 158, in sklearn.utils.seq_dataset.ArrayDataset.__cinit__ (sklearn\utils\seq_dataset.c:2875)
ValueError: Buffer dtype mismatch, expected 'double' but got 'float'

@Henley13
Copy link
Contributor

Henley13 commented Jun 9, 2017

@ncordier make_dataset should be consistent with fused types after the merge of #9040!

Base automatically changed from master to main January 22, 2021 10:49
@cmarmo cmarmo added Superseded PR has been replace by a newer PR and removed Waiting for Reviewer labels Feb 14, 2022
@thomasjpfan
Copy link
Member

Thank you for working on this PR. I'm going to close this PR since it has been superseded.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module:linear_model Superseded PR has been replace by a newer PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants