Skip to content

[MRG+1] FIX LDA(solver="lsqr"): make sure the right error is raised on transform #4427

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 23, 2015

Conversation

amueller
Copy link
Member

Fixes part B of #4415.

@amueller amueller added the Bug label Mar 20, 2015
@amueller amueller added this to the 0.16 milestone Mar 20, 2015
@landscape-bot
Copy link

Code Health
Code quality remained the same when pulling c02cc3d on amueller:lda_lsqr_predict into 90736ef on scikit-learn:master.

@coveralls
Copy link

Coverage Status

Coverage increased (+0.0%) to 95.09% when pulling c02cc3d on amueller:lda_lsqr_predict into 90736ef on scikit-learn:master.

@ogrisel
Copy link
Member

ogrisel commented Mar 23, 2015

LGTM. @mblondel @eickenberg any second review for this quick fix?

@ogrisel ogrisel changed the title [MRG] FIX LDA(solver="lsqr"): make sure the right error is raised on transform [MRG+1] FIX LDA(solver="lsqr"): make sure the right error is raised on transform Mar 23, 2015
if self.solver == 'lsqr':
raise NotImplementedError("transform not implemented for 'lsqr' "
"solver (use 'svd' or 'eigen').")
elif self.solver == 'svd':
check_is_fitted(self, ['xbar_', 'scalings_'], all_or_any=any)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Forgive me for being slow, but this line checks whether at least one of ['xbar_', 'scalings_'] is fitted, but lower down, in X_new = np.dot(X - self.xbar_, self.scalings_) both of them are needed. Shouldn't any --> all?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And further down X_new = np.dot(X, self.scalings_) makes me conclude that at least scalings_ is indispensable?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't write these lines, but I guess if the solver is eigen you only have scalings and if solver="svd" you have both. I could remove xbar_ from it, but I don't feel that would make it better or worse. I could branch the check on the solver parameter but I'm not sure how helpful that would be.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well my concern was that check_is_fitted is fine with only xbar_ fitted which is useful nowhere. So I would remove xbar_ at least from that check for logical consistency.

For completeness a check_is_fitted(self, 'xbar_') could then go in the appropriate decision branch. But I don't insist on that one because from what I see check_is_fitted is about throwing graceful errors if the variable concerned hasn't been fitted. For x_bar_ not to be fitted if solver == 'svd', the user must have called fit with a different solver and then changed the solver by hand, which I consider advanced (and weird), so that the user can then also deal with the ensuing error message.

This basically also invalidates what I said above. I will post this message to be able to remember the reasoning, but I agree with you that nothing needs to be changed.

@eickenberg
Copy link
Contributor

LGTM

@ogrisel
Copy link
Member

ogrisel commented Mar 23, 2015

Alright, merging now and doing the backport to 0.16.X.

ogrisel added a commit that referenced this pull request Mar 23, 2015
[MRG+1] FIX LDA(solver="lsqr"): make sure the right error is raised on transform
@ogrisel ogrisel merged commit 689388a into scikit-learn:master Mar 23, 2015
@amueller
Copy link
Member Author

@eickenberg feel free to make it more logically consistent in a PR ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants