Skip to content

[MRG] Inherit LinearModels from _LearntSelectorMixin #4241

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

MechCoder
Copy link
Member

Fixes #4180

  1. Make 1e-5, 1e-5 times max coefficient.
  2. Specify clearly that for l1 based models, this is the threshold.
  3. Inherit LinearModels from _LearntSelectorMixin, effectively adding transform to all LinearModels.

@MechCoder
Copy link
Member Author

ping @agramfort @amueller

Btw, why does _LearntSelectorMixin inherit from TransformerMixin? These cause test errors in the common tests (related to transformers) when I inherited LinearModels from it.

@amueller
Copy link
Member

Because it introduces a transform method?

@amueller
Copy link
Member

I don't see a failure in the common tests.

@jnothman
Copy link
Member

Please see also #2160

if threshold is None:
# Lasso has a l1 penalty but no penalty param.
if (hasattr(self, "penalty") and self.penalty == "l1" or
'Lasso' in self.__class__.__name__):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about ElasticNet?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should it default to median for ENet?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I meant that even Non-transformers, can have transform methods.

IMO, that's a contradiction, effectively. By implementing fit() and
transform(), they implement the Transformer API, and are thus transformers.
The common tests should be explicitly non-mutually-exclusive between
predictors and transformers. What sorts of tests were failing?

Having said that, I'm not sure I agree with this design and might rather
see this style of feature selection come through a wrapping meta-estimator
than a mixin.

On 12 February 2015 at 17:40, Manoj Kumar notifications@github.com wrote:

In sklearn/feature_selection/from_model.py
#4241 (comment)
:

@@ -68,11 +71,16 @@ def transform(self, X, threshold=None):

     # Retrieve threshold
     if threshold is None:
  •        if hasattr(self, "penalty") and self.penalty == "l1":
    
  •        threshold = getattr(self, "threshold", None)
    
  •    if threshold is None:
    
  •        # Lasso has a l1 penalty but no penalty param.
    
  •        if (hasattr(self, "penalty") and self.penalty == "l1" or
    
  •            'Lasso' in self.**class**.**name**):
    

Should it default to median for ENet?


Reply to this email directly or view it on GitHub
https://github.com/scikit-learn/scikit-learn/pull/4241/files#r24561488.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. Yes that seems a better idea.

However, sorry if this question seems dumb, but is it worth writing a separate meta-estimator, to just do a masking of X corresponding to coefs, below a certain threshold value?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I see work has already been done in deprecating _LearntSelectorMixin and having a SelectfromModel estimator. How far is it from being merged?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't mean to comment on this line of code. @maheshakya tried implementing a meta-estimator version at #3011. But there are arguments in both directions.

@MechCoder
Copy link
Member Author

Because it introduces a transform method?

I meant that even Non-transformers, can have transform methods. So if I inherit say ElasticNet from _LearntSelectorMixin, technically it is now a Transformer also. It passed the common tests, since I removed the inheritance.

@MechCoder
Copy link
Member Author

Closed in favor of #4242

@MechCoder MechCoder closed this Feb 12, 2015
@MechCoder MechCoder deleted the transform_linearmodels branch February 12, 2015 10:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

L1 feature selection docs confusing
3 participants