Skip to content

Conversation

thomasjpfan
Copy link
Member

Reference Issues/PRs

Towards #21308

What does this implement/fix? Explain your changes.

IsotonicRegression only supports 2d arrays with 1 feature and 1d arrays. In this case, I think we can have get_feature_names_out always return "isotonicregression0".

Any other comments?

Note that the common tests does not run on IsotonicRegression because it has the {"X_types": ["1darray"]} tag.

@jeremiedbb
Copy link
Member

jeremiedbb commented Jan 28, 2022

I think we can have get_feature_names_out always return "isotonicregression0".

Seems fine.

Is there a reason why you don't use the _ClassNamePrefixFeaturesOutMixin machinery and set self._n_features_out = 1 ?

@thomasjpfan
Copy link
Member Author

Is there a reason why you don't use the _ClassNamePrefixFeaturesOutMixin machinery and set self._n_features_out = 1 ?

I wanted to adjust the docstring for IsotonicRegression.get_feature_names_out to show that it ignores input_features. IsotonicRegression does not define feature_names_in_, which means input_features are ignored.

The alternative is to add support for feature_names_in_ and n_features_in_ for IsotonicRegression and then use the mixin:

  1. For the 2d case, feature_names_in_ and n_features_in_ are defined as usual. This means get_feature_names_out will actually validate input_features
  2. For the 1d case, the semantics are not clear. Maybe n_features_in_ = 1 and feature_names_in_ is undefined or None. get_feature_names_out will not validate input_features.

@jeremiedbb
Copy link
Member

Thanks for the clarification. I don't really know what we should expect in the 1d case either. Maybe it's better to stick to your current workaround for now and come back to implementing n_features_in_ when we have a clear expectation.

Copy link
Member

@jeremiedbb jeremiedbb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Maybe add a comment on top of get_feature_names_out to quickly explain the need for re-implementing it here

Copy link
Member

@ogrisel ogrisel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@ogrisel ogrisel merged commit 8991c3d into scikit-learn:main Feb 1, 2022
glemaitre pushed a commit to glemaitre/scikit-learn that referenced this pull request Feb 9, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants