Skip to content

Conversation

luxedo
Copy link
Contributor

@luxedo luxedo commented Oct 12, 2021

What does this implement/fix? Explain your changes.

Removes "mae", "mse", "least_squares", "squared_loss", "lad", "least_absolute_deviation" and "absolute_loss" from docstrings.

Since the errors were standardized in 1.0, maybe they can be removed from the documentation, leaving only the deprecation text.

If PR #21306 is accepted, that only fixes the docstring in RandomForestRegression, It would be natural to fix the other classes documentations.

According to v1.0 changelog;

|API| The option for using the squared error via loss and criterion parameters was made more consistent. The preferred way is by setting the value to "squared_error". Old option names are still valid, produce the same models, but are deprecated and will be removed in version 1.2. :pr:19310 by :user:Christian Lorentzen <lorentzenchr>.

  • For :class:ensemble.ExtraTreesRegressor, criterion="mse" is deprecated, use "squared_error" instead which is now the default.
  • For :class:ensemble.GradientBoostingRegressor, loss="ls" is deprecated, use "squared_error" instead which is now the default.
  • For :class:ensemble.RandomForestRegressor, criterion="mse" is deprecated, use "squared_error" instead which is now the default.
  • For :class:ensemble.HistGradientBoostingRegressor, loss="least_squares" is deprecated, use "squared_error" instead which is now the default.
  • For :class:linear_model.RANSACRegressor, loss="squared_loss" is deprecated, use "squared_error" instead.
  • For :class:linear_model.SGDRegressor, loss="squared_loss" is deprecated, use "squared_error" instead which is now the default.
  • For :class:tree.DecisionTreeRegressor, criterion="mse" is deprecated, use "squared_error" instead which is now the default.
  • For :class:tree.ExtraTreeRegressor, criterion="mse" is deprecated, use "squared_error" instead which is now the default.

|API| The option for using the absolute error via loss and criterion parameters was made more consistent. The preferred way is by setting the value to "absolute_error". Old option names are still valid, produce the same models, but are deprecated and will be removed in version 1.2. :pr:19733 by :user:Christian Lorentzen <lorentzenchr>.

  • For :class:ensemble.ExtraTreesRegressor, criterion="mae" is deprecated, use "absolute_error" instead.
  • For :class:ensemble.GradientBoostingRegressor, loss="lad" is deprecated, use "absolute_error" instead.
  • For :class:ensemble.RandomForestRegressor, criterion="mae" is deprecated, use "absolute_error" instead.
  • For :class:ensemble.HistGradientBoostingRegressor, loss="least_absolute_deviation" is deprecated, use "absolute_error" instead.
  • For :class:linear_model.RANSACRegressor, loss="absolute_loss" is deprecated, use "absolute_error" instead which is now the default.
  • For :class:tree.DecisionTreeRegressor, criterion="mae" is deprecated, use "absolute_error" instead.
    For :class:tree.ExtraTreeRegressor, criterion="mae" is deprecated, use "absolute_error" instead.

So this PR also checks the following classes:

class error
ensemble.ExtraTreesRegressor "mse"
ensemble.GradientBoostingRegressor "ls"
ensemble.RandomForestRegressor "mse" #21306
ensemble.HistGradientBoostingRegressor "least_squares"
ensemble.HistGradientBoostingRegressor "least_absolute_deviation"
linear_model.RANSACRegressor "squared_loss"
linear_model.SGDRegressor "squared_loss"
tree.DecisionTreeRegressor "mse"
tree.ExtraTreeRegressor "mse"
ensemble.ExtraTreesRegressor "mae"
ensemble.GradientBoostingRegressor "lad"
ensemble.RandomForestRegressor "mae"
ensemble.HistGradientBoostingRegressor "least_absolute_deviation"
linear_model.RANSACRegressor "absolute_loss"
tree.DecisionTreeRegressor "mae"
tree.ExtraTreeRegressor "mae"

Copy link
Member

@rth rth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's indeed much clearer which ones should be used this way. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants