-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
Ambiguity in brier score doc fixed #10969
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please refer to the log https://travis-ci.org/scikit-learn/scikit-learn/jobs/365980305 to resolve the flake8 errors
sklearn/metrics/classification.py
Outdated
@@ -1929,7 +1929,7 @@ def brier_score_loss(y_true, y_prob, sample_weight=None, pos_label=None): | |||
takes on a value between zero and one, since this is the largest | |||
possible difference between a predicted probability (which must be | |||
between zero and one) and the actual outcome (which can take on values | |||
of only 0 and 1). | |||
of only 0 and 1). The Brier loss is decomposed of refinement loss and calibration loss. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
E501 line too long (91 > 79 characters)
is evaluated with Brier score :func:`brier_score_loss`, reported in the legend | ||
(the smaller the better). | ||
both isotonic calibration and sigmoid calibration. | ||
The Brier score is a metric which is a combination of calibration loss and refinement loss, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is no longer grammatical. Try "Performance is evaluated with ...." then adding a new sentence summarising the intention of the metric.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you're not explaining calibration and refinement loss, right?
sklearn/metrics/classification.py
Outdated
@@ -1929,7 +1929,8 @@ def brier_score_loss(y_true, y_prob, sample_weight=None, pos_label=None): | |||
takes on a value between zero and one, since this is the largest | |||
possible difference between a predicted probability (which must be | |||
between zero and one) and the actual outcome (which can take on values | |||
of only 0 and 1). | |||
of only 0 and 1). The Brier loss is decomposed of refinement loss |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"composed" not "decomposed"
doc/modules/calibration.rst
Outdated
The Brier score is a metric which is a combination of calibration loss and refinement loss, | ||
:func:`brier_score_loss`, reported in the legend (the smaller the better). | ||
Calibration loss is defined as the mean squared deviation from empirical probabilities | ||
derived from slope of ROC segments. Refinement loss can be defined as the expected |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Slope -> the slope
fix grammar as suggested by jnothman.
Reference Issues/PRs
Fix #10883
What does this implement/fix? Explain your changes.
Ambiguity in brier score doc fixed. Details about calibration loss added.
Any other comments?