You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In all of these cases the output should be 0, as the y_pred correctly predicts y_true.
The function calls are brier_score_loss -> _check_binary_probabilistic_predictions -> label_binarize, with the issue starting in label_binarize.
brier_score_loss has a pos_label parameter, but setting this to 1, does not fix the issue. It then overwrites y_true with True and False based on the pos_label parameter. It then calls _check_binary_probabilistic_predictions to check the y_true and y_pred values, and returns the output of label_binarize(y_true, labels), where labels are the unique values in y_label. Manually setting this to [0, 1] for the case where y_true is all 1s does not fix the issue.
In label_binarize, if there is only one class, the array is returned as 0 + the negative label
The resulting comparison in brier_score_loss - np.average((y_true - y_prob) ** 2, weights=sample_weight) then has y_true values of all 0s instead of 1s, hence the incorrect score.
Operating system: Ubuntu 17.04 64 bit, problem present in master, 18.1 and 18.2.
The text was updated successfully, but these errors were encountered:
gnsiva
changed the title
brier_score_loss returns incorrect value when all y_true values are True/1
brier_score_loss returns incorrect value when all y_true values are True/1
Jul 8, 2017
I'm still experiencing this issue in 0.19.1. PR #9980 seems to address this, but #9300 and #9980 do not reference each other. This comment should fix that.
In all of these cases the output should be 0, as the
y_pred
correctly predictsy_true
.The function calls are
brier_score_loss
->_check_binary_probabilistic_predictions
->label_binarize
, with the issue starting inlabel_binarize
.brier_score_loss
has apos_label
parameter, but setting this to 1, does not fix the issue. It then overwritesy_true
with True and False based on thepos_label
parameter. It then calls_check_binary_probabilistic_predictions
to check they_true
andy_pred
values, and returns the output oflabel_binarize(y_true, labels)
, where labels are the unique values iny_label
. Manually setting this to[0, 1]
for the case where y_true is all 1s does not fix the issue.In
label_binarize
, if there is only one class, the array is returned as 0 + the negative labelThe resulting comparison in
brier_score_loss
-np.average((y_true - y_prob) ** 2, weights=sample_weight)
then hasy_true
values of all 0s instead of 1s, hence the incorrect score.Operating system: Ubuntu 17.04 64 bit, problem present in master, 18.1 and 18.2.
The text was updated successfully, but these errors were encountered: