-
-
Notifications
You must be signed in to change notification settings - Fork 26.2k
Closed
Description
Wrap up the discussions in #9828 #9567 along with some opinions from myself
sklearn/metrics/tests/test_common might need some improvements
- (1) Move roc_auc scores out of METRIC_UNDEFINED_BINARY (Fixed in [MRG+1] TST Move roc_auc_score from METRIC_UNDEFINED_BINARY to METRIC_UNDEFINED_MULTICLASS #9786)
- (2) Move average_precision scores out of METRIC_UNDEFINED_BINARY
- (3) Get rid of the awkward list here (See [MRG+1] Completely support binary y_true in roc_auc_score #9828)
- (4) In order to achieve (1) and (3), fully support binary y_true for roc_auc scores (See [MRG+1] Completely support binary y_true in roc_auc_score #9828)
- (5) In order to achieve (2) and (3), fully support binary y_true for average_precision scores
- (6) Remove some unnecessary metrics (e.g., roc_auc_score/macro_roc_auc, average_precision_score/macro_average_precision_score are actually the same)
- (7) This comment ("and label") may be inappropriate currently. Possibly related to the TODO here
cc @jnothman Feel free to edit it :)
TomDLT and glemaitre
Metadata
Metadata
Assignees
Labels
No labels