-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG] Adding pos_label parameter to roc_auc_score (#6873) #6874
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@@ -250,7 +254,8 @@ def _binary_roc_auc_score(y_true, y_score, sample_weight=None): | |||
raise ValueError("Only one class present in y_true. ROC AUC score " | |||
"is not defined in that case.") | |||
|
|||
fpr, tpr, tresholds = roc_curve(y_true, y_score, | |||
# use closure for pos_label parameter |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be better to make it explicit and not use closure. Just add the pos_label
arg to _binary_roc_auc_score
and pass it explicitly...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a reason why I used closure. _binary_roc_auc_score
will be called from _average_binary_score
as binary_metric(y_true, y_score, sample_weight=sample_weight)
. And it doesn't look like _average_binary_score
should know anything about pos_label.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah I see, Then you could do partial(_binary_roc_auc_score, pos_label=pos_label)
and pass that as the binary_metric
.
LGTM pending comments. |
there is no test. set the PR to MRG when ready |
@agramfort, where is the proper place for the tests? Is it |
test_ranking.py |
Ok, will do. |
Is there something else I should do within this issue? |
(Appveyor failure unrelated) |
Please add, or at least try adding, this metric to |
I'd like to clarify the semantics. Let's say we're using this measure with a binary problem with labels [1, 2] and 1 is the positive class. AFAIK, the |
@tswr are you still interested in working on this? |
tagging as "needs contributor" as there was no reply from @tswr |
Hello I am quite new to scikit-learn and would like to contribute. |
@aradhyamathur I think this is not a good issue for a beginner, it might be tricky to get right. |
@amueller ohk I understand thanks for your insight :) |
This work is continued in #9567. Could you please have a look? Thanks. |
Closing given #2723 (comment) |
Reference Issue
Fixes #6873
What does this implement/fix? Explain your changes.
This PR implements the pos_label parameter in roc_auc_score function. Actually all it does is forwarding this parameter to roc_curve function call.
Any other comments?
Added tests for pos_label param in roc_curve and roc_auc_score.
Results of make: all tests ok.