-
-
Notifications
You must be signed in to change notification settings - Fork 26k
[MRG] FIX Revert the addition of ndcg_score and dcg_score #9932
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This reverts commit 56a21ea.
My only qualm with this is that dcg_score might actually work. But who'd know, as it's not tested nor correct behaviour documented. |
I think dcg_score is correct, although the docstring isn't. also, if y_true is binary, I don't see what
achieves |
doc/whats_new.rst
Outdated
@@ -20,3 +20,5224 @@ Previous Releases | |||
Version 0.14 <whats_new/v0.14.rst> | |||
Version 0.13 <whats_new/v0.13.rst> | |||
Older Versions <whats_new/older_versions.rst> | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No need of the old whats_new
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed!
@TomDLT, opinion on the reversion? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
personally +1 for the PR :)
sklearn/metrics/ranking.py
Outdated
from ..utils.multiclass import type_of_target | ||
from ..utils.extmath import stable_cumsum | ||
from ..utils.sparsefuncs import count_nonzero | ||
from ..exceptions import UndefinedMetricWarning | ||
from ..preprocessing import label_binarize |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems that we need label_binarize here, it's now used in roc_auc_score for supporting binary y_true other than {0, 1} / {-1, 1}.
I agree it's simpler to revert it. We probably need to document the reversion, adding for example a descriptive error for users that will try to import these metrics. |
Nah. I don't think we'll get complaints about this ImportError. No one can
have used ndcg_score. Better off not having the name in the module __dict__.
|
if we do not provide a message for the revert, do we need to remove the what's new entry in 0.19? |
we will note the removal in what's new for 0.19.1.
…On 16 Oct 2017 8:33 pm, "Hanmin Qin" ***@***.***> wrote:
if we do not provide a message for the revert, do we need to remove the
what's new entry in 0.19?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#9932 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEz6-TMN7fK7UYh39rFZxwjndArEck5ks5ssyLrgaJpZM4P6E9G>
.
|
Actually, we have sufficient consensus. Merging. |
Can you explain that? I'll go ahead with making release preparations, but this smells funny to me. I haven't kept track of the issues as you have, though... |
tbh dcg_score could have possibly been used. ndcg only worked if the score
matrix was square, pretty useless in practice. Also it handled a weird
definition of ndcg that was neither normalised nor discounted.
|
This is an alternative to trying to fix up an unusable, undocumented implementation...
Closes #9921
Closes #9928
Closes #9929
Closes #9930
Closes #9931