Skip to content

[WIP] implement balanced_accuracy_score #4300

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

xuewei4d
Copy link
Contributor

Added balanced_accuracy_score. #3506

My implementation is very simple. The definition of balanced accuracy score is
0.5 * true positives / (true positives + false negatives) + 0.5 * true negatives / (true negatives + false positives)
It is actually the average of positive label recall and the negative label recall. It supports only binary label situation now.

There are other two relevant PRs #3929 #3511.

@xuewei4d
Copy link
Contributor Author

any comment @arjoly @adam-m-mcelhinney @jnothman @larsmans @ogrisel ?

@xuewei4d xuewei4d changed the title [WIP]implement balanced_accuracy_score [MRG]implement balanced_accuracy_score Mar 4, 2015
@xuewei4d xuewei4d force-pushed the balanced_accuracy_score branch from 8b579ca to fdda785 Compare March 23, 2015 14:42
Returns
-------
C : array, shape = [n_classes, n_classes]
Confusion matrix
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not what is returned by this function.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is my very first PR. I will rethink the code. Thanks! @ogrisel

@ogrisel
Copy link
Member

ogrisel commented Mar 24, 2015

This function needs to be referenced in the documentation where appropriate.

@ogrisel ogrisel changed the title [MRG]implement balanced_accuracy_score [MRG] implement balanced_accuracy_score Mar 24, 2015
@xuewei4d xuewei4d changed the title [MRG] implement balanced_accuracy_score [WIP] implement balanced_accuracy_score Mar 25, 2015
@xuewei4d xuewei4d force-pushed the balanced_accuracy_score branch from fdda785 to 0cb25f8 Compare March 26, 2015 15:15
@xuewei4d
Copy link
Contributor Author

Thanks @ogrisel! Fixed the problems you commented, but I don't it is necessary to change y_true and y_pred to be like np.array(['0', '1', '1', '1', '1', '1', '1', '1', '1', '1']) in the test cases, since I have a test case with string class labels in the third test case.

@xuewei4d xuewei4d force-pushed the balanced_accuracy_score branch from 0cb25f8 to de84a06 Compare March 28, 2015 13:53
@xuewei4d xuewei4d changed the title [WIP] implement balanced_accuracy_score [MRG] implement balanced_accuracy_score Apr 6, 2015
@xuewei4d xuewei4d changed the title [MRG] implement balanced_accuracy_score [WIP] implement balanced_accuracy_score Apr 6, 2015
@arjoly
Copy link
Member

arjoly commented Oct 24, 2015

closed in favor of #5588

@arjoly arjoly closed this Oct 24, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants