Skip to content

Leave one out cross validation with CalibratedClassifierCV and LinearSVC #7796

Closed
@macaodha

Description

@macaodha

Hi there,

This could be a usage problem, so I apologize in advance. I'm trying to use LeaveOneOut with CalibratedClassifierCV and for two classes, after I fit the model and run predict_proba() I get a matrix whos rows sum to greater than one. This code reproduces the problem:

from sklearn.calibration import CalibratedClassifierCV
from sklearn import datasets
from sklearn.svm import LinearSVC
from sklearn.model_selection import LeaveOneOut

num_classes = 2
X, y = datasets.make_classification(n_samples=100, n_features=20,
                                    n_informative=18, n_redundant=2,
                    n_classes=num_classes)
clf = LinearSVC(C=1.0)
clf_prob = CalibratedClassifierCV(clf, method="sigmoid", cv=LeaveOneOut())
clf_prob.fit(X, y)

probs_1 = clf_prob.predict_proba(X)
print probs_1.sum(1) # here the sum of the probabilities for each example sums to 2

If I instead try and fit three classes instead of two, I get an error that says:
index 1 is out of bounds for axis 1 with size 1.

If I replace cv=LeaveOneOut() with cv=KFold() everything works fine.

Thanks

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions