Skip to content

BUG: cross_val_score ignores scoring when estimator is a GridSearchCV object #3848

@fabianp

Description

@fabianp

When I call cross_val_score with a GridSearchCV object (you can see this as nested cross-validation) the scoring value is systematically ignored. Take for example the following code:

X = np.random.randn(100, 10)
w = np.random.randn(10)
y = X.dot(w)
alphas=np.logspace(-3, 3, 10)
clf = grid_search.GridSearchCV(linear_model.ElasticNet(), {'alpha' : np.logspace(-3, 3, 8)})

scores = cross_validation.cross_val_score(clf, X, y, scoring='mean_absolute_error', n_jobs=-1)
print(scores)

since the scoring function is 'mean_absolute_error', the values should be negative. However, what I get is [ 0.99999806 0.99999758 0.99999911], which corresponds to the scoring function r2 (the default).

This happens because cross_validation_score is not able to find the 'predict' method in the GridSearchCV object. That is, the test hasattr(clf, 'predict') in check_scoring fails. The interesting thing is that the GridSearchCV object does define a predict method in BaseSearchCV, only that as a property and thus the hasattr method fails.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions