Skip to content

Bug in Calibration Curve Documentation #25571

@JonathanJohann

Description

@JonathanJohann

Describe the bug

https://scikit-learn.org/stable/auto_examples/calibration/plot_calibration_curve.html

In the calibration curve page, a "scores_df" is generated to showcase supporting model evaluation metrics in addition to the calibration curves.

I noticed that my ROC AUC score was unusually low and noticed that it was consuming Y_PRED instead of Y_PROB. This is incorrect and may confuse users in the future.

Steps/Code to Reproduce

Not applicable.

Expected Results

Expected AUC in the 65-75 range for my application.

Actual Results

Observed AUC in the 50-55 range instead.

Versions

Not really relevant but 1.2.1.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions