Skip to content

Allow GeneralizedLinearRegressor to use parameter level regularization parameter instead of a scalar for all parameters #22350

Closed
@xiaowei1234

Description

@xiaowei1234

Describe the workflow you want to enable

Currently the GeneralizedLinearRegressor module only allows a scalar alpha regularization parameter to be used for all parameters. However, sometimes the modeler would like to input different amounts of regularization on certain parameters, either for interpretation or some other prior knowledge.

Describe your proposed solution

The alpha parameter can take either a scalar value or an iterable of the same length as the number of parameters in the input matrix.

The optimization algorithm will then flow as normal with the line below in glm.py naturally doing a vector to vector dot product instead of a scalar vector multiplication.

coef_scaled = alpha * coef[offset:]

I believe the alpha parameter passed to scipy.optimize.minimize in the args parameter isn't being used anywhere.

For QA, I plan to output regression results in R or statsmodels for the same data and model specs

Describe alternatives you've considered, if relevant

No response

Additional context

I modified this as part of my dayjob and also doing this as part of my programming work at my university. This is my first PR so please let me know if I'm missing anything.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions