@@ -194,17 +194,19 @@ def _logistic_regression_path(
194
194
only supported by the 'saga' solver.
195
195
196
196
intercept_scaling : float, default=1.
197
- Useful only when the solver ' liblinear' is used
198
- and self.fit_intercept is set to True. In this case, x becomes
199
- [x, self.intercept_scaling],
197
+ Useful only when the solver ` liblinear` is used
198
+ and ` self.fit_intercept` is set to ` True` . In this case, `x` becomes
199
+ ` [x, self.intercept_scaling]` ,
200
200
i.e. a "synthetic" feature with constant value equal to
201
- intercept_scaling is appended to the instance vector.
202
- The intercept becomes ``intercept_scaling * synthetic_feature_weight``.
201
+ `intercept_scaling` is appended to the instance vector.
202
+ The intercept becomes
203
+ ``intercept_scaling * synthetic_feature_weight``.
203
204
204
- Note! the synthetic feature weight is subject to l1/l2 regularization
205
- as all other features.
206
- To lessen the effect of regularization on synthetic feature weight
207
- (and therefore on the intercept) intercept_scaling has to be increased.
205
+ .. note::
206
+ The synthetic feature weight is subject to L1 or L2
207
+ regularization as all other features.
208
+ To lessen the effect of regularization on synthetic feature weight
209
+ (and therefore on the intercept) `intercept_scaling` has to be increased.
208
210
209
211
multi_class : {'ovr', 'multinomial', 'auto'}, default='auto'
210
212
If the option chosen is 'ovr', then a binary problem is fit for each
@@ -692,16 +694,19 @@ def _log_reg_scoring_path(
692
694
n_samples > n_features.
693
695
694
696
intercept_scaling : float
695
- Useful only when the solver 'liblinear' is used
696
- and self.fit_intercept is set to True. In this case, x becomes
697
- [x, self.intercept_scaling],
698
- i.e. a "synthetic" feature with constant value equals to
699
- intercept_scaling is appended to the instance vector.
700
- The intercept becomes intercept_scaling * synthetic feature weight
701
- Note! the synthetic feature weight is subject to l1/l2 regularization
702
- as all other features.
703
- To lessen the effect of regularization on synthetic feature weight
704
- (and therefore on the intercept) intercept_scaling has to be increased.
697
+ Useful only when the solver `liblinear` is used
698
+ and `self.fit_intercept` is set to `True`. In this case, `x` becomes
699
+ `[x, self.intercept_scaling]`,
700
+ i.e. a "synthetic" feature with constant value equal to
701
+ `intercept_scaling` is appended to the instance vector.
702
+ The intercept becomes
703
+ ``intercept_scaling * synthetic_feature_weight``.
704
+
705
+ .. note::
706
+ The synthetic feature weight is subject to L1 or L2
707
+ regularization as all other features.
708
+ To lessen the effect of regularization on synthetic feature weight
709
+ (and therefore on the intercept) `intercept_scaling` has to be increased.
705
710
706
711
multi_class : {'auto', 'ovr', 'multinomial'}
707
712
If the option chosen is 'ovr', then a binary problem is fit for each
@@ -881,17 +886,19 @@ class LogisticRegression(LinearClassifierMixin, SparseCoefMixin, BaseEstimator):
881
886
added to the decision function.
882
887
883
888
intercept_scaling : float, default=1
884
- Useful only when the solver ' liblinear' is used
885
- and self.fit_intercept is set to True. In this case, x becomes
886
- [x, self.intercept_scaling],
889
+ Useful only when the solver ` liblinear` is used
890
+ and ` self.fit_intercept` is set to ` True` . In this case, `x` becomes
891
+ ` [x, self.intercept_scaling]` ,
887
892
i.e. a "synthetic" feature with constant value equal to
888
- intercept_scaling is appended to the instance vector.
889
- The intercept becomes ``intercept_scaling * synthetic_feature_weight``.
893
+ `intercept_scaling` is appended to the instance vector.
894
+ The intercept becomes
895
+ ``intercept_scaling * synthetic_feature_weight``.
890
896
891
- Note! the synthetic feature weight is subject to l1/l2 regularization
892
- as all other features.
893
- To lessen the effect of regularization on synthetic feature weight
894
- (and therefore on the intercept) intercept_scaling has to be increased.
897
+ .. note::
898
+ The synthetic feature weight is subject to L1 or L2
899
+ regularization as all other features.
900
+ To lessen the effect of regularization on synthetic feature weight
901
+ (and therefore on the intercept) `intercept_scaling` has to be increased.
895
902
896
903
class_weight : dict or 'balanced', default=None
897
904
Weights associated with classes in the form ``{class_label: weight}``.
@@ -1643,17 +1650,19 @@ class LogisticRegressionCV(LogisticRegression, LinearClassifierMixin, BaseEstima
1643
1650
best scores across folds are averaged.
1644
1651
1645
1652
intercept_scaling : float, default=1
1646
- Useful only when the solver ' liblinear' is used
1647
- and self.fit_intercept is set to True. In this case, x becomes
1648
- [x, self.intercept_scaling],
1653
+ Useful only when the solver ` liblinear` is used
1654
+ and ` self.fit_intercept` is set to ` True` . In this case, `x` becomes
1655
+ ` [x, self.intercept_scaling]` ,
1649
1656
i.e. a "synthetic" feature with constant value equal to
1650
- intercept_scaling is appended to the instance vector.
1651
- The intercept becomes ``intercept_scaling * synthetic_feature_weight``.
1657
+ `intercept_scaling` is appended to the instance vector.
1658
+ The intercept becomes
1659
+ ``intercept_scaling * synthetic_feature_weight``.
1652
1660
1653
- Note! the synthetic feature weight is subject to l1/l2 regularization
1654
- as all other features.
1655
- To lessen the effect of regularization on synthetic feature weight
1656
- (and therefore on the intercept) intercept_scaling has to be increased.
1661
+ .. note::
1662
+ The synthetic feature weight is subject to L1 or L2
1663
+ regularization as all other features.
1664
+ To lessen the effect of regularization on synthetic feature weight
1665
+ (and therefore on the intercept) `intercept_scaling` has to be increased.
1657
1666
1658
1667
multi_class : {'auto, 'ovr', 'multinomial'}, default='auto'
1659
1668
If the option chosen is 'ovr', then a binary problem is fit for each
0 commit comments