Closed
Description
There is an issue of consistency with StandardScaler
with with_mean=False
and with_std=False
between the sparse and dense case.
-
Does it make sense to support this case. It will return the identity matrix which is not the use case for the
StandardScaler
. If we wish a transformer to do so, one should use theFunctionTransformer
I assume. -
If we consider this behaviour normal, we need to:
- In the dense case, force
self.mean_
to beNone
after each iteration ofpartial_fit
. - In the sparse case, compute the non-NaNs values and update
self.n_samples_seen_
which is not computed. It leads currently to an error if calling twicefit
(i.e.del self.n_samples_seen_
will fail).
- In the dense case, force
IMO, we should make a checking at fit
raise an error.
Metadata
Metadata
Assignees
Labels
No labels