sklearn.preprocessing.power_transform(X, method=’box-cox’, standardize=True, copy=True)
[source]
Apply a power transform featurewise to make data more Gaussian-like.
Power transforms are a family of parametric, monotonic transformations that are applied to make data more Gaussian-like. This is useful for modeling issues related to heteroscedasticity (non-constant variance), or other situations where normality is desired.
Currently, power_transform() supports the Box-Cox transform. Box-Cox requires input data to be strictly positive. The optimal parameter for stabilizing variance and minimizing skewness is estimated through maximum likelihood.
By default, zero-mean, unit-variance normalization is applied to the transformed data.
Read more in the User Guide.
Parameters: |
|
---|
See also
PowerTransformer
Transformer
API (as part of a preprocessing sklearn.pipeline.Pipeline
).quantile_transform
output_distribution=’normal’
.NaNs are treated as missing values: disregarded to compute the statistics, and maintained during the data transformation.
For a comparison of the different scalers, transformers, and normalizers, see examples/preprocessing/plot_all_scaling.py.
G.E.P. Box and D.R. Cox, “An Analysis of Transformations”, Journal of the Royal Statistical Society B, 26, 211-252 (1964).
>>> import numpy as np >>> from sklearn.preprocessing import power_transform >>> data = [[1, 2], [3, 2], [4, 5]] >>> print(power_transform(data)) [[-1.332... -0.707...] [ 0.256... -0.707...] [ 1.076... 1.414...]]
© 2007–2018 The scikit-learn developers
Licensed under the 3-clause BSD License.
http://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.power_transform.html