Sparse coding.
Each row of the result is the solution to a sparse coding problem. The goal is to find a sparse array code such that:
X ~= code * dictionary
Read more in the User Guide.
Data matrix.
The dictionary matrix against which to solve the sparse coding of the data. Some of the algorithms assume normalized rows for meaningful output.
Precomputed Gram matrix, dictionary * dictionary'.
Precomputed covariance, dictionary' * X.
The algorithm used:
'lars': uses the least angle regression method (linear_model.lars_path);'lasso_lars': uses Lars to compute the Lasso solution;'lasso_cd': uses the coordinate descent method to compute the Lasso solution (linear_model.Lasso). lasso_lars will be faster if the estimated components are sparse;'omp': uses orthogonal matching pursuit to estimate the sparse solution;'threshold': squashes to zero all coefficients less than regularization from the projection dictionary * data'.Number of nonzero coefficients to target in each column of the solution. This is only used by algorithm='lars' and algorithm='omp' and is overridden by alpha in the omp case. If None, then n_nonzero_coefs=int(n_features / 10).
If algorithm='lasso_lars' or algorithm='lasso_cd', alpha is the penalty applied to the L1 norm. If algorithm='threshold', alpha is the absolute value of the threshold below which coefficients will be squashed to zero. If algorithm='omp', alpha is the tolerance parameter: the value of the reconstruction error targeted. In this case, it overrides n_nonzero_coefs. If None, default to 1.
Whether to copy the precomputed covariance matrix; if False, it may be overwritten.
Initialization value of the sparse codes. Only used if algorithm='lasso_cd'.
Maximum number of iterations to perform if algorithm='lasso_cd' or 'lasso_lars'.
Number of parallel jobs to run. None means 1 unless in a joblib.parallel_backend context. -1 means using all processors. See Glossary for more details.
If False, the input arrays X and dictionary will not be checked.
Controls the verbosity; the higher, the more messages.
Whether to enforce positivity when finding the encoding.
Added in version 0.20.
The sparse codes.
See also
sklearn.linear_model.lars_pathCompute Least Angle Regression or Lasso path using LARS algorithm.
sklearn.linear_model.orthogonal_mpSolves Orthogonal Matching Pursuit problems.
sklearn.linear_model.LassoTrain Linear Model with L1 prior as regularizer.
SparseCoderFind a sparse representation of data from a fixed precomputed dictionary.
>>> import numpy as np
>>> from sklearn.decomposition import sparse_encode
>>> X = np.array([[-1, -1, -1], [0, 0, 3]])
>>> dictionary = np.array(
... [[0, 1, 0],
... [-1, -1, 2],
... [1, 1, 1],
... [0, 1, 1],
... [0, 2, 1]],
... dtype=np.float64
... )
>>> sparse_encode(X, dictionary, alpha=1e-10)
array([[ 0., 0., -1., 0., 0.],
[ 0., 1., 1., 0., 0.]])
© 2007–2025 The scikit-learn developers
Licensed under the 3-clause BSD License.
https://scikit-learn.org/1.6/modules/generated/sklearn.decomposition.sparse_encode.html