# K-SVD and its non-negative variant for dictionary design

@inproceedings{Aharon2005KSVDAI, title={K-SVD and its non-negative variant for dictionary design}, author={Michal Aharon and Michael Elad and Alfred Marcel Bruckstein}, booktitle={SPIE Optics + Photonics}, year={2005} }

In recent years there is a growing interest in the study of sparse representation for signals. Using an overcomplete dictionary that contains prototype signal-atoms, signals are described as sparse linear combinations of these atoms. Recent activity in this field concentrated mainly on the study of pursuit algorithms that decompose signals with respect to a given dictionary. Designing dictionaries to better fit the above model can be done by either selecting pre-specified transforms, or by… Expand

#### 176 Citations

A strategy of classification via sparse dictionary learned by non-negative K-SVD

- Computer Science
- 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops
- 2009

This model first applies the non-negative K-SVD algorithm to learning the discriminative dictionaries using very few training samples and then represents a test image as a linear combination of atoms from these learned dictionaries based on thenon-negative variation of Basis Pursuit. Expand

Nonnegative Low-rank Sparse Component Analysis

- Computer Science
- ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2019

Two algorithms to train K-NMF are proposed: one based on alternating optimization and exact sparse coding, the other based on a nonnegative variant of K-subspace, equivalent to k-sparse nonnegative matrix factorization. Expand

Dictionary Learning Based on Nonnegative Matrix Factorization Using Parallel Coordinate Descent

- Mathematics
- 2013

Sparse representation of signals via an overcomplete dictionary has recently received much attention as it has produced promising results in various applications. Since the nonnegativities of the… Expand

Learning overcomplete dictionaries with ℓ0-sparse Non-negative Matrix Factorisation

- Mathematics, Computer Science
- 2013 IEEE Global Conference on Signal and Information Processing
- 2013

Dictionary recovery experiments using overcomplete dictionaries show that the use of a ℓ0 norm penalty for NMF outperforms both NMF and a state of the art S-NMF method, in particular when the dictionary to be learnt is dense. Expand

Denoising predictive sparse decomposition

- Computer Science
- 2014 International Conference on Big Data and Smart Computing (BIGCOMP)
- 2014

A novel model, called denoising PSD (DPSD), for robust sparse coding is proposed and experiments on real visual object recognition tasks show that DPSD can dramatically outperform PSD in real applications. Expand

Provable Dictionary Learning via Column Signatures

- 2014

In dictionary learning, also known as sparse coding, we are given samples of the form y = Ax where x ∈ R is an unknown random sparse vector and A is an unknown dictionary matrix in Rn×m (usually m >… Expand

Nonnegative Sparse Representation of Signals With a Determinant-Type Sparsity Measure Based on the DC Programming

- Computer Science
- IEEE Access
- 2018

This paper designs an effective and tailored algorithm for the sparse representation of nonnegative signals using the so-called determinant sparsity measure formed with the determinant of the Gram matrix of sparse coefficients and forms the nonnegative dictionary learning problem that is optimization of a non-convex function. Expand

Dictionary learning based on dip patch selection training for random noise attenuation

- Computer Science
- GEOPHYSICS
- 2019

This work has developed a dip-oriented dictionary learning method, which incorporates an estimation of the dip field into the selection procedure of training patches, and applies a curvelet-transform noise reduction method to remove some fine-scale components that presumably contain mostly random noise. Expand

Family of iterative LS-based dictionary learning algorithms, ILS-DLA, for sparse signal representation

- Computer Science
- Digit. Signal Process.
- 2007

This work presents a family of iterative least squares based dictionary learning algorithms (ILS-DLA), including algorithms for design of signal dependent block based dictionaries and overlapping dictionaries, as generalizations of transforms and filter banks, respectively. Expand

Learning Big (Image) Data via Coresets for Dictionaries

- Computer Science
- Journal of Mathematical Imaging and Vision
- 2013

This paper suggests a new computational approach to the problem of dictionary learning, known in computational geometry as coresets, which are a small smart non-uniform sample from the input signals such that the quality of any given dictionary with respect to the input can be approximated via the coreset. Expand

#### References

SHOWING 1-10 OF 27 REFERENCES

K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation

- 2005

In recent years there has been a growing interest in the study of sparse representation of signals. Using an overcomplete dictionary that contains prototype signal-atoms, signals are described by… Expand

$rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation

- Mathematics, Computer Science
- IEEE Transactions on Signal Processing
- 2006

A novel algorithm for adapting dictionaries in order to achieve sparse signal representations, the K-SVD algorithm, an iterative method that alternates between sparse coding of the examples based on the current dictionary and a process of updating the dictionary atoms to better fit the data. Expand

Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization

- Computer Science, Medicine
- Proceedings of the National Academy of Sciences of the United States of America
- 2003

This article obtains parallel results in a more general setting, where the dictionary D can arise from two or several bases, frames, or even less structured systems, and sketches three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models. Expand

Dictionary Learning Algorithms for Sparse Representation

- Mathematics, Computer Science
- Neural Computation
- 2003

Algorithms for data-driven learning of domain-specific overcomplete dictionaries are developed to obtain maximum likelihood and maximum a posteriori dictionary estimates based on the use of Bayesian models with concave/Schur-concave negative log priors, showing improved performance over other independent component analysis methods. Expand

Atomic Decomposition by Basis Pursuit

- Mathematics, Computer Science
- SIAM J. Sci. Comput.
- 1998

Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions. Expand

Learning Overcomplete Representations

- Mathematics, Medicine
- Neural Computation
- 2000

It is shown that overcomplete bases can yield a better approximation of the underlying statistical distribution of the data and can thus lead to greater coding efficiency and provide a method for Bayesian reconstruction of signals in the presence of noise and for blind source separation when there are more sources than mixtures. Expand

Sparse image coding using learned overcomplete dictionaries

- Computer Science
- Proceedings of the 2004 14th IEEE Signal Processing Society Workshop Machine Learning for Signal Processing, 2004.
- 2004

The authors' overcomplete dictionary learning algorithm (FOCUSS-CNDL) is compared with overcomplete independent component analysis (ICA) and a modified version of the FOCUSS algorithm is presented that can find a non-negative sparse coding in some cases. Expand

Greed is good: algorithmic results for sparse approximation

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2004

This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries and develops a sufficient condition under which OMP can identify atoms from an optimal approximation of a nonsparse signal. Expand

Learning the parts of objects by non-negative matrix factorization

- Computer Science, Medicine
- Nature
- 1999

An algorithm for non-negative matrix factorization is demonstrated that is able to learn parts of faces and semantic features of text and is in contrast to other methods that learn holistic, not parts-based, representations. Expand

Non-negative Matrix Factorization with Sparseness Constraints

- Computer Science, Mathematics
- J. Mach. Learn. Res.
- 2004

This paper shows how explicitly incorporating the notion of 'sparseness' improves the found decompositions, and provides complete MATLAB code both for standard NMF and for an extension of this technique. Expand