Software

Analysis K-SVD

The work reported in this paper describes a dictionary learning algorithm for the analysis model, an alternative viewpoint to sparse and redundant representations. This model assumes that multiplication of the signal by an appropriate analysis operator (dictionary) leads to a sparse outcome. Specifically, the signal lies in a low-dimensional subspace determined by the dictionary atoms (rows in the operator) indexed in the signal’s co-support (the indices of the zeros in the sparse representation). To learn the analysis dictionary, we take an approach that is parallel and similar to the one adopted by the K-SVD algorithm that serves the corresponding problem in the synthesis model. The effectiveness of our proposed dictionary learning is demonstrated both on synthetic data and real images, showing a successful and meaningful recovery of the analysis dictionary. The following freely available package contains all our Matlab code to reproduce the results of the above-mentioned paper.