Sparse Overcomplete Denoising : Aggregation Versus Global Optimization
Carrera, Diego; Boracchi, Giacomo; Foi, Alessandro; Wohlberg, Brendt (2017-10-01)
Carrera, Diego
Boracchi, Giacomo
Foi, Alessandro
Wohlberg, Brendt
01.10.2017
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tty-201710051986
https://urn.fi/URN:NBN:fi:tty-201710051986
Kuvaus
Peer reviewed
Tiivistelmä
Denoising is often addressed via sparse coding with respect to an overcomplete dictionary. There are two main approaches when the dictionary is composed of translates of an orthonormal basis. The first, traditionally employed by techniques such as wavelet cycle spinning, separately seeks sparsity w.r.t. each translate of the orthonormal basis, solving multiple partial optimizations and obtaining a collection of sparse approximations of the noise-free image, which are aggregated together to obtain a final estimate. The second approach, recently employed by convolutional sparse representations, instead seeks sparsity over the entire dictionary via a global optimization. It is tempting to view the former approach as providing a suboptimal solution of the latter. In this letter, we analyze whether global sparsity is a desirable property, and under what conditions the global optimization provides a better solution to the denoising problem. In particular, our experimental analysis shows that the two approaches attain comparable performance in case of natural images and global optimization outperforms the simpler aggregation of partial estimates only when the image admits an extremely sparse representation. We explain this phenomenon by separately studying the bias and variance of these solutions, and by noting that the variance of the global solution increases very rapidly as the original signal becomes less and less sparse.
Kokoelmat
- TUNICRIS-julkaisut [19188]