Abstract
To recover a sharp image from its blurry observation is the problem known as image deblurring. It frequently arises in imaging sciences and technologies, including optical, medical, and astronomical applications, and is crucial for allowing to detect important features and patterns such as those of a distant planet or some microscopic tissue. Mathematically, image deblurring is intimately connected to backward diffusion processes (e.g., inverting the heat equation), which are notoriously unstable. As inverse problem solvers, deblurring models therefore crucially depend upon proper regularizers or conditioners that help secure stability, often at the necessary cost of losing certain high-frequency details in the original images. Such regularization techniques can ensure the existence, uniqueness, or stability of deblurred images. The present work follows closely the general framework described in our recent monograph [18], but also contains more updated views and approaches to image deblurring, including, e.g., more discussion on stochastic signals, the Bayesian/Tikhonov approach to Wiener filtering, and the iterated-shrinkage algorithm of Daubechies et al. [30, 31] for wavelet-based deblurring. The work thus contributes to the development of generic, systematic, and unified frameworks in contemporary image processing.
| Original language | English |
|---|---|
| Title of host publication | Mathematics And Computation In Imaging Science And Information Processing |
| Publisher | World Scientific Publishing Co. |
| Pages | 93-130 |
| Number of pages | 38 |
| ISBN (Electronic) | 9789812709066 |
| Publication status | Published - 1 Jan 2007 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2007 by World Scientific Publishing Co. Pte. Ltd.
Fingerprint
Dive into the research topics of 'Theory and computation of variational image deblurring'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver