Edge Guided Reconstruction for Compressive Imaging

Weihong Guo and Wotao Yin

Published in SIAM J. Imaging Sciences

Overview

This work proposes EdgeCS, an edge guided compressive sensing reconstruction approach, to recover images of higher qualities from fewer measurements than the current state-of-the-art methods.

Edges are important images features that are used in various ways in image recovery, analysis, and understanding. In compressive sensing, the sparsity of image edges has been widely utilized to recover images. However, edge detectors have not been used on compressive sensing measurements to improve the edge recovery and thus the image recovery. This motivates us to propose EdgeCS, which alternatively performs edge detection and image reconstruction in a mutually beneficial way. The edge detector of EdgeCS is designed to faithfully return partial edges from intermediate image reconstructions even though these reconstructions may still have noise and artifacts. For complex–valued images, it incorporates joint sparsity between the real and imaginary components.

EdgeCS has been implemented with both isotropic and anisotropic discretizations of total variation and tested on incomplete k-space (Fourier) samples. It applies to other types of measurements as well. Experimental results on large-scale real/complex-valued phantom and magnetic resonance (MR) images show that EdgeCS is fast and returns high-quality images.

  • It exactly recovers the 256-by-256 Shepp-Logan phantom from merely 7 radial lines (3.03% k-space), which is impossible for most existing algorithms.

  • It is able to accurately reconstruct a 512-by-512 MR image with 0.05 white noise from 20.87% radial samples.

  • On complex-valued MR images, it obtains recoveries with faithful phases, which are important in many medical applications.

Each of these tests took around 30 seconds on a standard PC. Finally, the algorithm is GPU friendly.

Citation

W. Guo and W. Yin, Edge guided reconstruction for compressive imaging, SIAM Journal on Imaging Sciences 5(3), 809-834, 2012.


« Back