Conventional imaging techniques, such as digital photography, capture images across only three wavelengths – or frequencies – of light, from blue to green to red. Hyperspectral imaging creates images across dozens or hundreds of wavelengths. These images can be used to determine the materials found in whatever scene was imaged – sort of like spectroscopy done at a distance.
But the technique does face some challenges.
For example, in a conventional imaging system, if an image has millions of pixels across three wavelengths, the image file might be one megabyte. But in hyperspectral imaging, the image file could be at least an order of magnitude larger. This can create problems for storing and transmitting data.
In addition, capturing hyperspectral images across dozens of wavelengths can be time-consuming – with the conventional imaging technology taking a series of images, each capturing a different suite of wavelengths, and then combining them.
In recent years, researchers have developed new hyperspectral imaging hardware that can acquire the necessary images more quickly and store the images using significantly less memory. The hardware takes advantage of “compressive measurements,” which mix spatial and wavelength data in a format that can be used later to reconstruct the complete hyperspectral image.
But in order for the new hardware to work effectively, you need an algorithm that can reconstruct the image accurately and quickly. And that’s what researchers at NC State and Delaware have developed.
In model testing, the new algorithm significantly outperformed existing algorithms at every frequency.
“We were able to reconstruct image quality in 100 seconds of computation that other algorithms couldn’t match in 450 seconds,” Baron says. “And we’re confident that we can bring that computational time down even further.”
The higher quality of the image reconstruction means that fewer measurements need to be acquired and processed by the hardware, speeding up the imaging time. And fewer measurements mean less data that needs to be stored and transmitted.
“Our next step is to run the algorithm in a real world system to gain insights into how the algorithm functions and identify potential room for improvement,” Baron says. “We’re also considering how we could modify both the algorithm and the hardware to better compliment each other.”
The paper, “Compressive Hyperspectral Imaging via Approximate Message Passing,” is published online in the IEEE Journal of Selected Topics in Signal Processing. Lead author of the paper is Jin Tan, a former Ph.D. student at NC State. The paper was co-authored by Yanting Ma of NC State and Hoover Rueda and Gonzalo Arce of Delaware. The work was supported by the National Science Foundation under grant number CCF-1217749 and by the U.S. Army Research Office under grants W911NF-14-1-0314 and W911NF-12-1-0380.