When the U.S. military gets into a fight, it wants to see everything that's going on, so it relies on a plethora of optical sensors.
Cameras on UAVs are increasingly numerous. So are cameras on vehicles and cameras on soldiers' helmets. And cameras on satellites have been around for a long time.
But traditional cameras have a drawback. They're bulky and relatively heavy.
Until now, not much could be done about that. The quality of images produced depended on the size of the lens.
But a Texas electrical engineering professor is taking a different approach to producing photo and video images. Instead of relying on a single lens, he proposes using a hundred tiny lenses and electronically merging the digital images they produce.
Even 100 tiny plastic lenses like those used in cell phone cameras - at 1/16 of an inch or so in diameter - would have negligible weight. As important, they would lack the bulk of a big lens and its focal length.
An array of tiny lenses would be essentially flat, thus they could be embedded in the underside of the wing of a micro UAV, or attached to the front of a soldier's helmet without adding noticeable weight or bulk, said Marc Christensen, chairman of the electrical engineering department at Southern Methodist University (SMU).
Built into the walls of buildings, arrays of tiny cameras could unobtrusively monitor workers and visitors.
Individually, each tiny lens produces a low-quality image.
"They're a little bit grainy and noisy because of the small aperture" of the lenses, Christensen said.
But the multiple low-resolution images the lens array gathers can be "digitally processed to extract high-resolution detail," Christensen explained in a research paper.
With colleagues at SMU in Dallas and Santa Clara University in California, Christensen is developing what he calls "a thin, agile, multi-resolution computational imaging sensor architecture." He has received $225,000 in funding from the U.S. Defense Advanced Research Projects Agency (DARPA).
The U.S. military is trying to provide flying, crawling and wearable optical sensors that are small enough, light enough and cheap enough that almost every soldier would have them, Christensen said.
While that's not practical with traditional cameras, in recent years "the necessary technologies have emerged to fundamentally change the way we collect images," Christensen said.
One of the technologies is the computing power to electronically combine multiple low-resolution images produced by tiny cell phone-size lenses into a single, high-resolution image. This produces a quality image without the weight and bulk of quality optics.
Another development that makes Christensen's sensor possible is MEMS (microelectromechanical systems) micro mirrors.
These are tiny, movable mirrors that are used to focus each lens on whatever is of interest in its field of view.
Typically, only 10 percent to 15 percent of what is displayed by a lens is of interest for intelligence and reconnaissance purposes, Christensen said. A shot from a UAV flying over a field, for example, might show a large empty field and a few buildings. Only the buildings are of interest.
The mirrors can be activated by a "man in the loop" viewing a screen that shows what the camera array sees. Or it can be done automatically.
Christensen and the scientists working with him have developed "novel adaptive algorithms" that enable a computer to calculate what is important in the camera's field of view. The computer then activates the mirrors to focus on that.
When the lenses focus only on the areas of interest in their field of vision, the images the system produces can approach, or even surpass, the performance of a traditional high-resolution camera, Christensen said.
The lens array is called Panoptes. To technophiles, that's Processing Arrays of Nyquist-limited Observations to Produce a Thin Electro-optic Sensor. (Nyquist refers to a theory used in signal processing.) To classicists, Panoptes refers to Argus Panoptes, a hundred-eyed giant in Greek mythology who served as an excellent watchman because only a few of his eyes slept at a time. (Ultimately, however, Panoptes was slain by Hermes, who first put all of his eyes to sleep by telling him boring stories.)
Christensen is not the first scientist to build an optical sensor out of many small camera lenses. Japanese researchers assembled an array of miniature lenses. But the Japanese sensor "stares across the entire field of view," Christensen said. It does not have the ability to focus on areas of interest.
That wastes bandwidth as useless data are transmitted from the sensor, and it wastes computing power to process the parts of the image that aren't of interest, he said.
So far, the Panoptes sensor remains in Christensen's laboratory, but field testing is expected to begin next year, he said.
For DARPA, Panoptes is part of a broader program that aims "to develop new imaging sensors with radically different form, fit and function compared to existing systems."
DARPA's goal is to break the "optics paradigm" and "replace the large, long-lens cameras of today with thin, lightweight cameras with exceptional performance."
The idea is "making cameras that fit the user rather than constraining the user to fit the camera," DARPA spokeswoman Jan Walker said.
Read Full Story