Drones Ideal for Spectral Imagery Sensors – Here’s Why

 by Tim Haynie, Spectrobotics, Business Development Mgr, thaynie(at)spectrabotics.com

Spectral imagery collection and exploitation from a small Unmanned Aerial System (sUAS) was recently demonstrated at the Naval Post Graduate School’s Joint Interagency Field Experiment (JIFX) and the Secretary of Defense’s Rapid Reaction Technology Office (RRTO) Thunderstorm 15-3 for both multi and hyperspectral sensor systems.  While spectral imagery collection and exploitation from high-altitude and satellite platforms have been around for many years and are well-documented, recent advancements in sUAS systems make them an ideal platform for spectral data collection for their increased resolutions, dynamic flight profiles and introduce a new dimension in data collection thinking.


At the JIFX, the sUAS platform used for the Pixelteq SpectroCam ™ multispectral camera was an eight-engine multicopter controlled with a 3DR open-source Pixhawk flight control computer as found in many commercial systems.  The system consumed roughly 18,000 watts of power to manage both the flight control, sensor operations and demonstrated a 25-minute flight time.  At the Thunderstorm 15-3 demonstration, the smaller, lighter Headwall Photonics Nano-Hyperspec ™ required only a six-engine multicopter flown with the same flight control computer (3DR Pixhawk) and attained a 15 minute flight time. 

These Spectral sensors record reflected light energy across the electromagnetic spectrum in the Visible and Near Infrared (VNIR) region (400-1100 nanometers).  The Pixelteq SpectroCam ™ uses eight filters to record the light-energy in multispectral bands while the Headwall Photonics Nano-Hyperspec™ records 270-bands of hyperspectral data using a diffraction grating to split the incoming light energy into measurable bands.  Post-flight analysis of the spectral data was able to identify physical and chemical features within the scene for material identifications using a reference spectra (library signature of a target material.)

For the JIFX, the team demonstrated the versatility of the multispectral sensor/platform to not only record overhead imagery of target materials, but also lowered the platform below the tree line to collect imagery off-nadir and below the canopies.  The sUAS collection platform was stable enough to collect data and subsequent analysis successfully detected the presence of the target material (military uniforms) despite being in shadows and masked by foliage. 

The dynamic flight characteristics of the multirotor sUASs were essential to the hyperspectral data collection at Thunderstorm 15-3 because of the need for a high-precision flight path.   Scanning an urban area for the presence of a chemical hazard (Methyl salicylate simulant), the Nano-Hyperspec™ required overhead collection at very specific speed and altitude in order to maintain the correct exposure and frame-rate needed for proper sensor operation.  This can only be accomplished using autonomous flight under the control of the sUAS’s 3DR Pixhawk flight management system.  The hyperspectral imager was able to detect the chemical simulant hidden within an urban environment after flying an autonomous “lawnmower pattern” over the target area.

The use of the sUAS platform for these sensors directly improved the temporal and spatial resolutions for the spectral imagery sensors above those attained through satellite and high-flying manned/unmanned systems, even those hosting larger, more capable sensors.  Temporal resolution was significantly enhanced as data collection was completed within an hour of notification of the target area encompassing mission planning, flight/data collection and system recovery.  The low-level flight of the sUAS was able to capture data at a 3-inch spatial resolution which helped analyst by collecting data with up to 100% pixel-saturation of the target material.  It is also important to note that despite the cloud cover (that would have prevented high-altitude collections) and filtered sunlight the sensors were able to collect sufficient data for analysis that detected the target simulant material.

The combination of these resolution-enhancements, coupled with the increased flexibility of a sUAS to alter its flight performance based on the individual sensor collection requirements make the sUAS a viable platform to not only incorporate other sensor systems, but also explore flight parameters that expand the data-potential of these systems.  Data collection from aerial systems has always been performed from a two-dimensional plane (fixed orbit, operating altitude); the sUAS, multicopters in particular, enables data collection from a three-dimensional space and the ease of deployment and operation increase their frequency of use giving more on-demand data. 

Our next level of effort is infuse this “sUASA data-layer” into the overall intelligence data cloud architecture and begin to open the data for advanced analytics by other users.  sUAS systems will eventually host other types of sensors beyond cameras recoding imagery (vapor sensors, signals detectors, laser rangefinding for 3D modeling to name a few) and the true benefits of the sUAS technology are the potential to increase the number of sensors deployed and broaden access to places unattainable by conventional platforms.

Participating in these two events and comprising “Team Peregrine” were Spectrabotics, Autonomous Avionics, PixelTec, and Exogenesis Solution from Colorado and Headwall Photonics from Massachusetts.   

Comments

Popular posts from this blog

2013 Naval Drones Survey

Inspector Gadgets: Drones in the Hangar

Autonomous Submarine Drones: Cheap, Endless Patrolling