You are here

Real-Time Sensor Data Processing and Compression Performed On-board Unmanned Aerial Systems (UAS)

Description:

OBJECTIVE: Develop onboard sensor processing capability for electro-optical (EO) and/or Infrared (IR) data that can support real-time identification, tracking, and selection of regions of interest (ROIs) based on environment, target sets and mission priorities. DESCRIPTION: The purpose of this research project is to develop practical concepts for effective, adaptive processing architectures and real-time processors that allow identification and selection of ROIs based on environmental conditions, user mission priorities, and collateral data sources similar to the saliency process the human brain uses to identify ROIs. Annotation and extraction of the ROI data in real-time, preliminary classification/ recognition of ROI data content, and reduced bandwidth requirements for critical data transmission to the ground system, are the primary objectives of this research. Performance metrics include accurate determination of ROIs, annotation of data streams, and selection of windows of data for transmission at a pixel input rate of 10E9, processing latency of less than 2 seconds, and power expenditure for processor operation of less that<150 watts when operating at a rate of 20 teraops with a maximum processor weight of<5 kilograms. Advances in military sensor technologies, especially in the area of focal plane arrays, has resulted in an increased potential for significantly improved resolution and area search rate. Improvements in resolution allows detection, tracking and persistent observation of dismount activities that are difficult without lower resolutions. The new wide-field of view (WFOV) high-resolution persistent surveillance video sensors have increased the amount of data by several orders of magnitude. These very large increases in sensor data are overwhelming the bandwidth capacities of Unmanned Aircraft Systems (UAS) to transmit the data and are overloading the capabilities of analyst teams to support timely assessment and interpretation. Additionally, the user needs a corresponding capability that will assist imagery analysts to exploit this new data real-time. DoD Task Forces have estimated that future assets in Theater will provide a 5000X increase in the already unmanageable amount of sensor data produced. Data loads being produced by development systems, such as the DARPA developed Autonomous Real-Time Ground Ubiquitous Surveillance Imaging System (ARGUS-IS) system, approach that of the human vision system which sends up to 72 Gigabytes of information per second to the brain! Saliency is the perceptual quality which makes some items in the world stand out from their neighbors and immediately grab our attention. The human visual system receives and processes the eye"s wide field of view images and determines regions that require"attention". This"saliency"processing uses size, shape, texture, motion, and color to determine regions of interest. The eye uses the high definition fovea in the center of vision to examine the ROI. This high resolution data is processed for recognition and interpretation in the visual cortex. PHASE I: Develop an architecture which demonstrates the saliency and classification/recognition functions emulating the processes of the human brain. Design a processor incorporating innovative processing elements upon which the processing architecture can be instantiated. ISR data containing detectable dismounts shall be processed and Pd, Pfa shall be estimated from processing in an emulation environment. PHASE II: Phase I sensor processing architectural concept and designs shall be incorporated into a more detailed processor design and performance demonstrated. The emulation environment shall be modified to represent the processor. Processor performance shall be estimated across the spectrum of conditions. An Engineering Development Plan shall be developed. PHASE III: Military applications include unmanned vehicles including aircraft, ships, and tactical land vehicles. Missions in the civil and commercial markets include physical security for border, maritime, and port surveillance applications, search and rescue and natural man-caused disaster support. REFERENCES: 1. J G Elias, HH Chu, and S Meshreki, A Neuromorphic impulse circuit for processing dynamic signals, IEEE International Conference on Circuits and Systems, pp 2208-2211, IEEE Press, 1992. 2. M Simoni, G Cymbalyuk, M Sorenson, R Calabrese, and S DeWeerth, A multi-conductance silicon neuron with biologically matched dynamics, IEEE Trans. Biomed. Eng, pp342-354, IEEE Press, 2004. 3. J Lin, P Merolla, J Arthur, and K Boahen, Programmable Connections in Neuromorphic Grids, 49th IEEE Midwest Symposium on Circuits and Systems, pp 80-84, IEEE press, 2006. 4. l Itti, C Koch, Computational Models of Visual Attention, Nature Review, Vol. 2, pp.194, 2001. 5. T Serre, L Wolf, S Bileschi, M Reisenhuber, T Poggio, Robust Object Recognition with Cortex-like Mechanisms, IEEE Trans. Pattern Analysis and Machine Intelligence, Vol. 29, No. 2, pp 441, 2007.
US Flag An Official Website of the United States Government