You are here
APEX: Adaptive and Peripheral surveillance fusion Engine
Title: Principal Scientist
Phone: (512) 342-0010
Email: sgreenblatt@21technologies.com
Title: COO
Phone: (512) 342-0010
Email: dtaylorz@21technologies.com
Contact: Moses Suidit
Address:
Phone: (716) 645-2357
Type: Nonprofit College or University
It is computationally infeasible to implement a complete Level 2/3-fusion solution that has detailed knowledge of all parts of an area of interest all the time. A staged approach is needed where the first stage implements a real time peripheral vision sensor analysis to identify asymmetric threats. The second stage implements a “foveal vision” sensor analysis that identifies, links and validates asymmetric threats computed in the first stage. In this Phase I effort, named APEX (Adaptive and Peripheral surveillance fusion Engine), The State University of New York at Buffalo and 21st Century Technologies Inc. propose this two stage approach to address the adaptive persistent/persistent surveillance problem. • SUNY Buffalo’s persistent surveillance fusion technology, INFERD (Information Fusion Engine for Real-Time Decision Making), contains real time fusion capabilities that flags asymmetric activities of interest by geo-spatial grid. Thus, INFERD acts as a “peripheral vision” capability to the level 2/3 fusion problem. Because INFERD operates at the millisecond level, there is no validation step. • When potentially interesting events are detected by INFERD, 21st Century Technologies’ TMODS (Terrorist Modus Operandi Detection System) validates and link activities of interest across multiple geo-spatial coordinates, and if appropriate, forwards the interesting activity to an I&W database.
* Information listed above is at the time of submission. *