You are here
DHS/S&T SBIR DHS SBIR-2011.2 1
NOTE: The Solicitations and topics listed on this site are copies from the various SBIR agency solicitations and are not necessarily the latest and most up-to-date. For this reason, you should use the agency link listed below which will take you directly to the appropriate agency server where you can read the official version of this solicitation and download the appropriate forms and rules.
The official link for this solicitation is: 1
Release Date:
Open Date:
Application Due Date:
Close Date:
Available Funding Topics
- 001: Low Power Tri-axial Acoustic Sensor
- 002: Improved Wipes for Surface Sampling of Chemical Agents on Porous Materials
- 003: Mobile Device Forensics
- 004: Short Standoff Checkpoint Detection System for Explosives
- 005: Iris Image Quality Tool Suite for Biometric Recognition
- 006: Intelligent “Object” Symbology
The U.S. Customs and Border Protection (CBP) use UGS units to detect personnel, vehicles, and aircraft engaged in illegal activity at the U.S. border. The UGS units consist of: sensor(s) for detecting activity; a buried housing that contains a processing unit that interprets the received signals from the sensor(s) and performs administrative and control tasks; a radio for communicating alarms back to a CBP Command Center; and a power supply. An UGS unit normally employs a microphone to detect acoustic energy generated by the target. Processing of the received acoustic signal can provide information on the time the target passed the UGS, and the targets’s speed and range. Line-of-bearing and track information in near real time is desirable for the Border Patrol (BP) to determine where a target engaged in illegal activity may be headed. With the current deployment of single channel microphone UGS units, tracking information would have to be realized from the use of multiple UGS units and correlation of the individual information from them. Deploying multiple UGS units entails an operational impact and it is desirable to be able to develop a track solution from a single UGS.
A low cost, low power, acoustic sensor that can provide directional information in both heading and altitude (for aircraft) and which can be integrated with existing UGS units is desired. It should be noted that the acoustic signature of some types of targets is narrowband in nature such that correlation processing of multiple acoustic sensors at the UGS site is not deemed viable (i.e., Army’s Boomerang sensor unit). A tri-axial acoustic sensor would provide the desired functionality for operating near the border. In order for a tri-axial acoustic sensor to be reasonable for employment, the cost of the sensor can’t greatly increase the overall cost of a deployed UGS units. Commercially available UGS units may range in cost from $2K-$10K depending upon the vendor. A tri-axial acoustic sensor also should not increase the power consumption of the UGS markedly either. Sensor power consumption should be on the order of three times that of a microphone sensor (ignoring increased processing requirements) when all three channels are used. The bandwidth of the directional acoustic sensor should be great enough to capture the acoustic signature of the different types of targets.
Directional acoustic sensors proposed for this solicitation need to show that their sensitivity or self noise does not markedly decrease the detection range of a target in comparison to a single channel microphone. Directional acoustic sensors proposed for this solicitation also need to address the sensitivity of the sensor to wind noise and/or the ability to shield the sensor from wind noise. Directional acoustic sensors proposed for this solicitation also will require an alignment capability or an alignment procedure such that heading measurements can be related to absolute coordinates. It should be noted that UGS are often deployed at night in a covert fashion and an alignment process that compromises the covertness of the deployment would not be acceptable.
PHASE I: Provide detailed analysis on sensitivity, self noise, wind noise abatement, directional capability, power, packaging, alignment, and cost for the proposed sensor. Consider/analyze methods for reducing power consumption (i.e., initially detecting and classifying a target on a single axis and using multiple axes only for tracking). Verify the analysis with measured data obtained both from laboratory testing and from field testing.
PHASE II: Work with an UGS manufacturer (approved by DHS S&T) for interfacing the directional acoustic sensor to their UGS units. The interface will involve both hardware and software modifications that will need to be performed by the UGS manufacturer. Laboratory test the prototype unit for self noise, directional accuracy, power draw, overall UGS power change from a single channel microphone configuration, and alignment accuracy. Field test the prototype unit with the directional acoustic sensor against targets of interest and verify detection range and tracking capability.
PHASE III: COMMERCIAL APPLICATIONS: Refine sensor packaging from the Phase II field trials. Market and transition the directional sensor capability to UGS vendors used by DHS and also UGS vendors used by the U.S. Army and Marine Corps.
The Department of Homeland Security (DHS) has a need for a novel surface wipe material that more efficiently removes low volatility chemical agent contamination from porous and absorptive surfaces (e.g., uncoated and coated concrete, painted wallboard, unglazed ceramic tile) than current cellulosic-based, gauze-type, wipe materials. The novel wipe material will further demonstrate the ability to quantitatively release agent, using conventional solvent- based extraction techniques, for analytical identification and quantification of agent contamination on a porous surface.
PHASE I: Development and laboratory proof-of-concept demonstration of: a) a novel surface wipe material system that, when compared to current wipe systems, reproducibly removes at least 45-50% more low volatility chemical agent surrogate from the surface of porous materials (e.g., uncoated and coated concrete, painted wallboard, unglazed ceramic tile), and b) the ability to recover a near quantitative amount of the low volatility chemical agent surrogate from the novel wipe material system using conventional analytical extraction techniques. The demonstration should be conducted in a manner such that the comparison of results is statistically-valid.
PHASE II: Laboratory demonstration of a novel wipe material system using operationally- relevant substrates (e.g., concrete, brick, painted wallboard) that have been contaminated to varying levels and contact times with a low volatility chemical agent. The same requirements apply for efficiency of agent removal and recovery of compound of interest from the novel wipe material by traditional analytical extraction procedures. The successful offeror will be required to work with a laboratory capable of chemical agent operations.)
PHASE III: COMMERCIAL APPLICATIONS: Transition of the technology to DHS S&T projects for response and recovery of critical infrastructure contaminated with a low volatility chemical.
Within the area of mobile device forensics, the Department of Homeland Security (DHS) Science and Technology (S&T) Directorate is currently interested in three distinct facets of this complex problem area. Proposers can respond to any of the three sub-topics listed below (i.e., proposers may submit up to three different sub-topic proposals in response to this mobile device forensics topic).
Sub-topic 1. NAND/NOR Chip Forensics – Flash memory is now present in a variety of devices including: mobile phones, iPads, eReaders, thumb drives, picture frames, and laptops. Investigators require technology to effectively obtain information from flash memory (both NAND and NOR) chips in a forensically sound manner.
There are three issues for law enforcement in this area:
1. Reading the data stored on the chip
2. Reverse engineering of the wear-leveling algorithm
3. Mounting the file system
The developed capability is envisioned to be a lab tool that addresses all three of the above issues. This is not intended to be extended for field use at this time.
Sub-topic 2. Bypassing PIN/PUK Codes – GSM, iDen, World Phones, and satellite phones use removable Subscriber Identity Module (SIM) and Micro-SIM cards to communicate on a cellular network. Without a PIN, an investigator cannot directly access data stored on a locked SIM card. Data on the SIM typically includes: contact lists, call history, SMS messages, and subscriber information. SIM cards can be locked with a 4-digit Personal Identity Number (PIN) and an 8- digit Personal Unlocking Key (PUK) that disables direct access to, and examination of, data stored on the SIM.
Law enforcement investigators require a tool to extract PIN and PUK codes from locked SIM cards.
Sub-topic 3. Disposable Cell Phone Analysis – Disposable phones (“throw-away”, “burner”, or “fast” phones) are frequently used by criminals because they are inexpensive and do not require a contract, credit card or personal information. Most disposable phones are GSM based, but CDMA phones are also available and these handsets either do not have external port access to retrieve information or access is prohibited in some other fashion. Law enforcement requires a tool to extract information from disposable phones.
This sub-topic will focus on the demonstration and development of methods and tools that will allow an investigator to acquire all: call logs, contacts, pictures, videos, and text messages stored within all disposable cell phones. The goals of this effort are:
1. Demonstrate and implement the capability to acquire the full physical memory of the devices in a designated population of disposable cell phones in a forensically sound manner.
2. Demonstrate and implement the capability to efficiently examine (parse) acquired data from a designated population of disposable cell phones in a forensically sound manner.
PHASE I:
Sub-topic 1. NAND/NOR Chip Forensics – Design a method for comprehensive chip reader and memory parser for NAND and NOR flash memory chips.
Sub-topic 2. Bypassing PIN/PUK Codes – Design a method for a forensically sound tool that will successfully decrypt SIM cards by acquiring PIN and PUK codes from locked SIM cards.
Sub-topic 3. Disposable Cell Phone Analysis - Design a method to acquire physical memory from a designated population of disposable cell phones in a forensically sound manner.
PHASE II:
Sub-topic 1. NAND/NOR Chip Forensics – Demonstrate and implement hardware and software applications for development of a comprehensive chip reading and memory parsing tool for NAND and NOR flash memory chips. The tool should be developed for law enforcement and forensic examiner use and, where possible, should be delivered as open source technology.
Sub-topic 2. Bypassing PIN/PUK Codes – Demonstrate and implement hardware and software applications for development of a forensically sound tool that will successfully decrypt SIM cards by acquiring PIN and PUK codes from locked SIM cards. The tool should be developed for law enforcement and forensic examiner use and, where possible, should be delivered as open source technology.
Sub-topic 3. Disposable Cell Phone Analysis – Demonstrate and implement hardware and software tools required to acquire and efficiently examine physical memory data from a designated population of disposable cell phones in a forensically sound manner. The tool should be developed for law enforcement and forensic examiner use and, where possible, should be delivered as open source technology.
PHASE III: COMMERCIAL APPLICATIONS: All sub-topics - The final developed tools will be marketable to a wide variety of Federal, State, and local law enforcement agencies. It is anticipated that those tools delivered as open source technology will require support, custom extensions, and additional applications as new mobile device technologies are commercially introduced.
Checkpoint security incorporates a wide variety of screening technologies and processes to detect person-borne threats and illicit objects, including weapons and explosives. Individuals attempting to circumvent checkpoint security have resorted to a variety of techniques to avoid detection, including hiding threat or illicit objects, but minute quantities of trace explosives may remain on their person or baggage.
Techniques for the non-contact trace detection of explosive particles on a person’s body, clothing or baggage are being sought for deployment at security checkpoints. The development of novel methods for this short standoff (<1 m) detection without the collection of explosive particles or vapors is encouraged.
Transportation Security Administration (TSA) screens passengers and their carry-on baggage at airport checkpoints prior to departure gate access. Checked baggage and 100 percent of air cargo carried on passenger aircraft are also screened for explosives. U.S. airlines set an annual record by carrying 769.4 million scheduled domestic and international passengers on their systems in 2007. A TSA checkpoint screens about 200 passengers per hour. TSA is deploying millimeter wave and backscatter x-ray Advanced Imaging Technologies (AITs) to enable Transportation Security Officers to detect non-metallic anomalies located under clothing. In order to minimize potential footprint and throughput impact of a short standoff explosives particle detector system, it is envisioned that it would be integrated into the AIT system and would operate concurrently with the AIT – requiring no more than 20 seconds to scan a full body or piece of baggage, analyze, and alarm on any surface explosive particles. The system must be eye-safe for both operator and subject.
Emerging technologies in the area of optical (vibrational, rotational, and electronic) spectroscopy offer the potential for a new type of non-contact trace detection system for explosive particles that could be readily integrated into an aviation security passenger checkpoint. While proven technologies such as Reflectance Infrared Spectroscopy and Raman Spectroscopy may provide the desired capabilities, other innovative approaches will also be considered for a short standoff checkpoint explosives detection system.
Such a capability would be of interest to various DHS components and other security forces.
PHASE I: Design a non-contact trace detection system for explosive particles on a person’s body, clothing or baggage and construct “breadboard” components as necessary to characterize potential system performance. Describe the minimum detectable trace residue (particle or vapor) at a range of 50 cm against a user defined set of military, commercial and homemade explosives (e.g., C4, TNT, NG, TATP, RDX, and PETN). Evaluate the effect of different surfaces and the environment on the ability to detect distinct signatures. Deliver a detailed report on the design, including acquisition and operating costs for a short standoff explosives detection system capable of being integrated into an AIT system.
PHASE II: Fabricate and demonstrate a prototype of the Phase I short standoff checkpoint explosives detection system for sensitivity and selectivity. Verification and validation in the prototype demonstration shall be achieved through empirical analysis, simulations, and/or other quantitative means. This analysis shall include, but not to be limited to: the probability of detection and false detection rate (characterized by a Receiver Operating Characteristic or ROC curve) for a user defined set of military, commercial and homemade explosives in an operational environment. Deliver a detailed report of this effort and its results.
PHASE III: COMMERCIAL APPLICATIONS: Incorporate lessons learned in Phase II and develop a mature short standoff checkpoint explosives detection system for independent evaluation. Adoption of this short standoff checkpoint explosives detection system will depend upon performance and cost. Checkpoint screening is ubiquitous, and the potential market for a robust system is considerably greater than that offered by DHS components.
Biometric system performance depends on the quality of the acquired input samples. If sample quality can be improved, whether by sensor design, user interface design, or standards compliance, better performance can be realized. For those aspects of quality that cannot be designed-in, an ability to analyze the image and identify recognition-related defects and problems is needed. The ability to quickly acquire knowledge about sample quality can aid in deciding whether to initiate reacquisition from a subject, in real-time selection of the best sample from a set of samples, and in the selective invocation of different processing methods. Automated quality assessment is also useful for other purposes. In an enterprise context, image quality assessments can reveal trends, or time and location specific instances of poor image capture practices. For example, if operational collections consistently produce poor iris quality samples, it can indicate that additional training is needed or that screening may be better served by using another biometric modality.
Quality analysis is a technical challenge because it is most helpful when measures reflect the performance sensitivities of one or more target biometric matchers. To help facilitate universal interoperability of iris data across cameras and matchers, the National Institute of Standards and Technology (NIST), supported by the Department of Homeland Security, completed Iris Exchange (IREX) II Iris Quality Calibration and Evaluation (IQCE). This work identifies iris image properties that influence different vendor’s recognition accuracy and quantifies their effects. It is expected that this work will inform the development of the international standard ISO/IEC 29794 Biometric sample quality - Part 6: Iris image data.
This topic area seeks to enable innovative research that will contribute to efforts to develop, evaluate, and publish precise computation methods for quality metrics and increase the availability of commercial grade iris image quality software tools capable of operating on embedded, desktop, and server platforms.
PHASE I: Develop, evaluate, and document precise computation methods for quality metrics that are identified by IQCE to influence recognition performance.
PHASE II: Develop a suite of software tools designed to function on a variety of platforms to provide image quality assessment information implementing the quality metrics established in Phase I.
PHASE III: COMMERCIAL APPLICATIONS: Optimize and validate computation methods that yield quantitative scores that predict cross camera and matcher recognition outcomes on avariety of hardware platforms. Develop, market and license software packages tailored to operate on different platforms that will perform this function.
The Department of Homeland Security (DHS) is committed to using cutting- edge technologies and scientific talent in its quest to make America safer. The International Committee for Information Technology Standards (ANSI INCITS) 415-2006, Homeland Security Mapping Standard - Point Symbology for Emergency Management establishes point symbols focused exclusively on the emergency management and emergency responder communities. DHS includes component organizations such as FEMA, CBP, USCG, and NPPD, whose missions encompass areas outside emergency management, such as law enforcement and intelligence analysis. These require symbology standards and techniques above and beyond that which is provided by ANSI INCITS 415-2006.
The Science and Technology (S&T) Directorate is working on the next generation of geospatial web mapping and symbology technologies to identify challenges and develop recommendations on mission relevant applied research to address these challenges. DHS has a wide “spectrum” of missions. As a result, a one size fits all symbology standard for the entire department is not practical in this environment with broad ranging practices, methodologies, and applications across the components. The current practice requires users to identify data, extract information, create or reuse existing symbology, and manually apply these symbols within geo-spatial applications. Additionally, the end users have to manually link information or Meta data to symbology in order for it to be interpretable and interoperable. This current practice is time consuming and creates room for errors.
There is a need for an automated, dynamic application that ingests, processes, and analyzes data from a combination of sources and presents interpreted data in a meaningful way to provide the end user with valuable and accurate information relevant to their specific needs or functions. In order to address these challenges, DHS is looking at the concept of Intelligent Symbology, an active dynamic object or agent that is able to interpret information and provide the appropriate symbol automatically to the end user. In other words, this application technology will search and assess data (structured/unstructured text) from pre-defined data sources such as Law Enforcement and Public Safety RSS feeds, bulletins, and other applications for key words and geo co-ordinates such as “fire at 500 Elm Street” and identify the appropriate symbology to accurately represent the feature and event on a viewer or geospatial application.
This effort goes beyond just analyzing and tagging “free” text and geo co-ordinates, and uses dynamic symbology and attribution. For example, the state of a symbol is dynamically updated to reflect relevant changes to the underlying data within a specified time window. Using the same fire example mentioned above, the applications technology would be able to analyze data sources to determine the current event status of the fire incident and its effect on occupants or surrounding areas and find the appropriate fire symbol and dynamic attribution to accurately represent this information. Secondly, the application technology should be able to determine data “pedigree”, which would allow the end user to extract additional information from a symbol at the application level. Meaning, at the application level, the user should be able to view all Meta data and information related to an incident, its location, and the data sources thereby allowing the user to validate the integrity of the data and determine how to safely and effectively respond to the incident.
Essentially, this Intelligent “Object” would actively query data sources for specified types of data that can be identified, symbolized and geo-coded to support business intelligence, mashups and data analytics on the content. The results will provide: 1) meaningful information to the end user, 2) the appropriate symbology automatically portrayed to represent that data (it is assumed that the symbols are vetted and cataloged in the data dictionary), and 3) the ability for the end user to drill down on a symbol to access more information pertaining to the incident or event.
PHASE I: Research and assess the concept proposed, and work in close collaboration with DHS to develop various approaches and methodologies for ingesting, integrating, analyzing, and representing data within the context of geospatial intelligence and information sharing. Use the ANSI INCITS 415-2006 as the base standard. Develop and demonstrate two or more proof of concept applications or widgets that support the chosen methodologies. Propose relevant performance metrics to be used in assessing the accuracy of the proposed approach.
PHASE II: Develop and deliver a working prototype application that is compatible with popular browsers and geospatial applications from major vendors such as ESRI, Microsoft, and Google. Provide and test the prototype application and related source code, documentation, data and other necessary artifacts as required for migration of the prototype application into the DHS Geospatial Information Infrastructure (GII) infrastructure. Implement agreed upon performance metrics to be used in assessing the accuracy of the proposed approach.
PHASE III: COMMERCIAL APPLICATIONS: Further develop the prototype into a working product and provide it as a plug-in service or widget to major geospatial vendors such as ESRI, Microsoft, and Google.