You are here

AN/ALE-47(V) Software Test Environment Automated Scenario and Mission Data File Test Generator Software

Award Information
Agency: Department of Defense
Branch: Navy
Contract: N68335-18-C-0513
Agency Tracking Number: N181-025-0735
Amount: $124,757.00
Phase: Phase I
Program: SBIR
Solicitation Topic Code: N181-025
Solicitation Number: 18.1
Timeline
Solicitation Year: 2018
Award Year: 2018
Award Start Date (Proposal Award Date): 2018-05-24
Award End Date (Contract End Date): 2019-10-09
Small Business Information
4401 Wilson Boulevard, Suite 810, Arlington, VA, 22203
DUNS: 606926769
HUBZone Owned: N
Woman Owned: N
Socially and Economically Disadvantaged: N
Principal Investigator
 Frank Monte Frank Monte
 Sr. Software Engineer I
 (703) 807-0055
 fmonte@idtus.com
Business Contact
 Teddy Kidd
Phone: (703) 522-4032
Email: tkidd@idtus.com
Research Institution
N/A
Abstract
The ALE-47 Countermeasures Dispenser Set (CMDS) is installed on almost all Navy aircraft and is a vital component of the Aircraft Survivability Equipment (ASE) Suite. Each Type/Model/Series of aircraft utilizes a Mission Data File (MDF) which is mission-critical software that must undergo extensive testing prior to fielding. Each MDF must be tested against a different set of system inputs across different scenarios and conditions. The current manual testing process which includes platform-specific script generation, manual operation of the Software Test Environment (STE) and manual validation that the system is meeting performance specifications, can greatly benefit from automation to reduce cost and schedule and increase test coverage. This proposal addresses these challenges by the unique application of end-to-end automation applied to the entire lifecycle of MDF testing. This end-to-end automated testing solution is centered on a system model that defines the overall behavior of the ALE-47 system. This model, along with recorded test data is used to support generation of automated test artifacts including: automated STE scenario scripts, test execution flows that automatically operate the STE during test events, analysis cases for requirements verification, System Requirement Verification Matrix (SRVM) to provide test validation and automatically generated Objective Quality Evidence (OQE) reports.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government