USA flag logo/image

An Official Website of the United States Government

DoD 2013.2 SBIR Solicitation

DoD 2013.2 SBIR Solicitation

Printer-friendly version
Agency: Department of Defense
Branch: Defense Health Program
Program/Year: SBIR / 2013
Solicitation Number: 2013.2
Release Date: April 24, 2013
Open Date: May 24, 2013
Close Date: June 26, 2013
DHP13-001: Humeral Head Intraosseous Training System
Description: OBJECTIVE: To develop a simulation-based training system to assist teaching and training the use of intraosseous (IO) devices in the humeral head to administer fluid to patients at point of injury. DESCRIPTION: Over the past few years, the British Medical Emergency Response Team (MERT) and US Air Force Search and Rescue Unit (aka, PEDRO) have been administering fluids to patients at point of injury and enroute through the use of intraosseous (IO) devices in the humeral head. The MERT includes an Emergency Medicine residency trained physician. The PEDRO includes pararescue trained medical providers who are afforded the opportunity to train on cadavers prior to deployment. The US Army Center for Predeployment Medicine (CPDM) at Fort Sam Houston, TX provides predeployment medical training to providers of all levels as mandated by Surgeon General Executive Order 096-09. CPDM currently does not have an adequate shoulder training model to train on the humeral head intraosseous device. Consequently, the shoulder IO device has not been placed in the US Army Medical Equipment Sets (MES) to avoid issuing a life-saving medical device without proper training. It is cost prohibitive to use cadavers for the number of students attending CPDM each year. It is not logistically feasible to train all providers in Emergency Medicine. As a result of not having an adequate training model to train our deploying providers, the US Army is currently behind the power curve on treating combat wounded service members on the modern battlefield in regards to the use of the humeral head IO. Research conducted under this effort should focus in the development and evaluation of a low cost simulation-based shoulder training capability for placement of intraosseous device in the humeral head. The system should: - Support established training objectives. - Provide a capability to judge proficiency performance. - Support practice of both cognitive and psychomotor skills. - Simulate entire torso from sternal notch to just below umbilicus with complete bilateral upper extremities. - Allow for proper positioning of the affected upper extremity prior to insertion of the device. - Afford training on both left and right humeral heads. - Have flexible upper extremities, including elbow, to allow positioning of the hand over the umbilicus to ensure proper position for device insertion. - Allow for improper positioning of the extremity by the student, affording an opportunity for remedial training. - Include palpable anatomical landmarks to determine proper insertion site of device. - Include injectable simulated bone to confirm placement and flow. - Have replaceable humeral head simulated bones. PHASE I: Conduct a six month effort to analyze the scientific, technical, and commercial merit and feasibility of using a low-cost medical simulator for training medical Army personnel of all levels in Army Combat Training Schools. Proposed work will include research into feasibility of developing the capability and describing the overall concept. We seek innovative and novel ideas for exploration of concepts to provide this training environment. Identify innovative technologies being considered, technical risks of the approach, costs, benefits, and notional schedule associated with development and demonstration of the prototype. The simulator solution must be hands-on, low-cost, and realistic for use in the current training Program of Instruction (POI) at the US Army Center for Pre-Deployment Medicine (CPDM) at Fort Sam Houston, Texas. PHASE II: Develop and demonstrate a prototype system from the recommended solution in Phase I. Provide realistic and meaningful interaction for hands-on treatment. The prototype should provide immediate student feedback without the aid of an on-site instructor. Evaluation of the proposed system is required. Data from these studies will need to be provided, analyzed, and presented in a final report. PHASE III DUAL USE APPLICATIONS: This system could be used in a broad range of military and civilian medical training applications. Demonstrate the application of this system to civilian hospitals, paramedics, and other military medical personnel. REFERENCES: 1. Carness, J.M., Russell, J.L., Lima, R.M., Navarro, L.H., & Kramer, G.C. (2012). Fluid resuscitation using the intraosseous route: Infusion with lactated ringer"s and hetastarch. Military Medicine, 177(2), 222-228. 2. Harcke, H.T., Crawley, G., Mabry, R., & Mazuchowski, E. (2011). Placement of tibial infusion devices. Military Medicine, 177(7), 824-827. 3. Hock, M.E., Chan, Y.H., Oh, J.J., & Ngo, A.S. (2009). An observational, prospective study comparing tibial and humeral intraosseous access using the EZ-IO. American Journal of Emergency Medicine, 27, 8-15 4. Paxton, J.H., Knuth, T.E., & Klausner, H.A. (2009). Proximal humerus intraosseous infusion: A preferred emergency venous access. The Journal of Trauma Injury, Infection, and Critical Care, 67(3), 606-611. 5. Sarkar, D., & Philbeck, T. (2009). The use of multiple intraosseous catheters in combat casualty resuscitation. Military Medicine, 174(2), 106-108.
DHP13-002: Automated Non-Invasive Cognitive Load Assessment for Medical Training Effectiveness and Safety
Description: OBJECTIVE: Effective team performance is critical during medical emergencies and combat trauma situations. The goal is to make medical team training exercises more useful to participants and more readily interpretable by instructors. The desired result is improved capability to measure -- automatically & noninvasively -- team performance, team dynamics, individual performance, individual cognitive load and team cognitive load balance. Improved medical team training assessment will likely result in more effective and efficient training, better performance and improved lifesaving capabilities. DESCRIPTION: The US Navy has successfully used noninvasive physiology data during submarine training simulations to assess individual performance, team function and team balance using technologies such as electroencephalogram (EEG), heart rate, galvanic skin response & other indicators. This topic seeks proposals that employ such an approach or similar approaches that support better instructor understanding of medical team training performance without increasing the task load on the instructors. Deliverables will include design and construction of a basic prototype, user testing, construction of an advanced prototype and a validation study approved by second-level DOD Institutional Review Boards (IRB). Approaches that integrate with current real time instructor evaluation systems are encouraged but not required. It is desired that the advanced training assessment system perform the following functions automatically and noninvasively: - Measure individual cognitive load ( & other factors) during simulations - Measure team cognitive load balance ( & other factors) during simulations - Identify team members exhibiting low workload and engagement - Obtain correlation data between noninvasive measurements (EEG, instructor evaluation, heart rate, etc) with vigilance, attention and performance tasks. - Accept input from instructors during the event to mark important events or performance - Provide automated interpretations and useful graphical outputs of assessment information Additionally, research should determine - The utility of this approach and specifics on what such a system can and cannot measure - Whether less engaged team members are less effective - Whether / how we identify team members who are struggling - How we distinguish a well integrated/performing team from a less effective one using these assessments - How assessment data correlates with the instructors"assessment - The best technologies for obtaining these measurements - Most effective ways this technology can be commercialized & #8195; PHASE I: Design and construct a basic team training assessment system prototype. Conduct informal user testing with a small number of Subject Matter Experts (SME) to internally validate approach and receive relevant feedback. No trials/studies requiring Institutional Review Board (IRB) during Phase I will be accepted. Design a well thought out research study (for phase II if invited to submit a Phase II proposal) to test your hypothesis. Report findings. A face-to-face review will be held around the 5th or 6th month at Fort Detrick, Maryland to present and demonstrate the proof of concept developed and results obtained to that point. Submitters should budget for that. PHASE II: Develop an advanced and functional medical team training assessment system prototype. Complete validation research study design. Submit research protocol to IRBs for approval. Perform validation study and publish results. Report findings. PHASE III: This capability is expected to result in a vision for, and described end-state of, a system with improved capability to measure -- automatically & noninvasively -- team performance, team dynamics, individual performance, individual cognitive load and team cognitive load balance. Capability could apply to military and civilian applications. It should identify one or more Phase III military applications or acquisitions programs and likely path(s) for transition from research to operational capability. It should identify EITHER (a) one or more potential commercial applications OR (b) one or more commercial technology(ies) that could be potentially inserted into defense systems as a result of this SBIR project. References: 1. Kohn, LT, Corrigan JM, Donaldson, MS, eds. To Err is Human: Building a Safer Health System. Washington DC: National Academy Press; 2000. 2. Berka, C., Levendowski, D.L., Lumicao, M.N., Yau, A., Davis, G.,Craven, P. (2007). EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. Aviation, Space, and Environmental Medicine, 78(5), Section II, B231-244. 3. Stevens, R., Galloway, T., Wang, P., Berka, C. (2012). Cognitive neurophysiologic synchronies: What can they contribute to the study of teamwork? Human Factors: The Journal of the Human Factors and Ergonomic Society, 54(4), 489-502. DOI: 20.227720028720811427296. 4. Behneman, A., Berka, C., Stevens, R., Raphael, G. (2012). Neurotechnology to accelerate learning. IEEEE Pulse 6 February, 60-63.
DHP13-003: Long-lasting Disposable Insecticidal / Repellent Fabric Barrier for Personal or Area Protection Against Biting Arthropods
Description: OBJECTIVE: Develop fabric barrier with long-lasting repellent and/or insecticide for protecting deployed personnel against biting arthropods, for military use. Product must have potential for EPA registration and use compounds with low mammalian toxicity. DESCRIPTION: Protection of deployed ground forces from disease-carrying insects requires the immediate and safe use of insecticides, repellents and bednets. Vector-borne diseases transmitted by insects, such as malaria, dengue and leishmaniases, are on the increase world-wide and are more of a threat to our military forces today than they were 30 years ago. Unfortunately, insects that transmit militarily important diseases are now becoming resistant to an increasing number of public health insecticides used for indoor residual sprays and space spray (aerosol) applications with little global R & D underway to search for replacements. Moreover, topical repellents and bednets approved for military use have a low compliance rate. One potential solution to improving compliance is to supplement the military"s current personal protective system found in the AFPMB"s Technical Guide 36,"Personal Protective Measures Against Insects and Other Arthropods of Military Significance", with an effective personal or area repellent. Such an area repellent or repellent/toxicant combination would reduce the need for a bed net and use of topical repellents and permethrin treated uniforms. Bed nets are commonly deemed too restrictive by military members and those bed nets with a mesh designed for mosquito exclusion are useless against tiny phlebotomine sand flies that vector leishmaniasis and can easily crawl through standard bed net mesh size. Backed by a recent publication (Ogoma et al., 2012) showing excellent efficacy, one potential solution with global implications would be the use of a narrow strip of natural fiber cloth, cord or tarp/blanket treated with a volatile insecticidal repellent that was able to successfully protect the user from insect bites for up to 6 months. This could be combined with insecticidal camouflage netting (Britch et al., 2011) to provide barrier treatment for protection of personnel in forward deployment locations. While the U.S. does have some commercially available spatial repellent products such as those containing allethrin (e.g. ThermaCELL), metofluthrin (e.g. OFF Clip On) and potentially transfluthrin, they have limited applications because they are not EPA registered for indoor use and require frequent (~4 hours) replacement of the toxicant/repellent strips, battery or butane cartridge used for heat. Plug in and passive devices are available globally for protection against biting insects, but none are currently registered for use in the United States and therefore available to the military. Additionally, there is little independent efficacy data to show that the globally available spatial repellent devices work to protect humans from biting insects. Therefore, the military needs a commercially available, EPA registered spatial repellent that works to prevent insect bites from potential pathogen carrying vectors for several months. The DoD Armed Forces Pest Management Board (AFPMB) has identified the need for additional and improved spatial and area repellents as a key strategy in the prevention of vector-borne disease transmission to deployed U.S. forces and listed this requirement as one of its top twenty research priorities for military entomology (AFPMB 2011). New spatial repellent barriers will provide deployed military ground forces a valuable additional tool to control insects that impact military operations. A cost effective, disposable, narrow treated cloth, cord, tarp or matt that prevents insect bites, yet does not obscure vision for incoming threats or add significant weight or volume to a deployed military members field kit will fill a major gap in the military"s ability to stop or prevent nuisance biting or disease vectors during military operations. PHASE I: This phase of the SBIR should focus on lab development of a safe, effective barrier cloth, cord, mat, or camouflage netting containing spatial repellents/toxicants that protects a sleeping or stationary individual user from biting insects such as mosquitoes or phlebotomine sand flies. Item must have the potential for EPA registration, have a shelf life of several years and provide protection for a minimum of 3 weeks under typical military situations. PHASE II: During the Phase II portion of this SBIR, the awardee should develop the prototype spatial repellent/toxicant barrier formulation for semi field and field testing against biting insects such as mosquitoes and sand flies. To the maximum extent possible, product must be entered into EPA registration process. PHASE III: The proposed SBIR has commercial applications outside of the military. This sort of novel vector control device could be used in global public health to protect vulnerable individuals from biting insects and the pathogens the carry. At the completion of a successful Phase II, the company should either develop it"s own manufacturing operation or seek funding from either a private company for product commercialization. Product could also seek additional funding through advanced development funding. The product resulting from this SBIR should be considered for NSN assignment so that it may be readily purchased by military and other US governmental organizations. In addition, the product should be considered for inclusion in a military member's deployment kit. REFERENCES: 1. Armed Forces Pest Management Board. 2011. Department of Defense Research Requirements for Pest Management for FYs 2011-2012, signed 24 May 2011. 2. Armed Forces Pest Management Board. 2009. Technical Guide 36 - Personal Protective Measures Against Insects and Other Arthropods of Military Significance (http://www.afpmb.org/sites/default/files/pubs/techguides/tg36.pdf). 3. Achee, N.L., M.R. Sardelis, I. Dusfour, K.R. Chauhan, and J.P. Grieco, 2009. Characterization of spatial repellent, contact irritant, and toxicant chemical actions of standard vector control compounds. J. Am. Mosq. Contr. Assoc. 25: 156-167 4. Argueta, T.B.O., H. Kawada, and M. Takagi. 2004b. Spatial repellency of metofluthrin-impregnated multilayer paper strip against Aedes albopictus under outdoor conditions, Nagasaki, Japan. Med. Entomol. Zool. 55: 211-216. 5. Britch S.C., K.J. Linthicum, W.W. Wynn, R.L. et al., 2011. Longevity and efficacy of bifenthrin treatment on desert-pattern US military camouflage netting against mosquitoes in a hot-arid environment. J. Am. Mosq. Control Assoc., 27: 272-279. 6. Burkett D., S.E. Cope, D.A. Strickman, G.B. White, 2011. The Deployed War-Fighter Protection Program: New Public Health Pesticides, Application Technology, and Repellent Systems, in W.J. Sames, D.E. Bowles, R.G. Robbins, S.E. Cope, eds: DoD Entomology: Global, Diverse, and Improving Public Health. Proceedings of the DoD Symposium at the 54th Annual Meeting of the Entomological Society of America, 12-16 December 2010, San Diego, CA. Armed Forces Pest Management Board, pp. 11-30. 7. Kawada H, Maekawa Y, Tsuda S, Takagi M., 2004. Trial of spatial repellency of metofluthrin-impregnated paper strips in shelters without walls in Lombok Island in Indonesia. J Am Mosq Control Assoc. 20:434-437. 8. Kawada H, Maekawa Y, Takagi M., 2005. Field trial on the spatial repellency of metofluthrin-impregnated plastic strips for mosquitoes in shelters without walls (beruga) in Lombok, Indonesia. J Vec Ecol. 30:181-185. 9. Kawada H, Maekawa Y, Tsuda Y, Takagi M. 2004. Laboratory and field evaluation of spatial repellency with metofluthrin-impregnated paper strip against mosquitoes in Lombok Island, Indonesia. J Am Mosq Control Assoc. 20:292-298. 10. Ogoma SB, H Ngonyani, ET Simfukwe, A Mseka, J Moore, and GF Killeen. 2012. Spatial repellency of transfluthrin-treated hessian strips against laboratory-reared Anopheles arabiensis mosquitoes in a semi-field tunnel cage. Parasites & Vectors 2012, 5:54. 11. Orshan L & Zollner G, 2011. Evaluation of a metofluthrin fan vaporizer device against phlebotomine sand flies (Diptera: Psychodidae) in a cutaneous leishmaniasis focus in the Judean Desert, Israel. J. Vector Ecology, 36, S157-S165. 12. Pates HV, Line JD, Keto AJ, Miller JE: 2002. Personal protection against mosquitoes in Dar es Salaam, Tanzania, by using a kerosene oil lamp to vaporize transfluthrin. Med Vet Entomol. 16:277-284.
DHP13-004: Militarized Formulation and EPA Registerable Attractive Targeted Sugar Bait for Insect Vector Control
Description: OBJECTIVE: Develop sugar-based vector control bait product, formulated and packaged for military use, with potential for EPA registration. Product must use insecticides effective for killing target vectors, but have low mammalian toxicity and minimal impacts on non-targets. DESCRIPTION: Protection of deployed ground forces from disease-carrying insects requires the immediate and safe use of insecticides. Alternative methods such as biological control are too slow and not practical under battlefield conditions. Vector-borne diseases transmitted by insects, such as malaria, dengue and leishmaniases, are increasing world-wide and are more of a threat to our military forces today than they were 30 years ago. Unfortunately, insects that transmit militarily important diseases are now becoming resistant to an increasing number of public health insecticides used for indoor residual sprays and space spray (aerosol) applications with little global R & D underway to search for replacements. Backed by a series of recent publications showing excellent efficacy, one potential solution with global implications involves the development and commercialization of an EPA registered attractive toxic sugar bait for controlling mosquitoes and other biting flies, formulated for use by the military. In addition to a blood meal where pathogen transmission occurs, most human vector species also require sugar meals as normal dietary requirements. Sugar meals are generally obtained from floral and extra-floral nectaries, plant juices, and honey dew (excretions from plant sucking bugs) and are commonly found throughout a typical ecosystem. Attractive sugar bait barriers or bait stations containing a toxicant that selectively attract and kill vectors is a novel and potentially useful vector control technique with great utility in many military situations. The military needs an attractive sugar bait formulation containing insecticide selective for killing vector species, but having little attractiveness or impact on non-targets such as honeybees and other beneficial arthropod. The DoD Armed Forces Pest Management Board (AFPMB) has identified the need for additional pesticides for controlling malaria vectors/new EPA-registered pesticides for public health use as one of its top six research priorities for military entomology (AFPMB 2011). New pesticides and pesticide application techniques will provide military entomologists and vector control specialists a valuable additional tool to control insects that impact military operations. A powdered or other easily transferred sugar bait toxicant for vector control will maximize the utility by making a relatively safe, effective and easily transported product that will fill a major gap in our ability to stop or prevent nuisance biting or disease vectors during military operations. The purpose of this project is to develop an EPA registerable attractive toxic sugar bait product for vector control that is safe, effective, easily applied and has a minimal impact on non-target insects. Product must be formulated for safe, easy transport, capable of being applied using standard military application equipment and have a long shelf life. PHASE I: This phase of the SBIR should focus on lab development of an effective attractive sugar bait toxicant and formulation that is effective for target vectors (mosquitoes, sand flies, filth flies) and has a minimum impact on non-target insects. This product must meet requirements for EPA registration and have a shelf life and formulations (wettable powder or liquid concentrate) suitable for use by the military. PHASE II: During the Phase II portion of this SBIR, the awardee will develop the prototype formulation and toxicant and run field tests showing efficacy against target vectors and impacts to non-targets. Product must be entered into EPA registration process. PHASE III: The proposed SBIR has commercial applications outside of the military. This sort of novel vector control product could be used in global public health vector control organizations (both governmental and non-governmental). At the completion of a successful Phase II, the company should seek final EPA approval and funding from either a private company for commercialization of the product or through advanced development funding. The product resulting from this SBIR should be considered for NSN assignment so that it may be readily purchased by military and other US governmental organizations. REFERENCES: 1. Armed Forces Pest Management Board. 2011. Department of Defense Research Requirements for Pest Management for FYs 2011-2012, signed 24 May 2011. 2. Allan, S.A. 2011. Susceptibility of adult mosquitoes to insecticides in aqueous sucrose baits. J. Vector Ecol., 36: 59-67. 3. Avant, S. 2012. DWFP: A Battle Plan to protect U.S. Troops From harmful insects. Agricultural Research, 60(10): 4-14. 4. Beier, J.C., G.C. Mller, W. Gu, K.L. Arheart, & Y. Schlein,2012. Attractive toxic sugar bait (ATSB) methods decimate populations of Anopheles malaria vectors in arid environments regardless of the local availability of favored sugar-source blossoms. Malar. J., 11:31. 5. Gu, W., G.C. Mller, Y. Schlein, R. Novak & J.C. Beier, 2011. Natural plant sugar sources strongly impact malaria transmission potential of Anopheles mosquitoes. PLoS One 6: e15996. 6. Junnila, A., G.C. Mller & Y. Schlein, 2011. Attraction of Phlebotomus papatasi to common fruit in the field. J. Vector Ecol., 36: 206-211. 7. Mller, G.C., J.C. Beier, S. F. Traore et al., 2010. Successful field trial of attractive toxic sugar bait (ATSB) plant-spraying methods against malaria vectors in the Anopheles gambiae complex in Mali, West Africa. Malar. J., 9: 210. 8. Mller, G.C., J.C. Beier, S.F. Traore et al., 2010. Field experiments of Anopheles gambiae s.l. attractiveness to local fruits/ seed pods and flowering plants in Mali to optimize strategies for malaria vector control in Africa using attractive toxic sugar bait (ATSB) methods. Malar. J., 9: 262. 9. Mller, G.C., Junnila, A., W. Qualls et al., 2010. Control of Culex quinquefasciatus in a storm drain system in Florida with attractive toxic sugar baits (ATSB). Med.Vet. Ent., 24: 346351. 10. Mller, G.C., A. Junnila & Y. Schlein, 2010. Effective control of adult Culex pipiens by spraying an attractive toxic sugar bait solution in the vegetation near larval developmental sites. J. Med. Ent., 47: 63-66. 11. Mller G.C., V.D. Kravchenko & Y. Schlein, 2008. Decline of Anopheles sergentii and Aedes caspius populations following presentation of attractive, toxic (Spinosad), sugar bait stations in an oasis. J. Am. Mosq. Control Assoc., 24: 147149. 12. Mller, G.C., E. E. Revay & Y. Schlein, 2011. Relative attraction of the sand fly Phlebotomus papatasi to local flowering plants in the Dead Sea region. J. Vector Ecol., 36: 187-194. 13. Mller G.C. & Y. Schlein, 2006. Sugar questing mosquitoes in arid areas gather on scarce blossoms that can be used for control. Int. J. Parasit., 36: 1077-1080. 14. Mller, G.C. & Y. Schlein, 2008. Efficacy of toxic sugar baits against adult cistern-dwelling Anopheles claviger. Trans. Roy. Soc. Trop. Med. Hyg., 102: 480-484. 15. Mller, G.C. & Y. Schlein, 2011. Different methods of using attractive toxic sugar baits (ATSB) for the control of Phlebotomus papatasi. J. Vector Ecol., 36: 64-70. 16. Mller, G.C., R.D. Xue & Y. Schlein, 2011. Differential attraction of Aedes albopictus in the field to flowers, fruits and honeydew. Acta Trop., 118:45-49. 17. Qualls, A.W., R.D. Xue, E.E. Revay, S. Allan & G.C. Mller, 2012. Implications for control of mosquitoes breeding and resting in cisterns and wells in St. Augustine, Florida by attractive toxic sugar baits (ATSB). Acta Trop., 124: 158161. 18. Schlein, Y. & G.C. Mller, 2008. An approach to mosquito control: Using the dominant attraction of flowering Tamarix jordanis trees against Culex pipiens. J. Med. Ent., 45: 384-390. 19. Schlein, Y & G.C. Mller, 2010. Experimental control of Phlebotomus papatasi by spraying attractive toxic sugar bait (ATSB) on vegetation. Trans. Roy. Soc. Trop. Med. Hyg., 104: 766-771. 20. Schlein, Y. & G.C. Mller, 2012. Diurnal resting behavior of adult Culex pipiens in an arid habitat in Israel and resulting possible control measurements with attractive toxic sugar baits (ATSB). Acta Trop., 124: 4853. 21. Xue, R.D., G.C. Mller, D.L. Kline, D.R. Barnard, 2011. Effect of application rate and persistence of boric acid sugar baits applied to plants for control of Aedes albopictus. J. Am. Mosq. Control Assoc., 27: 56-60.
DHP13-005: Rapid ID of Microbial Pathogens From Food, Water and Environmental Samples
Description: OBJECTIVE: To develop a field-ready kit for the rapid (max 8 hours) identification, quantification, and viability of microbial pathogens (bacterial, viral, and eukaryotic) from food matrices, water, and environmental samples. Direct or indirect detection of biological toxins is also desired. A developed kit will emphasize ease of use by technicians who are relatively lab-inexperienced, and an agnostic/semi-agnostic approach to sample setup and testing. DESCRIPTION: The United States Air Force public health and laboratory career fields have few field-capable diagnostic tools to identify microbial pathogens from food samples suspected of causing illness. Current capability relies on the non-differential and non-selective PetrifilmsTM (3MTM) that are incubated overnight to produce a total aerobic plate count; separate steps that incorporate the use of crystal violet dye or a heating step also allow the separate enumeration of Gram-negative and spore-forming bacteria, respectively. However, the resulting data is inadequate for deciding the appropriate course of action in determining the source and/or preventing the spread or reoccurrence of a Food-Borne Illness (FBI) event, as this kit does not provide enough data required for the unambiguous ID of a microbial pathogen. The initial organisms of interest are the top ten pathogens responsible for the vast majority (>95%) of food-borne illness events in the United States as identified by the Center for Disease Control (Scallan et al. 2011). They are: 1. Norovirus, 2. Salmonella spp., nontyphoidal, 3. Clostridium perfringens, 4. Camplyobacter spp., 5. Streptococcus spp. Group A, 6. Shigella spp., 7. Escherichia coli O157 and non-O157 spp., 8. Yersinia enterolitica, 9. Toxoplasma gondii, 10. Giardia intestinalis. The primary user of the developed kit will be USAF public health officers and technicians with little or no laboratory training under potentially extreme environments without access to ideal laboratory conditions and facilities. The kit must demonstrate cost-effective and user-friendly assays under field conditions with same-day (eight hours maximum) results at the genus or species level of identification from food samples. The kit must have agnostic (i.e., one assay can identify all ten initial organisms) or semi-agnostic (assays are broken into classes such as Gram-positive/Gram-negative/virus/parasite) sample preparation in order to reduce the number of tests per sample, save time, and keep costs low. PHASE I: Key deliverables: 1. A technology that can detect and identify microbial pathogens with same day results (~eight hours maximum) from food, water and environmental samples. 2. Quantitative or semi-quantitative results. 3. Agnostic or semi-agnostic sample preparation and assay set-up. 4. Ease of use; must not require extensive laboratory experience or training. 5. Process will include all steps from initial treatment of sample to readout of result. PHASE II: Key deliverables: 1. Technology from Phase I ruggedized for field use. 2. Cost-effective equipment; inexpensive and shelf-stable consumables/reagents. 3. Small footprint (
DHP13-006: Sporozoite Vaccine Administration Method
Description: OBJECTIVE: To develop an innovative method for administering a malaria sporozoite vaccine that provides efficient access by the sporozoites to the intravascular space, thereby mimicking direct intravenous (IV) delivery. This innovative method should contrast with traditional intramuscular (IM), subcutaneous (SC) or intradermal (ID) methods delivering sporozoites primarily to the interstitial space. Proof-of-concept should be established in animal models using cryopreserved malaria sporozoites as the Phase I objective, with pre-clinical development, clinical testing and FDA licensure of a P. falciparum sporozoite vaccine (administered by this novel method) projected for later development phases. The sporozoite is the infectious stage of the malaria parasite transmitted to humans by the female Anopheles mosquito. The sporozoites used for proof-of-principle in this SBIR should be purified and cryopreserved. The cryopreserved sporozoites can be either fully infectious or radiation-attenuated. Ideally, proof-of-principle for IV equivalency should be established using both types. The most important demonstration will be to demonstrate equivalent protection to IV administration, using radiation-attenuated sporozoites. See the Phase I project description for more details. The method may incorporate novel administration devices, locations, volumes, formulations or other innovative approaches, and should be equivalent to the current gold standard, direct intravenous inoculation, as measured by sporozoite infectivity, vaccine immunogenicity or vaccine efficacy on subsequent malaria challenge. The method should be amenable to routine use in vaccination clinics at military medical treatment facilities, primarily for adult recipients but ideally suitable also for pediatric applications. While the primary target is US Army / DoD military medical operations, a secondary target is medical operations for other government agencies and the general public. DESCRIPTION: Malaria has been identified as most significant infectious disease threat during deployments to tropical and subtropical regions. The first formal determination of malaria"s importance was made by the Infectious Diseases Investment Decision Evaluation Algorithm (ID-IDEAL) in 2003, which ranked malaria #1 out of 40 infectious diseases considered (Burnette, Hoke et al. 2008). The same conclusion was reached in 2012 by the Infectious Disease Threat Prioritization Panel, which ranked malaria #1 out of 38 infectious diseases (Infectious Disease Threats to the US Military Prioritization Panel Results; Memorandum for Record, Fort Sam Houston, TX, 2010). This prioritization reflects malaria"s historical role as the leading cause of person-days lost from active duty in conflicts taking place in tropical regions, including approximately 12 million person-days lost for Navy and Marine Corps personnel during WWII and 1 million person-days lost during the Vietnam Conflict (Beadle & Hoffman, 1993). In October 2003, malaria inflicted a 44% casualty rate on a 157 person Marine Expeditionary Unit deployed to Roberts International Airport in Monrovia, Liberia, aborting the peace-keeping mission after 12 days on the ground (Whitman, Coyne et al. 2010). These casualties occurred despite the availability of effective malaria drug prophylaxis and personal protective measures, indicating a major shortfall in the ability to sustain the performance of US military forces in the tropics. The evacuations from Liberia and subsequent hospitalizations cost the military approximately $1.2M (Roberts, in preparation). This shortfall would be entirely eliminated by an effective vaccine. For this reason, the DoD formalized the requirement for a malaria vaccine in the document"Operational Requirements Document for Plasmodium Falciparum Malaria Vaccine"issued by the Army Training and Doctrine Command and approved on 13 March 1997 by the Deputy Chief of Staff for Combat Developments. The requirement has been updated by the document"Capability Development Document For Plasmodium Falciparum Malaria Vaccine"issued by the Army Medical Research and Materiel Command and approved 01 April 2010 by the US Army Deputy Chief of Staff. Since 2000, the DoD had invested more than $100M in malaria vaccine development, focusing on three different promising vaccine platforms: (1) recombinant malaria proteins formulated in adjuvant (e.g., RTS,S/AS01B Vaccine); (2) malaria genes delivered as DNA plasmids or by non-replicating viral vectors (e.g., NMRC-M3V-D/Ad-PfCA Vaccine); and (3) metabolically active, non-replicating (attenuated), whole malaria sporozoites. This past fall (2012), whole malaria sporozoite vaccines emerged as the most protective approach to date. In a dose escalation trial of purified, cryopreserved, radiation-attenuated sporozoites (PfSPZ Vaccine, Sanaria Inc., Rockville, MD), 6/6 research subjects (100%) were fully protected against Plasmodium falciparum sporozoite challenge. The human challenge model exposes research subjects to malaria via infectious mosquito bite; it is also called controlled human malaria infection or CHMI. In the second highest dose group, 6/9 research subjects (67%) were fully protected. This was only the second time that the PfSPZ Vaccine has been tested in a clinical trial. These protection results are the best achieved by a candidate malaria vaccine administered by any route other than by mosquito bite. The route of delivery used in the recent trial described above was direct intravenous inoculation, since the first clinical study of the same vaccine delivered SC or ID showed protection of less than 10% (Epstein, Tewari et al. 2011). The IV route bypasses the initial tissue stage of infection, in that, during normal malaria transmission, most sporozoites are deposited by the probing mosquito within the epidermis or dermis or potentially subcutaneously depending on the depth of proboscis penetration. The mosquito searching for a blood meal probes the skin multiple times, depositing sporozoites in a single stream with each probing. Jin and colleagues at New York University demonstrated in the murine model that during mosquito probing, sporozoites are released at a relatively slow rate (approximately 1 to 2.5 per second) from the mosquito proboscis (Jin 2007). In the earlier trial with SC or ID administration, the purified cryopreserved parasites were deposited"en masse"in a bolus of fluid, as occurs with traditional SC or ID administration. This is very different from what occurs with mosquito probing. Since sporozoites do not"swim,"instead requiring the interstitial substrate to achieve the gliding motility that normally enables locating and entering the vascular system and subsequent travel to the liver, it could be that deposition within a lacunae of fluid inhibits entry into the vasculature. In addition, it is possible that, due to the shock of cryopreservation and subsequent thawing, radiation-attenuated parasites such as those comprising the PfSPZ Vaccine are less effective at moving through the interstitial space, penetrating the vascular wall and gaining access to the vascular system than non-cryopreserved radiation-attenuated sporozoites freshly injected by a mosquito. Intravenous administration effectively bypasses this initial skin phase of infection, thereby effectively circumventing both of these disadvantages. Intravenous administration thus revealed the potency of the PfSPZ Vaccine for providing sterile protection against malaria, achieving 100% protection in the high dose group. Although direct intravenous inoculation appeared to be safe and well tolerated in the recent study and could be administered in the controlled environment of DoD personnel prior to deployment to endemic areas, it would nevertheless be useful to identify an alternative method of delivery that mimics the ability of IV administration to elicit high grade protection. In developing this concept, improved methods of vaccine delivery could be demonstrated in animal models by (1) administering intact cryopreserved sporozoites and using infection as the outcome variable, or by (2) administering radiation-attenuated cryopreserved sporozoites and using protection as the outcome variable. With either the intact sporozoites/infection model or the attenuated sporozoites/protection model, there is a roughly 5 to 25 fold difference in dose comparing IV to ID or SC administration to achieve equivalent infection or protection (Hoffman, personal communication; Ploemen, Chakravarty et al. 2012). This SBIR solicitation is based on the assumption that a non-IV method can be developed that eliminates this multifold difference, permitting infectivity and/or protection that is equivalent to that achieved by direct IV administration. Applicants are encouraged to think"outside the box"by proposing highly innovative approaches and then assessing these approaches in animal models. Of note, novel vaccine delivery methods applicable to malaria sporozoites may prove equally applicable to other infectious agents for which we currently lack effective vaccines. Product requirements: a vaccination method for delivering cryopreserved malaria sporozoites that: - is safe and well tolerated (military personnel may not be placed in down status). - achieves equivalent potency relative to direct single needle IV administration, meeting DoD criteria of>80% protection (preferred:>90% protection) in humans. - can be administered in routine fashion at military medical treatment facilities. - is suitable for adult application (preferred: additionally suitable for pediatric including infant application). - is able to receive FDA approval for administration of a sporozoite vaccine. - can be adequately scaled to meet military needs. - is cost-effective. PHASE I: Perform experiments in animal models to demonstrate that the proposed method of sporozoite delivery is equivalent or superior to gold standard IV administration. Purified, cryopreserved or purified, cryopreserved, radiation-attenuated sporozoites should be used. The endpoint should be infectivity and/or protective efficacy, as measured by parasite liver load, proportion of animals positive, or other suitable outcome measure. The proof-of-principle, if conducted in a rodent malaria model, should be readily translatable to P. falciparum whole sporozoite vaccines, such as the PfSPZ Vaccine, and should have a clear pathway for pre-clinical and clinical development, licensure and deployment by the DoD. Please identify technical risks of the approach, as well as the costs, benefits and schedule associated with development for clinical use. Map out a well-constructed clinical development plan. Identify minimum system requirements for deployment. PHASE II: Perform pre-clinical development of the novel method for delivering the P. falciparum whole sporozoite vaccine through IND allowance. Manufacture sufficient material for pre-clinical and clinical development. Include a plan for scale-up manufacturing, licensure, marketing and effective implementation in the DoD. Include estimates of resources required to fully implement the method"s use in 100,000 military personnel. Include a business plan that addresses required partnerships and issues of intellectual property such that unrestricted government use at reasonable cost is assured. PHASE III: Assess the safety, tolerability, immunogenicity and efficacy of the novel method for delivering the P. falciparum whole sporozoite vaccine and demonstrate equivalence or superiority to direct, single needle IV administration. Fully develop the data set required for FDA licensure. Detail product and associated consumables costs. Provide full specifications for manufacturing materials and processes and product storage and shipment. Provide a detailed plan for licensure and transition to DoD acquisition programs. REFERENCES: 1. Beadle C & Hoffman SL. (1993)."History of malaria in the United States Naval Forces at war: World War I through the Vietnam conflict."Clin Infect Dis 16(2):320-9. 2. Burnette, WN, Hoke CH, et al. (2008)."Infectious diseases investment decision evaluation algorithm: a quantitative algorithm for prioritization of naturally occurring infectious disease threats to the U.S. military."Mil Med 173(2): 174-81. 3. Epstein JE, Tewari K, et al. (2011)."Live attenuated malaria vaccine designed to protect through hepatic CD8(+) T cell immunity."Science 334(6055):475-80. 4. Jin Y, Kebaier C & Vanderberg J. (2007)."Direct microscopic quantification of dynamics of Plasmodium berghei sporozoite transmission from mosquitoes to mice."Infect Immun 75(11):5532-9. 5. Ploemen IH, Chakravarty S, et al. (2012)."Plasmodium liver load following parenteral sporozoite administration in rodents."Vaccine. 6. Whitman TJ, Coyne PE, et al. (2010)."An outbreak of Plasmodium falciparum malaria in U.S. Marines deployed to Liberia."Am J Trop Med Hyg 83(2): 258-65.
DHP13-007: Development of a Vector Arthropod (Tick and Flea) Pitfall or Sticky Trap with CO2 Attractant
Description: OBJECTIVE: The development of a tick and flea sticky or pitfall style trap to be used for field surveillance which employs a deployment sound source of CO2. DESCRIPTION: Current methods for trapping ticks and fleas by DoD personnel are not as effective as should be given the peer reviewed literature which documents what serves to attract and trap off-host tick and flea species known to carry disease causing pathogens (Gaaboub et al, 1972; Borchert et al, 2012; Miles, 1968; Hokama, 1977; Garcia, 1962; Nevill, 1964). Among the methods currently employed by DoD entomologists, public health and pest management personnel for monitoring for ticks and fleas of medical and veterinary importance are tick drags and burrow swabs and, when available, with dry ice as a source of CO2, but these are either manpower intensive, not conducive to current deployment packages, or ineffectual. A tick drag is a piece of white sheet or fleece with a dowel sewn into one end and pulled by a string tied to the ends of the dowel, and burrow swabs are cotton balls on the end of a dowel rod or sewer snake, which are inserted into animal burrows. The dry ice is laid on the tick drag and left for some period of time. However, the ticks will more often than not crawl off prior to the attendant being able to return to the drag, or the attendant is left in the area watching the drag for 4 6 hours. What is sought here is a pitfall or sticky like trap which provides a continuous source of CO2 for ~8 hours of use which weighs less than 3 pounds and dimensionally suitable for field use in grid and burrow trapping. The ideal product will be well suited for preventative medicine deployment packages and equally effective as current collection methods. The trap must be low cost and durable enough to allow for multiple reuses and long term sustainability. The trap would allow for survival of collected arthropods for five days under similar temperature and humidity levels; may be of similar dimensions as discussed by Miles (1968) 12cm (long) x 5cm (diameter); allow easy removal of specimens; and self-contain additional parts. PHASE I: The Phase I deliverable one will be to design/develop a prototype(s) along with providing limited testing data (lab or field) demonstrating the effectiveness of each prototype in collecting and maintaining fleas and/or ticks from locations of known populations. The population data will be based on previously collected population data for each field site. Field-sites should provide supporting data for hard and soft tick species and flea species of medical and/or veterinary importance. Prototype data will also include potential for reusability, weight, dimensions, durability, and potential cost for each prototype. PHASE II: Phase II deliverables are include producing hardware prototype(s) of Phase I work, and development of proper field testing metrics that will address the following requirements: trapping effectiveness of fleas and ticks in a variety of climates and conditions (i.e., rain, extreme heat, areas of high humidity, etc), durability, reusability, preventative medicine deployment package weight and dimensions, and costs per unit/maintain unit analysis. Small businesses will work closely with Armed Forces Pest Management Board and relevant board committees to assure that their metric of prototype testing (i.e. field collection and lab analysis) are in line with DoD standards. A final report of findings for each prototype will be provided, along with comparisons of strengths and weakness of multiple prototypes (when applicable). PHASE III DUAL USE APPLICATIONS: Phase III expectations are that the small businesses will work with the Armed Forces Pest Management Board, and relevant board committees, to develop the hardware prototype(s) into products which can be submitted for inclusion in service specific deployment packages. This will require the product to be granted a national stock number which will be coordinated with the Armed Forces Pest Management Board. REFERENCES: References are available through the Armed Forces Pest Management Board"s"Literature Retrieval System"at http://www.afpmb.org/content/welcome-literature-retrieval-system 1. Borchert, J.N. et al. 2012. Evaluation and Modification of Off-Host Flea Collection Techniques Used in Northwest Uganda: Laboratory and Field Studies. J. Med. Entomol. 49(1): 210-214. 2. Gaaboub, I.A., Mansour, N.A. & Kamel, F.M. 1971. A Field Trap for Collecting Adult Fleas. WHO/VBC/71.289, Z. angew. Entom., 68. H. 4, S. 432-438 3. Garcia, R., 1962. Carbon-dioxide as an attractant for certain ticks (Acarina"Argasidae and Ixodidae). Ann. Ent. Soc. Amer., Vol. 55 (5): 605. 4. Miles, V.I. 1968. A Carbon Dioxide Bait Trap for Collecting Ticks and Fleas from Animal Burrows. J. Med. Ent. Vol. 5(4): 491-495 5. Nevill, E. M. 1964. The Role of Carbon Dioxide as Stimulant and Attractant to the sand tampan, Ornithodoros savignyi (Audouin) Onderstepoort J. Vet. Res. 31(1): 59-68.
DHP13-008: A software tool to assess injury risk and maximum allowable exertions for repetitive, forceful one hand and two hand shoulder push/pull motions
Description: OBJECTIVE: Develop injury criteria, an assessment methodology, a risk analysis software tool and design criteria for repetitive, forceful one and two hand shoulder push/pull motions performed for variable (brief to long) durations while operating military equipment. The injury criteria, assessment methodology and analysis software will be used to evaluate injury risk from man-machine interactions performed during routine use of military systems. The design criteria will estimate both the maximum strength capacities and the maximum allowable exertion limits that should be imposed on Soldier operators'shoulders. DESCRIPTION: Military jobs impose heavy demands upon shoulders and contribute to musculoskeletal injury. For example, a RAND Corporation study (Kirin, 1992) revealed that 43% of Army Military Occupational Specialties were classified as"very heavy", requiring Soldiers to occasionally lift more than 100 pounds and constantly lift more than 50 pounds. Besides heavy exertion, Soldiers often perform highly repetitive jobs that involve stressful postures. Postural stresses are an important, underappreciated risk factor for shoulder injury. Generally speaking, when the arm moves up and away from the body, pressure increases inside the shoulder joint and soft tissues are sometimes traumatized as they are pinched between the humerus and acromion. These traumatic events are not associated with pain initially. But with repetition and over time, episodic inflammatory reactions cause rotator cuff and articular cartilage degeneration. Severe degeneration produces profound disability. The best prevention for severe adverse health effects is to reduce ergonomic risk factors through proper equipment and job design. Two strategies can be used to achieve that end: provide System Developers with design criteria that reduce biomechanical shoulder stress and conduct health hazard assessments of equipment that identify and mitigate hazardous designs. An assessment tool is needed to characterize the hazards posed by the biomechanical forces evident inside the shoulder while pushing and pulling in different arcs of trajectory using different intensities of exertion for a range of durations. From these injury criteria a tool will be built to characterize the exertional hazards to which Soldiers are exposed while performing tasks and interacting with equipment. Design guidance is also needed that prescribes maximum allowable exertions for pushing and pulling with one or two hands, by either male or female Soldiers. The weakness in current guidance is illustrated by the advice that MIL-STD-1472G gives to System Developers for hatch design. That instruction advises that no more 50 pounds of force be required to open or close an overhead hatches. This guidance fails to recognize that performing repetitive exertions at this intensity may be hazardous, particularly when performed in some postures (especially end range shoulder abduction) or by small-framed individuals. Another issue with the MIL-STD-1472G hatch guidance is that it does not specify reach distance or hand placement. Most Soldiers are not capable of generating 50 pounds of force while reaching far or when hatch location requires that operators assume an awkward shoulder postures. Besides the potential hazard for musculoskeletal injury, the current design standard may allow equipment to be fielded that is difficult to egress. It is imperative that problematic design standards such as this one are revised as the military services open jobs more jobs to take full advantage of the capabilities of women. PHASE I: Develop injury criteria based upon the relationship between changes in intra-articular biomechanical stress related to shoulder posture and intensity of push-pull exertions. Use these criteria and the exposure schedule (hours per day, days per week, weeks per year) to determine hazard severities and probabilities for key adverse outcomes: load intolerance from shoulder muscle fatigue and increased risk (relative to a comparable normal population) of developing near term acute rotator cuff disorders (such as supraspinatus tendonitis and subacromial bursitis) and long term chronic degenerative tissue diseases of the shoulder (such as rotator cuff tears and degenerative joint disease). Build and demonstrate a proof-of-concept exposure assessment model that estimates and displays a biomechanical measure of adverse mechanical stress from a set of data that are either readily measurable or obtainable from the System Developer's specification (joint angle of the shoulder during exertion and force of exertion) or from the use scenario. This model will determine injury risk relative to the full range of male and female anthropometries and display risk using hazard severity and hazard probabilities as defined in AR-40-10. This model shall be limited to a simple vertical overhead push. PHASE II: Build a software application that integrates all of the features specified in Phase I. Additional exposure assessment criteria will be developed and integrated into the software that will estimate biomechanical forces and predict injury risk for pushing at other locations in the 3D work envelope described by the range of motion of the shoulder joint. Additional data will be collected to develop an analogous exposure assessment model for pulling exertions. The software will allow the user to enter key task parameters such as shoulder joint angle and push or pull force and exposure data such as rate or number of exertions and other frequency and duration data. The model will estimate the sum of the biomechanical energy to which musculoskeletal tissues are exposed based upon the task parameters and exposure data entered, determine hazard severity and hazard probability in accordance with AR-40-10 definitions and display these data on the graphical user interface. Develop a design standard that specifies maximum allowable one and two hand push and pull forces for various locations throughout the 3D work envelope based upon muscular strength capacity (fatigue modeling) and separate data for the same 3D work envelope locations that predict maximum push and pull forces based upon tolerance to adverse biomechanical force. PHASE III DUAL USE APPLICATIONS: The contractor will provide a working software application. The US Army Public Health Command (USAPHC) Ergonomics Program will develop at least five hypothetical scenarios that include equipment and user scenarios that require physically demanding pushing or pulling in different directions within a 3D work envelope. The USAPHC Ergonomics Program will use the assessment methodology to conduct analyses, write reports and submit the results through the USAPHC Health Hazard Assessment Program's report review process. USAPHC will collect comments and appraisals from Ergonomics and Health Hazard Assessment Personnel and analyze them. Results will be discussed with the US Army Medical Research and Material Command to identify deficiencies and develop a plan for reconciliation. The injury criteria, assessment software and design criteria will serve as a sentinel reference for ergonomists to describe strength capacity and upper extremity injury risk for dynamic exertions performed at various reach distances throughout the full range the work envelope. This will replace the very meager compilation of static strength reference values that have very limited applicability to equipment design. The normative values from this study will be vital to human resources professionals who need to write accurate job descriptions and occupational health professionals prescribing post-injury return-to-work recommendations. Vehicle and heavy equipment designers will find these data useful for determining maximum exertion levels for manipulating vehicle controls within a driver"s compartment. Ship and aircraft designers will also be able to use the data to estimate force requirements for hatches and doors. The injury criteria will find extensive application by all branches of the armed services as well as others who must perform physically demanding tasks that require pushing or pulling and individuals who design or work in work stations that require frequent reaching or forceful exertion. REFERENCES: 1. Chow A and Dickerson CR. Shoulder strength of females while sitting and standing as a function of hand location and force direction. Applied Ergonomics. 40:3003-8, 2009. 2. Gold GE, Pappas GP, Blemker SS, Whalen ST, Campbell G, McAdams TA and Beaulieu CF. Abduction and external rotation in shoulder impingement: An open MRI study on healthy volunteersinitial experience. Radiology. 244(3):815-822, 2007. 3. Hoffman SG. Whole-body postures during standing hand-force exertions: development of a 3D biomechanical posture prediction model. Doctoral dissertation, University of Michigan, 2008. 4. Hoozemans MJ. Van der Beek AJ. Frings-Dresen MH. Van Dijk FJ. Van der Woude LH. Pushing and pulling in relation to musculoskeletal disorders: a review of risk factors. Ergonomics. 41(6):757-81, 1998. 5. Kirin SJ and Winkler JD. The Army Military Occupational Specialty Database. Report No. N-3527-A. RAND Corporation, Santa Monica, CA, 1992.
DHP13-009: A Software Tool to Assess Injury Risk Associated with Mechanical Exposures From Wearing Head Supported Mass
Description: OBJECTIVE: Develop injury criteria, methodology, and a software tool to assess the risk of neck injury from loads sustained while wearing head supported mass. The software will characterize the hazards endemic to the ground combat environment and will be used to evaluate products and recommend less hazardous designs and usage scenarios. DESCRIPTION: It is imperative that equipment issued to Soldiers is designed to function properly and to minimize the risks of developing disabling chronic musculoskeletal disorders. Experiences from recent military conflicts clearly demonstrate that Soldiers are issued head borne devices that impose loads that exceed functional tolerances and elevate risks of developing disabling neuromuscular disorders. The Personnel Armor System for Ground Troops (PASGT) comes in five sizes that range in weight from 3.1 to 4.2 pounds to which many be attached other devices that add mass or create asymmetrical loads. The PASGT face shield is a common addition. The reactive contraction of posterior cervical muscles to counter the torque imposed by the 3.4 pound PASGT ballistic face shield caused headaches and prompted a request from a field medical officer to locate a substitute product. Although symptoms were noted soon after the face shield was fielded, the medical sequela from long term use and increased risk of severe, acute conditions such as disc herniation and of future degenerative conditions could not be estimated because of the lack of a injury criteria for chronic load effects and an assessment tool that could be applied during design to identify potential problems and alert System Developers to the need for design modifications. Since the Army fielded more than 800,000 PASGTs, tremendous opportunities to avoid costs associated with lost work time, provider salaries, medical treatments were lost by not having the capacity to identify problems early and integrate design changes prior to fielding. Although headgear is aggressively scrutinized, tests traditionally focus on how the physical properties of a helmet will protect the Soldier from insult originating from the environment or how well an attached device will perform its intended function. Aside from fitting, less attention is directed to estimating the adverse medical effects manifested in musculoskeletal tissues subjected to unaccustomed levels of mechanical force over long durations. The reason for this oversight is that the injuries and costs related to impacts from motor vehicle accidents and sports have stimulated medical research to develop models for high velocity collisions, but similar models have not been developed for lower energy, long duration loading. PHASE I: Develop injury criteria that describe the relationship between the physical characteristics of head borne load (mass, symmetry of weight distribution, and location relative to the center of gravity of the head) and the exposure schedule (hours per day, days per week, weeks per year) to determine hazard severities and probabilities for key adverse outcomes: load intolerance from cervical muscle fatigue and increased risk (relative to a comparable normal population) of developing near term acute cervical conditions (such as disc bulges, prolapses and herniations) and long term chronic degenerative tissue diseases of the cervical spine (such as degenerative disc disease and degenerative joint diseases). Build and demonstrate a proof-of-concept exposure assessment model that estimates and displays a biomechanical measure of adverse mechanical stress from a set of data that are either readily measurable or obtainable from the System Developer's specification (such as the item's mass and item center of gravity) or the use scenario. Load conditions shall demonstrate the range of weights of head borne gear (helmets and attached accessories). It shall determine injury risk relative to the full range of male and female anthropometries and display risk using hazard severity and hazard probabilities as defined in AR-40-10. This model shall be limited to static biomechanical calculates that assumes that the Soldier is stationary. PHASE II: Build a software application that integrates all of the features specified in Phase I. Additional exposure assessment criteria shall be developed and integrated into the software that allow simulation of attachments on headgear. The software shall provide a means for simulating a minimum of two attachments at a variety of locations relative to the head piece: a face shield and one other attachment. The software shall be able to resolve the forces from the masses of the main head piece and attachments and generate injury risk assessments using AR-40-10 as described in the previous phase. The exposure assessment model and software shall be expanded to enable injury prediction for the following dynamic activities: marching, running, jumping, controlled parachute fall landing and diving to prone. The software shall accomplish this by allowing the software operator to enter an integer corresponding to time spent either marching or running and integers for the other activities to designate the number of times each activity is performed in a 24 hour period. The model shall estimate the sum of the biomechanical energy to which musculoskeletal tissues are exposed based upon the physical characteristics of the activities performed and the duration of exposure, determine hazard severity and hazard probability in accordance with AR-40-10 definitions and display these data on the graphical user interface. PHASE III DUAL USE APPLICATIONS: The contractor will provide a working software application. The US Army Public Health Command (USAPHC) Ergonomics Program will develop at least five hypothetical scenarios that include equipment and user scenarios that expose Soldiers to head supported mass. The USAPHC Ergonomics Program will use the assessment methodology to conduct analyses, write reports and submit the results through the USAPHC Health Hazard Assessment Program's report review process. USAPHC will collect comments and appraisals from Ergonomics and Health Hazard Assessment Personnel and analyze them. Results will be discussed with the US Army Medical Research and Material Command to identify deficiencies and develop a plan for reconciliation. The injury criteria and assessment software should attract great worldwide commercial interest by all branches of the armed services, law enforcement agencies, and other parties who design headgear for occupational and sports applications. Generally, helmets are designed to protect against insult from rapid change in acceleration (such as used for horseback riding, hang gliding, roller and ice skating, skiing, skateboarding, automotive racing and motocross, bull riding, canoeing, kayaking and bicycle riding) and repeated trauma (such as used for hockey and football). Recent awareness of vulnerability to traumatic brain injury from high force and repeated low force trauma has been favoring a shift to constructing helmets from heavier materials. Exposure to chronic loading from these heavier helmets and other specialized helmets used by selected groups (such as policemen, firemen, welders, miners and autistic children) increase the risk of developing degenerative neck disorders. This head supported mass model would help commercial headgear designers more objectively evaluate the tradeoff between selecting certain design features (i.e., heavy materials or asymmetrical weight distribution) and injury risk. This model would impact a very broad market. Although no statistics were available that summarized the total market, the Bicycle Helmet Safety Institute estimated that between 12 to 15 million bicycle helmets are sold in the United States each year. REFERENCES: 1. Butler BP and Allen NM. Long-Duration Exposure Criteria for Head-Supported Mass. Army Aeromedical Research Lab, Fort Rucker AL. USAARL-97-34, 1997. 2. Ivancevic V and Beagley N. Determining the Acceptable Limits of Head Mounted Loads. Land Operations Division, Systems Sciences Laboratory. DSTO-TR-1577. 2003. 3. LaFiandra M, Harman E, Cornelius N, Frykman P, Gutekunst D, and Nelson G. The Effects of the Personal Armor System for Ground Troops (PASGT) and the Advanced Combat Helmet (ACH) With and Without PVS-14 Night Vision Goggles (NVG) on Neck Biomechanics During Dismounted Soldier Movements. US Army Research Institute of Environmental Medicine, Military Performance Division. T07-09 2007. 4. Manoogian SJ, Kennedy EA and Duma SM. A literature review of musculoskeletal injuries to the human neck and the effects of head-supported mass worn by Soldiers. USAARL Contract Report No. CR-2006-01, 2006. 5. Merkle AC, Kleinberger M, and Uy OM. The Effects of Head-Supported Mass on the Risk of Neck Injury in Army Personnel. Johns Hopkins APL Technical Digest: 26(1):75-83. 2005.
DHP13-010: A Human Body Model for Computational Assessment of Blast Injury and Protection
Description: OBJECTIVE: Formulate, develop and demonstrate anatomically consistent, articulated human body model for computational assessment of explosion blast injury loads, body responses and casualty estimation and for analysis of personal protective equipment. DESCRIPTION: Blasts from improvised explosive devices (IEDs) are the most common cause of wounded-in-action injuries and death in recent military operations [De Palma 2005]. US and coalition military personnel are engaged in unconventional warfare with continuously evolving terrorist threats (IEDs, road side bombs, car-borne bombs, suicide bombers and others). The primary goal of the Department of Defense (DoD) Blast Injury Research Program is to understand and predict the threat scenarios, injury potentials and to develop improved protective measures. Current body armor has been designed primarily to protect against ballistic and fragment impact threats and to a lesser degree to protect against explosion blasts, partly because of limited understanding of blast injury pathways. Moreover, experimental evaluation of protective armor against ballistic and impact loads, which have a localized focus, can be conducted using well established human body surrogates [Roberts 2007]. Explosion blast loads engulf the entire human body, are much more complex and involve loads induced by shock waves, debris, shrapnel, thermal flash and toxic gases. The biomechanical injury may be caused by the direct effects of pressures penetrating the body, flying debris, body translocation in air and impact on hard objects [Elsayed and Atkins 2008]. The types of injuries caused by blasts also depend on whether the blast occurs in an open field or within a building. In the last few years the DoD has made substantial research investments in understanding blast wave traumatic brain injury (TBI). The majority of these efforts use an experimental approach and shock tube tests using human physical surrogates (dummies), cadavers and animal models [Chavko 2007], which are useful and necessary but are slow, expensive, lack injury scaling and prediction capability. Anatomically consistent, articulated human body model and computational tools for modeling blast physics coupled to body biodynamics and biomechanics will help in better understanding of blast injury threats, interpret experimental data and in the development of improved protective armor and medical treatment procedures. Existing articulated human body models and associated computational tools have been developed for studying impact injury [Cheng 1998], lack required anatomical biofidelity and do not allow coupled blast-biodynamics-injury biomechanics simulation capability. Several teams have recently developed anatomical geometry model of a human head/brain to study TBI mechanisms [Moore 2009] but they lack the capability of modeling head/neck/body biodynamics, whole body responses, and other injury mechanisms. The goal of this project is to 1) establish a prototype database of body tissue material properties relevant to human response to blast environment using open literature data and identify missing but required data , 2) develop a human body anatomic model including skin, skeletal, head/neck and internal organs susceptible to blast injury (brain, spine, lungs, etc), 3) develop geometry and computational meshes to enable blast wave loads, body biodynamic responses and the biomechanics of pressure waves within the body, 4) conduct parametric simulations using selected computational tools of blast wave physics and body biomechanics and model validation against existing human/cadaver test data, and 5) demonstrate the capability to simulate the performance of personal protective equipment (PPE) in various explosion blast scenarios. PHASE I: Conduct thorough review of existing anatomical models of a human body and evaluate their capabilities and limitations for simulations of blast injury. Formulate specifications of a modeling framework integrating the anatomical data (skin, skeletal, major organs), geometric modeling tools for human body articulation (size, shape, posture), blast scene generation, material property data needed for biodynamic and biomechanic simulations. Evaluate and select existing software tools for modeling explosion blast events, blast body interaction, and assessment of injury/casualty estimation. Formulate a plan for model validation to be executed in Phase II and III. Develop prototype components of such a framework and demonstrate their capability to generate anatomical geometries of articulated human bodies and to simulate blast-body interaction including: blast loading, body biodynamics, and biomechanics and initial analysis of injury pathways to a selected organ e.g. brain, lung, spine, groin, extremity. The results of Phase I should be documented in a final report describing details of the proposed simulation framework, availability of existing data, results of relevant demonstration/validation simulations and rationale for further model development and validation. PHASE II: Implement the software tools for generation of anatomic geometry models of a human body and for generation of computational models for both high fidelity and reduced order simulations. Establish a database of human body models (e.g. based on body scans), develop software tools for articulation of a human body (e.g. upright standing, seating, leaning) and for generation of internal organs/tissues including skeletal, muscular, brain, lung, vascular, abdominal, etc. Develop a simulation framework integrating blast wave physics and human body biodynamics/biomechanics using existing software tools and evaluate its capabilities and limitations in modeling blast wave injury events. It is desirable that such a framework should enable both high fidelity and reduced order (fast running) simulations. Conduct model validation on existing published data of human (cadaver, dummy) body response to inertial, impact, shock tube and blast wave response. Demonstrate the capability to generate human body models wearing personal protective armor (helmet, vest, boots) and equipment. Conduct computational analysis of the role of PPE in protection against blast injury. Using the simulation results and available experimental data develop and demonstrate the capability to generate blast injury assessment and criteria to selected organs including: brain, lung, neck, groin, vascular and others. PHASE III DUAL USE APPLICATIONS: The data, software tools and results of this project will have immense potential application in military and civilian medicine. The US military will use such tools for development and evaluation of protective armor and equipment, for forensic analysis of blast events, development of blast dosimeters, diagnostics, and treatment of blast injury casualties. It will be also applicable for ergonomic evaluation of military equipment (cockpits, seats, vehicle safety, egress, etc.) and as a personal aid of soldier training. The technology developed in this SBIR project could also support various commercial applications such as automotive safety, sport medicine, rehabilitation after injury or surgery, and others. REFERENCES: 1. DePalma R., Burris DG., Champion HR., Hodgson MJ., (2005), Blast Injuries, N. Engl. J. Med, 352(13), 1335-1342, 2. Roberts, JC., Merkle AC., Biermann PJ., Ward EE. Carkuff BG., Cain RP., O'Connor JV., (2007), Computational and experimental models of the human torso for non-penetrating ballistic impact, J Biomechnaics, v40, 1, 125-136, 3. Elsayed N.M., and Atkins J.L., (2008), Explosion and Blast-Related Injuries: Effects of Explosion and Blast from Military Operations and Acts of Terrorism, Elsevier Academic Press 2008 4. Chavko M., Koller WA., Prusaczyk WK., McCarron RM., (2007), Measurement of blast wave by a miniature fiber optic pressure transducer in the rat brain, J. Neurosci. Meth., 159(2):277-81 5. Cheng, H., Rizer, A. L., and Obergefell, L. A., (1998), Articular Total Body Model Version V: User"s Manual, Human Effectiveness Directorate, Crew Survivability and Logistics Division, Wright-Patterson AFB, Dayton, OH, Report No. AFRL-HE-WP-TR-1998-0015. 6. Moore DF., Jerusalem A., Nyein M., Noels L., Jaffee MS., Radovitzky RA., (2009), Computational biology modeling of primary blast effects on the central nervous system. NeuroImage v47(S2):T1020, 2009.
DHP13-011: Visual Evoked Potentials for TBI Diagnosis
Description: OBJECTIVE: Investigate and validate the capability of using visual evoked potentials as a method to aid in the diagnosis mild traumatic brain injury. DESCRIPTION: In Iraq and Afghanistan, 12% of all warriors with battle injuries suffer from traumatic brain injury (TBI). The military"s need to diagnose and triage TBI casualties is described in the Theater Combat Casualty Care Initial Capability Document dated October 2007 and the Joint Force Health Protection Joint Casualty Management Joint Capabilities Document dated 11 June 2008. There is widespread agreement among subject matter experts that there is no clinically-validated predicate device for the diagnosis of mild TBI (mTBI). In addition, there is a need for the screening and management of treatment for acute and chronic moderate to severe TBI. Evoked potentials measure the brain"s electrical activity in response to stimulation and are generally used for studying higher cortical functions. Although they show promise, they are not routinely used in clinical neurology mainly due to advances in imaging technology, especially magnetic resonance imaging (MRI). An MRI typically gives more accurate information with regards to structural abnormalities but is not easily used in an austere environment, is more expensive, and does not have the capability to provide a rapid assessment of TBI. Visual evoked potentials (VEP) tests the function of the visual pathway from the retina to the occipital cortex and is used clinically ranging from glaucoma to classic and common migraines. The latency of visual evoked potentials has also been shown to have a positive correlation to intracranial pressure (ICP). An increase in ICP is a common secondary complication of TBI. Therefore, there is the potential of an easily portable device that can aid in the diagnosis of TBI through visual evoked potentials. It is the goal of this topic to explore the feasibility of developing a device that can meet the military"s need of a TBI diagnostic through the use of visual evoked potentials. The device should take into account the potential environment of use and be rugged enough to withstand extremes in temperature, shock and vibration, and humidity. The device should be sensitive and specific enough to provide a clear determination of TBI as characterized by a Glasgow Coma scale of 9-15 and be easily used (i.e. clear output indicator such as yes/no, stoplight, etc.). PHASE I: This Phase will demonstrate the feasibility of producing a device capable of aiding in the diagnosis of TBI by using VEP. This phase should include a plan for development, clinical validation, regulatory strategy, concept of the proposed device, and a literature search to support feasibility. PHASE II: Develop a working prototype based on Phase I work suitable for FDA clinical trials. Identify clinical sites for validation and primary investigators and have preliminary talks with FDA regarding regulatory path (at least pre-IDE, preferably IDE). Finalize pivotal trial protocol. PHASE III DUAL USE APPLICATIONS: There are clear commercial opportunities for a device that can correlate an easily measured signal to brain injury. The major military application is a device is to assess physiological impairment after exposure to blast or injury. The major civilian application is to assess physiological impairment after exposure to head injury during sporting events or other areas where concussions may be prevalent. Phase III of this project will look to conduct pivotal trials in these populations to show efficacy of the device. REFERENCES: 1. Gatz, Michael,"The Neurophysiology of Brain Injury,"Clinical Neurophysiology. 115, 4-18 (January 2004) 2. Evans, Andrew B and Boggs, Jane G,"Clinical Utility of Evoked Potentials,"Medscape Reference. (February 2012), http://emedicine.medscape.com/article/1137451-overview#a1 3. Padula, WV; Argyris, S;, and Ray, J,"Visual evoked potentials (VEP) evaluating treatment for post-trauma vision syndrome (PTVS) in patients with traumatic brain injury (TBI),"Brain Inj. 8(4), 125-133 (1994)
DHP13-012: Immediate Application Cranioplasty During Decompressive Craniectomy for Head Injuries
Description: OBJECTIVE: To develop a Cranioplasty construct for immediate application during Decompresive Craniectomy for relief of increased intracranial pressure refractory to medical management. The construct spares the costs of a delayed Cranioplasty. DESCRIPTION: Decompressive Craniectomy(ies) (DC) or the neurosurgical emergency procedure removing part of the skull to relieve brain pressure from traumatic injuries have been widely employed during the OIF/OEF period to ensure wartime head injury patient survival for long transport to tertiary medical treatment facilities(1, 2). The skull defect that results while awaiting a second-staged procedure to cover this defect (Cranioplasty) has given rise to an intervening disorder, the Syndrome of the Trephined (ST), which causes behavioral and motor deficits in patients and compromise their ability to carry out rehabilitative treatment, thus prolonging their hospital stay(3-5). The current surgical practice of waiting for an average of 6 months before subjecting the craniectomized patient to Cranioplasty was influenced by by Rish and co-workers6who in a 1979 publication of a large series of open head injuries recommended a minimum waiting period of 1 year to avoid infections associated with Cranioplasties. During the 2002-08 period, the Walter Reed Army Medical Center (WRAMC)-Neurosurgery compared the craniectomy-cranioplasty interval with the incidence of CNS infection to validate the conclusions made in the late"70s by Rish, et al that craniectomies done earlier than 6 months were prone to infection. Of the 188 craniectomies done by WRAMC-Neurosurgery during the 2002-08 period, 144 cases qualified as having reliable clinical data to determine their respective craniectomy cranioplasty interval periods which ranged from 11 days to 36 months (extent shortened to 19 months in the depicted graph). CNS infection cases (n=25) all had documented positive cultures taken from the craniectomy site. The data7 showed a period of lesser infection incidence at the early end of the interval period range (<3 month interval). Of the 23 patients who underwent cranioplasty<3 months from craniectomy in our series, only one CNS infected case was detected. The isolated infection case showing Streptococcus viridians growth in the CSF, occurring within the<3 month interval period (2 months post-Craniectomy) was that of an allied NATO soldier referred from a non-US military health facility. Although smaller than the 1030 patient review cited by Rish, et al for penetrating head injuries, results on the Craniectomy-Cranioplasty interval analysis for our series include both penetrating and closed head injuries and favor a paradigm shift from the current practice of staging late (>6 months - 1 year) cranioplasties for craniectomized head injuries. Previously cited publications (8-14) support important predictors for CNS infections other than timing for the Cranioplasty procedure and more importantly, signify that it is not the waiting period for the Cranioplasty procedure per se, but the interruption of wound healing, as seen in staged or operative revisions, that increases the infection risk. Since the least wound healing interruption is accomplished during the initial craniectomy, the development of a cranioplasty construct suited for simultaneous application during Decompressive Craniectomy spares the patient from a second surgery. It is to be noted that the current commercially available cranioplasty constructs are all designed for a second staged surgical procedure since they do not address the brain swelling found with Decompressive Craniectomies of acute head injuries. A semi-rigid cranioplasty construct which conforms to the expansion of the brain and its outer layer (dura) during brain edema and adapt to its normal contour upon its resolution will provide the technical gap to make this one-staged procedure viable. PHASE I: In Phase I, the performer will demonstrate the feasibility of the Cranioplasty construct design by computer simulation and 3D Model Testing. First, by noting the different time points of brain swelling on neuroimaging from its intitiation to its cessation from MRI images of head injury patients who have undergone Decompressive Craniectomy, a temporal profile of the brain swelling event will be created. Time gaps found between the available MRI images will be filled through a computer process called rendering. Secondly, the cranioplasty design, which would be based on actual post-craniectomy skull defects, will be simulated as well, as it interacts with the event of brain swelling. This can be done by utilizing available software (MIMICS) used for configuring neuroimaging (MRI, CT) into viewable and movable 3D images. These digital images will also be transformed into 3D plastic/rubber models of a phantom skull and brain which can be mechanically observed as they interact with event of brain swelling through a process known as stereolithography. A brain phantom setup will be created using a gelatin-filled elastic capsule containing two inner tubes with inflatable tips each constructed to the known 12% capacities of blood vessels and of CSF respective to the total brain volume fitting the intracranial cavity of the skull models(15). Based on 100 simulation trials to be carried out interplaying the cerebral edema event with the cranioplasty design both on the virtual and the 3D model levels, the outcome will iterate to an alpha prototype system, Cranioplasty Prototype Construct (CPC), which will be deliverable by the end of Phase I. Deliverable will also include a device development and testing plan for phase II, to include drafts of research protocols required for animal and possible human testing. PHASE II: Phase II will focus on the implementation of animal studies and to answer the question: What is the safety and efficacy profile of the Cranioplasty Construct Prototype among mammals? A related question of whether tissue ingrowth of an embedded cranioplasty implant will impede its migration during the process of cerebral edema resolution, will be answered and addressed in this Phase. The performer will implement a TBI animal survival model to approximate the known 3-month period of healing bone and will document bone growth in the post-Cranioplasty test animal survivor through serial CT with 3D Bone reconstructed imaging. Versions of currently used Cranioplasty materials (Polymethyl methacrylate, Titanium) using the prototype design will be compared and analyzed. The performer will also interact with FDA to develop a regulatory compliance plan. The performer will also develop a rational commercialization plan for addressing both the military and civilian markets. The performer will also deliver a refined Beta-prototype of the Cranioplasty Construct and system characterization data to military subject matter experts by the conclusion of Phase II. The performer will also apply for a Humanitarian Device Exempt (HDE) status from the FDA and submit a study protocol to the Walter Reed National Military Medical Center (WRNMMC) IRB for selected Head Injury subjects requiring Decompressive Craniectomy whose condition will not tolerate a second-staged Cranioplasty. Human subjects approval will also be required from the US Army Medical Research and Materiel Command"s Office of Research Protections. PHASE III DUAL USE APPLICATIONS: The further development of the Phase II-developed Beta Cranioplasty prototype will be extended to clinical testing in Phase III. The performer working with industrial partner(s) and/or industry-related funding, will seek out collaborations with neurosurgical institutions both in civilian and military sectors to consolidate outcome studies using the Cranioplasty Prototype in statistically powerful numbers of patients. The preliminary outcomes for prior patient recruitment under the FDA and IRB-approved HDE status will be documented and published. The performer will deliver a plan on how FDA approval will be achieved utilizing current Good Manufacturing Practices (cGMP). Quality Management and device applications will be completed and executed. In the military sphere, the prototype will be made available for Role 2/3 (Forward Surgical Team/Combat Support Hospital) and upward-Medical Treatment Facilities. Deliverables will also include clear, concise and standardized brochures and device guidelines that can be reliably followed by at least a minimally trained healthcare provider at the level of medic/operating room technician and above. REFERENCES: 1. Bell RS, Mossop CM, Dirks MS, Stephens FL, Mulligan L, Ecker R, Neal CJ, Kumar A, Tigno T, Armonda RA,"Early Decompressive Craniectomy for Severe Penetrating and Closed Head Injury During Wartime", Neurosurgical Focus 28:5, May 2010 2. Bell R, Armonda RA,"Severe Traumatic Brain Injury: Evolution and Current Surgical Management", ePublication, Medscape, June, 2008 3. Granthan E, Landis H,"Cranioplasty and the post-traumatic syndrome."J Neurosurg 5:19-22, 1947 4. Yamaura A, Makino H,"Neurological Deficits in the Presence of the Sinking Flap following Decompressive Craniectomy."Neurol Med Chir 17:43-53, 1977 5. Dujovny M, Agner C, Aviles A."Syndrome of the Trephined: Theory and Facts", Crit Rev Neurosurg 9:271-278, 1999. 6. Rish BL, Dillon JD, Meirowsky AM, Caveness WF, Mohr JP, Kistler JP, et al.,"Cranioplasty: A Review of 1030 Cases of Penetrating Head Injury", Neurosurgery 4:381-385, 1979 7. Tigno TA, Armonda RA,"Scientific Reports and Accomplishments for Cranioplasty for the Syndrome of the Trephined", Part I, 2012: https: //extranet.aro.army.mil/progress reports/ 8. Cheng YK, Weng HH, Yang JT, Lee MH, Wang TC, Chang CN,"Factors Affecting Graft infection After Cranioplasty", J Clin Neurosci 15:1115-1119, 2008 9. Carvi Y, Nievas MN, Hollerhage HG,"Early Combined Cranioplasty and Programmable Shunt in Patients with Skull Bone Defects and CSF Circulation Disorders", Neurol Res 28:139-144, 2006 10. Liang W, Xiaofeng Y, Weiguo L, Gang S, Xuesheng Z, Fei C, et al.,"Cranioplasty of Large Cranial Defect at an Early Stage After Decompressive Craniectomy Performed for Severe Head Trauma", J Craniofac Surg 18:526-532, 2007 11. Gooch MR, Gin GE, Kenning TJ,"Complications of Cranioplasty Following Decompressive Craniectomy: Analysis of 62 Cases", Neurosurg Focus 26 (6):E9, 2009 12. Korinek AM,"Risk Factors for Neurosurgical Site Infections After Craniotomy: A Prospective Multicenter Study of 1944 Patients. The French Study Group of Neurosurgical Infections, the SEHP and the C-CLIN Paris-Nord. Service Epidemiologie Hygiene et Prevention."Neurosurgery 41:1073-1079, 1997 13. Stephens FI., Mossop CM, Bell RS, Tigno TA, Rosner MK, Kumar A, Moores L, Armonda RA,"Cranioplasty Complications following Wartime Decompressive Craniectomy", Neurosurgical Focus/Journal of Neurosurgery, Volume 28, Number 5, May 2010 14. Matsuno, A., H. Tanaka, et al.,"Analyses of the factors influencing bone graft infection after delayed cranioplasty."Acta Neurochir (Wien) 148(5): 535-40; discussion 540, 2006 15. Mokri B,"The Monro-Kellie Hypothesis: Applications in CSF Volume Depletion."Neurology 56(12): 1746-8, 2001
DHP13-013: A Point-of-Care Device for Diagnosis of Platelet Injury in Trauma Patients
Description: OBJECTIVE: Develop a portable, point-of-care device that directly measures the platelet contribution to clot characteristics. DESCRIPTION: Hemorrhage, associated with trauma is one of the leading causes of preventable death on the modern battlefield. Posttraumatic hemostasis is often impaired by the rapid onset of coagulopathy which has been observed in up to 36% of trauma patients. Trauma-associated coagulopathy is indiscriminant and widely recognized to increase mortality and morbidity. A decrease in clot strength as measured by thrombelastography (TEG) has been identified as the earliest and most sensitive predictor of blood transfusion requirement and mortality in trauma patients. Clot strength, as defined by maximal amplitude by TEG and maximal clot firmness by rotational thromboelastometry is determined by the integrity of the fibrin network and the contractile force applied to this network by platelets. Unfortunately, TEG devices are neither rapid nor portable limiting their ability to reverse the rapidly evolving downward spiral that coagulopathic trauma patients face. This time lag to begin treatment of patients with severe trauma can mean the difference between life and death. Studies indicate that platelets become inhibited early during trauma as identified by a decrease in both light aggregation and reduced clot strength by TEG platelet mapping. These interactions are governed by the interaction between the platelet glycoprotein (GP) IIb/IIIa receptor and its specific binding to fibrin during clot formation and consolidation. However, the specific contribution of platelet GPIIb/IIIa-fibrin interactions to hemostasis during trauma is an important unanswered question. PHASE I: Determine the feasibility of establishing technical and scientific merit of platelet diagnostic technology. The feasibility studies are to determine the development of platelet microforce measurements as a new tool for the monitoring of antiplatelet therapy, and as an approach to define bleeding risk in trauma. These studies will also begin to define individual platelet receptor contributions to hemostasis during the critical early phase of traumatic injury and fluid resuscitation. PHASE II: Based on the Phase I feasibility study, develop, demonstrate and validate a laboratory prototype that enables direct and independent measurement of platelet micro forces thus providing more rapid and clinically-actionable information. This device should demonstrate capability for direct and independent measurement of platelet micro forces thus providing more rapid and clinically actionable information. PHASE III DUAL USE APPLICATIONS: The goal of the Phase III effort is transition to field operational use. Capability will be further matured to support/augment military operational medicine as well as transition to general, non-military emergency medicine. REFERENCES: 1. Hess et all (2008),"Department of Pathology, University of Maryland Medical Center, Baltimore, Maryland 21201, USA."2. Arinsburg et all (2012),"Determination of human platelet antigen typing by molecular methods: Importance in diagnosis and early treatment."3. Fries D. Martini (2010),"Role of fibrinogen in trauma-induced coagulopathy"Department of General and Surgical Critical Care Medicine, Innsbruck Medical University, Anichstrasse 35, Innsbruck, Austria. 4. Michael M. Sawyer,"Trauma and Thrombelastography", Denver Health and Hospital, 777 Bannock St, Denver, CO 80204, USA
DHP13-014: Tailored Wound Dressing for the Treatment of Burns
Description: OBJECTIVE: Develop a tailored wound dressing for the treatment of burned or severely damaged wounds which uses a unique protective coating. This dressing should be composed of a nano-thin layer of carbon deposit on a highly permeable silicone film. This wound dressing must allow the body to naturally grow news cells at the wound site and, at the same time, prevent the growth of bacteria. DESCRIPTION: Burns are among the most painful and debilitating battlefield wounds and often turn deadly if infection sets in. Since 2003, more than 8,000 U.S. Soldiers, Sailors, Airmen, and Marines have sustained injuries from hostile action, of which approximately 700 sustained burns, inhalation injury, and associated trauma in Iraq or Afghanistan severe enough to warrant burn center care. Each of these patients was provided initial care at U.S. military field hospitals then transported by air from the theater of operations to Landstuhl Regional Medical Center in Germany for further assessment and stabilization. Combat burn casualties were then flown more than 5,300 miles to San Antonio for treatment of their burns and other injuries The greatest advantage of the desired wound dressing would be protecting a wounded combatant from further deterioration of the wound site. It would essentially allow the dressing to function as a temporary skin replacement. By initially stabilizing the patient as early as possible in the theater of battle, would increase the survivability rate of persons with severe burns and allow for a much more comfortable transport tot a medical facility. This technology, if successful, would also provide a speedier recovery from the injuries. PHASE I: Conduct a feasibility study of the effectiveness of nanomaterial coated silicone film for the treatment of burn and severely damaged wounds. The feasibility studies are to determine if the nanomaterial is bacteria proof and provides controlled oxygen transport through the dressing to assist in cell growth. PHASE II: Based on the Phase I feasibility study, develop, demonstrate and validate a laboratory prototype that enables direct and independent assessment of the effectiveness of nanomaterial coated silicone film for the treatment of burn and severely damaged wounds. With proper approval, during this phase, clinical trials will be conducted to demonstrate that the nanomaterial is bacteria proof and in fact does provide controlled oxygen transport through the dressing. Additionally, these trials should demonstrate that water vapor permeability is controlled to prevent loss of water moisture through the wound from the body. PHASE III DUAL USE APPLICATIONS: The goal of the Phase III effort is transition to field operational use. Capability will be further matured to support/augment military expeditionary medicine as well as transition to general, non-military emergency medicine. REFERENCES: 1. Renz van M.,"Care of the Military Burn Trauma Casualty", San Antonio Medicine.Com, May 2008. 2. Sen, Soman MD; Greenhalgh, David MD; Palmieri, Tina MD.,"Journal of Burn Care & Research,"November/December 2010 - Volume 31 - Issue 6 - pp 836-848.
DHP13-015: A Universal Device for Performing Cricothyrotomies
Description: OBJECTIVE: To develop an all-in-one universal device for performing cricothyrotomies to more effectively manage airway trauma in the battlefield. DESCRIPTION: A cricothyrotomy (or cricothyroidotomy) is an emergency procedure to establish an airway in a patient when intubation attempts are unsuccessful due to acute injury to the head and/or neck. Establishing an airway and restoring oxygen-flow to the brain is essential and is a time-sensitive process. Cricothyrotomy convenience kits are often assembled for use in pre-hospital situations, and contain the necessary instruments to perform the procedure. However, these kits are not standardized and vary from company to company. In November 2010, Military Health System"s Committee on Tactical Combat Casualty Care (CoTCCC) published a list with preferred features for a surgical airway kit. Numerous publications have demonstrated that battlefield cricothyrotomies have been largely unsuccessful in Operation Enduring Freedom and Operation Iraqi Freedom. Due to the severity of injuries sustained in these environments, the procedures are generally performed by medics at the point of injury, though some were performed by physicians or physician assistants. TCCC recommends early consideration of cricothyrotomies because many medics are not experienced enough to perform successful intubation procedures. However, when looking at injuries and deaths in Iraq and Afghanistan, the procedures were more successful when performed by a physician or physician assistant than if performed by a medic, but still two-thirds of the patients died. Though it is hard to directly link a patient"s death with a cricothyrotomy failure, the statistics reveal procedure, training, and device problems. As the third-most preventable cause of death on the battlefield, it is imperative to identify a solution to successful airway management. Training inconsistencies and lack of continuous practice suggest that an all-in-one device would greatly improve the outcome of the procedure and reduce the number of instruments needed to package. The universal device would address the issues found in battlefield cricothyrotomies and be safe and intuitive to use. PHASE I: Phase I would consist of designing schematics and diagrams for a universal cricothyrotomy device, and providing a working prototype. A literature search would demonstrate that the device designed would address the complications of battlefield procedures and provide feasibility data. This phase would also address a potential regulatory path for gaining FDA approval or clearance. PHASE II: Phase II would consist of developing, demonstrating and validating the prototype and implementing the plan for FDA approval or clearance. This would include performing pivotal trials and device testing. PHASE III DUAL USE APPLICATIONS: Phase III would consist of developing training methods and protocols for the new device and performing Army-relevant testing, such as environmental testing and user studies. REFERENCES: 1. Mabry, RL."An Analysis of battlefield cricothyrotomy in Iraq and Afghanistan."J Spec Oper Med. 2012;12:17-23. 2. Bennet, Brad, et all."Cricothyroidotomy Bottom-Up Training Review: Battlefield Lessons Learned."Military Medicine. 2011;11:1311.
DHP13-016: Development of Technologies that Address the Complex Architecture of the Face During the Treatment of Severe Facial Burn Injury
Description: OBJECTIVE: The objective of this effort is to develop new innovative technologies that address the complex architecture of the face to facilitate the treatment, effectiveness, recovery and outcomes from treatment for severe facial burn injury. DESCRIPTION: Here we recognize 450,000 burn injuries requiring medical treatment occur in the U.S. each year. Approximately 55% of 45,000 of acute hospitalizations cases require admission to specialized burn units for treatment. Burn injuries also complicate approximately 5% to 10% of contemporary combat casualties not returned to duty within 72 hours. Approximately 77% of treated burn injuries sustained from combat explosions during the current armed conflicts involved the face. Such cases often exhibit spontaneous epithelialization that result in the formation of scars which may subsequently lead to contracture. A contracture scar is a permanent tightening of the skin where normal elastic connective tissue is replaced with inelastic fibrous tissue. The current unmet need when treating such severe burn injuries lies in scar prevention and improving current grafting technologies on the complex topography of the face. PHASE I: Conceptualize and design an innovative solution to facilitate wound healing and restoring facial aesthetics following severe facial burn injuries. Such technologies should improve the application, retention, or performance of autografts, allografts, dermal equivalents or skin equivalents when applied to deep-partial thickness to full thickness burns of the facial region leading to improvement of the functional and cosmetic outcomes. The required Phase I deliverables will include: 1) a research design for engineering the proposed technology and 2) A preliminary prototype with limited testing to demonstrate in vitro proof-of-concept evidence (to be executed at Phase I). Other supportive data resulting from in vivo proof-of-feasibility studies may also be provided during this 6-month Phase I, $100K (max) effort. PHASE II: The researcher shall design, develop, test, finalize and validate the practical implementation of the prototype technology that implements the Phase I methodology to prevent scar formation and facilitate wound healing at a burn injury site over this 2-year, $1.0M (max) effort. The researcher shall also describe in detail the transition plan for the Phase III effort. PHASE III DUAL USE APPLICATIONS: Plans on the commercialization/technology transition and regulatory pathway should be executed here and lead to FDA clearance/approval. They include: 1) identifying a relevant patient population for clinical testing to evaluate safety and efficacy and 2) GMP manufacturing sufficient materials for evaluation. The small business should also provide a strategy to secure additional funding from non-SBIR government sources and /or the private sector to support these efforts. Military application: The desired therapy will allow military practitioners to apply the therapy. Commercial application: Healthcare professionals world-wide could utilize this product as a therapy meant to improve the standard of care presently available to burn patients. REFERENCES: 1."Burn Incidents Fact Sheet."American Burn Association, 2011. http://www.ameriburn.org/resources_factsheet.php 2."Burn Injuries."JAMA, October 28, 2009; Vol 302, No. 16; p 1828. http://jama.ama-assn.org/content/302/16/1828.full.pdf 3."Burns."MedLinePlus. http://
DHP13-017: Assistive Technology Sensor Platform
Description: OBJECTIVE: Develop advanced sensor technologies that allow for the prosthesis socket and/or prosthetic components to respond to signals from the residual limb based on sensing from within the socket at the residual limb interface. Develop the ability to place sensors comfortably, safely and unobtrusively within the intimate confines of the socket-limb interface. Design and build ruggedized, low-cost, lightweight, non-invasive, unobtrusive sensors and a nonproprietary (open) platform-based system architecture for component-based, device-agnostic sensing that will eventually collect and share data to any number of current and future prosthetic devices. While this topic may appear technology-centric, the ultimate objective is to facilitate a comfortable, high-definition, high-fidelity, high-capability socket that will allow the user the user a more comfortable socket which also enables a much higher level of functionality and responsiveness from the prosthetic system. DESCRIPTION: Major limb amputations are among the most debilitating wounds sustained by those who survive a combat injury [1]. Recent military operations in Iraq and Afghanistan have resulted in a significant increase in the number of traumatic amputees receiving care in the military medical system [2,3]. Additionally, based on current population estimates for the United States (US), there are over 1.5 million people living with limb loss [4] with an incidence of 1 in 200 persons. Given our ageing population and increases in lifestyle diseases, it has been projected that the number of people living with limb loss will more than double by the year 2050 to 3.6 million primarily as a result of amputation secondary to dysvascular disease [5]. People with limb loss or limb deficiency use prosthetic and orthotic devices to regain function and mobility and restore appearance. Prostheses and Orthoses are externally applied devices used to replace wholly, or in part, an absent or deficient limb segment (ISO 8549-1, 1989). Prosthetic sockets form the interface between the residual limb and the prosthesis and are important for the transmission of forces and distribution of pressure in persons with amputation [6]. Conformity of the socket to the residual limb is extremely important, leaving negligible room for sensors to be inserted [6]. Existing sensor technologies are often too bulky and rigid to be accommodated comfortably and safely within the intimate fit of the prosthetic socket. Additionally, they are often limited to sensing only one variable. The challenge is to develop a sensor platform that is flexible, unobtrusive, and can be configured with almost any existing sensor. Advanced prosthetic technologies that respond to physiologic information from the residual limb require improved sensor platforms to become realistic, everyday solutions for the functional restoration of persons with amputation. An improved sensor platform could also be used to create monitoring systems for improved residual limb health and increase knowledge of the interface environment that should spur further developments in prosthetic systems. An improved and unobtrusive sensor platform could also be used to provide exteroceptive data as well, enabling the device or system to respond accordingly to external conditions. The Assistive Technology Sensor Platform will apply existing and/or innovative sensing technology for measurement of modalities such as pressure, shear, temperature, moisture, electromyography (EMG), limb volume, etc. that would be required for extended daily usage. The prosthetic sensing platform will enable integration with new sensors and new prosthetic components without undue redesign of either. The envisioned prosthesis-agnostic nonproprietary (open) sensing platform should comprise at least the following components: Sensors. Development of sensors suitable for guiding operation of a prosthesis, such as pressure, strain and temperature sensors. Sensors must operate effectively in the hot, moist environment of a prosthetic socket, in close proximity to human tissue, for extended periods of time. Open Integration Platform. Development of nonproprietary (open) architecture for sensor and device integration, data processing and pattern recognition to prosthetic devices, in such a manner that third-party prosthesis and sensor developers can more easily integrate and test products for compliance with the platform. The continuous capture, storage and transmission of sensor data will be critical to the operation of advanced prosthetic technologies. Requirements of the sensor platform system include: Unobtrusiveness, comfort and durability for extended use in a prosthetic socket in close proximity to tissue; Ability to integrate component sensors to measure the properties of interest in prosthetic applications (pressure, shear, temperature, moisture, blood flow, tissue oxygenation, EMG, limb volume, etc.); Ability to model data input characteristics of sensors and sensor arrays; Fidelity, accuracy and spatial/temporal resolution; Durability (especially within the socket environment); Means of managing and delivering power to a sensor array. It may be assumed that the prosthesis will have its own power source that can be used by the sensor system (e.g. couple with scavenger technology, inductive charging through liners and sockets.); Means of buffering and communicating data with prosthesis at rates and bandwidths sufficient to support dynamic control algorithms; Evidence of a scalable path to affordability and reasonable replacement costs for long term use. PHASE I: Develop sensor prototype and overall platform-based system architecture design that includes specification of sensor- and prosthesis-agnostic data management, calibration, signal processing, data storage, monitoring, network connections and power management. Demonstrate operation and data communication from candidate sensors under field-relevant laboratory conditions in three or more independent sensing modalities (pressure, shear, temperature, moisture, EMG, limb volume, etc.) needed to supply data for prosthetic control system. PHASE II: Develop and demonstrate a prototype sensing system for human testing with prototype or commercially available test prostheses. Demonstrate operation closed-loop sensor management under field-relevant conditions in six or more independent sensing modalities (pressure, shear, temperature, moisture, blood flow, tissue oxygenation, EMG, limb volume, etc.) in a spatially-distributed array, needed to supply data for prosthetic control system. Demonstrate rapid integration of one or more third party or research sensors. Conduct testing to demonstrate feasibility over extended operating conditions. Demonstrate plan for FDA approval. Across similar capabilities, demonstrate that this technology offers a greater level of comfort then traditional intra-socket sensing technologies. PHASE III DUAL USE APPLICATIONS: Demonstrate a path toward scalability and transition of a nonproprietary platform-based sensing architecture for integration with current and future prosthetic devices. Propose a methodology for application, use, calibration, maintenance and replacement of sensing devices for continuous operation of prosthetic systems over long periods of time, with estimations of lifecycle costs. Develop hardware and software-based development kits suitable for customizing third party products, while maintaining and prioritizing patient comfort and functionality. REFERENCES: [1] L. Stansbury, et al.,"Amputations in U.S. Military Personnel in the Current Conflicts in Afghanistan and Iraq,"J Orthop Trauma, 22(1), 43-46 (2008). [2] P. Pasquina, K. Fitzpatrick,"The Walter Reed Experience: Current Issues in the Care of the Traumatic Amputee,"Journal of Prosthetics and Orthotics, 18(Proceedings 6), 119-122 (2006). [3] H. Fischer,"U.S. Military Casualty Statistics: Operation New Dawn, Operation Iraqi Freedom, and Operation Enduring Freedom,"Congressional Research Service Report for Congress, September 28, 2010. http://www.fas.org/sgp/crs/natsec/RS22452.pdf [4] P. Adams, G. Hendershot, and M. Marano,"Current estimates from the National Health Interview Survey", National Center for Health Statistics (1996 and 1999). [5] K. Ziegler-Graham, E.J. MacKenzie, P.L. Ephraim, T.G. Travison, R. Brookmeyer,"Estimating the prevalence of limb loss in the United States: 2005 to 2050,"Archives of Physical and Medical Rehabilitation, 89, 422-9 (2008). [6] A.F. Mak, M. Zhang, and D.A. Boone,"State-of-the-art research in lower-limb prosthetic biomechanics-socket interface: a review,"Journal of Rehabilitation Research and Development,"38(2), 161-74 (2001).