You are here

Automated Encounter Documentation and Data Driven Decision Support Systems



OBJECTIVE: The objective of this topic is to identify and prototype an algorithm for combat casualty care scene interpretation using sensing modalities such as computer vision with the goal of automating medical treatment documentation and providing inputs on data driven decision support systems aimed at providing diagnosis and treatment recommendations to combat medics during prolonged field care.The algorithm shall be platform agnostic and capable of recognizing interventions performed by the medic to automate patient documentation and allow decision support systems to make inferences regarding the course of treatment.To facilitate immediate access in denied, intermittent, and low-bandwidth (DIL) communications environments, capabilities should be resident on a local device and be able to operate offline.Therefore, the system shall be compatible with current and future program of record systems such as Nett Warrior and Integrated Visual Augmentation System (IVAS).The systems that can ‘perceive’ medic-patient interactions will truly enable autonomous data collection with minimal interaction with the end user.This capability would further support development of robotic autonomous medical systems.The reduction of task and cognitive burden will allow medical personnel to focus on the operational mission to support of Soldier Lethality.

DESCRIPTION: Future combat will likely involve greater dispersion and near isolation over great distances, necessitating units to be more self-sufficient and less dependent on logistical and other support units.The potential for delayed medical evacuations due to anti-access and area denial challenges, poses a difficult dilemma for combat commanders with wounded, sick or otherwise incapacitated personnel and will likely result in periods of prolonged field care (PFC) near the sites of injury pending evacuation windows of opportunity.Such scenarios may require a few organic or attached combat medics to deal with many casualties, minimizing the time spent with each casualty.The DoD and civilian organizations are looking at intelligent systems to provide decision support systems (DSS), closed loop, and autonomous care capabilities to act as PFC force multipliers to enable medics to handle more patients at the same time. However these systems, are heavily data driven.Current data capture systems require hand entry of data which is both time consuming and distracts the medic from providing direct care; while recording patient encounter data (observations, interventions performed, and patient disposition) could wait, treatment cannot.Therefore significant research efforts are underway to provide hands free data entry during PFC.Much of that work is aimed at speech input which itself can distract from direct patient care, is limited to what the medic dictates, and is not conducive to noisy environments.The Army Medical Combat Developer has suggested that medical encounter data be captured by perception systems and interpreted with emerging AI techniques to enable real-time generation of input to on-site medical decision support systems as well as to capture patient encounter data for posting to the soldier’s medical record and forwarding to the next role of care.For such a capability to work, the Automated Encounter Documentation capture would have to be automatic, completely hands free, and capable of working in the dark.Additionally, redundant data storage and analysis on both a local device and in a medical data cloud would be needed to facilitate reliable operation and near-real-time response from DSSs and autonomous care systems during DIL communications, as well as to ensure data would not be lost if the medic’s EUD is lost or destroyed. The use of novel, and multi-modal sensing methodologies is encouraged to automate the capture of a wider range of data elements and to increase data capture accuracy and reliability.

PHASE I: Based on proposed solutions, develop designs to prototype, integrate and demonstrate a proof of concept light-weight perception system that can generate input into mobile medical information systems and decision support systems aimed at providing diagnosis and treatment recommendations to combat medics during prolonged field care or to be stored and potentially forwarded to a government operated medical data cloud for interpretation and upload to the patient’s medical record.The system shall be demonstrated using representative COTS sensors and hardware with the eventual goal of integrating with a system that can be carried by a single dismounted combatant along with other combat equipment.Identify potential datasets to be used for machine learning strategies.Produce a system design including analyses of alternatives for components to be used for prototyping and demonstration during Phase II.Initiate interoperability and integration plans for future hardware implementations using DoD programs such as the Army’s Integrated Visual Augmentation System (IVAS).

PHASE II: From Phase I work, develop and demonstrate a perception system that can generate input into mobile medical information systems and decision support systems aimed at providing diagnosis and treatment recommendations to combat medics during prolonged field care or to be stored and potentially forwarded to a government operated medical data cloud for interpretation and upload to the patient’s medical record.In order to accommodate initial prototype software evaluations with soldiers in the field and/or for fielding consideration, final system must be capable of being implemented on highly ruggedized, light-weight military End User Device (EUD) hardware.Develop integration plans for DoD programs such as the government owned mobile medical information system.Demonstrate an operational prototype in a field exercise with medics/corpsmen as coordinated by US Army TATRC.The offeror shall define and document the regulatory strategy and provide a clear plan on how FDA clearance will be obtained. The offeror should plan for Phase III integration of the prototype capabilities with fielded Army or Joint systems. Further develop commercialization plans that were developed in the Phase I proposal for execution during Phase III, which may include exploring commercialization potential with civilian emergency medical service systems development and manufacturing companies.Seek partnerships within government and private industry for transition and commercialization of the production version of the product.Other important considerations for the system concept include: 1) If a separate battery is used, it should be easy and quick to replace the battery in the field.2) No new or proprietary display devices should be proposed; if a display is needed for the initial human-in-the loop attended or tele-operated prototyping phases, any required display should be designed to use a standard military issued Android End User Device (EUD) such as the Army Nett Warrior or SOCOM Android Tactical Assault Kit. 3) The system shall be designed with respect to existing and emerging medical device interoperability standards. 4) If intra-device communications are involved in proposed prototype capability, Ultra-Wideband (UWB) communications technology (Ref 15-17) is the desired communications protocol in Phase II for connecting component technologies together and/or to tactical radios for remote teleoperations since UWB is being actively pursued as a secure wireless technology with minimal electronic signature for Open Body Area Networks (OBAN) in combat environments.Use of other innovative solutions for providing secure short-range wireless communications in a tactical environment will also be considered for system designs that require wireless intra-device communications.5) System should adhere to existing military standards based upon the research approach, such as compliance with existing IVAS standards, if exploring a vision-based perception system.6) This research is not designed to address the development of wireless capabilities.It is focused on development of a perception system.7) Speech-to-text capabilities will not be considered.8) Perception systems include but are not limited to vision-based systems.9) During Phase II field exercise coordinated by TATRC, the perception system be employed by multiple medics using it in medical scenarios (sample size n=32) to validate accuracy of the perception system.The medical scenarios will consist of medical procedures within a 68W Health Care Specialist’s Scope of Practice, as identified in Solder Training Publication STP 8-68W13-SM-TG dated 03 May 2013.The initial demonstration tasks are: 081-833-0065 Apply a Combat Application Tourniquet (C-A-T); 081-833-0068 Bandage an Open Wound; 081-833-0212 Apply a Pressure Dressing to an Open Wound; 081-833-0075 Perform a Needle Chest Decompression; 081-833-0033 Initiate an Intravenous Infusion; 081-833-0168 Insert a Chest Tube; and 081-833-0301 Administer an Intramuscular Injection with notation of medication given.10) This SBIR topic is not to develop new mobile medical applications that are required, but rather a capability that can be integrated into existing mobile military medical applications, such as JOMIS, BATDOK or MEDHUB.The vendor is not responsible for integration into existing mobile military medical applications.Proposals providing an approach, that supports integration that will be performed by or in conjunction with the appropriate government organization in follow-on Phase III spiral development activities, are preferred.

PHASE III: Refine and execute the commercialization plan included in the Phase II Proposal.The Phase III plan shall incorporate military service specifications from the U.S. Army, U.S. Air Force, U.S. Navy, and U.S. Marine Corps as they evolve in order to meet their requirements for fielding.Specifications will be provided in Phase II as they become available. The prototype system component may be integrated into a system of systems design and evaluated in an operational field environment such as Marine Corps Limited Objective Experiment (LOE), Army Network Integration Exercise (NIE), etc. depending on operational commitments.Present the product ready capability as a candidate for spiral development fielding (even without completion of the entire system of systems objective), to applicable Department of Defense. Army, Navy/Marine Corps, Air Force, Program Managers for Combat Casualty Care systems along with government and civilian program managers for emergency, remote, and wilderness Medicine within state and civilian health care organizations.Execute further commercialization and manufacturing through collaborative relationships with partners identified in Phase II.

KEYWORDS: Perception Systems, Artificial Intelligence, Image Processing, Automated Documentation, Prolonged Care, Electronic Medical Documentation, Electronic Health Record, Medical Robotics, Medical Autonomous Systems, Combat Casualty Care, Autonomous Enroute Care, HL7, UCS, Ultra-Wideband (UWB) Communications, DoD Genesis Health Record System, combat casualty care, MCP (Medical Computing Capability)



Tomorrow’s Tech: The Automated Critical Care System. 2014. Naval Science and Technology Future Force Staff.; Müller, H., Michoux, N., Bandon, D., & Geissbuhler, A. (2004). A review of content-based image retrieval systems in medical applications—clinical benefits and future directions. International journal of medical informatics, 73(1), 1-23.; Autonomous Patient Care Fact Sheet. 2017. Office of Naval Research.; Medical Data Integration with SNOMED-CT and HL7. 2015. Longheu A., Carchiolo V., Malgeri M. (2015) Medical Data Integration with SNOMED-CT and HL7. 2015.In: Rocha A., Correia A., Costanzo S., Reis L. (eds) New Contributions in Information Systems and Technologies. Advances in Intelligent Systems and Computing, vol 353. Springer, Cham a.; Nett Warrior (NW), US Army Acquisition Support Center.; Integrated Visual Augmentation System (IVAS). a.

US Flag An Official Website of the United States Government