TECHNOLOGY AREA(S): Air Platform, Info Systems, Bio Medical
OBJECTIVE: Develop a decision support tool that translates and synthesizes cognitive and learning science which is contained primarily in the academic literature, into information that is useful and usable by senior management when they make decisions about training and performance investments and acquisitions.
DESCRIPTION: Design a tool that provides investment and acquisition decision support to high level management personnel. The design should handle the management and visualization of a complex, sophisticated, and changing knowledge base related to the impact of technologies and instructional strategies on human learning and performance and address: The differential impact of high versus low fidelity simulation-based training on decision making and perceptual skill proficiency and retention, The impact of real-time instructor process feedback versus computer-based outcome feedback on proficiency acquisition, The impact of new technology and increased complexity in the operational platform on the practice time required to maintain proficiency, The impact of changes to the deployment preparation timeline (e.g., a longer than expected platform maintenance period) on proficiency (e.g., of pilots, maintenance personnel, sonar specialists, etc.), The effect of reduced training flight hours during deployment on proficiency levels. A great deal of work, including past SBIR/STTR work, has been performed to develop methods and tools for finding patterns of information in the vast web of data collected and held by the military and intelligence communities (e.g., Boury-Brisset, A., 2004). The results of that work and, in particular, methodologies that underlie the resulting tools, may be generalizable to the challenge posed by this STTR topic of searching through tens to hundreds of thousands of research reports to find the most recent cognitive and learning science related to a given funding or acquisition decision. Further challenges are to extract meaningful patterns of results, assess each patterns credibility, and determine how to treat conflicting results. These challenges may also benefit from that prior work for the military and intelligence communities. The research base mined by the tool should include relevant research results from the educational and cognitive psychology research literatures, including topics such as transfer appropriate processing, levels of processing, automaticity, skill decay, the development of cognitive efficiencies as expertise is acquired, and the development of metacognitive skill for using those efficiencies appropriately. Design a means by which the tool can give its users insight into chief findings; uncertainty and variety in the relevant research literature; research gaps; and impacts on the acquisition decision (e.g., learning benefits against short-term cost savings). Design an adaptable architecture and user-interaction framework that link the different elements of science with the types of questions outlined above. Decision makers should be given means and support for updating the tools knowledge base at an appropriate frequency and choosing from a variety of information organizational and visualization schemes. Because the research base changes over time, a decision support tool that draws from it will need to be adaptable by the decision makers who use it, not just the tools developers. It must be adaptable not just during its proposed development period, but also beyond it. It should be designed so that it can be adapted to interface with new information sources and new bodies of relevant research and to search for new types of research patterns. The tool should furthermore be extensible to support decisions about human-systems acquisition programs in general and decisions involving the introduction and use of new technologies within complex cognitive work domains, where proficiency and performance may be impacted. The developer will have rights to proprietary techniques and/or mechanisms they build into the system that allow the system to be evolved.
PHASE I: Develop a proof-of-concept and demonstrate the proposed tools ability to synthesize relevant data using representative and special use cases and provide basic concepts of how information will be visually presented. Obtain feedback on the concepts usefulness, understandability, and usability from senior personnel responsible for investment and acquisition decisions.
PHASE II: Based on the Phase I effort, iteratively develop and evaluate the tools architecture, user interfaces, functionality, and products. Ensure training science is represented accurately and in ways that take into account stakeholder time constraints, goals, and priorities. Provide naturalistic evaluation results that characterize the tool in terms of its usefulness, usability, and understandability, and demonstrate that stakeholders can use and adapt the tool to meet key needs independently of a software developer.
PHASE III: Further refine the tool based upon testing and early user experience and improve as necessary. Transition the Complex-Knowledge Visualization Tool to the acquisition community users via PMA-205. Private Sector Commercial Potential: The decision support tool is applicable to decision making in all complex domains in which a large and growing body of research exists and should be consulted to achieve smarter decision making about funding and policy. Example private-sector domains include the environment, education, economics, healthcare, and workplace safety. Private companies could use a tool such as the one proposed in this topic to support decisions about new technologies and research to pursue. The tool could also be used to communicate with stakeholders about complex information related to a wide range of decisions.
1. Ericsson, K. A., Charness, N., Feltovich, P. J., & Hoffman, R. R. (Eds.). (2006). The Cambridge handbook of expertise and expert performance (pp. 683-703). NY, NY: Cambridge University Press. Retrieved from http://www.cambridge.org/nu/academic/subjects/psychology/cognition/cambridge-handbook-expertise-and-expert-performance
2. Boury-Brisset, A. (2004). Ontological approach to military knowledge modeling and management. Paper presented at the RTO IST Symposium on Military Data and Information Fusion, held in Prague, Czech Republic, 20-22 October 2003, and published in RTO-MP-IST-040.
3. Kirchhoff, C. J., Lemos, M. C., & Dessai, S. (2013). Actionable knowledge for environmental decision making: broadening the usability of climate science. Annual review of environment and resources, 38(1), 393-414. Retrieved from http://eprints.whiterose.ac.uk/77662/
4. Ruppert, T., Dambruch, J., KrÃ¤mer, M., Balke, T., Gavanelli, M., Bragaglia, S., Chesani, F., Milano, M. & Kohlhammer, J. (2015). Visual Decision Support for Policy Making: Advancing Policy Analysis with Visualization. In Policy Practice and Digital Science (pp. 321-353). NY, NY: Springer. Retrieved from http://scholar.google.com/citations?view_op=view_citation&hl=de&user=cZcfjuQAAAAJ&citation_for_view=cZcfjuQAAAAJ:8k81kl-MbHgC
5. Succar, B. (2009). Building information modelling framework: A research and delivery foundation for industry stakeholders. Automation in construction, 18(3), 357-375. Retrieved from http://www.academia.edu/170356/Building_Information_Modelling_framewor
6. Welp, M., de la Vega-Leinert, A., Stoll-Kleemann, S., & Jaeger, C. C. (2006). Science-based stakeholder dialogues: Theories and tools. Global Environmental Change, 16(2), 170-181. Retrieved from http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0ahUKEwiBmv-P_KXOAhVEziYKHQOEA2wQFggcMAA&url=http%3A%2F%2Fwww.hnee.de%2F_obj%2FF7DE8626-C861-4DEE-A801-6BE1023023E5%2Foutline%2F03.pdf&usg=AFQjCNHvjbrn8xPQ8CZUVIrhWxxNnd9Qzg&sig2=lrP7RPDv4qqWBVcxxB3G4Q&bvm=bv.128617741,d.eWE-
KEYWORDS: Decision Support; Science-based Decision Making; Facilitation Of Stakeholder Understanding; Scientific Knowledge; Visualization; Knowledge Management