You are here

Next Generation Research Tools for Understanding Human Social Systems

Description:

 
 

PROPOSALS ACCEPTED: Phase I and DP2. Please see the 16.2 DoD Program Solicitation and the DARPA 16.2 Direct to Phase II Instructions for DP2 requirements and proposal instructions.

TECHNOLOGY AREA(S): Human Systems, Information Systems

OBJECTIVE: Develop tools to support innovation in advancing best practice research methods and capabilities for the social, behavioral, and economic (SBE) sciences, which include, but not limited to: analysis software, workflow systems, statistical packages, experimental platforms, and others.

DESCRIPTION: There is a critical DoD need for accurate and robust, reliable social, behavioral, and economic (SBE) models, which are increasingly important for planning and conducting effective military operations, including humanitarian aid, disaster relief, and stability support missions. The SBE sciences provide essential theories and frameworks that shape understanding of a wide range of human social behavior and systems of relevance for national security. The validity and reliability of SBE theories and concepts are fundamental to strong tactical, operational, strategic, and policy-level decision-making across the Department of Defense.

In light of several widely recognized “crises” in reproducibility in a number of disciplines, there is increased appreciation for the importance – and challenge – of experimentally validating results and claims of theories or model predictions. The academic community has responded by identifying a wide range of biases in the published literature, as well as their sources in experimental, statistical, and institutional structures and practices. Fortunately, a number of best practices and innovative methods have been developed to mitigate some of these challenges – but there remain opportunities for further development and dissemination of tools that, if matured and adopted, could have significant positive impact on a wide range of research questions and communities in SBE.

Accordingly, this topic is soliciting proposals for innovative tools that could demonstrate this positive impact. Examples might include proposals that provide credible approaches to improve the speed, efficiency, cost and/or adoption of one or more of the following tools: methods for pre-registration of experimental protocols; tools for transparent, modular, dynamic, and portable informed consent; Bayesian Net tools for tracking contingent evidentiary support structures within complex data or experimental designs; statistical tools to help identify and mitigate different biases in published or unpublished research; meta-analytic tools for exploring the robustness and generalizability of empirical findings; extensible packages for the analysis of text or geocoded data; assimilation methods for tuning computational models using real-time observations; licensing models for ethical data-sharing that protects Personally Identifiable Information (PII); platforms for joint collaboration and design of experimental protocols to increase scientific value prior to data collection; methods to obtain institutional pre-approval of widely-used experimental platforms like online surveys or games; and platforms that ethically and cost-effectively recruit a large number of experimental subjects across a wide range of cultural and demographic variables.

This topic is generally not seeking to fund approaches that are tightly tied to narrow experimental protocols or sensor systems, rely on restricted or excessively costly software and/or data sets, or visualization tools not explicitly tied to reproducible analytic techniques. Hardware and sensor approaches should leverage widely-available existing platforms and any proposed development efforts must focus on range of application, ease of use, and low barriers of entry for adoption of the tool or tools by academic, government, and commercial SBE researchers.

PHASE I: Identify the target research practice, protocol, or method that will be improved by the tool, and justify your approach via detailed specification of the degree of improvement over current practice, or a description of the new capabilities afforded. Demonstrate the key technical principles behind the proposed solution, and identify mitigations for any barriers to scale. The demonstrations should show wide applicability and relevance and potential benefit for common methodological approaches or challenges in the SBE sciences. Phase I deliverables are a notional prototype that achieves the core functionality of the complete product, as well as an extensive commercialization/propagation plan for achieving widespread use, and a final report.

PHASE II: Demonstrate scale and usability of the proposed approach. The demonstration should validate the predicted improvements and/or new capabilities versus current state of practice, as well as the engineering and design work required to easily scale. This includes integrations into existing systems and the development of institutional partnerships. The Phase 2 deliverables include the prototype system and a final report that includes the demonstration system design and test results.

PHASE III DUAL USE APPLICATIONS: Commercial applications may include product development, collaboration and workforce productivity tools, privacy enhancement, business intelligence, and data management. Military applications may include rapid ethnographic assessment, mission planning and logistics, crisis response and disaster relief.

REFERENCES:

  • Nature special issue on "Challenges in irreproducible research" - online at http://www.nature.com/news/reproducibility-1.17552
  • Ruths, D., and Pfeffer, J. "Social media for large studies of behavior." Science, Vol. 346, Issue 6213 (2014): 1063-1064
  • Pashler, Harold, and Eric–Jan Wagenmakers. "Editors’ Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence?." Perspectives on Psychological Science 7.6 (2012): 528-530.
  • Ioannidis, John PA, et al. "Publication and other reporting biases in cognitive sciences: detection, prevalence, and prevention." Trends in cognitive sciences 18.5 (2014): 235-241.
  • Haeussler, Carolin, et al. "Specific and general information sharing among competing academic researchers." Research Policy 43.3 (2014): 465-475.
  • Schrodt, Philip A. "Seven deadly sins of contemporary quantitative political analysis." Journal of Peace Research 51.2 (2014): 287-300.
  • King, Gary. "Restructuring the Social Sciences: Reflections from Harvard's Institute for Quantitative Social Science." PS: Political Science & Politics 47.01 (2014): 165-172.

KEYWORDS: social sciences, statistics, analysis, research practice, psychology, economics, behavioral science, data security

US Flag An Official Website of the United States Government