You are here

SBIR Phase II: An augmented learning platform for mobile devices

Award Information
Agency: National Science Foundation
Branch: N/A
Contract: 1632721
Agency Tracking Number: 1632721
Amount: $750,000.00
Phase: Phase II
Program: SBIR
Solicitation Topic Code: EA
Solicitation Number: N/A
Solicitation Year: 2016
Award Year: 2016
Award Start Date (Proposal Award Date): 2016-08-01
Award End Date (Contract End Date): 2018-07-31
Small Business Information
5208 Lodestar Way
Elk Grove, CA 95758
United States
DUNS: 078737978
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
Principal Investigator
 Tarun Pondicherry
 (732) 705-1198
Business Contact
 Tarun Pondicherry
Phone: (732) 705-1198
Research Institution

This Phase II project is to the first personalized learning platform for hands-on education that works with the billions of mobile devices already in people's hands worldwide. Because the technology works on any mobile device and requires little instructor facilitation, it will be commercially successful in home settings when parents are busy and school settings where budget restrictions limit the number of facilitators. Beyond commercial impact, this project will help address the nation's need to prepare citizens for the 21st century economy, improve science literacy and help provide equal opportunities to underrepresented minorities in science, technology, engineering and mathematics (STEM). This project is an effective teaching tool because it employs personalized learning, where the content and pace of learning are optimized for the individual learner. Personalized learning has already been shown to result in higher learner engagement and increased retention of concepts in educational software. Through the use of proprietary augmented reality and adaptive learning technology, this project brings personalized learning to hands-on STEM education, enabling learners to develop problem solving skills in a more effective, engaging and lower cost way. The key innovation in this project is a single software framework combining advanced augmented reality (AR) and adaptive learning (AL) techniques to capture learners? interactions with real world objects (for example, circuit blocks, fraction bricks or chemistry models) via a mobile device camera, analyze the significance of the interactions and automatically provide personalized guidance to each learner. The system will be the first to provide a high level API above the complexity of AR and AL, allowing designers of learning experience modules to focus solely on content and user experience instead of hundreds of thousands of lines of complex code associated with AR and AL. The methods employed will be the research, design and development of the augmented reality and adaptive learning engine, its integration with three learning experience modules: circuits, fractions, basic geometry, and an evaluation of the modules. To guide the development, pilot studies utilizing A/B testing with competing approaches, pre and post assessments, and behavioral analysis of users interacting with the system will be conducted periodically.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government