A Probabilistic Pose Estimation Algorithm for 3D Motion Capture Data
Department of Health and Human Services
Agency Tracking Number:
Solicitation Topic Code:
Small Business Information
20030 Century Blvd, Ste 104A, GERMANTOWN, MD, 20874-1112
Socially and Economically Disadvantaged:
AbstractDESCRIPTION (provided by applicant): A major challenge facing rehabilitation research is to measure relationships between impairments, functional limitations, and disabilities. Biomechanical analyses are a key tool for establishing these relationships byproviding quantitative objective measures of patient status and treatment outcomes. At the heart of many biomechanical analyses is estimation of the pose (position and orientation) of a multi-segment model based on recording of 3D motion data using sensors(optical, electro-magnetic, or inertial). Visual3D, the most advanced clinical biomechanics analysis software available commercially for 3D motion capture data, contains solutions for the estimation of pose from 3D sensor data that have been tested in laboratories throughout the world, and are used on a daily basis for clinical assessment. Researchers have come to rely on Visual3D's capabilities. C-Motion is proposing a collaborative research and development effort to get new pose estimation techniques into the hands of researchers. The algorithms from Phase I and the enhancements in Phase II will be included in Visual3D. At the core of Visual3D's functionality are flexible algorithms for identifying a mapping from 3D motion capture sensors to the 3D pose of a segmented skeletal model. The principle assumption of the Visual3D pose estimation algorithms (and other commercial biomechanics software) is that sensors move rigidly with the body segments to which they are attached. It is accepted, however, that sensors attached to the skin move relative to the underlying skeleton and that this Soft Tissue Artifact is challenging to quantify or model because it is often systematic but varies on a case by case basis. This artifact is a serious challenge to the relevance of non-invasive clinical motion analyses. The current pose estimation algorithms were not designed to incorporate models of soft tissue artifact. Uncertainty in data (e.g. sensor noise and artifact) cannot be addressed directly using current discriminative methods, but may be addressed by casting the Pose Estimation problem in the general framework of probabilistic inference (Todorov, 2007). In this framework, the pose and any prior knowledge about the pose are encoded probabilistically, and the artifacts and noise are captured by a generative model, which defines the conditional probability of the data given the pose. In Phase I we will implement and test a kinematics-based probabilistic algorithm for computing the pose (position and orientation) of asubject using Bayesian inference as proposed by Dr. Todorov. The results will be compared to a set of biplanar cinefluoroscopy data and 3D motion capture data recorded simultaneously by our collaborator Dr. Scott Tashman (Biodynamics Laboratory at the University of Pittsburgh), which we will treat as our gold standard for bone motion. The overall project is very ambitious, so in Phase I we are attempting an important subset of the overall algorithm to demonstrate feasibility of this approach, and to provide evidence that we are capable of tackling the even more ambitious Phase II project. PUBLIC HEALTH RELEVANCE: There is a tremendous need for improved rehabilitation research and clinical services to lower individual health care costs and improve productivity and quality of life. Biomechanical analysis is a key tool for understanding the relationships between impairments, functional limitations, and disabilities by providing quantitative, objective measures of patient status and treatment outcomes.This project is designed to apply probabilistic algorithms developed in the field of machine vision to make a new generation of biomechanical techniques available commercially, which will enable researchers to improve movement analysis dramatically and ultimately patient outcomes.
* information listed above is at the time of submission.