You are here

Image Analysis Tools for mpMRI Prostate Cancer Diagnosis Using PI-RADS

Award Information
Agency: Department of Health and Human Services
Branch: National Institutes of Health
Contract: 2R42CA224888-03A1
Agency Tracking Number: R42CA224888
Amount: $1,503,918.00
Phase: Phase II
Program: STTR
Solicitation Topic Code: 102
Solicitation Number: PA19-270
Timeline
Solicitation Year: 2019
Award Year: 2020
Award Start Date (Proposal Award Date): 2020-09-01
Award End Date (Contract End Date): 2022-08-31
Small Business Information
13366 GRASS VALLEY AVE STE A
Grass Valley, CA 95945-9549
United States
DUNS: 116971838
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
Principal Investigator
 JOHN ONOFREY
 (203) 785-3055
 john.onofrey@yale.edu
Business Contact
 MAHTAB DAMDA
Phone: (530) 274-1240
Email: mahtab.damda@eigen.com
Research Institution
 YALE UNIVERSITY
 
OFFICE OF SPONSORED PROJECTSPO BOX 208327
NEW HAVEN, CT 06520-8327
United States

 Nonprofit College or University
Abstract

Project Summary
Prostate cancer is one of the most commonly occurring forms of cancer, accounting for 21% of all cancer in men.
The Prostate Imaging Reporting and Data System (PI-RADS) aims to standardize reporting of prostate cancer
using multi-parametric magnetic resonance imaging (mpMRI). However, the in-depth analysis, as demanded
by PI-RADS, remains challenging due to the complexity and heterogeneity of the disease, and it is a clinically
burdensome task subject to both significant intra- and inter-reader variability. Auxiliary tools based on machine
learning methods such as deep learning can reduce diagnostic variability and increase workload efficiency by
automatically performing tasks and presenting results to a radiologist for the purpose of decision support. In
particular, automated identification and classification of lesion candidates using imaging data can be performed
with respect to PI-RADS scoring. In Phase I of this project, we developed two automated methods to reduce the
intra- and inter-observer variability while interpreting mpMRI images using the PI-RADS protocol: (i) a method
to co-register mpMRI data, and (ii) a method to geometrically segment the prostate gland into the PI-RADS
protocol sector map. The overarching goal of this Phase II project is to develop machine learning algorithms that
incorporate both co-registered multi-modal imaging biomarkers and PI-RADS sector map information into an
automated clinical diagnostic aid. The innovation in this project lies in the use of deep learning to automatically
predict PI-RADS classification. This project is significant in that it has the potential to improve clinical efficiency
and reduce diagnostic variation in prostate cancer diagnosis. In Aim 1 of this project, we will develop a deep
learning approach to localize and classify lesions in mpMRI. In Aim 2, we will integrate this diagnostic tool into the
ProFuseCAD system and perform rigorous multi-site validation to quantify PI-RADS classification performance.
Both aims will utilize a database of over 1,000 existing mpMRI images from multiple clinical sites to develop and
validate the algorithms. Ultimately, enhancements from this project will create a novel feature for Eigenandapos;s (the
applicant companyandapos;s) FDA 510(k)-cleared imaging product, ProFuseCAD, in order to improve the diagnosis and
reporting of prostate cancer.Project Narrative
Radiological interpretation of multimodal prostate imaging data is challenging and subject to high levels of vari-
ability. To address this problem, auxiliary tools based on machine learning methods such as deep learning can
increase workload efficiency by automatically performing tasks and presenting results to a radiologist for the
purpose of decision support. In particular, automated identification of lesion candidates and assessment of po-
tentially benign or malignant lesions with respect to specific PI-RADS categories from clinical imaging data can
improve prostate cancer reporting and reduce variation in radiological interpretation.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government