You are here

Bounding generalization risk for Deep Neural Networks

Award Information
Agency: Department of Defense
Branch: National Geospatial-Intelligence Agency
Contract: HM047620C0084
Agency Tracking Number: M20A-001-0012
Amount: $99,475.00
Phase: Phase I
Program: STTR
Solicitation Topic Code: NGA20A-001
Solicitation Number: 20.A
Timeline
Solicitation Year: 2020
Award Year: 2020
Award Start Date (Proposal Award Date): 2020-09-30
Award End Date (Contract End Date): 2021-07-04
Small Business Information
2014 CALLE BUENA VENTURA
OCEANSIDE, CA 92056-3217
United States
DUNS: 081267253
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
Principal Investigator
 Gabriel Perdue
 (630) 840-6499
 perdue@fnal.gov
Business Contact
 George Subrebost
Phone: (310) 869-1967
Email: george@euler-sci.com
Research Institution
 Fermi National Accelerator Laboratory
 Mary Jo Lyke
 
Pine St
Batavia, IL 60510-0000
United States

 (630) 840-8976
 Nonprofit college or university
Abstract

Deep Neural Networks have become ubiquitous in the modern analysis of voluminous datasets with geometric symmetries. In the field of Particle Physics, experiments such as DUNE require the detection of particle signatures interacting within the detector, with analyses of over a billion 3D event images per channel each year; with typical setups containing over 150,000 different channels.  In an analogous data intensive field, satellites continually produce datasets requiring the detection of millions of objects per 1000 sq km over the full surface of Earth. Understanding the uncertainty induced by the underlying Machine Learning Algorithm is important to such analyses. This error has not been included in analyses in a fundamental way and is currently included exclusively in sophisticated and costly empirical studies. We will develop a theoretical bounds on this error utilizing Fourier analysis (Xu, Zhang, Luo, Xiao, & Ma, 2019) and will build upon the a priori generalization bound established for shallow networks (Xu, Zhang, Zhang, & Zhao, 2019) by considering deep Rectified Linear Unit (ReLU) neural networks of minimal width and disparate test and train domains. We will then work on extending our bounds to simple Deep Convolutional Neural Networks, to simple empirical studies on disparate test and train domains, and to empirical studies for object detection.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government