You are here

Artic Small-Unmanned Aerial System Automatic Ground Control Point Processing for Terrain Modeling


OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Advanced Computing and Software


OBJECTIVE: Develop and validate an automatic Ground Control Point (Auto-GCP) methodology for orthomosaic and provide advancements in object identification & mapping (KAZE algorithms2) for multiscale feature detection.


DESCRIPTION: Small UAS platforms have redefined squad level ISR collection processes providing an overmatch capability, aiding in soldier lethality and maneuver for both dismounted and mounted off-road mobility platforms. In addition to the challenges of operating small UAS in Artic environmental conditions, the post processing of imagery for orthomosaic and DSMs (photogrammetric surface modelling derived from Structure from Motion) is complex due to the poor availability of identifiable ground features and contrast from a highly reflective surface. Traditional photogrammetry methods rely on GCPs and distinct terrain features to align and process imagery. However, in Artic environments that are characterized by flat, snow-covered terrain, lacking these critical features, rendering traditional methods less accurate and often incalculable. Recent computer vision and remote sensing advancements in object identification and mapping (KAZE algorithms2) for multiscale feature detection in nonlinear scale space will enhance photogrammetric accuracies as well as provide a basis to derive feature matching algorithms for localization in the absence on GNSS.


The goal of this topic is to further define photogrammetric processes unique to Polar Environments from small UAS imagery collections3,4. Those processes will result in automated, near real time product generation that will aid in ground maneuverability, UAS maneuver (obstacle avoidance) and visual terrain referencing for operations in denied GNSS environments.


PHASE I: Integrate and advance the photogrammetry process of automatically defining and matching ground control points (pre bundle adjustment) for accurate terrain modelling and imagery creation. Existing Auto-GCP algorithms will be integrated into the photogrammetric process in post processing of collected imagery to assess model performance on commercial hardware. The identified ground control points will be assessed for accuracies for inclusion in Visual Terrain Referencing in future resection algorithms for localization in GNSS denied environments. A detailed site survey will be produced with known ground control point horizontal and vertical accuracies that compares identified control points and benchmarked control points.


Determine atmospheric conditions in an Artic Environment that will support the desired final products at an absolute accuracy of minimum resolution of 10 centimeters with absolute geolocation accuracy of <5.0 m CE90/LE90 and vertical accuracy of <10 meters. An initial summary of results should include existing and derived results of weather affects on sUAS operations in artic environments. An approach will be established from surrogate or derived SUAS imagery or Full Motion Video for the efficacy of advanced high resolution terrain models and photogrammetric processes that are suitable for tactical level integration in military applications.


PHASE II: Develop a near real time computational process either on-board or via a direct down link to a End User Device that generates an orthomosaic of a pre-defined area and photogrammetrically derived DSM at a minimum resolution of 5 centimeters with absolute geolocation accuracy of <2.0 m CE90/LE90 (matching Artic DEM products) and vertical accuracy of <5 meters1. These 3D models will incorporate existing structure from motion and computer vision techniques employed by commercial and Army systems to derive ultra-high resolution 3D models. In addition to these products the near real time processing will also need to identify features in the environment that would be hazardous to UAS operations and flight mission planning.


The Auto-GCP algorithms will be fully integrated into the photogrammetric process and run in real-time during the collection phase. Existing algorithms will be incorporated into the photogrammetric processing that include recent advancements in feature detection and alignment and multiple scales (KAZE features9) available from existing automated geo-regristration techniques10.


PHASE III DUAL USE APPLICATIONS: This research will not only pave the way for accurate high-resolution mapping at the squad level in featureless terrain but will also provide methodologies for observing the rapidly changing Artic Environment.  This application would also aid in climate change studies and environmental monitoring and assist in ground (mounted / dismounted) and low altitude SUAS maneuvers and flight operations where GNSS is limited or non-existent.


Commercial SUAS offerings in Arctic environments are limited due to the inability to operate above 60◦ latitude requiring a high-resolution Digital Elevation Model to launch and recover for terrain following. Commercial applications of this product will benefit from utilizing new data sets (Artic DEM Project) as well as advanced elevation models for flight planning and operations.


The primary development and integration effort for this phase will establish a near real-time localization algorithm and process for sUAS localization in the absence of GNSS post-initialization. A vision-based navigation or visual terrain referencing software system, encompassing the Phase II Auto-GCP software, will be established to use organically collected imagery and photogrammetrically derived DSM’s for feature or horizon matching to determine the aircrafts position. The resulting localization will require an absolute position that is sufficient to carry out flight operations for a minimum of 50% of the entire flight time.

A real time object identification and avoidance model will be developed to aid in low altitude collections and reconnaissance missions derived from on-board optical camera systems. This effort will also aid in vision and terrain-based navigation with ground units for determining SUAS position and ground force localization during denied/degraded GNSS events.


The culmination of phase III will integrate the Android Team Awareness Kit (ATAK) 11 platform to render the resulting orthomosaics and DSM locally on device within 30 minutes of post flight operations. Final products will be delivered in a data format that already supported within the ATAK software suite and be properly aligned in a supported geographic data model (WGS 84 – Web Mercator).



  1. Center, P. G. (2023, August 17). ArcticDEM mosaic version 4.1 release. ArcGIS StoryMaps.;        
  2. Bhardwaj, A., Sam, L., Akanksha, & Martín-Torres, F. J. (2016). UAVs as remote sensing platform in glaciology: Present applications and future prospects. Remote Sensing of Environment, 175, 196-204.;    
  3. Nolan, M., Larsen, C., & Sturm, M. (2015). Mapping snow depth from manned aircraft on landscape scales at centimeter resolution using structure-from-motion photogrammetry. The Cryosphere, 9(6), 2089-2102.;  
  4. Tonkin, T. N., & Midgley, N. G. (2016). Ground-based photogrammetry and digital elevation model accuracy in an Arctic setting. Geomorphology, 269, 16-27.;         
  5. FACT.MR. Drone Surveying Market Size is Expected to Achieve a Valuation of US$ 8061.5 Million By 2033, States Fact.MR. GlobeNewswire News Room. Published May 8, 2023. Accessed November 12, 2023.; 
  6. Fact.MR – Drone Surveying Market Analysis, By Survey Type (Land Survey, Property Survey, Rail Survey, Infrastructure Survey), By End Use Industry (Energy, Construction, Transportation & Warehouse, Agriculture, Mining, Others), and Region - Global  Market Insights 2023 to 2033.;      
  7. Drones Market Insights.,2028.;       
  8. Commercial Satellite Imaging Market Size, Share, Trends - 2031. Allied Market Research.,industry%20on%20a%20global%20level.;
  9. Alcantarilla, P.F., Bartoli, A., and Davison, A.J., "KAZE features." European conference on computer vision. Springer, Berlin, Heidelberg, 2012.;
  10. Zhuo, X.; Koch, T.; Kurz, F.; Fraundorfer, F.; Reinartz, P. Automatic UAV Image Geo-Registration by Matching UAV Images to Georeferenced Image Data. Remote Sens. 2017, 9, 376.;
  11. (n.d.).


KEYWORDS: Auto-GCP, Small-Unmanned Aircraft Systems (SUAS), geolocation, Polar/Arctic environments

US Flag An Official Website of the United States Government