You are here
DOC NOAA SBIR 2017 Solicitation
NOTE: The Solicitations and topics listed on this site are copies from the various SBIR agency solicitations and are not necessarily the latest and most up-to-date. For this reason, you should use the agency link listed below which will take you directly to the appropriate agency server where you can read the official version of this solicitation and download the appropriate forms and rules.
The official link for this solicitation is: http://go.usa.gov/xkvAH
Application Due Date:
Available Funding Topics
- 8.2.1: Mobile coastal monitor
- 8.2.2: Flow cytometry for aquatic single-particle optical properties
- 8.2.3: Dart Improvement for Tagging Cetaceans
- 8.2.4: Single-Mast High-Frequency Radar Antenna (Long Range)
- 8.2.5: Seabed settling detection and measurement technique
- 8.2.6: Smartphone App for Marine Weather Observations
- 8.2.7: Advanced analysis software for new-generation gas chromatographs and mass spectrometers
- 8.2.8: Maritime and Arctic Observations (MAS) with Unmanned Aircraft System (UAS)
Create/develop ways to add improved resolution, usability or functionality to new or existing models and/or develop “modules” for existing models that are specific to established aquaculture species that predict impacts of aquaculture on the ocean environment. These applications should be directed to site selection and to evaluation of environmental assessments that are part of the permitting process for new aquaculture operations. Targeted market is regulators who need tools which provide actionable science based information to make yes or no decisions on all aspects of marine aquaculture. Tools should be: 1) science-based and verified to be credible and defensible, 2) simple to be used by non-scientists, and 3) transparent to be trusted.
Currently, a number of very powerful and accurate models exist to help evaluate potential sites for new or expanding marine aquaculture installations. Whether an individual or company is expanding an existing operation or seeking to obtain a permit to install a new operation, they are always required to provide very specific information on potential impacts and conflicts that may arise as a result of their proposed work. While the models that generate that information work well, they are very complicated to run, requiring highly trained specialists to gather data, input the data into the model, and to interpret the results once the model has been run. If modules could be developed that would allow for certain subsets of “background data” to be pre-programmed, the time and labor required to complete a model run could be substantially decreased. For example, modules could be designed for specific regions or species that would allow modelling technicians to focus on other variables. Modelers would be free to take on more clients, and the clients themselves would not be charged as much for the service.
Clouds and aerosols contribute the largest uncertainty to estimates and interpretations of the Earth’s changing energy budget. Atmospheric aerosol particles may be caused by natural or anthropogenic sources and may form either from emissions of primary particulates or through formation of secondary particulates from gaseous precursors. Knowledge of the composition of aerosol particles is an important factor in understanding their impact on cloud formation or the radiation budget, and ultimate climate effects. Long-term measurements of aerosol composition, in concert with measurement of other key atmospheric, aerosol, and cloud parameters, are critical to improving the understanding and numerical modeling of aerosol and cloud formation processes. Major aerosol components of interest include secondary inorganic sulfate and nitrate compounds, mineral dust and sea spray, and carbonaceous material, which is typically a complex mix of oxidized organic material with a minor component of refractory “black carbon”. The latter material is so named because it is highly optically absorbing and, therefore, the major aerosol component with a tendency toward atmospheric heating.
Aerosol chemical composition is historically measured using a variety of techniques, typically more than one analytical method being required to measure all components efficiently. While many research-grade instruments are used for short-term studies involving intensive field campaigns, issues of cost, manpower, and durability limit the methods available for long-term continuous measurements of aerosol composition. Most long term networks use filter-based samplers to collect (typically 24 hr.) time-integrated samples followed by offline chemical analysis. This strategy allows relatively simple unattended and robust field operation suitable for unattended sites while allowing state of the art analytical techniques even when the samples are acquired at remote locations. It also allows minimization of the field portion of the operating costs, but requires considerable operational costs to maintain the analytical infrastructure. A further tradeoff is that bulk samples allow typically low detection limits at the expense of time resolution.
While existing technologies can measure atmospheric properties of interest under ideal conditions, technological innovations and improvements are required to develop instrumentation that is more robust and automated for long-term deployment at field sites and that has lower weight and power requirements for deployment at remote field sites with limited power.
New measurement technology, is sought to develop or improve present instruments using current measurement principles, in order to allow long-term, continuous measurements of key atmospheric aerosol components listed above [black carbon (BC), non-refractory organics (commonly called OA), nitrate, sulfate, ammonium, mineral dust and at certain sites sea salt].
Measurements need to be autonomous or semi-autonomous with operations, calibrations and maintenance routinely conducted by general instrument technicians rather than experts in aerosol instrumentation. While prototype systems need not be immediately able to operate in autonomous mode, such operation should be anticipated or at least compatible with the system design and operating principle. Other desirable characteristics of robust field deployable systems include:
- Size: standard rack-mountable instrument or smaller.
- Power: ideally able to be powered from of a variety of different international power supplies, e.g. 50 or 60 Hz, 110 or 220 V.
- Shipping: custom shipping container or recommended procedure for protecting the instrument when shipping internationally.
- Routine Operations
- Low level of daily maintenance required to ensure high data quality.
- Low level of training required for continuous measurements at remote field site locations.
- Calibrations: as automated as possible so that a high level of scientific expertise is not required to maintain the instrument at the field site to ensure high quality data. Some examples include, but are not limited to:
- Automated valve switching for the routine sampling of a calibrant that does not require any manual switching from ambient sampling lines
- Calibrations that a general technician can be trained to do routinely and do not require scientific expertise.
- If solutions are required, that they are prepared autonomously and do not require wet chemistry skills/labor to be done in the field, e.g. the preparation of high precision chemical standards.
- Maintenance and Serviceability: spare parts and consumables are easy (not time-consuming) to procure, store, and ship to remote locations. For example, limited use for spares that require special offline upkeep, storage or shipping requirements (e.g., hazardous chemicals and/or radioactive materials).
Applicants should specify the proposed analytes, and document anticipated performance characteristics that represent improvements over other available techniques. Applicants should specifically address expected requirements for maintenance and calibration. Detection limits should be appropriate to the analytes, but adequate to measure expected fractions of ambient aerosols at relatively clean (< 5 μg/m3 average) locations. Applicants must provide convincing documentation (experimental data, calculations, and simulation as appropriate) to show that the sensing method is both sensitive (i.e., low detection limit), precise, and highly selective to the target analyte(s), (i.e., free of anticipated physical/chemical/biological interferences). Approaches that leave significant doubt regarding sensor functionality for realistic multi‐component aerosol samples and realistic field conditions will not be considered.
Chemical cycling of halogens, particularly chlorine, has a profound effect on Earth’s atmosphere. Chlorine cycles are responsible for stratospheric ozone depletion and the ozone hole, and they have a potentially important but poorly quantified effect on tropospheric ozone. Recent research has shown that chlorine cycling is widespread in the lower atmosphere and may be tightly coupled to anthropogenic nitrogen oxide (NOx) emissions and other components of air pollution. These cycles are ubiquitous, but especially prevalent in coastal cities such as Los Angles, Houston, New York and megacities of Europe and Asia due to the interaction of air pollution with chloride from sea salt. These cycles impact regional air quality and global climate through their influence on tropospheric ozone and oxidative processes.
Hydrogen chloride, HCl, is the major reservoir for atmospheric reactive chlorine and a key indicator for these atmospheric chemical cycles. Reliable measurements of this compound remain extremely limited. It is readily measured by wet chemistry, mass spectrometry and optical spectroscopy. Each of these methods has advantages and disadvantages. Wet chemical sampling is a very well characterized approach, but it can suffer from slow time response and cumbersome, offline analysis methods. Recent advances in mass spectrometry have made it a fast response, high precision option for HCl, but such instrumentation tends to be high in cost and can have significant weight, power and consumable gas requirements. Optical spectroscopy is robust and lightweight, but typically not as sensitive as other methods, such that it may lack the required precision for characterization of ambient HCl levels.
The goal of this solicitation is to develop an accurate, high precision, fast response, lightweight, low power consumption gas-phase HCl sensor suitable for use in intensive field investigations in atmospheric chemistry, including deployment on research aircraft. Desired specifications would include a limit of detection of 50 parts per trillion or better with an integration time of 1 second and an inlet response time faster than 5 seconds for this sticky trace gas. The desired measurement accuracy is at least 10%, though higher accuracy is likely achievable. The instrument should be small enough to be packaged into a standard, 19” rack mountable footprint, with a weight not more than 50 kg. Power consumption should not exceed 500 watts if possible. Lower power consumption would be desirable. The instrument should be capable of deployment both for intensive field campaigns and longer-term, ground or ship based measurements during which it would be largely autonomous for periods of days to weeks.
NOAA has been developing operational forecasting systems for the US coastal and the Great lakes regions since 1990s. The latest coastal operational forecast systems applied three-dimensional, high-resolution community ocean models to provide short-term forecast guidance of water levels, currents, temperature, and salinity. A critical challenge for the development and implementation of any 3-D predictive environmental model is the intelligent integration of observational assets.
In-situ measurements of water levels, currents and temperatures have been routinely used for model initialization, validation and verification, but rarely get assimilated into model simulations to reduce errors and improve short-term forecasts. Data assimilation is a necessary aspect of any operational forecast system whereby observations are combined with a background forecast and their respective error statistics to provide an improved analysis. Three-dimensional variational assimilation (3DVAR) is recognized as a reliable, computational efficient data assimilation algorithm with flexibility to assimilate observations from both in situ and remotely sensed data.
This subtopic solicit proposals to develop an innovative data assimilative tool that is capable of handling model backgrounds from model grid as well as a variety of observation types to improve the accuracy of the NOAA OFS forecasts and reanalysis. Especially the data assimilation tool should be able to 1) incorporate various observation datasets into existing NOAA operational forecasting systems, 2) computationally efficient with a user friendly interface, 3) demonstrate improvement over unassimilated simulations.
Improved water levels, currents and temperatures forecasts from NOAA coastal operational forecasting systems (OFS).
There is a plethora of autonomous marine vehicles, primarily measuring physical – chemical parameters, e.g. salinity, temperature, currents, etc. of ocean basins and providing subsurface data, such as vertical profiles. We are seeking development of an autonomous mobile surface coastal monitor that uses “green” power source(s), has a modular component capability (plug and play) to address multiple applications, such as NOAA’s five line offices and other agencies missions, and provides calibration / validation for NOAA and other optical satellite sensors.
The coastal zone, where the terrestrial and oceanic interactions occur, off shore to approximately the 12 mile jurisdiction region, has become a focus area for many of the concerns of NOAA and other environmental and security agencies, academia as well as industry. This broad region covers both littoral as well as pelagic applications of benthos and fisheries. Weather and climate issues within the coastal regions are also of concern. There is an inadequacy of data for many coastal regions, as oceanographic endeavors are conducted further out to sea. Many of the sampling / measurements, e.g. health monitoring, is being conducted at the shoreline. Thus there is a gap between “blue water” and shoreline that can be largely solved by mobile coastal monitoring systems.
Having a routine monitoring system that can be quickly configured for specific application(s) will provide much needed data for management decision makers. For example, a chemical spill could be mapped by adding a chemical sniffer sensor module to the mobile coastal monitor subsystem compartment(s), especially where unsafe conditions exist. Some examples of the “plug and play” type of modular component applications for NOAA are, NMFS may want to employ an acoustic / camera subsystem for certain fish stock assessments using a quiet, small platform. NWS may want to know more exact winds and air sea data a mobile coastal monitor could provide. NOS may want chemical sniffers or detailed current measurements that could be other modular subsystem(s) on the platform. An example of routine subsystem would be a calibration / validation for the increasing armada of space borne platform sensors increasingly being used for coastal processes, critical for NESDIS product development. The modular design for subsystems would provide OAR capabilities to design new research capabilities. Developing “plug and play” modular application subsystems is not part of this SBIR, however scoping examples for the platform is desired and developing a subsystem for calibration / validation of optical remote sensing assets is to be designed and demonstrated. Additional possible sensor or modular components are for such measurements as turbidity, nutrients, radioactive and seismic determinations. Optical measurements, both for water and sky, are the calibration / validation category of measurements that are needed to be designed as a component of the mobile platform. Water column profiles and horizontal profiles would be other options available on the mobile device. Additionally it should have a docking station which allows for a “buoy-like” capability, where propulsion energy could be diverted to batteries and allows for ease of access via boat. Diurnal applications can be conducted as well as vertical profiles of optional measurements and/or sampling. A tethered system should be an option for the mobile coastal monitor.
The platform should have data communications, collision avoidance, beacon, GPS, and whenever possible designed for ease of maintenance and retrieval.
Knowledge on hydrological and atmospheric optical properties provides the foundation for interpretation of satellite ocean color data. The light absorption and scattering coefficients of suspended particles are essential optical properties of natural waters. So far, almost all technologies on determination of these properties were designed to measure particulate samples in bulk. No method or instrument provides a capability to characterize the optical properties of individual particles other than the one introduced by Iturriaga et al. (1988). However, this method requires fixation of particles onto a microscope slide and manual operations during measurement, which are tedious and impossible to implement in an automated fashion. The concept of flow cytometry allows automated examination of individual particles one at a time and there have been many successful applications of this approach in oceanographic research. However, no flow cytometric instruments have been able to quantify the light absorption and scattering cross sections of single particles. Such a technological gap has limited our ability to characterize optical properties of natural waters, often leading to oversimplifications in inverse optical models that are critical for analyzing satellite data.
Optical characterization of natural waters using satellite imagery is necessary for accurate marine and fresh water quality forecasts. NOAA has an unmet need in field optical observations of aquatic suspended particles which would support modeling efforts to interpret ocean color satellite data.
NOAA is requesting proposals for a novel instrument that combines flow cytometric and microphotometric methods for determination of single-particle optical properties. As a minimum requirement, the instrument should be able to determine single-particle light absorption and backscattering cross sections at multiple visible wavelengths for particles commonly found in natural waters, such as phytoplankton cells, organic detritus, and minerals. Particles should be measured in the most undisturbed state possible; so no pretreatment should be applied on them before conducting optical measurements. Ideally, the following features are also desirable to have:
- Hyperspectral resolution;
- Wide spectral range from near UV through near IR;
- A wide size range, ~0.5 – 100 μm;
- Address elongated particles such as Pseudo-nitzschia cells;
- Fluorescence-activated cell sorting and collecting for subsequent analysis.
In order to effectively manage protected species such as endangered or depleted cetacean populations, we require detailed knowledge of their broad-scale habitat use, movements, and migration patterns, as well how they are affected by environmental factors and anthropogenic activities. In order to address this problem, researchers are increasingly turning to electronic tagging technology in order to track animals and provide data needed in stock assessments (Sheridan et al., 2007). Until the last decade, medium-sized cetaceans, including many of the toothed whales, could not be tagged because they were either too large to capture safely for direct application of electronic tags, or because they were considered too small to tolerate the recent generation of implantable satellite tags that penetrate more than 20 cm into tissue (e.g. Mate et al. 2007). Darts for tagging cetaceans have been developed and used during the last decade; however there is a high loss of tags from the dart detachments. Thus there is a need to redesign and commercialize a dart attachment system for improved tag retention for better tag attachment duration. This should also minimize potential impacts on their study species.
There is a need to design, test, and make commercially available a dart that improves the retention of satellite tag used for tracking cetacean movements, thereby decreasing the chance of any dart breakage, while retaining or improving existing performance.
IOOS operates the nation’s only high-frequency radar network that provides real-time information on the speed and direction of surface currents. This system supports search and rescue operations, response to oil spills, port navigation, monitoring and tracking harmful algal blooms, and understanding oceanographic phenomena such as the warm water mass off of the west coast. Research is underway to explore other uses of this technology, including application for national security, tsunami detection, and monitoring significant wave heights. The system currently consists of about 140 radars in nearly every coastal state as well as Puerto Rico. 
Approximately 1/3 of the radars in the network are considered “long-range (LR)”, that is, achieving usable surface current data from distances of at least 150 km from the radar’s location. Most of these radars operate in the 4-5 MHz band but four of them operate in the 8-9 MHz band.
Because of the 2012 ITU findings for oceanographic radar spectrum usage, the 8-9 MHz band will not be available for operational use when the USA fully implements the ITU spectrum recommendations. Hence, our need is for radars in the 4-5 MHz band only.
While the LR stations provide the greatest coverage and farthest measurement range, they are also more difficult for establishing sites because the separation required between transmit and receive antennas at the lower HF band is 50+ meters. This precludes mounting of antennas on most storm-hardened structures near the coast and limiting antenna installations to ground mounts only. A reduced antenna footprint would allow for placement on existing resilient stations like NOAA Sentinels, where available, or other hardened, fixed structures like concrete buildings and parking structures. As real estate at the coast becomes scarcer, this smaller-footprint antenna would also expand the number of potential sites for deployment of HF radars for the IOOS network. Sentinels, water level observing stations operated by NOS Center for Operational Oceanographic Products and Services (CO-OPS), have been strengthened to deliver real-time storm tide data during severe coastal events . These stations are single-pile structures with square platforms 10 feet to a side mounted 25 – 32 feet above mean water level and are designed to withstand category 4 hurricanes. With such a small space on Sentinel and other coastal/offshore platforms, it is not currently possible to install a LR HF radar system requiring separated transmit and receive antennas. A single-mast transmit and receive antenna mounted on a Sentinel or on top of a concrete structure, could provide surface currents throughout the approach and landfall of a storm with significantly more protection from damage or loss.
Design and develop a prototype of a combined transmit and receive HF radar antenna for ocean surface current mapping that could be mounted on fixed structures including Sentinel stations and other nearshore, or onshore, platforms. Considerations for mounting the antenna on Sentinels stations and other small footprint platforms would include: impact on existing equipment already installed (for the Sentinel platform), and power requirements for the small-footprint system.
Deployment of oceanographic sensor packages to the seabed using the variety of platforms (e.g., landers, bottom mounts, benthic frames…) and methods (e.g., free falling, ROV placement, winch deposit…) is common practice for a variety of measurement applications. Currently, all commonly used seabed platforms lack the capability to detect and measure a time series sinking and settling rate while on the seabed. Whatever the reason for placement of the instrument onto the seabed, a box core study of nematodes, a geochemist using electrodes to measure diagenesis or oceanographers using pressure sensors to detect water level changes, the disposition (settling, sinking, or neither) of the seabed platform is most times guess work, while often vertical position of the platform relative to a land based reference frame is of critical importance. We have in the past used different methods such as using resistivity sensors to detect changes in the sediment water interface, using small capillary core tubes to estimate depth of penetration, or marking the side of the bottom mount to determine sinking depth. However, these techniques alone do not provide the need precision, accuracy or rate of settling. This proposed project is to develop an automated sensor to detect, measure, and record times series of settling rates for seabed platforms over the duration of their deployment.
There is currently no automated measurement system that we are aware of that can achieve what we have proposed. This project will look at various methodologies and technologies to develop a commercially available sensor for measuring sinking and settling rate time series for seabed platforms.
Social media is increasingly being used to report weather information in real time. This has been the case recently over land areas. Unfortunately, at this time, information of this nature from the marine community tends to be rare. While some sailors will log their weather information while on a cruise, they are not shared with forecasters, model developers, or fellow sailors. The general lack of marine weather observations makes it challenging for NOAA marine weather forecasters to maintain situational awareness or to validate forecasts and conditions. The general lack of information also makes it difficult for developers to validate the quality of the wind, wave, freezing spray, and ice guidance that they make available to mariners and forecasters.
Smartphone technology makes it possible to report geo-located and time-stamped marine weather information (including such parameters and wind, wave, ice accretion, visibility, and present weather) along with pictures to populate databases and to display them on social media. It is feasible that the information could be shared in real time while mariners are within cell coverage and after a mariner returns to port from a cruise or sailing. Information shared with forecasters and developers could be provided in the form of a database that is updated in real time.
Real time or post cruise marine weather observations (geolocated and time stamped along with pictures) would address several needs of NOAA marine weather forecasters, NOAA ocean and weather model and technique developers, and nearby mariners. These include
- Increasing situational awareness of NOAA marine weather forecasters who have responsibility over the coastal and inland waters zones.
- Provide near real time weather information to nearby mariners who are monitoring social media.
- Provide feedback to NOAA forecasters and model developers on the quality of their forecasts and guidance.
Gas chromatography (GC) and mass spectrometry (MS) are two of the most common environmental analytical techniques utilized by research laboratories around the world. NOAA deploys an array of GC- and MS-based instruments to monitor the composition of Earth’s atmosphere for long-lived climate relevant gases and shorter-lived gases that may degrade local air quality. Many new mass spectrometric detectors, both custom-built and commercially available, are to accurately identify and quantify an ever-growing suite of environmental contaminants. Recent advances in GC and MS hardware development, especially the advent of high mass resolution time-of-flight (TOF) detectors, has outpaced the capabilities of the available analysis software needed to turn the resulting wealth of multidimensional TOF data into useful information. The majority of the commercially available software is proprietary to specific instrument manufacturers and does not allow for the required flexibility of working on an array of mass spectrometers. Standard chromatographic software typically relies on basic peak-integration techniques (e.g., dropping baselines); however, peak-fitting (i.e., mathematically solving for individual peak shapes and areas) provides largely untapped information aiding in peak identification and better quality control. New peak-fitting software is critical for both well-established and novel GC and MS techniques and is required to fully exploit the flood of data generated by these state-of-the art instruments. Currently, the lack of sophisticated software is the largest “bottleneck” for researchers handicapped by analysis tools developed for an earlier generation of instruments.
There is a pressing need for a stand-alone, highly automated, and flexible software tool to fully analyze, digest, and interpret the vast amount of data collected by modern GC and MS instruments as quickly and accurately as possible. The proposed software should specifically address the following criteria:
- The new analysis software must be compatible with an array of mass spectrometers, including but not limited to high mass resolution TOF-MS detectors, in a customizable and user-friendly interface. The peak-fitting software must be compatible with multiple file types, including *.hdf from TOF-MS instruments and *.cdf from quadrupole-MS instruments run in both total ion mode (full mass scans) and selected ion mode (discontinuous mass signals). The ability to ingest files natively is highly desired. The program must also be compatible with previously collected data allowing for older datasets to be re-analyzed.
- The software must utilize (i) peak-fitting algorithms for a variety of peak shapes including asymmetric peaks as well as (ii) standard peak-integration protocols. All fit parameters and variables for individual peaks should be (i) customizable, (ii) readily accessible for quality control purposes, (iii) exportable by the user for more detailed analysis, (iv) easily archived or accessed at a later date by a different user, and (v) efficiently use standard computing power such that it can run on a standard laptop (Mac and PC platforms).
- The software must be able to efficiently and accurately analyze large quantities of highly complex and variable environmental samples collected by field-deployed instruments that are operated under non-ideal (e.g., condition-controlled) settings. The analysis program must be able to accurately deal with retention-time shifts (± 20 seconds), sloping baselines, multiple co-eluting peaks, and identify “new” peaks of which the operator may be unaware. An idealized workflow includes the ability to (i) fit a single peak in a large batch (e.g., 1000’s of files), (ii) to fit all peaks (e.g., 100+ peaks) in a single file, (iii) analyze sub-sets of much larger datasets by sample type (e.g., calibrations, zeros, blanks, ambient samples, etc.), (iv) scan chromatograms for new peaks and identify these species, and (v) produce final concentration data for instruments with linear and non-linear sensitivities.
The Office of Oceanic & Atmospheric Research (OAR) in support of NOAA’s science mission across the line offices has a federal mandate to understand and predict changes in climate, weather, oceans and coasts and to conserve and manage coastal marine ecosystems and resources. The NOAA UAS Program has examined the use of state-of-the-art unmanned systems technologies to survey the most remote and dangerous parts of the world including oceanic and Arctic environments.
Recently, the Papahanaumokuakea Marine National Monument more than quadrupled the size to 582,578 square miles of land and sea in the Northwestern Hawaiian Islands, increasing the scale and scope for management monitoring and activities. Initial analysis of these sources indicates that only 10% of vessels transiting the Monument submit reports in compliance with the International Maritime Organization’s Particularly Sensitive Sea Area (PSSA) regulations governing the area. Additionally, climate change is making Arctic ecosystems more vulnerable while enabling maritime and industrial activities to expand. Both these areas highlight the need for additional, wide area, long-endurance environmental intelligence to fill these data voids. Unmanned systems may safely, efficiently, effectively and economically fill these voids in an environmentally friendly way while collecting valuable insitu atmospheric and oceanic data.
Specific NOAA/OAR/Line Office goals include:
- In support of cross-NOAA LO requirements research, design, develop and prototype the next generation long endurance, shipboard, unmanned systems tools and technologies for maritime surveys for marine domain awareness, Arctic and ice surveys, ecosystem assessments, weather, climate and other NOAA maritime missions. These multi-mission, transdisciplinary platforms will simultaneously fulfill several NOAA missions while being in direct support of our operational partners at the U.S. Coast Guard and U.S. Navy, and builds on past transition to operations, applications and commercialization (or other use) (R2X) successes.
- Develop, integrate and test new low Cost – Size, Weight and Power (C-SWAP) calibrated payloads onto autonomous unmanned systems for atmospheric boundary layer data capture while conducting maritime and Arctic missions.
- Leverage this data and information for climatology, ecosystem monitoring and for satellite support including calibration, validation, and verification (CAL/VAL/VER) of satellite sensors.
As Unmanned Aircraft Systems (UAS) mature in flight capabilities, operational readiness, and affordability, they provide a feasible alternative to maritime and Arctic observations. As such, there are three viable objectives with MAS, all of which can be efficiently achieved through a single system of shipboard (and land based) long endurance UASs with a high level of autonomy. The primary objective of MAS operations is to obtain routine shipboard, electro-optical and infrared (EO/IR) and full motion video (FMS) surveys while monitoring Automatic Identification System (AIS) which is an automatic tracking system used on ships. Additionally, conduct high-fidelity meteorological observations of the lower atmosphere, with particular emphasis on the planetary boundary layer (PBL). Acquiring these data will serve two purposes:
- Provide the NOS, NMFS and associated maritime partners with a cost-efficient, operationally-feasible means of deploying UAS for maritime and Arctic surveys for marine monitoring, wildlife monitoring and ice monitoring. While capable of performing routine missions, the autonomous and mobile UAS platforms will also have the capability to be deployed for non-routine rapid-response maritime and Arctic operations for oil spills, marine debris, and search and rescue. Fulfillment of this objective will save on time and resources while gaining access to difficult to access remote and dangerous locations possibly saving lives.
- As an extension of the primary objective, provide various atmospheric measurements within and near the PBL is the second MAS objective, which can be easily fulfilled through many of the same operational maneuvers and aboard the very same UAS platforms used for first objective. The low C-SWAP of these sensors has been studied.
Key Driving Requirements for Low Altitude Maritime Survey + Planetary Boundary Layer Sampling
- Primarily EO/IR and Full Motion Video (FMV) + Atmospheric Sensing
- Wind Speed
- Wind Direction
- Air Quality, O3, CO2 NO2, actinic flux (sunlight)
- Accuracy of Data for Potential Sensors
- EO/IR A resolution of approximately 2.5 cm x 6 cm would be necessary to read marine mammal tags, and 30 cm x 30 cm to read marine mammal bleach marks
- FMV: “Reasonable” quality video (at least 5 MP) for opportunistic preliminary damage assessments (HD resolution 720p or 1080i)--AIS Standard feed
- Temperature: +/- 0.2 C
- Humidity: +/- 5% RH
- Wind Speed: +/- 0.5 m/s
- Wind Direction: +/- 5 degrees azimuth
- Pressure: +/- 1.0 hPA
- Air Quality: +/- 5.0 ppbv O3, ±0.5 ppbv NO2; 2% actinic flux
- Sensor Response Time
- EO/IR, FMV, AIS - Real-time
- Atmospheric and Air Quality -Less than 5 seconds
- Altitude range
- Surface to 6,000 m
- Vertical Data Resolution (Atmospheric)
- 10 m for air quality measurement operations
- 25 m for meteorological measurement operations
- Horizontal range
- Threshold: 10 nm, Objective: 1000 nm
- Max Frequency of Deployments/Flights
- Constant deployment of one system (orbit) Threshold: 1 week, Objective: 5 months
- Operating Condition Ranges
- Wind Speed: 45 m/s (inflight), Take-off/Landing: Threshold: 25 Knots, Objective: 35 Knots
- Temperature: –30 to +40 degrees C
- Humidity: 0-100% RH
- Ongoing precipitation / Types: All weather
- Threshold: 24 hours, Objective: 36 hours
- Ascent Rate
- Ranging from 1 to 5 m/s, but ultimately dependent on sensor response / hysteresis
- Shipboard Footprint & Launch/Recovery Area
- On-load of system without crane services
- Launch/Recovery Area: Threshold: 20’x20’, Objective: 10’x10’
- Beyond Visual Line of Sight Equipment
- The UAS must be equipped with current beyond visual line of sight operations equipment.
NOTE: Even though a prototype may be required to be delivered for the project, it I Important to note that this prototype is still the property of the offeror. NOAA would only do field or lab testing on that product to see its feasibility in a production (or development) environment
Customers are requesting improved products for assessing positioning errors for satellite navigation systems such as the Global Positioning Satellite (GPS) system. The GPS system has become widely used for position, navigation, and timing. Single frequency GPS navigation systems are part of nearly every automobile or smart phone sold today. Dual and even three frequency GPS systems are used for more precise position and timing information and are widely used in maritime and aviation navigation, surveying, agriculture, oil and mineral exploration, and banking. Even the NWS weather models incorporate GPS data in their data assimilation schemes. Nearly everyone in industrialized nations of the world has become reliant on satellite navigation systems.
The density variations of the ionosphere modify the path and speed of the signal from the GPS satellite to the ground receiver and thus, introduce errors. The NOAA Space Weather Prediction Center (SWPC) currently provides products that help customers identify when and where these sorts of disturbances occur. But customers have indicated that the space weather impacts on their systems are not well captured by the current products and do not provide adequate information on what the user impacts are. For instance, the primary product from SWPC is a North American map of Total Electron Content or TEC as observed from a number of ground-based GPS receivers across the US, Canada, and Mexico. This parameter provides a broad indication of when and where the ionospheric disturbances might affect GPS but it does not provide accurate information on the magnitude of the positioning errors for the different types of systems.
As the GPS technology has advanced, so has our understanding of how to best measure and understand the impacts of space weather on these systems. There are new ways of processing the existing ground GPS data to provide more accurate estimates of positioning errors that are more applicable to the needs of the end user. Based on the large number of commercial and private sector users of GPS and new analysis and data processing techniques, it is highly likely that new products and services could developed, using existing data, that would have the potential for commercial services.
The specific goal of this project would be to develop a better way or processing the existing, publically available, ground-based GPS data, to create a new product or products that would give the users of the GPS position, navigation, and timing system a better indication of when, where, and how, space weather could be impacting their systems. This product should provide a real-time assessment of the environmental impacts on the accuracy and errors of the position and timing information generated by the GPS devices and, if possible, provide a short term forecast of how these impacts may change. The ultimate goal of this project would be to provide the numerous users of GPS satellite navigation better more accurate information on how their systems are performing.
Measurements of atmospheric aerosols, gases and meteorological parameters are critical components of NOAA’s climate and air quality studies (e.g., NEAQS, TexAQS, CalNEX, FIREX, CICCI). Unmanned Aerial Systems (UAS) provide a means to obtain these measurements from ships and land based regions not easily accessible by manned aircraft. Several upcoming NOAA experiments (FIREX, SOCRATES, Gulf of Mexico-BOEM collaboration) have called out a need for UAS measurements. At this time there is no UAS with 1) a maximum take-off weight under 55 lb as per FAA regulations, 2) that can be deployed and recovered from a ship, 3) has a payload of at least 15 lb (needed to carry the aerosol measurement payload) and 4) has a pusher engine so as to not interfere with gas and aerosol inlets on the front of the plane. Vertical take off and landing (VTOL) fixed wing UAS could answer this need however, there is currently no such UAS on the market that meets the above requirements.
Ship deployable UAS will fill many needs within NOAA including marine mammal surveys, coastal ecosystem health monitoring, climate and air quality studies, and surveillance missions. The VTOL technology allows the UAS to be launched and recovered within a small deck footprint. The fixed wing capability gives the UAS endurance for longer flights. At this time, VTOL fixed wing UAS have not been optimized for use aboard ships. Shipboard operations present some additional challenges not experienced in land based operations. The magnetometer used on the UAS will not work on metal ship. The turbulence around the ship due to the ship superstructure will require a more powerful VTOL motor than needed on land. Our mission requirement of a take-off weight of less than 55 lb, the ability to carry a 15 lb payload, and a pusher engine will require a trade off with fuel weight and thus endurance. The funded project will need to take these challenges and requirements into account to optimize the UAS design.
Public participation in scientific research through citizen science and crowd-sourcing programs produce scientific data that is used to help shape fundamental environmental questions. Data and other information generated through citizen science and crowdsourcing programs have been shown to be reliable and accurate, and the in-kind contributions to research have an economic value of up to $2.5 billion per a year. In September 2015 a Federal Memorandum recommended agencies build capacity for citizen science and crowdsourcing.
Citizen science volunteers partake in environment-related studies and can sample wide geographic areas, creating large amounts of spatial and temporal data that can be used to obtain valuable insights into scientific questions. A major challenge volunteers face is a lack of low-cost, innovative equipment, as well as resources for better communication and information sharing. Citizen science programs typically utilize equipment that produce low cost measurements (several dollars to approximately $50 per analysis) and financial limitations often prevent the use of modern instrumentation for laboratory and field data collection.
New, low-cost instrumentation is needed to increase capabilities of citizen science data monitoring projects. Proposals are requested for laboratory and field instrumentation and equipment that produce robust and verifiable scientific-quality, monitoring data at an economical cost. Volunteers from a diverse age group participate in citizen science programs, so equipment needs to be rugged and have a simple, user-friendly design. Budget constraints often limit programs from purchasing state of the art scientific equipment. The proposed equipment must be affordable to programs that only spend several dollars to approximately $50 per unit. In addition to field and laboratory equipment, proposals to create new applications for citizen science data portals are encouraged. Citizen science monitoring programs need open source application platforms that enables volunteers to upload verifiable data into a user friendly interface.
Machine vision and/or artificial intelligence tools to detect and respond to various classes of risk associated with aquaculture operations or the environment. For example, interactions of offshore marine aquaculture systems with marine mammals and turtles, or the detection of escapes, predators, disease and/or mortality in remote aquaculture operations. Creation and deployment of a system that can anticipate and/or alert aquaculture operators when there is an increased likelihood of an event or conflict (e.g. entanglement). This could be real time detection/monitoring or a combination of real time detection and modeling. The next step would be to act on the information provided by that tool; systems which could respond (or at least have a pathway to effect a response) to risks with mitigation measures are preferred.
These types of products address two key needs in the marine aquaculture industry: protection of livestock (as well as the local environment) and safety of employees. Currently, most of the crises that could occur at an offshore aquaculture installation would remain unknown to the operators without a human site visit or surveillance. Paying salaries and ensuring the safety for 24 hour staffing to monitor a farm site is prohibitively expensive, and operators are forced to gamble by estimating how much risk they are actually carrying. If technology could be developed and deployed that could detect a particular threat and alert the owners, they could take a very specific, target action in response (as opposed to visiting the site, discovering a problem and having to make a return trip to the site to resolve it). They types of risks that could be reduced or avoided by this technology include predation, damage to equipment, escape events, changes in water quality, the presence of harmful algal blooms, etc.. For some threats, this technology could be designed to be capable of resolving the threat without the intervention of human operators (i.e. active deterrence of predators, or changing the position of a submerged cage to better protect it from dangerous weather).
Consumption of seafood contaminated with the marine toxin saxitoxin (STX) can lead to severe and debilitating illness in both humans and shellfish, fish, seabirds and marine mammals. STX intoxication leads to paralytic shellfish poisoning, and symptoms include nausea, vomiting, diarrhea, abdominal pain, tingling sensations, shortness of breath and confusion. It is a major concern in New England and along the entire US West coast as well as many other regions worldwide. STX can cause respiratory failure and death within hours if respiratory support is not provided. There are very few rapid tests that can be used to detect STX quickly in order to protect the seafood supply by responding quickly to contamination events. One rapid test (the Jellett Rapid Test) is currently available, which is based on antibody detection. However, this test can produce false positive or negative results of toxicity due to issues with antibody cross-reactivity to extraneous compounds in solution. The Jellett Rapid Test for STX is expensive ($75 per single sample test), only gives a qualitative yes/no result, and has low sensitivity to some STX analogs or metabolites (e.g. neoSTX), which has made widespread use impractical. One technique that offers rapid testing in conjugation with sensitivity for the various toxins are receptor binding assays. Currently, these assays are based on radioactively labeling the toxin and then mixing the labeled toxin with a sample extract and the target receptor on the cell surface to which the toxin binds. In the case of STX, the receptor added are isolated sodium channel receptors. If there is no unlabeled STX in the sample extract only radioactive STX binds the sodium channel receptors, and high counts are recorded when the assay is read. In contrast, the more STX in the sample, quantitatively less radioactive STX is bound. This inverse relationship allows for a sensitive and accurate determination of the toxin content in the sample. Despite this accuracy, the rRBA sensor system is problematic because of variability in stability of the radioactive STX and the concurrent cost, training, and specialized facilities/equipment required. Getting stable batches of the radioactively labeled STX has proven problematic over the past few years leading to periods when screening was impossible. A potential solution to these issues is to develop fluorescent RBAs (fRBA) based sensor system. Such a system would have greater stability than the current radioactively labeled STX sensor system, equal or superior sensitivity, and lower cost due to the reduced need for waste handling, specialized equipment and facilities, and more stringent training and certification requirements.
The current radioactive receptor binding assay (rRBA) sensor system for detecting the saxitoxins (STXs) which cause paralytic shellfish poisoning is problematic. Because this reagent is difficult to manufacture and often unstable stability, there are periods when it is not available which limits our ability to test for this important seafood toxin. The rRBA is also expensive to run due to costs associated with waste disposal, maintenance of a radioactive license, and concurrent training and specialized facilities/equipment required. Consequently, there is a need for an alternative STX sensor system which is more stable and cost effective. A fluorescence-based sensor system has the potential to address this need because they offer comparable sensitivity, are more stable and easier to use. Such fluorescent sensor systems, however, have not yet been applied extensively for the detection of seafood toxins because the coupling chemistry needed to make the labeled toxin required is challenging. If a solution to this coupling problem can be found, it would allow construction of fluorescent RBAs (fRBA) based sensor systems for STX and other marine toxins. Given this potential, the funded project would be for development of a fRBA which (1) quantitatively detects toxic analogs/congeners of saxitoxin (e.g. STX, neoSTX, GTX1/4) without detecting the non-toxic/less toxic analogs (e.g. C toxins and analogs), (2) yields low false positive and negative results, (3) provides comparable sensitivity to the radioactive assay, and (4) can be successfully implemented in a 96-well plate format that can be read by fluorescent plate readers commonly available in many laboratories. The benefit of this approach would be faster, more cost effective method for measuring one of the major seafood toxins of concern in the US and worldwide. There is both a national and international market for an fRBA sensor system. If successful, the product could be easily commercialized and potentially be expanded in phase II to include development of detection assays for other marine toxins.
NOAA has invested significant resources in the development of DNA-based molecular assays for monitoring harmful bacteria and microalgae which adversely affect human and animal health and that cause significant economic loss to the nation. These assays work well and resource managers and public health officials have begun to employ these assays in some locations for routine monitoring aimed at identifying when significant environmental threats are present. These molecular assay are a preferred monitoring method because they can distinguish toxic species from co-occurring beneficial species that are morphologically identical using traditional light microscopy or other counting methods. There are also efforts underway to implement the assays in field portable devices. The transfer of these methods to a broader user base, however, is currently being hampered by the lack of a commercial source of standards required to calibrate the assays. NOAA laboratories are currently providing these standards to user groups on a case by case basis, but this is becoming a significant burden and does not represent a viable long-term strategy for implementing and commercializing these technologies. This grant would be for development of a cost effective method of producing standards for molecular based environmental monitoring assays. If successful, there are numerous standards from NOAA and other sources that could be profitably commercialized.
There is a significant monitoring need for molecular assays to detect toxic and pathogenic organisms in the environment. Numerous effective molecular detection assays for this purpose have been developed and published by NOAA, other Federal, and research institutions. The widespread application of these assays, however, has been hampered by the commercial availability of DNA standards needed for calibrating the assays. This grant is for developing a cost effective method for producing these DNA standards used in molecular environmental assays based on quantitative PCR and other methods. These assays typically require approximately 100 ng of standard template DNA per standard curve. The successful applicant would be required to produce and accurately quantifying µg quantities of DNA standard that can be partitioned into individual tubes where each tube can be used for constructing a standard curve. The ideal cost point is in $10-$20 per standard curve. The quantification of the DNA concentration must be precisely determined by digital PCR or some other method and stability for evaluated over a multiple month period. NOAA will provide the DNA templates to serve as the test subject for producing the standard curve DNA. The DNA standards produced within this scope of work will not have to meet the rigorous quality control measures required for human disease testing. If a cost effective way of producing standards can be successfully developed, there is a wide range of customers including natural resource managers, public health officials and academic researchers that would be interested in purchasing or ordering custom standards.
The Patent Pending NOAA NOy-Cavity Ring-Down Spectrometer is a sensitive, compact detector that measures total reactive nitrogen (NOy), as well as NO2, NO and O3 using cavity ring-down spectroscopy (CRDS). This product is unique in that the optical cage system holds four optical cavities (with associated sample cells) and a laser together, allowing a measurement of all four trace gases simultaneously and with a robust calibration in a small package. The NOAA CRDS is compact and has lower power, size, weight, and vacuum requirements than chemiluminescence-based instruments while approaching equivalent sensitivity, precision and time response.
Climate science and air quality monitoring provide ongoing applications for instrumentation to accurately measure atmospheric trace gases. The precision and accuracy of this instrument make it a versatile alternative to standard chemiluminescence-based NOy instruments currently on the market.
The markets for scientific instruments in the U.S. and abroad are well-established and supported by a number of known scientific instrument manufacturers, including at least three domestic and three international commercial manufacturers of a cavity ring down NO2 instruments. Given the compact and efficient performance and other unique features of this instrument for measuring ambient air across a range of environments and measurement platforms, it is an excellent licensing opportunity for the scientific instrument manufacturing sector
The NOAA NOy CRDS was developed for the Earth System Research Laboratory in Boulder, CO, in order to support the lab’s research activities. There is one prototype in existence, which is in regular use by the lab. The goal of NOAA’s Technology Transfer program is to encourage the broader use of NOAA’s patented or patent-pending technologies in commercial markets and/or to encourage the development of new uses for our technologies. The project goal, therefore, for this SBIR Technology Transfer solicitation is to receive proposals from companies that are interested and able to develop a more compact and commercially viable version of the NOAA NOy for sale.
In order to accomplish this goal, companies sending proposals against this SBIR Technology Transfer topic would be required to sign a one-year, no-cost research and technology which may be renewed under Phase II, should the Phase I activities be deemed successful.