You are here
DOC SBIR NOAA 2012-1
NOTE: The Solicitations and topics listed on this site are copies from the various SBIR agency solicitations and are not necessarily the latest and most up-to-date. For this reason, you should use the agency link listed below which will take you directly to the appropriate agency server where you can read the official version of this solicitation and download the appropriate forms and rules.
The official link for this solicitation is: https://www.fbo.gov/index?s=opportunity&mode=form&tab=core&id=fada14ff631c75636708234c986f3c3b&_cview=0
Release Date:
Open Date:
Application Due Date:
Close Date:
Available Funding Topics
-
8.1: Resilient Coastal Communities and Economies
- 8.1.1R: Unmanned Aerial System-Borne Gravimeter
- 8.1.2SG: Development of Ocean and Coastal Renewable Energy Related Technologies
- 8.1.3SG: Innovative Approaches to Facilitating Coastal and Marine Spatial Planning Processes
- 8.1.4F: Quantification of Green House Gas Fluxes in Coastal Ecosystems
- 8.1.5N: Self Reporting GPS Tracked Bench Mark for Sensor Vertical Position
- 8.1.6N: Enhanced Electrochemical Detection of Toxins in Water, Shellfish and Fish Samples
- 8.1.7N: Robust HF Radar Tsunami Detection Software Development
-
8.2: Healthy Oceans
- 8.2.1R: Zero-Reaction Manipulator-Handled Submersible Drill Rig
- 8.2.2R: High-Sensitivity/High-Precision Measurements of Calcium Concentrates in Seawater
- 8.2.3F: Improving environmental Sustainability and Competitiveness of US Marine Aquaculture
- 8.2.4F: Automated Image Analysis for Fisheries Applications
- 8.2.5F: ME70 MULTIBEAM PROCESSING EFFICIENCY IMPROVEMENTS
- 8.2.6N: Development of Sustainable Coral Cell and Tissue-Culture Lines
-
8.3: Climate Adaptation and Mitigation
- 8.3.1C: Development of a Long-Term Lagrangian pH and pCO2 Drifter
- 8.3.2C: Assessing the Economic Value of Climate Predictions
- 8.3.3C: The Local Three Month Temperature and Precipitation Outlooks (L3MTO and L3MPO)
- 8.3.4C: Integrated Water Resources Adaptation and Mitigation Approaches in the Coastal Zone
- 8.3.5D: Environmental Baselines for Coral Reefs: SST, PAR And UV
-
8.4: Weather-Ready Nation
- 8.4.1W: Comprehensive Analysis of Lower Atmosphere for Support to Firefighting
- 8.4.2W: Probabilistic Tool for Improving Weather Decision Services
- 8.4.3W: Standardized Rip Current Forecasting and Dissemination
- 8.4.4W: Reducing Impact of Severe Space Weather on Global Positioning Satellite (GPS) Satellite Signal Users
- 8.4.5D: Development of a Prototype Hyperspectral Microwave Sensor
- 8.4.6D: Low-Cost High Frequency Passive Microwave Radiometer for Ground Measurements
- 8.4.7D: Enhanced Geospatial Query Support for Oceanic Data Discovery
The National Geodetic Survey (NGS) within NOS has a federal mandate to provide accurate positioning, including heights, to all federal non-military mapping activities in the USA. The NOAA NGS leads the GRAV-D Project (Gravity for the Redefinition of the American Vertical Datum) with a specific goal to model and monitor Earth’s geoid (a surface of the gravity field, very closely related to global mean sea level) to serve as a zero reference surface for all heights in the nation. Accurate heights are critical information needed for better understanding of threats to low-lying communities and coastal ecosystems from inundation by storms, flooding, and/or sea level rise. The GRAV-D Project has successfully utilized airborne gravimetry observations to collect highly precise gravity measurements throughout CONUS, Alaska, and their littoral regions. However, more than 85% of the targeted surface area still needs to be economically surveyed, including portions of Alaska, the Aleutian Islands, Hawaii, the US Pacific Island holdings, and most of interior CONUS.
The ocean and coastal zones of the United States contain reserves of potential energy that have not yet been tapped to meet the increasing demands of an energy-hungry nation. Successfully tapping this energy will rely on more than just new energy harvest technologies – it will rely on the ability to site such projects in an environmentally sound way, and to assess the environmental impacts of such emplacements in a logical, efficient manner.
The ocean and coastal zones of the United States are called upon to serve a variety of human purposes. As the number and complexity of human uses grows, conflicts arise. The Ocean Policy Task Force defines coastal and marine spatial planning as a comprehensive, adaptive, integrated, ecosystem-based, and transparent spatial planning process, based on sound science, for analyzing current and anticipated uses of ocean, coastal, and Great Lakes areas. Coastal and marine spatial planning identifies areas most suitable for various types or classes of activities in order to reduce conflicts among uses, reduce environmental impacts, facilitate compatible uses, and preserve critical ecosystem services to meet economic, environmental, security, and social objectives.
Coastal wetlands, mangroves, and sea grasses sequester vast amounts of carbon in their plant material and sediments. These carbon sequestration and storage capabilities are important ecosystem services that if incorporated into management and planning can increase the protection and restoration of these habitats and allow for their inclusion in carbon markets. Key first steps to leverage carbon services to increase habitat conservation are to have a better understanding of exactly how much carbon is being sequestered or emitted from these ecosystems as well as how much is stored in the sediments from historical accumulation. Accurate data on these carbon services (and areal extent of habitats) are critical to support the development of carbon sequestration/storage protocols for coastal wetlands and to support efforts to incorporate carbon services of these habitats into Federal decision-making. The collection of more accurate data on carbon services will be facilitated by the development of an easy-to-use in-the-field, instrument or software that can quantify net carbon flux. In addition there are potential beneficiaries of the development of this instrument or software, including agricultural industry, the energy sector, private capital investment firms and developing countries. Accurate and rapid measurements of carbon sequestration, emissions, and storage, will provide the science support for federal and state habitat conservation, consultations and possible regulatory efforts. Improving the science behind quantifying carbon services can also facilitate the development of voluntary carbon markets which in turn can provide private sector funding that support habitat conservation goals. The costs of monitoring carbon are currently a hurdle for coastal carbon projects. Development of a cost effective, rapid measurement instrumentation or technology can help drive down cost and increase feasibility of incorporating coastal carbon into planning and conservation. A tool that could be easily used in the field, or a remote sensing tool, would be particularly useful.
At some point, with the advancements in GPS technology – accuracy, power, and unit size - the concept of a self-reporting bench mark could provide extreme cost savings and other efficiencies related to vertical stability. The bench mark could be stand-alone or incorporated into a sensor. One would think with proper timing, acquired knowledge of its surroundings (proximity to buildings, bridges, towers, obstructions, etc.), and lots of time (relatively), it would be possible to approach a GPS manual leveling survey that used accepted receivers and procedures. Shifts following earthquakes, hurricanes could be tracked without dispatching a field crew. This information could support decisions to shut-off or continue dissemination of critical data at the site of a disaster (e.g., 2011 Pacific Tsunami that hit Japan and affected much of the Pacific including West Coast U.S.). One of the big benefits would be the direct incorporation of this bench mark into sensors that monitor various vertical movements such as water level. To produce meaningful data, these sensors have to be leveled-in and routinely checked. While some of the accuracy levels required might take hours or maybe even days to accumulate, it would eliminate or greatly reduce the expense of deploying a field crew.
Toxins found in water, shellfish and fish threaten human health and result in significant economic losses. This is particularly true of the toxins produced by harmful algal blooms which are becoming more frequent. Consequently, there is a need for rapid, easy to use and inexpensive detection methodologies which allow resource managers, public health officials, aquaculturists, and commercial and recreational fishers to detect toxins. Of particular concern are toxins from harmful algae. One approach to developing such technologies which has shown great promise is the use of electrochemical detection. A major advantage of these technologies is that they provide an unambiguous digital readout. This contrasts with current field test technologies which depend largely on interpretation of color change in order to estimate toxin concentrations. The primary impediment to broadly employing this technology to water, fish and shellfish toxin analysis is that the typical electrode design depends on incubating a small amount of sample on top of a fixed electrode surface. Having to use a small volume of sample often limits the sensitivity of the assay such that only toxin levels at or above the regulatory limit can be reliably measured. To overcome these limitations, we are seeking proposals for the development of a flow through membrane-based electrode technology which can concentrate the toxin and which allows interfering compounds to be rapidly rinsed away prior to detecting the toxin.
Although predicted 32 years ago, actual observations of the unique tsunami signature expected in HF radar data has only recently been confirmed for the first time with the March 2011 Japanese event. This was observed by several radars, both in Japan and on the U.S. West Coast. First-generation detection software based on highly idealized assumptions and simulations had been developed and offered commercially before the Japan event. With the advent of actual data, this should be improved and optimized, leading to a more useful, robust, and reliable product. Detecting the pattern of the tsunami in HF radar radial velocity data in a timely fashion depends on many tradeoffs. Foremost is a good representation of this radial pattern as influenced entirely by the local bathymetry offshore from the radar.
A recurring request from principal investigators using human occupied (HOV) and remotely operated (ROV) submersible vehicles is the ability to take a number of short core samples from rock outcrops. These cores would then be subjected to a number of analyses. Currently awarded projects researching drowned fossil coral reefs would benefit by taking core samples for dating and related analyses to determine the subsidence and sea level history of the Hawaiian Ridge. This has application to paleoceanography and global climate change studies. Some additional applications and benefits include the ability to expand support for studies of paleomagnetism, radiometric dating, petrology, paleontology, archeology, and engineering/corrosion assessments of submerged cultural resources along with renewable energy installations and ocean observatory components. Submersible hydraulic drills are not the issue, as there are several commercial ones available that are small enough to handle with a manipulator. Several issues make this difficult and require some innovation: 1) be able to eject the drill bit or the entire drill (probably both), 2) take multiple cores during one dive, 3) keep the torque of the drill from disorienting an essentially neutrally buoyant vehicle. Size-weight limitations: <100 lbs dry. Fit in and be operable by the claw hand of a standard manipulator (e.g., Schilling Titan 4). Hydraulically powered and operate off a system producing ~3 gal/min at ~2000 psi. Core diameter of ~one inch and length to ~six inches, and the ability to take 6-8 cores during one dive. Higher RPMs and/or a counter-rotating mechanism may reduce the need for employing drastic operational techniques to maintain vehicle orientation and advance the bit.
The decrease in seawater pH (ocean acidification) caused by absorption of anthropogenic atmospheric CO2 leads to a reduction in the aragonite saturation state (ΩAR). Studies indicate that calcifying marine organisms respond to reduced ΩAR with decreased calcification rates. In particular, calcification rates in reef-building corals may have slowed by 10% over the last 150 years, with predictions to slow another 15-30% by the end of century. We cannot fully evaluate or address this threat to ocean biota without a highly precise, sensitive and direct method for measuring calcification rates. Existing methods for measuring calcification rates are not ideal. The most common measures changes in total alkalinity by titration followed by using assumptions and equations to calculate the change in [Ca+2]. Measuring skeletal incorporation of radioactive Ca+2 is hazardous and difficult to use for field/community studies. Direct measurement of [Ca+2] is possible with ion selective electrodes (ISE). Calcium ISEs are common in clinical applications, but are not widely used in seawater because of calibration difficulties, drift, and low sensitivity. Measurements of [Ca+2] by complexometric titrations are laborious and lack the sensitivity or precision needed. To advance, the field needs a quick and simple method to measure [Ca+2] with a precision of ±5 µM. Major issues to address are: high background [Ca+2] in seawater (~10.2 mM); interference from other ions (e.g. Mg); and relatively small [Ca+2] changes during laboratory incubations or diel cycles due to calcification (~0.050 mM). It will take a novel new technology or analytical method to achieve this goal.
The purpose of this subtopic is to develop innovative products and services to support the development of an environmentally, socially, and economically sustainable marine aquaculture industry in the US in a way that is compatible with healthy marine ecosystems and other users of coastal and ocean resources. As marine aquaculture technology moves from research to operations, aquaculture producers need affordable and reliable techniques, products, and services to support growth and economic viability of sustainable aquaculture operations. There is also a need for reliable and affordable equipment, instruments, tools and techniques to assess the potential risks and benefits of marine aquaculture facilities and to monitor any impacts of marine aquaculture operations on marine ecosystems. NOAA’s 2011 Annual guiding memorandum states that “Eliminating overfishing, rebuilding overfished stocks, and enabling sustainable marine aquaculture are essential for achieving fish populations that can produce maximum sustainable yields, ensuring the long-term sustainability of commercial and recreational harvests, and maximizing the economic and social benefits of sustainable fisheries and safe seafood”. Enabling the development of sustainable marine aquaculture figures prominently in NOAA’s Next Generation Strategic Plan and in NOAA’s recent new Policy for Marine Aquaculture (currently in draft form and awaiting final release after public comment). The three areas of focus for SBIR grants in aquaculture this year closely align with these guiding principles. They are: 1. Alternative feeds 2. Improved health management 3. Novel production technologies and techniques
Video and Image recording systems are increasingly being used by NMFS for a multitude of applications. Underwater systems are deployed on or near the seafloor or towed above the seafloor to record images of fish that are later analyzed to estimate numbers, sizes, and species composition in an area. Underwater systems are also installed in trawls to collect images that can be used to determine numbers, sizes, and species of fish caught in the trawl with the end goal of developing non-destructive trawls that collect all necessary information without actually catching the fish. Other systems are installed on commercial fishing vessels to monitor what fish are caught, kept, and discarded during fishing operations. The effort required to analyze data from these systems is time consuming and expensive. Computer automated analysis to identify fish species contained in an image sequence or video segment has been moderately successful in very controlled photographic conditions when the potential number of species in the images is limited to just a few. Fish lengths are successfully measured using stereo camera systems but require significant manual input by an analyst. There is a need for innovative approaches to automated recognition and counting of fish species and estimation of length of fish in the images collected by various systems.
The NOAA Northeast Fisheries Science Center (NEFSC) conducts annual spring and autumn bottom trawl surveys from Cape Hatteras, North Carolina into Canadian waters. The surveys are a stratified random design where the trawl locations are selected a priori and based on stratification and allocation of effort. Approximately 360-400 bottom trawl hauls are done during each survey. During a survey, the majority of trawls are damaged, often beyond repair. The damages occur primarily in the northern half of the survey area, but also occur in other rocky areas as well. Currently, a single-beam echo sounder is used to “scout” an area for a trawl path that will not damage the trawl. In short, this method is time consuming and has not been effective given the propensity for damaging the bottom trawls. In addition to the need for bathymetry, there is also a need for detecting and enumerating many living marine resources residing above the seabed and in the water column. Acoustic hardware (e.g., multi- or single-beam sonars) and software are now able to collect bathymetry and water-column data, but are often specialized for mapping the seafloor or detecting targets in the water column, not both. A need clearly exists for a more efficient and effective method to evaluate bathymetry for towing a bottom trawl, while simultaneously collecting water column data.
This solicitation seeks a technology that successfully produces sustainable culturing and cryopreservation of scleractinian coral cell and/or tissue lines for in vitro propagation and experimentation. Generating immortalized vertebrate cell lines has revolutionized the fields of medicine, agriculture and toxicology; however there are no permanent marine invertebrate cell lines in existence. To date, the availability of coral cell cultures are limited to isolation of cells from wild-caught specimens as primary cultures that stop dividing within 24-72 hrs and are only viable for weeks. Marine invertebrates have been recalcitrant to in vitro cell culture, illustrating the need for innovative solutions. Overcoming this barrier will address identifying risk factors for threatened coral species by providing a toxicological tool for screening environmental toxicants in marine waters. Coral cell lines will open research in cell biology, virology, genetics, biochemistry and physiology for clearer understanding of natural processes and pathologies. Providing an alternative to wild-harvested corals is a significant contribution to conservation and management of these marine resources. Industry can also benefit from the ability to screen, select and produce novel compounds. Clearly with the need to identify causes of global coral reef decline, such a technological breakthrough would provide an invaluable tool to elucidate causes and devise interventions. Because coral are basal metazoans with genes more closely related to humans than other invertebrate model organisms (e.g., Drosophila and C. elegans) having genetically distinct immortalized coral cell lines would broaden the commercial market, providing an alternative to the classical invertebrate models.
The oceans are a major sink for atmospheric CO2 and have served to mitigate a large fraction of anthropogenic emissions since the industrial revolution. Despite the large amount of ocean CO2 data, there are still, however, significant uncertainty in estimating the uptake of anthropogenic CO2 by the oceans. For example, measurement-based estimates of air-sea CO2 fluxes still have >50% uncertainty (Takahashi et al., 2009). Much of this uncertainty can be attributed to the difficulty in sampling the global ocean with sufficient spatial and temporal coverage. To further the understanding of air-sea CO2 fluxes, ocean acidification, and inorganic carbon dynamics, intensive, year-round monitoring is required. Autonomous sensors for pH and pCO2 have been developed over the past decade and are becoming more widely used for oceanographic research. However, an integrated autonomous pH and pCO2 platform is not currently commercially available. The most important and immediate need is to obtain sea surface pCO2 and pH measurements in the surface ocean. The surface float technology for integration of these sensors is well advanced. Availability of a surface pH-pCO2 float will make possible the development of an Argo-like CO2 monitoring network, which does not currently exist.
In order to measure the economic value of the use of climate information, a methodology is under development to estimate the use of climate information by the Southeast agricultural sector. If and when completed, this methodology may be transferable to other sectors, or it may be necessary to develop other sector-specific methods of assessing the use of climate information and the value of the information to that sector. Methodologies, whether the existing development or those to be developed, should also be able to assess the economic value of improved predictive capability given that the information provided will be incorporated into sectors’ decision making processes. For new methodologies, the measures need to be specific for different sectors of the national economy, but general in transfer of applications from one sector to another. Key sectors of the national economy may include, but are not limited to, agriculture, management of water resources, health, conventional and renewable energy, and transportation. An assessment of sectoral needs for climate information and sensitivity to the quality of the information should be achieved as part of this project. Project outcomes include a measure of the economic value of climate predictions, which will provide an assessment of: • Dollars saved by a sector of the economy as a result of use of climate predictions; or • Loss avoidance in the economic sector as a result of use of climate predictions; and • Projected savings with percentage improvement in skill of climate predictions.
NOAA climate data and forecast products respond to decision needs at national to local scales. Few products, however, convey both a view of the future and a picture of the past. Putting projections into the user’s context of institutional and operational memory allows them to better respond to changing climate conditions. One product that does so, the Local 3-Month Temperature Outlooks (L3MTO), introduced in 2007, facilitates decision-making at a local scale. The 2009 Customer Satisfaction Survey of NOAA NWS Climate Products indicated that 86% of respondents, after viewing L3MTO, wanted NOAA to issue a similar product for precipitation, a Local 3 Month Precipitation Outlooks (L3MPO). Since then the periodic customer satisfaction surveys indicated a need for L3MTO product improvements, especially critical for the way the information is communicated to the wide range of NOAA climate users. This proposal would improve the L3MTO product, develop a L3MPO, and deliver decision support tools that include visualizations of the forecast products and concurrent past weather conditions. The benefit of this work will allow more users to better respond to changing climate conditions by putting the projected 3-month forecasts into context of past local climate conditions. The present users of national temperature and precipitation outlook products include agriculture, construction, energy, reclamation, recreation and tourism, retail, water resources and wild life management. Future use of L3MTO and L3MPO would expand beyond the technically-literate to many other sectors. Decision-makers, business and industry will benefit most by improved communication methods of this highly-technical forecast information.
In recent years, water resource managers have become increasingly concerned about the amount of energy used, and therefore greenhouse gas (GHG) emissions generated, to provide water services (i.e., drinking water and waste water) to their consumers. At the same time, many utilities are faced with the potential for long-term climate change impacts on water quality, availability, and built infrastructure for water delivery and management. Potential water quality and quantity impacts include saline intrusion into coastal aquifers and loss of freshwater flows due to inland droughts and increasing demand. Infrastructure threats arise from potential damage due to sea level rise and increased frequency and severity of storms and flooding. Finally, an important issue to consider is the possibility that collection, distribution, and treatment systems located along the coast will be constantly flooded, or would be completely under the new water table and would have to be re-located. Increasing climate-related risk is forcing adaptation discussions and action. Given new public awareness; a desire to reduce GHG by local, state and federal entities; a need to develop new tools/approaches to adapting to a changing sea level and climate; and the reality of the cost of adaptation and mitigation to utilities, the results of this project would benefit a number of utilities, particularly in coastal areas who are already (or soon will be) addressing these pressing issues. The primary objective of this project would be to provide utilities with the tools and technologies they need to effectively and economically adopt adaptation strategies that reduce energy use, protect the quality and quantity of water supplies (especially from saline intrusion into coastal drinking water aquifers), and reduce the potential damage to infrastructure from climate-related risks. While a limited number of existing tools address individual aspects of adaptation or mitigation, decision makers are seeking tools and technologies to help them make optimal decisions to meet multiple goals, including how to build resilience to the multiple threats without increasing energy use. This need was articulated at a recent workshop hosted by NOAA, EPA, NASA, the Water Research Foundation, and the Water Environmental Research Foundation titled “The Future of Research on Climate Change Impacts on Water: A Workshop Focusing on Adaptation Strategies and Information Needs” held in August 2010. For more information see: www.waterrf.org/projectsreports/publicreportlibrary/4340.pdf. The research products should seek to address some combination of water quality and quantity or mitigation and adaptation simultaneously, and could include methods and technologies to: optimize carbon and water footprinting of adaptation approaches, reduce energy intensity of water treatment and movement, generate energy at the treatment facility and develop means of becoming net-zero energy users, create an ecological/environmental footprint metric, and/or develop and use non-conventional water sources.
Coral reef ecosystems provide a number of essential ecological services that underpin industries such as tourism and fisheries, and are important for stabilizing and protecting coastlines and human infrastructure from wave energy. In the U.S., reefs generate over $18 billion in tourism and fishing. Climate change represents the gravest threat to coral reefs. Increasing stress arising from rapidly warming seas is increasing the frequency and severity of coral bleaching, and mortality. Since 1997, NOAA Coral Reef Watch (CRW) have been producing coral bleaching forecast products using specialized SST climatologies. Currently these are based on AVHRR Pathfinder data, however with the increased spatial resolution of operational SSTs and a need for longer, more accurate data sets that include PAR and UV, the Pathfinder methodology is unable to provide the required data. This project seeks to provide CRW with the necessary data to derive climatologies suitable for use with current and future operational SST, PAR and UV satellite products. This requires the development of methodologies and processing systems that utilize relevant polar and geostationary data to produce 0.05 degree resolution SST data from 1981 that is compatible with current operational NOAA GOES/POES blended SST products. Methodologies and processing systems are also needed to provide PAR and UV products that match the 0.05 degree resolution SST product and that are also compatible with current operational NOAA PAR and UV products. The SST, PAR and UV products need to be of high accuracy and should be internally consistent through space and time.
Wildfire-suppression costs are estimated at $3B per year, with additional costs for damage to property, infrastructure, health, and natural resources. More importantly, many firefighters and homeowners lose their lives during evacuations and when fires make unpredictable movements. Many researchers have focused on the surface conditions that affect fires, but there is increasing recognition that the three-dimensional atmosphere, especially in the planetary boundary layer, plays a key role. Lack of data in the lower atmosphere, assimilation of that data into analysis schemes and high resolution models, and the forecasts themselves all result in a very poor diagnosis and prediction of the fire environment. The 2008 NOAA SAB report, “Fire Weather Research: A Burning Agenda for NOAA,” strongly advocates that “high spatial and temporal resolution (surface) observations and (upper air) soundings…are needed in the immediate vicinity of the wildland fire for both nowcasting and initialization of numerical models…data passed with minimal latency to the forecaster…”. The National Research Council’s Board on Atmospheric Sciences and Climate led the 2010 publication of “When Weather Matters,” which states, “temperature, humidity, and dry lightning can play a role in wildfire initiation, development and spread, while winds and terrain typically play key roles in spreading major wildfires.” Many researchers are examining the potential use of Unmanned Aircraft Systems (UAS) in the prediction of conditions that can dramatically impact fire behavior, but UAS at this point are considered cost-prohibitive. Conventional aerial surveys are costly and difficult to arrange, pibals and balloonsondes do not provide adequate spatial coverage and cannot be guided, and aerostats are not mobile and cannot follow the evolution of the fire. What is needed is a comprehensive examination of deployable profilers, UAS, and even dropsondes from aircraft which could improve the spatial and temporal resolution of atmospheric conditions that impact the dynamics of volatile fire lines, fires in rough topography, and areas where fuels can change fire behavior as impacted by weather conditions.
Goal #1 presented in the NWS Strategic Plan (2011) is to “Improve weather decision services for events that threaten lives and livelihoods.” This will involve an effective application of probabilistic forecast information to greatly benefit many areas of government and industry, to include firefighting, emergency management, commerce, energy planning, and agriculture (NRC 2006). One key to meeting this goal is the introduction of the necessary tools to get the job done, as described in the NWS Strategic Plan: • Forecaster Tools: Develop and implement, with research community and other partners, forecaster tools that support data mining, enhance visualization, smart decision assistance, and forecaster coordination and collaboration. • Decision Support Tools: Develop and implement, with users and partners, tools to apply weather, water, and climate information, including forecast uncertainty, into user decision processes and systems. • Social Science: Integrate social science research, methods, and capabilities into science service areas, forecaster tools, and decision support systems. The objective of this subtopic is to fulfill this visionary advancement with a focus on the “forecast tool” that enables the forecaster to support optimal decision making by primary customers (e.g., emergency managers, FAA, etc.). The tool will fully incorporate probabilistic weather information and principles of risk analysis and decision theory. Part of that functionality must include the ability to objectively define the customer’s risk tolerance, particularly for sequential and dynamic decision contexts. To address aspects of social sciences and human cognition (NWS 2010), the tool will also promote effective communication of the optimal decision (e.g., reasoning on forecast uncertainty and the decision recommendation) to the customer in an interactive forum. Lastly, the tool must be able to validate the benefit(s) to the customer who follows the optimized decision recommendations. While the main focus is on the “forecaster tool”, consideration will also be given to constructing a user-specific “decision support tool.” Tailoring the application of probabilistic weather forecasts to the often complex aspects of a user’s decision context is critical to fully realizing optimal performance, as demonstrated in studies such as Small et al. (2011).
According to the U.S. Lifeguard Association, rip currents cause approximately 100 fatalities per year in the U.S. The NWS has enhanced its rip current program over the last several years by: - including rip current science and forecasting into training - developing a rip current awareness website (http://www.ripcurrents.noaa.gov) - designating a yearly rip current awareness week (first week of June) - having outlooks highlighted on the watch, warning, advisory page (http://www.weather.gov) - developing a rip current monitoring program where lifeguards are trained to observe and report surf zone conditions conducive to rip current development - increasing its rip current education and outreach program - providing more detailed rip current risk outlooks via surf zone forecasts and other coastal statements - investigating the value of rip current watches, warnings, or advisories. Despite these enhancements, there are no standard national rip current forecasting methods used across NWS coastal offices. As a result, NWS rip current outlooks vary in product issuance format and threshold, terminology, and dissemination and are often not clearly understood by users; especially visitors or tourists. A disproportionate number of rip current fatalities tend to be visitors or tourists. Most rip current fatalities are males (80-85%) and the majority of those males are young, between the ages of 10-29. There are various social reasons for this gender bias and the NWS is beginning to target this gender and age group but much more effort is necessary.
Reducing Impact of Severe Space Weather on Global Positioning Satellite (GPS) Satellite Signal Users
The Nation’s critical infrastructure and economy are increasingly dependent on high accuracy GPS positioning, navigation, and timing services. Severe space weather can result in degradation or disruption of the GPS signal which in turn can prevent dual-frequency GPS receivers from locking onto the GPS satellite signal and from determining position at all (denial of service). High latitudes such as the Alaska Region are especially susceptible. As our Nation’s dependence on reliable satellite navigation (GPS) increases, any denial of service will have significant life, safety, and economic impacts. Specification and forecast products are needed to support the broad GPS user community. Precision GPS systems are now integral to many commercial enterprises including air transportation, oil exploration, road building, agriculture, surveying, shipping and transportation. Many new applications of GPS have been deployed in the last five years during which time, there were few major space weather storms due to the fact that the sun was at the lowest point in its eleven year solar cycle. Thus, there are numerous customers for GPS products who do not yet know they are customers. The NOAA Space Weather Prediction Center currently has no operational product for specifying or forecasting ionospheric scintillation and the resulting denial of service. A network of ground-based GPS receivers in North America make it possible to characterize ionospheric scintillation and potential denial of service in real-time. These new data could be assimilated into an empirical or even a physics-based model to provide specification and forecast capabilities for GPS denial of service.
Modern passive microwave space-borne sensors have only a limited number of channels available, totaling anywhere between 5 and 30 channels. This limited number of channels has been shown to be insufficient to solve for the ill-posed nature of the inversion of the geophysical state from space-borne measurements. This is especially true for cases where cloud, rain and/or ice are present in the atmosphere. In this case indeed, a large uncertainty exists due the lack of knowledge about the particle density, shape, size, distribution, vertical structure, temperature dependence, etc. A larger number of channels will help solve for the inherent ambiguities in these cases. It will also allow to provide a higher vertical resolution for the temperature and humidity sounding, a better distinction between the surface and the atmospheric signals, a better surface typing due to the different spectral signatures of the different surface parameters mixtures, etc. While sensors operating in the infrared and near-infrared have experienced an ever increasing number of channels and bands with the new hyperspectral sensors (such as IASI, CrIS, AIRS), microwave sensors despite their large benefits to weather prediction and their ability to penetrate cloud and sense within and below the cloudy and rainy layers, have not seen their number of channels increase. This type of sensors would be expected to have significant positive impacts on the forecast skills of numerical weather prediction models, especially if deployed in space with large spatial and temporal coverages (for hurricanes conditions especially). Besides the large of number of channels (between hundreds and thousands) sought, in the range between 3 GHz and 300 GHz, possibly going up to 600 GHz and potentially higher (sub-millimeter spectral region), it is emphasized that the noise level should be as low as possible and at least as low as the current state of the art sensors.
Passive microwave sensors are key sensor payloads on many operational satellites, including those operated by NOAA and EUMETSAT – the Advanced Microwave Sounding Unit (AMSU) and the Microwave Humidity Sounder (MHS). Over the past decade, satellite-based high frequency measurements at and above 150 GHz (including those near the 183 GHz water vapor absorption band) have become extremely useful for the retrieval of several parameters, including precipitation rate and snowpack properties. In order to advance our understanding of the relationship between these parameters and the emitting microwave energy (and to advance radiative transfer model development), a sensor that can be used on the ground (either pointing upward or downward) which takes measurements at these high frequencies needs to be developed – presently, such sensors typically make measurements at 90 GHz or lower.
This subtopic focuses on development of a web service to transform a rich textual description of a geographic area into a geospatial object such as a polygon or set of polygons. This new capability will greatly enhance ease of use as well as improve people’s success in locating geospatial data. The initial domain is oceanic geospatial data at the NOAA National Data Centers (Oceanographic, Geophysical and Climatic centers). The new techniques are expected to have broader applicability to terrestrial data as well as to other federal (NASA Distributed Active Archive Centers, USGS, and others) and commercial geospatial data archives and portals. The nation’s volume of geospatial data is rapidly increasing. For the nation to receive the full benefits of this data through widespread usage in research and decision support, it is essential to develop state-of-the-art data discovery systems. Current geospatial search systems are far from ideal, especially for novice users encountering a steep learning curve. A simple, easy to use interface is characteristic of popular Internet search engines. But, these open domain search engines are created primarily for largely unstructured data (web pages) and rely primarily on keyword matching. This frequently results in low precision even for well posed questions. The geospatial data is largely structured data. Significant effort is spent standardizing data formats and developing rich metadata suitable for the designated user community. With the recent developments in natural language query processing and semantic web technologies, high precision natural language query processing systems could be developed on such largely structured data.