Topic

Funding Opportunities

Icon: back arrowBack to Funding Opportunities Search

Shop Floor Human Detection Using Low-Cost Equipment

Seal of the Agency: DOD

Funding Agency

DOD

USAF

Year: 2025

Topic Number: AF254-D0823

Solicitation Number: 25.4

Tagged as:

SBIR

BOTH

Solicitation Status: Open

NOTE: The Solicitations and topics listed on this site are copies from the various SBIR agency solicitations and are not necessarily the latest and most up-to-date. For this reason, you should use the agency link listed below which will take you directly to the appropriate agency server where you can read the official version of this solicitation and download the appropriate forms and rules.

View Official Solicitation

Release Schedule

  1. Release Date
    May 7, 2025

  2. Open Date
    May 7, 2025

  3. Due Date(s)

  4. Close Date
    June 25, 2025

Description

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Advanced Infrastructure & Advanced Manufacturing; Sustainment & Logistics The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. OBJECTIVE: to develop a system that can achieve a third party safety rating comprised of software controls and three dimensional time-of-flight (ToF) sensors for data capture. This system will enable existing non-collaborative mobile heavy industrial robots to operate safely in human-populated settings, eliminating the need for static safety measures such as fencing, door interlocks, and light/laser curtains. To ensure a comprehensive field of view, the system will identify and process implied presence zones to account for blind spots in real time. The system must be compliant with international robotic safety standards, and be certified by a third party as PLd, Cat. 3 or SIL 2. This certification is the key here, systems that cannot obtain this certification would be considered unsuccessful. DESCRIPTION: The industrial landscape is evolving, with increasing demands for automation in environments that are inherently dynamic and populated by human workers. Traditional safety measures, such as static fencing and interlocks, are inadequate in these settings, particularly when mobile robots are involved. These environments pose significant challenges to ensuring safety while maintaining operational efficiency. In many industrial applications, robotic systems must adapt to varying conditions without compromising safety. The challenge is to create a system that can accurately detect and respond to human presence and other unexpected obstacles in real-time, ensuring that robots can function effectively without the need for external safety systems. Such a system must function for industrial robots mounted on mobile robots as well as for industrial robots mounted on rails, gantries, or pedestals. This project seeks to develop a system using ToF sensor technologies combined with a custom control system that can map the three-dimensional environment in real-time as well as be integrated into an existing industrial mobile robot at an Air Logistics Complex in order to enable them to operate collaboratively and safely. The goal is to create a solution that not only addresses safety concerns but also enhances the operational capabilities of these robots in human-populated environments. PHASE I: this is a Direct-to-Phase II initiative and companies must demonstrate, from the outset, a prototype system with the capability to interface with ToF sensors and robotic control hardware. They must demonstrate practical experience detecting unforeseen obstacles and humans using point cloud technology and accurately recognize the position of objects within a global coordinate system. Additionally, companies need to identify regions that are obscured or not visible to each camera in real-time and have an established methodology for incorporating blind spot information into the robotic control system to ensure comprehensive environmental awareness. PHASE II: Create a functional prototype which communicates with an existing robotic system at an Air Logistics Complex via an application programming interface. This system would send safety signals to the robotic system, similar to traditional safety devices (e.g., pressure mats, light curtains). The system must communicate the detection of unexpected obstacles and humans inside the environment and within different zones (e.g., warning/slow, hazard/stop) in order to trigger appropriate responses from the robot. The prototype should allow for safe operation without the need for additional external safety measures. Furthermore, implement real-time mapping of the dynamic environment to improve the robot's operational flexibility, minimize false positives, and enhance overall system efficiency. Optimize the hardware and software to maximize performance, leading to a robust, reliable solution suitable for integration into an Air Logistics Center's robotic systems. The system will be compliant with international robotic safety standards and be certified by a third party as PLd, Cat. 3 or SIL 2. PHASE III DUAL USE APPLICATIONS: If the Phase II is successful in developing the technology, the government would like to pursue a phase III to further develop autonomous decision-making algorithms that allow the robotic system to predict and adapt to environmental changes proactively by integrating advanced artificial intelligence-driven predictive analytics, enabling the robot to anticipate potential hazards before they arise. Moreover, Phase III’s would be leveraged to coordinate the technology between multiple robotic systems, allowing them to communicate and collaborate in real-time. Other Phase III activities would include scaling the solution to other robotic systems in a multitude of facilities and refine the prototype into a market-ready product that meets the requirements of federal agencies, such as the Air Force, while also being suitable for private industry applications. This solution should be customizable for deployment across various sectors, providing a scalable, reliable enhancement to robotic systems operating in diverse environments. REFERENCES: 1. Liu et al. “A real-time hierarchical control method for safe human–robot coexistence.” 2024, Robotics and Computer-Integrated Manufacturing. https://doi.org/10.1016/j.rcim.2023.102666 2. Zanchettin. “Human tracking from quantised sensors: An application to safe human–robot collaboration.” 2023, Control Engineering Practice. https://doi.org/10.1016/j.conengprac.2023.105727 3. Marvel, Norcross. “Implementing speed and separation monitoring in collaborative robot workcells.” 2017, Robotics and Computer-Integrated Manufacturing. https://doi.org/10.1016/j.rcim.2016.08.001 4. Robla et al. “Visual sensor fusion for active security in robotic industrial environments.” 2014, EURASIP J. Adv. Signal Process. https://doi.org/10.1186/1687-6180-2014-88 5. Haddadin, Albu-Schäffer, Hirzinger. “Requirements for safe robots: Measurements, analysis and new insights.” November 2009, The International Journal of Robotics Research. https://doi.org/10.1177/0278364909343970 KEYWORDS: Industrial Safety; Human Detection