In ground breaking research conducted for the Defense Advanced Research Projects Agency (DARPA) Information Innovation Office (I2O), Scientific Systems Company, Inc. (SSCI) has demonstrated a team composed of a fully autonomous unmanned aerial vehicle (UAV), and a search and rescue canine collaborating together without intervention from a human handler to conduct a simulated search and rescue mission.
SSCI’s newly developed Teammate Aware Autonomy (TAA) system was used together with its Collaborative Mission Autonomy (CMA) and Finding Objects thru Closed Loop Understanding of the Scene (FOCUS) software in this key demonstration. TAA develops predictive models of non-authoritative teammates – in this case a canine – that respond to only limited types of commands, enabling novel teams comprised of both autonomous agents and sentient teammates to effectively and efficiently work together. TAA-enabled teams can combine the advantages and capabilities of autonomous systems, like UAVs, with the unique strengths of sentient teammates for which there is no technological substitute. Now working together, the autonomous system and the canine dramatically increase the team’s mission effectiveness. In this case the UAV, with a birds-eye view of a landscape and powered by artificial intelligence for locating objects in its scene, can guide the sentient canine teammate that has an unmatched sense of smell.
In July of 2020, working with Missouri Task Force One (MO-TF1) – one of FEMA’s elite Urban Search and Rescue Teams – SSCI demonstrated this unique mammal-machine team. While conventional Search and Rescue canines require a human handler to provide instructions to the canine, the TAA-enabled team used the UAV to command the canine directly, without need for intervention or direction from the canine’s handler. In other words, the UAV was not being used as a mere direct relay of commands from the handler to the canine. Here, the UAV was, in fact, taking safety informed objectives from the handler and translating those into direct actionable commands, based on the scene the UAV observed. During this demonstration, the TAA system was successfully used by the FOCUS UAV to navigate the canine, command it to begin a search, and identify the canine’s alert behavior, cueing the FOCUS UAV to autonomously investigate the alert location. The Search and Rescue canine was equipped with an electronic TAA vest, used to both broadcast audio commands to the canine from the UAV, and to provide the UAV with real-time data from the canine. The Search and Rescue canine’s handler played a supervisory role during the demonstration, ensuring the search and rescue mission was being conducted in an efficient and safe manner by the canine-UAV team. This capability is not meant to replace the human handler. Instead, this technology could, in the future, allow a human handler to control a much larger pack of canines and UAV teams, ensuring improved and more responsive search and rescue results.
In this effort, technical research support was provided to SSCI by the Massachusetts Institute of Technology (MIT) Computer Science and Artificial Intelligence Laboratory (CSAIL) Interactive Robotics Group, as well as the Oregon State University Autonomous Agents and Distributed Intelligence (AADI) Lab.