skip to main content

Making search and rescue drone swarms a reality

A quarter of the people recovered by wilderness search and rescue operations are injured by the time they’re found. Researcher Ryan Williams wants to shave that statistic down by bringing multi-robot systems into searches.

Wilderness searches for lost people hinge on space and time, Ryan Williams believes — the goal is to cover as much ground as possible with every passing minute, and to get it done with skill and coordination. To make the most of these operations in complex wilderness terrain, Williams and his collaborators in the College of Engineering aim to enable a new kind of search team to form, by lending skilled human searchers support from the skies with drone swarms. 

Teaming human searchers with groups of unmanned aerial vehicles would advance the use of drones past the remotely-piloted, single robots we’re seeing in today’s search and rescue operations, Williams said. According to the researchers, such a team must be attuned to the realities of traversing the wilderness and the need to build trust between humans and robots. 

“It’s meant to be a collaborative effort between the robots and the people,” said Williams, an assistant professor in the Bradley Department of Electrical and Computer Engineering.

Autonomy’s balancing act

Currently, drones are used in search and rescue to map areas, search for victims, observe targets, make deliveries, and relay communication, but during these activities, they’re usually manually operated. Williams wants to introduce autonomy into their use by search and rescue personnel. But there has to be a balance: though autonomous drones can reduce human effort, the algorithms behind them need to ensure that they don’t get in the way of their human users.

Williams’s team is programming drones to choose search trajectories in coordination with human searchers, anticipating how they’ll move through the landscape based on topography. 

“They’re not just blindly searching — they incorporate what the humans are planning to do,” Williams said. “If it’s really hilly, for example, they anticipate how the human searchers will move, and adapt accordingly. If they anticipate that a human may encounter an area that’s difficult to traverse, they will prioritize searching it, as the human may not be able to get to it.”

The team’s algorithms are also factoring in real lost person behavior. Nicole Abaid, an associate professor in the College of Science’s mathematics department, has been using historical data from more than 50,000 documented lost person scenarios for models informing drone searches.

Two people working on taking apart drones outside.
Larkin Heintzman, fifth-year Electrical and Computer Engineering doctoral student.
Close up of hands holding a drone camera that is about to be installed.
Close up of the circuits on a large drone.
A drone in a grassy field.
A drone hovering above farmland.

Drones that work together

Williams’s expertise extends to multi-robot systems in a range of contexts. He’s looking at their use not only in search and rescue, but in fields like agriculture, and has found the general state of interactions between robots in these systems to be fragile — real-world field deployments of the swarms show there’s a lot of progress to be made to get them working together smoothly. 

That’s especially the case in wilderness searches, where information needs to move between vehicles spread out across large spatial scales, Williams explained. The team is working to enable drones to do things they haven’t yet come close to mastering, like entering and exiting long operations gracefully and repeatedly, disconnecting and reconnecting with each other when their batteries die, and acting intelligently on what they observe “on the fly,” Williams said.

Two people working on taking apart drones outside.
Nathan Lau, Associate Professor in Industrial and Systems Engineering, and Nicole Abaid, Associate Professor in Mathematics.

Through the canopy

A key feature of wilderness-savvy search and rescue drones will be their skills in perception. The researchers are outfitting their drones with thermal cameras to see past thick vegetation and other difficult natural features as they survey the area. 

“You can’t just break the laws of physics and see directly through the trees, but what we’ve seen is that with the right algorithms and thermal vision, you can see through the canopy, as long as it’s not too dense,” Williams said. “If people are lost in the woods, we can fly aerial vehicles over the canopy and really help human searchers. We can look for warm bodies, essentially.”

A drone remote control.
A student woking on a laptop outside with drone remote controllers nearby.
Hands working on a laptop that is displaying drone aerial footage.

Lightening the load

Searching autonomously for long stretches requires a lot of computation. “We take these things for granted — humans are really good at computation,” Williams said. “For robots, it’s hard. We need quality computers that can support the decision-making, the planning, and the perception.”

The researchers have developed a field-deployable backpack that allows search and rescue operations to have high-powered computation follow their drone team around as it searches. The backpack, outfitted with several small, WiFi-enabled computers and batteries, enables the drones to extend their flight time, process large amounts of data, and make agile decisions. 

A student with a backpack for tracking drones.
Nathan Lau.

“Basically, we add a little supercomputer in a backpack so we can offload any kind of drone computations to select artificial intelligence algorithms, path planning algorithms, that kind of stuff,” said James McClure, a member of the research team and a computational scientist for Advanced Research Computing at Virginia Tech.

A student wearing a backpack.
Kevin Smith, a first-year master’s student and graduate research assistant for Advanced Research Computing.
A student putting on a backpack for tracking drones.
James McClure (left), computational scientist with Advanced Research Computing, and Nathan Lau.

Video by Ray Meese, photos by Peter Means.

If you want to have an impact on our students and faculty like those featured in this magazine, go here to support the College of Engineering. For more information, call (540) 231-3628.