Search-and-rescue operations are delicate, time-sensitive, and intense. That's why researchers are always looking for new ways to unload some of the dirty work onto robots: machines that will help rescuers get the to the bottom of the rubble—or the top of the mountain—faster and far more efficiently.
Last week, at the SSRR conference in Sweden, researchers from all over the world described the latest concepts in "rescue robotics," including everything from computational theory to specific hardware proposals. In a few cases, researchers expounded on larger, applied visions for future research.
Here are just a few disaster scenarios and how robots might help get your keister out of them alive.
If you're buried by a mountain of snow, rescuers have 15 minutes to dig you out before you die of asphyxiation—hypothermia doesn't even have time to set in. In that time, they've got to locate victims in an area that averages the size of 100 football fields, and navigate treacherous, unstable terrain without getting themselves killed. Methods for tracking avalanche beacons and mobile phones already exist, but rescuers need to be able to move faster.
Launched earlier this year, the SHERPA project is collaborative effort between seven universities trying to develop a "robotic platform" to aid rescuers in Alpine search-and-rescue. The "platform" is basically a coordinated system that uses different types of bots.
How it works
The SHERPA platform (PDF) outlines different roles for a team of robots, working together with human rescuers to assess an avalanche and locate possible survivors as quickly as possible. We mortal humans are categorized simply as "busy geniuses"—the brains—who can think and act very effectively, but will usually be distracted by specific rescue tasks while the machines autonomously serve their roles. These, of course, have their own fun names:
- The patrolling hawk: A high-altitude vehicle, like a helicopter, that hovers above the scene providing over-arching support for the rescue. The hawks are outfitted with a LIDAR sensor for rapid mapping of the rescue area and also serve as a communications hub.
- Trained wasp: A small UAV-type vehicle that can move quickly from place to place, examining areas of interest in greater detail, as well as exploring holes in the high-altitude map. The wasps are outfitted with lightweight laser scanners to for terrain mapping.
- Intelligent donkey: A ground rover with a multi-function robotic arm. It's lightweight, so it can be carried in a box on the back of a human rescuer. The donkey serves as a local communications hub, as well as a recharging station for the wasps.
The SHERPA platform is specifically designed for use in mountainous backcountry, but researchers think the overall methodology can be applied to broader scenarios. The plan is to build hardware specific to each of the robot roles as well as to develop AI for the bots' autonomous tasks.
You've got to survey large flooded areas and disseminate critical supplies to victims spread widely across the target region. A simple air drop won't do.
A team at Carnegie Mellon University is working on the Cooperative Robotic Watercraft, a low-cost airboat that can be deployed in small platoons to serve the joint task of discovering and reporting what's going on, as well as delivering critical supplies to isolated victims.
How it works
An army of cheap, easy-to-assemble fan boats is set loose upon a flooded region, surveying environmental conditions and delivering aid. So far, researchers have developed (PDF) several models model for a plastic boat that's propelled by a large fan, and connected to the outside word with a commercial smartphone, which relays GPS coordinates and data captured from any on-board sensors. The phone connects to everything else through a simple Arduino interface.
The stock plan calls for boats that are 2 x 1 foot in size, but they can scale up—amongst the projects, for example, is a 9.5-foot ocean-ready boat loaded with a mass spectrometer The simple modular design costs between $800-$1,200. Figure in six hours of labor and you're looking at a $2,000 price tag.
How many boats will you need? According to Professor Paul Scerri, about one for every four acres. But, he points out, the size of the fleet is only of secondary importance.
"Part of our argument, however, is that we don't think this is an interesting point," he writes. "Robots will become so cheap that human time is the far more important thing so go crazy with the number of boats—we don't think this time is far off."
Of course, you can't simply push a boat out into the water and hope everything goes as planned—so the researchers have also developed systems that help the boats do everything from avoiding obstacles in shallow water to sampling water quality.
According to Scerri, the applications extend far beyond mere disaster relief. In early 2014, the Carnegie Mellon team will be traveling to Kenya with a fleet of boats to help map hippo pools. "It's a great robotic project because it's so dangerous for humans."
Just because a building has collapsed doesn't mean everybody inside is dead. Rescuers with limited time and resources need a faster way to find their way into rubble so they can pull out survivors.
Using UAVs outfitted with RGB-D cameras—color and depth cameras like the one found in a Kinect—researchers at Ryerson University have developed a system for identifying potential access holes in large areas of rubble.
How it works
In a proof-of-concept demonstration, used a Microsoft Kinect sensor strapped to a UAV to find holes—potential entryways—in mountains of rubble. The system analyzes the camera's twin 640 x 480 images—one displays color, one displays depth—and identifies candidate holes for rescuers to investigate. Potential access portals are scored based on a number of criteria, including depth disparity with the surrounding pixel areas, size, and relative brightness.
So far, the concept has been tested on UAVs over training areas for Urban Search & Rescue teams. The next step is to expand the different criteria used to identify potential entryways, as well as to improve the data processing so it can be done in real-time as a UAV flies over a scene.
Opening image courtesy U.S. Library of Congress.