Future drones will see as the human eye does, helping to navigate dark rooms.
Camera-armed multi-rotor drones take great pictures…during the day. But if they are to guide themselves through darkened rooms and buildings, they need to see in low-light conditions while quickly moving. A new type of visual sensor developed by Swiss researchers will allow drones to see as human eyes do. That will enable small drones to be more useful in both “civilian and military applications,” such as finding people trapped in rubble or during complex urban warfare scenarios, the research team’s head said.
Conventional cameras work by collecting lots of information about light (specifically, its intensity). They treat all the data equally, which is fine for taking single pictures. But when the light is low, or when the camera is moving, as it would on a drone, that technique produces blurred pictures that don’t convey any useful information.
The Dynamic Vision Sensor, or DVS, works differently. “Instead of wastefully sending entire images at fixed frame rates, only the local pixel-level changes caused by movement in a scene are transmitted – at the time they occur,” explained Davide Scaramuzza, one of the paper’s authors and the head of the Director of the Robotics Perception Group at the University of Zurich.
The DVS works much more the way an eye does, looking for areas of the field where there is some change in the light intensity. Focusing on such changes, which Scaramuzza called “events,” can help the entire system to see both movement and edges in low light much more efficiently.
In their paper, Scaramuzza’s team describe a hybrid approach that uses the DVS event camera to transmit data about changes in brightness as well as collecting standard frames.They conducted two experiments, including the first autonomous flight of a drone using an event camera. In one experiment, they switched the light on and off as the drone flew around the room, forcing it to focus on the events, coupled with measurements it gathered while the light was on. In the second one, the drone flew autonomously, in darkness, in ever-faster circles to see how will the technique worked when the drone was moving quickly. Check out the video:
They found that their DVS-equipped drone saw 130 percent better than a similar system relying exclusively on event data and 85 percent better than a system relying only on standard data.
The University of Zurich researchers worked with the Swiss research consortium NCCR Robotics in the effort, which was partially funded by the Defense Advanced Research Projects Agency, or DARPA. The U.S. military is increasingly concerned with how to wage war in dense urban environments, particularly where GPS signals are blocked or are hard to acquire.
NEXT STORY: Veterans Affairs CIO Stepping Down