When GPS isn’t available, drones can have a hard time reaching their destination. During instances where GPS is minimal or nonexistent, Dr. Jiefei Wang, a researcher from UNSW Canberra Trusted Autonomy Group, has developed a new set of drone “eyes” and algorithms.
The eyes are in the form of an Xbox Kinect sensor, according to Fresh Science, which acts as an input camera. This video footage is processed image-by-image through algorithms, which compare the same pixel of one image with the previous photo, looking for any differences. Spotting changes in 2D images enable the aircraft to discern its motion, speed, and flight-path obstacles in 3D space, and grants full autonomy.
“Depth information is crucial for locating objects,” says Jiefei. “Human beings can use one eye to see the world but need two eyes to locate. For example, try closing one eye, then point your index fingers toward each other and bring them together. Most people will find this difficult.”
Jiefei says the RGB-D cameras, like the Kinect sensor, are still new, which means there are a few areas that need improvement, including operational range and resolution.
“So, in the future, the impact of more accurate RGB-D cameras could be explored. If I have the chance in the future, I would like to come back to this. I cannot stop the wars and disasters, but as an engineer, I believe in the power of technology, and we are experiencing the way how it changes our lives right now,” Jiefei says.
Jiefei predicts the drones may also help the mining industry, earthquake rescue efforts, underground detection, and more.