INS and Neuromorphic Cameras Steer Drone for Post-GPS Tech - IEEE Spectrum
Close bar

Neuromorphic Camera Helps Drones Navigate Without GPS

High-end positioning tech comes to low-cost UAVs

3 min read

Edd Gent is a Contributing Editor for IEEE Spectrum.

An engineer splicing fiber optic cables used in inertial navigation systems.

Researchers are testing new hybrid imaging and inertial-guidance tech (pictured) that could enable drones to navigate even in GPS-denied environments.

Advanced Navigation

Satellite-based navigation is the bedrock of most modern positioning systems, but it can’t always be relied on. Two companies are now joining forces to create a GPS-free navigation system for drones by fusing neuromorphic sensing technology with an inertial navigation system (INS).

GPS relies on receiver units that communicate wirelessly with a network of satellites to triangulate the user’s location with incredible precision. But these signals are vulnerable to interference from large buildings, dense foliage, or extreme weather and can even be deliberately jammed using spoofed radio signals.

This has prompted the design of alternative navigation approaches that can be used when GPS fails, but they have limitations. INS use sensors like accelerometers and gyroscopes to track a vehicle’s location from a known starting point. However, small measurement errors accumulate over time and can ultimately cause a gradual drift in positioning accuracy. Visual navigation systems use cameras to scan the terrain below an aircraft and work out where it is, but this takes considerable computing and data resources that put it out of reach for smaller, less expensive vehicles.

“The two things together really neatly solve navigating in a challenging, GPS-denied environment. You can travel really long distances over a really long time.” —Chris Shaw, Advanced Navigation

A pair of navigation technology companies has now teamed up to merge the approaches and get the best of both worlds. NILEQ, a subsidiary of British missile-maker MBDA based in Bristol, UK, makes a low-power visual navigation system that relies on neuromorphic cameras. This will now be integrated with a fiber optic-based INS developed by Advanced Navigation in Sydney, Australia, to create a positioning system that lets low-cost drones navigate reliably without GPS.

“The two things together really neatly solve navigating in a challenging, GPS-denied environment,” says Advanced Navigation’s CEO Chris Shaw. “You can travel really long distances over a really long time.”

When deciding on a navigation system for a vehicle there is always a price to performance trade-off, says Shaw. It typically doesn’t make sense to install expensive, high accuracy INS on a low-cost platform like a drone, but smaller, cheaper ones are more prone to positioning drift. “Sometimes it could be just 10, 20 minutes, before you start to get such a big error growth that the position accuracy is not good enough,” says Shaw.

Ditching GPS for Cameras

A visual navigation system can provide a workaround by giving the INS high accuracy position updates at regular intervals, which it can use to recalibrate its location. But the high resolution cameras used in these systems generate huge amounts of data, and this has to be compared against a massive database of satellite imagery using computationally expensive algorithms. Fitting these kinds of computational resources on a small and power-constrained vehicle like a drone is typically not feasible.

NILEQ’s system significantly reduces the resources required for visual navigation by using a neuromorphic camera. Inspired by the way the human retina works, these devices don’t capture a series of images, but instead track changes in brightness across the sensor’s individual pixels. This generates far less data and operates at much higher speeds than a conventional camera.

“Using the neuromorphic camera alongside low-cost, inexpensive inertial sensors [provides] a big cost and size benefit.” —Chris Shaw, Advanced Navigation

The company says its proprietary algorithms process the camera output in real-time to create a terrain fingerprint for the particular patch of land the vehicle is passing over. This is then compared against a database of terrain fingerprints generated from satellite imagery, which is stored on the vehicle. The process of creating these fingerprints compresses the data, according to Phil Houghton, head of future concepting at MBDA. “This means that the size of the database loaded onto the host platform is trivial and searching it in real-time requires minimal computation,” he adds.

On the other hand, neuromorphic cameras are not currently able to operate using infrared, says Houghton, which would enable nighttime operations. But infrared neuromorphic cameras are currently under development and should be available in the next few years, he says.

Neuromorphic cameras are more expensive than conventional ones, often costing in the region of $1000, says Shaw. But this is balanced out by the fact that they can be combined with much cheaper INS. “Some really high-end navigation systems might run into the hundreds of thousands of dollars,” he says. “This approach of using the neuromorphic camera alongside low-cost, inexpensive inertial sensors, there’s a big cost and size benefit.”

Beyond providing the INS, Advanced Navigation will also use its AI-powered sensor fusion software to combine the outputs of the two technologies and provide a single, reliable location reading that can be used by a drone’s navigation system in much the same way as a GPS signal. “A lot of customers in this space want something they can just basically plug in and there’s no big learning curve,” says Shaw. “They don’t want any of the details.”

The companies are planning to start flight trials of the combined navigation system later this year, adds Shaw, with the goal of getting the product into customers hands by the middle of 2025.

The Conversation (0)