Posted by admin

Precision Navigation Using Fused Flow™ Algorithm

06 June 2016

This paper describes the mechanisms used for the Fused Flow™ algorithm in Luftronix’s precision navigation system. It also covers how visual information about displacement can be combined with inertial and absolute reference sources to increase precision or provide navigational autonomy.


Precision in navigation matters to different degrees depending on the use case. High-altitude airspaces with little traffic need far less precision than indoor or congested airspace scenarios. Precise navigation is an essential step towards advanced scenarios of Unmanned Aerial Vehicles (UAV) operation like unattended take-off and landing and fully autonomous flight. In all scenarios, precision is expected to become an important factor as the number of participants increases.

The Luftronix Fused Flow™ navigation system achieves precision in navigation by taking its input from all available sources and allows for correction from absolute sources whenever they are available. Using this fusion of diverse input sources, Luftronix navigation systems are able to obtain sub-centimeter precision for the flight path of UAVs.

Technical Fundamentals

Luftronix precision navigational units consist of three major components: The flight control unit for managing the flight path; the Luftronix Fused Flow navigator; and the data acquisition layer to interface with the payload.

Fused Flow™ Navigation

To obtain precise navigation capabilities, flight control units rely on real-time decisions based on the input from low-latency sources. Luftronix Fused Flow™ is able to take into account a number of types of sources:

Distance Sensor. Fused Flow™ works with barometric altimeters, laser range finders, 3D cameras or sonar systems to determine distance from objects and altitude above ground.

Gyroscopes to measure the angle against the surface and rotation.

Electro-optical input to take snapshots of contrast markers on the surface and calculate the vector between time intervals.

3D Cameras to map the shape of the surface.

Short-wave infrared sensors to measure distance and to generate contrast markers in cases where they don’t naturally exist.

Long-wave thermal sensors to detect patterns in the infrared spectrum that would be inaccessible to electro-optical methods.

In addition to taking input from sensors allowing the deduction of a relative path, Fused Flow™ also accepts corrective information from fixed reference points when available. Examples are travelling in and out of GPS coverage, local positioning systems, usage of infrared beacons or similar known reference points to correct the relative flight path.

Data Acquisition Layer

The primary purpose of flying a UAV today is to gather data found through sensors aimed at the surface or at an object. Luftronix supports a data acquisition layer that is linked to the precision navigation instruments for location tagging in the metadata. Luftronix-equipped UAVs support the full range of sensors for data acquisition including electro-optical, 3D-camera, infrared, thermal, sound, chemical, radio, radiation and multi-spectral sensors. Payload on UAVs is equipped with a choice of sensors depending on mission requirements.

Optical Flow and Fused Flow™

Optical Flow is based on James Gibson’s observation of how humans and animals interpret relative motion as they travel through the world. It has been extended into a mathematical framework for interpreting discrete image displacements and the algorithms have been analyzed by Barron, Fleet and Beauchemin in 1994. Today it is considered a known method to utilize electro-optical input sources to determine the rate of displacement over time.

By itself, Optical Flow suffers from larger than acceptable error accumulation and is therefore only useful for short sections of a mission. Luftronix has complemented the strategy of using displacement information by augmenting it with additional sources of input for a Fused Flow™ model of displacement that takes into account additional relevant factors such as the angle of the camera to the surface, elevation, heading and possible temporary view obstruction to a complete model of motion. In essence, this results in a more precise optical flow algorithm that is still able to operate in real time.

Luftronix Fused Flow™ uses a sample consensus method to implement a robust algorithm that compares markers in two subsequently taken images and determines how large the vector of movement was between the two samples of the surface, after taking into consideration factors like altitude, angle and rotation – all of which might have changed between the two samples.