Your browser does not support JavaScript!

Home    Search  

Results - Details

Search command : Author="Τσακίρης"  And Author="Δημήτρης"

Current Record: 1 of 1

Back to Results Previous page
Next page
Add to Basket
[Add to Basket]
Identifier 000417051
Title Visual UAV navigation
Alternative Title Οπτική πλοήγηση μη επανδρωμένου εναέριου οχήματος
Author Τιμοθεάτος, Σταύρος
Thesis advisor Τραχανιάς, Παναγιώτης
Reviewer Αργυρός, Αντώνιος
Τσακίρης, Δημήτρης
Abstract Unmanned Aerial Vehicles (UAVs), particularly multi-rotor UAVs, have gained significant popularity in the autonomous robotics research field. The small size and agility of such aircrafts makes them an ideal candidate for use in restricted environments. With the ability to navigate freely in 3D space, UAVs offer the possibility to reach places that are otherwise inaccessible to humans. As such, UAVs find numerous applications in commercial and research tracks, such as inspection, surveillance, search and rescue (SaR) and aerial mapping. However, their remote operation is challenging for non-highly trained personnel, due to the fast dynamics of UAVs and the inherent difficulty to estimate distances to far away objects. In order for an autonomous UAV to safely and reliably navigate within a given environment, the control system must be able to determine the state of the UAV at any given moment. The state consists of a number of extrinsic variables, such as position, velocity and attitude of the UAV. The latter are commonly provided in outdoor operations via the Global Positioning System (GPS). While GPS has been widely used for long range navigation in open environments, its performance degrades significantly in constrained environments and is unusable indoors. Accordingly, state estimation for UAVs in the named environments is a popular, contemporary research area. Many successful solutions have been developed using laser-range finder sensors. These sensors provide very accurate measurements at the cost of increased power and weight requirements. Visual sensors constitute an attractive alternative state estimation sensor; they offer high information content per image coupled with light weight and low power consumption. As a result, many recent works have focused on state estimation on UAVs, where a camera is the only exteroceptive sensor. In this thesis we aim at advancing the navigation capabilities of UAV systems, through the development of a framework that facilitates autonomous flights in previously unknown indoor and outdoor workspaces, using dual on-board cameras as the primary sensors. Those sensors introduce interesting challenges and need to be considered in the design of each part of the navigation framework. Therefore, the work presented in this thesis focuses on the problem of visual based navigation for UAVs. For the sake of robustness in autonomous flights, visual information is fused with measurements provided by an Inertial Measurement Unit (IMU), available on every UAV. The complementary nature of visual and inertial data renders this setup a particularly powerful sensor combination, albeit posing challenges in their synchronization and calibration. With this minimal sensory setup, in the current dissertation we have studied, implemented and evaluated methods for UAV control, state estimation, and autonomous navigation. The first contribution of this thesis regards state estimation and control. We have shown that both parts can strongly benefit from a good model that provides valuable information about possible motions. Including the model in the state estimator, combined with a pressure sensor, renders the velocity and the two inclination angles observable. We present a versatile framework, using an Extended Kalman Filter (EKF) to fuse visual information with inertial measurements from the on-board sensor suite. Given that UAVs should be able to estimate their state from visual observations and also share acquired environment information with other members of the team, a common frame of reference for both positioning and observations is also derived. For UAV control, we study the necessary dynamics and differential flatness of multi-rotor systems. We discuss potential abstractions in order to employ position- and trajectory controllers, on different types of multi-rotor platforms. Based on our investigation, we propose a control approach based on dynamic inversion, which avoids the commonly used attitude control loop and significantly reduces the mathematical operations necessary, rendering the approach appropriate for constrained on-board hardware. Furthermore, we present a novel framework for horizon line (HL) detection that can be effectively used for UAV navigation. Our scheme is based on a Canny edge and a Hough detector along with an optimization step performed by a Particle Swarm optimization (PSO) algorithm. The PSO’s objective function is based on a variation of the Bag of Words (BOW) method to effectively consider multiple image descriptors and facilitate computational efficiency. More specifically, the image descriptors employed are L * a * b color features, texture features and PHOW-SIFT features. Finally, a formulation of a visual navigation technique that enables the Asctec Firefly UAV to localize and navigate in outdoor environments is outlined. This is achieved through the use of a general, scalable, tracking and mapping system (SLAM). The overall goal of this dissertation has been to combine and integrate the above partial contributions, and enable UAV-navigate in rather large, unknown outdoor environments. The latter implies that the UAV is stable flying on its own, while an operator at a ground station only gives high-level commands, such as simple way-points, desired velocities and dynamic trajectories. These outdoor tests prove the validity of the presented real-time and on-board navigation system. Also, they show the capability of the entire sensor-fusion framework to convert an aerial vehicle into a power-on-and-go system for real-world, large-scale and long-term operations. The results in this thesis, showed that vision only navigation is possible as long as we are only concerned about the local consistency of our environment. The advantages of multi- fusion systems were demonstrated by switching between single and dual visual sensor setups and tested the different configurations with state noises and erroneous measurements. SLAM framework was able to initialize and to produce results in real-time on-board navigation, in self-similarity environments, where futures like grass in rural areas and asphalt in urban areas, causing the tracker to fail tracking the current pose, and the mapper to fail re-localizing. Furthermore the navigation system was shown to be able to operate, even with low resolution and distort image sequences from visual sensors, in contrast to other visual SLAM approaches. To summarize, in the current dissertation we studied and developed an integrated framework to facilitate autonomous UAV navigation in large, unknown outdoor environments. Accordingly, robust and stable UAV flight has been accomplished, while an operator at a ground station only gives high-level commands, such as simple way-points, desired velocities and dynamic trajectories.
Language English
Subject EKF
Extended Kalman Filter
Horizon line
IMU
Optical
RT-PTAM
Issue date 2018-03-23
Collection   School/Department--School of Sciences and Engineering--Department of Computer Science--Post-graduate theses
  Type of Work--Post-graduate theses
Permanent Link https://elocus.lib.uoc.gr//dlib/b/a/2/metadata-dlib-1531209278-901930-3721.tkl Bookmark and Share
Views 429

Digital Documents
No preview available

Download document
View document
Views : 14