GPS-Denied Vision Control of UAV
From March 2017 to December 2018, I worked on “Project Micrathene” with a Ph.D. student who focused on controls. The objective of this project was to develop an autonomous launch, tracking and landing system for an Unmanned Aerial Vehicle (UAV). The object to track and land on was a moving ground platform, and everything was to be done without GPS.
In this project, vision is used to estimate the pose, the translation and rotation relationship, between the UGV and the UAV. The pose was used for flight control during different states of the UAV.
The hardware platform contains:
- DJI Matrice 100 fly platform
- Point Grey Chameleon 3 camera
- NUC onboard computer.
- Omnidirectional mobility platform
A marker was designed to obtain the relative pose between the UAV and the UGV platform. There are four coordinate systems: image, camera, marker and UAV. The transformation matrix between different coordinate systems is shown on Figure 2. In order to control the UAV, the pose between the camera and the marker, which is $[R2\quad t2]$, has to be calculated with the PnP method. The pose between the UAV and the moving platform can then be obtained through another coordinate transform.
The software design of this project is shown on Figure 3:
-
Obtain undistorted images through image processing;
-
Send these images to the marker detection process to get the camera pose w.r.t. the marker;
-
In the main process, a state machine was run to process all the data and commands, including the pose between the moving platform and the UAV, sensor data from DJI SDK and the command from the off-board computer.