Autonomous Drone with Object Pickup Capabilities
MetadataShow full item record
The objective with this project was to create an autonomous drone with the capability to pick up, and deliver an object, without any human interference. This is an continuation of the work done during the pre-project. The obectives during this project were:1. Improve the orientation controller2. Implement a position controller with necessary sensors3. Implement a method for the drone to recognize objects to be picked up, and lock on to the object4. Implement a method for picking up an object5. Do an overall test of all the objectives mentioned above6. If there is enough time a collision detector system should be implemented. Before the orientation controller could be improved it was necessary to revise the orientation data processing algorithms. After implementing a calibration routine, motor throttle compensation, rollover compensation and acceptance test for the compass, and a tilt compensation function for the gyroscope, the results were satisfying. An LQR controller and cascade controller was tested in order to find the best alternative for controlling the drone's orientation. Both controllers were tested and tuned against the drone model in Simulink before implemented on the drone's MCU for further testing and tuning. The cascade configuration was precise and responsive, and proved to be the superior controller design for this use. With this, the first objective was met. In order for the drone to be implemented with a position controller the following new hardware was installed. 1. GNSS receiver, for measuring position and velocity2. Pressure sensor, for accurately measuring altitude3. Vision sensor, for detecting objects4. IR sensor, for detecting the ground In order to get a visual feedback of the current status of the drone, LED's were installed. Appropriate drivers were developed for the new peripheral devices. Because of the new hardware a new PCB design was necessary, the new design made the electronics more concealed, and the drone more compact. A hook connected to a servo was suggested as a method for picking up objects, but was not implemented due to controller issues. In order for the visual representation of the drone's position and -velocity to be easy to comprehend the GNSS data was converted to ENU coordinates, and pressure sensor data was recalculated to altitude. Sensor fusion with kalman filtering was implemented to filter the velocity data, and a second order low-pass filter was used to filter the altitude data, this gave noise reduced measurements. The controller design used for positioning was a cascade controller, and was tuned and tested in the same manner as the orientation controller. The tuning of the position controller had to be done outside, but due to harsh weather conditions, not enough time was spent on tuning the controller, the parameters were therefore not optimal. The overall controller design and implementation worked as planned, with this, the second objective was met. The raw data from the vision sensor, object $x$- and $y$ position and object width, was recalculated to a three dimensional position vector, so the object position data could be used in the position controller. To make testing easier, the vision sensor was also modeled in Simulink. The flight controller combines all the controller and data processing modules plus necessary logic for the modules to cooperate. This was designed in Simulink because of the possibility to simulate against the drone model. Four flight modes were implemented in the flight controller, 1. Manual mode, 2. Manual GNSS mode, 3. Autonomous mode, and 4. Autonomous object mode. A battery monitoring system was also implemented for warning the user of low battery via the LED's. A real-time operating system was implemented due the need for doing several operations at the same time, and some modules having a strict execution deadline. There was implemented a total of seven tasks and one interrupt routine. The operating system worked flawlessly. All modes except "autonomous object mode" were tested on the physical drone, though the functionality of all three modes worked as expected there was definitively room for improvements. When testing "manual mode" an issue arose, when given a roll- or pitch input the drone was not able to hold its attitude over time, and would flatten out as it picked up speed. This affected the drone's top speed, which was noticeable on the performance in "autonomous mode" and "manual GNSS mode", this issue affected the drone's ability to hold its geographic position, and move to another position. The root cause of this issue was tough to be poor controller parameters. Because of this issue there was no point in implementing the "autonomous object mode" on the drone, this was rather simulated in Simulink, and the result was satisfying enough to suggest the mode to be implemented on a later moment. Due to the issues mentioned above, the "autonomous object mode" was never implemented on the drone, there was no point in implementing this mode as the position controller was not precise enough. With this, the third and fourth objective was not met, although much of the work has been done. A test of all the implemented objectives was done, the fifth objective was therefore met. There was no time to complete objective six.