Show simple item record

dc.contributor.advisorStavdahl, Øyvind
dc.contributor.advisorSanfilippo, Filippo
dc.contributor.authorKjørholt, Harald Gard Halvorssønn
dc.date.accessioned2018-09-06T14:02:18Z
dc.date.available2018-09-06T14:02:18Z
dc.date.created2018-06-04
dc.date.issued2018
dc.identifierntnudaim:18607
dc.identifier.urihttp://hdl.handle.net/11250/2561331
dc.description.abstractDeveloping snake-like robots can be advantageous for numerous reasons. Biological snakes are capable of traversing all types of terrain in a very energy efficient and effective way. They achieve this variety of locomotion tasks by utilizing the terrain s roughness in a beneficial way, by pushing against obstacles they encounter and propelling themselves forward. Mimicking such biological obstacle aided locomotion (OAL) enables the snake to do reconnaissance in a quick manner, and copying this trait is appealing for a variety of applications. Search and rescue missions, firefighting and in general traversing dangerous or unreachable areas are some of the fields that are being investigated for the use of robotic snakes versus mobile robots with wheels or legs. A form of sensing equipment is desirable for a robot to effectively achieve such locomotion capabilities. Without any means of gathering data from the outside world, it would be a virtually blind and hence ineffective approach. This project aims at combining tactile data, coming from the snake s force sensory system, with visual data from an external visual sensor system, to achieve Perception-driven Obstacle-Aided Locomotion (POAL). A sensor fusion approach is adopted to effectively combine tactile and visual data. Sensor fusion gives the ability to confirm the reliability of system behavior by cross-referencing the data. This thesis focuses on the visual perception aspect and the implementation of sensor fusion. The most important contributions to the project are implementing the visual perception module to collect and visualize all necessary data for Perception-driven Obstacle-Aided Locomotion (i.e. snake position, collision points, collision forces, collision directions, etc.) integration with the tactile perception module (i.e. sensor fusion.) performing real experiments with POAL (replicating the simulation results that were previously achieved in [1].) maintaining and documenting source code with the aim of releasing the framework as an open-source project Obstacle aided locomotion by sensor fusion is proven plausible in this thesis through experiments. It is demonstrated in attached videos, see Appendix. System development and experiments conducted are as of now strictly for laboratory purposes and do not necessarily represent real-life scenarios. The main development platform used in the project is Robot Operating System (ROS), used for robot control. The snake is controlled through a robot interface, and tactile data is sent over connections between ROS and the robot interface. A visual sensor interface is used to gather visual data regarding the snake. Simulating the system is done by taking in both visual and tactile data for visualization.
dc.languageeng
dc.publisherNTNU
dc.subjectKybernetikk og robotikk, Robotsystemer
dc.titleA Sensor Fusion Approach with Focus on Visual Sensing for Perception-Driven Obstacle-Aided Snake Robot Locomotion
dc.typeMaster thesis


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record