Abstract
This dissertation addresses the problem of improving the teleoperation experience of an overconstrained multiterrain mobile robot. Localising a robot and assessing its condition is cumbersome in a teleoperation scenario. Despite the fact that robots are becoming more sophisticated,they aren’t still able to perform efficiently without human intervention. Current state of the art localisation technology is not able to operate without failure. Odometry is robust, though it is error prone and complicated to use in over-constrained robots. Ideally, every actuator drives the robot to the same position, though, terrain roughness difficults the task of motion control systems which fail to guarantee perfect kinematic geometries. This means that each wheel contributes differently to the position estimation of the robot. The proposed odometry model considers the input provided by each wheel and provides an error model for further sensor fusion. The operator’s link to the remote environment is often influenced by a single video feed which decreases the operator’s capability of assessing the robot’s condition. This contributes to disorientation and complicates the comprehension of the remote environment. This dissertation tackles the problem of improving situation awareness in teleoperation by offering the user multiple alternatives to control a pan-tilt camera. Results obtained in real and simulated environments demonstrate the capabilities of the presented systems.
Original language | English |
---|---|
Awarding Institution |
|
Supervisors/Advisors |
|
Award date | 1 Jan 2008 |
Place of Publication | Caparica |
Publisher | |
DOIs | |
Publication status | Published - 1 Jan 2008 |