Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/2436
Title: A visual-servoing scheme for semi-autonomous operation of an underwater robotic vehicle using an IMU and a laser vision system
Authors: Karras, George C. 
Kyriakopoulos, Kostas J. 
Loizou, Savvas 
metadata.dc.contributor.other: Λοΐζου, Σάββας
Keywords: Degree of freedom;Human-machine systems;Robot vision
Issue Date: 2010
Source: IEEE International Conference on Robotics and Automation ICRA, 2010, Anchorage, Αlaska
Abstract: This paper presents a visual servoing control scheme that is applied to an underwater robotic vehicle. The objective of the proposed control methodology is to provide a human operator the capability to move the vehicle without loosing the target from the vision system's field of view. On-line estimation of the vehicle states is achieved by fusing data from a Laser Vision System (LVS) and an Inertial Measurement Unit (IMU) using an asynchronous Unscented Kalman Filter (UKF). A controller designed at the kinematic level, is backstepped into the dynamics of the system, maintaining its analytical stability guarantees. It is shown that the under-actuated degree of freedom is input-to-state stable and an energy based shaping of the user input with stability guarantees is implemented. The resulting control scheme has analytically guaranteed stability and convergence properties, while its applicability and performance are experimentally verified using a small Remotely Operated Vehicle (ROV) in a test tank.
URI: https://hdl.handle.net/20.500.14279/2436
ISSN: 10504729
DOI: 10.1109/ROBOT.2010.5509259
Rights: © 2010 IEEE
Type: Conference Papers
Affiliation: Frederick University 
Appears in Collections:Δημοσιεύσεις σε συνέδρια /Conference papers or poster or presentation

CORE Recommender
Show full item record

SCOPUSTM   
Citations 20

11
checked on Nov 9, 2023

Page view(s) 20

458
Last Week
2
Last month
5
checked on Nov 21, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.