Please use this identifier to cite or link to this item:
Title: Detecting square markers in underwater environments
Authors: Čejka, Jan 
Bruno, Fabio 
Skarlatos, Dimitrios 
Liarokapis, Fotis 
Keywords: Augmented reality;Cultural heritage;Generating synthetic images;Marker-based tracking;Real time
Category: Computer and Information Sciences
Field: Natural Sciences
Issue Date: Feb-2019
Source: Remote Sensing, 2019, Volume 11, Issue 4, Article number 459
Journal: Remote Sensing 
Abstract: Augmented reality can be deployed in various application domains, such as enhancing human vision, manufacturing, medicine, military, entertainment, and archeology. One of the least explored areas is the underwater environment. The main benefit of augmented reality in these environments is that it can help divers navigate to points of interest or present interesting information about archaeological and touristic sites (e.g., ruins of buildings, shipwrecks). However, the harsh sea environment affects computer vision algorithms and complicates the detection of objects, which is essential for augmented reality. This paper presents a new algorithm for the detection of fiducial markers that is tailored to underwater environments. It also proposes a method that generates synthetic images with such markers in these environments. This new detector is compared with existing solutions using synthetic images and images taken in the real world, showing that it performs better than other detectors: it finds more markers than faster algorithms and runs faster than robust algorithms that detect the same amount of markers.
ISSN: 2072-4292
DOI: 10.3390/rs11040459
Rights: © 2019 by the authors.
Type: Article
Appears in Collections:Άρθρα/Articles

Files in This Item:
File Description SizeFormat
remotesensing-11-00459.pdfFulltext4.18 MBAdobe PDFView/Open
Show full item record


checked on Aug 16, 2019

Page view(s)

Last Week
Last month
checked on Aug 23, 2019


checked on Aug 23, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.