Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/32725
DC FieldValueLanguage
dc.contributor.authorVlachos, Marinos-
dc.contributor.authorSkarlatos, Dimitrios-
dc.date.accessioned2024-07-24T09:52:29Z-
dc.date.available2024-07-24T09:52:29Z-
dc.date.issued2024-04-01-
dc.identifier.citationRemote Sensing, 2024, vol. 16, no. 7en_US
dc.identifier.issn20724292-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/32725-
dc.description.abstractThe task of colour restoration on datasets acquired in deep waters with simple equipment such as a camera with strobes is not an easy task. This is due to the lack of a lot of information, such as the water environmental conditions, the geometric setup of the strobes and the camera, and in general, the lack of precisely calibrated setups. It is for these reasons that this study proposes a self-adaptive colour calibration method for underwater (UW) images captured in deep waters with a simple camera and strobe setup. The proposed methodology utilises the scene’s 3D geometry in the form of Structure from Motion and MultiView Stereo (SfM-MVS)-generated depth maps, the well-lit areas of certain images, and a Feedforward Neural Network (FNN) to predict and restore the actual colours of the scene in a UW image dataset.en_US
dc.description.sponsorshipCyprus University of Technology The APC was funded by Cyprus University of Technology.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.relation.ispartofRemote Sensingen_US
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internationalen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectunderwater colour restorationen_US
dc.subjectfeedforward neural networksen_US
dc.subjectmultiview stereoen_US
dc.subjectstructure from motionen_US
dc.titleSelf-Adaptive Colour Calibration of Deep Underwater Images Using FNN and SfM-MVS-Generated Depth Mapsen_US
dc.typeArticleen_US
dc.collaborationCyprus University of Technologyen_US
dc.subject.categoryCivil Engineeringen_US
dc.journalsOpen Accessen_US
dc.countryCyprusen_US
dc.countryCyprusen_US
dc.subject.fieldEngineering and Technologyen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.3390/rs16071279en_US
dc.identifier.scopus2-s2.0-85190311135-
dc.identifier.urlhttps://api.elsevier.com/content/abstract/scopus_id/85190311135-
dc.relation.issue7en_US
dc.relation.volume16en_US
cut.common.academicyear2024-2025en_US
item.grantfulltextopen-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.fulltextWith Fulltext-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.openairetypearticle-
crisitem.journal.journalissn2072-4292-
crisitem.journal.publisherMDPI-
crisitem.author.deptDepartment of Civil Engineering and Geomatics-
crisitem.author.deptDepartment of Civil Engineering and Geomatics-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.orcid0000-0002-2732-4780-
crisitem.author.parentorgFaculty of Engineering and Technology-
crisitem.author.parentorgFaculty of Engineering and Technology-
Appears in Collections:Άρθρα/Articles
Files in This Item:
File Description SizeFormat
remotesensing-16-01279-v2.pdf8.33 MBAdobe PDFView/Open
CORE Recommender
Show simple item record

Page view(s)

34
Last Week
1
Last month
16
checked on Dec 22, 2024

Download(s)

42
checked on Dec 22, 2024

Google ScholarTM

Check

Altmetric


This item is licensed under a Creative Commons License Creative Commons