Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.14279/30822
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Melillos, George | - |
dc.contributor.author | Kalogirou, Eleftheria | - |
dc.contributor.author | Makri, Despina | - |
dc.contributor.author | Hadjimitsis, Diofantos G. | - |
dc.date.accessioned | 2023-11-20T11:09:32Z | - |
dc.date.available | 2023-11-20T11:09:32Z | - |
dc.date.issued | 2023-09-21 | - |
dc.identifier.uri | https://hdl.handle.net/20.500.14279/30822 | - |
dc.description.abstract | This paper proposes an automatic ship detection approach in Synthetic Aperture Radar (SAR) Images using YOLO deep learning framework. YOLO (You Only Look Once) is an object detection algorithm Object detection algorithms using region proposal includes RCNN, Fast RCNN, and Faster RCNN, etc. Region based Convolutional Neural Networks (RCNN) algorithm uses a group of boxes for the image and then analyses in each box if either of the boxes holds a target. It employs the method of selective search to pick those sections from the picture. YOLO can be used to assist in making safety checks for ships and mariners. We train the YOLO model on our dataset in this paper for our detector to learn to detect objects in SAR images such as ships. The preliminary YOLO test results showed an increase in the accuracy of ship detection at Cyprus’s Coast and can be applied in the field of ship detection. | en_US |
dc.description.sponsorship | ERATOSTHENES Centre of Excellence | en_US |
dc.format | en_US | |
dc.language.iso | en | en_US |
dc.relation | EXCELSIOR: ERATOSTHENES Centre of Excellence for Earth Surveillance and Space-Based Monitoring of the Environment : Teaming Phase1 GA 763643 | en_US |
dc.rights.uri | http://creativecommons.org/publicdomain/zero/1.0/ | * |
dc.subject | YOLO | en_US |
dc.subject | Remote Sensing | en_US |
dc.subject | Sentinel-1 | en_US |
dc.subject | Ship detection | en_US |
dc.title | Ship detection using SAR images based on YOLO (you only look once) | en_US |
dc.type | Conference Papers | en_US |
dc.collaboration | ERATOSTHENES Centre of Excellence | en_US |
dc.collaboration | Cyprus University of Technology | en_US |
dc.subject.category | Computer and Information Sciences | en_US |
dc.subject.category | ENGINEERING AND TECHNOLOGY | en_US |
dc.journals | Subscription | en_US |
dc.country | Cyprus | en_US |
dc.subject.field | Engineering and Technology | en_US |
dc.publication | Non Peer Reviewed | en_US |
dc.relation.conference | Ninth International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2023), 2023, Ayia Napa, Cyprus | en_US |
dc.identifier.doi | https://doi.org/10.1117/12.2681665 | en_US |
cut.common.academicyear | 2022-2023 | en_US |
item.openairetype | conferenceObject | - |
item.cerifentitytype | Publications | - |
item.fulltext | No Fulltext | - |
item.grantfulltext | none | - |
item.openairecristype | http://purl.org/coar/resource_type/c_c94f | - |
item.languageiso639-1 | en | - |
crisitem.project.funder | EC | - |
crisitem.project.grantno | H2020-WIDESPREAD-04-2017 | - |
crisitem.project.fundingProgram | H2020 | - |
crisitem.project.openAire | info:eu-repo/grantAgreement/EC/H2020/763643 | - |
crisitem.author.dept | Department of Civil Engineering and Geomatics | - |
crisitem.author.dept | Department of Civil Engineering and Geomatics | - |
crisitem.author.faculty | Faculty of Engineering and Technology | - |
crisitem.author.faculty | Faculty of Engineering and Technology | - |
crisitem.author.orcid | 0000-0002-8292-1836 | - |
crisitem.author.orcid | 0009-0005-0188-0200 | - |
crisitem.author.orcid | 0009-0002-6217-9328 | - |
crisitem.author.orcid | 0000-0002-2684-547X | - |
crisitem.author.parentorg | Faculty of Engineering and Technology | - |
crisitem.author.parentorg | Faculty of Engineering and Technology | - |
Appears in Collections: | EXCELSIOR H2020 Teaming Project Publications |
CORE Recommender
Page view(s) 20
403
Last Week
7
7
Last month
29
29
checked on Feb 1, 2025
Google ScholarTM
Check
Altmetric
This item is licensed under a Creative Commons License