Παρακαλώ χρησιμοποιήστε αυτό το αναγνωριστικό για να παραπέμψετε ή να δημιουργήσετε σύνδεσμο προς αυτό το τεκμήριο:
https://hdl.handle.net/20.500.14279/23720
Τίτλος: | Characterizing Abhorrent, Misinformative, and Mistargeted Content on YouTube | Συγγραφείς: | Papadamou, Kostantinos | Λέξεις-κλειδιά: | YouTube’s recommendation algorithm;Reddit;disturbing content;involuntary celibates;misinformation;pseudoscience | Advisor: | Sirivianos, Michael | Ημερομηνία Έκδοσης: | Απρ-2021 | Department: | Department of Electrical Engineering, Computer Engineering and Informatics | Faculty: | Faculty of Engineering and Technology | Περίληψη: | YouTube has revolutionized the way people discover and consume video content. Although YouTube facilitates easy access to hundreds of well-produced educational, entertaining, and trustworthy news videos, abhorrent, misinformative and mistargeted content is also common. The platform is plagued by various types of inappropriate content including: 1) disturbing videos targeting young children; 2) hateful and misogynistic content; and 3) pseudoscientific and conspiratorial content. While YouTube’s recommendation algorithm plays a vital role in increasing user engagement and YouTube’s monetization, its role in unwittingly promoting problematic content is not entirely understood. In this thesis, we shed some light on the degree of abhorrent, misinformative, and mistargeted content on YouTube and the role of the recommendation algorithm in the discovery and dissemination of such content. Following a data-driven quantitative approach, we analyze thousands of videos posted on YouTube. Specifically, we devise various methodologies to detect problematic content, and we use them to simulate the behavior of users casually browsing YouTube to shed light on: 1) the risks of YouTube media consumption by young children; 2) the role of YouTube’s recommendation algorithm in the dissemination of hateful and misogynistic content, by focusing on the Involuntary Celibates (Incels) community; and 3) user exposure to pseudoscientific misinformation on various parts of the platform and how this exposure changes based on the user’s watch history. In a nutshell, our analysis reveals that young children are likely to encounter disturbing content when they randomly browse the platform starting from benign videos relevant to their interests and that YouTube’s currently deployed counter-measures are ineffective in terms of detecting them in a timely manner. By analyzing the Incel community on YouTube, we find that not only Incel activity is increasing over time, but platforms may also play an active role in steering users towards extreme content. Finally, when studying pseudoscientific misinformation, we find among other things that YouTube suggests more pseudoscientific content regarding traditional pseudoscientific topics (e.g., flat earth) than for emerging ones (like COVID-19), and that these recommendations are more common on the search results page than on a user’s homepage or the video recommendations (up-next) section. | URI: | https://hdl.handle.net/20.500.14279/23720 | Rights: | Απαγορεύεται η δημοσίευση ή αναπαραγωγή, ηλεκτρονική ή άλλη χωρίς τη γραπτή συγκατάθεση του δημιουργού και κάτοχου των πνευματικών δικαιωμάτων. Attribution-NonCommercial-NoDerivatives 4.0 International |
Type: | PhD Thesis | Affiliation: | Cyprus University of Technology |
Εμφανίζεται στις συλλογές: | Διδακτορικές Διατριβές/ PhD Theses |
Αρχεία σε αυτό το τεκμήριο:
Αρχείο | Περιγραφή | Μέγεθος | Μορφότυπος | |
---|---|---|---|---|
Kostantinos Papadamou - PhD Thesis.pdf | 14.15 MB | Adobe PDF | Δείτε/ Ανοίξτε |
CORE Recommender
Page view(s)
393
Last Week
0
0
Last month
3
3
checked on 6 Νοε 2024
Download(s)
206
checked on 6 Νοε 2024
Google ScholarTM
Check
Αυτό το τεκμήριο προστατεύεται από άδεια Άδεια Creative Commons