Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/31186
DC FieldValueLanguage
dc.contributor.authorKonstantinou, Loukas-
dc.contributor.authorPanos, Dionysis-
dc.contributor.authorKarapanos, Evangelos-
dc.date.accessioned2024-02-12T12:20:39Z-
dc.date.available2024-02-12T12:20:39Z-
dc.date.issued2024-01-17-
dc.identifier.citationInternational Journal of Human–Computer Interaction, 2024en_US
dc.identifier.issn15327590-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/31186-
dc.description.abstractInterest in technological solutions for combating online misinformation has rapidly grown over the last decade, yet the majority of proposed tools do not consider behavioral theories in their design, nor have they addressed the ways in which individuals could potentially interact with these tools, while omitting the plausible ways in which variations in the design of the interventions may affect end-users’ decision-making and behavioral responses. In this paper, we explore the potential of nudging to inform the design of technological tools that aim to mitigate online misinformation through behavior change. We report on a design workshop where 29 participants were asked to conceive technology-mediated nudges supporting individuals’ decision-making in the production, dissemination or consumption of misinformative content. In producing novel solutions, participants used the “Nudge Deck,” a design support tool that makes nudge theory, and particularly a framework of 23 interaction design mechanisms for nudging, accessible during time-constrained design meetings. We present the outputs of the session and discuss them in light of prior literature with respect to ethics and potential effectiveness.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.relation.ispartofInternational Journal of Human-Computer Interactionen_US
dc.rights© Taylor & Francis Groupen_US
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectNudgingen_US
dc.subjectOnline misinformationen_US
dc.subjectBehavior changeen_US
dc.subjectCognitive biasesen_US
dc.titleExploring the design of technology-mediated nudges for online misinformationen_US
dc.typeArticleen_US
dc.collaborationCyprus University of Technologyen_US
dc.subject.categoryComputer and Information Sciencesen_US
dc.journalsSubscriptionen_US
dc.countryCyprusen_US
dc.subject.fieldNatural Sciencesen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.1080/10447318.2023.2301265en_US
cut.common.academicyear2023-2024en_US
item.fulltextNo Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.openairetypearticle-
item.grantfulltextnone-
item.languageiso639-1en-
item.cerifentitytypePublications-
crisitem.journal.journalissn1532-7590-
crisitem.journal.publisherTaylor & Francis-
crisitem.author.deptDepartment of Communication and Internet Studies-
crisitem.author.deptDepartment of Communication and Internet Studies-
crisitem.author.facultyFaculty of Communication and Media Studies-
crisitem.author.facultyFaculty of Communication and Media Studies-
crisitem.author.orcid0000-0002-3493-3015-
crisitem.author.orcid0000-0001-5910-4996-
crisitem.author.parentorgFaculty of Communication and Media Studies-
crisitem.author.parentorgFaculty of Communication and Media Studies-
Appears in Collections:Άρθρα/Articles
CORE Recommender
Show simple item record

Page view(s)

115
Last Week
0
Last month
0
checked on Feb 3, 2025

Google ScholarTM

Check

Altmetric


This item is licensed under a Creative Commons License Creative Commons