Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/13290
DC FieldValueLanguage
dc.contributor.authorNiforatos, Evangelos-
dc.contributor.authorKarapanos, Evangelos-
dc.date.accessioned2019-02-12T06:16:52Z-
dc.date.available2019-02-12T06:16:52Z-
dc.date.issued2015-02-
dc.identifier.citationPersonal and Ubiquitous Computing, 2015, vol. 19, no. 2, pp. 425–444.en_US
dc.identifier.issn16174917-
dc.description.abstractWe introduce EmoSnaps, a mobile application that captures unobtrusively pictures of one’s facial expressions throughout the day and uses them for later recall of her momentary emotions. We describe two field studies that employ EmoSnaps in an attempt to investigate if and how individuals and their relevant others infer emotions from self-face and familiar face pictures, respectively. Study 1 contrasted users’ recalled emotions as inferred from EmoSnaps’ self-face pictures to ground truth data as derived from Experience Sampling. Contrary to our expectations, we found that people are better able to infer their past emotions from a self-face picture the longer the time has elapsed since capture. Study 2 assessed EmoSnaps’ ability to capture users’ experiences while interacting with different mobile apps. The study revealed systematic variations in users’ emotions while interacting with different categories of mobile apps (such as productivity and entertainment), social networking services, as well as direct social communications through phone calls and instant messaging, but also diurnal and weekly patterns of happiness as inferred from EmoSnaps’ self-face pictures. All in all, the results of both studies provided us with confidence over the validity of self-face pictures captured through EmoSnaps as memory cues for emotion recall, and the effectiveness of the EmoSnaps tool in measuring users’ momentary experiences.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.relation.ispartofPersonal and Ubiquitous Computingen_US
dc.rights© Springeren_US
dc.rightsAttribution-NonCommercial-NoDerivs 3.0 United States*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/us/*
dc.subjectDay Reconstruction Method (DRM)en_US
dc.subjectEmotion self-reportingen_US
dc.subjectExperience reconstructionen_US
dc.subjectExperience Sampling Method (ESM)en_US
dc.subjectUser experience (UX) evaluationen_US
dc.titleEmoSnaps: a mobile application for emotion recall from facial expressionsen_US
dc.typeArticleen_US
dc.collaborationUniversity of Luganoen_US
dc.collaborationMadeira Interactive Technologies Instituteen_US
dc.subject.categoryComputer and Information Sciencesen_US
dc.journalsOpen Accessen_US
dc.countrySwitzerlanden_US
dc.countryPortugalen_US
dc.subject.fieldNatural Sciencesen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.1007/s00779-014-0777-0en_US
dc.relation.issue2en_US
dc.relation.volume19en_US
cut.common.academicyear2014-2015en_US
dc.identifier.spage425en_US
dc.identifier.epage444en_US
item.fulltextNo Fulltext-
item.cerifentitytypePublications-
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.openairetypearticle-
item.languageiso639-1en-
crisitem.journal.journalissn1617-4917-
crisitem.journal.publisherSpringer Nature-
crisitem.author.deptDepartment of Communication and Internet Studies-
crisitem.author.facultyFaculty of Communication and Media Studies-
crisitem.author.orcid0000-0001-5910-4996-
crisitem.author.parentorgFaculty of Communication and Media Studies-
Appears in Collections:Άρθρα/Articles
CORE Recommender
Show simple item record

SCOPUSTM   
Citations

26
checked on Nov 6, 2023

WEB OF SCIENCETM
Citations 5

19
Last Week
0
Last month
1
checked on Oct 29, 2023

Page view(s)

633
Last Week
1
Last month
8
checked on May 12, 2024

Google ScholarTM

Check

Altmetric


This item is licensed under a Creative Commons License Creative Commons