Please use this identifier to cite or link to this item:
Title: A semi-automated approach to the content analysis of experience narratives
Authors: Karapanos, Evangelos 
Keywords: Content Analysis;Semantic Similarity;Latent Semantic Analysis;Latent Concept;Automate Approach
Category: Computer and Information Sciences
Field: Natural Sciences
Issue Date: 2013
Publisher: Springer
Journal: Studies in Computational Intelligence, 2013, Pages 115-136
Abstract: iScale will typically result in a wealth of experience narratives relating to different stages of products' adoption. The qualitative analysis of these narrative is a labor intensive, and prone to researcher bias activity. This chapter proposes a semi-automated technique that aims at supporting the researcher in the content analysis of experience narratives. The technique combines traditional qualitative coding procedures (Strauss and Corbin, 1998) with computational approaches for assessing the semantic similarity between documents (Salton et al., 1975). This results in an iterative process of qualitative coding and visualization of insights which enables to move quickly between high-level generalized knowledge and concrete and idiosyncratic insights. The proposed approach was compared against a traditional vector-space approach for assessing the semantic similarity between documents, the Latent-Semantic Analysis (LSA), using a dataset of a study in chapter 4. Overall, the proposed approach was shown to perform substantially better than traditional LSA. However, interestingly enough, this was mainly rooted in the explicit modeling of relations between concepts and individual terms, and not in the restriction of the list of terms to the ones that concern particular phenomena of interest.
ISBN: 978-3-642-31000-3
Rights: © 2013 Springer-Verlag Berlin Heidelberg.
Type: Book Chapter
Appears in Collections:Κεφάλαια βιβλίων/Book chapters

Show full item record

Page view(s)

Last Week
Last month
checked on Aug 21, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.