Please use this identifier to cite or link to this item:
Title: Enhancing Nonverbal Human Computer Interaction with Expression Recognition
Authors: Karpouzis, Kostas 
Tsapatsoulis, Nicolas 
Raouzaiou, Amaryllis 
Moshovitis, George
Kollias, Stefanos D. 
Keywords: Human-Computer interaction
Issue Date: 2000
Publisher: ACM
Source: ACM SIGCAPH: Computers and the Physically Handicapped, no. 67, 2000, pp.1-9
Abstract: This paper describes an integrated system for human emotion recognition, which is used to provide feedback about the relevance or impact of the information that is presented to the user. Other techniques in this field extract explicit motion fields from the areas of interest and classify them with the help of templates or training sets; the proposed system, however, compares indication of muscle activation from the human face to data taken from similar actions of a 3-d head model. This comparison takes place at curve level, with each curve being drawn from detected feature points in an image sequence or from selected vertices of the polygonal model. The result of this process is identification of the muscles that contribute to the detected motion; this conclusion can then be used in conjunction with the Mimic Language, a table structure that maps groups of muscles to emotions. This method can be applied to either frontal or rotated views, as the curves that are calculated are easier to rotate in 3-d space than motion vector fields. The notion of describing motion with specific points is also supported in MPEG-4 and the relevant encoded data can be used in the same context, to eliminate the need to use machine vision techniques.
ISSN: 0163-5727
Rights: ©2003 - 2009 Association for Computing Machinery
Type: Article
Appears in Collections:Άρθρα/Articles

Show full item record

Page view(s) 50

Last Week
Last month
checked on Aug 19, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.