Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/4060
Title: Multi-Modal Contact-less Human Computer Interaction
Authors: Frangeskides, Frangiskos 
Lanitis, Andreas 
metadata.dc.contributor.other: Λανίτης, Ανδρέας
Major Field of Science: Humanities
Field Category: Arts
Keywords: User Interfaces and Human Computer Interaction;Artificial Intelligence (incl. Robotics);Business and Economics
Issue Date: 2007
Source: Enterprise Information Systems: 8th International Conference, ICEIS 2006, Paphos, Cyprus, May 23-27, 2006 , Revised Selected Papers, pp 405-419
Abstract: We describe a contact-less Human Computer Interaction (HCI) system that aims to provide paraplegics the opportunity to use computers without the need for additional invasive hardware. The proposed system is a multi-modal system combining both visual and speech input. Visual input is provided through a standard web camera used for capturing face images showing the user of the computer. Image processing techniques are used for tracking head movements, making it possible to use head motion in order to interact with a computer. Speech input is used for activating commonly used tasks that are normally activated using the mouse or the keyboard. The performance of the proposed system was evaluated using a number of specially designed test applications. According to the quantitative results, it is possible to perform most HCI tasks with the same ease and accuracy as in the case that a touch pad of a portable computer is used.
URI: https://hdl.handle.net/20.500.14279/4060
ISBN: 978-3-540-77581-2 (Online)
DOI: 10.1007/978-3-540-77581-2_28
Rights: Springer Berlin Heidelberg
Type: Book Chapter
Affiliation : Cyprus College 
Cyprus University of Technology 
Appears in Collections:Κεφάλαια βιβλίων/Book chapters

CORE Recommender
Show full item record

SCOPUSTM   
Citations 50

6
checked on Nov 8, 2023

Page view(s) 20

510
Last Week
0
Last month
8
checked on Dec 3, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.