Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/3555
Title: Human action analysis, annotation and modeling in video streams based on implicit user interaction
Authors: Tsapatsoulis, Nicolas 
Ntalianis, Klimis S. 
Doulamis, Anastasios D. 
metadata.dc.contributor.other: Τσαπατσούλης, Νικόλας
Major Field of Science: Social Sciences
Field Category: Media and Communications
Keywords: Computer science;Multimedia systems;Semantics
Issue Date: 2008
Source: Proceeding AREA '08 Proceedings of the 1st ACM workshop on Analysis and retrieval of events/actions and workflows in video streams, pages 65-72
Abstract: This paper proposes an integrated framework for analyzing human actions in video streams. Despite most current approaches that are just based on automatic spatiotemporal analysis of sequences, the proposed method introduces the implicit user-in-the-loop concept for dynamically mining semantics and annotating video streams. This work sets a new and ambitious goal: to recognize, model and properly use "average user's" selections, preferences and perception, for dynamically extracting content semantics. The proposed approach is expected to add significant value to hundreds of billions of non-annotated or inadequately annotated video streams existing in the Web, file servers, databases etc. Furthermore expert annotators can gain important knowledge relevant to user preferences, selections, styles of searching and perception
URI: https://hdl.handle.net/20.500.14279/3555
ISBN: 978-1-60558-318-1
DOI: 10.1145/1463542.1463554
Rights: Copyright 2008 ACM
Type: Book Chapter
Affiliation : Technical University of Crete 
Cyprus University of Technology 
National Technical University Of Athens 
Appears in Collections:Κεφάλαια βιβλίων/Book chapters

CORE Recommender
Show full item record

SCOPUSTM   
Citations 20

3
checked on Nov 9, 2023

Page view(s) 20

481
Last Week
1
Last month
5
checked on Dec 3, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.