Please use this identifier to cite or link to this item:
|Title:||Model-based generation of personalized full-body 3D avatars from uncalibrated multi-view photographs||Authors:||Michael, Nicholas
|Keywords:||3D body shape modeling;Animation-ready avatars;Collaborative virtual reality;Multi-view active shape model;Personalized avatars||Category:||Computer and Information Sciences||Field:||Natural Sciences||Issue Date:||Jun-2017||Publisher:||Springer New York LLC||Source:||Multimedia Tools and Applications, 2017, vol. 76, no. 12, pp. 14169-14195||Journal:||Multimedia Tools and Applications||Abstract:||According to a number of studies, the use of personalized avatars in virtual environments can enhance the immersion experience of users and the effectiveness of communication between different players. The benefits of using personalized avatars in conjunction with recent technological developments in Virtual Reality (VR) prompted an increasing demand for low-cost systems that are capable of fast and easy generation of personal avatars for use in VR applications. In this paper we present a novel model-based technique that is capable of generating personalized full-body 3D avatars from orthogonal photographs. The proposed method utilizes a statistical model of human 3D shape and a multi-view statistical 2D shape model of its corresponding silhouettes. Our technique is automatic, requiring minimal user intervention, and does not need a calibrated camera. Each component of our proposed technique is extensively evaluated and validated, in an attempt to test the geometric accuracy and identifiability of generated avatars. Furthermore, we demonstrate the use of the proposed method for generating and importing animation-ready avatars in collaborative VR environments.||ISSN:||1380-7501||DOI:||10.1007/s11042-016-3808-1||Collaboration :||Cyprus University of Technology||Rights:||© Springer||Type:||Article|
|Appears in Collections:||Άρθρα/Articles|
WEB OF SCIENCETM
checked on May 31, 2020
checked on Jun 1, 2020
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.