Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/13445
Title: A Recurrent Latent Variable Model for Supervised Modeling of High-Dimensional Sequential Data
Authors: Christodoulou, Panayiotis 
Chatzis, Sotirios P. 
Andreou, Andreas S. 
metadata.dc.contributor.other: Χατζής, Σωτήριος Π.
Ανδρέου, Ανδρέας Σ.
Χριστοδούλου, Παναγιώτης
Major Field of Science: Engineering and Technology
Field Category: Computer and Information Sciences;Electrical Engineering - Electronic Engineering - Information Engineering
Keywords: Αmortized variational inference;Ηigh-dimensional sequences;Predictive modeling;Recurrent latent variable
Issue Date: Jul-2018
Source: IEEE International Conference on Innovations in Intelligent Systems and Applications, 2018, 3-5 July , Thessaloniki, Greece
Conference: IEEE (SMC) International Conference on Innovations in Intelligent Systems and Applications, INISTA 2018 
Abstract: In this work, we attempt to ameliorate the impact of data sparsity in the context of supervised modeling applications dealing with high-dimensional sequential data. Specifically, we seek to devise a machine learning mechanism capable of extracting subtle and complex underlying temporal dynamics in the observed sequential data, so as to inform the predictive algorithm. To this end, we improve upon systems that utilize deep learning techniques with recurrently connected units; we do so by adopting concepts from the field of Bayesian statistics, namely variational inference. Our proposed approach consists in treating the network recurrent units as stochastic latent variables with a prior distribution imposed over them. On this basis, we proceed to infer corresponding posteriors; these can be used for prediction generation, in a way that accounts for the uncertainty in the available sparse training data. To allow for our approach to easily scale to large real-world datasets, we perform inference under an approximate amortized variational inference (AVI) setup, whereby the learned posteriors are parameterized via (conventional) neural networks. We perform an extensive experimental evaluation of our approach using challenging benchmark datasets, and illustrate its superiority over existing state-of-the-art techniques.
URI: https://hdl.handle.net/20.500.14279/13445
ISBN: 978-1-5386-5150-6
DOI: 10.1109/INISTA.2018.8466296
Rights: © 2018 IEEE
Type: Conference Papers
Affiliation : Cyprus University of Technology 
Publication Type: Peer Reviewed
Appears in Collections:Δημοσιεύσεις σε συνέδρια /Conference papers or poster or presentation

CORE Recommender
Show full item record

Page view(s) 50

327
Last Week
1
Last month
3
checked on Dec 22, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.