Poster Sessions


4 September 2019
POSTER SESSION DAY 1: Doctoral Consortium and Track 1
Authors Title ID Category
Rebecca Stower The Role of Trust and Social Behaviours in Children’s Learning from Social Robots D1-1-DC DC
Maria Elena Lechuga Redondo Comfortability Detection for Adaptive Human-Robot Interactions D1-2-DC DC
Wail Elbani Social touch: stimuli-imitation protocol and automated recognition D1-3-DC DC
Jonny O’Dwyer Speech, Head, and Eye-based Cues for Continuous Affect Prediction D1-4-DC DC
Chongyang Wang Automatic Detection of Protective Movement Behavior with MoCap and sEMG Data for Chronic Pain Rehabilitation D1-5-DC DC
Tao Bi Wearable Sensing Technology for Recognizing and Sharing Emotional Experiences of running – from a phenomenologically computing perspective D1-6-DC DC
Le Yang Multi-Modal Depression Detection and Estimation D1-7-DC DC
Deniece Nazareth Emotion Recognition in Dementia: Advancing technology for multimodal analysis of emotion expression in everyday life. D1-8-DC DC
Angela Chan “I Need a Hug Right Now”: Affective Support Through Remote Touch Technology D1-9-DC DC
Everlyne Kimani A Sensor-based Framework for Real-time Detection and Alleviation of Public Speaking Anxiety D1-10-DC DC
Saurabh Hinduja Mitigating the Bias in Empathy Detection D1-11-DC DC
Germán Ruiz Marcos An investigation on the automatic generation of music and its application into video games D1-12-DC DC
Noé Tits A Methodology for Controlling the Emotional Expressiveness in Synthetic Speech – a Deep Learning approach D1-13-DC DC
Ricardo Rodrigues Enriching Discrete Actions with Impactful Emotions D1-14-DC DC
Georgiana Cristina Dobre Using machine learning to generate engaging behaviours in immersive virtual environments D1-15-DC DC
Judith Ley-Flores, Frederic Bevilacqua, Nadia Bianchi-Berthouze and Ana Tajadura-Jiménez Altering body perception and emotion in physically inactive people through movement sonification D1-16-T1 TBBD’19
Batuhan Sayis, Ciera Crowell, Juan Benitez, Rafael Ramirez and Narcis Pares Computational Modeling of Psycho-physiological Arousal and Social Initiation of children with Autism in Interventions through Full-Body Interaction D1-17-T1 TBBD’19
Marine Taffou, Clara Suied and Isabelle Viaud-Delmon Emotion induced by auditory roughness modulates multisensory integration in relation with the body D1-18-T1 TBBD’19
Pardis Miri, Katherine Isbister, James J. Gross and Keith Marzullo Affect Regulation Using Technology: Lessons Learned by Taking a Multidisciplinary Perspective D1-19-T1 TBBD’19
Heleen Boers and Emmanuel Rios Velazquez Multisensory wearable device to measure effect of human interaction on stress level D1-21-T1 TBBD’19
Jelena Rosic, Ilkka Kosunen and Pia Tikka FROM LIVED EXPERIENCE TO EMPIRICAL DATA – Steps towards the science of embodied cognition D1-23-T1 TBBD’19
Regine Zopf, Kelsie Boulton, Patrick Nalepka and Anina Rich A whole new ball game: What bodily interactions with virtual objects can tell us about body size representations and distortions D1-24-T1 TBBD’19
Iseline Peyre, Frédéric Bevilacqua, Véronique Marchand Pauvert, Agnès Roby Brami and Pascale Pradat Diehl Explore gesture-sound coupling for motor-recovery D1-26-T1 TBBD’19
Marina Scattolin, Maria Serena Panasiti and Salvatore Maria Aglioti The role of Bodily Self Consciousness in Morality and (dis)honest behaviors D1-27-T1 TBBD’19


5 September 2019
Authors Title ID Category
Asma Ghandeharioun, Daniel McDuff, Mary Czerwinski and Kael Rowan Towards Understanding Emotional Intelligence for Conducting Experience Sampling D2-1 Dataset
Hao-Chun Yang and Chi-Chun Lee Annotation Matters: A Comprehensive Study on Recognizing Intended, Self-reported, and Observed Emotion Labels using Physiology D2-2 Dataset
Nicolas Beaudoin-Gagnon, Alexis Fortin-Côté, Cindy Chamberland, Ludovic Lefebvre, Jérémy Bergeron-Boucher, Alexandre Campeau-Lecours, Sébastien Tremblay and Philip L. Jackson The FUNii Database : A Physiological, Behavioral, Demographic and Subjective Video Game Database for Affective Gaming and Player Experience Research D2-3 Dataset
Judy Hanwen Shen, Agata Lapedriza and Rosalind W. Picard Unintentional affective priming during labeling may bias labels D2-4 Dataset
Ajjen Joshi, Youssef Attia and Taniya Mishra Protocol for Eliciting Driver Frustration in an In-vehicle Environment D2-5 Dataset
Jiahui Hu, Bing Yu, Yun Yang and Bailan Feng Towards Facial De-Expression and Expression Recognition in the Wild D2-6 Facial Exp
Dominik Seuß, Anja Dieckmann, Teena Chakkalayil Hassan, Jens-Uwe Garbas, Johann Heinrich Ellgring, Marcello Mortillaro and Klaus Scherer Emotion Expression from Different Angles: A Video Database of Facial Expressions of Actors Shot by a Camera Array D2-7 Facial Exp
Min Peng, Chongyang Wang, Tao Bi, Tong Chen, Xiangdong Zhou and Yu Shi A Novel Apex-Time Network for Cross-Dataset Micro-Expression Recognition D2-8 Facial Exp
Mani Kumar Tellamekala and Michel Valstar Temporally Coherent Visual Representations For Dimensional Affect Recognition D2-9 Facial Exp
Alireza Sepas-Moghaddam, Paulo Lobato Correia, Fernando Pereira and Ali Etemad A Deep Framework for Facial Emotion Recognition using Light Field Images D2-10 Facial Exp
Jeffrey Girard, Gayatri Shandar, Zhun Liu, Jeffrey Cohn, Lijun Lin and Louis-Philippe Morency Reconsidering the Duchenne Smile: An observational investigation of the prototypical expression of positive emotion D2-11 Facial Exp
Jian Huang, Jianhua Tao, Bin Liu, Zhen Lian and Mingyue Niu Efficient Modeling of Long Temporal Contexts for Continuous Emotion Recognition D2-12 Multimodal
Md Kamrul Hasan, Taylan Sen, Yiming Yang, Raiyan Abdul Baten, Kurtis Glenn Haut and Mohammed Ehsan Hoque LIWC into the Eyes: Using Facial Features to Contextualize Linguistic Analysis in Multimodal Communication D2-13 Multimodal
Meishu Song, Zijiang Yang, Alice Baird, Emilia Parada-Cabaleiro, Zixing Zhang, Ziping Zhao and Björn Schuller Audiovisual Analysis for Recognising Frustration during Game-Play: Introducing the Multimodal Game Frustration Database D2-14 Multimodal
Nathan Henderson, Andrew Emerson, Jonathan Rowe and James Lester Improving Sensor-based Affect Detection with Multimodal Data Imputation D2-15 Multimodal
Eddie Huang, Hannah Valdiviejas and Nigel Bosch I’m Sure! Automatic Detection of Metacognition in Online Course Discussion Forums D2-16 Multimodal
Benjamin Ma, Timothy Greer, Matthew Sachs, Assal Habibi, Jonas Kaplan and Shrikanth Narayanan Predicting Human-Reported Enjoyment Responses in Happy and Sad Music D2-17 Music
Léo Hemamou, Ghazi Felhi, Jean-Claude Martin and Chloé Clavel Slices of Attention in Asynchronous Video Job Interviews D2-18 Upskilling-Workforce
Raiyan Abdul Baten, Famous Clark and Mohammed Ehsan Hoque Upskilling Together: How Peer-interaction Influences Speaking-skills Development Online D2-19 Upskilling-Workforce
Katie Seaborn, Nina Lee, Marla Narazani and Atsushi Hiyama Intergenerational shared action games for promoting empathy between Japanese youth and elders D2-20 Health
Shogo Okada, Ken Inoue, Toru Imai, Mami Moguchi and Kaiko Kuwamura Classification of Dementia Scale Based on Ubiquitous Daily Activity and Interaction Sensing D2-21 Health
Belén Saldías F. and Rosalind W. Picard Tweet Moodifier: Giving emotional awareness to Twitter users D2-22 Health
Pablo Barros, Nikhil Churamani, Angelica Lim and Stefan Wermter The OMG-Empathy Dataset: Evaluating the Impact of Affective Behavior in Storytelling D2-23 Health
Sara Evensen, Yoshihiko Suhara, Alon Halevy, Vivian Li, Wang-Chiew Tan and Saran Mumick Happiness Entailment: Automating Suggestions for Well-Being D2-24 Health
Florian Grond, Ariel Cascio, Rossio Motta Ochoa, Tamar Tembeck and Stefanie Blain-Moraes Participatory design of biomusic with users on the autism spectrum D2-25 Health
Mohammad Rafayet Ali, Taylan Sen, Viet-Duy Nguyen, Reza Rawassizadeh, Paul Duberstein, Ronald Epstein and M Ehsan Hoque What Computers Can Teach Us About Doctor-Patient Communication: Leveraging Gender Differences in Cancer Care. D2-26 Health
Nadine Aburumman, Antonia Hamilton and Marco Gillies Being with a Virtual Character: Nonverbal Communication in Virtual Reality D2-27-T2 NPMA’19
Jin Hyun Cheong, Zainab Molani, Sushmita Sadhukha and Luke Chang Synchronized emotions during shared experiences increase social connection D2-28-T2 NPMA’19


6 September 2019
Authors Title ID Category
Erica Volta, Radoslaw Niewiadomski, Temitayo Olugbade, Carla Gilio, Elena Cocchi, Nadia Berthouze, Monica Gori and Gualtiero Volpe Analysis of cognitive states during bodily exploration of mathematical concepts in visually impaired children D3-1 Movement
Takashi Yamauchi, Anton Leontyev and Moein Razavi Assessing Emotion by Mouse-cursor Tracking: Theoretical and Empirical Rationales D3-2 Movement
Jonny O’Dwyer, Ronan Flynn and Niall Murray Eye-based Continuous Affect Prediction D3-3 Movement
Jian Shen, Xiaowei Zhang, Junlei Li, Yuanxi Li, Lei Feng, Changqing Hu, Zhijie Ding, Gangping Wang and Bin Hu Affective Auditory Stimuli Induced Depression Detection from Electroencephalogram Signals D3-4 Physiological
Monica Perusquía-Hernández, Saho Ayabe-Kanamura and Kenji Suzuki Posed and spontaneous smile assessment with wearable skin conductance measured from the neck D3-5 Physiological
Daniel McDuff and Jeffrey Girard Democratizing Psychological Insights from Analysis of Nonverbal Behavior D3-6 Physiological
Nattapong Thammasan, Ivo Stuldreher, Dagmar Wismeijer, Mannes Poel, Jan van Erp and Anne-Marie Brouwer A novel, simple and objective method to detect movement artefacts in electrodermal activity D3-7 Physiological
Ross Harper and Joshua Southern End-To-End Prediction of Emotion From Heartbeat Data Collected by a Consumer Fitness Tracker D3-8 Physiological
Youngjun Cho, Nadia Bianchi-Berthouze, Manuel Oliveira, Catherine Holloway and Simon Julier Nose Heat: Exploring Stress-induced Nasal Thermal Variability through Mobile Thermal Imaging D3-9 Physiological
Grace Leslie, Asma Ghandeharioun, Diane Zhou and Rosalind W. Picard Engineering Music to Slow Breathing and Invite Relaxed Physiology D3-10 Physiological
Kushal Chawla, Sopan Khosla and Niyati Chhaya Pre-trained Affective Word Representations D3-11 Speech
Mia Atcheson, Vidhyasaharan Sethu and Julien Epps Using Gaussian processes with LSTM neural networks to predict continuous-time, dimensional emotion in ambiguous speech D3-12 Speech
Deboshree Bose, Ting Dang, Vidhyasaharan Sethu, Elithamby Ambikairajah and Sarith Fernando A Novel Bag-of-Optimised-Clusters Front-End for Speech based Continuous Emotion Prediction D3-13 Speech
Angela Chan, Niloofar Zarei, Takashi Yamauchi, Francis Quek and Jinsil Seo Touch Media: Investigating the Effects of Remote Touch on Music-based Emotion Elicitation D3-14 Touch
Héctor López-Carral, Diogo Santos-Pata, Riccardo Zucca and Paul Verschure How you type is what you type: Keystroke dynamics correlate with affective content D3-15 Touch
Surjya Ghosh, Shivam Goenka, Niloy Ganguly, Bivas Mitra and Pradipta De Representation Learning for Emotion Recognition from Smartphone Keyboard Interactions D3-16 Touch
Tipporn Laohakangvalvit, Michiko Ohkura and Tiranee Achalakul Comparison on Evaluation of Kawaiiness of Cosmetic Bottles between Japanese and Thai People D3-17 Culture
Everlyne Kimani, Kael Rowan, Daniel McDuff, Mary Czerwinski and Gloria Mark A Conversational Agent in Support of Productivity and Wellbeing at Work D3-18 Agent
Asma Ghandeharioun, Daniel McDuff, Mary Czerwinski and Kael Rowan EMMA: An Emotion-Aware Wellbeing Chatbot D3-19 Agent
Tanja Schneeberger, Mirella Scholtes, Bernhard Hilpert, Markus Langer and Patrick Gebhard Can Social Agents elicit Shame as Humans do? D3-20 Agent
Samiha Samrose, Wenyi Chu, Carolina He, Yuebai Gao, Syeda Sarah Shahrin, Zhen Bai and Mohammed Ehsan Hoque Visual Cues for Disrespectful Conversation Analysis D3-21 Ordinal session
Rifca Peters, Joost Broekens, Mark A. Neerincx and Kangqi Li Robots Expressing Dominance: Effects of Behaviours and Modulation D3-22 Robot session
Samuel Spaulding and Cynthia Breazeal Frustratingly Easy Personalization for Real-time Affect Interpretation of Facial Expression D3-23 Robot session
Frank Kaptein, Joost Broekens, Koen Hindriks and Mark Neerincx Evaluating Cognitive and Affective Intelligent Agent Explanations in a Long-Term Health-Support Application for Children with Diabetes Type 1 D3-24 Robot session
Morten Roed Frederiksen and Kasper Stoy Augmenting the affective audio capabilities of a non-affective robot D3-25 Robot session
Jimin Rhim, Anthony Cheung, David Pham, Subin Bae, Zhitian Zhang, Trista Townsend and Angelica Lim Investigating Positive Psychology Principles in Affective Robotics D3-26 Robot session
Matthew Lewis and Lola Cañamero A Robot Model of Stress-Induced Compulsive Behavior D3-27 Robot session