PerFRDiff: Personalised weight editing for multiple appropriate facial reaction generation
Zhu, H; Kong, X; Xie, W; et al.Huang, X; Shen, L; Liu, L; Gunes, H; Song, S
Date: 28 October 2024
Conference paper
Publisher
Association for Computing Machinery
Publisher DOI
Related links
Abstract
Human facial reactions play crucial roles in dyadic human-human interactions, where individuals (i.e., listeners) with varying cognitive process styles may display different but appropriate facial reactions in response to an identical behaviour expressed by their conversational partners. While several existing facial reaction generation ...
Human facial reactions play crucial roles in dyadic human-human interactions, where individuals (i.e., listeners) with varying cognitive process styles may display different but appropriate facial reactions in response to an identical behaviour expressed by their conversational partners. While several existing facial reaction generation approaches are capable of generating multiple appropriate facial reactions (AFRs) in response to each given human behaviour, they fail to take human's personalised cognitive process in AFRs generation. In this paper, we propose the first online personalised multiple appropriate facial reaction generation (MAFRG) approach which learns a unique personalised cognitive style from the target human listener's previous facial behaviours and represents it as a set of network weight shifts. These personalised weight shifts are then applied to edit the weights of a pre-trained generic MAFRG model, allowing the obtained personalised model to naturally mimic the target human listener's cognitive process in its reasoning for multiple AFRs generations. Experimental results show that our approach not only largely outperformed all existing approaches in generating more appropriate and diverse generic AFRs, but also serves as the first reliable personalised MAFRG solution.
Computer Science
Faculty of Environment, Science and Economy
Item views 0
Full item downloads 0
Except where otherwise noted, this item's licence is described as ©2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.