Haku

Multi-Task Learning for Jointly Detecting Depression and Emotion

QR-koodi

Multi-Task Learning for Jointly Detecting Depression and Emotion

Depression is a typical mood disease that makes people a persistent feeling of sadness and loss of interest and pleasure. Emotion thus comes into sight and is tightly entangled with depression in that one helps the understanding of the other. Depression and emotion detection has been a new research task. The central challenges in this task are multi-modal interaction and multi-task correlation. The existing approaches treat them as two separate tasks, and fail to model the relationships between them. In this paper, we propose an attentive multi-modal multitask learning framework, called AMM, to generically address such issues. The core modules are two attention mechanisms, viz. inter-modal (I {mathrm{e}}) and inter-task (I {t}) attentions. The main motivation of I {mathrm{e}} attention is to learn multi-modal fused representation. In contrast, It attention is proposed to learn the relationship between depression detection and emotion recognition. Extensive experiments are conducted on two large scale datasets, i.e., DAIC and multi-modal Getty Image depression (MGID). The results show the effectiveness of the proposed AMM framework, and also shows that AMM obtains better performance for the main task, i.e., depression detection with the help of the secondary emotion recognition task.

Tallennettuna: