Identifying Emotions Aroused from Paintings

Xin Lu
Adobe Systems Inc.

Neela Sawant
Amazon.com, Inc.

Michelle Newman, Reginald B. Adams, Jr., James Z. Wang, Jia Li
The Pennsylvania State University

Abstract:

Understanding the emotional appeal of paintings is a significant research problem related to affective image classification. The problem is challenging in part due to the scarceness of manually-classified paintings. Our work proposes to apply statistical models trained over photographs to infer the emotional appeal of paintings. Directly applying the learned models on photographs to paintings cannot provide accurate classification results, because visual features extracted from paintings and natural photographs have different characteristics. This work presents an adaptive learning algorithm that leverages labeled photographs and unlabeled paintings to infer the visual appeal of paintings. In particular, we iteratively adapt the feature distribution in photographs to fit paintings and maximize the joint likelihood of labeled and unlabeled data. We evaluate our approach through two emotional classification tasks: distinguishing positive from negative emotions, and differentiating reactive emotions from non-reactive ones. Experimental results show the potential of our approach.


Full color PDF file (25MB)

Full color PDF file (low-resolution, 3.3MB)


Citation: Xin Lu, Neela Sawant, Michelle G. Newman, Reginald B. Adams, Jr., James Z. Wang and Jia Li, ``Identifying Emotions Aroused from Paintings,'' Lecture Notes in Computer Science, vol. 9913, G. Hua and H. Jegou (eds.), Proceedings of the Workshop on Visual Analysis of Sketches, in conjunction with the European Conference on Computer Vision, pp. 48-63, Springer, 2016.

Copyright 2016 Springer-verlag. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works, must be obtained from Springer-verlag.

Last Modified: July 29, 2016
© 2016