Conferences in Research and Practice in Information Technology
  

Online Version - Last Updated - 20 Jan 2012

 

 
Home
 

 
Procedures and Resources for Authors

 
Information and Resources for Volume Editors
 

 
Orders and Subscriptions
 

 
Published Articles

 
Upcoming Volumes
 

 
Contact Us
 

 
Useful External Links
 

 
CRPIT Site Search
 
    

Visualization of Music Impression in Facial Expression to Represent Emotion

Nakanishi, T. and Kitagawa, T.

    In this paper, we propose a visualization method of music impression in facial expression to represent emotion. We apply facial expression to represent the complicated and mixed emotions. This method can generate facial expression corresponding to impressions of music data by measurement of relationship between each basic emotion for facial expression and impressions extracted from music data. The feature of this method is a realization of an integration between music data and the facial expression that convey various emotions effectively. One of the important issues is a realization of communication media corresponding to human Kansei with less difficulty for a user. Facial expression can express complicated emotions with which various emotions are mixed. Assuming that an integration between existing mediadata and facial expression is possible, visualization corresponding to human Kansei with less difficulty realized for a user.
Cite as: Nakanishi, T. and Kitagawa, T. (2006). Visualization of Music Impression in Facial Expression to Represent Emotion. In Proc. Third Asia-Pacific Conference on Conceptual Modelling (APCCM2006), Hobart, Australia. CRPIT, 53. Stumptner, M., Hartmann, S. and Kiyoki, Y., Eds. ACS. 55-64.
pdf (from crpit.com) pdf (local if available) BibTeX EndNote GS
 

 

ACS Logo© Copyright Australian Computer Society Inc. 2001-2014.
Comments should be sent to the webmaster at crpit@scem.uws.edu.au.
This page last updated 16 Nov 2007