Scheduled System Maintenance on May 29th, 2015:
IEEE Xplore will be upgraded between 11:00 AM and 10:00 PM EDT. During this time there may be intermittent impact on performance. We apologize for any inconvenience.
By Topic

CAO: A Fully Automatic Emoticon Analysis System Based on Theory of Kinesics

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Ptaszynski, M. ; Language Media Lab., Hokkaido Univ., Sapporo, Japan ; Maciejewski, J. ; Dybala, P. ; Rzepka, R.
more authors

This paper presents CAO, a system for affect analysis of emoticons in Japanese online communication. Emoticons are strings of symbols widely used in text-based online communication to convey user emotions. The presented system extracts emoticons from input and determines the specific emotion types they express with a three-step procedure. First, it matches the extracted emoticons to a predetermined raw emoticon database. The database contains over 10,000 emoticon samples extracted from the Web and annotated automatically. The emoticons for which emotion types could not be determined using only this database, are automatically divided into semantic areas representing “mouths” or “eyes,” based on the idea of kinemes from the theory of kinesics. The areas are automatically annotated according to their co-occurrence in the database. The annotation is first based on the eye-mouth-eye triplet, and if no such triplet is found, all semantic areas are estimated separately. This provides hints about potential groups of expressed emotions, giving the system coverage exceeding 3 million possibilities. The evaluation, performed on both training and test sets, confirmed the system's capability to sufficiently detect and extract any emoticon, analyze its semantic structure, and estimate the potential emotion types expressed. The system achieved nearly ideal scores, outperforming existing emoticon analysis systems.

Published in:

Affective Computing, IEEE Transactions on  (Volume:1 ,  Issue: 1 )