Tuesday, April 5, 2011

Paper Reading #20 - Data-Centric Physiology

Comments:
Comment 1
Comment 2

References:
Title: Addressing the Problems of Data-Centric Physiology-Affect Relations Modeling
Authors: Roberto Legaspi, Ken-ichi Fukui, Koichi Moriyama, Satoshi Kurihara, Masayuki Numao, and Merlin Suarez
Venue: IUI 2010, Feb. 7-10 2010

Summary:
In this paper, the authors describe a new method of analyzing the emotions -- or affect -- of people. They describe some of the problems with current emotion modeling solutions like the time it takes to analyze them, and how they believe it can be improved by changing how the data is analyzed.

They mention that analyzing the data continuously along the entire spectrum of data for a user will produce better results than the current method of discretely analyzing emotions. To prove this, they analyzed the emotion changes of two subjects by using the sensors shown in the pictures above and playing music that affected them emotionally.

Then, they describe in detail the algorithms behind their continuous analysis, and show that it is as fast as discrete analysis and should provide better results in certain situations.

Discussion:
To be perfectly honest, this paper was so difficult to read that I'm not exactly sure that I got the correct analysis out of it. It took me half of the paper to figure out what they were trying to do with the emotion readings, and I am still not exactly sure what the point was.

Additionally, I am curious what the benefits are behind sensing emotions of users, especially if it requires the elaborate equipment shown in the picture. I have seen some cool little games that used the user's emotions to modify the game, but no other real applications.

(Image courtesy of: this paper)

4 comments:

  1. I can see where this paper would be super hard to read, but the concept is very cool if they can pull it off. I can think of tons of ways computers could use emotion sensing, and yes games are one of them, but what about things like gauging the ease of use of your UI? If you could measure emotion, you could measure how frustrated or happy someone gets by using your interface.

    ReplyDelete
  2. I'm with Stephen. There are a lot of ways I could see computers using reliable emotion reading. The most obvious being more realistic imitation of human interaction with humans - but maybe that's the A.I. class I took talking.

    ReplyDelete
  3. I too had a ridiculously hard time reading this. I even tried twice. I'm glad I'm not the only one who couldn't get it. I really don't get what this paper is aimed at solving, on a high level.

    ReplyDelete
  4. Wow, everyone had a hard time with this paper. Basically, if people are able to detect emotions from facial expressions and voice, then computer should be able to as well. They just want to refine the results using brainwaves to narrow down the ranges in recognition.

    ReplyDelete