Emotion plays an important role in human interactions. All our perceptions, decisions and interactions are heavily influenced by our emotional mood. If you take the emotional part away from human beings, they’ll seem so stupid and predictable.

Therefore emotion could not be overseen when developing some believable virtual intelligent agents.

There are some works done in this area, but most of them use a heavily simplified model for emotion. Therefore in comparison to the human emotion, these models are too simplified so an acceptable result can’t be ecpected. In an article published in The International Journal of Virtual Reality 2008, the author proposed the following model for the emotion layer. In the “Reflective Layer” which is responsible for emotion state changes in this model, only three emotions are considered: Shame, Love and Shock. These emotions are not diverse enough to represent a believable emotional agent.

image from The International Journal of Virtual Reality, 2008

In this work I studied some philosophical and psychological topics. In advance I’ve developed a multi-agent framework to simulate virtual agents, and test my new methods and models. I’ve used the following adaptive behavioural model for interactions, proposed by Li and MacDonnell.

image from The International Journal of Virtual Reality, 2008

This model uses the well-known FFM to represent the personal characteristics of an agent, and in my personal opinion it’s more than adequate. I’ve also added an agent construction module, which is responsible for making new agents with a unique FFM set of parameters, based on the society they’re being born in. This makes the agents more communicative, as they share more common personalities, forming some local social groups.
From an implementation point of view, this framework is developed on the PS3 hardware, because of multithreading capabilities of the Cell processor. Data-driven model is used, so each layer is a set of Lua scripts transferred to the PS3 through TCP.

My main work was being done on the perception layer. It began with a simple question of “what makes an agent like or hate something?”. The question leads to some interesting areas. As you may know, Love is not created through simple sensing of our eyes and ears transferred to our brain. It’s been widely accepted that the creation of love takes place through some unknown communications between two persons. Therefore a simple perception model which is used widely in multi-agent simulations does not lead to a believable emotion-based interaction.

Unfortunately, this work is currently put on hold for some personal reasons.

Leave a comment