Models of Social Intelligence

Part of Project Lirec: Notes (and bits pasted) from Deliverable 5.1

An understanding of social intelligence is needed for creating agents capable of long term companionship with humans.

What is social intelligence?

  • Begins in childhood with attachment and love for caregivers
  • Later extends to family, friends and co-workers
  • Relationships grow and fade as time passes

A long term artificial companion needs to apply the same strategies for creating and maintaining relationships if it's to be effective.

Emotions and personality need to be expressed for people to gain an empathic relation with an agent.

  • Disney animation has been used as an example in research.
  • The theory is that actions lead to feelings, which lead to personality and perceived emotional state

Companions need to have a consistent personality if they are to maintain long term relationships.

See also: Theory of Mind in Robotics

A theory of mind in a robot is a model of a users (or another agent's) emotional state, gained by measuring affective state from sensor input - such as facial expression of tone of voice. In this way an agent may attempt to return an empathic relationship with a human. Such a model must include a users:

  • Emotional state
  • Goals
  • Intentions
  • Beliefs
  • Personality

A companion needs to adapt and evolve based on past experiences. The companion must adapt to a users personality in order to match it better as time passes.

Socially intelligent agents

Human intelligence includes both:

  • Efficient problem solving
  • Social and emotional intelligence

Socially intelligent agents need to have the appearance of social understanding and also recognise other agents or humans in order to establish relationships with them.

This social information is part of a robot's environment, and sensed through it's understanding of the outside world.

*The expectation of social interaction of an agent depends on it's form, a human form is more natural, but also creates very high expectations - expectations which cannot be currently met.*

  1. Goal oriented agents: the action the agent takes in the world (in other words, its behaviour) is aimed at producing some result.
  2. Interference and dependence: implies that there must be interference among the actions and goals of the agents, i.e., the effects of the action of one agent are relevant for the goals of another.
  3. Mindreading: representation of both beliefs and goals of the minds of other agents. This concept is frequently mentioned as Theory of Mind.
  4. Coordination: when two agents coordinate their behaviour without influencing each other and without any explicit communication.
  5. Delegation: an agent x needs or likes an action from the agent y and includes it on her own plan. Delegation (or relying on) presupposes trust among the agents.
  6. Goals about other's action/goals: besides having beliefs about other agent's goals, agents can also have goals about the minds of the other. For instance, one agent might have the goal of changing something on other agent's mind.
  7. Social goal adoption: when the mind of an agent x changes because of a goal of another agent y. In other words, when the agent y succeeds on the task of changing agent x's mind.
  8. Joint action: when two or more agents (socially) commit each other and adopt the same goal.
  9. Social structures and organisation: when a group of agents, with different goals and limited abilities and resources, interact in the same environment, a dependence structure emerges. Such emergence is what makes social goals evolve or be derived.

Establishing relationships might be easy - for instance, it's quite possible for people to have relationships of a form with inanimate objects, but maintaining and developing a relationship is harder.

Bickmore & Picard - relationship maintenance strategies, based on human-human relationship maintenance:

  • Performing new activities together, as long as both the parties agree to the new activity.
  • Meta-relational communication, i.e., talk about the relationship.
  • Empathy, defined as the process of attending to, understanding and responding to another person's emotive expressions.
  • Reciprocal self-disclosure, which may increase intimacy, trust, closeness and liking.
  • Use of humour.
  • Talk about the past and the future together, and reference mutual knowledge in conversations.
  • Continuity behaviours to properly bridge the time people are apart, such as greetings and farewells.
  • Emphasize commonalities and de-emphasize differences, to increase solidarity and rapport.
  • The influence of a social agent on a person
  • Social agent could be another person, a social role, norm or group
  • Social power can be resisted
  • The amount of influence of a social power can be controlled
  • Affective ties of one person or agent with others
  • Can be positive or negative
  • Not necessarily reciprocal
  • But they are often balanced

The theory states that people tend to avoid unstable cognitive configurations. For instance if agent A knows and likes agent B, and they both are aware of, and have positive feelings towards object C then there is balance, similarly if they both have negative feelings towards object C. If they disagree then there is imbalance, which can be resolved from agent A's perspective by one of three steps:

  • Agent A switches to dislike of agent B
  • Agent A decides to change it's mind and agree with agent B about object C
  • Agent A attempts to change agent B's mind, in order to make it agree about object C

The last option takes more work than the other two, and so there is also a concept of cost for maintaining social balance.

Examples of Socially Intelligent Agents

Embodied conversational agents - these use face to face conversations in an attempt to simplify human/computer communication.

REA: The automated real estate agent: Attempts to sell people houses, keeps a track of task talk and small talk, and can interleave them.

Laura & FitTrack: An exercise advisor designed to explore long term relationships. Tested with 100 people, users with the relationship building features added were “more likely” to continue using the system.

Avatar Arena: Multi characters interacting with each other on the user's behalf:

2SGD Model: Agents for interacting with groups:

Sociable physical robots

Kismet: Animatronic head which responds to people's actions, and capable of “proto conversation” through babbling:

Valerie the roboceptionist:

Emotions and personality in Social Agents

Theoretical foundations start with Darwin's study of expressions in humans and animals. James-Lange theory of emotion - maps emotions with physiological responses.

State that emotions are the result of evaluating an event or situation.

  1. Primary appraisal: Is this relevant to me?
  2. Secondary appraisal: How should I cope with it?
Emotions Moods Personality Traits
Seconds/Minutes/Hours Hours/Days/Weeks/Months Years/Lifetime

(Moffat, 1997) states, “personality is the name we give to (an agent’s) reactions tendencies that are consistent over situations and time”.

Examples of computational models of emotions


FAtiMA: Used in Fear Not! (is this part of ION?)

  • Events + Emotions → Appraisal
  • Appraisal based on
    • Goals
    • Standards
    • Attitudes
  • Emotions have intensity - attenuated in time
  • Mood - overall state of emotions
  • Mood also affects emotions

Personality is a combination of:

  • Emotional threshold - characters resistence to an emotion type
  • Decay rate - How fast an emotion disappears
  • Event or action → Knowledge base update
  • → emotion → autobiographic memory



  • food
  • water
  • physical integrity
  • sexuality
  • affiliation (social need)
  • certainty and competence

Needs given weights, Reactive level/Deliberate level - more goal oriented

More info:

  • models_of_social_intelligence.txt
  • Last modified: 2009-01-19 11:59
  • by davegriffiths