Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Avatars as communicators of emotions

10.07.2008
Current interactive systems enable users to communicate with computers in many ways, but not taking into account emotional communication. A PhD thesis presented at the University of the Basque Country puts forward the use of avatars or virtual Internet personages as an efficient form of non-verbal communication, principally focusing on emotional aspects.

Scientists have been working for decades so that the interaction between people and computers be more natural and intuitive. In fact, a great part of the success or failure of a computer application depends on the user interface.

The way in which we communicate with the operating system, for example, has progressed a lot from the time when it was required to write complicated lines of commands on a black and white screen to those with much more intuitive windows. Currently, we have systems designed that combine 3-dimensional graphics, artificial vision and speaking technologies – what are known as multimodal interaction systems. Amongst these, the most common are voice synthesisers and recognition devices that enable the user to communicate with the machine using verbal language.

Nevertheless, taking into account the everyday interaction between people, this technology should also take into consideration non-verbal communication, i.e. facial expressions and body gestures. Our successful acceptation of a transmitted, face-to-face message depends 7% on the words used, 38% on the way in which the voice is used (tone and volume) and 55% on gesture or facial expressions. This is why Ms Amalia Ortiz Nicolás considers it fundamental to include modules in the multimodal interfaces that enable the interpretation and generation of non-verbal, specifically emotional, communication.

In her PhD thesis, Avatars for emotional interaction, the use of avatars (virtual personages) is proposed as one of the best ways for computer systems to emit non-verbal information. Ms Amalia Ortiz is a PhD in computer science and currently works at the VICOMTech-IK4 technological centre. Her research was led by doctors Néstor Garay-Vitoria and Maria Teresa Linaza of the Computer Sciences and Artificial Intelligence Department in the Computer Faculty of the University of the Basque Country (UPV/EHU).

Tools for creating emotions

With verbal communication, Ms Ortiz’s thesis focuses on the emotional and affective aspects. Her basic aim was to see if avatars are capable of communicating emotions. As doctor Ortiz pointed out in her thesis, an avatar is a virtual person that enables a system to be equipped with an appearance (face, eyes, body, voice, and so on) and with behaviour that emulates interaction between persons.

After studying existing computer structures and architecture and observing that all of these focus on concrete cases of interaction, Dr. Ortiz designed generic architecture capable of storing any type of emotional interaction using avatars. She also designed the tools required for both the user and the creator of the interactive system to be able to generate and express emotions easily and intuitively: a tool enabling the generation of avatars without any type of previous knowledge, a system based on emotional cognitive models and an interpreting module that translates the orders and facial expressions. Once the different emotions were generated, Dr. Ortiz implemented an animation model which enabled the avatar to show each emotion together with its corresponding intensity.

Testing emotional interaction

With the aim of applying her studies to a real case, Ms Ortiz validated the generic architecture through the creation of various applications, containing all the tools developed for generation and expression: the IGARTUBEITI, AVACHAT, SASTEC and ELEIN programmes, designed at VICOMTech-IK4.

The IGARTUBEITI system consists of a virtual journey through history by means of digital narrations. In this way, history is experienced – transmitted through emotions – by means of the reconstruction with three-dimensional graphics and explanations of historical circumstances by a virtual guide. AVACHAT is a chat system whereby users can communicate with each other not only with text but also by means of three-dimensional avatars, both verbally and non-verbally. Also, the SASTEC system aims to provide a series of support mechanisms for users with cognitive disabilities.

To this end, a series of exercises to help the memory of such users has been designed, in such a way that their autonomy is encouraged and enhanced. The system has user interfaces made up of emotional avatars. Finally the ELEIN programme, aimed at e-Learning environments, looks to providing a novel way of communicating educational content over the Internet. It offers a three-dimensional didactic agent with the ability to speak in real time and incorporate the complete contents of courses.

After these applications were subjected to evaluations by end users, Dr. Ortiz concluded that the users prefer interaction using avatars because they considered it to be more pleasant, user-friendly and entertaining. Moreover, she observed that users were capable of recognising most facial emotions or expressions and they believed that information to be better explained if an emotional avatar is providing it. According to the data gathered, the users of the online ELEIN course responded correctly to 13% more questions when the concepts had been previously explained by an avatar and that the number of correct responses increased by a further 10% when the concepts had been explained by means of an emotional avatar.

Dr. Ortiz believes that it would be interesting to implement a module with an emotional voice synthesiser, given that users’ evaluation would be more positive if the tone of voice of the avatar is in harmony with its facial expression. Thus, this PhD thesis defended at the UPV/EHU opens novel possibilities for voice synthesisers capable of simulating emotions and for body animation modules which express emotions by means of gestures.

Alaitz Ochoa de Eribe | alfa
Further information:
http://www.elhuyar.com
http://www.basqueresearch.com/berria_irakurri.asp?Berri_Kod=1822&hizk=I

More articles from Information Technology:

nachricht NIST's antenna evaluation method could help boost 5G network capacity and cut costs
11.12.2018 | National Institute of Standards and Technology (NIST)

nachricht ETRI exchanged quantum information on daylight in a free-space quantum key distribution
10.12.2018 | National Research Council of Science & Technology

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Topological material switched off and on for the first time

Key advance for future topological transistors

Over the last decade, there has been much excitement about the discovery, recognised by the Nobel Prize in Physics only two years ago, that there are two types...

Im Focus: Researchers develop method to transfer entire 2D circuits to any smooth surface

What if a sensor sensing a thing could be part of the thing itself? Rice University engineers believe they have a two-dimensional solution to do just that.

Rice engineers led by materials scientists Pulickel Ajayan and Jun Lou have developed a method to make atom-flat sensors that seamlessly integrate with devices...

Im Focus: Three components on one chip

Scientists at the University of Stuttgart and the Karlsruhe Institute of Technology (KIT) succeed in important further development on the way to quantum Computers.

Quantum computers one day should be able to solve certain computing problems much faster than a classical computer. One of the most promising approaches is...

Im Focus: Substitute for rare earth metal oxides

New Project SNAPSTER: Novel luminescent materials by encapsulating phosphorescent metal clusters with organic liquid crystals

Nowadays energy conversion in lighting and optoelectronic devices requires the use of rare earth oxides.

Im Focus: A bit of a stretch... material that thickens as it's pulled

Scientists have discovered the first synthetic material that becomes thicker - at the molecular level - as it is stretched.

Researchers led by Dr Devesh Mistry from the University of Leeds discovered a new non-porous material that has unique and inherent "auxetic" stretching...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

VideoLinks
Industry & Economy
Event News

New Plastics Economy Investor Forum - Meeting Point for Innovations

10.12.2018 | Event News

EGU 2019 meeting: Media registration now open

06.12.2018 | Event News

Expert Panel on the Future of HPC in Engineering

03.12.2018 | Event News

 
Latest News

Electronic evidence of non-Fermi liquid behaviors in an iron-based superconductor

11.12.2018 | Physics and Astronomy

Topological material switched off and on for the first time

11.12.2018 | Materials Sciences

NIST's antenna evaluation method could help boost 5G network capacity and cut costs

11.12.2018 | Information Technology

VideoLinks
Science & Research
Overview of more VideoLinks >>>