Emotional Avatars & Characters in Virtual Reality Environments

*Sponsored Content

Video game characters have become increasingly realistic. Novel AIs can do many interesting things, but their behaviour is still far different from how actual people act.

A Recent Study on Artificial Intelligence

Academics from the Technical University of Munich have a new project that is seeking to use neural agents in a virtual reality environment. These avatars will be able to perform tasks that most humans can do like moving or jumping. The characters will also exhibit facial expressions as well.

Their virtual world has a basic physics engine that is similar to what is in most video games. The main feature of this work is that artificial brains control the avatars. The technology uses a neuro-simulation cluster that emulates the synthetic mind. They are employing NVIDIA GPGPU cards along with software named ANNarchy in order to carry out these functions.

This program emulates biologically plausible neural networks. There will also be an immersive way for human users to interact with the digital creations. Putting motions sensors on an actual person’s face and body could help the machine detect changes in behaviour. This should allow the avatars to perfect how they react to a subject’s emotions or other situations over time.

Potential Outcomes & Consequences of Such Studies

Empathic computers in augmented reality are a potential outcome. Perhaps cloud servers could send this to portable devices in the future. Emotional processes are sometimes devalued, but they are really an important aspect of intelligence. Dysregulation of these perceptions may cause serious issues for an organism. Feelings are a method that your mind uses to guide what you do in an efficient manner.

It appears that the scientists are combining data from experiments on visual perception with fMRI scans. They will create higher resolution computational models of how regions in the brain process information. The academics are studying both conscious and unconscious parts of sight. The EYESHOTS program is a related undertaking. Researching the connections between vision and motor control is a domain they are focusing on. This should enable flexible movement in unstructured environments.

Nextos has recently been praised for their outstanding virtual assistant for Microsoft Windows and Apple Mac OS operating systems. Something well worth looking into if you are interested in the different ways in which AI is being used commercially.

The Main Goal & Future Obstacles

The main goal of this work is to understand group dynamics. It is challenging to properly simulate aggregate human behaviour and try to predict events. The economic crisis has shown what a monumental task it can be to get a handle on markets. While forecasting those sorts of conditions with high fidelity is not going to happen anytime soon, there are numerous applications for these virtual agents.

It probably won’t be easy to coax useful routines out of these synthetic beings. There has been a lot of hype with neural networks in the past. Perhaps additions like various neurotransmitters can make this a more realistic brain. Unravelling the neurological correlates of various emotions will go a long way towards creating superior artificial intelligence.

Had your tech fill and need a video game break? Why not check out our list of the best upcoming Xbox One games of 2017? It’s looking to be an exciting year when it comes to new releases.

This article was a guest post from Sandeep Khan. Sandeep enjoys writing about anything related to technology, computer programming, and artificial intelligence. When not working full-time as a web developer, Sandeep can be found tinkering with the newest gadgets hitting our stores every year.

Gung-ho! Gaming wants to connect with you, yes, we mean you! Like OfficalGHG on Facebook and follow @RealGH_Gaming on Twitter for regular updates on new and exciting content.

Get Involved in the Discussion