Sharing problems with another human, be that a friend, family member, or colleague, is something we do frequently. Be that a moan about your commute to work or a gripe about a noisy neighbour it’s common to share these dissatisfactions without hesitation. However, when problems become more emotionally sensitive and serious many people struggle to open up to other people for various reasons. With the advancements of AI ever increasing, emotionally intelligent AI is starting to make an appearance to help overcome these issues from assisting in customer service grumbles, to a more deeper level of providing mental health support.

Andrew McStay, Professor of Digital Life at Bangor University is currently working on emotional intelligence in AI and exploring both the benefits and concerns of emotional life becoming machine-readable. In his own studies, Andrew interviewed around 100 leading companies, organisations and sectors interested in the use of affective computing to think about the implications of emotional life and what responsible innovation in this sector might consist of. He asked ‘who is actually using emotion sensitive technology?’ On that list was a wide range of companies and researchers including advertisers, artists and a very diverse range of sectors keenly interested in emotional life.

At the AI Assistant Summit in London this September, Andrew shared his current research, and you can watch his presentation here:

Andrew looked back into studies on Eliza, an early AI from the 1960s designed to play role of Rogerian Psychiatrist. The developer, Weizenbaum, found that people formed really strong emotional connections with Eliza and were shocked by extent that patients anthropomorphised with Eliza. A recent report by startup Cognia found that people are actually more willing to speak to an AI assistant for longer than they are a human and they are volunteering a lot more personal, intimate secrets to AI Assistants.

Andrew believes our own behaviour primes systems to respond to us in an appropriate way, this has a real significance for marketers and advertising and he can see in the future a lot more investment being put into AI by marketers, investors and retailers.

These systems introduce new social and economic questions: ‘Is it desirable that machines are able to use sense and feeling to human emotional life. What are the implications of this?’ We are talking about making people and emotional life machine readable.

His recent surveyed showed 53.42% of the population are not comfortable with voice based emotion detection and that older people were less inclined to use voice based emotion detection than younger people.

Andrew believes developers need to listen citizens - they are really not keen on having personally identifiable information connected with emotions, and suggests listening to them and bringing them in before they start designing the technology, also wants developers to consider notions of privacy by design.

Andrew will be joining RE•WORK again at the AI Assistant Summit in San Francisco this January 25 & 26, where he will further discuss his work and share his most recent research findings. Register now to guarantee your place at the summit.