My computer tells me you’re bored - should you care?

category - ai
sector - media & technology
type - opinion
Opinion
The use of facial and emotional recognition still splits opinion. But with transparency, it could bring positive possibilities and insights to brands

Imagine being in a Zoom meeting and being bored. It’s not a huge stretch. Now, what would happen if the person giving the presentation stopped, called out your name and asked you to pay attention – because they have proof your mind is elsewhere?

Facial recognition software is now at the stage where expressions that align with a drifting mind can be identified. This might be great for self-driving cars that need human oversight but not so great when it comes to civil liberties and companies tracking our every emotion.

Yet, emotional facial recognition technology is already widely used, with real influence on how we work, pitch ideas, meet and teach. In customer service and call centres, tracking how a conversation or a phone call develops is nothing new. Today, staff can view real-time feedback on both their and the customer’s tone, interest and boredom. In turn, Harvard Business Review reports that emotionally connected customers are more than twice as valuable as highly satisfied customers, while the market for emotional facial is anticipated to be worth £42bn (€56bn, €47bn) by 2024, according to Markets & Markets.

If our conversations are already being steered by machines, should we embrace this when communicating over live video, to facilitate a more emotionally connected experience? Those of us working remotely could certainly benefit from automated feedback; most of us need body language and other non-verbal communication to up our game when addressing a group. Consider the students who designed and prototyped Mood for Zoom – a real-time platform that lets people delivering presentations more easily read the virtual room.

But here is where the predicament lies. Telling people you are monitoring their reaction can alter their response, while not telling them raises red flags. It may take many years for us to relax under recorded conditions, though it’s possible that habit or convenience will influence this over time, with simple buttons allowing us to agree or decline.

Published by:

26 November 2020

Author: Marius Bartsch

Image: Design Yourself: Evasive Techniques by York Collective. Using AR filters to investigate facial recognition technology.

Share


I Never Found Her by Erica Scourti. Erica's work explored Optical Character Recognition software, interrogating how identity and human understandings are influenced by these now everyday filters.

Further, in an increasingly mobile society, it’s not surprising that our devices are getting to know us, too. According to Gartner, by 2022 our phones may know more about our emotional state than our family members. This could be pretty useful for voice tech and virtual personal assistants, learning when to prompt, interrupt and help while we are on the go. But again, to work efficiently, we must grant AI and machine learning systems permissions to record and analyse our behaviours, moods and communications.

In this vein, Digitas has been working on a proprietary emotion detection tool called Emotion XD. It’s currently limited to helping clients extract insight from voice and text interactions to improve customer experience. The exciting thing, as this technology develops, is seeing where the initial boundaries are for acceptance. We are also exploring the ecosystem of ethical data use – for example, if and where recordings are stored, how long for, what they are used for and how they may benefit end users as well as businesses. The aim is always to improve experiences for customers with their consent and with transparency. So far, we are seeing valuable feedback.

Getting permission and finding out how people feel about emotion tracking is complex and interesting work. Above all else, we need to think about how people are informed. They must have the ability to opt out and we must ensure they understand what the data is being used for and who can access it – the basic tenets of our digital lives, even as evolution continues apace.

Marius Bartsch is head of customer engagement at Digitas, a global marketing and technology agency that transforms businesses for the digital age.

‘For virtual personal assistants, emotional recognition means they can learn when to prompt, interrupt and help while we are on the go’
Marius Bartsch, head of customer engagement, Digitas

Get our weekly newsletter straight to your inbox

SIGN-UP
 

Want to read more?
Become a member today!

Sign up to one of our subscribtion packages and get unlimited access to a hive of insights - from microtrends and macro trends to market reports, daily news, research across eight industry sectors and much more.


Discover our memberships

Already a member? Click here to login