Brain-machine interfaces and emotional AI are real enough to be undergoing trials. Petra Stock considers what they mean for the future – and whether they’re needed at all.
What if technology could tap into a raw data feed of your feelings and manifest them in a kind of digital aura? If anger and annoyance agitated as red asymmetric fractals; or peace and calm floated by in blue, infinite loops?
That possibility is here in a device that looks like a 1960s swimming cap worn with ski goggles. Called Neo-Noumena, it’s designed to “read” its wearer’s emotions, representing them as a digital halo of spirographic shapes.
“Neo-Noumena was an attempt to expand how much information people can pass to each other in communicating about their emotions,” says its creator, neurotechnologist Dr Nathan Semertzidis.
Since he was little, Semertzidis has been enthralled by bionics – the use of technologies to extend the human body and mind. His interest is partly informed by personal experience – he has a genetic hearing loss and wears hearing aids – along with a childhood fascination for science fiction.
“I wanted to work on technologies that interface the human brain with computers, to exchange information between the two, and to further that dream of becoming a cyborg that I had when I was a kid,” he says.
Music makers, dreamers of dreams
Semertzidis configured Neo-Noumena at the Wonka-esque Exertion Games Lab, where he works as a research fellow. Ensconced in Monash University’s Clayton campus in Melbourne’s south-east, the lab’s bright and airy warehouse contains all manner of contraptions and gadgets. There’s a bouldering wall that can tilt based on a climber’s breathing patterns, banana-yellow high jump mats, a crate of pool noodles and a floatation tank, along with a series of outlandish-looking electrode hats and augmented and virtual reality (VR) glasses.
Play and experimentation underpins the research philosophy here, as a dozen or so researchers tinker with futuristic technologies that can interface directly with the human brain or body. The goal they’re working towards is a singular human-computer entity.
Semertzidis’ retro-futuristic headset is one of a set of brain-computer interfaces he’s designed. There’s also Inter-Dream, a VR system to facilitate sleepiness; and PsiNet, a kind of ‘hive mind’ designed to amplify shared brain activity in couples or groups.
Neo-Noumena’s high-tech headgear is made from three pieces of kit.
First, a mobile electroencephalogram (EEG) device monitors brain activity via a series of black electrodes polka-dotted over the white mesh skull cap. The headset tracks electrical activity in the brain, data which previous studies have correlated with emotional states.
EEG is not a new technology. It’s been used as a diagnostic tool in neuroscience and psychiatry for decades. But recent years have seen the hardware become more affordable and widely accessible through initiatives like OpenBCI. These days anyone can pick up a neurotechnologist’s starter kit for the price of an iPhone and even run it on open-source software.
An artificial intelligence (AI) system sorts the filtered EEG data into four distinct emotional states, based on whether an emotion is positive or negative (classified as high or low valence) and the degree of energy or arousal.
Augmented reality (AR) technology is what makes these states visible, making emotions appear as swarms of geometric shapes, fractals circling within 1.2 metres of the wearer when viewed through the HoloLens goggles.
Shapes are displayed as symmetrical and curved for a positive emotion, or bent out of shape for a negative one. They might move fast or slow depending on the level of energy behind the emotion. Happy or excited feelings – high arousal, high valence – might show up as speedy green spirals. Their counterpoint, bored or sad states, are slow, misshapen yellow objects.
“It’s kind of like having a swarm of bees or a flock of birds flying around you,” Semertzidis says.
Out of the lab and into the wild
Semertzidis’ research involves prototyping, testing and then analysing the use of these brain-computer interfaces “in the wild”. That means taking Neo-Noumena outside the lab and into people’s homes and everyday settings.
Ten participants – five pairs made up of four couples and a mother-son team – took the devices home to try them out.
They donned the cap and goggles in a variety of situations – when they got home from work, completed assignments, watched comedy, played card games, listened to music or relaxed. They observed and diarised the ways the apparatus embodied their own and their partner’s emotions through physical space and time.
Semertzidis’ 2020 paper about Neo-Noumena records the participants’ responses to qualitative interviews about the experience.
One describes the experience as akin to having a pet: “It lands on your table, kind of like a cat […] but instead of a cat, it’s literally a piece of your emotion.”
Many others became more acutely aware of their own emotional states and their effect on others. Some found it reassuring to watch their emotional states change and flow.
“It was like a constant visual reminder to consider someone’s mood […] and just appreciate that other people have emotions as well,” one says.
Semertzidis says if two people are experiencing the same emotion, the fractals might even swarm together and effectively “hang out” in the middle of the room. At those times, the device seemed to intensify the pleasure of a shared emotional experience, like when two participants spent an evening drinking and listening to music.
“The whole night we were singing and dancing along with the music, and we were generating some pretty positive emotions,” one recounts. “For me that made me even happier to see that she was happy. It felt like it was feeding back in on itself, like a nice big loop of happiness.”
Communicating emotional experiences is core to being human, “yet also notoriously difficult”, Semertzidis says. This is what Neo-Noumena seeks to address.
The question is: do we need technology’s help to encode and decode human emotions?
Mixed feelings
Humans are “exquisitely good at both communicating emotions facially and reading them in others”, says Nick Haslam, professor of psychology at the University of Melbourne.
“There’s a very intricate orchestration of 40-odd muscles which create facial expressions of emotion. And humans are just incredibly good at decoding those things.”
People can also physically express their emotions through voice tone and intonation, body movements (like slumped or expansive poses), and heart rate. And, of course, using words.
Haslam says emotions are “a connected pattern of thoughts, feelings and actions oriented towards events in the world”.
“Emotions aren’t just patterns of brain activation over the scalp,” he says – they are inherently complex, made up of subjective feelings, thoughts, behaviours and physiology.
Haslam adds that sharing emotions has an important social role. For instance, fear communicates to others the presence of a threat to be avoided, whereas “anger can have a function of expressing authority, expressing moral disapproval of someone else”.
People can also physically express their emotions through voice tone and intonation, body movements, heart rate and using words.
Indeed, a study by Belgian psychologist Professor Bernard Rimé found that emotions are mainly social, as opposed to private and individual. Rimé found that up to 96% of experiences are shared with others.
To complicate matters, a person’s internal emotional experience can sometimes contradict the emotion they choose to express.
“The jargon in psychology is emotion regulation, when you try to express an emotion different from the one that you’re initially feeling,” Haslam says. This might mean dampening down sadness or self-pity in the face of someone else’s success, showing gratitude for an unwanted gift, or even overcoming an inappropriate giggle at a funeral. “To some extent the feeling part is private, unless you communicate it,” he says.
Haslam says that technologies – from text messages to Zoom calls – often get in the way of communicating emotion; they act as a barrier by removing crucial face-to-face, or in-person contact. The result is continuing efforts to create workarounds.
“The rise of emojis in texts, for instance, is largely a response – not just because they’re kind of cute – it’s because it often is difficult to communicate subtleties of emotions, or things like sarcasm over pure text without the usual back and fro of face-to-face contact,” Haslam says.
The Neo-Noumena study suggests the system might augment people’s ability to read their partner’s emotions, even the private ones. One of the participant pairs played a card game while wearing their headset, and used the swirling fractals as an additional tactic for deciphering their opponent’s poker face. The fractals were “mellow if it was a good hand”, and “red towards the end of a round where it’s determined who is going to win or lose”.
In another example, a participant was working on a university assignment while their partner was playing music. Semertzidis describes the encounter: “The one playing music was like, ‘Oh, is this annoying you?’. The one doing the assignment said, ‘No, it’s fine.’ But they saw the fractals were obviously annoyed.”
“You can imagine a dystopia where a machine can detect your deepest, darkest thoughts just from looking at you,” Haslam says. “But my sense is that that is so distant as a possibility, and it’s so far beyond what we can currently do, that I don’t worry about it too much.”
Emotional AI: friend or foe?
“Some of what’s going on here is a neuro-fetishism,” says Monash professor Robert Sparrow. “This idea that if you can read it in the brain it’s somehow more real than if you could just look at someone and see that they were sad.”
Sparrow, a philosopher specialising in technologies and applied ethics, is concerned about the pervasive use of systems like facial recognition being programmed to gauge the public’s emotional state. His particular worries include people’s lack of choice to be monitored and the technology’s potential for manipulation.
“Advertisers have been quite interested in the possibility that they might be able to spot people walking past a billboard, find their emotion, and maybe tweak their pitch, in accordance with people’s emotion,” he says.
“Notoriously people, as I understand it, consume more when they’re feeling slightly sad. So if you can identify people when they’re vulnerable and pitch your products at them, that would be a way of increasing sales.”
Dynamic pricing systems could even be used to gauge someone’s emotional state, with prices adjusted accordingly for maximum commercial advantage, Sparrow suggests.
The interest of enterprises has already been piqued by emotional AI technology’s potential to dig into consumers’ inner thoughts and feelings.
Cosmetics giant L’Oréal is partnering with technology company Emotiv using EEG devices to detect emotional responses to different scents, to “help consumers make accurate and personalised choices around their fragrance”, according to a company press release. The BBC reports that music platform Spotify patented technology for analysing a user’s emotional state based on speech recognition, to further personalise content recommendations.
Meanwhile consumer brands are collecting and analysing biometric and facial recognition technologies for the purposes of targeted advertising, or product development. Confectionary maker Mars partnered with technology company Affectiva to study the facial expressions and emotional responses to its advertising of more than 1,500 European participants, in the largest such study yet.
These systems work differently but all rely on collecting personal data, tracking everything from gazes and facial movements, to walking gait, online sentiment, brain signals, heartbeats and skin moisture.
Sparrow says there are privacy concerns too. People are used to curating their digital selves in various ways, he says. But most would find it challenging to moderate their facial expressions and other emotional indicators at every point of interaction with a laptop, phone or security camera system. He wonders about the accuracy of emotion-reading brain-machine interfaces. “If someone looks happy, and they say they’re happy but yet your brain reader says that they’re sad […] how do you know that the machine is not just making a mistake?”
Semertzidis agrees that at this stage EEG brain-reading systems can be inaccurate. They are based on generalised models that don’t fully capture the individuality of a person’s brain.
Neo-Noumena’s classification accuracy was around 58%, according to the 2020 paper. To some extent the system might be as arbitrary as a mood ring, with “a bit more information that goes into it and a bit more processing”, Semertzidis says.
But study participants assumed the system’s output to be the objective truth. They even trusted the device at times when it diverged from their own experience or gut instinct, Semertzidis says.
“If it didn’t align with how they were feeling, they felt that perhaps they were wrong, and the machine was right,” he says. This highlights a danger that “people could probably put their faith in the machine over themselves”.
Yays and nays
Through toying with these emerging technologies, Exertion Games Lab hopes to open people’s minds to technologies that are just over the horizon and ways they could be used.
“This can inspire a utopian vision of the future – and also give you an idea of where things can go wrong,” Semertzidis says.
A project called ‘Machine in the Middle’ explored the possible unintended consequences, or ‘dark patterns’ when brain-computer interfaces are used to classify emotions, beyond a user’s agency or control. In Machine in the Middle, an EEG headset is fitted with electrical pulses capable of stimulating the wearer’s facial muscles. The headset then makes their face conform to certain expressions – stressed, happy, sad, relaxed – in keeping with their internal emotional state.
Privacy and human rights watchdogs are already taking note.
The UK’s Information Commissioner recently warned of the privacy and accuracy risks involved with collecting and holding vast amounts of personal biometric data.
To some extent the system might be as arbitrary as a mood ring, with “a bit more information that goes into it and a bit more processing”.
Emotional AI is a high-risk technology that “may reveal highly sensitive data via subconscious behaviours and responses, interpreted through highly contested forms of analysis”, according to a report from the commissioner’s office. The report highlights specific risks relating to the use of children’s data, or uses such as workforce or public surveillance.
Deputy information commissioner Stephen Bonner says biometrics and emotional AI technologies are immature and may not work. “We are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are incorrect and lead to discrimination.”
Leading neuroscientists are calling for ‘neuro-rights’ to be added to the UN’s Universal Declaration of Human Rights. They are seeking to enshrine the right to identity, free will and mental privacy and guard against abuse and manipulation by emotion and brain-reading technologies. Chile’s parliament has already taken steps to amend its constitution accordingly, the first country to do so.
At this stage, wearing Neo-Noumena is very much an opt-in arrangement, and one limited by the discomfort of the hardware. Participants in Semertzidis’ study struggled to wear their headsets and goggles beyond the minimum hour-a-day agreed to. The cap was heavy; the goggles sometimes pinched their noses. The gel that had to be injected into the electrodes messed up their hair.
Brain-computer interfaces are very much in their infancy. But, as systems improve, they will drastically change what it means to be a human, or maybe even move beyond that, Semertzidis says.
“In the future, when we get better at connecting brains together, we might start to see things that only exist in science fiction. Like human hive minds – where our thoughts and experiences are shared between thousands of people instantaneously – and we have access to the collective information of the entire human race.”
Today’s emotional AI technologies are but baby steps. The interest lies in where they’re leading us next.
Originally published by Cosmos as Seeing emotions
Petra Stock
Petra Stock is a journalist and engineer. She has previously worked in climate change, renewable energy, environmental planning and Aboriginal heritage policy.
Read science facts, not fiction...
There’s never been a more important time to explain the facts, cherish evidence-based knowledge and to showcase the latest scientific, technological and engineering breakthroughs. Cosmos is published by The Royal Institution of Australia, a charity dedicated to connecting people with the world of science. Financial contributions, however big or small, help us provide access to trusted science information at a time when the world needs it most. Please support us by making a donation or purchasing a subscription today.