Emotional Artificial Intelligence (EAI) is no longer science fiction. From facial expression analysis in classrooms to voice-based mood detection in call centres, EAI systems are increasingly being used to monitor, predict, and respond to human emotion. Marketed as tools of empathy and personalisation, these systems claim to detect how we feel — and adjust their outputs accordingly.
But this apparent sensitivity masks a much deeper problem. EAI systems do not understand emotions; they interpret surface-level cues — a furrowed brow, a raised voice, a change in typing speed — and label them as evidence of internal emotional states. In doing so, they risk misreading us, disciplining us, and ultimately reshaping how we feel and express emotion.
Emotions are not simple reflexes. They are complex, contextual, and culturally shaped experiences. A tear can mean grief, joy, manipulation, or even boredom — depending on when and where it falls. Yet EAI systems are built on the assumption that emotional states can be reliably inferred from a limited set of measurable signals. This is a flawed and dangerous idea.
The false science of emotional legibility
Emotional AI typically relies on what Stark and Hoey term proxy data — measurable physiological or behavioural indicators such as facial microexpressions, vocal tone, or skin conductivity — as stand-ins for interior states. Most commercial systems are built upon Basic Emotion Theory (BET), which posits a small set of universal emotions expressed through fixed, biologically determined signals. This model, popularised by Paul Ekman, has found computational traction because it renders emotions discrete and machine-readable.
However, as Lisa Feldman Barrett and others have shown, there is no scientific consensus that emotions are universally recognisable from facial or physiological signals. On the contrary, emotions are highly context-dependent, culturally inflected, and often ambiguous. The assumption that a frown signals sadness or a raised voice indicates anger ignores the nuances of human affective life.
This is not a trivial limitation. It means that systems designed to "read" emotion are often making speculative inferences based on decontextualised data. In domains such as hiring, education, or policing, such misreadings can have material consequences. An algorithm that tags a job applicant as “anxious,” a student as “disengaged,” or a protestor as “agitated”, does not merely read emotion — it ascribes meaning to it, shaping their prospects in ways that are opaque, unchallengeable, and normatively unjustified.
Emotional surveillance and the discipline of feeling
What makes EAI particularly insidious is that it shifts surveillance from what we do to how we feel. Unlike traditional forms of monitoring, which track behaviour, EAI invades the affective domain. It treats the body not just as a source of movement or presence, but as a signal of internal states — available for scrutiny, interpretation, and intervention.
The result is a chilling form of emotional surveillance. In workplaces, it can be used to track employee engagement or loyalty. In schools, it monitors attentiveness. In retail environments, it analyses our reactions to products and adverts. Over time, this kind of surveillance may pressure us to perform emotions that conform to machine expectations — smiling when we don’t want to, suppressing frustration, or “correcting” our expressions to avoid being flagged.
The danger here is not just that machines fail to understand us. It’s that they may begin to discipline us — nudging our expressions, altering our behaviour, and shaping our emotional lives in invisible ways.
Even worse, once emotional expressions become a form of data to be measured and ranked, people may begin to perform emotions in strategic ways. Employees may feign positivity. Students may mask confusion. Customers may smile for discounts. This is not emotional expression — it is emotional choreography.
Such systems reward emotional transparency and punish ambiguity. They treat the most personal, context-sensitive aspects of human experience as input-output problems to be optimised. The subtlety of emotional life is flattened into data points. The consequence is not emotional intelligence, but affective control.
Emotional privacy: a new kind of right
To address the harm caused by EAI, we need more than technical fixes or improved accuracy. We need a normative framework that centres emotional life as something worth protecting — not just from extraction or manipulation, but from conceptual distortion.
I call this framework emotional privacy. By emotional privacy, I mean the right to have one’s emotional life — including emotional expressions, experiences, and responses — protected from unwanted observation, interpretation, or interference. Emotional privacy is not just about keeping our emotions secret or under control. It is about protecting the personal and social conditions that allow people to feel, express, and understand emotions in their own way.
This includes the right to remain emotionally opaque, the freedom to express emotion without it being recorded or categorised, and the ability to inhabit emotional spaces — such as grief, confusion, or excitement — without those states being instrumentalised.
Conclusion
EAI is not just a technological innovation. It is a shift in how emotion is understood, monitored, and governed. By turning emotional expressions into data, it risks eroding the conditions under which emotion can be lived and shared meaningfully.
We must recognise that the harms of EAI are not just technical failures. They are conceptual and normative intrusions. They reframe emotional life as something to be measured, optimised, and corrected — and in doing so, they threaten our capacity to feel freely and to live emotionally rich lives.
Emotional privacy offers a framework for resisting this trajectory. It reminds us that emotions are not problems to be solved, but ways of being in the world. To protect emotional privacy is not only to resist invasive technology — it is to defend the possibility of “emotional freedom” itself.