Should the Brakes be put on Emotion Detection Technology?

Emotion Detection Technology is on fire – with an expected value of over $36 billion by 2021.  But at least one organization is calling for it to be banned when dealing with important decisions that can have a significant impact on people’s lives and their access to opportunities.  According to the December 2019 AI Now Report “AI systems continue to be deployed rapidly across domains of considerable social significance—in healthcare, education, employment, criminal justice, and many others—without appropriate safeguards or accountability structures in place.”

Here is the basic problem with this technology – it is assuming that everyone processes the 7 universal expressions of anger, contempt, disgust, fear, joy, sadness and surprise (plus embarrassment, pain, interest and shame) in the same manner and that these outward cues are connected to a person’s behavior.  However, these are faulty conclusions.

Let’s take a look at the criminal justice system and pretend that emotion detection is being used to determine if an inmate is getting angry and may pose a threat to a guard.  One would think that this technology would be a great resource as it is proven that everyone who is angry makes the same outward expression of pursed lips and flared nostrils.  But what if that inmate is a schizophrenic? Science has show that schizophrenics , and other groups with select psychological issues, show a lack of facial expression and therefore relying on this technology could put a guard at risk.

How about the employment field.  In this scenario an HR person cannot decide who will get the top job.  It is between a person who is born and raised in the United States, one from Japan and one from Thailand.  All candidates are equally matched for consideration so HR decides to base the decision on their individual expression to being told “the job includes a free fish lunch every week”.  The HR person thinks this is a great perk and will award points to the person the emotion detection technology reads as being the happiest about hearing this news.  The person from America smiles because he is happy and behavioral scientists have proven that in America smiling is a common way to express happiness.  The person from Japan is quite happy with this news too but doesn’t smile as smiling is not as accepted of an outward emotion in her culture.  And the person from Thailand has the largest smile but in her culture smiling means that she is fearful or embarrassed.  But, the emotion detection technology is not reading cultural differences and therefore gives the extra points to the person from Thailand who is secretly thinking of withdrawing her name from selection because she has a fish phobia.

And then there is healthcare where reading an expression wrong can mean the matter of life or death.  Two people come into the emergency room – one is a hypochondriac and the other a person with Parkinson’s Disease.  They both are having symptoms of a heart attack.  The emotion detection reads the hypocondriac’s body language.  The person is doubled over in pain, his face has a grimace and tears are flowing out of his eyes.  The person with Parkinson’s Disease is in significant pain however her neurological disease prevents her from being able to express it.  Yet with a limited staff the doctors are cued to throw their attention first to the hypochondriac who it turns out did not have anything wrong.  Yet the ignored Parkinson’s patient is having a real heart attack and every second away from helping him is putting his life more in jeopardy.

All of these examples show the problems with emotion detection technology but perhaps more importantly it shows the problem with how humans interact with technology.  Technology is not intended to be a vacuum void of human interaction.  Rather it is intended to be a tool for which humans can use to aid them in their decision-making process.  One can argue this technology has merit and although it should be perfected it does not deserve to be banned.  The human brain is exposed to 34 gigabytes of information each day.  This is a tremendous number and without question we are unable to absorb it all.  This is why our brain selectively filters out what it deems as important from that it does not believe will have an impact.  But some of the information may be critical.  And at times because of language barrier gaps all a doctor has to go on is outward emotional signs of distress, pain and confusion.  So in the example of the emergency room an emotion detection may not be able to single out the hypochondriac but it can signal if someone’s pain level has increased and in need of immediate attention when a doctor may not otherwise know as his or her attention is elsewhere.