White Hat Hacker: Social Media Is a 'Goldmine For Details' For Cyberattacks Targeting Companies
What Can Be Done Right Now to Stop a Basic Source of Health Care Fraud

Emotion Recognition: Facial Recognition Software Based Upon Valid Science or Malarkey?

The American Civil Liberties Union (ACLU) reported:

"Emotion recognition is a hot new area, with numerous companies peddling products that claim to be able to read people’s internal emotional states, and artificial intelligence (A.I.) researchers looking to improve computers’ ability to do so. This is done through voice analysis, body language analysis, gait analysis, eye tracking, and remote measurement of physiological signs like pulse and breathing rates. Most of all, though, it’s done through analysis of facial expressions.

A new study, however, strongly suggests that these products are built on a bed of intellectual quicksand... after reviewing over 1,000 scientific papers in the psychological literature, these experts came to a unanimous conclusion: there is no scientific support for the common assumption “that a person’s emotional state can be readily inferred from his or her facial movements.” The scientists conclude that there are three specific misunderstandings “about how emotions are expressed and perceived in facial movements.” The link between facial expressions and emotions is not reliable (i.e., the same emotions are not always expressed in the same way), specific (the same facial expressions do not reliably indicate the same emotions), or generalizable (the effects of different cultures and contexts has not been sufficiently documented)."

Another reason why this is important:

"... an entire industry of automated purported emotion-reading technologies is quickly emerging. As we wrote in our recent paper on “Robot Surveillance,” the market for emotion recognition software is forecast to reach at least $3.8 billion by 2025. Emotion recognition (aka “affect recognition” or “affective computing”) is already being incorporated into products for purposes such as marketing, robotics, driver safety, and audio “aggression detectors.”

Regular readers of this blog are familiar with aggression detectors and the variety of industries where the technology is already deployed. And, one police body-cam maker says it won't deploy facial recognition in its products due to problems with the technology.

Yes, reliability matters -- especially when used for surveillance purposes. Nobody wants law enforcement making decisions about persons based upon software built using unreliable or fake science masquerading as reliable, valid science. Nobody wants education and school officials making decisions about students using unreliable software. Nobody wants hospital administrators and physicians making decisions about patients based upon unreliable software.

What are your opinions?

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

The comments to this entry are closed.