FYI.

This story is over 5 years old.

Games

A Neural Network is Being Designed To Identify "Criminal Faces"

At best, this is a misguided attempt at science. At worst, it's the first step towards dangerous digital phrenology.
Watch Dogs 2. Screenshot courtesy of Ubisoft

Earlier this week I wrote a short piece about an interesting little AI experiment that Google released called Quick, Draw!, and I ended that piece by discussing some of the reservations I have about the use of crowd labor from the internet for the enhancement of neural networks. When we play with these toys, we're doing research labor. We are training the artificial intelligence, and we're making it better at its job of recognizing shapes, patterns, and objects. I ended that piece by writing that "it's easy to imagine a dystopian project of, say, identifying what a criminal looks like."

Advertisement

Little did I know that this exact project was being worked on. Last week, two researchers from Shanghai Jiao Tong University submitted a paper to a Cornell database that was titled "Automated Inference on Criminality using Face Images." In this paper, the authors outline the creation of an artificial intelligence that recognizes criminals based on physical traits like "lip curvature, eye inner corner distance, and the so-called nose-mouth angle." With a dataset of 1856 faces and their respective criminal records, the authors claim to have discovered a "law of normality for faces of noncriminals."

You need to read that again to let it sink in. This is the worst hellscape possible, the worst dystopian vision of the future. It's the kind of stuff that Watch Dogs 2 and other games of its ilk recognize as an unequalled violence that must be opposed at any cost.

"For a neural network, images are images, and the data set they're trained on is all that matters."

From the late 1800s through the early 1900s,  the eugenics movement in the United States and Europe believed that the physical body held the key to the soul. Francis Galton, a key figure in the eugenics movement, created elaborate composite photographs and lineage charts to figure out where disability, lack of character, and criminality could lie in the shape of the face and the body. He thought in the same way that these researchers do: that he could find your destiny literally written on your face.

Here's why it matters: We live in a world where Taser is considering arming the police with a autonomous drone that can electrocute someone at-will. The President Elect is, at best, waffling over whether he expects the Muslim population of the United States to register themselves in a database. The UK is close to implementing a historically unparalleled law that eliminates fundamental privacy rights. These are operations of surveillance. This is governing people by their data, and turning that data into ways of criminalizing people without due process or access to basic human rights.

The horrifying thing is that it's all connected. My playing Quick, Draw! to have a little bit of fun directly contributes to the creation of better algorithmic learning that can be tuned toward processes for better recognizing "criminal" nose shapes. For a neural network, images are images, and the data set they're trained on is all that matters.

It's a mistake to say that we can go back, for sure. To suggest that we shut it all down and go back to some kind of ideal state is pointless because it can't happen. But what we do need is an awareness of how these things work, what systems are speaking to other systems, and some wariness about games that use our labor to support systems that can be turned back against our basic human rights.