China Using Emotion-Recognition AI Capable Of ‘Predicting’ Future Actions To Have People Arrested

CCTV installed on the streets of China
CCTV installed on the streets of China. © Open Doors

China's iron fist has strengthened its hold upon the necks of its very own citizens, as news of an "emotion recognition technology" being used on people has emerged. Authorities use the emotion-recognition AI capable of "predicting" future actions to determine if an individual should be arrested.

According to Breitbart, the new technology is being applied in scenarios in which a person is carrying drugs in his car. The emotion-recognition AI then tells police if the person feels "more nervous than the average person at a checkpoint," which they then use as a metric for searching the vehicle.

A report from state-run Global Times revealed that communist China is using this emotion-recognition AI in "various fields including health, anti-terrorism, and urban security." However, it poses an alarming threat to even the most innocent Chinese citizens as it aims to criminalize one's feelings.

Even more concerning is the fact that China has inked a deal with Huawei to develop technology that identifies ehtnic groups, specifically Uyghur people. The news came via a Huawei patent that was found to reference Uyghurs in its Chinese-language document, as per a request to the China National Intellectual Property Administration (CNIPA) to remove the mention of the word "Uyghurs'' in the document.

"One technical requirement of the Chinese Ministry of Public Security's video-surveillance networks is the detection of ethnicity - particularly of Uighurs," Human Rights Watch representative Maya Wang told BBC.

The patent was initially created in 2018 with Chinese Academy of Sciences and describes how deep-learning artificial-intelligence techniques are used to identify various features of pedestrians captured on surveillance on the street. Now, the Chinese Communist Party is leveling up the game by implementing emotion-recognition AI capable of "predicting" future actions to have people arrested.

While there are no confirmed reports that emotion-recognition AI is being used in the Uyghurs' native region of Xinjiang, it is highly likely that it is already being used in the network of concentration camps in the region. However, Breitbart reports that experimentation with emotion-recognition AI are often conducted on inmates and prisons in China, with at least six prisons admitting to be openly using such technology to predict violent outbursts.

"Back in China, emotion recognition has contributed to the risk assessment of prisoners in a few regional prisons," Global Times reported.

Emotion recognition technology, per Ma Ai, director of the criminal psychology research center under China University of Political Science and Law, helps prison officers determine whether a prisoner has mental problems and violent or suicidal tendencies. It also helps them determine the likelihood that the prisoner will repeat his or her offense if released, Ma Ai told Global Times.

How it works

China's move to criminalize feelings through emotion-recognition AI is a step further down the hole of human rights abuses. According to Not the Bee, the country is using such technology to "scan a person's emotional state to infer criminal guilt and provide justification for search and seizure."

The emotion-recognition AI works by scanning a person's face for three to four seconds, during which the system will analyze "seven main physiological indexes including body temperature, eye movement, and heart rate, and convert them into psychological signs showing whether the prisoner is calm, depressed, angry," or a different emotion.

This type of emotion-recognition AI technology is most likely used on the communist country's political prisoners, who have been arrested under charges of "subversion of state power" or "picking quarrels and provoking trouble."

Per Breitbart, the technology could be used on those who criticize the CCP, or hold religious views that the CCP deems illegal - such as Christians who have not registered to be part of the state-sanctioned churches.