AI emotion-detection software tested on Uyghurs

0 93

A camera system that uses AI and facial recognition intended to reveal states of emotion has been tested on Uyghurs in Xinjiang, the BBC has been told.

A software engineer claimed to have installed such systems in police stations in the province.

A human rights advocate who was shown the evidence described it as shocking.

The Chinese embassy in London has not responded directly to the claims but says political and social rights in all ethnic groups are guaranteed.

Xinjiang is home to 12 million ethnic minority Uyghurs, most of whom are Muslim.

Citizens in the province are under daily surveillance. The area is also home to highly controversial “re-education centres”, called high security detention camps by human rights groups, where it is estimated that more than a million people have been held.

Beijing has always argued that surveillance is necessary in the region because it says separatists who want to set up their own state have killed hundreds of people in terror attacks.

The software engineer agreed to talk to the BBC’s Panorama programme under condition of anonymity, because he fears for his safety. The company he worked for is also not being revealed.

But he showed Panorama five photographs of Uyghur detainees who he claimed had had the emotion recognition system tested on them.

“The Chinese government use Uyghurs as test subjects for various experiments just like rats are used in laboratories,” he said.

And he outlined his role in installing the cameras in police stations in the province: “We placed the emotion detection camera 3m from the subject. It is similar to a lie detector but far more advanced technology.”

He said officers used “restraint chairs” which are widely installed in police stations across China.

“Your wrists are locked in place by metal restraints, and [the] same applies to your ankles.”

He provided evidence of how the AI system is trained to detect and analyse even minute changes in facial expressions and skin pores.

According to his claims, the software creates a pie chart, with the red segment representing a negative or anxious state of mind.

He claimed the software was intended for “pre-judgement without any credible evidence”.

The Chinese embassy in London did not respond to questions about the use of emotional recognition software in the province but said: “The political, economic, and social rights and freedom of religious belief in all ethnic groups in Xinjiang are fully guaranteed.

“People live in harmony regardless of their ethnic backgrounds and enjoy a stable and peaceful life with no restriction to personal freedom.”

The evidence was shown to Sophie Richardson, China director of Human Rights Watch.

“It is shocking material. It’s not just that people are being reduced to a pie chart, it’s people who are in highly coercive circumstances, under enormous pressure, being understandably nervous and that’s taken as an indication of guilt, and I think, that’s deeply problematic.”

Agencies
You might also like