Australia’s Curtin University on Wednesday requested the retraction of a 2018 research study, partly funded by the Chinese government and conducted by a former Curtin faculty member, that helped the Chinese Communist Party (CCP) refine the facial-recognition software it employs to keep the oppressed Uyghur Muslims of Xinjiang province under surveillance.
Curtin University said the study conducted by its former employee Wanquan Liu was unethical and conducted without the knowledge or approval of university administrators. It asked Wiley Online Library, the publisher of the study, to fully retract it.
Wiley Online Library told Reuters on Wednesday it added a “publisher’s note and expression of concern” to the study in September 2020 after investigating ethical complaints.
“We take every concern seriously and are reviewing the matter again taking into account the new information provided by Curtin University,” Wiley said Wednesday.
The key problem with the study, as summarized by Reuters, was that Uyghur university students did not give their consent when they were added to a “dataset of facial images” compiled from hundreds of students at Dalian Minzu University in China by Liu and his co-authors.
Tibetan and Korean students were also included in the Chinese state-funded study, which concluded that facial recognition software has “great application potential in border control, customs check, and public security.”
Much of that potential has been realized since 2018. In December 2020, Chinese electronics giant Alibaba admitted its role in developing technology that can racially profile Uyghurs using images of their faces.
Alibaba executives claimed they were unaware the system was under development and insisted it had not been used outside of “testing environments,” but their denial was not very convincing, as U.S. researchers discovered Alibaba websites showing prospective clients exactly how they could use the facial-recognition system to target Uyghurs.
Another major Chinese company, Huawei, developed a “Uyghur alarm” system that could notify the police whenever China’s innumerable security cameras spotted a Uyghur. Huawei also claimed the product was merely in a “testing” stage, while outside researchers said at least a dozen Chinese police departments were already using it.
Reuters noted that Liu spent two decades working for Curtin on an Australian Research Council grant, but left in May to work for Sun Yat-sen University in Shenzhen, China. Curtin said the objectionable research into Uyghur facial recognition was conducted “informally” by Liu without university support, but the published study identified Liu as an employee of the Curtin computer studies department.
Australia’s ABC News reported on letters showing Curtin University has successfully lobbied several other publishers to retract subsequent studies that were baked on Liu’s research, and has made several previous attempts to convince Wiley to pull the Liu study.
“It is alarming to think an Australian university was involved with research that can so clearly be used for profoundly unethical purposes,” Australian Sen. James Paterson told ABC.