The Xinjiang
Data Police

Credits
Darren Byler is a postdoctoral researcher at the University of Colorado Boulder and the author of the forthcoming books “Terror Capitalism” and “Technologies of Reeducation.”

Preparing For War

Baimurat saw the advertisement for the new job sometime around December 2016. He had come back to China from Kazakhstan a few years before in order to find better medical care for his second child. But despite his college degree from a Chinese university, he had not been able to find steady work. Like more than 80% of Kazakh and Uighur college graduates, he was chronically underemployed.

Baimurat had grown up in Xinjiang, the northwest border region of China where Turkic Muslim Kazakhs and Uighurs make up the majority of the population. He understood that Muslim-majority countries like Kazakhstan had much more to offer him in terms of financial opportunities. But since his family was back in China and the medical system was better there, he made the difficult choice to return.

In late 2016, the local Public Security Bureau in his home county of Qitai began recruiting people to become “assistant police” — a type of citizen policing role that the authorities described as a kind of supermarket or mall security guard position. “Since I graduated from Xinjiang Normal University, I was considered very well qualified,” Baimurat remembered. “I had an interview, and I got the job. I was among the first of the people they hired from all over the region.”

At first, it seemed too good to be true. Because of his qualifications and his Chinese language skills, Baimurat was given the highest possible salary of nearly 6,000 yuan, or $1,000, per month — much higher than the minimum wage of around 2,000 yuan. But even in the beginning, there were signs that it would not be a typical mall security job.

The local authorities put the new recruits through several weeks of boot camp. “We were given police uniforms,” he said, “and then we started doing different kinds of training. It was really strict, as if we were planning for a war.”

Over the next year, that war would unfold on TV monitors, on smartphone screens, in databases of personal information, at police checkpoints, behind razor wire and concrete walls. Baimurat, who now looks back from a distance with deep regret at the role he played in that war, would see things that he would never forget. But the war goes on today without him, with thousands upon thousands of his people — both data police and detainees — laboring under the watchful eyes of perhaps the most sophisticated surveillance system in the world.

The People’s War On Terror

In China in the 1990s and 2000s, millions of ethnic Han settlers ventured west into the ancestral homelands of the Uighurs and Kazakhs who had long lived in the wide steppes and desert regions of what was once called East Turkestan. They sought oil, natural gas and land for industrial agriculture. Many Uighurs responded to what ensued — land seizures, the rising cost of living, endemic job discrimination and, eventually, brutal policing — with protests and, on rare occasions, with violence. Many Kazakhs, meanwhile, began to seek escape routes to Kazakhstan, where the government subsidized their immigration.

3G internet arrived in Xinjiang in 2011, and Uighurs, Kazakhs and other Muslim ethnic groups began to soak in contemporary Muslim culture and faith traditions from across the Muslim world, like the one inspired by the Tablighi Jamaat, a non-political Sunni tradition that has tens of thousands of members in the U.K. The atmosphere of protest and increasing adherence to halal standards, such as abstaining from alcohol, alarmed Han settlers, who feared a largely abstract, threatening stereotype of Islamic violence.

Over this same period of time, thousands of Uighurs fled to Turkey as refugees, where some allegedly received the support of the Turkish government to travel on to Syria to fight the Assad regime and also ISIS in the civil war. As a proportion of the Uighur population as a whole, this group of foreign fighters was smaller than the number of Muslims from the U.K. who also fought in the war, many of whom were on ISIS’s side.

Back in China, a new Islamophobic rhetoric of “extremism” and “terrorism” had arrived via global media, and a process of dehumanizing Muslims took hold. In echoes of Cultural Revolution-era rhetoric that depicted counterrevolutionaries as vermin, state media began to represent Uighurs and Kazakhs as venomous snakes and disease-carrying insects that needed to be exterminated, or as demons ridden with the metastasizing cancer of extremism.

As early as 2014, the Chinese government began to identify “signs” of Islamic extremism, including possessing digital files with Islamic content and using virtual private networks, as pre-criminal markers of terrorism “intent.” Policing relied in part on netizens who were willing to report other internet users for cybercrimes. Local authorities offered rewards for reporting “extremist, terrorist or separatist” online behavior that resulted in detentions.

“State media began to represent Uighurs and Kazakhs as venomous snakes and disease-carrying insects that needed to be exterminated.”

By 2017, according to ongoing research I and others are doing, new digital forensics tools were built by the Chinese tech giant Meiya Pico and other companies.  They were used to feed data about the digital histories of individuals into a region-wide system called the Integrated Joint Operations Platform (IJOP). The IJOP identified suspects for local police departments based on patterns of suspicious behaviors and data it detected through surveillance systems.

By combining this data with information collected through interrogations, the system helped to determine which Muslims were “untrustworthy” and in need of further investigation. In just two years, nearly 350,000 people in the region were convicted of criminal offenses and given prison terms. Between 900,000 to 1.5 million more were sent to a newly organized network of reeducation camps to be cleansed of ideological “viruses” of extremist thoughts.

Over 90,000 police contractors were hired to build and staff this network. Many were deputized young men like Baimurat who came from Uighur and Kazakh Muslim populations, the same groups that were targeted by the system. Their job was to watch cameras, perform spot checks of Muslim young people and demand that they provide their state-issued ID and their phones for inspection via spyware apps and scanning devices. These young, low-wage workers were also responsible for monitoring face-scanning machines and metal detectors at fixed checkpoints.

And thus a dataset grew. Algorithms that “identified” extremism were honed. And a digital enclosure tightened over everyday life.

As the scholar Lilly Irani has noted, cutting-edge algorithms built by tech conglomerates around the world are often managed and monitored by low-wage technicians. Much of this work is done through platforms like the Amazon-hosted contractor network called Mechanical Turk. These “data janitors,” as Irani refers to them, train software to recognize and digitize material objects, behaviors and people.

In northwest China, data janitors became data police.

No Blank Spots

Soon after Baimurat was hired, local authorities started building “convenience police stations,” a type of surveillance hub set up every several hundred yards at intersections and entrances to parks, banks, shopping malls and other high-traffic areas. Then, Baimurat and the other contractors were divided up and stationed in the different stations. Across the region, at least 7,700 such stations were built.

Inside these blocky structures were banks of TV monitors that showed dozens of camera views. Some of the cameras were moveable using a little joystick, and the contractors could zoom in on faces. Baimurat said that, at first, their assignment was just to watch the monitors. They themselves were being watched by multiple cameras in a centralized command center. “If we stopped looking, we would be punished,” he said.

Over time, the kind of surveillance they did began to shift. First, the contractors would be sorted based on their Chinese language ability and knowledge of counterterrorism laws. “They made us do other exercises like reciting rules about participating in the camp system,” Baimurat recalled. “We had to learn these by heart.”

Then, around the middle of 2017, they were tasked with actively fine-tuning the programming of the system using assessment tools called “counterterrorism swords.” The handheld devices and software systems scanned through smartphones and other electronic devices in less than two minutes, attempting to match materials to a base dataset of as many as 53,000 flagged audio, video, picture and text files that had been deemed related to religious extremism or terrorism. The systems were similar to the Clearview and Palantir systems used by U.S. police departments, but rather than searching only through publicly available social media data, the Meiya Pico systems cracked the codes of private social media, email and instant messaging applications to assess the phone owner’s digital history and social network.

These systems which cost between $2,000 and $3,000 per unit, purported to have a 90% success rate detecting installations of banned apps such as WhatsApp or Twitter, even if the apps had since been deleted. In 2017, in Qitai, where Baimurat found his job, the local authorities spent at least $65,000 on such systems, and, the following year, another $22,000.

“The Meiya Pico systems cracked the codes of private social media, email and instant messaging applications to assess the phone owner’s digital history and social network.”

The devices could access the histories of popular Chinese social media apps such as WeChat, as well as files received and sent through instant messaging. Like commonly used malware-scanning software used on a computer, it would sound an alarm if it captured key files and would quarantine them to be further examined in detail.

The devices also scanned through the route history of pedestrians and car drivers. And they searched through people’s contacts, scraping images of faces and other personal data, adding them to a police database. “We got information about whether or not the person had worn an Islamic veil, had installed WhatsApp or had traveled to Kazakhstan,” Baimurat said. “All sorts of things like that.”

Baimurat was ordered to stop people on the street who appeared suspicious. He and his coworkers understood that this meant they should target Muslims in particular. “We could stop every car on the street and check them,” he said. “When we stopped them, we asked the people inside to show their phones and ID cards. If there was something suspicious, we needed to inform our bosses.”

The system became smarter with every new suspicious face or name logged in the database, every piece of information time-stamped, geotagged and synchronized with the surveillance camera network. For people who were deemed to have crossed a line — like traveling outside permitted jurisdictions, driving someone else’s car, failing to carry a smartphone and many more infractions — points would be deducted from their “trustworthiness” score.

Unfolding Terror

At first, Baimurat felt that despite the tedium of staring at monitors all day, being a police contractor was a good job. He was paid well and “just had to sit there.”

But several months into the job, Baimurat learned that one of the middle schools in town was being turned into what he understood to be a prison surrounded by an iron gate, a high electric fence and watchtowers. And then, he found out that the people he was detaining through device checks were being sent there.

One day, he was assigned to transfer detainees from the county jail to the camp. What he saw when he arrived shocked him. There were nearly 600 people waiting to be loaded into rows of buses. His job was to put shackles on their hands and feet and a black hood over their heads. “We had so many manacles,” Baimurat remembered. “I saw very young women, very old women and men with white beards among the detainees.”

“I will never forget her screams.”

All of the detainees, except perhaps one or two, were Muslim. Many of the detainees were so weak or disoriented that they couldn’t hold onto the blankets that Baimurat draped over their hands. One of the old women could not walk due to a leg injury, so police contractors dragged her to the bus. “I will never forget her screams,” Baimurat said. “I felt very bad about being a part of the system. There were so many people who made very tiny mistakes and ended up in the camp.”

The camps began to overflow with detainees. Baimurat’s own cousin was detained because a phone he had bought secondhand had been used to download a video of a Uighur protest that happened several years before, which he had no way of knowing. When one of his coworkers gave a cigarette to a Uighur detainee in view of surveillance cameras, an officer showed up and took him away. “We never saw him again,” Baimurat said.

The contractors worried that they themselves would be taken away if they failed to show “no mercy.” They were told they could not quit. The rule was to “round up everyone who should be rounded up.”

“We were terrified,” Baimurat said.

Data Policing

As the Italian chemist and writer Primo Levi put it in his reflections on surviving Auschwitz, besides being in good health and knowing the language of the police, getting out of internment camp systems is often a result of “sheer luck.” Occasionally, there are routes of escape.

Baimurat could read the writing on the wall. In mid-2018 he secretly went to the Kazakhstan visa office in Ürümqi and begged the workers there to help him return to Kazakhstan. Since he had established Kazakhstani citizenship before returning to China, they were able to issue a new travel document for him and help him secure his safe passage across the border.

But Baimurat was just one member of western China’s new data police force. Tens of thousands of his former colleagues are still in Xinjiang, scanning phones and faces, monitoring WeChat groups and downloads of suspicious content, and putting black hoods over people’s heads.

In 2018, the Xinjiang Uighur Autonomous Region spent close to $100 million on policing equipment such as scanning tools. Although this is just a small proportion of the nearly $10 billion information and security technology industry in the region, it is one of the most crucial aspects of the reeducation camp system. Because the army of police contractors actively profile the people they target with scans, they make sure that the majority of the Kazakhs and Uighurs in the region are assessed. Their task is to assure that the scans are conducted on a regular basis. For targeted populations, they are sometimes done every day. In internal police documents, it is clear that, over time, the list of flagged WeChat groups and other “micro-clues” of religious extremism continues to be expanded through the data they collect via scans.

“They are watching in real-time how the system they helped build aids the dehumanization and detention of people in their own communities.”

Although the camp system used in Xinjiang may appear to be unique to Xinjiang, the system of hand-held phone scanning tools is no longer limited to the Uighur region. In other areas in China where there are sizeable populations of minorities, such as the Yi in Yunnan or the Hui in Ningxia, these systems are now being trialed, though not yet at the same scale as in Xinjiang. They are also being used during border customs inspections in places like Shandong. Meiya Pico has been placed on a no-trade list by the U.S. Department of Commerce, but according to a 2017 company report, the company aims to expand its reach to policing systems “in 100 developing countries.”

The work the data police did not only affect the people it targeted. It also affected the data police themselves. Like Facebook content moderators stuck in office buildings watching horrifically violent videos and assisting algorithms in taking down hate speech, Xinjiang data police like Baimurat suffer from trauma stress disorders.

In this case, though, it is not just a secondary effect of viewing violence or even a post-trauma experience — it’s worse. They are watching in real-time how the system they helped build aids the dehumanization and detention of people in their own communities. Ultimately, they see that they are building a system that takes their own lives away, and that there is no real protection from it unless they are able to escape.

Unlike justice for data janitors who build the algorithms of American tech companies, justice for data police in China is not simply a question of better work conditions or the freedom to quit. They exemplify a more fundamental question of whether people are the authors of their own lives. Data police like Baimurat are witnesses to the new unfolding of a banal power over life itself.