Chinese police predict crime by tracking down who they think will be future offenders

(ORDO NEWS) — The New York Times published an article claiming that China, using a sophisticated surveillance system, is monitoring people who are not criminals, but who, in the opinion of the authorities, will become them. Who got on the list?

Artificial intelligence helps to persecute not only its own citizens, but also representatives of other nationalities, in which the country’s authorities see a threat.

Everything has gone so far that the police are now not only responding to threats that have already arisen, but also trying to predict future crimes and protests – everything is like in a movie.

The algorithms are supported by a wide network of surveillance cameras deployed throughout the country.

Who is under suspicion

More than 1.4 billion people are under close surveillance by the Chinese Communist Party today, according to a New York Times report.

They are “captured by police cameras that are installed everywhere from street corners and subway ceilings to hotel and apartment building lobbies. Even their phones are tracked, their purchases are tracked and their online chats are censored.

However, the biggest concern of investigative journalists was that the government, known for its human rights violations, pays special attention to vulnerable groups such as ethnic minorities, labor migrants, people with mental illness and HIV diagnosis.

  • It all started two years ago when China used AI and facial recognition to track down citizens infected with the coronavirus.
  • At the time, the public raised concerns about infringements on privacy and consent.
  • However, this is of little interest to the government, so it, according to the New York Times, began to purchase “technology that uses extensive surveillance data to predict crimes and protests before they happen.”
  • The technology could alert police if a person found to be using drugs makes too many calls to the same number, or alert officers whenever a person with a history of mental illness approaches a school.

The source also mentions a situation in which, in 2020, authorities in southern China allegedly denied a woman’s request to move to Hong Kong to be with her husband after software warned that the marriage was allegedly a bogus one.

The investigation revealed that they “did not often go to the same place at the same time and did not spend the Spring Festivals together.” Then the police came to the conclusion that the marriage was fake to obtain a migration permit.

It is not yet clear whether any measures are applied to possible “violators” if the violation did not actually occur, but there are only assumptions of artificial intelligence.

The publication notes that such an approach to law enforcement can “automate systemic discrimination and political repression.”


Contact us: [email protected]

Our Standards, Terms of Use: Standard Terms And Conditions.