What is it like when police go rogue in a liberal democracy? Look to Britain. – The Washington Post

Software

Facial recognition software. (Ian Waldie/Photographer: Ian Waldie/Getty I)

Stephanie Hare is an independent researcher and author of a forthcoming book on technology ethics.

To see what it looks like when law enforcement goes rogue in a liberal democracy, look to Britain.

Last week, police officials in South Wales told the Guardian newspaper that it will test facial-recognition technology apps on its officers’ smartphones, allowing officers to check a scan of a person’s face against a police database.

At first glance, this may not seem alarming. After all, we already have closed-circuit TV cameras everywhere, and we are told that if we have nothing to hide, we have nothing to fear, and that it is worth sacrificing our privacy if it helps the police to catch criminals.

Only people who have never lost their civil liberties could treat them so casually.

To understand the immense power of facial-recognition technology, compare it to other biometrics that law-enforcement officials can and do use to catch criminals, such as DNA and fingerprints.

We know when the police take our DNA and fingerprints because they have to arrest us and book us at the police station. They must take this data from us individually, which prevents them from taking it en masse. They must abide by laws that govern the conditions under which they can use this data, and how long they can keep it. If we are released without charge, or if we are acquitted, they must delete our DNA and fingerprint data within a defined period of time.

Not so with facial recognition.

The police can use facial-recognition software on us even without our knowledge. They can take it en masse. There are no laws governing what they can do with this data, or how long they can keep it. If we are released without charge, or if we are acquitted, they can keep our facial data indefinitely.

Police databases in Britain contain the facial data of innocent people alongside mugshots of suspected and convicted criminals. A High Court ruled in 2012 that they must not do this, but the Home Office (which controls the police) ignored this until 2017, when it devised a workaround: If we think our face is wrongfully in a police database, we can request to have it deleted. But most of us don’t know we have this right, there is little evidence that citizens are taking advantage of it, and the police can refuse our request.

This means that when the police scan our faces, they may find us in their database even if we’ve done nothing wrong. This requires us to prove our innocence, which violates one of the core tenets of a liberal democracy: that we are innocent until proven guilty.

Nor does facial-recognition technology identify us all with equal accuracy: it performs worse on darker-skinned people, women and children — well more than half the population. That’s why Axon, the largest provider of police body cameras in the United States, decided not to use facial recognition, citing ethical concerns.

Yet even if facial-recognition technology someday becomes accurate to a high degree of confidence (it will never be 100 percent), this only solves the technical problem. The social problem remains.

Facial-recognition technology threatens the right to protest and the right to free speech, core freedoms in a liberal democracy. If people fear their faces will end up in a police database, they may be reluctant to exercise their rights to protest and speak their views.

The South Wales Police know this; it is the defendant in a landmark legal action, the verdict of which we await, precisely because they scanned the face of a British citizen who was attending a protest.

It knows this also because there is a growing backlash against facial-recognition technology in Britain. It’s coming not only from researchers and civil liberties organizations but also the three British data regulators (the Surveillance Camera Commissioner, the Biometrics Commissioner and the Information Commissioner), who have warned that the government’s biometrics strategy is “not fit for purpose and needs to be done again,” and from members of the House of Commons, whose Science and Technology Committee recently called for a moratorium on the use of facial-recognition technology.

All are recognizing not just of the inaccuracy of facial-recognition technology, but also the risks it poses to our privacy and civil liberties. All are acknowledging the worst-case scenario: A technology that makes it possible to identify people anywhere, in real time, without their knowledge, is a tyrant’s dream come true, opening up new nightmares of persecution based on our ethnicity, religion, gender, sexuality, immigration status or political beliefs.

Liberal democracies have always had to balance freedom and security. Anything that threatens that balance, such as facial recognition technology, is not a matter for the police to decide. It should be decided by Parliament, which represents the people.

Read more:

The Post’s View: The facial-recognition future we feared is here

Betsy Golden Kellem: The dangers of facial recognition software

The Post’s View: There’s no federal standard on facial recognition. Congress should step in.