The unlimited, and unaccountable powers we’ve given police

Jul 6, 2022
Police_officers_of_Queensland_Police_Service,_on_the_beat,_in_Brisbane_Australia
In Australia, police have more or less consciously decided to use the new technology mostly for intelligence purposes, Image: Wikimedia Commons

The US Supreme Court’s decision to overrule the Roe v Wade principle that the right to an abortion is a privacy right guaranteed by the American constitution has magnified the fears of pro-choice citizens. They are worried that in the red states that have already criminalised abortion, right-to-life zealots, including those in law enforcement, will be anxious to search out and punish those breaching these state-based laws. The text of some of these laws, and the rhetoric of some of their proponents, suggest that the concern is by no means simply paranoid.

In any event Twitter, social media, and even what right-wing critics would call the “liberal” mainstream media have been reminding each other of the amazing capacity of police and other groups to track people, including in arrears, from signals coming from mobile telephones and computers, as well as the evidence of credit card transactions. Likewise, surveillance technology from cameras in the streets, face-recognition software, and amazing police and intelligence access to government and private data bases, as well as the use of algorithms in artificial intelligence, can identify suspects, determine those with whom they associate, and even get computer-based predictions of those likely to breach the law of some states. With the possibility of “people like us” being targets, there’s concern never shown with police actions among the underclasses.

I do not know how far some US state jurisdictions will go to prevent women having access to safe, legal abortion. We have seen enough lawless cops and legal figures, many corrupted by the US Guantanamo systems, and by practical Trumpism. Most of what is written about the technical capacity of modern police, law enforcement and intelligence systems, as well as hardware and software in use by some private businesses is accurate enough. It’s not theoretical, and and is in present use without enough in the way of controls.

I spoke recently at a protest outside the US Embassy seeking the return to Australia of Julian Assange. About 10 AFP officers were present – about one for every two protesters, who were of average age about 65.

At some stage a drone rose in the air about 100m from the protesters and began filming what happened.

Protesters assumed that the drone, which seemed to have originated on Embassy premises, was for American rather than AFP purposes. I am not sure of that but am quite certain that there was nothing in the behaviour of the small group which was ever capable of inspiring such a police presence, creating concern about menace to police or American life or limb, or justifying the fabulous display of arms, armour, Tasers and batarangs they wore. It might have explained, all by itself, why police are now too busy to attend or investigate burglaries or minor traffic accidents.

Someone remarked that the images would most likely be downloaded into American immigration data bases. It would probably result in our being denied visas if we applied for them – on the basis that anyone who gathers in opposition to US policy must be an enemy of the US. It is hard to dismiss such suggestions when agencies claim they cannot discuss their methods for fear of tipping off master criminals. We use the same excuses to prevent scrutiny of how we stop boat people.

Most powers now being routinely used by police, security agencies and a wide array of government bodies were granted under the theory that they were sadly necessary if we were to catch and stop terrorists. But there has been massive drift in the use of such powers by law enforcement for purposes for which they were never originally intended, and which parliament would not, originally, have allowed.

There are some still within the system who have never encountered a capacity they did not want, or willingly accepted in practice a control or limitation placed upon its use. Controls are regarded as attempts to improperly hamstring investigators.

From time to time external inspectors discover widespread systemic breaches and require agencies to ‘fess up. This they do cheerfully, as if this was an inadvertent oversight, and as if they mean heads to roll as a result.

Some of the equipment now in routine use is dangerously unreliable, incorporating, as police forces in overseas jurisdictions admit, serious bias. The use of artificial intelligence, machine learning and large data bases combines with the fact that most operators – and even people on the procurement side – are unable to accurately assess the accuracy of equipment (because manufacturers claim trade secrets) and are, in any event, inadequately trained on its use and its limitations.

A recent report on the advent of new technologies in the justice system by a House of Lords committee, tabled on Thursday, is scathing about what it describes as “a new wild west, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with”. It quoted, approvingly, a witness who remarked that “we are not building criminal risk assessment tools to identify insider trading or who is going to commit the next kind of corporate fraud … we are looking at high volume data that is mostly about poor people.” Some of it is a bit like Robo-debt, and no official ever suffers for abuse, excess of jurisdiction or a mistake of judgment or misunderstanding information. Except, of course, the poor people.

In Britain, at least, police are conscious of many of the problems, and there are lots of committees – probably too many – focused on standards, ethics and controls. These committees mean the public has some capacity to know what police are doing, and under what circumstances.

Talk about controls here, and some police will retort that police face the ultimate accountability of having to bring cases before the courts, where they face intense legal scrutiny. But their dodgy data hardly ever comes before the courts. First, much of it is predictive, even if it informs operations. Second, in Australia, police have more or less consciously decided to use the new technology mostly for intelligence purposes, without disclosing to courts or defendants how they discovered vital forensic facts. Thus, if they have used mobile phones, face and number-plate recognition technology, and information obtained from banks – mostly procured informally rather than by documented procedures – they can tell a suspect they have information placing them near the scene of a crime. As often as not, the broad-sweep way of finding suspects through AI allows them to find others at the same location, leaving suspects to infer they were dobbed in, or identified by witnesses.

Called on to justify their use of such powers, one can always expect that someone will point to the “stranger-danger’ crusade against paedophiles abusing children abroad, or access abuse materials. These are, by definition, so wicked, that we are invited to think that any investigative technique is justifiable, even if the crusade leaves entirely neglected the 99 per cent of child sexual abuse victims abused much closer to home, usually by relatives or authority figures. That it is much harder to inspire a moral panic about leaks of government information is one reason why police do not tell you that there are often 100 such trawls of the movements and communications of journalists and public servants for every paedophile hunt.

Share and Enjoy !

Subscribe to John Menadue's Newsletter
Subscribe to John Menadue's Newsletter

 

Thank you for subscribing!