The corporate-government power nexus

Jun 15, 2023
Close-up Portrait of Software Engineer Working on Computer, Line of Code Reflecting in Glasses. Developer Working on Innovative e-Commerce Application using Machine Learning, AI Algorithm, Big Data

Mass surveillance and manipulation should not be allowed to become the new normal.

The Robodebt controversy directly affected more than 470,000 people, some of whom reputedly took their lives as a result of the harassment they received to repay debts to the Australian Government that they never incurred. The policy, which was first implemented under former Attorney-General, Christian Porter, was found to be illegal by the Federal Court late in 2019. Despite this ruling, which required the Australian Government to repay $1.2 billion in debt repayments to affected citizens, no formal apology was ever forthcoming from the Turnbull or Morrison Governments, or then Minister for Government Services, Stuart Robert. In her book Automating Inequality: how high-tech tools profile, police, and punish the poor (2018), sociologist Virginia Eubanks documented how similar practices were implemented by conservative state governments in the United States, and could have easily been avoided.

Critics of rapid technological change in the nineteenth century like Mary Shelley and Wolfgang Goethe wrote fictional allegories warning about the unintended negative consequences that could arise from developing technology that isn’t fully understood (Frankenstein’s monster) and making deals with the Devil to gain superhuman knowledge (the Faustian bargain). In the twentieth century an additional concern arose, i.e. modern technologies have become so powerful and widespread they are determining the shape of human lives, rather than the reverse (autonomous technology).

Contrary to the widely held view of most politicians and political parties today, these criticisms suggest that technological development does not necessarily lead to social progress. In the case of unintended negative consequences, those negative outcomes are not the result of purposeful action. They are either an unexpected detrimental by-product of an action intended to have a positive outcome, or they are a perverse effect contrary to that originally intended. For example, the introduction of anti-terror legislation is used to monitor and prosecute political dissidents. Another example is the lack of regulation of social media and Big Tech by successive governments which has led to monopolistic control by private for-profit interests of important communication and commercial platforms.

In the case of the Faustian bargain, the more technical control that is exercised over society and nature, the more it results in social alienation and environmental degradation. For example, the constant surveillance and monitoring of workers’ performance leads to chronic stress and anxiety, while unrestrained coastal development increases the risk and consequences of storm damage to private property and ecosystems.

In the case of autonomous technology, as certain technologies become larger and more widespread and their proponents more politically and economically powerful, they become resistant to change and preoccupied with controlling their environments and how people interact with them. For example, the international fossil fuel industry is so highly capitalised and integrated into all aspects of our societies that it has effectively delayed and disrupted action on human-induced climate change, while the international finance industry has so effectively colonised the policy and regulatory functions of government that the gap between rich and poor has never been greater.

Over the last several years, there has been a huge increase in data-matching programs by governments and corporations as a tool to automatically assess the applications and submissions of welfare beneficiaries, clients and customers. These automated programs are used by a growing number of government agencies responsible for health, welfare, tax, visas, policing and veterans’ affairs. Such expert systems, colloquially known as ‘Artificial Intelligence’ (AI), are being used to conduct complex interpretive tasks that had previously been undertaken exclusively by human beings. These systems are almost invariably developed by private companies and provided as contract services to governments, despite legitimate ethical and legal concerns about government agencies handing over personal information to commercial actors.

Neoliberal and neoconservative governments and corporations appear to find these techniques particularly attractive, primarily because they save them time and money. However, the ability of machines using algorithms to assess people’s eligibility for a government benefit, or compliance with the law, is extraordinarily limited. All of these systems rely on complex ways of drawing inferences from available information. In other words, they rely on forms of logical induction to not just make generalisations, but to draw conclusions.

Philosophers have long understood the problem of induction, i.e. the accumulation of instances of a particular event or process under certain conditions does not guarantee that it will always proceed in the same manner in future. This is a particularly pertinent issue in the physical sciences, where inferential reasoning from masses of experimental or observational data is an essential component of scientific research. But all good researchers know that human judgement and interpretation are also essential in making sense of data, and in deciding what is important, what contextual factors must be taken into account, and in what circumstances it is legitimate to draw conclusions, or revise approaches.

Recent revelations about the extent to which mass surveillance and political manipulation is routinely exercised by nation states and transnational corporations have demonstrated that all of the concerns raised by pessimist prophets of technology like George Orwell and Aldous Huxley were completely justified. For neoliberals like Friedrich Hayek and Milton Friedman and their many followers in government, business and academia, however, the liberty to make a buck with no ethical constraints makes democratic oversight the enemy of capital. In an era of mass surveillance and consumerist docility, it’s no wonder that TV series like ‘Big Brother’, ‘Survivor’ and ‘Alone’ are so popular in so many countries.

Nazi, fascist and communist regimes were condemned by conservatives, liberals and progressives in the twentieth century for spying on their citizens, limiting their freedoms and punishing their political enemies. The lessons of twentieth century politics are that authoritarian measures undermine democratic norms and foster intolerance and hatred. But this is precisely the kind of society that we are becoming as a result of unchecked executive power within our governments and corporate capture of our public institutions, including our universities.

The technocratic biases of contemporary governments and corporations have led them to the erroneous belief that expert systems are a legitimate substitute for human judgement, because they are supposedly ‘intelligent’. However, intelligence is far more than the ability to make inferences from masses of data. Nevertheless, this conceit seems to be impervious to such knowledge-based criticisms. Consequently, governments in the United States, Australia, and elsewhere have embraced this technology, with often devastating consequences for the citizens affected.

One positive outcome of the Robodebt debacle was that it revealed the dangers of relying on automated decision-making in cases where human judgement is clearly required. Implementing a law similar to the EU’s General Data Protection Regulation in Australia would go some way to preventing government decisions of any kind from being automated in this way. However, the use of technology by elites to promote a constant state of fear and anxiety amongst the working population to ensure its obedience and compliance will require far more than changes in the law to arrest and reverse.

Share and Enjoy !

Subscribe to John Menadue's Newsletter
Subscribe to John Menadue's Newsletter


Thank you for subscribing!