X Close

Big Tech’s threat to democracy Can the US government tolerate the existence of a rival within its territory?

Is he transparent enough?. Photo: Andrew Harrer/Bloomberg via Getty Images


June 29, 2021   4 mins

The convenience of the smart home may be worth the price; that’s for each of us to decide. But to do so with open eyes, one has to understand what the price is. After all, you don’t pay a monthly fee for Alexa, or Google Home. The cost, then, is a subtle one: a slight psychological adjustment in which we are tipped a bit further into passivity and dependence. 

The Sleep Number Bed is typical of smart home devices, as Harvard business school Professor Shoshana Zuboff describes in The Age of Surveillance CapitalismIt comes with an app, of course, which you’ll need to install to get the full benefits. Benefits for whom? Well, to know that you would need to spend some time with the 16-page privacy policy that comes with the bed. There you’ll read about third-party sharing, analytics partners, targeted advertising, and much else. Meanwhile, the User Agreement specifies that the company can share or exploit your personal information even “after you deactivate or cancel … your Sleep Number account.” You are unilaterally informed that the firm does not honor “Do Not Track” notifications. By the way, the bed also transmits the audio signals in your bedroom. (I am not making this up.)

The business rationale for the smart home is to bring the intimate patterns of life into the fold of the surveillance economy, which has a one-way mirror quality. Increasingly, every aspect of our lives — our voices, our facial expressions, our political affiliations and intellectual predilections — are laid bare as a data to be collected by companies who, for their own part, guard with military-grade secrecy the algorithms by which to use this information to determine the world that is presented to us, for example when we enter a search term, or in our news feeds. They are also in a position to determine our standing in the reputational economy. The credit rating agencies and insurance companies would like to know us more intimately; I suppose Alexa can help with that.

Allow me to offer a point of reference that comes from outside the tech debates, but can be brought to bear on them. Conservative legal scholars have long criticized a shift of power from Congress to the administrative state, which seeks to bypass legislation and rule by executive fiat, through administrative rulings. The appeal of this move is that it saves one the effort of persuading others, that is, the inconvenience of democratic politics. 

All of the arguments that conservatives make about the administrative state apply as well to this new thing, call it algorithmic governance, that operates through artificial intelligence developed in the private sector. It too is a form of power that is not required to give an account of itself, and is therefore insulated from democratic pressures. 

In machine learning, an array of variables are fed into deeply layered “neural nets” that simulate the binary, fire/don’t-fire synaptic connections of an animal brain. Vast amounts of data are used in a massively iterated (and, in some versions, unsupervised) training regimen. Because the strength of connections between logical nodes is highly plastic, just like neural pathways, the machine gets trained by trial and error and is able to arrive at something resembling knowledge of the world. The logic by which an AI reaches its conclusions is impossible to reconstruct even for those who built the underlying algorithms. We need to consider the significance of this in the light of our political traditions.

When a court issues a decision, the judge writes an opinion in which he explains his reasoning. He grounds the decision in law, precedent, common sense, and principles that he feels obliged to articulate and defend. This is what transforms the decision from mere fiat into something that is politically legitimate, capable of securing the assent of a free people. It makes the difference between simple power and authority. One distinguishing feature of a modern, liberal society is that authority is supposed to have this rational quality to it — rather than appealing to, say, a special talent for priestly divination. This is our Enlightenment inheritance. It appears to be in a fragile state. With the inscrutable arcana of data science, a new priesthood peers into a hidden layer of reality that is revealed only by a self-taught AI program — the logic of which is beyond human knowing.

The feeling that one is ruled by a class of experts who cannot be addressed, who cannot be held to account, has surely contributed to populist anger. From the perspective of ordinary citizens, the usual distinction between government and “the private sector” starts to sound like a joke, given how the tech firms order our lives in far-reaching ways.

Google, Facebook, Twitter and Amazon have established portals that people feel they have to pass through to conduct the business of life, and to participate in the common life of the nation. Such bottlenecks are a natural consequence of “the network effect.” It was early innovations that allowed these firms to take up their positions. But it is not innovation, it is these established positions, and the ongoing control of the data it allows them to gather, that accounts for the unprecedented rents they are able to collect, as in a classic infrastructure monopoly. If those profits measure anything at all, it is the reach of a grid of surveillance that continues to spread and deepen. It is this grid’s basic lack of intelligibility that renders it politically unaccountable. Yet accountability is the very essence of representative government. 

Mark Zuckerberg has said frankly that “In a lot of ways Facebook is more like a government than a traditional company.” If we take the man at his word, it would seem to raise the question: can the United States government tolerate the existence of a rival government within its territory? 

In 1776, Americans answered that question with a resounding ”No!” and then fought a revolutionary war to make it so. The slogan of that war was “don’t tread on me.” This spirited insistence on self-rule expresses the psychic core of republicanism. As Senator Amy Klobuchar points out in her book Antitrust, the slogan was directed in particular at the British Crown’s grant of monopoly charters to corporations that controlled trade with the colonies. 

Today, the platform firms appear to many as an imperial power. The fundamental question “who rules?” is pressed upon America once again.

This is an edited version of Matthew B. Crawford’s testimony before the US Senate Judiciary Committee on June 15th 2021.


Matthew B Crawford writes the substack Archedelia


Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

17 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Geoffrey Wilson
Geoffrey Wilson
3 years ago

Excellent article. I wish democratic governments well in resisting the growth in power of big tech, and historically this has succeeded by following the money, so identifying how big tech makes its money, and where that is a public good like vast databases on individuals, taxing it, regulating it, nationalising it, splitting it. For a start, compelling publication of algorithms to a regulatory authority.

Graham Stull
Graham Stull
3 years ago

The EU has proposed a new law governing the use of AI, which I believe is the first in the world:
EUR-Lex – 52021PC0206 – EN – EUR-Lex (europa.eu)

Galeti Tavas
Galeti Tavas
3 years ago

And?? So what should we do?

I think it is time for Constitutional Amendment Number 28!

One which says as this is a nation, Of, For, and By humans, and thus no digitally created laws may be binding.

The article above says the entire AI/SocialMedia/Tech are overlapping with government, and THUS we need laws to keep it where any law, in the broadest sense, MUST be created and enforced, by duly appointed, or elected, Human officials. By digital laws I possibly include algorithms which generate and manage any content, or metadata, which is private by law, or custom, and that controls that which influences human behavior, choices, manipulates citizens, or otherwise causes them to behave in ways they would not have without these artificial algorithms – should be limited.

Humans controlling Humans should be the law.

Last edited 3 years ago by Galeti Tavas
Kristof K
Kristof K
3 years ago
Reply to  Galeti Tavas

Great to find you back, Mr Artzen. I usually disagree with you though, but not on this occasion!

Christian Filli
Christian Filli
3 years ago

The combination of “algorithmic governance” and “surveillance capitalism” seems truly terrifying. But I’m not sure if “rivalry” between government and big tech is the real problem we are dealing with. I would argue that the deeper, more troubling question is what class of politician will be in bed with what class of technocrat. In a recent episode of The Economist Asks, I heard Amy Klobuchar make a big speech about breaking up these mega-companies, but when asked very directly what she thought about Twitter banning Trump from its platform, she unequivocally endorsed the move. So it’s all about what’s convenient and who gets to benefit. And I really wonder if citizens will have any choice but to play along (privacy policies be damned) …

Tom Krehbiel
Tom Krehbiel
3 years ago

Yes, indeed. To put it in Biblical terms, it all depends on whose ox is being gored, with no thought of a general prohibition on the goring of oxen.

Julian Rigg
Julian Rigg
3 years ago

Smart phone providers together with social media companies have the majority of the, NOT so smart, people under their spell. Addiction is now complete.
I have no idea where is this heading but at the ground level it’s irritating and sometimes dangerous while on a freedom level it’s disturbing.

Peter LR
Peter LR
3 years ago

Would the repeal of Section 230 making platforms into publishers make any difference to this situation? I know Trump was keen on this but maybe political expediency is in action here as Big Tech definitely seems blue in its leanings. And of course Facebook has just overturned the Antitrust case which was attempting to harness its reach.

Christopher Barclay
Christopher Barclay
3 years ago

Big Tech is not a rival to the US government. It is a partner, which showed its worth when hiding the exposure of Biden corruption and pedophilia.

Karl Juhnke
Karl Juhnke
3 years ago

Love Crawford’s no nonsense appraisals of modern life. His background in mechanics and a hands on mentality shines through. I read Surveillance Society by Foucault some years back and though Foucault was a nutter and quite unlikable his thoughts on the subject have been pretty spot on. The Swamp keeps evolving and shifting. There is a fightback from independent thinkers underway right now with Brett Weinstein being at the forefront. David and Goliath all over again.

Tom Krehbiel
Tom Krehbiel
3 years ago
Reply to  Karl Juhnke

Let’s hope it IS a fight between David and Goliath, considering who won the last time.

Hardee Hodges
Hardee Hodges
3 years ago

In asking why these giants collect your data we find it has value to advertisers who want to target ads and to other organizations who now can sell you a telephone number for a personal contact. We can fight back to a degree by not purchasing any product you see in web adverts. The appeal to advertisers is better targeting to consumers and the web advert money proves effectiveness. Sadly given a finite ability to advertise, that has led to a decline in printed adverts which once sustained local newspapers.
Perhaps the hazards of the data lie in the ability to influence behavior by tailoring messages targeted to specific groups. This manipulation of emotion has likely huge political consequences in terms of greater fracination of society. Propaganda has always been effective particularly for some less informed public. The answer is a well informed public, not likely anytime soon.

Alan Thorpe
Alan Thorpe
3 years ago

I wonder what George Washington would have thought of this. He didn’t agree with political parties, with good reason. This is what he said in his farewell address, and it is even more valid today:
“The alternate domination of one faction over another, sharpened by the spirit of revenge, natural to party dissension, which in different ages and countries has perpetrated the most horrid enormities, is itself a frightful despotism. But this leads at length to a more formal and permanent despotism. The disorders and miseries which result gradually incline the minds of men to seek security and repose in the absolute power of an individual; and sooner or later the chief of some prevailing faction, more able or more fortunate than his competitors, turns this disposition to the purposes of his own elevation, on the ruins of public liberty.”

pdrodolf
pdrodolf
3 years ago

I noticed Apple was missing from his the list of bad tech companies. Was this an oversight or do they get a pass?

Prashant Kotak
Prashant Kotak
3 years ago

“…The logic by which an AI reaches its conclusions is impossible to reconstruct even for those who built the underlying algorithms…”

An aside from the politics of the discussion:

Neural networks have a peculiar parallel to human expert decision making and knowledge, which is little discussed – certainly no non-tecchies I have discussed this with have ever fully understood this point – they will look at you blankly if you raise it. However, tecchies who think about the nature of knowledge and algorithmic vs human decision making, and the question if all human processing is ultimately algorithmic or not, will know instantly what I am talking about. Like neural nets, top-end human expertise equally cannot tell you exactly what it does to reach the conclusions it does. An illustration of this, is that there are a small number of traders who will make money in all circumstances and all kinds of markets. However, when their professed methods are turned into algorithms, they significantly underperform the trader. This happens even if the trader instigated the process of turning the methodologies into trading system algorithms. Investigation of the point at which the trader and the algorithm differ always yields an extra layer of “rules” – ‘oh yeah, if a is happening and b or c is looking like the case, then in that circumstance you should look at x and do y and z instead’ – and there is *always* another layer, it’s like pulling on a string that keeps coming.

This says several interesting things, one of them being, it seems not only can human experts not fully articulate the deeper levels of their expertise, they are in effect confabulating stories about what they do, even to themselves.

Neural nets appear to be some kind of multi-dimensional function approximators, bypassing mass scale number crunching, and instead arriving at sophisticated heuristic answers – in this at least they appear to be doing something similar to what human expertise does.

Last edited 3 years ago by Prashant Kotak
Prashant Kotak
Prashant Kotak
3 years ago

Algorithmic technologies are not human scale and lawmakers without intimate knowledge of tech haven’t got a hope of drafting law quick enough in reaction – and by the time they react the landscape has altered and the tech has moved on, so they are mechanically set to remain behind the curve.

The problem stems from the unspoken human scale assumptions that have underpinned all human governance prior to the rise of ubiquitous computation; these assumptions are being undermined, ever faster, by algorithmic technologies which don’t face the same biological and physical limits of humans and human geography – speed of processing, unlimited scaling potential (both up and down), perfect replication, high speed transmission, unbreakable encryption. Traditional legislation mechanisms (short of coercion) cannot hope to complete, because algorithmic engineering solutions to get round or even subvert any frameworks of governance legislators come up with can always be created.

The first step towards any successful model of governance of the digital world is to recognise and acknowledge this. Thereafter, you are faced with the much bigger truth, unpalatable as it may be to this moment of humanity: technology will not alter to accommodate human societies; human societies must decide either to alter themselves in reaction to fit around technological advance, or junk modernity and revert back to antiquity. It’s a binary, there isn’t really a middle ground, no matter how much you may wish for it.

Last edited 3 years ago by Prashant Kotak
Paul LoSchiavo
Paul LoSchiavo
2 years ago
Reply to  Prashant Kotak

I suspect it is this inevitable fork in the road that inspired author Frank Herbert to imagine the adoption of a feudal system reflective of historical structures and establishes the rationale for the rejection of “thinking machines” made in the image of the human mind. In the “Dune” universe, humanity chooses to expand the potential of the human mind to fill the void resulting from its refusal to use machine intelligence. Failing to or suppressing the co-evolution of human thinking alongside its machine equivalent will possibly lead to a similar inflection point in humanity’s future.