X Close

The myth of online misinformation Our moral panic plays into the hands of Big Tech

He's telling you to worry about fake news, so that he can control you. Credit: MANDEL NGAN/AFP via Getty Images

He's telling you to worry about fake news, so that he can control you. Credit: MANDEL NGAN/AFP via Getty Images


March 17, 2022   5 mins

Just before the Russian invasion of Ukraine consumed the media, New York Times columnist Jay Caspian Kang and Substacker Matthew Yglesias published near-simultaneous critiques of the notions of “disinformation” and “misinformation”. This convergence among prominent liberals was significant. These and related concepts like “fake news” have shaped press coverage of a range of issues since the presidential contest of 2016 and have legitimised a new, censorious speech regime on tech platforms. But they usually go unquestioned on the Left.

Kang and Yglesias both consider the possibility that “misinformation” and “disinformation” are misleading frameworks for making sense of the world today. Indeed, Yglesias argues that the “misinformation panic could, over time, make discerning the actual truth harder”. This is because “misinformation” talk seems to lead inexorably to the suppression and censoring of dissent.

But Yglesias’s title — “The ‘misinformation problem’ seems like misinformation” — hints at a more paradoxical possibility: what if these concepts are the result of a deliberate and coordinated attempt to mislead the public?

In an earlier critique of the “misinformation” and “disinformation” framework, cited by Kang, tech journalist Joe Bernstein argued that the broad acceptance of these ideas reflects the rising influence of what he calls “Big Disinfo”: “a new field of knowledge production that emerged during the Trump years at the juncture of media, academia and policy research.” Its ostensibly neutral agenda offers ideological cover for centrist and liberal politicians by aligning them with values like objectivity, science, and truth, while defaming their opponents as conspiracy theorists.

Bernstein argues that Big Disinfo covertly serves the interests of the major tech platforms themselves, whose profit model relies on digital ads. This might seem counterintuitive, since the misinformation panic helped generate the “techlash” that tarnished Silicon Valley’s previously benign reputation among liberals and centrists. But the notion that online content is highly effective at changing people’s views is crucial to the sales pitch companies such as Meta (formerly Facebook) make to advertisers and investors. Hence, for Bernstein, the tech industry’s embrace of Big Disinfo’s claims is “a superficial strategy to avoid deeper questions” — and also valorises tech platforms as guardians of the information ecosystem.

Alongside journalists like Bernstein, Yglesias, and Kang, some academics are beginning to question the prevalent account of misinformation. Social Engineering, a new book by communications scholars Robert Gehl and Sean Lawson, helpfully reorients the discussion about these issues by offering deeper historical context and a new conceptual framework.

Terms like “misinformation”, “disinformation”, and “fake news”, Gehl and Lawson argue, fail “to grasp the complexity of manipulative communication” because they “reduce everything to a stark true/false binary” and thus disregard “the blurry lines between fact and falsehood”. Moreover, these terms imply a radical discontinuity between pre and post-internet worlds: they cast the former as a halcyon realm of clear, accurate, truthful communications, overseen by benevolent eminences like Walter Cronkite, while depicting the latter as a cesspit of lies and delusions. Bernstein parodies this view: “In the beginning, there were ABC, NBC, and CBS, and they were good”. This short-sighted perspective disregards the widespread concerns about propaganda that prevailed when network TV was at the height of its influence, which recent anxieties often echo.

The alternative terminology Gehl and Lawson propose is “social engineering”, a term that, as they show, has a two-stage history. The first widespread use of this phrase began in the early 20th century. Progressive reformers began to envision the possibility of employing mass communications technologies to reshape thought and behaviour on a vast scale. Their vision informed the coevolution of state propaganda and private-sector public relations, advertising, and marketing. Initially an optimistic project of benevolent technocratic rule, mass social engineering fell into intellectual disrepute by the late 20th century, although industries such as advertising and PR never abandoned its basic premises.

In the Seventies and Eighties — the same era when the older, top-down project of social engineering was being discredited as elitist and paternalistic — a new, bottom-up understanding of the same concept took hold among a loose cadre of hackers and “phone phreaks”. As Gehl and Lawson document, these communications outlaws developed an array of personalised techniques, such as impersonating clients and obtaining data under false pretences, to gain illicit access to technological systems belonging to phone companies, banks, government agencies, and other entities.

Applying the term “social engineering” to these sorts of tricks may seem grandiose, but Gehl and Lawson argue that they are continuous with the older technocratic enterprise: both types of social engineers “hide their true purposes, use sociotechnical knowledge to control others, and seek to manipulate others into doing things”.

The “hacker social engineers” of the past few decades have an easier time proving the efficacy of their techniques than mass social engineers, not least because their aims are typically more modest and practical. Consider an infamous incident from the 2016 election, part of the larger sequence of events that prompted the misinformation panic. The phishing scheme targeting Hillary Clinton’s campaign chairman, John Podesta — a classic act of hacker social engineering — was a success in that it achieved the limited practical goal of gaining access to his email account. Conversely, the attempts by the Trump campaign and its allies at mass social engineering (including via the publication of Podesta’s hacked emails) had no clear effect on the outcome of the 2016 election. There were too many other causal factors at work.

It’s not surprising, given the demonstrable successes of hacker social engineers at manipulating thought and behaviour, that larger entities have attempted to scale up their personalised techniques. This is how Gehl and Lawson recontextualise two of the most notorious alleged cases of “misinformation” from the 2016 period: the political consulting firm Cambridge Analytica and Russia’s Internet Research Agency. The first claimed, dubiously, to be able to perform “psychographic microtargeting” of voters based on data obtained under false pretences; the second deployed hacker techniques (like phishing) as well as paid trolls and fake accounts. Both “demonstrated the same ambitions of the older mass social engineers, but… also used the more targeted interpersonal techniques developed by hackers”.

Gehl and Lawson coin a term for these contemporary efforts that fuse mass social engineering with more personalised hacker methods: “masspersonal social engineering”. Although the aim of their book is to document the emergence of masspersonal social engineering, they concede that it’s unclear whether it has influenced thought and behaviour enough to, for instance, alter the results of elections. However, they caution that “future masspersonal social engineering may better implement the successful model of interpersonal hacker social engineering on a large scale”.

But they follow this warning with a more intriguing observation: many “sociotechnical developments that did not have particularly great immediate effects… are now recognised as having been vitally important despite (or perhaps because of) their failures.” One way to interpret this is that while the direct impacts of Russian hackers, Cambridge Analytica, and “fake news” have been modest, they have had a major indirect effect. They furnished the pretext for the misinformation panic, which offered embattled corporate media outlets, centrist politicians, and tech platforms a way of restoring their reputations and asserting more direct control over the information sphere.

As Gehl and Lawson note, “those in power are the ones in a position to wield the… capacities of social engineers”. This is why, historically, “publicity, propaganda, public relations, and consent engineering all have deep ties to the national security state”. In the face of inchoate populist tumult, declaring war against “misinformation” has enabled establishment institutions to shift legitimate anxieties about the manipulative use of media technologies towards upstart entities that, however unscrupulous, cannot claim anything like the influence of state-aligned media organisations or, for that matter, the tech platforms themselves.

Behind all talk of “misinformation” and “disinformation” is the tacit assumption that these terms designate exceptions to an otherwise pristine media landscape. Gehl and Lawson’s alternative framework points to the messier reality: a contest between “social engineers” (many of them only aspirational) operating at a variety of different scales and targeting populations, individuals, and everything in between. The misinformation panic, which has obscured this reality over the past half decade, is itself one of the most effective social engineering campaigns of our time.


Geoff Shullenberger is managing editor of Compact.

g_shullenberger

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

61 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Martin Bollis
Martin Bollis
2 years ago

I think the article conflates several quite separate issues.

There is mis information – lies. The type of pure nonsense Russia appears to be feeding its population. That may need a censorship response.

There is, for want if a better term, nudge reporting. “The riots are mostly peaceful.” You are lead in a direction of support for something by what is emphasised and what is downplayed.

I was abroad at the time of the Mueller hearings and watched back to back CNN and Fox coverage in hotels. I don’t think either told any actual lies, but the emphasis and downplay gave an impression of two completely different hearings.

Then there is social engineering, like TV adverts featuring couples that are always mixed race or gay. Not lying, not offensive … but clearly not representative and therefore leading to the “what am I being fed,” niggle.

It’s much more complex and nuanced than suggested in this article.

Hardee Hodges
Hardee Hodges
2 years ago
Reply to  Martin Bollis

I can recall watching CSPAN testimony and later watching the reporting on the testimony. The omissions and selective editing changed the testimony directly. Just as in the reporting of Trump’s “both sides” comments about the Charlottesvile riot where the omission was critical in asserting he supported one group. Had you not heard the words yourself, you were being purposely misled. I can understand that is OK for partisans, but not by news reporters. Such things result in mistrust, rightly so. It became even worse over the pandemic. Asking questions could get you kicked from a platform. What was misinformation at one point in time became truth later. All in the name of protecting the public.

Matt Hindman
Matt Hindman
2 years ago

I think the misinformation panic is going to start to crash in the near future. You cannot keep shamelessly lying over and over and never have people lose faith in you. Look at media trust polling and viewership. They are in free fall. Major newspapers are having trouble making ends meet and places like CNN and MSNBC are hemorrhaging viewers. Almost everyone I know no longer trusts the news and just a couple of years ago the opposite was true. At this point I think they know the “misinformation” line is starting to fall apart, so they are doubling down in panic. Why else do you think they have such a hard on for censorship right now?

Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Matt Hindman

Loss of media trust is not a reaction against a ‘misinformation panic’ but a consequence of misinformation. Foreign intelligencies and various subversives do not need to convince people of their line. It is more than enough to sow dissension and convince people that nothing can be trusted. Many Russians apparently believe that the Russian army is only attacking military bases, and all the horrific images come from the Ukrainians shelling their own cities. That is not because they are stupid, but because they are used to an environment so full of misinformation thet this explanation is no stranger than the alternatives and there is no way of getting to the truth.

Just a question: If ‘almost everyone you know’ no longer trust the news, what do they trust? Do you believe that your side is reliable and misinformation-free and only the others are lying?

Johann Strauss
Johann Strauss
2 years ago
Reply to  Rasmus Fogh

Seems to me you have fallen hook, line and sinker for misinformation of the official kind promulgated by the BBC and other MSM outlets. Indeed in everyone of our exchanges you have automatically adopted the “narrative” du jours and followed “the Experts TM”.
All censorship is bad. We are all adults and the only way to figure out the truth or get anywhere near it is to read all sides of an issue. Indeed, a Ukrainian who works for me told me that the only way she can figure out what’s going is to listen to both Ukrainian and Russian news because the truth is generally somewhere in the middle.

Warren T
Warren T
2 years ago
Reply to  Johann Strauss

Not sure that works either. If one side is telling you one completely false narrative, and the other side is telling you the complete opposite false narrative, then how can one arrive at a truth?

Johann Strauss
Johann Strauss
2 years ago
Reply to  Warren T

Well I would say that there is generally a degree of truth from all sides. But it requires critical thinking to figure things out, as well as a willingness to look at actual data rather than just ignoring it.
As an example, consider the COVID vaccines. The current UKHSA data indicate that the vaccine becomes negatively effective around 6 weeks after the third booster. So if one party insists that one has to be boosted, yet the data show a minimal benefit for a short period of time, it would seems smart to think about whether or not the risk of getting a booster is worth the temporary benefit. Likewise, if it’s becoming abundantly clear from databases such as the US VAERS and the UK Yellow book that the number of serious adverse events reported following COVID vaccination is orders of magnitude higher than other vaccines combined, it should at least give one pause to think and take the effort to investigate oneself rather than just relying on the official narrative du jours. Similarly when officials call the vaccines safe and effective, and refer to myocarditis with hospitalization as mild. Mild relative to what? Death? It’s evident that any medical admission for a week or more is by definition severe and not mild.
The same thing can be said about the tragic situation in Ukraine. On the one hand we have Biden declaring that Putin is war criminal (which is not exactly conducive to some sort of negotiate settlement), yet the UN reported on Monday that there were a total of 536 civilian deaths up to that time (how they can be so accurate is beyond me but still), and at the same time in the same articles it’s reported that over 6000 Russian troops have been killed. Clearly this information does not appear to be entirely self-consistent. We then have the MSM reporting how badly the Russians are doing and how they are stalled, etc. etc…, and yet a daily look at the map of advancing Russian forces would suggest that slowly but surely they have almost completely encircled the 60,000 odd Ukrainian army. In other words, it is evident that both Ukrainian and Russian propaganda, with the former being swallowed whole and regurgitated by the western press and politicians, while the latter is completely dismissed as misinformation, represents a good deal of cognitive dissonance. Recall, the White House and Pentagon are still denying the existence of Ukrainian biolabs working on dangerous pathogens and funded by the US, dismissing it as Russian misinformation, even though the Deputy Secretary of State under oath testified only last week at a Senate hearing, to the existence of such labs and that the US was very worried that the materials would fall into Russian hands. Clearly something is off here, and in this instance the official narrative that we are being sold Russian misinformation and delving into conspiracy theories is entirely false, given that on top of everything else all the relevant information is available in the US in the public domain.

Zirrus VanDevere
Zirrus VanDevere
2 years ago
Reply to  Johann Strauss

Thank you for your rational take on this, it’s amazing to me how often the comment section here is more thoughtful and nuanced than some of the articles.

Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Johann Strauss

My critical thinking and looking at the data tells me that the side effects of vaccination is vastly smaller than the side effects of getting COVID unvaccinated, and that vaccination is very good at saving you from dying of COVID. If you agree on that we can discuss the vaccination side effects and the limits of what they can do. If you do not agree, we should deal with those questions before we start analysing VAERS.

As for Ukraine there is a lot of nonsense floating around, and a lot of unproved claims that you would be foolish to trust at face value no matter where it is printed. I’d say it is proven that the Russians are shelling populated cities and causing a lot of damage. It is an open question whether they are deliberately targeting hospitals and refuges (as they did in Syria) – which would be a war crime – or this is just what happens in war. But you give the impression of being very much in favour of the invasion and against the efforts to counter it. What are you claiming, exactly, so we can discuss something more important than unknowable Russian losses?

Last edited 2 years ago by Rasmus Fogh
Warren T
Warren T
2 years ago
Reply to  Rasmus Fogh

That is the great question and the reason for this situation in the first place. Who knows who they can trust anymore? And when no one can trust any source of information, then chaos and anarchy prevail. Only during chaotic and confused times can governments take total control.

Laura Creighton
Laura Creighton
2 years ago

Moreover, these terms imply a radical discontinuity between pre and post-internet worlds: they cast the former as a halcyon realm of clear, accurate, truthful communications, overseen by benevolent eminences like Walter Cronkite, while depicting the latter as a cesspit of lies and delusions. Bernstein parodies this view: “In the beginning, there were ABC, NBC, and CBS, and they were good”. This short-sighted perspective disregards the widespread concerns about propaganda that prevailed when network TV was at the height of its influence, which recent anxieties often echo.

But that is the whole point. In the pre-internet world, we had journalists. They had a mostly-shared work-culture with standards and ethics, and new members of the profession were socialised into the culture. Everybody carried the image of somebody like Walter Cronkite as ‘how our trade is done when it is being done well’. And everybody was trained in evils of the profession, what is slander, what is libel and also things like ‘what we cannot write about because it will make our advertisers angry’ and ‘what we cannot show on tv’ and so on and so forth. It wasn’t halcyon, but the concerns about propaganda were coming from a class that didn’t want to be used by the powerful to spread lies and falsehoods, and which tried to police itself from those sell-outs who were perfectly happy to spread lies for cash, status and access. Indeed, while these concerns were raised, what I remember most is the frustration about the stories that could never be told, because the advertisers wouldn’t stand for it. ‘Lying by omission’ was believed to be a greater problem than telling lies, because the told lies could be refuted.
Fast forward to the internet age. A large number of people finally got a chance to say those things that could not be said before. And while this started out among a journalist class that still held to the old ethics and valued the truth, it wasn’t too long before people started pushing the boundaries, and found that they weren’t there. The journalism class no longer could police itself, nor distinguish itself from the new profession, that of ‘influencer’. And influencers had no professional ethics at all.
They puffed products they claimed to use, but never did. They had no editors to please acting as gatekeepers on what they wrote or did. Their colleagues were sharing tips on how to photoshop images and edit videos in order to more attractively present untruths, and were lauded for their tech savvy. Moreover, journalistic bias — which never can be eliminated completely — was confused with the planting of deliberate falsehoods. If you cannot be perfectly unbiased, then why try to be unbiased at all?
When people took them to task for this — ‘Aren’t you just a shill?’ the influencers pled a higher calling. They were holy activists. Somehow, even the influencers who were strictly about product placement managed to earn a halo.
The small town newspapers were finished. The small town advertisers that used to advertise in such papers had no voice at all. Brands advertise directly to consumers, and bigger is better. For a while, people thought that the solution to this problem was going to be subscription based journalism, but alas it turned out that you can make more money selling ‘lies our readership likes’ to your subscribers than actually telling them the truth.
But there is a real hunger for truth out there. The journalism trade — and I suspect it will be a trade, and not a profession — may make a comeback. Alas, it looks like it is going to do so on the ashes of ‘professionalism’ as it exposes just how corrupted everybody else’s professional ethics has become.
But in order to get this to work, we need to change focus from ‘misinformation’ — a focus on the statements, to be evaluated on whether they are true or false, but to slander, libel, violations of the various truth in advertising acts — legally defined criminal activity. (Some of which may need some adjusting for the internet age). The problem isn’t in the mistaken things that we all believe. The problem is in the lying.

Last edited 2 years ago by Laura Creighton
Hardee Hodges
Hardee Hodges
2 years ago

I get my best misinformation from platforms like this one and a excess of SubStack writers. The latter often produce considerable analyses of data along with that data. Comments often produce new insights into fairly complex topics. Peer review in near real time. I applaud these new journalists.

Laura Creighton
Laura Creighton
2 years ago
Reply to  Hardee Hodges

Yes indeed, substack seems to be where those who would be journalists and not influencers are finding each other.

Andrew McDonald
Andrew McDonald
2 years ago

And guess what? Info and opinion that you have to pay for turns out to be better (more reliable, more interesting) than the rubbish you get for free. Who knew?

Saul D
Saul D
2 years ago

Social manipulation is pervasive historically. The Black Legend about Spain, ‘Let them eat cake’, South Sea Bubble, tulip mania, patent medicines, ‘you have nothing to lose but your chains’, Scientology, boiler-room sales, WMD, Ponzi schemes, the Steele Dossier. Even appealing to science as fact goes wrong – miasma theory, eugenics, lobotomies, stress ulcers, diesel pollution, CJD.
In a world of mass information where individuals get targeted personally by Nigerian princes, phishing emails, fake charity appeals and behavioural nudges, it’s unlikely that ‘controlling the information sphere’ is possible or would work at all. For instance, despite being blocked by the media, it only took a month for about 50% of the US population to become aware of the phrase “Let’s go Brandon” – a phenomenal rate of spread that took place outside MSM.
Instead of panicking about misinformation (some of which ends up as correct), teaching skepticism, while not blocking any speech, might be a better long term treatment to encourage caution and doubt about what might be true to displace misplaced certainties.

Jon Game
Jon Game
2 years ago

Governments across the world have successfully weaponised the cries of “misinformation”, usually through their state broadcaster via regulators, or simply the size of their advertising budgets. They have been so successful that,despite evidence to the contrary (see article in Spectator today comparing Scotland and England) most people still believe what they have been told about mask wearing and social distancing. Misinformation as a weapon is also seen clearly in the pharmaceutical industry where the massive statin industry squashes any evidence that shows they have no effect on all cause mortality. This track record is being continued in their drummed-up compulsion to vaccinate those who don’t need to be, including world-class tennis players.

Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Jon Game

most people still believe what they have been told about mask wearing and social distancing

Yes, I am one of them. As it looks from my side it is you and your freinds who have fallen lock, stock and barrel for a misinformation campain.

This is not to say ‘you too!’ – just to point out tht if we are to get anywhere on how to handle misinformation, we cannot each start from the premise that our own information is obviously correct and it is all the others who are lying.

Johann Strauss
Johann Strauss
2 years ago
Reply to  Rasmus Fogh

That is precisely why one has to look at all incoming data with a critical mind and always be skeptical. Something you have demonstrated time and time again that you are not. As for masks you have been brainwashed because you have failed to think critically about their use, how best to use them, when they should be used, in which environment/situation are they most effective and in which are they ineffective, how long do they last for, do they prevent egress better than ingress, and in the context of egress, how good are they when they leak like a sieve on the sides, tops and bottom, etc. etc. etc.

Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Johann Strauss

You are making my point for me. I have demonstrated time and again that I do not agree with your judgements or your assumptions. That is not the same as being uncritical.

Johann Strauss
Johann Strauss
2 years ago
Reply to  Rasmus Fogh

You are dead wrong and unfortunately are so brainwashed you can’t see it. I have no issue with anybody disagreeing with me. I do have a problem when all they can do is parrot “The Narrative” without ever bothering to look at the actual data in the real world. Sorry but you are intellectually lazy.

Hardee Hodges
Hardee Hodges
2 years ago
Reply to  Johann Strauss

I think Ian Miller has established via evidence that masks are mostly useless in any protective sense.

Linda Hutchinson
Linda Hutchinson
2 years ago
Reply to  Johann Strauss

It seems to me reading this little discussion that you are assuming that, because Mr Fogh disagrees with you, he is brain-washe;, this is an insulting comment and implies that Mr Fogh has not thought about this. I am sure that he has looked into the evidence and the data and from that made a rational decision, not based on anti-establishment and anti-expert ideology. Because one comes down on the side of the accepted “narrative” it does not mean that one has not looked into the detail; a knee-jerk opposition to this “narrative” is as bad as accepting it without investigating (as far as one can)..

Johann Strauss
Johann Strauss
2 years ago

Under normal circumstances you would be right. But in this instance I have had so many exchanges on matters COVID (and more recently Ukraine) with Mr Fogh that I would have to disagree with you. His line of argument is always you haven’t cited any references or links, or that I’m living in some alternate universe (never mind that many of articles and interviews on Unherd, as well a readers comments happen to agree with what I’m saying). One then gives him the links to official data (whether UKHSA, VAERS, yellow book, etc…) and he comes back that it’s too much. So yes, every response of his just parrots the narrative du jours, and when that narrative changes, he changes. Only yesterday or the day before he denied the existence of the biolabs in Ukraine dealing with very nasty pathogens. How can one so blindly dismiss something as conspiracy theories and Russian propaganda, when the Deputy Secretary of State, Victoria Nuland, testified last week under oath at a Senate hearing that yes indeed there were biolabs in Ukraine dealing with nasty pathogens and funded by the US. Marco Rubio who was questioning her was not expecting that answer – he was expecting her to say it was all Russian disinformation. But I guess when you are testifying under oath under penalty of perjury and a hefty jail sentence if caught lying, one might just opt for the truth.
That is precisely why it’s so important that there be no censorship of any kind because it is all too easy for the powerful few to censor information that they don’t like or doesn’t further their narrative. And that’s true whether it involves matters COVID, Ukraine, climate change, energy production, etc….

Elaine Giedrys-Leeper
Elaine Giedrys-Leeper
2 years ago
Reply to  Johann Strauss

” … there were biolabs in Ukraine dealing with nasty pathogens and funded by the US. ”

Yes there are bio labs in the Ukraine, just as there are in Russia, China, the UK, the USA and many other nations all over the planet.
Allegedly the US “has invested approximately $200 million in Ukraine since 2005, supporting 46 Ukrainian laboratories, health facilities, and diagnostic sites,” – Defense Department fact sheet – part of the Biological Threat Reduction Program (Viewable here : https://www.state.gov/wp-content/uploads/2019/02/05-829-Ukraine-Weapons.pdf ) designed to provide technical support to the Ukrainian Ministry of Health since 2005 to improve public health laboratories whose mission is analogous to the U.S. Centers for Disease Control and Prevention.”
Examples of their research (presented at the Chemical and Biological Defence Science and Technology Conference in 2017)

  • laboratory efforts at improving the diagnosis, surveillance and prevention of ASF in wild boar populations
  • a program to monitor certain soft ticks, which transmit ASF to pigs
  • methods to trace tularemia and anthrax in animals such as wild boars.

I suppose if you were a Russian pig farmer all this ASF research would look highly suspect. If you were a Ukrainian pig farmer you would be over the moon that the USA was funding such research to reduce the risk of you having to cull all your pigs next year.
I have never been involved in a proper audit of these labs and I would guess you haven’t been either so what research has actually been going on in them is pure speculation.

One fact we do know for sure is that Putin and his minions have access to a number of different chemical agents and have no compunction about using them on foreign soil to kill people, regardless of any collateral damage.

Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Johann Strauss

If you want to convince anyone, you need to change your style. Of course you may not want to. What you are presenting is so steeped in the assumption that you are obviously right that it is useless as an argument. Either people already agree with you, or they look at your argument, find the first three unsupported assumptions you are making that they do not agree with, and dismiss the whole thing. To convince, you need to show clearly what information you are proposing, why they should believe it, and how it supports your argument. People like Saul D can do it, but you do not seem to be even trying.

Item: links to official data. It is not enough to say ‘there is a number on this web site that proves I am right’. To be convinced I need to know which number, what it actually means, and how it was calculated, so I can check for possible pitfalls. You refuse to tell me even which page to look at. Why, then, should I listen to you?

Item: Biolabs. The Russian case is that the US supports biological weapons research in Ukraine, which is both illeqal, aggressive, and extremely dangerous. If that was true it would be pretty serious, but I am convinced the Russians are lying. It is openly admitted that the US has supported some kind of biological research labs in Ukraine, that involve dangerous pathogens. But then, every hospital in the world has a lab like that. If you want to argue that this could justify an invasion, you need to argue that this is indeed dangerous weapons research, and tell us something about what goes on there and why we should believe you. Instead you just say “See! There are biolabs! Victoria Nuland admitted there are biolabs! That proves I am right!”. Without even getting to the actual point, which is what supposedly happens in those labs.

To you this may well be convincing – but that is because you are so convinced already that you do not really need proof. If you are already certain of the answer, any old piece of data can be used to confirm it. But for people who do not believe your answer, you need to come with something strong and solid enough to change their mind. And not just scold them because they are too lazy to see that you are obviously right.

Last edited 2 years ago by Rasmus Fogh
Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Johann Strauss

Well, Johan, if you are game, this is a good test example for where we both get our data from.

The Russian accusations of Ukraine bioweapon development can be seen for example here. They claim a deliberate program of developing diseases to spread into Russia. Of course they have every reason to lie, but that does not prove it is not true.

The US version, as spread through the MSM, congressional papers etc. is that these are pathogen surveillance programs. They say that when Ukraine became independent they had several bioweapons labs, Since neither the US nor Ukraine wanted Ukraine to have a bioweapons program, and neither thought that it was a good idea to have a lot of unemployed and embittered bioweapons experts floating around, the US financed a number of labs to do disease research and keep the experts gainfully occupied. I remember hearing about such things years ago. They might be lying, but it is at least plausible.

Now, as E G-L points out, nobody has been doing an audit of what was happening in those labs, so nobody actually knows anything, except the US and Ukraine. Me, I would say this accusation is too serious to believe without some solid evidence, and I do not see any – just Russian war propaganda. Also I do not think it is plausible. Active bioweapons programs (as opposed to gathering information for possible future use) is not something I think the US would be doing at this point. Too much reputational risk, and too hard to control the weapons. I mean, why would the US want to start an epidemic of drug resistant TB in Russia?? If they were doing it, I absolutely do not think they would do highly dangerous top secret mass destruction research in Ukraine. Too many potential Russian spies around, and the entire country is only one election, coup, or policy change away from going over to Russia with all the secrets entrusted to it. If they were doing it, they would do it in Maryland, or some other place where they could rely on their secrets being kept in the future.

Now you,Johan. You have no direct information either. Which sources that you trust are telling you that there is a US bioweapons program in Ukraine? How do you think they know? Why do you think it is plausible? We might never convince each other, but maybe we could find out which sources we each chose to rely on.

Last edited 2 years ago by Rasmus Fogh
Jon Game
Jon Game
2 years ago
Reply to  Rasmus Fogh

A better example of misinformation can surely not be found anywhere in the world, now or throughout history, than mask wearing to stop covid. If population A fared better than population B because they carried a rabbit’s foot, or wore a green hat, then I would be interested and investigate further. As there is no such evidence in real life (i.e. outside a lab), then I continue to smirk at people that have fallen for it.

Johann Strauss
Johann Strauss
2 years ago
Reply to  Jon Game

Well said. What’s so interesting though is how so many people have been completely brainwashed by this. In the medical research institution that I work in, they have only just lifted the mask mandates outside of clinical settings, and the mask mandate was removed at least several weeks ago within the county I live in, which itself was well behind the state as a whole. Yet the vast majority (and I mean >90%) continue to wear masks even though they might be the only person sitting in their labs, and these are scientists who are supposed to be able to think critically but clearly are incapable of it. The same is true in the stores but at least the majority of those are not scientific researchers and hence might hang on and feel more comfortable with any number of superstitions.

Last edited 2 years ago by Johann Strauss
Martin Logan
Martin Logan
2 years ago

This is actually pretty simple.
Both the Kremlin (and certain academics in the West) argue that “all discourses are valid.” This, of course, violates the core of Western thought since the 12th C. To paraphrase Ockham, the hypothesis supported by more evidence is more likely to be true. At any given time only ONE discourse is more valid.
As a practical matter, that Putin’s offensive in Ukraine has failed is pretty clear. Even the Kremlin acknowledges that it has taken almost no new ground for more than a week. Meanwhile, a force much larger in numbers, the Ukrainian army, is receiving more weapons and more training on a daily basis.
That Moscow can change the situation militarily is also remote, since most of Russia’s ground forces are already committed. Bringing in units from Kamchatka and Armenia won’t change the situation. Short of nuclear war, Russia is stalemated, and may well see regime change.
All discourses are NOT valid–and quite often the ones that are valid can be found on the MSM.

Martin Bollis
Martin Bollis
2 years ago
Reply to  Martin Logan

Deleted, meant for Laura

Last edited 2 years ago by Martin Bollis
Johann Strauss
Johann Strauss
2 years ago
Reply to  Martin Logan

Perhaps let’s wait and see what the situation is in another 3-6 weeks. Is it not possible you have fallen for Ukrainian propaganda especially as there is no doubt that Ukraine is winning the propaganda war. Whether they are actually winning on the ground (which is what ultimately counts) is anybody’s guess given that information coming out of Ukraine is limited and heavily distorted. The Ghost of Kiev anybody?

Hardee Hodges
Hardee Hodges
2 years ago
Reply to  Johann Strauss

I found Graham’s Newsletter | Graham Seibert | Substack useful for an expat living near the action.

Saul D
Saul D
2 years ago
Reply to  Martin Logan

Frame it that “all discourses have a likelihood”. Then the hypothesis that best fits the evidence is most likely to be true – highest likelihood (noting that a single piece of evidence can render an entire hypothesis untrue). Ockham’s razor says we should aim for the simplest explanation that maximises likelihood across the evidence available.
However, with incomplete evidence, there will often be several discourses that are valid each with similar likelihood – so multiple maxima, not just one. It’s why intelligent people can still disagree even with the same data.
Choosing between discourses with similar likelihood means finding additional distinguishing evidence (not always possible). So individuals making different judgements about discourse likelihoods need to trade and test the data they use. We all weigh evidence differently based on different learning and experiences. Lab leak or meat market? Stalled or ISW maps still showing slow advances? Masks or no masks?
And it gets worse when you look to the future. There are no facts about the future, only estimates and guesses from where we are now. Likelihoods shift with every new event. Yesterday’s valid discourse is today’s busted flush. Sharing opinions isn’t just about whether those opinions are right, but also understanding what you might be missing, so as to improve your guess likelihood.

Last edited 2 years ago by Saul D
Laura Creighton
Laura Creighton
2 years ago
Reply to  Saul D

Up against likelihood we also have to check if somebody with deep pockets could have paid for a lie.

Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Saul D

Absolutely correct!
Can I add that (according to Bayes theorem, if you want to get fancy), the likelihood of an proposition depends not only on the evidence, but also on how likely the proposition is in the first place. Which is unavoidably subjective, so ‘similar likelihood’ is in the eye of the beholder. What saves science is that as enough evidence accumulates, the people with completely off ideas about what is intrinsically likely eventually get overwhelmed by the evidence.

Last edited 2 years ago by Rasmus Fogh
Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Saul D

A short answer of mine has evaporated here. I wonder what triggered it – the name of a well-known dead clergyman who gave his name to a central part of probability theory??
Ah well, it came back – just minus the ‘Like’ button. Just what is going on here?
And gone again.

Last edited 2 years ago by Rasmus Fogh
Elaine Giedrys-Leeper
Elaine Giedrys-Leeper
2 years ago
Reply to  Saul D

Many thanks for this – one of the most lucid posts I have ever read on UnHerd

laurence scaduto
laurence scaduto
2 years ago

I detect a generational shift here.
First of all, back in the day we all understood that “truth” was a muddled and sort of soupy concept. Like “sea level”, it never holds still long enough to actually “know” it.
So the thing to do, whether the topic is the price of bananas or who killed JFK, is gather the most reliable info you can from as many different sources as is feasable and decide for yourself. While keeping in mind that truth is an illusion and most people are lieing a little anyway. And that most “truths” are unhelpful, unbelievably boring, and probably none of your business.
But the most important thing is to NEVER rely on “received wisdom”; especially from corporations, governments, politicians and parties (obviously), anyone who is trying to get money from you (as in “DONATE now”), and anyone who looks or sounds angry and/or overly enthusiastic (most of the internet).
And always count your change before you leave the store.

Martin Bollis
Martin Bollis
2 years ago

I’ve had that problem several times. I notice Galeti, James and SULPICIA seem to have disappeared completely.

It seems some must remain unheard, even on Unherd. I no longer believe it’s just an IT glitch, which is very sad.

Billy Bob
Billy Bob
2 years ago
Reply to  Martin Bollis

It could simply be they haven’t renewed their subscriptions

Andy Aitch
Andy Aitch
2 years ago
Reply to  Martin Bollis

One of the reasons I remain a member here is the thoughtful quality of most reply posts. Galeti whatisname was one of the least readable. In fact I suspect it was an updated pseudonym for an earlier unlikely Scrabble-hand of a name who regularly sprayed Dave Spart-ish comments across the site, so not really missed!
If an unseen (unherd?) hand is editing, by and large all sides seem to get through – at least thus far.
But I’m still not seeing the ‘longer piece’ by Laura Creighton…

Last edited 2 years ago by Andy Aitch
Adam Bartlett
Adam Bartlett
2 years ago

Fantastic article even by Unheard standards. Especially the last para. It’s probably fair to say a quite high proportion of even us BTL commentators on Unheard and other news sites are small scale “social engineers”, trying to persuade others to our point of view. Or at least we wear that hat for some of our posts.
On ‘Big Disinfo’ covertly serving the big platform operators, this strikes me as true, though it may be more a result of adaptation than planning. The way I remember it, there were increasingly calls by prominent SJWs for Facebook and others to increase online censorship starting from 2016. Zuck largely resisted them up to maybe late 2020 and especially after Jan 6th. Kind of seemed like they were forced into playing censors by SJWs.
I might be a little less dubious on the effectiveness of psychographic micro targeting, though I seem to recall studies finding CAs attempt has virtually no discernible impact on the 2016 election. I remember micro targeting being a hot topic in 2017, many actors started up or expanded their social media analytics division. To some extent the work they did for clients could be classed as micro targeting and I understand from a friend who until 2019 was high up in one of the big PR firms that there was good evidence it was reasonably effective, even if it may have had no extra effect on most individuals. That said, I wouldn’t doubt big tech have an incentive to over promote its effectiveness in the hopes of gaining more add revenue.

Hardee Hodges
Hardee Hodges
2 years ago
Reply to  Adam Bartlett

Google has been quite effective in hiding various things as any other search engine reveals. Apparently, they have the capability to actually target subsets of people based on their internal data and shape opinion. Whether this can affect voters is up for grabs. But I no longer use their engine.

Laura Creighton
Laura Creighton
2 years ago

And then it all went away, a second time, and then back again ….
Something is truly wonky with this site.

Last edited 2 years ago by Laura Creighton
Saul D
Saul D
2 years ago

Innocuous stuff goes missing. I was wondering if it might be connected to systems they might use for load-balancing or content networks – more technical issues at play than just failures in the moderation system?

Rasmus Fogh
Rasmus Fogh
2 years ago

Your post has no ‘like’ button. So let me say explicitly that I really like it.

Laura Creighton
Laura Creighton
2 years ago
Reply to  Rasmus Fogh

Thank you.

Andrew Dalton
Andrew Dalton
2 years ago
Reply to  Rasmus Fogh

Messages that go missing and come back often don’t have the like and dislike buttons. Sometimes they reappear, as in this case.

Laura Creighton
Laura Creighton
2 years ago

And now my long piece is back, but Rasmus Fogh’s note saying there is no like button under my piece is gone.

Martin Bollis
Martin Bollis
2 years ago

As is my comment noting that it’s happened to me a few times and that a number of well known commenters on here seem to have disappeared

Andy Aitch
Andy Aitch
2 years ago
Reply to  Martin Bollis

Disappeared too!! My earlier comments about Martin’s vanished commentators AND Laura’s long piece, which I can now see.
Very good it is too, but disappearing comments (that do not break politeness rules) is a bigger – very disturbing – trend. If we cannot follow threads in their entirety, mistakes & all, then what is the point of having comments?
If moderators feel the need to delete something then leave the trace of that deletion. Breadcrumbs at the very least!

Rasmus Fogh
Rasmus Fogh
2 years ago
Reply to  Andy Aitch

Absolutely!

Andrew Dalton
Andrew Dalton
2 years ago
Reply to  Andy Aitch

My comment has disappeared again. All I said was that the like and dislike button often disappear when such comments reappear.
Unless I’m breaking some rule about how often the word appear may appear in a single paragraph, I’ve no idea what’s going on.
I’m not renewing my subscription unless it is clearly demonstrated this problem is resolved. I’m certainly not putting effort into comments that just warp in and out of existence on the whim of a trigger happy mod or awful software.

Allison Barrows
Allison Barrows
2 years ago
Reply to  Andy Aitch

I made an innocuos comment yesterday and it went into moderation, then disappeared. Look, we pay for the right to post here, so if this keeps up, I won’t re-subscribe.

Jeremy Bray
Jeremy Bray
2 years ago
Reply to  Martin Bollis

Unlike James Joyce, an interesting commenter here who no longer seems to comment either voluntarily or otherwise, I am not in a believer in our right to make any comment we wish to here.This is Unherd’s platform and they are entitled to moderate in the style they wish to.
However, I do think it is a courtesy to readers and potential commentators to spell out rather more clearly than the Guidelines do where they want to draw the line, and I have suggested by email to Freddie that either he or one of his editorial team write an article to discuss moderation on the site.
It is irritating to post a comment that disappears temporarily into moderation because of the use of some word that the automatic moderation algorithm identifies incorrectly to be a form of abuse or disappears for good because it has breached provision 12 of the guidelines because it is considered to upset someone even if it is factually correct or merely a questioning of facts in an article.

Christopher Chantrill
Christopher Chantrill
2 years ago

This is not that hard. Everyone with power is tempted to use that power to control the conversation and to eliminate ideas and “facts” that threatens their power.
Journalists are particularly vulnerable to corruption because, since time began, their jobs have depended on successfully sucking up to the powerful.
Here in America we have this crazy thing called the First Amendment. It says, first, no state church, meaning the gubmint is not allowed to call on God to back up its lies.
Then it warbles away about freedom of speech, and the press and petitioning the gubmint for redress of grievances. Really darling, how 18th century!

Alan Groff
Alan Groff
2 years ago

I’m pasting an excerpt at the end of Foucault’s book Knowledge/Power because it provides a philosophical framework for the sort of manipulation the author seems to fear. Foucault looks like the hopeless derivative from the worst aspects of Hegel and Kant. Still, he’s the most cited academic in the many areas of the humanities for the last forty years.
Here is the philosophical ground for manipulation and the great enemy to Western values of justice and liberty.

The university emerges as the center of power because of the multiplication and reinforcement of the power effects of ensembles of intellectuals who pass through and relate themselves to the academic system. We saw some emergence of this power as early as the Second World War with Oppenheimer, who acted as the point of transition between universal and specific intellectual. The nuclear threat affected the whole human race and the fate of the world; thus, his discourse could, at the same time, be the discourse of the universal.  

The universal intellectual of the nineteenth and twentieth century was the man of justice, the man of law, who counterposes to power the universality of justice and equity. The universal intellectual derived from the jurist found his fullest manifestation as the writer of values and significations in which all can recognize themselves. The importance of the great writer has all but disappeared. But the specific intellectual is quite another figure – that of savant or expert. I just said that it’s with atomic scientists that this latter figure comes to the forefront. Biology and physics were the zones of formation of this new personage. The extension of the technico-scientific structures in the economic and strategic domain gave him real importance. The figure of prestige is no longer writer of genius, but that of the absolute savant, no longer he who bears the values of all, opposes unjust sovereigns, but he who with a handful of others has powers which can benefit or irrevocably destroy life and control the truth.  

Intellectuals must accept political responsibility, which he is obliged to accept, as a computer expert, a pharmacologist, etc. We must not discount him politically in his specific relation to local forms of power because of the fundamental point: the effects proper to true discourses.  

The important thing is that truth isn’t outside power. Contrary to myth, truth isn’t the reward of those who succeed in liberating themselves.  Truth is a thing of this world and induces regular effects of power. Each society has its regime of truth, its politics of truth: the discourse which it accepts as true; the mechanisms that distinguish true and false statements, the means by which each is sanctioned; the status of those charged with saying what counts as true.  Truth is centered on the discourse of the institutions that produce it. It is subject to the immense diffusion and circulating apparatuses of education and information produced and transmitted under the exclusive control of the political apparatuses of the university and media.

The battle for truth – understood again as the power-apparatus that separates the true and false – is not a battle “on behalf” of truth, but the battle about the status of truth and its economic role. It is necessary to think of intellectuals’ political problems, not in terms of “scientific truth” and “ideology” but in terms of power.  

We, therefore, make the following propositions:

  • Truth is to be understood as a system of ordered procedures for the operation, production regulation, distribution, circulation, and operation of statements.
  • Truth is linked in a circular relation with systems of power that produce and sustain it, and to the effects of power, which induce it and extend it — A regime of Truth.  
  • This regime is not merely ideological or superstructural; it’s the same regime that, subject to certain modifications, operates in the socialist countries.  
  • The intellectual’s essential political problem is not to criticize the ideological contents supposedly linked to science or to ensure that a correct methodology accompanies his own scientific practice, but that the ascertaining of the possibility of constituting a new Politics of Truth. The problem is not changing people’s consciousness – or what’s in their heads – but the political, economic, institutional regime of the production of Truth.
  • The political question, to sum up, is not error, illusion, alienated consciousness, or ideology; it is Truth itself. Hence the importance of Nietzsche. – End of Book –
J S
J S
2 years ago

The fairness doctrine was a great thing (look it up). Greedy and shortsighted, it was disposed of, to great harm.

Karl Juhnke
Karl Juhnke
2 years ago

Before the invention of the printed press, the staus quo was that only elitist and state ideas were sanctiined and widely promoted. The printed press suddenly allowed dissenting voices to challenge this. Slowly, through elitist and state educational funding and the resulting career pathway$, the status quo returned. Then came along dissenting voices of the internet. Again; the money flowed and again the stas quo is fighting to be reinstated. Hence ‘misinformation’ means ‘unsanctioned’ as was the case with Hillary and Hunter. Two examples from thousands out there.

Ian Wray
Ian Wray
2 years ago

Concerning the ‘Trusted News Summit” of 2019:
“Major news and tech organisations will work together to protect their audiences and users from disinformation, particularly around moments of jeopardy, including elections.
Earlier this summer the BBC convened a Trusted News Summit, bringing together senior figures from major global technology firms and publishing. Recent events such as the Indian elections have highlighted the dangers of disinformation and the risks it poses to democracy, and have underlined the importance of working together around shared principles.
The BBC’s partners who attended the summit are The European Broadcasting Union (EBU), Facebook, Financial Times, First Draft, Google, The Hindu, and The Wall Street Journal. Other partners are AFP, CBC/Radio-Canada, Microsoft, Reuters, and The Reuters Institute for the Study of Journalism, and we are also consulting Twitter on areas of potential collaboration.
Tony Hall, Director-General of the BBC and EBU President, says: “Disinformation and so-called fake news is a threat to us all. At its worst, it can present a serious threat to democracy and even to people’s lives.
“This summit has shown a determination to take collective action to fight this problem and we have agreed some crucial steps towards this.”
The summit agreed to work collectively, where appropriate, to agree collaborative actions on various initiatives.”
See: https://archive.is/HBEuR

Hardee Hodges
Hardee Hodges
2 years ago
Reply to  Ian Wray

As noted by Dr Prasad and elsewhere, the Trusted News Initiative has affected responsible scientists who disagree with some of the covid policies. That debate is worth understanding and has not been allowed “in the public interest”.