Are you on Team A or B? (Photo by Brandon Bell/Getty Images)


February 16, 2021   7 mins

When was the last time you read an article, an opinion piece, that you felt was trying to persuade you of something? To argue a position that you don’t hold, and make you believe it?

I suspect such experiences are rare. It is easier to write things for people who already agree with you: to make them cheer or feel clever, or to remind them how dreadful the other lot are. It’s also more fun. 

I’m not talking about reading a column that disagrees with you. I’m sure you read them regularly, or at least the headlines: pieces get hate-shared all the time among people who disagree with them. But they are not written to persuade, and readers are not persuaded. The intention, I think, is to provoke a reaction, to elicit cheers and boos. Not, primarily, to change minds.

On Saturday, a long-awaited New York Times article was published, about the blog Slate Star Codex (SSC). To get you up to speed: SSC is a blog by Scott Alexander, a pseudonymous Californian psychiatrist, part of a community of Bay Area nerds and weirdos, widely known as the rationalists. They care about human biases, artificial intelligence and doing good with charity. (I’ve written about them in my book.)

In June, the NYT’s tech reporter Cade Metz contacted Alexander, and said he was going to write a piece about SSC, and particularly about how remarkably good the rationalist community was at predicting the course of the Covid pandemic. I spoke to Metz, and reassured Alexander and the Rationalists to the best of my ability that I thought it would be in good faith, rather than a hit job.

But then Metz and the NYT said they would reveal Alexander’s real name in the piece. Alexander thought this would endanger his relationships with his patients, and took down his blog. He has since quit as a psychiatrist, re-ordered his life, and set up a new website. Now, half a year later, the NYT piece is out.

I don’t want to get into whether or not it is a hit-job; others have done that. I will say that it comes perilously close to outright misrepresentation. For example, Metz says that “in one post, [Alexander] aligned himself with Charles Murray, who proposed a link between race and IQ in The Bell Curve.” But the line in which he “aligns himself” with Murray is on whether there is a genetic component to poverty (which surely there must be), not race: race is not mentioned in the post at all. It is, in essence, guilt-by-association.

What interests me, though, is that SSC, and the rationalists, are seen as gateways to hard-right thinking: to “race realism”, to men’s rights activism. And I think that persuasion is a key part of the story.

Because, on the face of it, the idea that the rationalists are secret fascists is strange. A 2019 survey of SSC’s readers found that self-described “conservatives” were outnumbered 8:1 by self-described “liberals” and “social democrats”; there were rather more “libertarians”, but still far fewer, and weirder subcultures like “alt-right” and “neoreactionary” existed only in slightly larger numbers than “Marxists”. They are far more anti-Trump than the American population. 

But the NYT piece is far from the first article to suggest that, nonetheless, the rationalist community is an “an on-ramp to radical views” that allows “extremist views to trickle into the tech world”.

Partly, that’s because the rationalist community is explicitly a place for reasoned, polite debate, and almost any views are welcome as long as they are expressed respectfully and can be backed up with evidence or reasoning. Inevitably, that means precisely those views which cannot find expression elsewhere tend to gravitate to it. But also, I think, it’s because SSC tries to persuade people.

Read something of his, on some controversial subject. Billionaire philanthropy, for example, is not always popular: long articles have been written about why it is actually a bad thing, because it whitewashes billionaire reputations, allows them to control society, and is unaccountable to democratic institutions.

All of which is reasonable. Scott Alexander, though, thinks that on balance, billionaire philanthropy does more good than harm, and that the movement against it will hurt the world. 

It’s easy to imagine a newspaper article that attacks “billionaire-bashers”, that lists all the great things that Bill Gates or Jeff Bezos have done with their philanthropy, and makes fun of the idiots who think that stopping them doing that will improve things. Alexander, on the other hand, talks directly to people who disagree with him, who think billionaire philanthropy should be curbed: “I’m against this. I understand concern about the growing power of the very rich. But I worry the movement against billionaire charity is on track to damage charity a whole lot more than it damages billionaires.”

It seems a small thing, a single phrase, “I understand concern” — but it is not. It demonstrates that the piece is intended to change minds. It says to those anxious about billionaire philanthropy that their worries about inequality and democratic unaccountability are real — I’m on your side! — but look, there might be these other things that you’ve not thought about. Whether or not you end up agreeing with Alexander on the particular case, he’s trying to win you over.

Another example. “Free speech” has become a left-right battleground issue, and the instances we read about are always of right-wing speech being limited by left-wing activists. So, inevitably, left-wing people think it’s a partisan attack on them, or a smokescreen for people who just want to say unpleasant things (which, let’s be clear, it often is). But Alexander takes a different tack. In one post, for example, he calls attention to a woman fired for “having a Kerry-Edwards bumper sticker on her car” by her George W Bush-supporting boss. The point is, or at least the effect on me was, to drag the issue away from partisan sniping. It wasn’t firing shots in the culture war, it was talking to liberals and left-wingers, trying to persuade.

I should, nervously, admit that I was persuaded on one topic that is much more highly charged: the gender imbalance in various professions, notably tech. Alexander argues that straightforward discrimination can’t be the only factor behind the male dominance of some fields: he points out, for instance, that sexist attitudes kept almost all women out of almost all professions until relatively recently. Law, medicine, academia, journalism, you name it.

Now, though, he says, lots of professions are female-dominated: “men make up … only 25% of new psychologists, about 25% of new paediatricians, about 26% of forensic scientists, about 28% of medical managers, and 42% of new biologists.” Women make up half of new medical students, half of new law students, the large majority of new journalism students and psychology students. Most of these jobs are comparable in pay and status to computer programming. “Yet for some reason, engineering remains only about 20% female.” 

He argues convincingly that there is no detectable difference in ability in maths, or computer science, or engineering between the two sexes. But, he says, women are on average more likely to be interested in careers where you deal with people, rather than with systems or things. 

And this distinction explains why, for instance, women make up the large majority of gynaecologists, paediatricians, psychiatrists and family doctors (American GPs), while men make up the large majority of radiologists, anaesthetists and surgeons. Either we have to posit that radiologists are much more sexist than psychiatrists, or we have to say there’s some other, major, factor going on.

Alexander suggests that it’s about interests: that there are large and systematic differences in what men and women are interested in, and that translates into systematic differences in their choice of profession. And it does seem to me that anaesthetists and surgeons can treat patients as “systems” or “things” to a much greater degree than GPs or paediatricians. Of course this is just a statistical difference, and individual men and women vary widely — but, he says, it is probably part of the story at a population level. 

That piece, and others by him on the topic, persuaded me that sexist discrimination alone is not enough to explain the gender difference in tech or many other fields. (Do read the post, rather than arguing with my short synopsis of it, if you disagree.)

This is why, I think, he and the rationalists are seen as a gateway to the hard right. If you are on Team A in the big internet fight, and you want to beat Team B, then someone who comes along and talks, in Team A language, to Team A people, to make them believe things that are associated with Team B — then that person is worse than the most fire-breathing Team B zealot. He’s not a foreigner, he’s a traitor. He’s not a combatant, he’s a spy. He’s a fifth columnist. 

Of course, Alexander would say that he’s not trying to win people over to Team B. He’s a member of Team A; he just wants to understand things! But, of course, this is exactly what a traitorous spying fifth-columnist would say. He comes here talking the language of inclusion and diversity and liberalism, but he actually tries to convince people that sexist discrimination in tech is less of a problem than you think.

And the worst thing is – it works. People do change their minds. I did; I am less sure about a lot of things than I was before I read SSC, and I think that’s what caused it. (I’ve changed my mind the other way, too, towards more stereotypically liberal positions: he has convinced me that trigger warnings are good.)

That is scary. Particularly if you’re a Team A partisan, and you see other Team A partisans losing their will to fight, as they become less certain that Team A actually has all the right answers. Or if your identity is heavily tied up with your political beliefs, and changing them would feel like changing who you are.

Maybe radicalisation is a real problem; maybe these controversial debates are fine in some tiny gated community of nerdy weirdos, but then when they come out into the wider world, they take it too far and end up in some strange corner of the internet. Rationalism is indeed a gateway to dangerous beliefs, says Scott Aaronson: “insofar as once you teach people that they can think for themselves about issues of consequence, some of them might think bad things. It’s just that many of us judge the benefit worth the risk!” 

But I don’t think that’s the real problem. I don’t think that’s really why rationalist writers are seen as dangerous. I think it’s because if you think all of this is a big fight — if debate is war, and arguments are soldiers — then someone coming along and killing your soldiers behind your lines is simply the enemy, even if they’re wearing your uniform. And at the end of the day, traitors and spies get the harshest punishments of all. 

 


Tom Chivers is a science writer. His second book, How to Read Numbers, is out now.

TomChivers