Truth is harder to dispose of than some people think. To say that there is no truth is already to have stated one, or at least what you take to be one. Yet the idea of truth is a lot less in fashion than it used to be, and there are social and political reasons for this unpopularity.
One of them is individualism. In a society which lacks any strong bonds between its members, everyone is likely to have their own interpretation of the world, just as everyone is likely to have their own toothbrush. You wouldn’t want to borrow someone else’s version of the truth, any more than you would want to borrow their toothbrush. Truth becomes privatised. It’s a matter of my personal experience, and surely no one can challenge that.
In fact, personal experience is as open to debate and dissent as the existence of God. There’s nothing absolute about it. One of the key insights of the late modern period, before the birth of postmodernism, is the recognition that human beings are constitutively opaque to themselves. “Constitutively”, because this lack of self-transparency is built into the kind of animals we are, not just some lamentable self-blindness which we could put right with a little more self-reflection.
When Oedipus in Sophocles’s drama finally comes to know who he is, he recognises that he is a stranger to himself. It has taken incest, parricide, pollution, self-blinding and self-exile to arrive at the conclusion that he has no certain grasp of his own identity, and that this is the condition of us all. There must be easier ways of learning this lesson.
There was a time when we thought that we were transparent to ourselves and that it was others who were indecipherable to us. Nowadays, we tend to accept that we aren’t always in entire possession of our own experience, and that others can sometimes know us better than we know ourselves. Among other things, this is because other people are frequently well-placed to see what we do, and what we do is a surer guide to who we are than what we say or think we are. If someone insists that he’s a passionate lover of animals and spends his spare time dissecting live frogs, then he is self-deluded. It’s the beliefs implicit in our behaviour, not those recorded in our memoirs, that really count. If the truth is what we feel, there can be no place for self-deception in human affairs. It simply wouldn’t be possible for a prime minister to declare himself a pretty straight sort of guy, and probably believe it, yet seek to deceive the public about the invasion of Iraq.
So I am no infallible guide to the meaning of my own experience. I thought at the time I was furious, but looking back I realise that I was afraid. It’s true that my experience is beyond doubt in the sense that I really am in agony, and no mistake; but to know that what I’m feeling is agony rather than ecstasy, I must have the concepts of agony and ecstasy, and this isn’t something I can achieve all by myself. I can know this only by belonging to a community whose language includes these notions.
In this sense, even the most private of experiences is public. Toddlers don’t spontaneously come up with the concept of unfairness. Rather, they take part in a form of social life in which the idea of unfairness has a function, they watch how their elders behave and listen to how words are used as forces in these contexts. If they have the misfortune to be brought up in Conservative Party headquarters, they will never get the hang of the idea.
Another insight of late modernity is that experience and reality are in some ways out of sync. It looks as though the sun circles the earth, but this is because the Earth revolves on its axis. (“The sun’s coming up — or as the new theory has it, the Earth’s going down,” as a 17th-century character remarks in Tom Stoppard’s play Rosenkrantz and Guildernstern are Dead). One purpose of science is to close the gap between reality and our experience of it, as well as explaining how the gap comes about.
Like all concepts, truth is inherently social. There must be public criteria by which we can determine what counts as true or false, and these criteria are nobody’s private possession. The philosopher Wittgenstein imagines someone exclaiming “But I know how tall I am!” and placing his hand on top of his head. He hasn’t grasped the fact that tallness is a comparative notion, one that involves socially agreed modes of measurement. There must also be public criteria for conflicts over truth. Disagreement is only possible if you can agree on what you are disagreeing over. We aren’t conflicting over the value of the pound if you are thinking of sterling and I am thinking of the American poet of that name.
A certain degree of consensus must underlie the most vehement of altercations. People who cry “Well, it’s true for me” (the #MeToo generation is also the “Me-Me-Me” generation) haven’t grasped this point, either. Rather than making the coldly impersonal idea of truth more warmly experiential, they end up abolishing it altogether. If truth is just what anyone happens to think is true, then the word drops out of use. It becomes impossible to distinguish between what I feel and the way the world actually is. There are people who insist that slavery is one of the most outrageous episodes in Western history, yet put the word “truth” in scare quotes. They tend to do the same with the word “fact”, while maintaining that it’s a fact that women are equal to men.
Does all this mean that truth is beyond dispute? Certainly not for science. The fact that truth isn’t just subjective doesn’t mean that it must be absolute. The one is just the flipside of the other. Relativists and subjectivists fear that unless they are right, truth becomes dogmatic, infallible and authoritarian. But this is to be afraid of a bogeyman. To say that a proposition is scientific is to say among other things that it could be wrong. It’s the kind of thing that someone might always come along and falsify. It differs in this way from “Put that tiger down!” or “Sorry, make that two tuna sandwiches”.
Science is a never-ending argument, in which one’s conclusions are perpetually open to revision, and argument is fundamentally a political affair. You must make sure that everyone gets a chance to participate in the debate, that they do so on equal terms, that nobody is allowed to impose their own partisan interests on the discussion and so on. None of this guarantees the emergence of truth, but it’s an essential pre-condition of it. People who claim “It’s true for me” tend not to be interested in any of this. What is there to argue about? It is they who are the absolutists, since there would seem no way of disproving what they assert. Anyway, someone can always protest that the “it’s true for me” claim isn’t true for them.
“It’s true for me” springs from a false egalitarianism. If everyone is to be included in political society, then everyone’s views must be respected as well. So what about the view that everyone isn’t to be included? Is that to be respected as well? And what of the so-called law of contradiction? If you think that reading Agatha Christie gives you prostate cancer and I disagree, then one of us has to be wrong. But postmodernists are reluctant to admit that anyone is wrong. It offends against the law of inclusivity, as well as making some people sound inferior to others. Anyway, from what Olympian vantage point can one make such a judgement? How can I possibly judge that there are some sexist police officers in Britain? Isn’t any such claim a sign of moral superiority or intellectual elitism on my part?
Such is the addled thinking that passes for genuine argument in certain circles. A fear of sounding authoritarian, when all you’re trying to do is tell the truth, haunts contemporary culture. This is why it’s fashionable to qualify what one says with the word “necessarily” — as in “It’s not necessarily that people are passing themselves off as Tom Cruise, it’s just that there’s a crisis of identity”, where what you mean is that people aren’t passing themselves as Tom Cruise at all. But it sounds offensively dogmatic to say so. Similarly, to say “It’s nine o’clock” sounds unpleasantly absolute, so it’s advisable to throw in a “like”. “It’s like nine o’clock” sounds far more agreeably uncertain. A pervasive sense of uncertainty is one reason why truth has come into disrepute. But there are other reasons, which i’ll return to in my next essay.