Wednesday, September 30, 2015

An End to Pedantry: THAT vs. WHICH

I've used this blog on a number of occasions to debunk rules that are widely believed to represent 'proper' educated usage but which have very little basis in reality. I have for instance argued that data is not in fact a plural, that tomatoes aren't really fruit outside of botanical circles, and that there is no good reason to use an rather than before words beginning with h. I'm far from alone in writing about these imaginary rules but I occasionally feel I can provide a slightly different perspective on them. After encountering some of my posts on these issues, one of my readers asked me to elaborate on what I meant by this:
The only standard by which we can judge the use of a word to be correct or not is whether it conforms to the conventions used by members of the speech community in the particular context in which it is being used. [On falling apples and whether tomatoes really are fruit, Oct 16, 2013]
In particular, this reader was interested in what I thought about the alleged rule for when to use that and which, and whether I think it actually matters whether we follow such rules if our meaning is likely to be understood in any case. What follows is a slightly edited version of the response I gave this reader by email, which I hope clarifies my position.

First, it's important to realize that there are in fact two alleged rules: one about which word should be used for 'restrictive' relative clauses (also called 'integrated' relative clauses), and another about which word should be used for 'non-restrictive' relative clauses (also called 'supplementary' relative clauses). Like many people, I was always taught (1) that you should use that rather than which in restrictive relative clauses, and (2) that you should use which rather than that in non-restrictive relative clauses.

The reason I divide this into two rules is because the latter rule, unlike the former, is actually followed by all native speakers of English essentially 100% of the time. I should note that this is only true in modern English - that is also readily found in non-restrictive relative clauses in English texts dating back a couple of hundred years or so. In modern times though, we all (to a very good first approximation) use which to introduce non-restrictive relative clauses. The trouble (if you want to call it that) is only with which word should introduce restrictive relative clauses, and here, usage is divided between that and which.

This should make pedantic prescriptivist types slightly uneasy given that their usual way of explaining the lack of adherence to an alleged rule is to blame it on the laziness of speakers, or declining standards of education. In this case, they are forced to argue that despite the similarity in the form of these two rules, the educational system is very good at teaching one of them and very poor at teaching the other, and mysteriously, even to 'students' who have never been to school! Either that or speakers are unaccountably lazy about one of these rules and not the other! Just to drive home the absurdity of this, note that to use either rule effectively, the user has to understand (on some level) the distinction between restrictive and non-restrictive relative clauses. We all demonstrably understand this distinction enough to use only which with non-restrictive relative clauses, but the prescriptivist has to argue that speakers are too lazy or feeble-minded to use that exclusively with restrictive ones.

The obvious alternative to this kind of explanation is that there is something fishy about the prescriptivists' rule for restrictive relatives, along with their assumptions about what forces might be staving off humanity's descent into a grammatical free-for-all. I'm reminded of Charlie Dog earnestly attempting to hold up a tower that is threatening to topple over as his reluctant master pretends to run for help. His master is of course exploiting Charlie's ignorance of the Leaning Tower of Pisa along with his tendency towards the sort of self-belief that could allow him to think he is strong enough to support it with his paws. I have no doubt prescriptivists think their usage guides play some causal role in maintaining the integrity of our language in essentially the same way that Charlie Dog thought he was holding up the Leaning Tower of Pisa. But they aren't, and it's quite safe to let go. They have mistaken language for a thing shaped by deliberate acts of domestication, but wherever you look, language in the wild has all the structure and complexity that the self-proclaimed guardians of language, from antiquity onwards, have feared will be lost to declining standards.

Charlie Dog 'holding up' The Leaning Tower of Pisa in A Hound for Trouble (1951)

So what's wrong with the version of the rule that applies to restrictive relatives? Well, apart from the fact that it's never been something that a large proportion of English speakers or notable writers in the language have ever shown any great respect for, it's not hard to show to everyone's satisfaction that at least some restrictive relative clauses need a which rather than a that. This is true, for example, when the relative pronoun is preceded by a preposition. We would all prefer the house in which I was born... to the house in that I was born... You can say the house that I was born in though if you don't mind dangling prepositions. You could of course reformulate the rule to permit such exceptions along with others we may find, but what we'd be doing in the end is looking to habits of usage as the ultimate authority. Certain opinionated language lovers are happy enough to extrapolate rules based on usage when it's their own but fail to see that alternative conventions that are in widespread use can equally be legitimized by appeals to usage.

Languages absolutely do have rules, but they are to be discovered from patterns of usage rather than imposed on it. This is the difference between descriptivism and prescriptivism.

So where do prescriptivists get their rules, and especially these funny ones that they insist ought to be followed but have never applied to large groups of people at any time in history (strictures against split infinitives, dangling prepositions, etc.)? There are probably as many answers to that as there are cases of it. Many undoubtedly arose through various confusions, others by imposing the rules of Latin on English, but for the distinction between that and which, the answer is well known. The rule is the whim of Henry W. Fowler, who proposed it in the 19th century, not as a rule that he believed came from an authority that predated him, but as a rule he consciously devised himself to reform one aspect of the language he personally disliked.

The question of how these rules emerge is perhaps less interesting than how they gain traction. Just as astronomy is being shadowed by astrology, evolutionary biology by creationism, psychology by the self-help industry, evidence-based medicine by alternative medicine, and so on, linguistics is being shadowed by a kind of hocus pocus of its own, driven by individuals who have very strong opinions about language, and who write these opinions down in usage guides like Fowler's, and more widely known volumes such as The Elements of Style by Strunk and White. These opinions are presented as revealed truths, with nothing approaching rigorous evidence to back anything up. Naturally, these views tend to flourish in discourse that occurs outside of the sort of rigorous modern framing that would expose their weaknesses.

I describe astrology and the others as 'shadow' disciplines, but in each of the examples I've listed, the hocus pocus varieties predated their more rigorous counterparts, and this is equally true of linguistics. Indeed, linguistics in its modern form didn't exist when The Elements of Style was published in 1918, and was just born when White updated Strunk's guide in 1959. There has not been enough time for the modern understanding in linguistics to filter through the culture and supplant old views in the way astronomy has done to astrology, and I suspect linguistics will be stuck with its shadow for a very long time because everyone seems to think they are experts when it comes to language, especially people who are good at using it. No one would assume that you need to understand how a car works in great detail to be a good driver but for whatever reason, the distinction between being a skilled language user and understanding how language works isn't as clear for many people, and many skilled writers, editors and self-professed language lovers happen to be carrying most of the inertia that linguists are struggling against.

As for what we should think of a rule that doesn't seem to contribute to understanding given that the meaning of a sentence is clear to everyone whether people follow it or not, well, a rule doesn't have to have that function to be real. It will be news to many, but communicative success doesn't tell you anything about what rules may or may not exist because the rules of grammar are not always closely linked to communicative functions. The assumption that they are certainly seems plausible until you look at the evidence, but consider a language that marks all its nouns for gender (including nouns for inanimate things). How is communication served by requiring that an adjective carry the same gender marking as the noun it modifies in such a language? If a non-native speaker gets the form of the adjective wrong, the natives will notice that a rule was violated but still understand perfectly well what was meant. We don't need language at all to communicate successfully in many contexts. Non-native speakers and small children can succeed in getting their intentions across with a grammatical jumble that gives us nothing more than what amounts to vague clues about their intentions with context filling in the rest. And just as speakers can succeed without following the rules, speakers can fail even when following them. Listeners can fail to understand perfectly grammatical sentences either because they are ambiguous ("I saw the man with the telescope" - was he with it or was he seen with it?), or because they are nonsense ("Colorless green ideas sleep furiously"). In the literature, the independence of grammar and meaning is called the "autonomy of syntax".

Whether a particular grammatical form is okay to use in a particular context is a practical question that has both grammatical and social dimensions to it so I don't think there's a simple answer. If you're applying for a job at a newspaper where you know the editor thinks the that/which rule for restrictive relatives represents educated usage, then you have to be sensitive to it as a matter of personal survival. It's not really functioning any differently to the choice of wearing a suit and tie to an interview. Both conventions are essentially shibboleths and being aware of these things will help a person navigate through life.

No comments:

Post a Comment