the usage authority (mad hatter) speaksSome random commentary and other updates follow. Rather than cram all new updates and commentary into a single post, I’ll spread the latest updates over a few posts. I began preparing this post back in early January, but I then set it aside until now; as a result, some items may seem dated, as may items in succeeding posts. Items in this post all relate in some way to English usage or terminology. They include the following:

These are followed by some closing remarks and end notes.

Same Decade, Different Day: Welcome to this Decade’s Final Year ^

Early in January 2020 (when I started preparing this post, before setting it aside to attend to other matters), I ran across a lot of materials offering reflections on a decade said to have ended on 31 December 2019. Though a decade is any ten-year period, meaning one could focus on the decade from 01 January 2006 through 31 December 2015 if one wished, the fact that there was never a year 0 A.D. (nor B.C.) makes it most proper to consider 2020 the tenth year of the decade that began in 2011. That way, you have a succession of decades from 1 to 10 A.D., 11 to 20 A.D., and so on, until you finally reach our current decade running 2011 to 2020. As 1 A.D. was the first year of our Lord, and as 2001 A.D. was the first year of our present millennium (hence 2001: A Space Odyssey), so 2021 will be the first year of our next decade.

Of course, since everyone will want to call the next decade the “2020s,” the temptation to consider 2020 part of it will prove irresistible to almost everyone, my quick contrarian note notwithstanding. On the subject of being contrarian….

Punctuational Equilibrium: Ending Bias Against Old Paths of Punctuation ^

“Thus saith the LORD, Stand ye in the ways, and see, and ask for the old paths, where is the good way, and walk therein, and ye shall find rest for your souls. But they said, We will not walk therein.” (Jeremiah 6:16)

In a day when more and more people are turning over stylistic decisions to software developers and artificial intelligence (I’m sure you’ve seen ads for the sort of writing software I have in mind), debating usage may be more pointless now than it has ever been. Nevertheless, I thought I’d take a moment to rail against an opinion that seems to have become dominant among people who are considered, or who consider themselves, “authorities” on English usage. One such person writes this:

He opened the book, not to read it, but to seem occupied is, in my opinion, overpunctuated. The comma after book would rarely be heard in speech; the comma after it might not be heard either.

Most grammar books do prescribe commas around a negative element such as not to read it. I don’t know why they do. The not to read it is not a parenthetical element….The commas have no necessary function at all, either as signals of spoken delivery or as indications of grammatical structure.

So Edward D. Johnson claims in a 1982 text presaging more recent style guides and bearing blurbs of praise from such luminaries as William Safire, The Handbook of Good English.[1] I have to admit, however, that I have always read these commas as signaling natural pauses, always heard the pauses when playing written text in my head (yes, a few of us still do this sometimes), and always paused appropriately when reading such text out loud (which I rarely, but occasionally, do). While an awkward statement, “He opened the book not” would be recognized by listeners as meaning “He did not open the book,” so the possibility of missing the correct sense of the sentence for a moment seems real when the comma before “not” is omitted. I suspect that the widespread opinion that this and the comma after “it” serve no purpose owes to the dominance of rapid reading over both careful reading and reading out loud. If there is some rule out there stating that only parentheticals may be set off by commas, I’m unaware of it. Whatever the source of the opinion, it seems clearly a matter of subjective judgment, not an objective matter on which Johnson can be said to be “correct.”

Nevertheless, Johnson does not seem to see his stylistic judgments as matters of subjective preference and personal aesthetics. Note the following:

We had thought that, considering your position, we might buy you out is a similar case [to another sentence not discussed here], with an unnecessary comma between that and the parenthetical participial phrase considering your position….

I must admit that I very frequently see these unnecessary and illogical commas in carefully edited books and magazines. They are a hangover from past centuries, when commas were used much more heavily; they violate the overriding general principle of modern punctuation—to use punctuation lightly and omit it when it is unnecessary.[2]

Several things make this passage interesting. Note how Johnson trusts his own judgment over that of all the editors of “carefully edited books and magazines.” Also note that, whereas he earlier objected to commas around a negative element because it wasn’t a parenthetical, he in this case objects to commas around what he grants is a parenthetical. When, exactly, did this alleged “overriding general principle of modern punctuation” become authoritative and the old paths of “past centuries” unacceptable? Why should we accept the modern aesthetic over the older one?

Naturally enough, Johnson also prefers to omit commas around “not only…but also” constructions. He writes:

He opened the book, not only to read it, but to seem occupied is a misapplication of the rule about setting off negative elements [which rule Johnson rejects anyway]. The sentence has no negative element; not only to read it is not negative in the way that not to read it is. [True enough. But, here, again, the commas Johnson would omit do signal natural pauses as I read the sentence.] There should be no commas. [Again, Johnson makes his personal preference a prescription: one “should” do as Johnson would do.]

Not only did he hope to seem occupied, but he wanted to read the book is a different situation: the comma is correctly used to separate two independent clauses [connected by a conjunction], though it would not be incorrect [says Johnson] to omit it in this example….[3]

A more recent style guide, one that also bears William Safire’s praise on its cover, is Garner’s Modern American Usage, which says of “not only…but also” constructions that “No comma is usually needed between the not only and but also elements and—as the last citation above shows—to put one in merely introduces an awkward break.”[4] The pause Garner finds awkward appears in a quotation from a journal article that reads as follows: “Feminist methods and insights [must] be adopted[,] not only by female scholars, but also by males…” (italics removed, comma signaling first natural pause added by me; “must” inserted by Garner; ellipses mine to eliminate redundancy).[5] If Garner had said, “either use commas both before and after ‘not only by female scholars’ or not at all,” I would see no need to object. To me, the natural pauses before “not” and after “scholars” seem equal, so someone seeking principled consistency in his writing should use either two commas or no commas in such instances. In my judgment, however, there is no objective reason to prefer comma omission over comma inclusion. When both commas are used, no awkwardness or imbalance results. Trends of the masses, even when embraced and promoted by “authorities,” should have no prescriptive force.

Of course, my natural bent is that of a philosopher, someone who challenges and critiques mass trends and the unquestioned presuppositions of large groups. This is quite other than the natural bent of most English usage authorities, whose willingness to resist mass trends seldom reaches beyond resisting a few vogue practices of their youngest competitors. Adopting the subjective biases of their favorite teachers and advisers as their own, they thereafter trust these biases as simply sound judgment. Suggesting that something they consider the “overriding general principle of modern punctuation” might be just a subjective aesthetic preference one may take or leave as one likes—to them, that’s just crazy talk.

Johnson demonstrates his trust in his own bias in an additional instance worth noting. He writes:

Toward the hazy cape, the weary whalers rowed, with subject and verb in the usual order but modifying phrase and verb still in inverted order, is also faulty, though less obviously so; in fact, older grammar books advise setting off such adverbial phrases with a comma.

So, though older grammar books advise using the comma, Johnson is perfectly certain that one is at fault if one includes it. I would say, “Well, at least he’s consistent” in wanting to get rid of punctuation his forebears preferred to use—but he isn’t. On the subject of the serial comma before and, he advises, in the words of one of his sectional heading, “Use a comma before and, or, or nor preceding the last of a series of three or more words or phrases.”[6] I wholeheartedly agree with Johnson’s advice here, but I must note that it goes against both the trend I’ve seen rising to dominance in online writing, where leaving the final comma out seems to have become the norm, and Johnson’s own “overriding general principle.” Were Johnson consistent in applying that principle, he would have to join journalists and the masses in leaving this comma out most of the time, since it is only rarely needed to make the basic meaning clear. It does signal a natural pause and does add a bit more assurance that one’s sentence will not be misread, but if these purposes alone could justify using the comma, they would also justify using it in other cases where Johnson condemns its use. Johnson’s inconsistency is a sure indicator of subjective bias.

The odd obsession with eliminating “unnecessary” punctuation, dominant but inconsistent in Johnson’s text, can be found in some works even older than Johnson’s, such as John Opdycke’s 1941 revision of Get It Right!:

No comma is required in such compound sentences as these:
We hurried home after the party[,] but they had already left.
John was determined to go[,] and Mary was equally determined to accompany him.
The old grammars prescribed a comma before but in the first example and before and in the second [I’ve added these in brackets], and this punctuation is still observed to some extent. The rules were [are]: Use a comma between clauses of equal rank connected by and, but, or, nor; use a comma between clauses that are set in contrast to one another; use a comma between coordinate clauses that have different subjects.[7]

The basic assumption of people with this punctuation-averse bias seems to be that punctuation within sentences is somehow distasteful and should be expunged as much as possible. Thus this author brags: “Note that thruout [sic] this book punctuation is used as sparingly as possible without sacrificing anything (it is hoped) to clarity.”[8] And he asserts that, rather than “lean upon punctuation,” writers should “rewrite and rewrite until they get their sentences so closely knit in construction that no punctuation (or at least very little) is required.”[9] This, however, is just Opdycke’s personal aesthetic bias, a bias that, among other things, unquestionably limits the complexity of thought one may incorporate into a sentence. It also prevents signaling pauses that would naturally occur if one spoke a sentence as its writer would speak it. Surely the two sample sentences have a slightly different flavor and emphasis when one places a comma before “but” and “and,” and surely it is possible that a writer may prefer that flavor and emphasis over the one produced by leaving the commas out.

The same is true of Johnson’s preference and of Garner’s. I find it hard to believe that anyone fails to sense a natural pause before the “not only” in “not only…but also” constructions, a pause often stronger than the one before “but also,” yet these gentlemen have rejected both commas, and the dominant trend of American usage has rejected the first but not the second (a weird inconsistency presumably due to the high prevalence of commas before “but” in situations other than “not only…but also” constructions). This clearly indicates that subjective aesthetics are at work. This is a matter of stylistic taste where the “old paths” condemned by the likes of Opdycke, Johnson, and Garner have no less claim to legitimacy than their punctuation-averse pronouncements. The older rules made explicit much that the punctuation-averse prefer to leave implicit. Is one “wrong” to prefer explicit over implicit communication? Is another “right” to prefer the implicit over the explicit when punctuating? Why start from their assumption that any punctuation that isn’t definitely required for basic clarity should be left out? Why not instead start from the assumption that any punctuation favored by the traditional rules should be retained if it doesn’t reduce clarity, particularly when it serves such useful functions as signaling natural pauses and making grammatical relationships more explicit? (To my eye, both these useful functions improve clarity.)

One benefit of retaining, or reestablishing, traditional rules is that these rules make mastering English as a second language easier. While native writers may successfully communicate though they approach sentence construction in an ad hoc manner based on subjective aesthetics (I’ve actually seen would-be freelance editors advertise themselves as “intuitive editors,” which I take to mean that they edit based on subjective aesthetics alone), non-native writers generally do best if they rely on rules. Such, at least, is my impression, an impression based upon several years’ experience revising non-native writing.

Of course, we live in a culture that is daily more obsessed with doing things quickly, even when that means doing them less well. Self-help books emphasizing how to think more quickly do seem to outnumber those emphasizing how to think well. (After all, even in the much slower days of 1991, popular culture recognized that “you miss too much these days if you stop to think.”) The willingness to read slowly and critically, though that means reading less, seems far rarer than the urge to read rapidly and uncritically, since that means being able to read more (and to spend more time streaming the latest degrading content from Netflix and similar services). Less punctuation means fewer characters to process, and speed readers don’t like being made to pause. Really, it’s a wonder they still tolerate periods at the end of sentences.

Though Opdycke’s use of “thruout” in place of “throughout” shows an inclination to embrace trends destined not to catch on (in this case, simplified spelling), his inclination to leave out punctuation his forebears thought desirable seems to align with a trend that would soon become dominant. Warfel, Mathews, and Bushman, writing in 1949 (8 years after Opdycke’s text), speak of “The present tendency to open punctuation, that is to punctuation used only when clarity demands it,” asserting that the new tendency “is a recognition of the fact that some interrupters do not confuse the reader.”[10] (No doubt that is why they inserted no comma after “that is” in this sentence, though they use it elsewhere.[11]) Concerning a rule that adverbial clauses and verbal phrases preceding main clauses should be set off by commas, they state that

When the preceding clause is short and a misreading is improbable, the comma may [not “should”?] be omitted:
ACCEPTABLE [not “preferred”?]: If he believes that story[,] he is a perfect gull.
ACCEPTABLE: When I first came here[,] I was only twenty.[12]

I’ve noted in brackets where the classical rules would add commas. I’ve also noted in brackets that these authors don’t go so far as to assert that leaving these commas out makes one’s style superior. If they meant to assert this, they would say one “should” omit the comma in these cases; they would also call the comma-less usage “preferred” rather than “acceptable.”

Overall, these authors seem much less eager than Opdycke to embrace the “open punctuation” trend. Concerning a rule that items in a series of coordinates should be separated by commas “unless the individual members contain enough commas to require the semicolon,” they observe that “Some writer omit a comma before the and which joins the last two members of such a series; [but, they add,] to use one is not only acceptable but wise.”[13] And concerning the use of commas to separate coordinates more generally, they write:

Coordinate elements in a sentence, if they are main clauses or items in a series, might be confused if they are not separated by commas. Even where the danger of confusion is slight, the presence of the commas makes for easier reading.[14]

Their advise to use punctuation even when “the danger of confusion is slight” seems in tension with their willingness to tolerate open punctuation in other contexts. I suspect this tension becomes unavoidable whenever one makes the standard for punctuation one’s subjective sense of what will or won’t confuse a hypothetical reader. Readers vary in their ability to misread a sentence, and writers invariably perceive their own writing as more clear and univocal than will any of their readers. One’s sense of what makes writing perspicuous, like one’s sense of what makes writing aesthetically appealing, is highly subjective. To what extent one’s sense of perspicuity or aesthetic appeal should determine one’s punctuation seems open to question. Might not stricter adherence to formal rules compensate for these questionable guides?

Terminological Illness: Purely Plural “They” Now Mandatory for Christians ^

Although usage of the third-person plural pronoun (they, them, their) as a third-person singular generic has a long history in informal English usage, its evident illogic not withstanding, the recent enlistment of this usage into service of transgender ideology makes Christian rejection of all but the plural use of “they” mandatory. Though using “he” for both known-to-be-male subjects and generic subjects seems to me to best combine the logic of using proper singulars with the stylistic avoidance of complex prolixity, “he or she,” though wordy and awkward, now has the advantage that, in addition to being precise, it asserts the reality of humans’ physical nature as sexually binary, the reality that humans are innately either male or female by design and that this is a physical reality to which one’s thoughts and emotions must be made to conform, not a mental or cultural construct that alternative constructs may override at will.

While it may be granted that a few individuals, due to rare chromosomal abnormalities resulting from nature’s fallen state, may actually fall outside God’s binary design physically, one could argue that anyone possessing a Y chromosome, even if abnormally combined with two X chromosomes rather than one, should be considered physically male. In such cases, surgical correction could legitimately be pursued. Surgical alteration of normal XX females to make them appear male, and of normal XY males to make them appear female, however, cannot be justified. “Doctors” who perform such surgeries are mutilators and criminals preying on people who need help learning to accept the biological gender God has given them. This doesn’t mean such people need to embrace all the gender stereotypes their culture may wish to impose on them; culture may always be challenged when its requirements are not grounded in Scripture. It does mean, however, that, no matter how much a female hates being female, no matter how much a male hates being male, no matter how strongly a male believes himself female or a female believes herself male, sex or gender is not “assigned” at birth—it is observed: one’s male or female gender is an objective reality, not a cultural construct.

Though it is possible I’m just noticing this more as the issue becomes a matter of political and cultural conflict, I’m getting the impression from popular culture that advocates of social liberalism are making a special point of using third-person plurals as singular generics. For instance, the season of PBS’s Endeavor most recently released to Amazon Prime has an (I believe) Oxford don, in a context where he is speaking rather formally, say “to each their own” rather than “to each his own.” Was this really the typical usage of Oxford dons in the late 1960s, or does the Endeavor scriptwriter have an agenda? I suspect the latter, though I grant that the general public in the 1960s, like the general public for a very long time before that, might have used “their” quite naturally. I also suspect that some readers, particularly those who like to think highly of themselves for watching “more refined” PBS fare, will object to my identifying a PBS Mystery show as “popular culture.” Such readers should get over themselves.

Anyone inclined to agree with me and embrace logic over linguistic drift should be prepared to ultimately lose this conflict. As Science Office Spock would affirm, humans’ mental processes are not fundamentally logical. Informal usage, particularly in spoken communication, seems always to have found using the third-person plural as a singular generic more natural than using a rigidly logical construction like “he or she.” This suggests that logic here runs afoul of the rules of language hard-wired into the human brain, which hard-wired rules always win out in practice in the end. Then again, when humans just follow their natural impulses, moral decadence always wins out over rectitude and slothful inaction always wins out over hard work, so there is something to be said for resisting natural impulses.

Negations on a Sentence by William Barr ^

At one point in remarks he prepared for a speech at Notre Dame, Attorney General Barr says that “From the nature of things we can, through reason, experience, discern standards of right and wrong that exist independent of human will.”[15] His inclusion of “experience” here show how irresistible today’s dominant empiricism can be. As I noted in a prior post, the external world that we often call “nature” provides no consistent testimony as to morally proper human conduct, except perhaps “whatever you’re strong enough to get away with is okay.” The Bible’s Book of Ecclesiastes shows how only by falling back upon revelation can one make sense of a fallen natural order that might otherwise turn one nihilistic. Barr’s inclusion of “experience” is therefore unfortunate, introducing a bit of confusion into an otherwise excellent speech.

When he penned our Declaration of Independence on behalf of himself and other founders, Thomas Jefferson made a point of calling the truth that every individual possesses certain inalienable, God-given rights “self-evident.” This is not the language of empirical inference dependent on experience. It is the language of intuitionism or apriorism. When one asserts that something is “self-evident,” one asserts that no evidence or experience is required, that the truth of some proposition is directly perceived by the mind and indubitable. Truths of this sort cannot be shown false by any empirical inference—whereas empirical inferences are merely probable, self-evident truths are undeniable—and cannot be refuted by honest and valid reasoning. If someone denies a truth that is self-evident, it isn’t because he needs to see more evidence or have more experiences; it’s because he’s not being honest with himself about what he knows to be true.

To our founders, human nature was self-evidently such that every individual necessarily possesses certain rights. Philosophers do refer to self-evident, foundational truths, what mathematicians call axioms, as “deliverances of reason,” but they are deliverances of reason as a truth-perceiving faculty rather than as a problem-solving tool. Reason as a problem-solving tool depends on these truths to start its work; they are necessarily presupposed in order for reason to engage in any problem solving at all. If you apply reason to what you experience in nature and draw the conclusion that individuals have rights, you understand rights to be probable inferences from empirical observation: it seems to you, based on your experience, that all individuals must have certain rights. Rights you believe in in this way are not self-evident, and new evidence and experience might incline you to reject some of them in the future.

Now, were you a scripturalist disciple of Gordon H. Clark, you would hold that the propositions of Scripture, and those propositions alone, are self-evident and indubitable. To justifiably believe in individual rights, you would have to find them included among or deduce them from the axioms of Scripture. Were you instead a presuppositionalist disciple of Cornelius Van Til, you might see the gracious and sovereign triune God as the sole self-evident reality. From the nature of this God as wholly truthful, you would see Scripture as wholly true and would infer truths from it in much the way scripturalists do. With the help of Scripture and awareness of the Fall that Scripture reveals, you would also see inferences from the natural world, properly understood in a Trinitarian and scriptural context, as a useful source of knowledge. You might find a basis for belief in innate individual rights in God’s triune nature, in the truths revealed in Scripture, or through empirical inferences of some sort. Neither a scripturalist who deduced rights from scriptural axioms, nor a presuppositionalist who deduced them from God’s nature or Scripture, nor one who inferred them as probabilities from evidence and experience would rightly call individual rights “self-evident.”

Closing Remarks ^

So ends this first post of the final year of the present decade, wherein dominant preferences in punctuation have been questioned, third-person plural pronouns have been proclaimed exclusively plural, and a prominent personage’s understanding of the “self-evident” has been criticized. Thanks for reading.

Notes, Elaborations, & Asides ^

[1] Edward D. Johnson, The Handbook of Good English (New York, 1982: Facts on File), 76.

[2] Ibid., 79.

[3] Ibid., 77.

[4] Bryan A. Garner, Garner’s Modern American Usage, 3 ed. (Oxford and New York: Oxford University Press, 2009), 574.

[5] Ibid., quoting from J. M. Balkin, “Turandot’s Victory,” 2 Yale J. Law & Humanities 299, 302 (1990). The ellipses I’ve added remove an “as well” that the preceding “also” makes redundant. Garner prefers to keep the “as well,” indicating that “also should have been omitted.”

[6] Johnson, The Handbook of Good English, 73.

[7] John Baker Opdycke, Get It Right! A Cyclopedia of Correct English Usage, revised edition (New York: Funk & Wagnalls, 1941), 491.

[8] Ibid., 477.

[9] Ibid.

[10] Harry R. Warfel, Ernst G. Mathews, and John C. Bushman, American College English: A Handbook of Usage and Composition (New York: American Book Company, 1949), 217.

[11] For example, they write the following: “Commas are used to set off non-restrictive modifiers, that is, those which modify so loosely that to omit them will [would] not change the essential meaning of the sentence.” (Ibid., 217.)

[12] Ibid., 220.

[13] Ibid., 215. One wonders if they would also think it acceptable to re-punctuate their final clause to include the commas Johnson and Garner oppose: “to use one is, not only acceptable, but wise.” This very clearly has a different flavor and emphasis than the openly punctuated version these authors use. Would they, Johnson, and Garner assert that this flavor and emphasis must always be ruled out in favor of open punctuation? And what might they think of stylists who’d prefer they use “that” rather than “which” in this context? This last relates to a “rule” that seems not quite to have taken hold in American English when Warfel and associates wrote their text. Advocates of this rule assert that “that” should always be used with restrictive subordinate clauses (essential identifying information not set off by commas) and “which” with nonrestrictive (optional additional information set off by commas). Though people who like to consider themselves usage authorities will usually identify this as a rule that is now obligatory in American English, and though my love of order makes me want to follow the rule whenever it doesn’t create an unpleasant sound (in some restrictive contexts “which” can sound much better than “that” due to the sound quality of surrounding words), survey of real-world usage suggests that this particular American innovation still falls a bit short of true “rule” status. At any rate, differing national and regional conventions in English usage and punctuation (compare how American, British, and World English differ in their choice of single versus double quotation marks and in their placement of these marks in relation to punctuation of the sentences in which quotations appear) seem likely to break down over time. (I’ve already seen at least one book published in America, Murach’s JavaScript and JQuery, rev. ed., by Mary Delamater and Zak Ruvalcaba [Fresno, CA: 2017], use the World English convention for quotations rather than the American. I believe I’ve also seen this convention in use on Wikipedia.)

[14] Ibid., 213.