Orwell and Socrates Want You to Make Sense, Not Nonsense

In 1946, Orwell published an essay called “Politics and the English Language,” in which he bemoaned the “mental vices” that had put the language into decline. He said that the English language, “becomes ugly and inaccurate because our thoughts are foolish, but the slovenliness of our language makes it easier for us to have foolish thoughts.”

 

That is why deep learning is important no matter what method of acquiring information we use. The Socratic method tends to force deep learning on the student, because the effort of memorization ensures that everything resides within, while the use of debate ensures a critical questioning of everything that we think we know. A person with deep learning won’t accept just anything. He learns to recognize malicious nonsense for what it is. Expert performance in writing requires curiosity and a critical mind.

 

But because Socrates was not a practitioner of reading and writing, he couldn’t see the subtle ways in which it might be used and useful, even for the kind of learning that he was after. Lev Vygotsky, born in 1896 in Belarus, founded a branch of psychology known as cultural-historical psychology. He viewed writing as every bit as dynamic as speech, perhaps more so, and fully capable of creating the same sort of dialog that Socrates used. It can create that dialog inside the writer. The effort of writing and rewriting, over and over, gradually refines the text so that it can become as sharp and deep as Socratic dialog. When approached in that way, with great effort, writing can lead to what Socrates called “virtue,” the handmaiden of truth. But both the Socratic dialog and expert writing are hard work.

 

Maryanne Wolf is the director of the Center for Reading and Language Research at Tufts University. She wrote that “Reading is… enriched as much by the unpredictable indirections of a reader’s inferences and thoughts, as by the direct message to the eye from the text.” She expressed her concerns not unlike those of Socrates and of the Sanskrit scholars in the fifth century BCE, who were also against writing. We are at a similar turning point as we make the transition from a world of rich voluminous texts to a world of digital imagery, video and audio clips, and fragmentary messages that have no dramatic structure. (Dramatic structure imbues a text with the emotional valence that makes it durable in memory.) Most of those new channels through which we receive information serve to shatter attention, not to drive a coherent narrative. This shattering of attention began with radio and television (or perhaps much earlier if we include ancient cave art as a contrast to reality). In those broadcasts the narrative is constantly interrupted by advertisements. The effect has grown dramatically with the use of the Internet. And as in the time of Socrates, there are those who believe that this new technology is going to make us stupid.

 

It is now well known among neuroscientists that certain parts of the brain can rapidly recast their networks of neurons in response to whatever we practice doing. Learning to read, for example, changes the visual cortex, among other areas. Instead of interpreting the lines and angles of the letters, we develop new networks of neurons that are devoted to recognizing whole words and instantly connecting them with their meaning, their mental models and emotional labels. Reading also forces us to practice paying deliberate attention. To comprehend what we read requires disengaging from whatever else we are doing and diving into the text. And reading with deep concentration and comprehension activates the brain broadly, across many different areas, in a complex recursive process that not only allows us to understand what is on the page but that calls up the whole universe of our own knowledge, experience, inference, and imagination. When reading a good book, you may say, “I felt as if I were there.” If you felt that way, you were there.

 

Surfing the Internet, sending text messages, watching videos, or downloading music does none of that. I enjoy those activities and find them useful. But they are also brief and fragmentary exercises. We’re adept at learning what we practice. With enough practice, we begin to learn a new way of approaching the process of acquiring and using information, one that skims the surface without the need for understanding or even emotion. With our brains thus altered, we begin to do everything a little bit differently. We learn that we can have answers instantly, and we gradually give up analytical thought. This can have catastrophic effects when we attempt to choose our leaders in this frame of mind.

 

Because the Internet allows us to do many things quickly and simultaneously, we learn both to hurry and to distribute our attention over many unrelated matters. Rather than regarding the world with a deep curiosity and a critical eye, we begin to accept what is presented unquestioningly. If we drive a car, we are not walking. And to walk is to be human. It is our most ancient legacy. To drive is, well… A dead end.

 

Already the influence of the Internet can be seen in television, which scrolls news across the screen and uses pop-up ads. Magazines have fragmented their text and graphic design into bite-sized bits that at their best disrupt deliberate attention and at their worst make no sense at all. Newspapers now feature capsule summaries so that people don’t have to read the articles. The headlines in the New York Timeshave lost all meaning as far as news is concerned. And signs and instructions everywhere have been stripped of words in favor of icons. (I saw a street sign pointing to the library. It had no words on it, only a silhouette of a man reading a book. Presumably, this was intended to direct illiterate people to the library.)

 

Obviously, there are clear benefits of having instant access to a universe of information. As a writer, I find that in an hour I can do research that would have taken me a week or a month to do just a few years ago. But every new advantage comes with unintended consequences. Just as a sedentary life will erode our strength, the shattering of our attention by brief bursts of unrelated information will influence the way we think. With the Internet at my fingertips, I begin to feel as if I know everything. But I have to be careful to separate what I know from what I can look up, lest I, as Socrates put it, “appear to be omniscient and… generally know nothing.”

 

   Technology is already changing the way people read and learn. A five-year survey was conducted at University College London to find out how people use two popular research sites, one operated by the British Library and one by a British educational consortium. The study showed that researchers are skimming, not reading deeply. They quickly move from source to source, reading a page or two. They don’t return for an in-depth look at the work. The report said, “It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”

 

Wolf argued that the Internet is weakening our ability to think deeply, interpret text, and make the rich mental connections that lead to real learning and new insights. She wrote that the beauty of reading is that it allows us to “reach beyond the specific content of what we read to form new thoughts.” She worries, probably with reason, that the Internet is going to rot the minds of her children.

 

Every new technology comes with a cost. Even something as simple as the clock changed the way people use their minds. As Joseph Weizenbaum, a computer scientist from the Massachusetts Institute of Technology, observed in his book, Computer Power and Human Reason, clocks led people to reject their direct experience of the world. For example, we eat at meal times rather than when we’re hungry. 

 

    In the face of these inventions, the quest for inefficiency becomes a noble aspiration. Looking up a word in a dictionary is different from looking it up on the Internet, because the way a book is structured forces you to browse, exposing you to all sorts of words that you had no intention of finding. The same is true of browsing the stacks in a library. You discover books that you didn’t know existed and remember ones that you had forgotten. You take detours, find dead ends; and all the while your brain is working. At what, you cannot tell. But that very process of exploration in pursuit of a goal is an essential part of developing a rich interior life and a critical faculty that will keep you from accepting nonsense as the truth, even as the effort informs and enriches your writing.

miSFIts

Real writers are miSFIts. If you are a real writer, you are an eternal outsider. You make it your life’s work to get inside and describe what you see. But the very intention to describe sets you apart. None of those people who really belong intend to describe where they belong. They are living it. We are peering in through the peephole of our craft. 

 

So real writers are destined to be alone, belonging nowhere, because they–we–can never be true insiders. The real world is always at a distance, on the end of a stick, and the stick is the pencil with which we write our descriptions. 

 

I spell the word miSFIts with those funny capital letters, because I’ve been writing a book about the Santa Fe Institute, and it is a home for misfit scientists who could never be insiders at conventional academic institutions. 

The Art of Writing

Still, the beautiful can be distinguished

   from the common,

   the good from the mediocre.

 

Only through writing and then revising

   and revising

   may one gain the necessary insight.

 

… verbosity indicates lack of virtue.

 

from Wen Fu by Lu Chi (261-303)

Socrates Was Right—And So Was Plato

The way the brain handles language brings up an interesting question that I’ve heard students ask in the age of Google and Wikipedia: Why should I memorize or even read something when I can look it up in four seconds? The answer is this: Memorizing stores knowledge permanently inside of you. What you remember is knowledge. What you have to look up is not. Knowledge is an essential tool of the writer.

 

New knowledge immediately begins to integrate with all your other knowledge, effortlessly and unconsciously. That is process that you can think of as “simmering,” a term many writers have used. To steal from Auden, this takes place while you’re eating or opening a window or simply walking dully along.

 

Great currents of knowledge, all laced over and under with emotions and sensory feelings, glide together and make new associations that can produce unheard of ways of thinking or fabulous combinations of words. Everything that you’ve stored in memory will inform and enrich your writing in a way that simply can’t happen if you store everything outside of your brain. The more you put into your brain, the more you get out of it. 

 

Memorization is an essential task if you want to write. For one thing, you need to know the meanings of a lot of words. For another, remembering things gives the brain something to work on during the process of simmering. You need to know what you wrote on page one when you’re writing page 100. The struggle for coherence and meaning, the quest for a narrative line, is a continuous task of holding more and more in your head.

 

It involves writing something, then going back to read it so that you can go forward. And then you have to go back and re-read everything once more so that you can proceed a bit further. By the time you’ve finished 100 or 400 pages, you will have read it many times, but you will also have had the physical experience of creating the words with your hands many times as well. Through this process, will come to possess much of the text internally. It’s no longer just on the page. It’s inside of you, if not word for word, then at least in a fairly detailed form. Because brain and body work together so intimately, you then know your work both by heart and mind. The Chinese word for this is hsin.

 

That doesn’t mean that you literally have to memorize great chunks of text, although that is a good idea. But it does mean at the very least, that you have to read a great deal and remember much of what you read. And it also means that you have to read and write and then read more and rewrite many, many times. Your brain will do the rest beyond the reach of conscious effort.

 

Socrates argued against writing as a means of storing and conveying knowledge. He was a hard-ass. He called writing “dead discourse,” because it lacked the dynamics of speech. The text, once written, could not answer back when questioned. Moreover, he felt that writing would, “create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves.”

 

He said that writing was, “an aid not to memory, but to reminiscence,” and did not convey “truth, but only the semblance of truth.” He believed that writing would turn people into “hearers of many things [but they] will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.” He could easily have been talking about the Internet. Google can make the generally ignorant look omniscient.

 

Without prodigious memory and active debate, Socrates argued, without the deep and complex structure of learning that the oral tradition engenders, we would come to skim over texts and receive a false sense of knowledge. Writing, he feared, would create a culture of people who not only couldn’t remember but who thought that they were educated when in reality they were ignorant. Their knowledge would be superficial, and in the end, we’d lose all control over knowledge. Yet without Plato, we would know none of this. He was a student of Socrates. He rebelled against his teacher and wrote it all down.

 

Socrates was right. Writing did decrease the need for memory. And as he predicted, the written word has been the tool of much mischief and misinformation. Socrates could see that writing would profoundly change the way the human brain works. But in exchange for our memory and the inevitable spread of pernicious nonsense through writing, we gained the ability to consult books on every conceivable subject. We have a much broader range of information available to us than even the best human memory can hold. Indeed, what we can’t remember, we can always look up. 

 

The trouble with being able to look everything up is that less and less resides inside of us. The brain is constantly working on what it knows, whether that knowledge is conscious or unconscious. As we put more and more diverse knowledge into our brains, we have more and more insights and original ideas. That process won’t work if we never learn anything deeply and instead rely on looking everything up. We are less likely to make imaginative leaps. Our words will not combine in those rich and lustrous ways that can make writing such a pleasure to read. People who want Ph.D.s still have to take oral exams. Like the students of Socrates, they must know their subject “by heart.”

Like a Bug!

“Keller and others accuse the impacters of trying to squash deliberation before alternate ideas can get a fair hearing.”

 

– The Atlantic, “The Nastiest Fued in Science,” Bianca Bosker, September 2018.

 

Journalists often misunderstand the word “quash.” It’s a legal term that means to overturn or to suppress, as you would “quash a move to dismiss the charges” or “quash a revolt.” The word “squash” means to compress something by squeezing, such as “I squashed the grape.” You can quash a motion in a court of law. You cannot squash it, as it has no physical body.

 

Gerta Keller, a scientist, would never put up with squashing other scientists. See her here:

 

https://gkeller.princeton.edu/

 

The offending sentence, which was presumably read by the illustrious editors of The Atlantic and perhaps even by its author herself, contains the additional ugly error of using the word “alternate” where the word “alternative” is needed. “Alternate” ideas are ones that switch back and forth. “Alternative” is meant to describe a situation in which more than one choice exists.

 

As for the word “impacters,” your guess is as good as mine. Norman Mailer once said that letting journalists have access to the printed word was like giving a loaded gun to a three-year-old.

Don’t Make Me Gag

Andrea Wulf wrote a brilliant and moving book called The Invention of Nature about the science of Alexander von Humboldt. I started to read it and could not stop. 

 

However, she had a few tics that a keen editor should have caught. She could not seem to place the word “only” in its proper place.

 

And she wrote this sentence, describing the difficult ascent of a high river valley by von Humboldt and his partner, Aimé Bonpland: 

 

“Bonpland was struggling with thin air–feeling nauseous and feverish.”

 

The word she’s looking for is “nauseated.” The word “nauseous” describes the substance that is the cause of nausea. Hence: “The filthy toilet was nauseous.” Or: “I was nauseated upon seeing how filthy the toilet was.” Or: “The sight of the dead man’s brains on the sidewalk made me nauseated.” 

 

The brains were nauseous. The narrator was nauseated. As was Bonpland.

 

For the use of “only” see the entry on “only.”

Don’t Tell Me What to Think

“One of the most remarkable aspects of T cells and B cells is that once an innate cell presents them with antigen, they can remember it for the rest of your life.”

David R. Montgomery and Anne Biklé, The Hidden Half of Nature, W.W. Norton, 2016.

 

The authors are telling me that I must regard this fact as remarkable. If the fact is remarkable, they ought to convey it in such a way that I find it remarkable on my own. 

 

Also, the construction “aspect… is that…” is clumsy and serves no purpose. The word “aspect” is less than satisfactory for what is meant. “Aspect” is more appropriate for architecture. “Characteristics” would be better here. But not even that word is needed.

 

Better: “Innate cells present antigens to T and B cells. Then the T and B cells remember those antigens for the rest of your life.”

 

Is that remarkable? You tell me.

Junk Words

“The research is hazy, if not totally silent, about exactly where the amber finds come from.”

“Blood Amber”

New Scientist, Graham Lawton 4 May, 2019

 

Don’t use words that you don’t need. How is “totally silent” different from “silent”? If something is silent, it makes no noise. Totally making no noise does not increase its silence. Likewise, “exactly where” is no different from “where.” 

 

In many cases, you may find that adverbs can be safely removed from your sentences without doing harm.

 

This sentence also suffers from the crime of using a noun as an adjective and a verb as a noun. This leads to a confusing construction. The phrase “where the amber finds” leads the brain to expect the word “amber” to be a subject and “finds” to be a verb. Encountering the phrase “finds come” or “come from” sets up a dissonance that causes us to double back to see what the writer intended. 

 

Better: “The research is hazy, if not silent, about where the amber comes from.”

Don’t Use Nouns as Adjectives

In December, 2019, the New York Times Magazine published an article by Heidi Julavits called “What I Learned in Avalanche School.” In it, Heidi repeatedly used awkward constructions that could easily have been fixed by the use of prepositions or the elimination of a word. She is not only an associate professor of writing at Columbia University, she has published several novels and won a PEN New England Fiction Award and a Guggenheim Fellowship. 

 

One sentence she wrote said, “In many nonavalanche-terrain scenarios, if a person falls into a heuristic trap, the outcome isn’t death.” It is permissible to put the prefix non before any word. But nonavalanche stretches the point and is awkward. But to create an adjective by adding a hyphen and the word terrain produces an unpleasant effect that cries out for revision. The vogue word scenario has no place in good writing.

 

Further on: “We noted the resistance variation between the layers.” Better: “variation in resistance between…” But we don’t really know if the original sentence meant that they were noting the resistance or noting the variation. She could also say, “We noted the difference in resistance between the layers.” 

 

All of this can be avoided by not trying to use nouns as adjectives.

 

She also wrote, “survival strategy,” “many-feet-deep snow,” “avalanche autopsy,” and “tree bough.”