Imaginative, not imaginary

I have been thinking about the place that a poem makes in the world, the place that a poem is in the world. My recent reading on C.S. Lewis’ Narnia series (see the tail end of this previous post) has led me back to a few of his essays. He felt that good stories–whether fantasy, mythology, allegory, science fiction, or epic narrative–take the reader to threshold spaces that are imaginative, not imaginary.

I think that poetry offers what Plato calls psychagogia— “an enlargement of the soul” in C.S. Lewis’ definition, or see John Joseph Jasso’s dissertation chronicling it as “the idea that rhetoric can lead souls to their own betterment; that is, guide them in an ascent along a metaphysical hierarchy through beauty, goodness, and truth to a fuller participation in being.” Poetry provides such enlargement by permitting the reader to imaginatively undergo transformation via images and places the poem offers, to experience the turn in the poem’s rhetoric, to feel ‘along with’ the poem’s nature. The poem is a threshold at which the reader stands and makes the choice of whether or not to enter.

Granted, that seems a rather allegorical way to think of poetry, but not, I think, an unwarranted perspective.


Lewis, by training a medievalist, believed that we need to read “the canon” or, essentially, any and all great literature of the past, in order to have “something to set against the present, to remind us that the basic assumptions have been quite different in different periods and that much which seems certain to the uneducated is merely temporary fashion.” (This is from his 1939 sermon “Learning in War-Time.”) I love reading modern and contemporary literature; but I agree with him that through reading the work of the past, we cross a threshold into a new (to us) perspective. I do not know what the past knows; I have to explore, read carefully, infer, and take nothing for granted. I must take the role of observer before donning the garb of critic. For me, it’s as important to approach literature with beginner’s mind as it is to approach the garden with beginner’s mind. Perhaps this is one reason I have always enjoyed reading history: The past is a place I do not know well and therefore have to find a way to enter into anew.

Lewis continues by noting that the person who reads literature of the past “has lived in many times and is therefore in some degree immune from the great cataract of nonsense that pours from the press and the microphone of his own age.” Given the times in which we live and the nonsense pouring from the microphones of our age (which are legion), it takes a good deal of sorting to find the beautiful and the good–which do exist–amid the resounding chaos. I do not recommend a full retreat into reading Beowulf, the Illiad, or Tolstoy, but tempering my intake of current media with poems and stories reminds me that I ought to question my basic assumptions and the basic assumptions and perspectives of others, including people who lived long ago in eras and cultures about which I know very little.

A good read inclines me toward the imaginative. Whatever arts it may take to get me there, past the imaginary and into imagination, whatever aesthetic form it takes, I am grateful.

baynesNarniaPauline Baynes’ illustration: Narnia’s lamppost in snow.









Jargon & rhetoric

This is a kind of continuation of my last post, in which I alluded to euphemism and jargon and the weightiness of words. Herein, I take the unpopular stance and argue that the Centers for Disease Control‘s suggestion that proposals avoid certain words is not entirely about censorship but about rhetoric and persuasion and in this case–given the makeup of the current Congress–was actually appropriate. This is a situation in which jargon–wording–makes all the difference in the persuasiveness of an argument.

The CDC needs to send its annual proposals for research, for agency budgeting, and more to Congress; and each set of documents requires Congressional Justification. If the Congress does not agree to certain proposals, those which have justification withheld will not be funded. (It is possible that many citizens, myself included, are not fully informed as to how these government-funded agencies operate.) Therefore, the possibly-skeptical audience must be convinced of the value of these proposals.

~ ~

letter I


f this were an essay for my students, I would prompt them to think about the audience for their arguments. A writer can employ jargon effectively to help persuade a skeptical audience. Use the terms of the discipline, I tell them. That usage may seem superficial, but it actually works to prove to the audience that you know what they are seeking; you are a member of that community, you know the lingo, you’re on that side and your research will advance that cause.

It may be that your research is something well beyond the audience’s understanding, but you know how to sound like one of them; so, chances are, they’ll get on board with whatever you are conducting.

Even if they have no idea, really, what it is you’re proposing. Even if what you are proposing in fact runs counter to the audience’s ideology, effective proposal writing can hide the fact.

The CDC, from what I have been reading, has not banned those seven words nor the research or public policies concerning them; instead, the agency is cautioning its researchers and policy proposal writers to avoid language that would ring the wrong bells in the ears of this particular Congress’s majority ideology. I see nothing wrong with that. In fact, as a person who guides others in writing arguments based upon research, I encourage it.

Gotta know your audience, or your proposal will fail no matter how excellent its logic, science, and methodology may be. I hope the resourceful writers at the CDC will find ways to convince our Congress to keep the agency and its research funded. A future Congress may be less sensitive to evidence-based information. In the meantime, use the jargon in proposals–whatever works! But use the best words, and the most clear and accurate words, in other forms of discourse.






Complex ambiguities

Rebecca Solnit from Ploughshares, May 2016: “We live in a time when …purveyors of conventional wisdom like to report on the future more than the past. They draw on polls and false analogies to announce what is going to happen next, and their frequent errors… don’t seem to impede their habit of prophecy or our willingness to abide them. ‘We don’t actually know’ is their least favorite thing to report.” [My italics.]

I am the sort of reader who loves to hear experts announce “We don’t actually know.” But I recognize I am in the minority–in this respect–in my culture. That most Americans are willing to abide such speculative prophecies worries me a bit, and I do what I can in the classroom to waken my students to the possibility of erroneous thinking, even on the part of supposed experts and aggregate sources.

Yes, once again I am teaching argument to freshmen…the classic example of what Solnit calls naïve cynics:

Non-pundits, too, use bad data and worse analysis to pronounce with great certainty on future inevitabilities, present impossibilities, and past failures. The mind-set behind these statements is what I call naïve cynicism. It bleeds the sense of possibility and maybe the sense of responsibility out of people.

Maybe it also says something about the tendency to oversimplify. If simplification means reducing things to their essentials, oversimplification tosses aside the essential as well. It is a relentless pursuit of certainty and clarity in a world that generally offers neither, a desire to shove nuances and complexities into clear-cut binaries. Naïve cynicism concerns me because it flattens out the past and the future, and because it reduces the motivation to participate in public life, public discourse, and even intelligent conversation that distinguishes shades of gray, ambiguities and ambivalences, uncertainties, unknowns, and opportunities.

Scholarly argument should ideally create discourse, not embattled absolutism on things that cannot ever be “proven.” In fact, I have forbidden my students to employ the word “prove” (or any of its conjugations) in their argument papers. They know my rationale for this lexical excision; I also warn them away from “always,” “never,” and “everyone.” But they are not yet experienced enough critical thinkers to recognize that my practice is also to encourage research and nuance, to shove them (gently) out of their naïve cynicism into the world of no-easy-answers, no-slippery-slope-thinking; a world of wonderfully complex ambiguities waiting to be more fully explored.

I think of my oldest child who, many years ago, was insistent on knowing ahead of time how everything was going to turn out: “Does the movie have a happy ending?” “Does the Little Red Hen get anyone to help make her bread?” “Can I win this game?”photo ann e. michael

Raising a child who is temperamentally anxious requires a form of parenting that offers comfort but admits to unknowingness. (Next up on my reading list: America the Anxious, by Ruth Whippman!) Solnit says the alternative to naïve cynicism is “an active response to what arises, a recognition that we often don’t know what is going to happen ahead of time, and an acceptance that whatever takes place will usually be a mixture of blessings and curses.”

I don’t think I have ever heard a more accurate description of what being a human entails.

Continuing the discussion

The semester is almost over, and my students and I have spent a few weeks doing writing that relates to Cass Sunstein’s book Why Societies Need Dissent. As it turns out, this semester coincides with considerable current-event attention on protest, conformity, stereotyping, and other issues Sunstein explores in that text. Social media pushes the herd mentality, the “troll” mentality, and the ease of using shortcuts in thinking: justification through bad analogies, irrational responses, barely-considered ideas, culturally-entrenched concepts, knee-jerk reactions.

In other words, the gamut of human social psychology in 140 characters or thereabouts, with links, memes, and dudgeon.

A case in point that appeared on social media last week is a photo of a black man holding a sign that reads, “No mother should have to fear for her son’s life every time he robs a store.”

That was a photoshopped “joke” in which someone altered the last line of the protester’s poster. The intent was to assert that Michael Brown had robbed a store before walking down the middle of a Ferguson street, and the intent was clearly meant to suggest that Brown deserved to be shot by police–or, at any rate, to suggest that he was not “innocent.” I agree with the poster even in its altered state because I propose that none of us are innocent, and that none of us deserves to be killed. A suspected robber should be tried by jury and should be considered innocent until proven guilty because that is the way US law reads.

I do not claim that “It’s that simple.” Indeed, the situation is far from simple, which is why it feels so fraught and inflames such exertions of logic, law, and character defamation, and so many conflicting opinions–not to mention Facebook “purges” and irate newspaper columns and public protests. These are reasons that discussion can be useful. We need to continue the discussion, even though it is awfully difficult to do so.


If only we could listen to other perspectives. If only we could engage in discussion. I listened to two of my male students talking about being stereotyped. One claimed he was seldom troubled by harassment and not really bothered when people tried to stereotype him. “You’re not black,” his friend responded, “You’re Latino, or whatever.” The first man held up his arm: “Hey, man, I’m darker than you. What makes you black and me not?”

“Neighborhood. Money.”

“Look at you, bro! You’re wearing $185 shoes and new jeans. Dollars to donuts your family has more money than mine.”

They continued in this fashion awhile, sometimes asking me what I thought. If it was history that made them different, couldn’t the black man put it behind him? And he didn’t even really know much about “his” history, it turns out. If it wasn’t skin color that made one man feel less sure of himself on the street, warier, even in a “good” neighborhood, to what could it be ascribed? Was it just a personal issue? A neurosis? Was the Latino man clueless, or oblivious? Or just lucky up to now? Are these issues of confidence, self-esteem, bravado, or fear? Social issues or private ones? All of the foregoing?

And how does all of this relate to how young people of any background, religion, or color comport themselves in the world, deal with society and its assumptions, codes, expectations?


I teach writing. My job consists in instructing students in the perhaps arcane code that clear, concise, informational, and persuasive writing requires if they are to succeed in writing for academia and, later, the world of business information. I tell them: “This is what you should expect others/authorities to expect of you. It’s your choice to follow the conventions or not to follow the conventions, but you need to at least know what the conventions are.”

Meanwhile, I hope they recognize that they should follow the conventions of the rule of law; and if they choose to oppose the law, they should do so with forethought and initially, at least, within the structure of the law. But bad laws do need to be changed, and bad protocols need to be changed, and unarmed people should not be killed for brooking authority; and stereotyping–a very natural and automatic human behavior though it is–should be consciously questioned, even though yes, that can make the discussion difficult.

Dissent, controversy, & opinion

It may be obvious that, in this blog, the writer tends to shy away from highly controversial contemporary issues–with the possible exception of my occasional strong views on education–even though philosophical and critical arguments are part of my job and integral to my life interests. One possible explanation is that I am, as Charles Schultz memorably popularized, “wishy-washy.” (This strip is from 1952, © Charles Schultz):


And a little destructive criticism from 1959….


Indeed, my students sometimes get annoyed with me because I do not take sides during class discussions of controversial topics. “Don’t you have an opinion?” they ask.

Why, yes, I do. It is not my job to share my opinions with students, however, as much as it is my job to make them think more than once about their own opinions. It is also my job to help them navigate the complexities of critical thought, weighing “both sides” (and pointing out that many controversies have many more than two sides), and learning that perspective can deepen understanding and sometimes even alter opinions. This approach is far from wishy-washy; it is courageous. It can be risky to analyze rationales and points of view that differ from your own, and risk takes courage.


A good book that explores the courage it takes to analyze and, often, to dissent from the normative view is Cass Sunstein’s Why Societies Need Dissent. Sunstein argues that truly free societies need to permit dissenters room for expression and criticism; he provides evidence that without dissent, societies fail to thrive through change. Because growth is a change process, societies that resist change too rigidly fall apart.

This year, my class and I will be exploring Sunstein’s text in an effort to recognize the kind of thinking and evidence needed before one writes an essay. I hope they apply these ideas in their freshman Philosophy course.

I hope they apply these ideas in my course, for starters…


Argument has a negative connotation in American English, so many critics substitute the word discourse. I have no problem with such a substitution: the term discourse seems to connote politeness and respect, behaviors necessary for useful dissent and analysis of alternative perspectives. The philosophical argument, whether taking place in philosophy class, conference hall, or koan, operates most productively and insightfully when predicated upon mutual respect for differences.

Dissent as discourse may not be the most natural behavior for human beings, but it is something we can demonstrate and coach in the university classroom.

With any luck, both students and teachers may be able to apply the techniques to other areas of our lives. Along that vein, here’s an easy-to-interpret Buddhist explanation from New Lotus on how to approach argument in the Buddhist way.





Enter the philosophy paper…

My “day job” at a small university is part administrative, part teaching, part assessment, and largely tutoring in writing. The last of these requires a peculiar balancing act, because my directive says I must not tutor discipline content; I have to tutor students toward “clear expression” while staying within the areas of grammar, spelling, vocabulary use, assignment interpretation, thesis writing, paper structure, and documentation. As a job description, that all sounds quite clearly delineated and objective enough, but writing well cannot happen when the writer fails to understand content material. Enter the Philosophy paper.

In any discipline, it’s difficult to separate tutoring “clear expression” in terms of grammar and vocabulary without also tutoring content. With philosophy that process is especially challenging, because to a large extent, philosophical understanding (content) relies on grammar (rhetoric). A student can contradict himself simply by neglecting to type the word “not” in a sentence, rendering his attempt at argument void. Or a student may announce she will use one approach to prove her claim and then prove the claim, quite adequately, with a different (and opposite!) approach.

This bust resides in the Louvre, and was found here:

This bust resides in the Louvre, and was found here:

Cases like these cause me to ponder. How can I coach the writer without offering a content-based answer? Philosophy itself supplies the method: inquiry.

“So, you say here that because Locke believed in Natural Law, he would not apply Natural Law in the case of the social contract. Can you explain that statement? Because it seems as though you are contradicting yourself, unless you accidentally added the word ‘not’ or unless you have more to say after this sentence…maybe, why he would not do so?”

“Here, you do a pretty good job explaining why beauty is in the eye of the beholder, although you need to pay more attention to your use of the comma. But back at your claim in paragraph one, you say you will prove beauty is transcendent–and your definition of transcendent doesn’t work with your argument in paragraph three…do you mean beauty is not transcendent? Did you forget a word, or are you missing a paragraph of explanation?”

When the science students or economics students bring papers to me, it is, I admit, much easier for me to stick to grammar and mechanics. The same sorts of logical structure or argument issues crop up, however. Sometimes, I feel as though I am right on the borderline, and sometimes I think I’ve teetered a bit too far into content tutorial–especially when the students are writing about history, philosophy, literature, or philosophy. Yet would any philosopher disagree that you cannot completely disentangle grammar logic from any other kind of logic? They stem from the same root.

Critique: an anecdote

A group of friends gathers once a month to read and share and critique one another’s poems. They know one another well enough that they can be empathic, honest, and helpful; also, they are trusting enough to bring poems that really aren’t “working.” The critique’s the best way to get some understanding of why a poem is not working.

Recently, one member shared a poem that was somewhat philosophical in its overtones. The rhetoric of the poem was shifting, getting away from her, and she knew it. But she couldn’t figure out why, or how to correct the problem.

The talk moved away from critique and into values–for awhile. So the group was no longer exactly discussing the poem, or poetry, at all.

And yet, through the conversation, the writer recognized where at least part of the poem’s problem lay…which was in image and in structure (as illuminated, perhaps, by value and by rhetoric).

Reason can lead to beauty.