Writing self

Among the students I have tutored over the years was a young woman recovering from a traumatic brain injury. Writing was difficult for her on several levels. Reading on the screen or page tired her eyes and made it hard to focus; while using voice-activated software helped for that part of the writing issue, it did not resolve her larger cognitive loss: she found she could no longer tell a story. The ability to tie together research, concepts, and chronological moments to compose a logical narrative evaded her.

As we worked together, I learned how writing can restore the self. She began to reflect, through writing, on her process and her memories and to tether things together on the page so that they “made sense” to me–her sounding board. When something made sense to me, she would re-read it and decide if it reflected what it was she had been trying to say. Gradually, she felt more restored to herself, a slightly altered-by-trauma self, but a cohesive self who could tell a story again.

~

When I tutor students who are multilingual, particularly if they are fairly new immigrants here, I find that writing plays a similar role in reflecting or re-creating a self. These students learn to work and write using American English as their mode of persuasive communication, and in the process they develop as people who live in the United States and who consciously employ those terms, phrases, writing techniques, and concepts. They are much more conscious than “native” speakers about the fact that they are using Americanisms and writing in an American style; what they end up with is a self that they can deploy when necessary in American society.

~

Brain diseases, strokes, and dementia dismantle the story-telling ability. Whether we use the metaphor of braiding, warp & weft, or nuts & bolts, we mean that story has structure–and in dementia, structure comes undone. With that structural demise all too often comes the unraveling of the self. Each gap weakens the links that give us our own story-made self and leaves the human bereft of that consciousness we rely upon for being. The person whose brain has stopped constructing self stories is no less human, physically; but the self–that sentient, much-valued ego–disappears.

When I am with a hospice patient whose mind has stopped composing narratives, I see that the narrative of pain and envy and sorrow seems to depart. Is there a story that contains only peace? Could that even be a human story?

I don’t know what to make of all of this.

IMG_5015

Sometimes, I wish I had the peace and confidence of a house cat.

Advertisements

Transitions & ambition

letter I
have maintained this blog pretty regularly, for years now, writing about books and poems and gardens and teaching, examining the concept of consciousness and trying to plumb–from a novice’s perspective–the brain’s wiring and functions. I suppose I am seeking a kind of “interdisciplinary” approach in these posts and in life: a philosophy of values that considers the arts, aesthetics, evolution, biology, social structures, neurology, consciousness, physics, etymology, pedagogy, ecology, and compassion (have I forgotten anything?) in a distinct but expansive method of living in which I can situate myself and which might guide my behavior as I make my life-long way through the world. If, by some chance, my words influence a reader–so much the better; this is, after all, a public space (WordPress.com).

Like many people who use social media platforms for their writing, though, I have a mixed view of its suitability as a medium and of its perceived necessity for contemporary writers. My purpose, originally, was to practice writing prose and to promote the arts and the natural environment as necessary complements to and instruction for the development of empathy (compassion) and metacognition in human beings.

The blog has been reasonably suitable for practice; it gets me writing what is basically a brief essay on a more-or-less weekly basis. It has several thousand “followers,” but only a handful of readers. [I can discern this through the statistics page on WordPress, though I don’t check often.] In general, I use this platform mostly as a way of “seeing what I think,” and it serves that purpose, too.

IMG_0127

I have come to some conclusions about the problem of consciousness (and about whether it actually is a problem) through the reading and experiences of the past ten years or so. Those conclusions are, however, private ones. While the process of discovery and inquisitiveness works in a public forum, the takeaway remains, for this blogger, a thing carried within.

But.

~~

But other blogger-writers have influenced my thinking about what a public forum such as blogging or Facebook can do for the writing process. Dave Bonta and Luisa Igloria, as well as Michael Czarnecki and Lou Faber–among others–promote by example the option, and value, of publishing new or unedited, unfinished, partially-revised work. Granted, not all of them have thousands of readers who weigh in on criticism or encouragement; but the very process of making public the work-in-progress seems to me to be courageous. This may be because I am a wimp, or it may be because the social aspects of the vaunted “po-biz” have dampened my willingness to show a kind of transparency in my writing methods.

I am not on the tenure track and will not be teaching in an MFA program, however, so why would it matter?

Therefore: be prepared, oh limited but blesséd audience. I may begin to foist upon you the recent sad, sad poems I’ve been writing–in draft form. Or I may begin to reveal the poems from my seven-years’-unpublished manuscript online. Or I may, like Luisa and Michael, begin to blog “a poem a day” (unlikely, but…). It seems to me that a transition is in order here. And that stands as my writing ambition for the moment, as autumn makes its way toward the solstice and I face another stack of student essays to grade.

 

 

 

 

Not enough

The fall semester is about to begin, a very busy time for me and my colleagues. I need to be nose-to-the-grindstone, yet I have some deep and worrying concerns that distract me from the evaluations, curriculum preparation, scheduling, and staff meetings. Among the 40+ students who attended our university’s summer “bridge” program for college transitions, at least six openly expressed fears about being accepted and wondered how to deal with prejudice on and off campus. I am pretty sure they spoke for others who kept such fears to themselves.

We do not have answers for them. We can only say: Be yourselves, and be that well; say what matters, and say it forcefully but non-violently; and tell us if you feel afraid or need support–we promise we will stand with you.

That promise I take as seriously as any promise I make to my family members or best-beloveds–even though my students are “strangers” to me. I will intervene if I notice that they are threatened in any way. I’m a writer; I know that words, too, can cause harm.

And maybe that promise is not enough.

And maybe marching is not enough (read about marching here and here).

I am by nature a quiet person. But being quiet is not enough.

hate has no home

It is not enough. It is, however, a start.

Science & philosophy

The small, religiously-affiliated university at which I work graduates, percentage-wise, a large number of baccalaureates in the sciences although it offers a liberal arts-based core curriculum. How does that affect what coursework students must do? For starters, two Theology courses and one Philosophy course are required for graduation.

Three critical-thinking method, scholarly courses ought not to be more than a student in the sciences–or any other discipline–can handle; but I hear a bit of resentment among the undergrads. They question the necessity of abstract ethics classwork, wondering how such material will be applicable to a fast-paced, technologically-advanced, science-oriented career or life. Philosophy doesn’t seem to be a skill set to them.

SocratesWhile I fundamentally disagree, I take their point. With so much new information coming at them, info-savvy young people might well feel skeptical about what they can gain from reading texts by Plato, Aristotle, Augustine, or Aquinas.

Philosophy has been around for millennia, though; empirical science as we know it–with electron microscopes, satellite-mounted telescopes, petri dishes and x-rays–is brand-spanking new by comparison. The techniques we use today seem concrete and tool-like rather than theoretical; yet as every real scientist knows, the only way developments occur is through hypothesis–theory–claim–assertion–question–pushing the envelope of the known.

Which is what philosophers have been doing for thousands of years.

The budding scientists and medical-studies researchers I encounter seldom realize that without philosophy, science would not exist. Philosophers asked the “why” questions, came up with theories and categories, tried to see into a future that might someday have the technology to confirm or refute the theories they came to solely through human observation and deduction. Problem-solving skills. They were the scientists of their day, and the methods of thinking they came up with are those that contemporary scientists in all disciplines continue to employ.

http://www.isys.ucl.ac.be/descartes/images/Descartes.gif Descartes

Descartes, 1640s

A wonderful book on the way philosophy developed into biology (to take just one of the scientific disciplines) is Marjorie Grene and David Depew’s The Philosophy of Biology: An Episodic History.

The authors–a philosophy professor and a rhetoric professor–provide a history lesson in science, taking us by steps and by leaps into the development of a scientific (empirical) skill set as derived from insightful cognitive understandings of those Dead White Guys on whose thinking Western philosophy is based.

finch beaks

Darwin’s finches, 1840s

Now, I am not an advocate for a strict return to the Western Civ canon; I think university education should diversify into exploring (and questioning) other modes of cognition, culture, and philosophical approaches. Yet it seems to me imperative that students continue to study, and learn to value, the history of human thought. You can be a nurse without a thorough background in Aristotle’s categorical concepts; you can learn the drill about washing hands, donning gloves, and inserting catheters–all practical, concrete skills. You can understand the rationale for all of those skills; that’s true, and practical.

beautifulbrain01-1080x1373

Cajal’s drawing of a pyramidal neural cell, 1913

Nurses today, however, should have the thinking skills to solve unexpected problems rapidly and rationally, which is how things play out “in real life,” to deduce that something’s going wrong even when the readouts look stable, to recognize that the hurried intern added an extra zero to the number of milligrams of medicine prescribed. They need enough background in the history of medical care-giving to question a doctor or administrator when the ethics of a patient’s care seem to be at risk. These problem-solving skills are not only crucial, they are philosophically-based.

~

 

I will dismount from my high horse now. With all the disorienting information being bombarded at me these days, I need a poem to reorient myself. Here’s one by Mary Oliver.

Snowy Egret (by Mary Oliver)

A late summer night and the snowy egret
has come again to the shallows in front of my house

as he has for forty years.
Don’t think he is a casual part of my life,

that white stroke in the dark.

==

We shake with joy, we shake with grief.
What a time they have, these two
housed as they are in the same body.

 

The 4 Cs

white

Sometimes, when I am in reading-after-a-hard-day-at-work mode, I feel mentally unprepared to tackle difficult books. On such days it is better to settle on the sofa with a glass of chardonnay and a text that entertains as well as informs. I confess that Ruth Whippman’s America the Anxious: How Our Pursuit of Happiness Is Creating a Nation of Nervous Wrecks had me snorting my wine a few times; her wry British cynicism kept me giggling even when her observations strongly critique some serious aspects of the culture and nation to which I belong.

In her book, Whippman finds understandable fault with the commodification of happiness, but she also threatens an American sacred cow: the concept of individual happiness that arises from our foundation document concerning our rights to life, liberty, and the pursuit of happiness. Implied in her critique of the American “happiness industry” (including positive psychology, attachment parenting, yoga, mindfulness, Facebook…) is that maybe our nativist stance of rugged individualism and the freedom to make money on anything we can capitalize upon, thus pulling ourselves up by our own bootstraps, does not result in “happiness.” (Maybe Jefferson meant something else by that term. We cannot really know.) One reviewer mentioned that it is just Whippman’s “outsider” status as a person not raised in the USA that makes her book so useful. Changing one’s usual perspective, as I constantly reiterate to my freshman students, can hardly fail to be a valuable exercise in critical thinking and broadening one’s outlook.

Here is an observation of Whippman’s with which I heartily agree: “If happiness is community, then a psychologically healthy society takes collective responsibility for the well-being of its most vulnerable members.” I agree, however, because Whippman’s conclusion happens to coincide with my culture, upbringing, or perspective. Like her, I am willing to accept contentment–with occasional bouts of joy–rather than run relentlessly after happiness; and like her I find most contentment among human beings, though I may want them to shut up and just hang out quietly in the same room with me for awhile! Furthermore, it increases my happiness when I know that in my community (or nation), other people are cared for, not just me. In my point of view, happiness–including personal happiness–arises when I know that all human beings have their needs met.

But I recognize that not everyone will agree with Whippman’s, or my, conclusion that community is happiness; indeed, there is a good argument to be made for Sartre’s “Hell is other people,” too.

~

Recent discussions on diversity among fellow people employed in academia (what it appears to mean, what it might include) and reflections on mortality, consciousness, the notion of the self–and spirituality and religion–not to mention science writing on evolution, have pushed me into a deeply introspective mode. Yet I find I want to converse with other people about these ideas, not hole up in my own head; I seek, and have been happy to participate in, discourse with others.

In another word: community.

~

Here’s a paragraph from Daniel Dennett’s Breaking the Spell that I want to share with my students:

If you can approach the world’s complexities, both its glories and its horrors, with an attitude of humble curiosity, acknowledging that however deeply you have seen, you have only just scratched the surface, you will find worlds within worlds, beauties you could not heretofore imagine, and your own mundane preoccupations will shrink to proper size, not all that important in the greater scheme of things. Keeping that awestruck vision of the world ready to hand while dealing with the demands of daily living is no easy exercise, but it is definitely worth the effort, for if you can stay centered, and engaged, you will find the hard choices easier, the right words will come to you when you need them, and you will indeed be a better person. [italics Dennett’s]

Complexity, community, curiosity, contentment. The four Cs?

Oh, let’s add chardonnay. Make it five.  🙂

Complex ambiguities

Rebecca Solnit from Ploughshares, May 2016: “We live in a time when …purveyors of conventional wisdom like to report on the future more than the past. They draw on polls and false analogies to announce what is going to happen next, and their frequent errors… don’t seem to impede their habit of prophecy or our willingness to abide them. ‘We don’t actually know’ is their least favorite thing to report.” [My italics.]

I am the sort of reader who loves to hear experts announce “We don’t actually know.” But I recognize I am in the minority–in this respect–in my culture. That most Americans are willing to abide such speculative prophecies worries me a bit, and I do what I can in the classroom to waken my students to the possibility of erroneous thinking, even on the part of supposed experts and aggregate sources.

Yes, once again I am teaching argument to freshmen…the classic example of what Solnit calls naïve cynics:

Non-pundits, too, use bad data and worse analysis to pronounce with great certainty on future inevitabilities, present impossibilities, and past failures. The mind-set behind these statements is what I call naïve cynicism. It bleeds the sense of possibility and maybe the sense of responsibility out of people.

Maybe it also says something about the tendency to oversimplify. If simplification means reducing things to their essentials, oversimplification tosses aside the essential as well. It is a relentless pursuit of certainty and clarity in a world that generally offers neither, a desire to shove nuances and complexities into clear-cut binaries. Naïve cynicism concerns me because it flattens out the past and the future, and because it reduces the motivation to participate in public life, public discourse, and even intelligent conversation that distinguishes shades of gray, ambiguities and ambivalences, uncertainties, unknowns, and opportunities.

Scholarly argument should ideally create discourse, not embattled absolutism on things that cannot ever be “proven.” In fact, I have forbidden my students to employ the word “prove” (or any of its conjugations) in their argument papers. They know my rationale for this lexical excision; I also warn them away from “always,” “never,” and “everyone.” But they are not yet experienced enough critical thinkers to recognize that my practice is also to encourage research and nuance, to shove them (gently) out of their naïve cynicism into the world of no-easy-answers, no-slippery-slope-thinking; a world of wonderfully complex ambiguities waiting to be more fully explored.

I think of my oldest child who, many years ago, was insistent on knowing ahead of time how everything was going to turn out: “Does the movie have a happy ending?” “Does the Little Red Hen get anyone to help make her bread?” “Can I win this game?”photo ann e. michael

Raising a child who is temperamentally anxious requires a form of parenting that offers comfort but admits to unknowingness. (Next up on my reading list: America the Anxious, by Ruth Whippman!) Solnit says the alternative to naïve cynicism is “an active response to what arises, a recognition that we often don’t know what is going to happen ahead of time, and an acceptance that whatever takes place will usually be a mixture of blessings and curses.”

I don’t think I have ever heard a more accurate description of what being a human entails.

Generation Anxious

In the shorthand of age demographics, I am marginally a Baby Boomer. I think there were earlier tag-names (Jazz Baby, for example) but the Boomer generation began a spate of efforts to define millions of people randomly born within a few years of one another by some generational attribute that caught on with media. I am not sure that I fit the conventional Boomer stereotype, but naturally at least some of the generalizations of that era apply to me. That’s why people use stereotypes. It is an easy way to categorize (thanks, Aristotle).

Of course, each so-determined generation feels certain that the antecedent generations are out of touch and misconstrue the attributes and the attitudes of those younger than they–and they are justified in this conclusion.

Often, though, we understand young people’s circumstances better than they realize because yes times have changed, but people haven’t. Not that much.

~

I do not have any idea for how long the media and demographers will go on calling young people Millennials; but the young adults I meet seem to more anxiety-ridden than millennial, whatever that means (actual millennials are only 16 years old now). They have grown up with parents who worried about keeping them safe in a society obsessed with security after 9/11. My guess is that in the USA, society’s insecurity entered young lives insidiously through toys, media, the internet, parental conversations, games and gaming, you name it. Parents’ main goal–any tribe or nation’s goal–basic to survival instincts, is to keep the offspring safe. That has felt challenging in the last 20 years or so.

I am not blaming parents. I am not blaming young adults.

~

I do observe a tendency away from risk-taking among many young adults and an accompanying fear of the future; among the risk-taking proportion of young adults, I notice that they engage in risks often because they feel there is no future for them.

Far too many of them believe dystopia awaits: climate warming, floods, polluted waters, chaotic capitalist oligarchies as government, spying and infiltration, loss of the ozone layer, terrorists everywhere. They don’t want to believe this is their future, but they are afraid.

And the Baby Boomers, who (according to the legend) were going to march forth and change the world for the better, failed the generations that followed. That’s the current story. (The story will change and develop over the coming years with the evidence of hindsight.)

I get it. I understand the fear and I know how fear dampens motivation and fosters, instead, a muttering resentment under the surface and a pervasive feeling of stress and anxiety. Few of my children’s friends are “secure” in their careers, jobs, housing, health, or finances between the ages of 22 and 30. Most of them have education debt and few have savings.

That’s scary for them. And here’s the thing: when I was their age, I was in the same boat but felt less frightened about my situation. I did not have the feeling that the world was dangerous and things might not work out. I was probably wrong about that…

Maybe ignorance is bliss?

~

My parents’ cohort was dubbed “The Silent Generation.” That implies they accomplished nothing, sat back and served roast beef on Sundays while McCarthy and cronies raked through American society looking for communists.

Maya Angelou, Neil Armstrong, Toni Morrison, Harvey Milk, Stephen Sondheim, and Martin Luther King Jr are among the “Silent Generation.”

So here’s the thing: Nomenclature is not destiny.

~

Anxiety requires learning coping skills, whatever works for the individual; the ability to puzzle things out using critical thinking; a sense of independence; development of self confidence and courage. Those are things we attain with maturity and experience as our guides. Millennials–or whatever you call yourselves–you are getting there. It feels slow. It feels scary.

Your elders may forget to tell you about that part, or perhaps wanted somehow to spare you from the realities. Please forgive us.

To millennials, the anxious generation: You got this. You are more educated than any previous generation, more concerned, possibly more compassionate. You know how to tackle complicated problems–you are merely afraid you will make mistakes. Go ahead and make mistakes.