Hunger for words, words for hunger

When I was very young, our church became involved in the War on Poverty outlined by the Johnson administration (1964). My father attended events and marches to raise awareness about the fact that many people in this wealthy nation, the USA, were struggling–even starving. It seemed, probably idealistically, that a country as prosperous as the US was in ’64 would find a way to insure that all its citizens could have enough to eat and a roof over their heads. (This was Johnson’s “Great Society”)

A memory:

My sister, my mother, and I are seated at the table in our little apartment kitchen in Yonkers, NY. My father is away on pastoral business, but the previous evening, he’d told us that we were going to fast the next day in solidarity with poor people who never had enough food to eat. The reason for fasting was to let us feel how they must feel.

My little sister thought that was unfair. She was, in her defense, only four years old.

Of course it was unfair. That was the point. Why should some people have plenty of food while others went hungry? That is unfair. (This logic she understood, though I don’t think either of us made more of the connection at that time.)

“You kids won’t fast the whole day,” my folks said–just suppertime.

Now it is suppertime. We are at the table, and the table is bare. We each have a glass of water, not milk. And we are hungry. Our mother has fasted the whole day. Isn’t she hungry? Yes, she says. She’s hungry. It isn’t a good feeling, and we whine awhile, hungry and in addition, bored.

“Okay,” she says, “you can each have a piece of bread. One piece.”

It is something, but it doesn’t fill the stomach.

bread

bread

Another memory:

I’m in my thirties, with young children of my own now, and talking with my mother about her past–a past she has kept from us, and from herself, and is slowly learning to accept. A past that included growing up during the Depression with five siblings. How her father refused, out of pride, any kind of government relief. How hard her mother worked to keep the family from going hungry.

I think, then, that my mother knows what it means to be hungry.

~~

Many decades later, the term for hunger has become, in legislature and grant proposals, “food insecurity.” The jargon, the euphemism, distances us from the facts. People without enough good nutritious food are not insecure. They hunger.

I don’t want words to operate that way, moving the reader away from understanding. I want words to bring us close, to open up the mundane and horrible real and the fervently imagined possible. Language that sears and mends, the interpretation of which also can sear and mend, words that do not act as misprision but as multi-faceted revelation. Those are the words for which I am hungry.

Something that fills the stomach: embodied, flavorful, wonderful words. That’s one of the reasons I love poetry so much, that hunger for the non-distancing. The relationship that brings us truth. The truth that is often unspeakable.

Poems can take us there:

One Kind of Hunger

The Seneca carry stories in satchels.
They are made of  pounded corn and a grandmother’s throat.

The right boy will approach the dampness of a forest with a sling, a modest twining

wreath for the bodies of  birds. A liquid eye.

When ruffed from leaves, the breath of  flight is dissolute.
What else, the moment of  weightlessness before a great plunge?
In a lost place, a stone will find the boy.
Give me your birds, she will say, and I will tell you a story.
A stone, too, admits hunger.
The boy is willing. Loses all his beaks.
What necklace will his grandmother make now.
The sun has given the stone a mouth. With it, she sings of what has been lost.
She sings and sings and sings.
The boy listens, forgets, remembers. Becomes distracted.
The necklace will be heavy, impossible to wear.
~

Lehua M. Taitano

Advertisements

Truth as relationship

300px-magrittepipe1928lacomuseumart

What is truth? Is there an absolute truth? Is truth relative? Can it be relative and simultaneously absolute? I have not been much of a sojourner on the path to truth, because other things interest me more; however, in 2017, what constitutes truth has become a topic of contemporary discussion–in the most superficial ways imaginable.

And as I have been reading about Josiah Royce in Harry Cotton’s text, the definition of truth and the concept of The Absolute (the philosophy of which Royce and his dear friend William James often argued over) raise their abstract heads and ask for understanding.

Royce insists that truth is not something that happens. It’s not a verb, though people often employ the concept of truth when referring to occurrences: It happened just this way, so I know it is true. That’s a scientist’s empirical truth, or a pragmatist’s truth, someone who believes in his or her own experiences–a truth-in-action. It is also by nature individual in essence.

Just so, says Royce, but what happens to that truth-in-action once the action has stopped, the occurrence is over? Is the event still true? It is now past, and therefore has passed into the realm of memory or idea. Any number of other probabilities could have occurred in experience but did not, and it would be impossible to disprove all of those possibilities in order to prove something that happened as “true.” Cotton sums the thinking up (and you’d have to read Royce’s work on this idea to get the better picture) as “Indeed, no series of events can determine the truth of an idea.” Rather than being an event, Truth, says Royce,

…is a relation whereby various possible or real objects, events, ideas, counsels, and deeds are joined, in ideal at least, into one significant whole. This whole is no one event, or mere set of distinct events. It is a connected life process.    [my italics]

I cannot say I am wholly convinced by Royce’s philosophy in general; but I do like the concept of truth as “a connected life process,” which–to my own mind and given my predilections  about life and consciousness and truth–rings true.

 

Self as social

I’m an introvert. I need and, indeed, quite enjoy people–but in small groups and short doses. Much as I love you, I may still need to retire alone with a book or journal or a long walk in the meadow by myself to re-charge my energies, which are low enough to begin with these days.

card_one_hedgehog2

Potter’s curled-tight hedgehog, my animal totem

I think of that as alone with my Self. But recent reading along neurological, evolutionary, and psychological lines has me questioning this Self that seems to own its singular consciousness, and makes me consider the self-less consciousness of, say, Zen Buddhism.

~

 

From Carl Zimmer’s book Soul Made Flesh:

 

Finding the mechanisms of consciousness will not mean we lack a true self. It’s just that this self looks less and less like what most of us picture in our heads–an autonomous, unchanging being that has a will all its own, that is the sole, conscious source of our actions, and that distinguishes humans from animals. All animals probably create some kind of representation of their bodies in their brains, and humans simply create a particularly complicated model…

The human self did not reach this complicated state on its own. Thought is more like a node in the social network of our species…The human brain can make a series of unconscious judgments about people…in a fraction of a second. In recent years, neuroscientists have been mapping our the networks that make this social intelligence possible, and one of their most astonishing discoveries is that a picture of the brain thinking about others is not all that different from a picture of the brain thinking about oneself. Some neuroscientists think the best explanation for this overlap is that early hominids were able to understand others before they could understand themselves.      [italics mine]

In the foregoing passage, Zimmer cites Damasio, M. D. Lieberman, and an academic-philosophical article by Endel Tulving (2001) titled “Episodic Memory and Common Sense: How Far Apart?” that basically shows how little we can depend upon our own memories as “fact” and how deeply we engage in forms of storytelling to connect our memory episodes. It is possible that our general knowledge of things-as-they-are (including the behavior and “minds” of other beings) evolved before our ability to recall episodes of experience. Tulving writes:

…when we wonder which came first, episodic memory (experiences) or semantic memory (facts), common sense tells us that the answer is episodic memory. Information gets into semantic memory “through” episodic memory: First an individual has a particular experience in the course of which he, say, learns a new fact, and later on he can use the knowledge thus acquired independently of any remembering of the original learning episode as such.

This is what many experts in the area of memory have believed (and many still do) ever since the distinction between episodic and semantic memory was drawn. The careful reader of papers in this issue will be able to spot statements to this effect in various chapters. Nevertheless, although the jury is still out on this question, and although the final answer may turn out to be of a kind that almost always is reached at the end of debates (“well, it all depends”), I believe that the correct view is the reverse of common sense: information gets into episodic memory through semantic [general knowledge] memory.

He closes with the observation that “evolution is an exceedingly clever tinkerer who can make its creatures perform spectacular feats without necessarily endowing them with sophisticated powers of conscious awareness.” Darwin would not disagree.

Now to mull over the idea that my self is part of a wide-ranging network of human relationships, and hence not so entirely my “own.” Ha–I find myself of two minds (or more!) on this one.  😀

Do we change? Can we?

I have blogged about the Myers-Briggs personality inventory–a tool that may or may not be useful to psychologists, depending on whom you talk to. Because my father used the inventory in his studies of people in groups, he “experimented” with his family, administering the inventory to the five of us. I was 17 years old the first time I took the survey; my type was INFP (introvert, intuitive, feeling, perceptive), heavy on the I and the F. Has that “type” changed over the years? The “brief” version of the test now shows me moving in the last category, still P but slightly more toward J (judgment). That makes sense, as I have had to learn how to keep myself more organized and ready for difficult decisions. After all, I am a grownup now.

The personality type does not indicate, however, what sort of thinker a person is. Certain types may tend to be more “logical” in their approach to problem-solving, and others tending toward the organized or the intuitive, but what do we mean by those terms? For starters, logical. Does that mean one employs rhetoric? That one thinks through every possibility, checking for fallacies or potential outcomes? Or does it mean a person simply has enough metacognition to wait half a second before making a decision?

Furthermore, if personality type can change over time (I’m not sure the evidence convinces me that it can), can a person’s thinking style change over time? Barring, I suppose, drastic challenges to the mind and brain such as stroke, multiple concussion damage, PTSD, chemical substance abuse, or dementia, are we so hard-wired or acculturated in our thinking that we cannot develop new patterns?

There are many studies on such hypotheses; the evidence, interpretations, and conclusions often conflict. Finally, we resort to anecdote. Our stories illustrate our thinking and describe which questions we feel the need to ask.

~ A Story ~

high school.jpg.CROP.rectangle3-large

This year, I did the previously-unthinkable: I attended a high school reunion.

We were the Class of 1976, and because our city was directly across the Delaware River from Philadelphia–the Cradle of Liberty! The home of the Liberty Bell and Independence Hall!–the bicentennial year made us somehow special.

Not much else made us special. Our town was a blue-collar suburb of Philadelphia, a place people drove through to get to the real city across the river, a place people drove through to get from Pennsylvania to the shore towns. Our athletics were strong, our school was integrated (about 10%  African-American), people had large families and few scholastic ambitions. Drug use was common among the student population, mostly pills and pot. There were almost 600 students in the class I graduated with, although I was not in attendance for the senior year–that is a different story.

But, my friend Sandy says, “We were scrappy.” She left town for college and medical school, became a doctor, loves her work in an urban area. “No one expected much of us, so we had to do for ourselves,” she adds, “And look where we are! The people here at the reunion made lives for themselves because they didn’t give up.”

It is true that our town did not offer us much in the way of privilege or entitlement, and yet many of us developed a philosophy that kept us at work in the world and alive to its challenges. The majority of the graduates stayed in the Delaware Valley region, but a large minority ventured further. Many of these folks did not head to college immediately, but pursued higher education later on in their lives; many entered military service and received college-level or specialized training education through the armed forces.

ann1975-76?

Does this young woman look logical to you?

I wandered far from the area mentally, emotionally, and physically; but then, I was always an outlier. One friend at the reunion told me that she considered me “a rebel,” a label that astonishes me. I thought of myself as a daydreamer and shy nonconformist, not as a rebel! Another friend thanked me for “always being the logical one” who kept her out of serious trouble. It surprises me to think of my teenage self as philosophical and logical. When one considers the challenges of being an adolescent girl in the USA, however, maybe I was more logical than most.

I find that difficult to believe, but I am willing to ponder it for awhile, adjusting my memories to what my long-ago friends recall and endeavoring a kind of synthesis between the two.

~

The story is inevitably partial, incomplete, possibly ambiguous. Has my thinking changed during the past 40 years? Have my values been challenged so deeply they have morphed significantly? Have I developed a different personality profile type? Are such radical changes even possible among human beings, despite the many transformation stories we read about and hear in our media and promote through our mythologies?

How would I evaluate such alterations even if they had occurred; and who else besides me could do a reasonable assessment of such intimate aspects of my personal, shall we say, consciousness? Friends who have not seen me in 40 years? A psychiatrist? My parents? A philosopher? It seems one would have to create one’s own personal mythology, which–no doubt–many of us do just to get by.

I have so many questions about the human experience. But now I am back in the classroom, visiting among the young for a semester…and who can tell where they will find themselves forty years from now? I hope they will make lives for themselves, and not give up.

 

 

At sundown

The disintegrating physical and mental situation of an elderly best-beloved recently has led me back (after a brief pause) to readings in neurology and consciousness. It has also led me to reflect on the tasks memory accomplishes for us and how the need to tell a story seems to reside deep in whatever “makes us human.” Many poems, perhaps most of them, are “inspired” by memories and a need to tell. So I will indulge myself by giving a narration here, and perhaps poems will follow later.

paintdaub copy

The best-beloved has been in and out of hospitals, rehabilitation centers, and so-called independent living placement and appears to have developed a very common but not-commonly-talked-about cognitive disarray, or hospital-induced delirium, that medical personnel call “sundowning” when it occurs in Alzheimer’s patients. But my patient does not have Alzheimer’s disease. Her meshing of realities must have been triggered by something else, but the possible factors are many. We may never figure out what it was that pitched her into delusion and lack of compassion, turning her into a person we barely know.

She took good care of her body. At 90, her physical self is in better shape than many people 20 years her junior. Her brain–and hence, her mind–has not stayed as healthy as the rest of her. Several small strokes deep in her brain began to alter not just her gait but her personal focus. Long years of hearing loss no doubt altered how her brain processes input. The reading I have been doing (most recently Carr’s The Shallows, Sacks’ On the Move, and Damasio’s Looking for Spinoza) indicates that the human brain is “plastic” but not necessarily “elastic.” It can modify in response to damage or training, but that does not mean it will spring back to the way it was before. In extreme old age, the process of adaptation slows. The brain becomes less resilient. For reasons no one really understands–a host of possible culprits includes hormones, glutamates, serotonin production, medicines, genetic predispositions, and environmental factors among others (a perfect storm…)–persons who have been sharp and cogent may suddenly experience delusions, often leading to paranoia, confusion, loss of affect, lack of social filters, violent and contrary behavior.

And we ask, “What happened to the soul I love?”

alice-heart1 copy

If we believe in souls, we have faith that somewhere under the changeling is the best-beloved. In flashes, she may return to us. If we believe that the brain is the person, the transition from best-beloved to aggressive complainer is harder to accept. Damasio seems to believe the brain is the person. I find it hard to agree with him fully, though I have been learning a lot about neurology in the process.

Metaphors or analogies for the situation seldom seem, to me, quite to capture the wrenching feeling I have when encountering sundowning. The idea of disintegration seems inappropriate in this case, because she recalls who we are and her mind is not collapsing so much as morphing in unaccountable ways. Threads unraveling? No, not really; the metaphor of a quilt coming undone, maybe, or an intricately-woven tapestry shredded apart–but that’s far too simplistic.

Think of the mind: it encompasses the brain with its regions for motor, somatosensory, auditory, and visual processing; the body, which takes in those physically-produced inputs; memories; thoughts; feelings, which are thoughts spurred by emotions; and a host of complex inter-relationships we cannot even begin to map. Somewhere in all of this is the person, the “self.” At least, as far as we have so far been able to speculate (though not everyone agrees; see my post on Hofstadter & Parfit. Parfit suggests personal identity is an invalid construct).

Perhaps an environmental analogy would suffice, being complex enough for comparison. She is the planet Earth, aging and adaptable, but not endlessly adaptable; her healthy balance has been thrown off by things she may not have had any control over. In whole regions, she becomes inhospitable. Poisoned. Dry. Hot. Overrun with invasives. She seems not to like us anymore, but that is not what’s going on at all. In fact, she’s dying.

Maybe that’s taking the metaphor too far. But in difficult times, one reaches far. There is hope she may recover at least some of her Self, and in the meantime, we have stories in which she plays a role. Mnemosyne–awaken in the consciousness of those who know her. Telling the stories is a step toward letting go.

 

Memorial

snowdrop

At last, the snowdrops: spring has deigned to return.

Renewal, rebirth–and remembrance.

~

In a post from 2011, I wrote about poet Chris Natale Peditto, a long-time friend who had recovered from a serious cerebral arteriovenous malformation that resulted in a temporary loss of his abilities to read, write, and speak.

Chris died in November of 2013, just before his 70th birthday. This afternoon, I will be attending a celebratory event in his memory in the city he loved and left, Philadelphia. We will be reading his poetry, letters, and prose, speaking poems aloud as he loved to do. There will be many artists of many kinds attending this gathering, and we will be honoring his place among us.

Outside this morning, a pelting rain, expected to clear a bit later today. A weather report that suits the mood.

Memoir & the lyrical narrative

I have decided to devote two class periods to exploring the lyrical narrative with my students. The reason evolved from, not exactly a revelation, but a dawning awareness that this particular mode of poetry connects more easily with students than other modes.

Popular music, of course, sets the contextual stage here. American country music fills the nation’s highways and airwaves with lyrical narratives and modern-day ballads. The story-song appears in a wide range of musical genres from rock to rap, born from simple blues narratives and Appalachian ballads and from John Henry and Casey Jones to glam-rock “epic rock ballads,” new wave, Motown, British invasion (think “A Day in the Life”) and quirky indie lyrics–not to mention huge hits like “Lying Eyes” and “The Devil Went Down to Georgia” or oft-played 70s narrative songs like “Cat’s in the Cradle” or Bruce Springsteen’s “Thunder Road.” These tunes are all before my students’ time, but they have their own lyrical narrative popular songs; they “get it.”

Bruce Springsteen: lyricsThunder Road

Bruce Springsteen: lyrics
Thunder Road

Narrative lyrical poems hook readers who might not otherwise spend much time closely reading a poem because of those critically important pronouns “I” and “you” and because there’s a human impulse to stick with a story. We want to know how it ends; and we want to figure it out in our own subjective ways, to put the speaker/writer’s experience into our own (or vice versa) and interpret the narrative on our own terms. We also like to be a little surprised.

Why?

I’ve touched on the topic of the cognitive need for narrative in a previous post, and on Boyd’s story-telling impulse research (here), and now–in light of reading the lyrical narrative poem–I want to offer an excerpt from Oliver Sacks. In an excerpt from Speak, Memory, Sacks writes:

“There is no way by which the events of the world can be directly transmitted or recorded in our brains; they are experienced and constructed in a highly subjective way, which is different in every individual to begin with, and differently reinterpreted or reexperienced whenever they are recollected…Frequently, our only truth is narrative truth, the stories we tell each other, and ourselves… Such subjectivity is built into the very nature of memory, and follows from its basis and mechanisms in the human brain. The wonder is that aberrations of a gross sort are relatively rare, and that, for the most part, our memories are relatively solid and reliable. We, as human beings, are landed with memory systems that have fallibilities, frailties, and imperfections—but also great flexibility and creativity.”

How can we honestly interpret a poem without acknowledging immediately that our brains are highly subjective processing organs that inherently interpret and experience input differently? Our personal narratives, our memories and recollections, limit, expand upon, and influence our interpretations. That is why I insist that my students accept all “expert interpretations” of famous works with a grain of salt. Every human brain re-creates based upon subjective, unique processing; the fact need not keep us from admitting of rational thinking, but it must affect human interpretations of phenomena. Especially subjective phenomena such as art.

This is also the reason I warn my students not to assume that the speaker of the poem is the poet himself or herself. Poets invent, and they can invent personas. Furthermore, in their efforts to write truths–emotional truths, lasting truths–they may alter physical, actual, memory-based “truth.” In other words, maybe the story happened just that way. Or didn’t. Though “for the most part our memories are relatively stable and solid,” the paradox of art is that altering the facts can lead to deeper truths. Sometimes the facts seem altered from one perspective but not from another. Other times…well, I confess, I myself have changed some facts in poems in order to make the poem better. In such cases, craft supersedes the need for stony factuality. I guarantee I am not the only writer who employs this strategy.

Whose life is it anyway? And whose art? Sacks reminds us of the loosey-goosey aspects of recollection: “The neuroscientist Gerald M. Edelman often speaks of perceiving as ‘creating,’ and remembering as ‘recreating’ or ‘recategorizing.’” Thus, the lyrical narrative is a form of memoir, created through individual perception and recreated through the process of memory itself. Which, all of us being human and therefore fallible or otherwise liable to err, and subconsciously quite able to lie to ourselves, means that the lyrical narrative could end up as mythical as the stories of Mount Olympus.

And just as compelling to generations of human listeners or readers.

A voyeur’s fascination that the reader may be witness to the human-talking-to-human in the framework of a storyline is a significant part of what engages audiences. This poem might be memoir! It may be true. It may be genuine experience, something to which I can relate. There’s emotional frisson, or thrilling curiosity, or the dread of knowing it will all end badly. But I must know; and I want to believe it might be true. Tell me sweet lies, oh troubadour!

~

*Note: the image above is not Bruce Springsteen’s handwriting. He prints. An example of his actual lyric drafts is here.