Do we change? Can we?

I have blogged about the Myers-Briggs personality inventory–a tool that may or may not be useful to psychologists, depending on whom you talk to. Because my father used the inventory in his studies of people in groups, he “experimented” with his family, administering the inventory to the five of us. I was 17 years old the first time I took the survey; my type was INFP (introvert, intuitive, feeling, perceptive), heavy on the I and the F. Has that “type” changed over the years? The “brief” version of the test now shows me moving in the last category, still P but slightly more toward J (judgment). That makes sense, as I have had to learn how to keep myself more organized and ready for difficult decisions. After all, I am a grownup now.

The personality type does not indicate, however, what sort of thinker a person is. Certain types may tend to be more “logical” in their approach to problem-solving, and others tending toward the organized or the intuitive, but what do we mean by those terms? For starters, logical. Does that mean one employs rhetoric? That one thinks through every possibility, checking for fallacies or potential outcomes? Or does it mean a person simply has enough metacognition to wait half a second before making a decision?

Furthermore, if personality type can change over time (I’m not sure the evidence convinces me that it can), can a person’s thinking style change over time? Barring, I suppose, drastic challenges to the mind and brain such as stroke, multiple concussion damage, PTSD, chemical substance abuse, or dementia, are we so hard-wired or acculturated in our thinking that we cannot develop new patterns?

There are many studies on such hypotheses; the evidence, interpretations, and conclusions often conflict. Finally, we resort to anecdote. Our stories illustrate our thinking and describe which questions we feel the need to ask.

~ A Story ~

high school.jpg.CROP.rectangle3-large

This year, I did the previously-unthinkable: I attended a high school reunion.

We were the Class of 1976, and because our city was directly across the Delaware River from Philadelphia–the Cradle of Liberty! The home of the Liberty Bell and Independence Hall!–the bicentennial year made us somehow special.

Not much else made us special. Our town was a blue-collar suburb of Philadelphia, a place people drove through to get to the real city across the river, a place people drove through to get from Pennsylvania to the shore towns. Our athletics were strong, our school was integrated (about 10%  African-American), people had large families and few scholastic ambitions. Drug use was common among the student population, mostly pills and pot. There were almost 600 students in the class I graduated with, although I was not in attendance for the senior year–that is a different story.

But, my friend Sandy says, “We were scrappy.” She left town for college and medical school, became a doctor, loves her work in an urban area. “No one expected much of us, so we had to do for ourselves,” she adds, “And look where we are! The people here at the reunion made lives for themselves because they didn’t give up.”

It is true that our town did not offer us much in the way of privilege or entitlement, and yet many of us developed a philosophy that kept us at work in the world and alive to its challenges. The majority of the graduates stayed in the Delaware Valley region, but a large minority ventured further. Many of these folks did not head to college immediately, but pursued higher education later on in their lives; many entered military service and received college-level or specialized training education through the armed forces.

ann1975-76?

Does this young woman look logical to you?

I wandered far from the area mentally, emotionally, and physically; but then, I was always an outlier. One friend at the reunion told me that she considered me “a rebel,” a label that astonishes me. I thought of myself as a daydreamer and shy nonconformist, not as a rebel! Another friend thanked me for “always being the logical one” who kept her out of serious trouble. It surprises me to think of my teenage self as philosophical and logical. When one considers the challenges of being an adolescent girl in the USA, however, maybe I was more logical than most.

I find that difficult to believe, but I am willing to ponder it for awhile, adjusting my memories to what my long-ago friends recall and endeavoring a kind of synthesis between the two.

~

The story is inevitably partial, incomplete, possibly ambiguous. Has my thinking changed during the past 40 years? Have my values been challenged so deeply they have morphed significantly? Have I developed a different personality profile type? Are such radical changes even possible among human beings, despite the many transformation stories we read about and hear in our media and promote through our mythologies?

How would I evaluate such alterations even if they had occurred; and who else besides me could do a reasonable assessment of such intimate aspects of my personal, shall we say, consciousness? Friends who have not seen me in 40 years? A psychiatrist? My parents? A philosopher? It seems one would have to create one’s own personal mythology, which–no doubt–many of us do just to get by.

I have so many questions about the human experience. But now I am back in the classroom, visiting among the young for a semester…and who can tell where they will find themselves forty years from now? I hope they will make lives for themselves, and not give up.

 

 

Slightly less difficult books

photo ann e. michaelI recently read Paul Bloom’s book Descartes’ Baby while simultaneously reading Daniel Dennett’s Content & Consciousness. Of these two, the latter falls a bit under the “difficult books” category, but it is not too hard to follow as philosophy goes. Dennett’s book is his first–the ideas that evolved as his PhD thesis–and in these arguments it is easy to see his trademark humor and his deep interest in the ways neurology and psychology have aspects useful to philosophy. Bloom’s book, a somewhat easier read, suggests that the mind-body problem evolved naturally from human development: young children are “essentialists” for whom dualism is innate; Descartes simply managed to write particularly well about the evolutionary project (with which, I should note, Bloom disagrees; as a cognitive psychologist, he maintains a more materialist stance).

It turns out that because I have read widely if shallowly in the areas of philosophy, cognitive psychology, evolution, art, aesthetics, and story-making, I find myself able to recognize the sources and allusions in texts such as these. Quine, Popper, Darwin, Pinker, and Wittgenstein; Schubert, Kant, Keats, Dostoevsky, Rilke…years of learning what to read next based on what I am currently reading have prepared me for potentially difficult books. [Next up, Gilbert Ryle and possibly Berkeley.] I don’t know why I feel so surprised and happy about this. It’s as though I finally realized I am a grownup!

And I am glad to discover I am not yet too old to learn new things, young enough to remember things I know, and intellectually flexible enough to apply the information to other topic areas. Synthesis! Building upon previously-laid foundations! Maslow’s theory of humanistic education! Bloom’s taxonomy! The autodidact at work in her solitary effort at a personal pedagogy.

If I ever really discover what consciousness is, I’ll let you know.

 

 

 

Dozens of views

No one has ever found the traces of memory in a brain cell. Nor are your imagination, your desires, your intentions in a brain cell. Nothing that makes us human is there.

Deeprak Chopra

Chopra is not my favorite writer on consciousness, but he does an adequate job of explaining complicated concepts to people who are just getting accustomed to questioning experience and who are beginning to be open-minded about the mind, the body, and beyond. So often, we have been raised not to doubt, told what God is and is not, and trained into beliefs about the truth. This, in spite of the common human trait of curiosity that asks: who and where are we in the world? What makes me me? What happens when I die? Chopra, with his medical background and his experience spanning several major cultures, can offer both a great deal of information and pose provocative questions to his readers.

In our technologically-obsessed culture, it is easy to turn to science as foremost authority; I happen to be fascinated by neurology and neuropsychology when it comes to consciousness, for example, but I never rule out so-called spiritual insights. Chopra’s writing often falls into the fallacy of stating “there are two views,” when in fact there are dozens of views, even among scientists. My guess (it is but a guess) is that the either/or form of presenting perspective is simpler for the “average reader”–as defined by his editor–to understand. Yet it seems to me a slight to the average reader to narrow these big questions down to “two views.”

Here’s an example, just one of many in his writing:

There are two views about consciousness in science today. One is that consciousness is an emergent property of the brain and. therefore, also an emergent property of evolution. That’s the materialist, reductionist view. There’s another view…[that] holds that consciousness is not an emergent property but inherent in the universe.

Now, I genuinely prefer what Chopra calls the “mind first” argument in which consciousness is a kind of field effect. I would not, however, suggest that matter first and mind first are the only two views today’s scientists hold; and neither would anyone else who has read a number of the elegantly-argued, well-researched, thoughtful, passionate blogs of today’s science researchers. The majority of them are atheists, but some are agnostics and some are inclined toward non-theist teachings such as Zen. Even among the ranks of non-believers (in terms of an anchoring eternal presence or god), the question of consciousness leads to intriguing inquiries.

The philosophers of today cannot ignore scientific advances any more than Maimonides could in the 12th century. Physics is a thing! as my students might put it. For the ways in which this relates to the science of neurology, I return to the framework on consciousness proposed by Douglas Hofstadter in his book I Am a Strange Loop.

What is the world but consciousness? Or illusion, in Hindu and Buddhist teachings (Maya) and, in a slightly different but related way, in Plato. And how many perspectives are there on that consciousness?

Chopra would probably say that each of us has to experience a state of awareness and interaction with whatever deep potential “god” or the creating principle offers for us. Which basically admits of not merely dozens but billions of unique interactions or perspectives…if we even agree to the schema.

goldenrod (solidago) going to seed

goldenrod (solidago) going to seed

Noesis

noesis~

  1. Cognition; perception.
    2. The exercise of reason.

Interesting that definition number two is dependent upon definition number one. Lately I have been thinking about the difference between consciousness and conscience; the latter seems to me to be specifically human, I guess, because isn’t conscience a sort of cultural or judgmental entity based upon rules? Yes, I am talking about morality, a term I tend not to use much when I consider cognition, consciousness, narrative, being.

I recently perused Patricia Churchland’s Braintrust and found myself intrigued about where and in what ways morality and consciousness or sentience mesh. Churchland is a moral philosopher, but this book relies largely on arguments premised on neurology, biology, evolution, and animal studies. Her critics pose interesting rebuttals, too. I found her book readable and often convincing–and it’s the kind of book that leads me to other writers and scientists; I love that in a book!

braintrust~

The phenomenology of consciousness–the carbon body brain-based “real world” idea of the word–involves intentionality, sentience, qualia, and first-person perspective. We can identify qualities based upon our first-person consciousness and respond to them. This process has led Western thinkers toward the concept of reason or rational thinking. The exercise of reason derives from perception.

This does not mean that phenomenology is the sole form of consciousness or even that it is necessarily human-only, but it seems to me to be the easiest one for human beings to wrap their minds around. Yet the earlier philosophers were not phenomenologists. Their speculations about what consciousness originated in and what morality inhered in were quite abstract.

For a good sum-up of how contemporary scholars define and discuss consciousness, go to Stanford’s site here.

~

Being cognizant or conscious does not necessarily lead to moral behavior or reason…or does it? Here we have an idea that has been debated for centuries. In her book, Churchland often returns to Hume, who wrote about morality from what, eventually, became known as the utilitarian stance (though I would argue Hume is not really utilitarian). Stanford offers an overview of morality as defined by philosophers over the years; The Internet Encyclopedia of Philosophy says this of Hume:

In epistemology, he questioned common notions of personal identity, and argued that there is no permanent “self” that continues over time. He dismissed standard accounts of causality and argued that our conceptions of cause-effect relations are grounded in habits of thinking, rather than in the perception of causal forces in the external world itself. He defended the skeptical position that human reason is inherently contradictory, and it is only through naturally-instilled beliefs that we can navigate our way through common life.

These concepts should feel modern to most of us thanks to cultural anthropology, sociology, and psychology, among other disciplines. Hume’s position conflicts with much religious dogma, but his ideas were not out of line with many of his fellow Enlightenment-Era thinkers. During the Enlightenment, intellectuals were enamored of the exercise of reason (noesis).

~

So: consciousness and conscience. First we have the one–however it arises within us*–and the other develops (or evolves?) thanks to the need for social beings to navigate common life. And thanks, perhaps, to brain evolution adapting to social common life (see Churchland for more on this).

Much to mull over during my brief summer break.

~

Jiminy Cricket copyright Walt Disney Co.

Jiminy Cricket copyright Walt Disney Co.

*See my numerous previous posts on consciousness!

⇐ “And always let your conscience be your guide!”

 

Awareness, openness, & … magic mushrooms?

In a recent New Yorker article about potential medical uses for psilocybin (“The Trip Treatment”), science, culture, and food writer Michael Pollan interviewed researchers in neuroscience, medicine, and psychology. The medical potential of psychedelic drugs is not something I can comment on from reading just one article; what intrigued me most about this piece is how these drug studies overlap with studies on cognition, metacognition, consciousness, and spirituality. The medically-controlled “tripping” that volunteers have undergone overwhelmingly resulted in some form of what we term mystical or spiritual (for lack of a scientific term) feeling.

It’s almost impossible to consider these realms of experience without questioning concepts such as “soul” or “self-awareness.” Pollan writes:

Roland Griffiths is willing to consider the challenge that the mystical experience poses to the prevailing scientific paradigm. He conceded that “authenticity is a scientific question not yet answered” and that all that scientists have to go by is what people tell them about their experiences. But he pointed out that the same is true for much more familiar mental phenomena.

“What about the miracle that we are conscious? Just think about that for a second, that we are aware we’re aware!”

A man after my own heart. It is amazing, a kind of miracle. And we get consciousness and metacognition without any drug intervention at all. It just springs into our beings at some point, as we create ourselves from lived events and construct speculative worlds and an understanding (though often flawed) of other minds.

Here is another fascinating result from the psilocybin studies that may make us revise our ideas of interpersonal relationships, personhood, and creating a self. Pollan writes:

A follow-up study by Katherine MacLean, a psychologist in Griffiths’s lab, found that the psilocybin experience also had a positive and lasting effect on the personality of most participants. This is a striking result, since the conventional wisdom in psychology holds that personality is usually fixed by age thirty and thereafter is unlikely to substantially change. But more than a year after their psilocybin sessions volunteers who had had the most complete mystical experiences showed significant increases in their “openness,” one of the five domains that psychologists look at in assessing personality traits. (The others are conscientiousness, extroversion, agreeableness, and neuroticism.) Openness, which encompasses aesthetic appreciation, imagination, and tolerance of others’ viewpoints, is a good predictor of creativity.   [italics mine]

Openness, aesthetic appreciation, imagination, tolerance, creativity…and one researcher in neuropsychopharmacology suggests that this sort of un-boundaried openness signifies a temporary regression to an infantile state, very much as Freud hypothesized. I thought instead of Bachelard and the childhood reverie state.

Intriguing, that pharmacological work of this sort makes neuroscientists resort to citing William James and Sigmund Freud on mystical experience and the subconscious!

[Robin] Carhart-Harris believes that people suffering from other mental disorders characterized by excessively rigid patterns of thinking, such as addiction and obsessive-compulsive disorder, could benefit from psychedelics, which “disrupt stereotyped patterns of thought and behavior.” In his view, all these disorders are, in a sense, ailments of the ego. He also thinks that this disruption could promote more creative thinking. It may be that some brains could benefit from a little less order.    [italics mine]

I wonder if the aesthetic experience itself, when wholly engaged, can sometimes act like a drug on the art-viewer’s or art-maker’s being. When one reads the poem that rearranges one’s world, doesn’t it disrupt stereotyped, familiar, habitual patterns of thought? That’s what happens for me when I encounter great art of any kind. It is close to mystical.

~

 

Blame & fear

Amazing, the human brain, consciousness layered over instinct, habits of thought, the ways we feel, rationalize, justify, seek for why. In the wake of tragedies, we tend to react with fear and blaming; it is as if we could only discern who or what to blame, perhaps we could learn how to prevent it. So we “reason.”

But all too often, what we are doing is not using reason. Instead, people tend to blame whoever or whatever best suits their own, already-decided view of the world and use “reason” to justify their feelings, a psychological phenomenon called “confirmation bias” on which Daniel Kahneman has much to say. Cognitive biases inherently interfere with objective analysis, which is sometimes a lovely and rich part of the human experience but which also leads to terrible misuse of analysis. We usually act based on biases rather than on logic (see this page for a long list of biases). So many ways to justify our often-mistaken and uninformed beliefs or responses.

Anthropologist and philosopher René Girard offers insights into the desire to blame–a sociocultural desire, deeply rooted in the way humans behave when in groups and, he believes, one of the foundations for the development of religious rituals, among other things. As we endeavor to “make sense of” impossible events, to “discover why” they occur, we seem naturally to turn to blaming. Apparently, designating a scapegoat consoles us somehow, allows us to believe we might have some control over what is terrible, not unlike sacrificing a calf to propitiate an angry god.

~

I lived just outside of Newtown, CT for a few years in the 1980s. I still have friends there and I know the area well. It was a safe town, and it is still a safe town; only now, it is a safe town in which a terrible and statistically-rare occurrence happened. That sounds rather dry and heartless: “a statistically-rare occurrence.” Yet from the logic standpoint–if we are being reasonable–it is simple to discover that by any measure, U.S. schools are the safest place a school-age child can be. Fewer than 2% of deaths and injuries among children ages 5-18 occur on school grounds. I got these numbers from the US Center for Disease Control. Keeping an armed policeman at every U.S. school (as recently proposed by the president of the NRA) might possibly make an incrementally small difference in that tiny number. Might. Possibly. Rationally, would it not make more sense for us to address the 98% and decrease that number? Though I am all in favor of hiring more people to safeguard our cities, the only real value of such a move would be to reduce a mistaken sense of public fear.

Because we are afraid, and fear is keeping us from rational and compassionate behavior. Fear can be useful–it probably helped us survive in the wild, and it continues to serve good purpose occasionally; but human beings ought to recognize the value of fear is limited in a civilized, community-based, theoretically-rational society. Rational, compassionate behavior on the part of our nation would be to remove the lens of public scrutiny from the people of Newtown and allow them to deal with grieving in the privacy of their families and community. We cannot come to terms with private loss, nor ever understand it truly, through network news, tweets, photographs on our internet feeds, or obsessive updates on ongoing police investigations.

Fear also keeps us from finding resources of our own. It blocks us from our inner strengths. The families and friends of the victims and the killer need that inner strength more than they will ever require public notice, no matter how well-intentioned the outpourings are.

~

Blame. Whose fault is it? Children and teachers and a confused and angry young man and his mother have died violently, and I’ve been listening to the outcry all week–even though I have tried to limit my exposure to “media sources.” Here are the scapegoats I have identified so far: the mental health system; semi-automatic weapons; violent computer games; the 2nd Amendment; the media; autism; school security; the killer’s father and mother (herself a victim); anti-psychotic drugs and the pharmaceutical industry; divorce; god; U.S. legislation concerning weapons and education and mental health; bullies in schools; the NRA; the victims themselves, for participating in a godless society; poor parenting; narcissism; the Supreme Court; President Obama; the CIA. I’m sure I have missed a few. (Andrew Solomon’s recent piece in the New York Times also touches on our default blame mode; his list coincides pretty closely with mine; see this article.)

Scapegoats serve several purposes. They allow us to say we, ourselves, no matter how guilty we feel, are not at fault. They give us an excuse for disaster, something to punish or something to attempt to change through controls we can think through and develop (“logically”). And in fact some good may eventually come of the changes and the control we exert, but such change is likely to be small and long in arriving. Mostly what scapegoating achieves turns out to be bad for us, however, because what it does well is give us something to fear.

Fear motivates us to read obsessively every so-called update on the killer’s presumed (and, ultimately, unknowable) motives, to argue over the best way to address the complex and intertwined issues that each of us perceives to be the root cause of any particular tragic event. Our fears make us consumers of media, and our information sources respond to our need to know why and our desire to blame. Our fears drive us to purchase guns to protect ourselves even though statistics continually prove that more U.S. citizens are killed accidentally or intentionally by someone they know intimately (including themselves, especially in the case of suicides–which Solomon also addresses in the essay I’ve cited) than by strangers or during acts of robbery, terrorism or massacres. “News,” as we have come to know it, is predicated on reporting things that are dramatic and therefore statistically unlikely. Suppose our information sources kept an accurate hourly update on weapons-related or motor vehicle-related deaths…would we become immune to the numbers? Would we say “That’s not news”? Would we be less avid consumers of such “news sources”? Would it comfort us to know we are more likely to be struck by lightning twice than to die in a terrorist act on U.S. soil or be killed by a deranged gunman in a mall or school?

Can we delve into our inner resources of rationality in order to fight our fears?

~

I think not. Fear is not easily swayed by facts. Instinct trumps reason psychologically and cognitively in this case. Fear is so emotional that it requires a deeply spiritual, soul-searching response perhaps–instead of a reasoned one. Perhaps that is why so many of the “great religions” include stories of human encounters with a god, godhead, or cosmic intelligence which humans “fear” (though the term is used to signify awe and recognition of human insignificance rather than the fear of, say, a lunging tiger). In these stories–the Bhagavad Gita and Book of Job among them–a human confronted with the godhead recognizes such fear/awe that he can never afterwards fear anything this world has to offer. In the face of what is beyond all human understanding, there is no reasoning, and no human “feelings” that psychology can explain.

Roosevelt said we have nothing to fear but fear itself. Words well worth recalling in times like these.

~

waterpaper

Finally, this:

And the angel said unto them, Fear not: For, behold, I bring you good tidings of great joy, which shall be to all people.

Namaste, Shalom, Peace, Al-Salam. May your find the strength within yourself to make your way compassionately through this world.

Art and “human intelligence”

I’ve gotten almost to the end of Brian Boyd’s intriguing and well-argued book On the Origin of Stories, which makes fairly large claims about sociality, cognition, theory of mind, art, and storytelling (ie, fiction) given an evolutionary perspective (art as adaptation). The first 200 pages lay the foundation for his claims; he provides evidence from the “hard” sciences, most often biology and neurology, and from archeology, anthropology, and psychology, to back up his theory that art is an evolutionary adaptation humans developed in order to live as social animals. And that art is necessary for human cognition in terms of further developing intelligence and the ability to communicate among our peers: it is cognitive play, practice and skill strengthening for mind and muscle.

Big claims, and occasionally hard to “prove” from the hard sciences. I believe he does a good job with that set of proofs, but I’m not a scientist. His claims based on social sciences—anthropology, sociology, psychology—are very convincing; but many people have arguments with those fields because they are so apparently subjective. Most exciting to me is the way Boyd synthesizes neurological findings with evolutionary developments.

Actually, most exciting to me are his chapters on the Odyssey, but that may be because I am a literature geek. He essentially writes a literary analysis of the Odyssey based upon the inferences and findings in the first half of this book (evolution) rather than the customary literary analysis grounded in, say, context or culture of style or theme, ad infinitum. The resulting analysis is, for me, a truly exciting way to look at Homer’s work and why it matters now, as well as why it mattered then.

Boyd comes close to making the assertion that Homer made Socrates possible, and hence all of Western civilization’s philosophy and social intelligence. Of course, he is careful not to go that far in his argument—he steers as far as he can from logical fallacies— but the thought certainly feels planted in the reader’s mind. His argument does suggest that metacognition in human beings is the definer that makes us human, and art as more-than-play separates human from not-human. He also demonstrates that the Odyssey offers great leaps beyond older epics and posits that the author(s) composed the epic for contemporary audiences that were capable of intelligent, sophisticated, “modern” thought processes; the piece is therefore not primitive literature, as some critics claim.

Boyd’s work has also turned my thoughts to how the attributes of attention, perspective and foreknowledge, overturned expectations, audience-sociality, false belief, cooperation and competition work in the poem as well as in narrative. Granted, many poems have a narrative framework, however thinly sketched, but not all of them do. When there is no narrative frame, these other aspects of storytelling (audience expectations in particular) take precedence and can be employed in almost infinite ways, bounded only by imagination and the willingness of the reader to pay attention as the writer earns that attention through a host of innovative or traditional skills.

A last thought…I spent the long weekend visiting octogenarian friends, both of whom are wonderful tellers of stories. The value of such people to human society is priceless:

“Story by its nature invites us to shift from our own perspective to that of another, and perhaps another and another.”  ~Brian Boyd