Sunday, May 26, 2013

Who are we, really?

I have awakened from a dream recently to realize that the person I was in my dream is not my present self but an ageless adult, sort of a composite of how I think of myself, as if I were permanently 35.

And I connected this realization to an earlier post (April 30: Genuine Freedom) and to other musings about the mysterious inner me, the "selfless self of self," as the poet G. M. Hopkins called it.

Who are we at the core of our being? That is the question. Am I the product of my conscious thought, produced by the brain, embodied in my mortal flesh, or am I an enfleshed spirit or, as I was raised to think in parochial school, a soul encased in a body?

Without considering for a moment the immortal center of my being called the soul, I think of all the many couples who, married in their twenties, find that they have drifted apart and become divorced in their forties, because they have changed.  What part of them has actually changed? Of course, we are changing and growing constantly biologically; our tastes and behavior and attitudes change as do our values.

But the man or woman of 45-50 who is more mature than the bride or groom of 25, with different interests from his or her partner, remains essentially the same person.  The question then is, what can psychology and philosophy tell us about who that person is, that self that might grow but remains essentially true to its original form?

"What are we at our core, before anything, before everything?"  This question, posed at the opening of an article by Abigail Tucker (in the Smithsonian 1-13) comes from a researcher at the Yale Infant Cognition Center, where scientists have been studying toddlers and babies to see if altruism is an innate human element. It seems too early to say for sure that the answer is definitively 'yes.'  But I cite this example as a fundamental question underlying much of the important work I sometimes read about being done by people with infinitely more knowledge than I have or will ever have about the complexities of the human personality.

I remember, too, a psychologist introducing to a workshop I attended some years ago the distinction between "the pattern" and "the person": when we think a friend or partner or colleague is unbearable, annoying, or otherwise unpleasant to be around, what we are reacting to is the behavior pattern that this person displays. I think of several people I know who seldom listen, talk incessantly about themselves and are clearly wound up emotionally. I shun their company.

Yet these people, beneath the surface, are bright, caring individuals who are lovable--if I can separate myself from the surface pattern to see the real person beneath. A challenging "if."  How close this approach is to Freudian or Jungian ideas of the psyche or self is something I do not know, but it helped me understand a basic human issue. Perhaps it is related to the belief of Robert Louis Stevenson and others ("Dr. Jekyll and Mr. Hyde") who contended in Victorian times that the human person is not one but two--a divided self, half good, half evil.

Although this seems too simplistic today, there remains in us a sense that our real selves remain mysterious. Even as we shun encountering them, we meet them in dreams and see them reflected in films and literature.  If other people are hard to understand (and love), we tend to remain hard to understand even by ourselves.

One of the most satisfying examinations of all this, on the spiritual level, for me has been the work of Thomas Merton and in particular the study focusing on his idea of the true self by James Finley: Merton's Palace of Nowhere.  Merton, having read very widely, was attracted to Blake as a graduate student at Columbia and then, as a Trappist monk, steeped himself in the mystical tradition of both Christianity and the East.

I will try to sum up some key aspects of Finley's study of what Merton meant by our true identity in contrast to the false self we create as a public persona or mask.  The contemplative tradition of emptiness and silence, for Merton, is the highest form of self-realization; it reveals that the person I am is not limited to the individual I am.

Involved here is the loss of the false self when, in contemplation, our being becomes one with the being of God, who is Being itself.  The person, that is, transcends everything in his or her union with God. The self that we thought ourselves to be vanishes ("He who loses his life shall find it," as Jesus said) because of love.

And this brings us back to Brennan Manning, whose death last month prompted a brief post here that expresses the same basic Mertonian idea very directly: The true self is the one loved by God; every other identity is an illusion.

Merton put it this way:  "Learning to be oneself means learning to die [to the self] in order to live. It means discovering in the ground of one's being a self, which is ultimate and indestructible..."  So, for him, the soul is the mature personal identity, the true self. Yet the question, "Who are you when you do not exist?"--the ultimate question we all ponder when we think of death--can never be answered by the mind. It requires what is difficult for many: a leap of faith.

I hope at least some of this makes sense and that it will lead readers unfamiliar with Merton to read him as well as Finley's classic book, which is challenging because the language of mysticism defies the limits of human language. But few questions are as important as who we are and what happens to us when we are here no more.

Tuesday, May 14, 2013

Cursive Redux

There has been a lively debate this month on Andrew Sullivan's Dish about cursive writing, with persuasive arguments on both sides.  One of these comments prompted my latest post here.

I still think that, while printing may be more legible in many cases, students should be taught the basics of cursive handwriting since taking notes in college and at meetings requires the speed of handwriting; and they should be familiar with it so they can read the cards and notes (and maybe even old-fashioned letters!) of retro-folk like me who refuse to print my greetings in ink.  I am unsure if cursive is really better for cognitive learning.

I encourage those interested in this issue to visit www.dish.andrewsullivan.com.

Sullivan and his team have, for years, provided one of the most diverse and interesting series of posts imaginable, with links to articles and books in many fields.  Reading it, even when I disagree or get turned off, is an education.

Saturday, May 11, 2013

Technological Addiction

Today's social media, which have the potential to bring people together, can also alienate. I remember a cartoon depicting five people seated at a restaurant, each gazing at his or her cellphone rather than communicating.

There are echoes here of Ray Bradbury's story, "The Pedestrian," in which the residents of a futuristic city stay indoors and stare at TVs, afraid to go out; in fact, to take a walk is seen as almost subversive.

I was reminded of this while reading a recent interview with the star of the new Star Trek movie, Zachary Quinto, 35, who does not enthusiastically embrace all the techie wonders in his own life. He makes limited use of his smartphone but says, "I try to unplug as often as I can." This, from the latest Spock, scientific officer of the Enterprise in its 23rd century journeys.

His words were music to my ears. Tune out the noise, I want to say; be creative in thinking of other people, their needs. Take time for silence. Of course, the omnipresent cellphone is, even for me, an essential link in times of emergency, and I am grateful for such inventions. Although Quinto calls himself a Luddite, he really is no more a Luddite than I am.

It's all a matter of balance. Of using technology when necessary and not becoming addicted to or obsessed with electronic devices so that they become more important in our lives than people.

I was interested to see Quinto say that the proliferating advances in technology can dehumanize us.  We think we are so much more connected than we are, he says, but "we are actually becoming further and further away from true connection."  And that is scary. Ray Bradbury would agree with this actor, who is plugged in to what matters.

Wednesday, May 8, 2013

The Right Word

When Hemingway was asked by Paris Review editor George Plimpton what problem he was having with the ending of A Farewell to Arms, which the novelist revised at least 39 times, he famously replied, with cynical understatement, "Getting the words right."

This is every writer's challenge, of course--to select the precise word that captures the idea or feeling he or she wishes to express, not an easy task in a language like English with a vast storehouse of verbal options.

It doesn't help when notable writers knowingly or carelessly use the wrong word.

Consider Maureen Dowd, avidly read by millions in the New York Times, where her acerbic wit skewers public figures in Washington and elsewhere. Today my wife pointed to a sentence in Dowd's column on sexual abuse in the military that reads in part:  "President Obama was also lacerating on the Krusinski arrest..."  Lacerating?  Wounding is the only meaning I am aware of for this word, unless it is now being used in a new way. The sentence is unclear to us.

The Times is also singled out by William Deresiewicz for a linguistic scolding in a recent article in The American Scholar for using "apologist" to mean "one who apologizes."  The Atlantic, he reports, has used "waxed" (as in waxing and waning) to mean talk, as in "wax eloquent."  This is on a par with some of my students' bloopers.

Lorrie Moore, the noted writer, used "willy nilly" to mean something other than "by compulsion." Ann Beattie seems to think "reticent" means "hesitant."  NPR has mistakenly equated "notoriety" with "fame" and per se with "so to speak."  And so it goes. Does no one have time to consult a dictionary to make sure they have the right word?

Deresiewicz does a good job of noting the errors of the experts and has found, as I have, widespread misunderstanding and misuse of "hoi polloi" (the people in Greek: the masses, not the upper crust, as so many now think); "penultimate," which does not mean "really ultimate," whatever that is.  "Begs the question," as I noted in an earlier post is a hopeless case, along with "disinterested" (impartial). No one seems to remember the original meanings of certain idioms or words.

Although some words have lost their original meanings over the centuries, as usage changes, as any visit to the OED will show, "bemused" does not mean "amused," as the high-brow New York Review of Books seems to think. 

Such errors are not funny like the ones I enjoy collecting, helped along by Richard Lederer and others.  I refer to malapropisms that are hilariously wrong, as in "The sea was infatuated with sharks" (infested).  Or the valedictorian who told his or her graduating classmates, "we are moving from the world of childhood to the world of adultery."

English can be a challenge: consider the confusion that often exists in similar-looking and sounding words with divergent meanings, like ravaging, ravishing, and ravenous.  Only the last one refers to hunger, despite what many writers and speakers may think.

What is disturbing rather than amusing is to see the educated elite write carelessly, failing to edit or be edited, adding to the watering down of accuracy in language and thereby to clear communication. Writers of some of the periodicals mentioned above set the standard for American English usage. And where are the editors?

As Mark Twain said, the difference between the right word and the almost right is like the difference between lightning and the lightning bug.

Sunday, May 5, 2013

Cursive Revisited

Earlier, I wrote a piece about the strange and sad demise of cursive handwriting, having noticed that my adult students print their in-class written assignments, as if they were children.  Teachers for the most part have given up on teaching this, contending that it takes too much time.

It does take time--sometimes too much--and the issue remains controversial as a recent (April 30) debate among New York Times bloggers reveals.

Suzanne Ascherson, a representative of an early childhood education company, might have a vested interest in her argument, which is that learning to write cursively improves brain development. "Cursive handwriting," she says, "stimulates brain synapses and synchronicity between the left and right hemispheres, something absent from printing and typing."

She cites the College Board's conclusion that students who wrote in cursive on the SAT essay exam "scored slightly higher" than those who printed.

Yet she does not present any evidence of her theory that learning handwriting actually helps in the areas of thinking, language, and working memory. Of course, her essay is only a brief blog post.

I agree with her than students need several options.  For me, a combination of printing and cursive writing works well in note-taking, where printing would seem to slow down the process.

What is missing from this debate is the problem I have discovered: that students who never learned to write cursively cannot read it, so that when I write them a note or make a comment on their essays, they say, "Huh?"

If teachers cannot spend time practicing the Palmer method that I learned in the 4th grade with their kids, they should let them become familiar enough with it so they can at least read what is written cursively.

In the meantime, I would like to see more evidence supporting the benefits of learning cursive writing.

Saturday, May 4, 2013

Starting Your Own Country

Herewith a bit of historical trivia that is too hilarious to overlook: 50 years ago, the tiny village of Seborga on the Italian Riviera declared its independence as a sovereign nation when a local merchant, Giorgio Carpone, claiming some obscure link to the old Holy Roman Empire, declared himself Prince Giorgio I. Beginning in 1963, he was called His Tremendousness--or His Terrificness (Sua Tremendita in Italian).

Dying without heirs in 2009, the title to his realm passed to Prince Marcello I, who is no doubt also interested in imitating nearby Monaco and boosting tourism.  Even though Seborga has its own coinage (a bit like Arizona), it is legally still part of Italy and is not recognized by anyone outside the "principality."

Screenwriters looking for comic material for a movie might consider the 50-year reign of Prince Giorgio. I can picture someone like Silvio Berlusconi in the lead role. (Not a good idea.)

It occurs to me, at a time when extremists in America (as in Texas) often speak of seceding from the union, when Arizona proposes using gold and silver instead of dollars and when anti-federal fury drives many people to arm themselves with enough fire power to wipe out a regiment, that such radical separatists might heed the lessons of Seborga.

First, if you are going to start your own country, have a decent population (not a mere 362 souls), have some land (not a mere 5.8 square miles), and have some money so that, unlike the Prince of Seborga, you can hire more than one man to serve in the military.

Second, if you are serious about starting your own country, have a good sense of humor: know that the idea is essentially ridiculous.

Friday, May 3, 2013

Doing What Comes Naturally

Ethel Merman, known for belting out Broadway songs for forty years in a voice that never needed amplification, was a big star.  Along the way, apparently, a musician told her, "Ethel, never let anyone teach you to sing."

Why ruin natural talent?  Of course, some who remember Merman singing "Doin' What Comes Naturally" from Annie Get Your Gun and most of the songs from Gypsy, may question the quality of that talent.

When I heard this anecdote, I immediately thought of the teaching of writing and how, all too often, it has intimated rather than encouraged students, who grow up feeling they cannot write.  As one colleague once told me, "I don't remember the rules."  A friend in his fifties, who yearns to write, worries about punctuation, as if his hand will be slapped if he makes a minor mistake. The computer's Spell-check frustrates him, tells him he doesn't know enough to write.

I tell him that the "rules" have little to do with generating ideas and tapping on his rich experience in producing interesting sentences.  What he needs is freedom from the opinions of others, especially ones stored in his memory.

Is there such a thing as too much instruction? I suppose in music, the answer might be Yes.  Writers, who are more familiar to me, need guidance and helpful readers and practice; they do not need more prescriptive advice on what is wrong with their work.

It takes a patient teacher to nurture a writing student so that he or she is not prevented from using his natural talent, from remembering that he in fact has such talent. Good writing involves a confidence in oneself along with liberation from the old voices of past teachers and editors that haunt us by saying, "You don't really know enough."

If I waited to write until I "knew enough," would I ever write anything?