Michael S. Russo
A Member of the SophiaOmni Network
  • Home
  • Professional
  • Enterprises
  • Courses
  • Creative
    • Photography >
      • People
      • Places
      • Things
      • Freaky
      • Statements
      • Post No Bills
      • Philosophy Ads
      • Of the Spirit
    • Videography
    • Creative Writing
  • Musings
  • Contact Me

The Limits of Loyalty

11/26/2013

0 Comments

 
Picture
I’ve often heard students of mine say things like, “you’ve got to support your family no matter what” or “friends have got to stick together no matter what.” When I hear statements like this in class, I can’t help being impressed by how important the idea of loyalty to friends and family is among the members of the Millennial Generation. I certainly don’t recall members of my own generation—Generation X, if you must know—being all that loyal to anything other than the idea to have a good time in life. So, on one level, I’m greatly pleased that a virtue as important as loyalty is making a comeback in American society.

However, as a teacher of ethics, I find myself somewhat concerned about the “no matter what” clause that Millennials often attach to their commitments of loyalty. As worthy a virtue as loyalty is, I can’t help but believe that this virtue could never be absolute in the real world. There’s got to be some natural limits to our loyalty, or the fidelity that we show those we care about becomes a kind of blind fanaticism.

So when ought our commitment to support our friends and family members come to an end? I’m inclined to agree with both Aristotle and Cicero that an intimate relationship of any kind must be terminated if the other party involved turns morally bad or wicked—that is, if they begin to act in such a way that they are causing harm to themselves or to other people.

Let’s begin with the issue of harm to others, since that’s less controversial. Let’s imagine a friend that you’ve had since childhood suddenly becomes obsessed with money and has developed a scheme to rob UPS trucks of their packages when they are left unattended by their drivers. Your friend has become quite successful at this and has managed to make thousands of dollars from his crimes. He confides in you about his activities one night. What should you do?

 Assuming that you yourself have any moral standards, the answer would be that you should attempt to convince him that what he is doing is wrong and try to persuade him that, at the very least, he has to stop his criminal activities. But what if he chooses not to? I think at that point, were you to continue to remain loyal to your friend, you would be complicit in his crimes. Instead, you’ve got to tell him that, unless he stops what he’s doing immediately, you can no longer continue to see him or be his friend. Any obligations of loyalty that you have towards your friend subsequently would come to an end, until such time that you friend decides to change his ways.

This is a very dramatic example, of course, but I think that the principle holds in less dramatic ones as well. If your friend was a bully, a bigot, a chronic liar, a cheat, a manipulator—if he repeatedly engaged, in other words, in activities that caused harm to others, especially innocent others—then you would likewise have no choice but to end your friendship.

The example of self-harm is a bit more problematic, but I think that the principle I’ve laid out holds here as well. Image that you have a friend who has developed a serious substance abuse problem. His behavior is causing him to neglect his job and his responsibilities to his family. You try taking to him about his issues, but he refuses to even acknowledge that he has a problem. So what do you do at that point?

Certainly, there are those who would argue that it’s wrong to abandon a friend in a time of crisis like this one—that you ought to continue to stand by him and remain loyal for as long as he needs you. But I think that this just makes you complicit in his act of self-destruction. The right thing to do in a case like this is to try as much as possible to get your friend to change, but, when it becomes evident that he has no intention of doing so, you have to put an end to your friendship for the sake of your friend. And any loyalty that you have towards him must be suspended until he agrees to do something about his problem.

The examples I’ve used above involve friends, but what I’ve stated about the limits of loyalty apply to family as well. If a family member—a parent, a sibling, or a child—becomes to engage in activities that cause harm to themselves or others, I think that we have a moral obligation to terminate our relationship with these family members in order to help them become morally responsible individuals again. To think otherwise would be to imply that family relationships trump all moral duties and obligations that we have in life, and this is simply not true.

I also think that if we really care about people—whether they are family, friends, or less intimate acquaintances—we would be as concerned about their moral welfare as we are for their physical, financial or social welfare. And the closer individuals are to us, the greater, I believe, are our obligations to care for their moral well-being. In this sense, we should have even higher moral standards for our close family members and friends than we do for other members of society…not because we want to treat them harder than we do others, but because we care about them even more.

I know that there are those who would reject the position that I’ve laid out on the limits of loyalty. Some would probably argue that I am being overly ridged and moralistic and that no one could adopt the kinds of moral standards towards family and friends that I’ve argued for here. If that’s the case, feel free to challenge what I’ve said in this piece. But consider first how you would respond if you discovered that a friend or family member was involved in the kinds of situations that I’ve described above. And then reflect on whether the continuation of your absolute loyalty towards these individuals—supporting them “no matter what”—would be better or worse for them than the kind of tough love that I’ve argued for.
0 Comments

The Skeptic's Way

8/14/2013

0 Comments

 
Picture
I’ve been teaching philosophy now for over 20 years, and it always amazes me at  how gullible students are.  Every  year when teaching my philosophy of Leadership course, I come in the first class and inform the students—in a very bad Irish brogue—that I am Fr. Liam McCarthy  from County Gallway in Ireland.   I then go on with the prepared script:
“Dr. Russo, I’m afraid, has been deemed ill-suited to teach this class and I’ve  been asked to take his place.  What  I plan to do is examine the leadership styles of our Lord and Savior, Jesus  Christ, his blessed Mother Mary, and the saints and martyrs of the Catholic  Church, including, but not limited to Saints Perpetua and Felicity, St. Odo of  Cluny, and, of course, the blessed Barengarius of Tours.   Our text will be the Bible, which I plan to teach to you in the original  Greek.  Many of you, I fear, will  not do well in this course, because you are weak of mind and prone to the  frailties of the flesh.  I want you  to know that I have no problem failing every one of you, if you fail to meet my  exacting standards.  Does anyone  have any questions?  Good.  Then let’s begin our class with a  prayer taken from the Catholic rite of the dead.”
I say all this with a perfectly straight face, while at the same time trying to  the best of my ability to maintain something like a Barry Fitzgerald-style  brogue from The Quiet Man.  It’s a  ludicrous performance, and no one with any sense at all could possibly believe  that Fr. McCarthy could be real.   But the students all do.   And when I can no longer sustain my performance, break out in laughter,  and inform them that they’ve been had, most of my freshmen still don’t know how  to react:  They sit paralyzed for  some time, trying to figure out how they could have believed something so  patently absurd to be true.
 
I know what you’re thinking: how stupid can  these freshmen be?  But they’re not  stupid at all.  In fact, only
honors-level students take my leadership class.   And I would bet that, if you were  in this class, you would buy into the reality of Fr. McCarthy, even with his  abysmal brogue and his absurd 1950s Catholic worldview.    You would accept that Fr. McCarthy is for real, because, like most human  beings, you’ve been trained to accept many things on faith that you have no real  evidence for at all.   
 
For instance,
  • you  believe that you were born in a certain place at a certain time to certain  parents.  
  • you believe that the world you experience with your senses exists as you perceive  it.
  • you believe that this planet that we are on is part of a larger universe that is very, very large and contains many other solar systems.
  • you believe in God and that when you die your personal identity will live on in some form.
  • you believe that when you look into the mirror every morning that the person you see staring back at you is the real you.

Unlike the reality of Father McCarthy, these are all somewhat plausible beliefs, to be sure.  You’ve probably embraced many of these beliefs most of your life and people that you trust and love undoubtedly hold to them as “gospel truth.”  But how do we really know that any of these so-called “truths” are actually true at all?  
 
Mind Games
 
Let’s play a few mind games.  For these games to work, you’ll have to put aside all the beliefs about  your life that you have taken for granted are true.
 
We can start with your experience of reading this very text.  Your assumption, I’m sure, is that you, __________________ (fill in your name), are sitting down in front of your computer reading the words that appear on the screen.  But can you really be certain that this is what you are actually doing?  Haven’t you had the experience of thinking that you were enmeshed in some activity—hanging out with your friends, visiting a strange, exotic place, making love to a desirable partner, only to wake up and discover that everything you thought was real was actually nothing more than a dream?  But while you were dreaming, the dream seemed totally and completely real to you, didn’t it?  Well, how do you know that something similar is not going on right now?  Perhaps instead of reading this text on your computer, you are, in fact,  in deep REM sleep, dreaming about reading this text.  Can you really be 100% certain that this is not the case (remember, while you are in a dream, everything seems completely real to you)?
 
Let’s try another mind game, just for fun.  Once again, you are reading this text, imagining that what you are experiencing is real.  But I’m here to tell you that the you that you think is you is not really you, and the world that you think is really real is not real at all.  You are actually a being of a much more highly evolved species than homo sapiens (You have a body only about 4 feet tall, four fingers on each hand, a huge cranium to support your impressive brain, and no icky genitalia, since reproduction of your species is done purely through mental contact).  Every 150 years members of your species go into a coma-like state, called
“The Phase”  in order to regenerate, and remain in this state for about five years.  During that time, it’s not uncommon for beings like yourself to imagine themselves as completely different sorts of creatures on strange new worlds.  For example, while you are in your coma-like state, you’ve imagined yourself as _______________ (fill in your name) living in a place called ___________ (fill in your town and country), on a planet called Earth, in a period described as the early 21st century.  You’ve even created a bizarre physical form for yourself that is totally unlike the “real” form that you actually possess (pubic hair…yuck!).  The further along you are in The Phase, the more elaborate the dream becomes until you no longer even begin to question that it’s real.  You establish relationships, develop a career, beget children, etc.  But—and here’s the kicker—you are now approaching the end of your five year sleep cycle and very soon will be ripped  from the fantasy reality that your mind has created.  When that happens, everything you experience in that dream-like state will become nothing more than a vague memory that you will eventually forget completely as you resume your “real” life. 
 
I know that you are probably thinking that both scenarios that I’ve described are completely implausible.   You know exactly who you are, and you know damn well that what you are experiencing at this very moment is precisely what it appears to be.   But can you really be certain that is the case?  In fact, the “certainty” that you possess about just about every aspect of your life is actually more like a belief or conviction—something that ultimately can’t be proven or disproven.  You could, in fact, be sleeping or you could be an alien creature in comma-like state.  How could you ever prove that you’re not?
 
The Way of the Skeptic
 
What’s the point of all this, you’re probably asking by now?  The point is to set you on a path that some philosophers have called the ultimate road to self-realization.  It’s called the  path of skepticism, and its practitioners—called, not surprisingly, skeptics—argue that true liberation comes from embracing the uncertainty inherent in human life.  “Dubito”—I doubt—is the motto of all skeptics, and a truly radical skeptic doubts every aspect of his experience.  
 
The way of the skeptic is the opposite of that of the dogmatist.  Dogmatists believe they have certain knowledge about the nature of reality, the right way to live, how to organize society, etc.  Their supposed certainty leads to conflict with other dogmatists who also believe that they hold the truth. Aggression, violence, war, and genocide are the end results of embracing a philosophy that holds that one’s own truth is
absolute and everything else is error, lies, and heresy.
 
The skeptic, in rejecting the idea of universal or transcendent truth, avoids the tension and conflict that the
dogmatist inevitably experiences when his views run counter to the views of others.  When the skeptic  encounters someone with an alternative perspective on reality, he simply acknowledges the beliefs of the other and moves on humbly and graciously. He doesn’t get angry or frustrated, because he has no personal stake in the debates dogmatists love to have among themselves.
 
The total suspension of judgment that the skeptic has about what is true or false leads to a kind of inner peace that dogmatist can never possess.  Things may “appear” or “seem” to be true to the skeptic, but when he’s shown that this is not the case, there’s no psychic rupture that occurs within him.  His beliefs are recognized to be beliefs, and nothing more, and when new beliefs come along that are superior to the ones he’s previously held, he’s capable of embracing them with a cognitive flexibility that the dogmatist could never even imagine.
 
Not convinced?  Try suspending judgment for just a week on matters that you’ve always assumed to be true.  For just a week, instead of reacting dogmatically when your beliefs encounter opposition, make an effort to remain open to conflicting viewpoints.  You just might find that your life has become much more pleasant by giving up some of your certainty about the truth…and you also might find that the world around you becomes a much nicer place as a result.
0 Comments

This Being Unto Death

4/6/2013

0 Comments

 
Picture
I’m dying…Did you know that?

Don’t get too upset about it:  you’re dying too.  We’re all dying.  In fact from the very moment we’re born on this planet, our lives have been a steady, inexorable progression to the grave.   We’re literally “being unto death”—to use the memorable terminology of the philosopher, Martin Heidegger.  
 
I know what you’re thinking right now:  “That’s pretty obvious, isn’t it?  Who doesn’t know that they’re going to die?” 

But it isn’t really obvious at all to most people.  If you’re elderly, or sickly, or have had a close friend or a family  member die tragically, then maybe you have appreciation for the fact that you are a being unto death.  But if you are a typical college student at the peak of your physical development, you probably only understand death in the abstract.  Death for most twenty-year-olds—actually, for most people regardless of their age—is something that happens to someone else:  to Aunt Sally who had cancer, or Grandma who was 90, or to that starving child in the commercial about Africa.   

But you certainly don’t think it’s going to happen to you…not for a very, very long while anyway.

When you’re in your twenties, the last thing you want to do is spend your time thinking about death.  There are wild parties to go to, romances to be had, careers to be started.  Who has time to think  about death?   When you’re young, you’re also convinced that you’re indestructible.  That’s why most twenty-year-olds are almost always reckless jerks on the road.   They don’t ever stop to think that getting behind the wheel drunk and driving 80 miles an hour on the expressway is the perfect recipe for swift demise.  

Believe it or not, I was young once too.   At one time in my life I too thought that I would live forever.   I used to laugh at old people and their assorted ailments.  I remember once working a security job when I was a freshman in college and was teamed up with a 60 year old former cop named Lenny. Lenny would have to run to the bathroom every half hour or so and I’d inevitably make some wise crack about his old man bowels.  I remember quite well, though, what he used to say to me:  “Just you wait, Mike, one day you’ll get old and you’ll be crapping bee bees all day long too.”

 Thankfully, I’m not crapping bee bees yet.  But as I pass through my fourth decade on this planet, I also am quite aware that I am no longer that young, 130 pound smart ass who never gave a  moment’s thought to sickness or old age.   The hair is definitely thinning out now, and strands of grey are starting  to appear out of nowhere.   When I was in my twenties, I was so emaciated that I used to drink weight-gain formula that I bought at a fitness store, just so my ribs wouldn’t stick out quite so much.  Now I have to watch everything that I eat and work out almost every day to forestall the inevitability of middle-aged sag.  

The first time I was aware that I was no longer a young person was when I was on the subway with a group of college students for a class we were having in Manhattan.  We were all hanging onto a pole in the train car, and I happened to glance down at our hands all bunched together.  And then I saw it:  that brittle, veiny, craggy old hand in a sea of soft, collagen-rich, wrinkle-free hands.  There was no mistaking it: I was no longer young.

So you see, I really am going to die.  Maybe not today or tomorrow, but relatively soon.  And I can’t deceive myself about that fact any more the way I could when I was younger.  The old man hands that I stare at every time I type something on the computer won’t ever let me forget that fact.

And, when I die, I tell you, that the universe and everything in it will die with me.  What good does it do me that humanity lives on if I am to be no more?  When I die, my art dies with me; when I die, my words disappear as if into thin air; when I die, all the hopes and dreams of a lifetime are buried in the grave with
me.

Or is there some other state that I can hope for after death that might take away some of the bitter sting of human mortality?  Certainly thinkers much more profound than I am have developed fairly persuasive arguments for the immortality of the soul that should not be cynically dismissed.  But you and I must also acknowledge that all claims to a life beyond this one are matters of hope and faith, and may very well amount to little more than the desperate longings of fearful minds. 

Fortunately, while I’m alive I have philosophy, which Plato in the Republic so aptly called a“preparation for death.”   He meant that philosophy prepares the soul to live out its existence after death in an incorporeal state.  But I think that, when we consider philosophy a preparation for death, we mean something much more than this.  We mean that philosophy places death front and center as an object of contemplation in order to teach us what’s most important about our transient human condition.

The acknowledgment that I’m going to die in fairly short order forces me to think seriously  about the way I am living out the precious time I have on this planet.   Am I living a worthy life, a noble life, a virtuous life? 
Am I leaving the world a better place than I found it?  As the Quaker Stephen Grellet once put it, “I expect to pass through this world but once. Any good that I can do, therefore, let me do it now.  For I shall not pass this way again.”  Am I, in fact, doing the good I can while I am here?  Or am I just adding to the sum total of human misery, inflicting my own nasty emotional baggage onto others around me?

I’m dying and so are you, but it’s nothing to get morbid about.  In fact, you just might find that contemplating on death now and then helps put our human existence into true perspective, sifting out what is really important in life from what is utterly frivolous and insignificant.  And the inevitability of death teaches us, above all, that our fragile human lives are the most valuable gift imaginable and ought to be fully cherished each and every moment, for …
We shall not pass this way again.
0 Comments

In Defense of Augustinian Pessimism

4/2/2013

0 Comments

 
In general, I think the TV sitcom is a fairly stupid form of entertainment.  In 22 minutes, there’s some silly conflict, a happy resolution of the conflict, and the amazing advancement in insight and moral behavior that inevitably occurs at the end of each episode.  

The one exception I make to my sitcom antipathy is that old standby of 1990’s comedy—Seinfeld.  What is it that sets this show apart from more banal sitcoms, you may be wondering?  Well, the creators of  Seinfeld—Jerry Seinfeld himself and Larry David—were keen philosophers of human nature.  They understood that, for the most part, our characters are fixed, that human beings keep making the same  mistakes all the time, and that true personal growth and transformation rarely,  if ever, occurs in the real world.  “No hugging, no learning,” was Larry David’s philosophical approach to the show, and this is precisely what makes it so profound.  In the last episode of the series, the four main characters actually wind  up in prison, and they still learn nothing from their experiences.  We are almost guaranteed that when they finish serving their prison terms, Jerry, Elaine, George, and Kramer will immediately return to the cynical narcissistic behavior that got them into so much trouble in the first place.

I have no doubt that, if St. Augustine were living in the 1990s, he would have appreciated the wisdom of Seinfeld too.  You see, Augustine was convinced that the effects of original sin and the force of vicious habits over long periods of time created a situation in which men and women may know what the right way to behave is, but they wind up time and again doing what is wrong anyway.   For example, a college sophomore who is on a very tight budget because of school expenses, knows damn well that she shouldn’t spend the little money she has buying a new pair of stylish leather boots, especially since she already has five pairs of boots in her closet already.  She goes to the mall with her friends determined only to look, but years of rampant consumerism and the rush she gets from buying new things undermine her fragile resolve.  By the time she leaves the mall, she has spent $200 on a new pair of boots that she didn’t actually need, and now is wondering whether she is going to find the funds to pay for gas to get to school.

In real life, this happens all the time: we resolve to begin eating better, but can’t resist the urge to wolf down Big Macs three or four times a week; we know we should not drink to excess, but wind up binge drinking almost every weekend; we promise ourselves that we will be kinder to our parents, but always seem to get into fights with them over the silliest things.

Augustine saw nothing strange in this pattern of behavior.  He was convinced that human nature was so corrupted that one could know darn well what the right thing to do is, but feel compelled to do what is wrong anyway.  His favorite quote was from Ovid:  “Video meliora proboque deteriora sequor.” (I see the better way and approve it, but I follow the worse way.)

What’s so radical about Augustine’s approach to human behavior is that it represents a complete and total break from the classical tradition of which he’s technically a part. The adherents of all of the great schools of antiquity—the Neo-Platonists, the Peripathetics, the Academics, the Cynics, the Epicureans, and the Stoics—basically accepted the principle that, if a person knew what was right and desired to do what was right, right moral behavior was virtually guaranteed.   But Augustine clearly understood that this was an overly simplistic way of understanding human nature.  He knew this because in his own life he had troubles committing to do the good that he desired to do (“Lord, make me chaste and continent, but not yet”) and  because, as a priest and bishop he encountered individuals who were sincere in their desires to live out the Christian faith, but who fell back into sin time and again.

And this is what makes Augustine seem so modern and relevant in the 21st century compared to his more idealistic counterparts in antiquity.  We understand today just how nearly impossible it is to overcome addictions. We also have a much greater understanding today of those factors—environmental, psychological, and genetic—that often interfere with the free exercise of the will.

 In this sense, I consider Augustine to be the first modern thinker in the West.  His understanding of human nature may indeed be pessimistic, but it’s also extremely realistic.  If you don’t believe me, just try this simple exercise:  commit yourself for one whole week not to lie or gossip for any reason.  See how long you succeed in carrying out this intention.

At the end of the exercise, you too may find yourself beginning to question just how free you are to do the good you desire.  And that’s exactly the sort of humility that Augustine claims is needed, if we are ever to begin to look outside of ourselves for a solution to our human problems.  
0 Comments

We Are the Last Men

3/30/2013

0 Comments

 
Picture
A recent study of religious attitudes and beliefs shows that 20% of Americans describe themselves as having no religious affiliations at all (up from only 8% in 1990).  This includes people who may be vaguely spiritual, but who have no interest in being part of any organized religion, as well as those who describe themselves as agnostics or atheists.   More telling still, one-third of men and women under the age of 30 claim to be “nothing in particular” when it comes to religious affiliation.

What accounts for this sudden surge in the number of people who describe themselves as having no connection to any religion?  Organized religion itself is undoubtedly to blame.  In the past decade we have seen leaders from all the major religious groups in the United States engage in financial improprieties, abuses of power, sexual transgressions with minors, and every manor of hypocrisy imaginable. Such behavior has most certainly tainted the “brand” of organized religion in the eyes of many Americans.  

The study also indicates that the association of organized religion with a conservative political agenda that at times can come across as racist, sexist, and homophobic has apparently also played a role in the decline of organized religion.  40% of self-identified liberals, for example, claim to have no religion (compared to 9% of conservatives).  This may also be why younger Americans, who tend to be fairly liberal on social issues, make up a much larger percentage of the religiously unaffiliated than older Americans.

While it is clear that organized religion is not going away any time soon, when the youngest, best educated, and most upwardly mobile members of a society turn away from organized religion—as appears to be happening in the United States—that doesn’t bode very well for the future of religion in this country.  

It’s interesting that this trend away from organized religion was predicted in the 19th century by the German philosopher Friedrich Nietzsche.  In language that is as provocative today as it was in his own time, Nietzsche dramatically declared that “God is dead.”   In The Gay Science, Nietzsche has his “madman” run to the marketplace, shouting, “I seek God, I seek God,” only to be mocked by atheists in the crowd.  In response to their taunts, the madman proclaims the following:
“Where  is God?” he cried; “I will tell you. We  have killed him—you and I. All of us are his murderers. But how did we  do this? How could we drink up the sea? Who gave us the sponge to wipe away the  entire horizon? What were we doing when we unchained this earth from its sun?  Whither is it moving now? Whither are we moving? Away from all suns? Are we not  plunging continually? Backward, sideward, forward, in all directions? Is there  still any up or down? Are we not straying as through an infinite nothing? Do we  not feel the breath of empty space? Has it not become colder? Is not night  continually closing in on us? Do we not need to light lanterns in the morning?  Do we hear nothing as yet of the noise of the gravediggers who are burying God?
Do we smell nothing as yet of the divine decomposition? Gods, too, decompose.
God is dead. God remains dead. And we have killed him. 

“How  shall we comfort ourselves, the murderers of all murderers? What was holiest and  mightiest of all that the world has yet owned has bled to death under our  knives: who will wipe this blood off us? What water is there for us to clean  ourselves? What festivals of atonement, what sacred games shall we have to invent? Is not the greatness of this deed too great for us? Must we ourselves not become gods simply to appear worthy of it? There has never been a greater deed; and whoever is born after us—for the sake of this deed he will belong to a higher history than all history hitherto.”

Here the madman fell silent and looked again at his listeners; and they, too, were silent and stared at him in astonishment. At last he threw his lantern on the ground, and it broke into pieces and went out. “I have come too early,” he said then; “my time is not yet. This tremendous event is still on its way, still wandering; it has not yet reached the ears of men. Lightning and thunder require time; the light of the stars requires time; deeds, though done, still require time to be seen and heard. This deed is still more distant from them than the most distant stars and yet they have done it themselves.” 

It has been related further that on the same day the madman forced his way into several churches and there struck up his requiem aeternam deo. Led out and called to account, he is said always to have replied nothing but: “What after all are these churches now if they are not the tombs and sepulchers of God?”  (The Gay Science, 125).
It should be noted that Nietzsche was not claiming that God didn’t exist—although he certainly believed that.  What he was trying to say is that the hypocrisy and fundamental dishonesty of organized religion would eventually lead to the death of religious belief itself.  God is dead because religion is no longer able to provide order, meaning and value to our lives.  We are moving, he believed, from a religious era to a post-religious one, and we’ve found nothing yet to replace our belief in God.  

Or have we?  In Thus Spoke Zarathustra, Nietzsche describes what he calls, the “last man”.  “There will come a time,” he writes, “when man will no longer give birth to any star. Alas!  There will come a time of the most despicable man, who can no longer despise himself. Behold! I show you THE LAST MAN.”

In the absence of the meaning and order provided by organized religion, the last man strives to avoid suffering and struggle and lives for comfort and pleasure.  In the end, the last man is actually closer to a beast of the field than a human being, and about as far removed from the Superman—the endpoint of human evolution for Nietzsche—as one can possibly get. When the concept of God dies, when the Churches become his tombs, Nietzsche believes that the nihilism that results provides fertile ground for the propagation of decadent last men and women.

 In proclaiming the death of God, the madman in the Gay Science admits that he has “come too early”—that the world is not ready for him. This was Nietzsche’s problem as well. But Nietzsche’s prophetic views on the end of organized religion, and what that fact means for human society, seems to be coming true in our own day and age.  If the study cited above is accurate, God may not be completely dead yet in the United States (as he is in most of Northern Europe), but he is certainly on life support.  And, in the absence of a new generation of Superman, what we have in our own society is a nation of the kind of “last men”that Nietzsche describes so wonderfully in Zarathustra.  

Like Nietzsche’s last men, we have become a people that no longer is capable of lofty ideals, a nation of individuals who see no value in struggling to improve ourselves and who are content instead to swill cheap
beer on our lazy boys, watching reality TV as we graze on an endless supply of artery clogging snacks.  The planet is literally choking on the shit that is emitted from our cars, our smoke-stacks, and our energy-intensive homes, but as long as we have our daily pleasures, what difference does it make to us?

 Nietzsche believed that out of the nihilism caused by the death of God a new race of Supermen would emerge to create new values and ideals.  I’m not so confident that a Superman will appear in our own society any time soon.  And, if he did, I have no doubts that the last men would kill him off the way that other morally superior human beings, like Jesus or Gandhi, have been killed off in the past.  

No, we’re definitely not ready for the arrival of a Superman.  We’ll have to content ourselves to living in a world without God and without any transcendent values.  The consolation is that when the end comes for us last men, we probably won’t even notice it.  And even if we do, we’ll be far too absorbed by the endless pleasures provided by Lindsay Lohan and the Kardashian sisters to give a damn, anyway.
0 Comments

Why Socrates Still Matters

3/28/2013

0 Comments

 
Picture
The first time I encountered Socrates was when I was 21 years old.   I was a freshman at Fordham University, and like all freshmen, I was required to take an  introductory course in philosophy to fulfill my general education requirements,

In all honesty, I didn’t have a clue about what  philosophy was.  I had come from a fairly conservative Catholic high school—a prep seminary, actually—where we were taught to accept the teachings of the Catholic Church without much questioning at all.  I had learned everything there was to know about the Old and New Testaments and the Christian doctrine, but philosophy was one thing that the good fathers who taught me seemed to have little use for.

In fact, when I told the priests at my high school that I was going to a Jesuit college, they responded with actual alarm.  I remember one of my teachers telling me before I graduated, “Just be careful that you don’t lose your faith, Michael.  Those Jesuits will teach you to QUESTION EVERYTHING!.”

So there I was back in 1982, a freshman in my Introduction to Philosophy class, not really knowing what philosophy was, but having the suspicion that I was going to be corrupted in some way by my encounter with it.  When I met my instructor, a young adjunct instructor named Ed, who seemed far too cool to be a  professor, I became even more concerned.   “You’ve been living in a world where you accept everything as true based upon your upbringing, your faith, or your own biases,” I remember him saying during our first class. “But I’m going to teach you how to challenge your presuppositions about reality and see if they hold up in the light of reason.”

Ed’s plan for the class was fairly straightforward.  We were going to be reading what were known as the Socratic dialogues of Plato—those texts that Ed said best represented the actual thought of the philosopher Socrates.  And the reason for this was simple: Everything you needed to learn about the methods of philosophy, Ed maintained, you could learn from understanding the approach that Socrates took to the discipline.  And that approach could be summed up as “QUESTION EVERYTHING.” 
 
So the priests in my high school were right, I thought.  One week in a Jesuit college and they were already trying to brainwash me into abandoning my faith!

The first text we had to read in the class was the Apology, Socrates’ famous speech in defense of his philosophical way of life.  Despite my reservations, I found myself being captivated by the person of Socrates, who certainly was not afraid to poke fun at his accusers, even though his life was hanging in the
balance.  But what impressed me most about Socrates was his dogged determination to discover The Truth about the right way to live, no matter what the consequences. So he spent his life cross-examining those who “claimed to know”—the so-called experts—only to discover that he was far wiser than they were, because at least he realized how little he actually knew.

And this, I believe, is the key to Socrates’ continued relevance 2,500 years after his death.

We live in a world in which everyone claims to know TheTruth.   All around us we have experts telling us what we need to eat in order to be healthy, what policies we need to support in order to put our country back on the right path, what doctrines we have to believe in order to be saved, and so on.  What Socrates teaches us is that we shouldn’t simply accept the opinions of those who claim to know, but rather we  should be involved in a life-long process of questioning the so-called experts to see if what they say actually holds up to reason. Sometimes the opinions of the experts will be right, but quite often, we’ll discover that the experts, to put it frankly, are full of shit—that they know even less than we do, but their pride prevents them from admitting their woeful ignorance.

The example of Socrates also teaches us the importance of humility in our quest for the truth. Recognizing the limitations of our own knowledge is a first step to being open to the possibility of actually moving in the direction of the truth.  Like Socrates, we may not grasp this truth completely in our own lifetimes, but our lives, like his, will be much better spent for making the effort. And we will certainly become just a little bit wiser as a result.

When my first philosophy class ended, I discovered that the fears of my high school teachers were totally unfounded.  My encounter with Socrates in that class didn’t destroy my faith, but rather, helped me to sort through the teachings I had grown up with to see which actually made sense and which were the product of irrational superstition.  My encounter with Socrates also began my life-long love affair with the discipline of philosophy that has enriched my life in ways that I could hardly have imagined while I was sitting through Introduction to Philosophy.  

I may know less about the really important issues in life than I did as a freshman in college (I knew everything back then).  But now at least, I take consolation from Socrates that the recognition of my own ignorance may one day prove to be the source of my future wisdom. 
0 Comments

On Tragedy and Moral Responsibility

1/2/2013

0 Comments

 
Piece written for Wisdom's Haven

Picture
2012 is now officially over.  Although the year saw some glimmers of economic recovery on the horizon and an Obama victory over the forces of rabid conservativism, it was not, by any stretch of the imagination, a very good year for our country or for the planet.  On the East Coast two events in particular caused the year to end on such a disturbing note, that you almost can’t blame people for wanting to move on as quickly as possible to 2013. These events, of course, were the destruction caused in the mid-Atlantic region by Hurricane Sandy in October and the shooting of 20 school children and 6 adults at Sandy Hook Elementary School in Connecticut just a few weeks ago.

Both these events can legitimately be described as tragedies.  In the storm lives were lost, thousands were left homeless, and there was billions of dollars in property damages.  At Sandy Hook innocent children and teachers were murdered by a deranged young gunman, Adam Lanza, who also took his own life and that of his mother.  The only appropriate response to either of these tragic events is to feel immense sympathy for the victims and offer them as much emotional and financial support as we can to help them deal with their losses.

But there are two important lessons that we should take away from events like these.

The first is existential—the real recognition that human life is inherently tragic, that horrific things happen all the time to very good people, and that the attempt to insulate ourselves from the tragic nature of life is a fool’s quest.  Indeed, one could argue that the entire life project of many Americans is precisely to try to forget as much as is humanly possible just how tragic life can be.  We spend much of our time engaged in the most frivolous sorts of activities—shopping for unnecessary creature comforts, gorging ourselves on unhealthy food, traveling all over the world, building huge homes for ourselves and our bloated families—all in an attempt to forget that human life is inherently vulnerable and transient. 

The simple truth is that, as human beings, each of us will experience the death of loved ones as a regular occurrence, we will suffer physical and emotional pain as a normal part of living our lives, we will know failure, loss, and rejection, and we will eventually get sick and inevitably die.  And all this must be done alone, because no one else can live our lives for us and no one else can suffer and die for us.   It shouldn’t take a wall of water from the Atlantic Ocean sweeping our homes away or the murder of innocent school children to make us understand the tragic nature of the human condition; daily existence itself should teach us that—if we didn’t incessantly try to cover over this fact. 

In the end, however, try as we might to ignore the tragic nature of the human condition, ultimately we can’t really escape from it.  Even the “Real Housewives of New Jersey” will get fat, will get old, and will die.  And their children will die.  And their children’s children will die.  All of the riches and pleasures of the American consumeristic lifestyle can’t disguise the fact that all we really amount to at the end of our lives is a hunk of rancid flesh fit only for the consumption of the meanest parasites.  That is the inevitable conclusion of our all too brief time on this little planet of ours and there is not much we can really do about it. 

Were we to embrace the inherent tragic nature of our human condition, instead of constantly trying to run away from it, I’m convinced that we would all be much happier for it in the end.  And the happiness I’m talking about is not the shallow sort that comes from buying a new Ipad or designer outfit.  It’s the happiness that comes from understanding that life is precious, that our time on the planet is fleeting, and that we should try to live the most meaningful existence we can, “for we shall not pass this way again.”

The second lesson, I believe, that we should take away from these two events is that, despite the inevitably of tragedy in our lives—or perhaps precisely because of it—we have a moral duty to do what we can to minimize the amount of unnecessary tragedy that innocent human beings are forced to experience.  We also need to seriously consider how our own selfish, materialistic, consumeristic—i.e., American—lifestyles may contribute to making the tragedies that are the price we pay for corporeal existence more severe or more common than they might otherwise be.

Hurricanes, for example, are inevitable.  And, as long as there are severe hurricanes, people will die as a result of them, and property will be destroyed.  But just because hurricanes are part of nature, that doesn’t mean that we Americans are totally blameless for the swath of devastation caused by Hurricane Sandy.  Many climatologists, for example, believe that Sandy would not have been quite so destructive if water temperatures had not been artificially raised because of the climate change that we are responsible for.  We should also reflect on the fact that American taxpayers essentially subsidize those who live in hurricane prone areas by providing them with government insurance that allows them to live on barrier islands, where no one probably should be permitted to live.  The question that we need to begin to ask ourselves is what we collectively are going to do about facts like these to ensure that fewer Americans die as a result of disasters like hurricanes.

Similarly, there will always be insane people among us who are prone to violence.  Arming every citizen in the county won’t prevent mass shooting, nor will putting a police officer in every school in the country.  But we might begin to question our obsessive need to cut taxes at all costs, even if this cost is the kind of community mental health counseling that might have identified Adam Lanza as a troubled individual and provided him with the kind of help he desperately needed.  Similarly, we might begin to reflect upon a gun culture in the United States that allows mentally ill individuals in many parts of the country to buy assault weapons with no background check.  Perhaps it’s time to start questioning whether our first amendment rights are—or need to be—as absolute as the NRA would like them to be.  If assault weapons and their ammunition were impossible to come by, Adam Lanza might still have been responsible for the death of innocent lives, but 20 children and 6 teachers probably wouldn’t be dead right now.

Ultimately, you and I are responsible for the misery, suffering, and death caused by both hurricane Sandy and the shooting at Sandy Hook.  We are responsible not because we could have prevented events like this from happening, but because our mindless commitment to a selfish materialistic American lifestyle has made these events far more catastrophic than they needed to be.

The question is what, if anything, are we going to do about it? 

0 Comments

The Ethics of Quid Pro Quo, Part Two

11/14/2012

0 Comments

 
I’ve got to acknowledge that my moral perspective has gotten much more restrictive in recent years than it was when I was younger.  As a college student, I had a wonderful, idealistic moral vision that was founded upon the radical altruism of the Gospels, the progressive social activism of the 1960s, and the example set by the great social exemplars of the 20th century—Gandhi, Bishop Romero, Martin Luther King, and Ralph Nader, in particular.  Back then I honestly believed that selfless compassion for those in need was possible and that through collective sacrifice we could transform the world into a much better place.

As I entered middle age, I began to recognize that there was little likelihood that I would ever become a saint and that personal and collective sins are not quite so easy to eradicate as I had assumed they were.  My moral position at this point is the happy mean between the Christian altruism of my youth, which I now find far too idealistic to implement in any kind of meaningful way, and the libertarian ideology which is running rampant throughout the United States, and which I find abysmally devoid of any concern for the common good.  I call this approach the Ethics of Quid Pro Quo and wrote about it in an earlier piece.  

In a nutshell, my position is that real reciprocity is the key to authentic moral interaction with other human beings.  Our obligations extend to autonomous others to the extent that they have entered into a relationship with us in which there is a balance between what is given and what is received.  Those who take without ever giving are moral pariahs who ought to be shunned; and those who give without ever expecting anything in return are moral fools, who almost deserve to be taken advantage of.  In the balance between the quid (that which is given) and the quo (that which has been received) a true moral relationship is formed in which the mutual needs of the parties involved are recognized and respected, and as a result both parties are morally and existentially affirmed through their interactions.

I’ve come to believe that there is absolutely nothing wrong with expecting others to reciprocate in some form when we care for them or do some act of kindness for them.  The expected reciprocation (the quid) should be roughly comparable in significance to the initial act (the quo), although, depending on the specific circumstances of the other, the act of reciprocation can at times be as minimal as an expression of appreciation (a sincere and heartfelt “thank you,” in other words).  I also think that it is a sign of decent moral character to consider how to reciprocate—and to what extent to reciprocate—when one has been treated kindly or generously by another person.  The person who never thinks about reciprocating at all is either a moral imbecile, and therefore not responsible for his actions, or, as I’ve already indicated, a moral pariah, who is best not associated with by anyone but the most committed masochist. 

As I contemplated how this ethics of quid pro quo might be implemented, I began to wonder what exactly our obligations are towards those who are not able to engage in the kind of exchanges demanded in this kind of moral system.  The answer quite simply is that, if an individual is incapable of truly reciprocating because of mental or physical incapacity or limitations (the seriously mentally or physically disabled or ill) age (young children), lack of free will (animals), or by virtue of the fact that they do not yet exist (future generations), then, individually and collectively, we have an obligation to work for the good of such individuals regardless of whether or not they can reciprocate.  Once again, however, we must be careful not to demean such individuals by automatically assuming that they are completely incapable of any sort of reciprocity at all.  Young children, for example, are able to give back much more than we typically assume and should be trained from a very early age to contribute to the good of their families and to the larger community in whatever way they are capable.

I also think that it has been a mistake of otherwise well-intentioned liberals to treat the economically disadvantaged as though they lacked the ability to either care for themselves or provide some service in kind for the public generosity bestowed upon them.  When charity, for example, is given to the poor in the form of food stamps or below cost public housing, with no expectations of any kind of reciprocating action on those receiving it, we treat such individuals as though they were not fully autonomous and therefore not quite as human as we are.  It really is an insult to their dignity as human beings, and does little more than make the distributor of charitable offerings feel morally superior to those who are the recipient of his or her largesse.  On the other hand, a well-constructed workfare program—and I’m not sure that such a thing actually exists right now in the United States—asks recipients of taxpayer support to give something back to the lager community, and in doing so allows those individuals the dignity of feeling like full participatory members of that community.  

One should not assume that my focus on reciprocity in moral actions means that I reject the value of charity completely.  There are those towards whom charity is certainly appropriate.  Victims of natural disasters, wars, and famines, for example, deserve our sympathy as well as our financial and emotional support; the same is true for those who fall victim to circumstances beyond their control (sickness, disability, mental illness, etc).  We have an obligation to individually and collectively care for such individuals, if they are not able to care for themselves.  And this is true, even if they are strangers who might never be able to repay our generosity in any meaningful way. 
0 Comments

What's Wrong With Dabbling?

12/18/2011

0 Comments

 
In his book, Outliers, Malcolm Gladwell makes the case that becoming a world-class expert at anything is not just a matter of having innate genius. To become a great athlete, writer, musician, or even a master criminal, he argues, requires a minimum of ten thousand hours of practice. Even Mozart—who most people mistakenly believe sprung out of his mother’s womb a musical talent—spent much of his early life honing his musical gifts before he wrote his most impressive works.

If someone like Mozart needed ten thousand hours to become an expert in his field, then you can bet your sorry ass that you and I need at least as much time engaged in some consistent and intense sort of practice before we should even think of calling ourselves “experts” in any field of human endeavor.

Ten thousand hours. That’s a hell of a long time to spend focused on anything. Maybe that’s the reason why really talented people have such unbalanced lives: they’re so intent on perfecting their skills that everything else falls by the wayside. Relationships, fun little hobbies, family, the simple pleasures in life…all get tossed aside in the expert’s fanatical quest for perfection in his or her field. Experts also tend to be jerks, because with that ten thousand hours hanging over their heads, how could they possibly find the time to think about social niceties (as a case study, see “The Social Network”)?

If this is what it takes to be an expert, please forgive me, but, for now at least, I prefer to remain a devoted dabbler. There are simply too many fascinating things to explore in this short life of ours for me to have to commit myself exclusively to any one. I suppose that my career might have been more successful if I was able to focus on any one of the jobs I held since I began my professional life: high school teacher, campus minister, director of religious education, director of service-learning, director of international education, director of the first year experience, professor of philosophy and ethics, and now publisher. Perhaps if I had committed myself to any one of these positions—or at least spent Gladwell’s ten thousand hours perfecting my skills at any one of them—I might be renowned in at least one field of human endeavor, instead of “being all over the map,” as one very charming administrator at my college so aptly put it.

I should also confess that, even when it comes to my hobbies, I’ve demonstrated an equally passionate fear of commitment. In my adult life, I spent years at a time studying (not in any particular order of importance) Franciscan spirituality, photography, the Beat Generation, Roman intellectual thought, web design, ecotourism, zoysia propagation, veganism, Eastern religion (including, at one time or another, Zen, Vipassana, Tibetan Buddhism, and Vedanta), the Counterculture, Augustinian eudemonistic theory, 1930s screwball comedies, the Venician art of the cicchetti, Belgian beer production, Buddhist iconography, perennial plant science, abstract and minimalist art, acid-alkaline food combining, the music of Bob Dylan (for four entire years!), communitarian theory, voluntary simplicity, and book design. Whew! It took a lot to get all of that out.

I guess you could say that, if anyone fits the bill of a consummate dabbler, then that would most certainly be me.

Now, Gladwell probably would argue that this lust for pursuing whatever fancy caught my attention throughout the years came at a tragic price: because I never spent enough time perfecting my skills in any one area, I never really developed the mastery required to become an expert at anything. And he’s probably right about that.

But I think that there is an advantage to being a dabbler that Gladwell overlooks. Dabblers often can be proficient enough in so many areas that they can move almost seamlessly from one to another as needed in life or in their careers. Experts can’t do that. If you are an expert, as we’ve seen, it means you probably are inept in most other areas of your life because of the time commitment involved in attaining mastery. That’s no problem if you have a career that pays well, is fairly stable, and brings you long-term happiness. But, if any of these turn out not to be the case, the expert is screwed. He has nowhere else to go.

The dabbler has another advantage that the expert lacks: he may have a perspective on a broad spectrum of human thought and experience that makes him a go-to guy when the experts are puzzled. Many problems that we face in the 21st century are so complex and so interconnected that it often takes someone with the kind of expansive vision that comes from dabbling to see how all the “pieces of the puzzle fit together.” That’s precisely the kind of vision that the expert most decidedly lacks.

So consummate dabbling may not be as much of a problem as some people make it out to be, and it may, in fact, have certain advantages over maniacal specialization.

This position seems to be supported by the philosopher, Plato. In the Republic, Plato maintained that his Philosopher-Kings would only be chosen after the age of 50, when after a lifetime of education and rich experience, they would be in an ideal position to rule the polis—to become political experts, in other words. Plato’s Philosopher-Kings, then, would spend the first fifty years of their lives essentially dabbling. Of course, they’d be studying philosophy intensely, but the educational program that Plato provided for them insured that they’d probably be fairly well versed in just about every other subject imaginable as well. It is precisely this sort of broad training that guaranteed that the Philosopher-Kings would have the wherewithal to govern competently, while their philosophical expertise guaranteed that they’d govern justly.

I’ll be turning 50 myself in a few years (heavy sigh!), but until then, I plan to heed Plato’s advice. I’ll continue to joyously dabble, as I’ve always done, study subjects that interest me at any given time, learn any skills that I think might benefit me or my students, and write about whatever damn well pleases me. If I ever get appointed Philosopher-King, I’ll leave my dabbling behind and grudgingly become an “expert,” if that’s what’s required of me. In the absence of that sort of mandate, I might very well remain an inveterate dabbler for the rest of my life.

Who knows: I might even become the world’s first expert dabbler!
0 Comments

On Human Perfectability (And Other Such Nonsense)

12/8/2011

0 Comments

 
When I was a graduate student at the University of Leuven and had to decide on a period of philosophy in which to focus, there really was no question in my mind that it would be late antiquity. I had always found this period of history—which begins with the death of Alexander the Great and goes until the fall of the Roman Empire in the West—to be one of the most fascinating and underappreciated in European history.

But it was the intellectual thought of the period that really grabbed me. It was during late antiquity that all of the major schools of philosophy—Epicureans, Skeptics, Cynics, Stoics, Peripatetics, and the Neo-Platonists—were engaged in a mighty philosophical battle to determine whose philosophical system would come to dominate. Now, the approaches of all these schools differed dramatically, but they all shared one basic presupposition: that human perfectibility in this life was indeed a possibility.

For a young man in his 20s with more than his own share of emotional baggage, this idea naturally was quite appealing to me. All you had to do, it seemed, was have the right understanding of the nature of reality and live your life according to some well- defined principles and you were set. Happiness, self-realization, moral perfection, the good life—what ever you want to call it—could be yours for the taking. And once you attained this state of ultimate perfection, as exemplified by the great sages of all these traditions, you’d never again have to be bothered by anger, fear, despair, loneliness, anxiety, and the like. You’d essentially be impervious to the vicissitudes of chance and fortune.

It was a sweet dream, but, as I get older, I have begun to realize that this dream is actually more like a dangerous fantasy. And, unfortunately, it’s a fantasy that most Americans buy into to one degree or another. We think that, if we can just change our outlook on life, or get the right kind of job, or marry the right person, or have a bit more money, or find the right religion to belong to, our lives will suddenly become perfect. We would be walking around on a cloud of bliss, never again to know the tortuous pangs of unhappiness.

So, we spend our lives going from therapist to therapist, from medication to medication, from guru to guru, looking for the magic pill that will dispel the inner demons that plague us. But what we find, more often than not, is that we “flee into the desert” hoping to escape our problems, only to find that our problems follow us right into the desert (to paraphrase John Cassian). The desert, of course, is a metaphor for that secret cure that will end our unhappiness. In the early Christian Church, the desert represented escape from the wickedness and temptations of the city. But our own personal desert could be just about anything: it is basically the delusion that we have that everything in life will become just fine, if we can just change ourselves or our outlook on life just a little bit.

St. Augustine, the great Doctor of the Church, saw that this incessant question for perfection was a dangerous tendency that would actually rob one of the little happiness that was possible in this life. As a young man, he flittered from sect to sect trying to find the magic cure for his own misery. In the end, however, he discovered that the very quest for perfection was his greatest sin and that what he really needed to do was just accept his own fallen nature and muddle through life as best he could. The rest, Augustine came to believe, was in the hands of God.

I used to think that Augustine’s approach was insufferably cynical and pessimistic. But now I think that he actually was on to something. If we can just recognize that we’re not perfect and never will be, and try to accept who we actually are, with all our foibles, neuroses, petty character traits, and gross imperfections, we might actually find the modicum of contentment that may, in fact, be quite realizable in this life.

As the great philosopher, Clint Eastwood, once profoundly said:

"A man's got to learn his...limitations."
0 Comments
<<Previous

    Musings

    Picture
    Some thoughts and reflections inspired by things going on in my own life or in the world around me. 

    Categories

    All
    Arts And Literature
    Cultural
    Diatribe
    Economics
    Education
    Environmental
    Ethics
    Humor
    Personal
    Philosophical
    Political
    Religious

    Archives

    September 2015
    August 2015
    April 2015
    February 2015
    March 2014
    February 2014
    November 2013
    August 2013
    July 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    November 2012
    October 2012
    September 2012
    August 2012
    June 2012
    May 2012
    April 2012
    March 2012
    February 2012
    January 2012
    December 2011
    November 2011
    October 2011
    September 2011
    August 2011
    April 2010
    February 2010
    April 2008
    March 2008