This blog is no longer active

Check out another online publication from ISI, The Intercollegiate Review.

Advancing Natural Law by focusing on the Fourth Witness
By Hyrum Lewis

 

Of J. Budziszewski’s “four witnesses” that testify to the natural law—1) Synderesis (deep conscience), 2) designedness, 3) details of design, and 4) natural consequences—I believe that those who wish to advance the natural law in society should focus on the fourth.  My reasoning is that, in many ways, we have reached a stalemate on the first three witnesses and the fourth provides the most fruitful way forward.  Budziszewski rightly argues that there are moral truths that we “can’t not know,” but even those can’t not know them can nonetheless still deny that they know them, so I don’t know that we can go much further with Synderesis.  On designedness and details of design, even I (a staunch believer in natural law) am not persuaded by those arguments and they often become highly abstract, technical, and unappealing to common people.  [I won’t give my full rationale for rejecting 2) and 3), but suffice it to say that do so for the same reason that I reject Dewey’s naturalist ethics—I find it impossible to reason one's way from an “is” to an “ought”].

 

Unfortunately, it is on the first three witnesses that believers in natural law have so far directed their energies. Most natural law adherents are philosophers or political theorists, and make their arguments in ways that are unlikely to make any more headway with the public. 

 

I think this must change: we need more scholars going into the social sciences—economics and sociology in particular—to show, with hard data, the consequences of ignoring the natural law.  Science has a resonance and legitimacy in modern societies that philosophy or political theory do not.  Why not use that “bias” to our advantage and show, scientifically, that there are universal moral laws by showing empirically that the natural law is true because it “works”?  Thus far, it seems that Arthur Brooks is almost alone in making these arguments, but he has nonetheless had a significant impact on policy debate.  How much better if we could send more young believers in natural law into empirical social science, rather than philosophy and political theory, such that they can make tangible, behavior-changing, policy-effecting arguments for our position? 

Thoughts on Joseph Addison's Cato
By Matthew Wright

I’ve found Joseph Addison’s popular 18th century play, Cato—A Tragedy, to be a very helpful meditation on the ideal of leadership aspired to in the Roman republican tradition. It is notable, I think, not only for Addison’s portrayal of the conflict between the play’s major political actors, but just as much for the interesting contrasts that emerge between Cato and those among his family and friends who remain loyal to the republican order. The play not only highlights Cato’s nobility in contrast to the ambition, treachery, and capitulation of those who would take Caesar’s part, but it also draws our attention to the great difficulty, and thus rarity, of the kind of republican statesmanship that Cato embodies. A central difficulty that emerges is that virtuous republican leadership not only demands a triumph over personal and political vice; it also requires a level of detachment from real human goods that lie very close to the heart. The level of transcendent virtue and dedication that Cato embodies, while serving as an ideal, thus presents itself as a rare political commodity.

Cato’s republican/Stoic virtues are numerous, and Addison withholds no praise. At the highest level, of course, Julius Caesar’s overweening ambition is matched by Cato’s patriotic devotion. “My life is grafted on the fate of Rome: Would he [Caesar] save Cato? Bid him spare his country” (34). His self-abnegation and concern for his friends repeatedly stands in marked contrast to the treacherous opportunism of Sempronius and Syphax, men who attend to their own fates by joining with Caesar before being subjected by him. There are those, as well, who wish to avoid impending disaster by quick capitulation. In contrast to such diffidence from Lucius, Cato greets looming destruction with courage and constancy—even hope: “While there is hope, do not distrust the gods” (33); and later, “the gods, in bounty, work up storms about us, that give mankind occasion to exert their hidden strength” (42). Finally, in determining which course to take, Cato’s moderation steers a well-reasoned course between Sempronius’ (spurious) zeal and Lucius’ milquetoast statesmanship. Thus, Cato’s virtue towers above this set of characters that is more or less hostile to the republic and/or unvirtuous.

Addison also gives considerable attention to Cato’s family members and loyal associates who, in addition to their concern with the immediate fate of Rome, are embroiled in various unspoken, unrequited, or simply suppressed romantic relationships. They all seem to be keenly aware—except, perhaps, for Cato’s more volatile son, Marcus—that their personal passions and attachments threaten to distract them from the republic’s present crisis. Addison does not present these personal engagements as trivial or vicious; on the contrary, Cato himself ultimately blesses the love that exists between Portius and Lucia and that between the Numidian prince, Juba, and Cato’s daughter Marcia. In fact, Cato ties the persistence of a true Rome to them: “Whoe’er is brave and virtuous, is a Roman” (96). Nevertheless, the question is whether these characters will be able to control or postpone their desires, perhaps indefinitely, in order to devote their energies to the present crisis. From the opening scene, Addison creates an interesting contrast between Cato’s two sons, Marcus and Portius. Whereas Portius, like his father, is able to maintain a dispassionate view of Caesar “in the calm lights of mild philosophy” (8) and a Stoic acceptance of the “dark and intricate” ways of heaven, Marcus is an angry and passionate man, “tortured ev’n to madness” by Caesar’s aggression and from the outset subject to the “tyrant Love” (8, 10). Despite his patriotism, from the first scene of the play Marcus’ resolution has dissolved; he is a slave to Love. Marcia, as well as Portius, is presented in a much more virtuous light. Both attempt to hold their desires in abeyance, yet both are ultimately overcome—Portius by his love for Lucia (62), Marcia by her grief at Juba’s apparent death (76). This is not to say that any of them become insensible to the perils of Rome or unable to execute their respective duties. When immediately confronted with battle, Portius’ heart “leaps at the trumpet’s voice, and burns for glory” (64), and Marcus, despite his volatility, dies in faithful execution of his duty. They are valiant in the moment of peril (bringing to mind Tocqueville’s observation that despite the inconstancy of souls shaped by democracy, free men are capable of tremendous exertions in the face of direct dangers). Still, all of them are bound by personal passions and attachments that they recognize compromise their ability to act in the republic’s best interests throughout the trial.

In contrast to this set of characters, Addison’s Cato seems perfectly poised between love of home and patriotic duty. As Marcia remarks, he is a man “fill’d with domestic tenderness, the best, the kindest Father” and always gentle with his friends (92). Yet at Marcus’ death, Juba remarks, “Behold that upright man! Rome fills his eyes with tears that flowed not o’er his own dead son” (85). At the same time, it is the fate of his friends that alone is able to incite in him any fear of Caesar. But perhaps the greatest indication of Cato’s love of home and hearth is his advice to Portius to retreat to the family estate: “There live retired, pray for the peace of Rome: Content thyself to be obscurely good. When vice prevails, and impious men bear sway, The post of honour is a private station” (86-7). Cato is a man of transcendent patriotism, yet so attached to the virtues and relationships of private life that a life of obscure goodness holds great attraction. He seems as easily able to walk away from the public realm as he is able to weep for Rome more than his dead son. Yet, this would seem to present no small internal conflict: How is one to maintain sufficient detachment from either realm (public or private) sufficient to walk away from it for the sake of the other, but also enough devotion to both that either life is a completely fulfilling option? In other words, it would seem like a patriotism so strong that one cries more for a dying republic than a dead son would tend to diminish or even trivialize personal attachments. But then does life in an obscure private station really present itself to such a patriot as a desirable one? Cato, at least, seems to have reached such a state of character, but Addison wants us to see that he is alone in it. Not only is his virtue at odds with the vice of Caesar’s cohort, it is markedly different than that of the would-be patriots around him.

Perhaps the point is that Cato’s family and friends have simply not yet reached this level of virtue, and it is still within their grasp. In that case, we should only look for true statesmen among those advanced in years. It seems likely to me, however, that Addison also wants to paint a rarified portrait of Cato, emphasizing that even among the virtuous transcendent statesmen will be very few. This is not simply a matter of vicious self-interest; it is just as much a function of the very real goods and virtues that pull our energies and affections away from the public interest.

I myself have serious doubts as to whether this Roman ideal of leadership is a helpful one, but Addison’s work does serve to underscore some of the reasons the Founders had for distrusting political leadership and for constructing the system they did to constrain it.

[Page citations to the Liberty Fund edition of the play.]

Professor Flipflops, or Some Thoughts on How to Approach Summer, Part One
By Glenn Moots

Part One: Preliminary Considerations

 

You don’t have to be a farmer to appreciate the summertime break from classes that most academic contracts afford.  Though some agitate for a year-round academic calendar without this traditional reprieve, such designs demonstrate a poor understanding of both leisure and the vocation of scholarship.

 

Though I am not above making the occasional joke to friends and family that they “actually work for a living,” I know full well that my workday is just as long as or longer than theirs. There is always something to be done and it’s almost impossible to put academic work aside. But none of that work is back-breaking labor and I am additionally thankful that much of it can be done more-or-less on my own schedule. Nevertheless, the “academic life” presents its own set of challenges and, strangely enough, summer is one of those challenges. I think of this every time I am tempted to close correspondence with a colleague. What to say? ”I hope you have a restful summer.” Nope. That might make me look like a slacker. “Have a productive summer”? Probably not: this makes me look like an anxious workaholic. I have since dispensed with these kinds of closings. Why not make it look like I didn’t even notice the onset of the summer? That’s it! Subtly implied workaholism!

 

I remember when I left a job in finance (again, humor friends and family by referring to this kind of thing as “working for a living,”) and began a full-time teaching position. Despite having a very good boss who promised to put me on the fast track to remunerative success in financial planning, the allure of summers was too strong. Yes, yes. I wanted to impart wisdom to students. Yes, I wanted to have a “life of the mind.” But I also wanted the warm and leisurely months of my youth. Little did I know that the other nine months of the year would more than balance out the so-called “lazy days of summer.” Of course, no one plays ball at the old schoolyard anymore and I don’t quite fit in the wading pool like I used to. Still, I’ve never looked back.

 

Since making that decision to transition to full-time teaching in 1995, I’ve had good summers and bad summers. I can remember particularly “productive” summers revising projects for publication or preparing for new courses. I can also remember more than one “unproductive” summer trying to help family or get a handle on other domestic fronts. I can recall a few summers where projects just didn’t get off the ground despite my best efforts.

 

With all of this in mind, allow me to offer some thoughts on how to approach summer. As we used to say in finance, your returns may vary. Be sure to read your prospectus carefully. Most importantly, think of these suggestions as a conversation starter rather than a conversation killer.

 

If you teach for most of the summer, then much of what I’m talking about doesn’t apply. Let’s face it: it’s really the break from classes that helps to define summer as a leisurely respite. Whether you should teach or not really depends on your particular circumstances. While I could really use the extra income that comes from summer teaching, my institution doesn’t incentivize it well. And unlike extra teaching during the year, there really isn’t the same efficiency to teaching in the summertime. I’m not already on campus. I’m not already teaching two other sections of the same course. Plus, I’m trading a small amount of money for a big block of time that I would otherwise have to “buy” with a grant. On the other hand, colleagues have told me that they find summer teaching to be more informal and relaxed. And some are well compensated for teaching during the summer. You’ll have to decide the trade-off here.

 

Accompanying any decision to teach or to tackle ambitious projects must be the question of what your spouse and children doing in the summer.  (Don’t forget your parents and extended family, too.) My wife and children are home all summer, so I have a different set of challenges or opportunities compared to someone who has an empty house all summer. What does the rest of your year look like in terms of family time? I am often so busy during the academic year that I really owe it to my family to take things slower in the summer. (When the kids were younger, I did find my way into a wading pool now and then.) On the other hand, my teaching schedule makes it imperative to read and write in the summer lest all scholarship grind to a halt. This trade-off obligates daily or weekly time management as well as some perspective on the relationship of summer to the rest of the year.   

 

In the next post, I’ll offer some considerations on time management and try to put the summer into perspective.

The Loss of Mother
By Dr. B. Jeffrey Reno

I recently had the pleasure of a reunion with an old friend.  It wasn’t an Army buddy or a lost classmate who found me on Facebook.  It was my first musical love, Debbie Harry and her band, Blondie.  In their fifth decade of making music, the band has just released their latest single, “Mother.”

Blondie’s catalogue ranges from the punk beat of “Detroit 442” to the disco-inspired “Heart of Glass” and the pop sound of “Dreaming.”  Their new song, “Mother” with its ethereal resonance tends toward the latter.  The lyrics are simple, building toward a chorus with its melancholic anthem:

Mother in the night,

Where are you, where are you now?

Mother's left the building.

We're the missing children.

Mother in the night,

Where are you, where are you tonight?

On first listen, I thought the band had finally gone soft.  The gang that used to rock CBGB had served up a saccharine tribute to mothers of missing children.  I envisioned a video with photos of lost teenagers reminiscent of the Soul Asylum video for “Runaway Train.”  Alternatively, I wondered if the song might be Debbie Harry reflecting on her own life in which she never became a mother (and, as an adopted child, did not know her birth mother).  I was relieved when I figured out that these interpretations were incorrect. 

The song became more meaningful when I watched the video.  It begins in a packed club in a shabby New York dive with Blondie playing on stage.  Young people dance as a miraculously alive Andy Warhol, played by Alan Midgette, saunters in and takes a table in the back.  Other notables from the 1970s club/art/fashion scene including Chi-Chi Valenti, Johnny Dynell, and Hattie Hathaway join Warhol.  Before long, zombies ascend from the basement and ravage the club, leaving a trail of carnage and new recruits in their wake as the music fades away.

After one final misguided thought that the video was a clichéd effort to latch onto pop culture’s Twilight-inspired obsession with the undead, I made the connection: “Mother” was the name of the club in the video as well as an actual bar that was part of the Manhattan scene when Blondie started out.  Chi-Chi Valenti owned Mother, and it is her club— symbolic of the entire New York counter-culture—that has gone missing.  

The distinction between culture and counter-culture has eroded.  Young people today have no CBGB, no Studio 54, no Max’s, and no Mother.  The energy that once drove counter-culture has gone corporate—accepted entry into the bourgeois class it once despised.  The lukewarm mess that they now generate, such as Green Day—exemplars of what the great Joe Strummer meant by “turning rebellion into money”—moves from the studio to the airwaves, but then gets choked out by something even worse: Justin Bieber and Katy Perry.

The degradation of counter-culture has been equally problematic for bourgeois culture in several ways:   First, by accommodating what used to be counter-culture, bourgeois culture has become more juvenile, providing less ballast to political society.  Second, counter-culture is one of several training fields for those who question convention.  Most grow out of it and come to embrace convention, but do so satisfied that they had their moment of youthful rebellion.  For others, the experience of growing up exposed to counter-culture whets the appetite for more mature forms of questioning, such as the study of political philosophy.  Without counter-culture we must rely on other training grounds, such as organized religion, which is also not in the best condition.  Finally, for those who fail or refuse to grow up, a counter-culture distinct from bourgeois culture provides a relatively safe containment field.  It is better to have the middle-aged stoner hanging out in a used record store than teaching high school English.

This is why “Mother” captivates me.  A mother is fundamental to life, and, as the song suggests, something fundamental has been lost.  The zombies are today’s music industry of record producers, FM radio programmers, and, of course, “American Idol.”  We need a Mother to protect us from the zombies.  I’m glad that Debbie Harry, at 66 remains ever vigilant.

History, Mastery, Utility: Or, the Public Responsibility of Historians
By Prof. Ronald Joseph Granieri

 

Last April, the organizers of the annual “College-palooza” at the University of Pennsylvania invited me to participate in their series of “one minute lectures,” where professors had one minute to crystallize their approach to their subjects. I figured that it would be a challenge to say anything coherent in only a minute, but at the same time found the experience to be quite stimulating as I tried to make a point that I often hint at in my undergraduate lectures.

This is the complete text:

"I am often asked: What does History teach us?

History doesn’t teach us anything. Historians do. History is not some independent abstraction; it is the attempt by human beings to make sense of the past.

Viewing history with the ironic distance of the hip moviegoer breeds a dangerous sense of superiority over historical actors, and the equally dangerous assumption that simply identifying past mistakes will grant mastery over the present and guarantee future success. Remembering the human dimension can help us avoid those dangers.

Historical actors are human beings, not necessarily any smarter or dumber than we are. Their weaknesses are ours: the limitations of individual perception and the inability to see the future.

None of this means we should let historical actors off the hook for their mistakes, or resign ourselves to an inability to learn from the past. It means we study History not in search of mastery, but rather to develop our sense of humility. History is not an upward march from a benighted past to an enlightened present, but the story of humanity’s eternal struggle against our limitations. Whatever nobility there is to be found in our lives springs not from smug pride over mastering the past, but from the humble awareness that all of us must continue the struggle if we hope to create a better future."

[Yes, there is a real limit to how much you can say in one minute!]

I firmly believe all that I said last year, but I also think there are larger implications that need further elaboration.

To put my cards on the table right away, I believe there is a real and growing tension between how historians view what they do and how the broader public views History, and historians need to work harder to maintain their important public role.

This is not the place for another jeremiad about the historical ignorance of the general population. People are ignorant about a great variety of subjects, in varying measures, so historians should not consider themselves either immune from ignorance or somehow superior to others. Being an educator means being aware of one’s own ignorance as well as that of others, and taking up the challenge to combat it whenever possible in the areas where one can help, even if one realizes that there is no finish line when ignorance will be decisively defeated.

What truly worries me is not ignorance, but selective knowledge. For many educated people (in universities as well as politics and other professions) History is a grab bag of case studies or potted stories used to highlight a point or justify a prejudice. Such an approach can be catastrophic when the selector gets the facts wrong, of course. But it is not much better when the facts are correct but the case is torn from its context, with the implication that a story from any time or place can be reduced to a simple lesson and applied to our own time. Of course we can learn a great many things from the people and events from the past. Contrary to what some may say, however, the people and events of the past do not speak for themselves.  Anyone who cares about historical understanding should want to be sure that the past is presented and interpreted with skill, clarity, and intellectual honesty.

I realize that there are several issues at play here: the relationship of past to present, the tension between seeking broad historical understanding and the desire for a usable past, the uneasy relationship between academic and popular History, or more generally between historical study as an academic discipline and History as a tool of civic education. Each of them has great potential as a blog post on their own, and none can be resolved simply.

My point today is to emphasize the responsibility of the professional historian as a citizen in our current society. This means being a regular and enthusiastic contributor to popular discussions, not just as the “killjoy” who smugly punctures popular myths, but as an engaged citizen who wants to share knowledge and expertise. Humility goes both ways—as historians encourage the public to eschew simple-minded lessons and to embrace the complexity of context, they should themselves understand that they have not cornered the market on truth either. Historians should offer the context missing from many public discussions, but should not just be reactive. Historians need to seek out opportunities to discuss their own ongoing work with broader audiences, in public forums—including of course blogs—as well as popular publications. They should also cultivate a style and approach that will be inviting to non-specialists, rather than throwing up ramparts of jargon and obscurity.

Therein lies the rub. The attainment of academic credentials, and the preservation of academic employment, each encourage specialization over general appeal. There is no easy answer there either, but historians need to struggle against the habits and practices that threaten to isolate them.

History is a process, a three-way conversation between the past, the historian, and the public. The conversation is going on all the time, and if historians do not accept their role, someone else will take it up. Historians need to get off the sidelines, and get into the arena.

Five Books Every American Should Read: The Moviegoer, Number Three
By Jessica Hooten

 

Harold Bloom calls The Moviegoer “a permanent American book,” placing the novel in the tradition of Mark Twain. He describes the protagonist Binx Bolling as “a kind of grown-up, ruefully respectable New Orleans version of Twain’s Huckleberry Finn”—leave it to Harold Bloom to get to an idea before me. When I was deciding on the second book that every American should read, I wavered between those two. While Twain’s The Adventures of Huckleberry Finn is canonical, Percy’s first novel is just as quintessentially American, expressing who we have been, who we are, and who we are becoming. It received the National Book Award in 1962, and in 1998 the Modern Library ranked the novel number 60 out of 100 best 20th century English-language novels.

The twentieth-century Huck Finn, Binx, is also on a quest, what he describes as “the search.” Binx defines the search thus: “The search is what anyone would undertake if he were not sunk in the everydayness of his own life. …To become aware of the possibility of the search is to be on to something. Not to be onto something is to be in despair” (13). With all their freedom, Americans now struggle with what to do with that freedom. Ironically, a nation founded on the pursuit of happiness is populated by people largely in despair.

Binx’s “cousin” Kate (the stepdaughter of his Aunt) suffers from such despair, or “malaise,” as Percy terms it. After an attempt to commit suicide, she realizes that life is a choice. She enjoys what Percy calls in Lost in the Cosmos the life of an “ex-suicide:” “The ex-suicide opens his front door, sits down on the steps, and laughs. Since he has the option of being dead, he has nothing to lose by being alive.” Once you realize that you have a choice to live, life becomes something to value. Both Percy’s father and grandfather committed suicide, so Percy understood that choice well.

The novel ends with an affirmation of life that most people, including Bloom, either ignore or debunk. Binx and Kate end up together and their marriage signifies a change in Binx; the existentialist wanderer has made a commitment to love and to community. Moreover, Percy intended the epilogue to be a tribute to Dostoevsky’s The Brothers Karamazov. Though it concludes with the death of Binx’s young brother, Percy emphasizes the resurrection.

Walker Percy was a Catholic convert. He moved from agnosticism to faith and simultaneously made a move from science to fiction. A recent documentary by Win Riley (http://www.walkerpercymovie.com) shows this transition. In an interview in First Things, Riley says, “In the making of the film, I felt as if I had an obligation to fully explain Percy’s faith. Finally, I came to realize that it couldn’t ever be fully explained.” Percy would have wanted it that way. For Percy, faith was a gift; he would spend his life searching to understand it.

In Aliens in America: The Strange Truth About Our Souls, Peter Lawler attempts to ascertain why the average American, who currently inhabits a national utopia “may be less happy, may experience themselves as more lonely and displaced, than ever” (x). He takes his title from Percy, who asserts that we are all aliens in this world, meant for another world. Yet, fifty years after Percy diagnosed the feeling of displacement, Americans are still searching.

The Commoners' Road
By J. Budziszewski

An intriguing question came up during one of the vigorous discussions after my presentation at ISI's American Studies Center Summer Institute at Princeton on June 21.  The presentation was an introduction to the classical natural law tradition.  I had explained that the theories we call classical take a "thick" view of natural law, always weaving together four different sources of moral knowledge -- as I like to call them, four witnesses.  These are: 

 

  1.  Deep conscience or synderesis (as distinguished from surface conscience or conscientia);
  2.  
  3.  Sheer recognition that the human person has a constitution, a meaningful order, a design;
  4.  
  5.  Observation of the principles of this design, for example sexual complementarity; and
  6.  
  7.  The natural consequences of violation.

 

Thinkers like Augustine, Aquinas, and the last and current pope are all more or less congenial to the thick view.  I also put Aristotle and Cicero in this camp, because even though they are a little hazy about the first witness, they have an inkling of it.  Non-classical natural law theorists like Hobbes and Pufendorf are more or less hostile to the thick view; they acknowledge one or two witnesses, but at the expense of the others.  Thinkers like Grotius and Locke are ambivalent, blowing sometimes hot and sometimes cold.

Two lines of questioning might be pursued.  One is to ask whether all four witnesses are valid.  Assuming their validity, the other is to ask whether all four are helpful in conversation with the morally disoriented people with whom we share the public square.  The latter question is the one that came up.

Interestingly, though most of the participants suspected that some witnesses would be more helpful than others, they differed among themselves about which ones are likely to be helpful.  One group wanted to appeal only to deep conscience, because it is basic and spontaneous.  Another was more sympathetic to the two witnesses of design, because they show why only certain kinds of lives can flourish.  A third preferred to talk only about natural consequences, because liberal secularists get hung up on conscience and teleology.

My own view is that such strategies of engagement are partly quite right, but also partly wrong.  As a matter of timing, they make excellent sense.  I ought to begin conversation with the points my neighbor grasps already.  Even so there are two problems.  One is that different people don't always find the same points easy to grasp; the best place to begin speaking with Felicia may be the worst place to begin speaking with Felix.  Moreover, I shouldn't assume that just because Felicia and I begin with point A, point A is the only point we will have to discuss.  Each of the four witnesses implicates and interpenetrates each of the others.  In fact, whenever we push one of the four witnesses out the front door, it creeps in through the back door in disguise -- a point that "thin" theories of natural law never reckon with.

Suppose we stake everything on the witness of deep conscience.  Unfortunately, by itself deep conscience is underspecified.  It can get us to principles that would hold equally well for humans and rational Martians, such as "Do unto others as you would have them do unto you," but in order to know what I should have others do unto me, I have to know how we humans are different from Martians -- we require the society of other persons, we have two sexes rather than one or three, our young pass through childhood rather than hatching out fully mature, and so forth.  Already that brings in the second and third witnesses, and the fourth is not far behind.

Suppose we put all our eggs in the basket of design.  This time the problem is double.  In the first place, if we are going to take teleology seriously, then we must include the teleology of the moral intellect.  That reintroduces the witness of deep conscience.  In the second place, if we are going to discuss the fact that we must honor our design in order to flourish, then it is hard to see how we can avoid talking about what happens when we don't honor it.  That reintroduces the witness of the natural consequences of our actions.

Finally, suppose we try to speak solely about natural consequences.  Alas, there is no "solely."  Medical consequences make sense only in the light of bodily teleology; social consequences make sense only in the light of social teleology; noetic consequences, such as guilty knowledge, make sense only in the light of the teleology of conscience.  Besides, without such consideration we lose the distinction between natural and arbitrary consequences.  Without this distinction we can no longer explain what is wrong with seeking "systems so perfect that no one will need to be good."

Other recipes for making moral conversation easier than it really is suffer similar drawbacks.  For example, you might try to avoid mentioning that troublesome personage, God.  Good luck.  Should the conversation go on long enough, your counterpart will mention Him whether you do or not.  Worse, he is likely to think that you are up to something -- and with good reason.  The experience of conscience inevitably raises questions about the authority of its maker; the experience of ourselves as a meaningful order raises questions about the source of its meaning; and, perhaps most surprisingly, the sting of natural consequences provokes complaints about divine justice.

That last fact is likely to seem odd.  You wouldn't expect anyone to reproach the justice of God unless the question of God's existence had already been broached.  Real conversations turn out to be more surprising, more labyrinthine, and more mysterious.  Fallen man wants to be a god himself.  The system of natural consequences keeps getting in the way.  This goads him to resentment, and it seems he cannot help but feel that someone is to blame.  Don't blame me for that; I am only the reporter.  It is almost as though he reasoned, "I can't have my way; life is unjust; therefore there is a God."  He keeps on acting as though he thought this way, even while arguing that God does not exist.

Make no mistake:  A grasp of natural law is a sine qua non of moral conversation.  We should resist the academic urge to make the conversation harder than it is, because at some level the four witnesses make their appeal to every human being.  Yet we should also resist the impatient urge to oversimplify.  When King Ptolemy demanded of Euclid a simpler way to learn mathematics, Euclid replied, "There is no Royal Road to geometry."  Neither is there a Royal Road to moral sanity.  The road is the same for commoners and kings, and there are no short cuts along the way.

Top Five Books in Early American History: Numbers Two and One
By Michael Schwarz

My personal countdown, long dormant, concludes on this Independence Day with two classics on the American Revolution.  Many regular visitors to this blog not only will have read these books but, I expect, will have a working familiarity with them.  For others—indeed, for anyone who hopes to understand the Revolution—I offer my strongest recommendation on behalf of these two books.  You will enjoy them.   

 

Here’s a refresher (for a review of the parameters, see #5): 

Number Five: Drew McCoy, The Elusive Republic: Political Economy in Jeffersonian America (1980)

Number Four: George Dangerfield, The Era of Good Feelings (1952)

Number Three: Lance Banning, The Sacred Fire of Liberty: James Madison and the Founding of the Federal Republic (1995)

  

Number Two

Bernard Bailyn, The Ideological Origins of the American Revolution (Cambridge, Mass.: Belknap Press of Harvard University Press, 1967)

 

Not until I read this book did I fully understand why the American Revolution happened. 

For many years scholars had explained the Revolution as principally a Lockean phenomenon.  Having experienced the oppressions of arbitrary power, the rebels in 1776 established a new regime presumably inspired by, and founded upon, both natural rights and consent of the governed.  To those who studied revolutionary thought, the transcendent importance of Lockean liberalism, given its prominent place in the nation’s founding document, seemed, for lack of a better phrase, self-evident.  In the 1950s and 1960s, however, this liberal consensus began to crumble, as historians discovered that there was a good deal more to America’s eighteenth-century intellectual tradition than John Locke. 

Building on earlier studies by Caroline Robbins and H. Trevor Colbourn, Bernard Bailyn offered a new and persuasive interpretation of the Revolution’s origins.  Through extensive analysis of revolutionary-era pamphlet literature, Bailyn revealed that America’s patriots drew upon a number of sources, including, most important of all, a group of seventeenth- and early-eighteenth-century English opposition writers such as Algernon Sidney, Robert Molesworth, Viscount Bolingbroke, John Trenchard and Thomas Gordon.  These men, along with some of their contemporaries, sounded the alarm against the growing English state.  Liberty, they claimed, always had much to fear from the encroachments of those who wield power.  Taken together, these opposition writers also appeared to predict the precise pattern by which those in power would attempt to destroy liberty: debts, taxes, tax-gatherers and a multiplication of government offices (useful also for patronage), popular discontent, and, the coup de grace in this conspiracy, the imposition of a standing army.  Looking back from the 1770s, America’s revolutionaries could not help but find these earlier opposition writings most useful for explaining what had happened to the colonies since 1763. 

In short, Bailyn’s book transformed the way scholars think and write about the American Revolution.  More than forty years later, it remains indispensable.  

             

Number One

Gordon S. Wood, The Radicalism of the American Revolution (New York: Alfred A. Knopf, 1992)

 

Whereas Bailyn’s Ideological Origins explains the Revolution’s causes, Gordon Wood’s Radicalism shows with incomparable erudition how the Revolution fundamentally altered American society.  On the face of it, this assertion does not seem to square with history.  “We Americans,” Wood begins, “like to think of our Revolution as not being radical; indeed, most of the time we consider it downright conservative.”  Based on the amount of violence, bloodshed, and social upheaval it occasioned, the American Revolution does appear tame in comparison with other great revolutions—in France, Russia and China, for example.  Furthermore, by any measure the white American colonists who made this revolution were the freest and most prosperous people in the world.  How, then, could a primarily political revolution without any apparent social causes qualify as radical?  Wood explains:

“The social distinctions and economic deprivations that we today think of as the consequence of class divisions, business exploitation, or various isms—capitalism, racism, etc.—were in the eighteenth century usually thought to be caused by the abuses of government.  Social honors, social distinctions, perquisites of office, business contracts, privileges and monopolies, even excessive property and wealth of various sorts—all social evils and social deprivations—in fact seemed to flow from connections to government, in the end from connections to monarchical authority.  So that when Anglo-American radicals talked in what seems to be only political terms—purifying a corrupt constitution, eliminating courtiers, fighting off crown power, and, most important, becoming republicans—they nevertheless had a decidedly social message.” 

In 1776, Americans rejected not only royal authority but the entire monarchical and aristocratic superstructure that maintained it.  To be sure, this political revolution did not alter society overnight; history doesn’t work that way.  It did, however, constitute the most dramatic and decisive moment in early America’s republican transformation.

Wood shows that eighteenth-century American society, despite the absence of a king (in person) or a hereditary aristocracy, was decidedly monarchical.  It was a world of patricians and plebeians, gentlemen and commoners, where talent and merit prevailed only through patronage, and where one’s claims to political authority depended entirely upon one’s social standing.  Republican sentiments, however, even before 1776, already were beginning to erode the foundations of monarchism.  After 1776, as a direct consequence of the political revolution, Americans democratized their society with an enthusiasm that exceeded both the expectations and in many cases the hopes of those who had led the rebellion against royal authority in the first place.  It was this incredibly rapid democratization—the acceptance of equality for white males, the legitimization of interest, etc.—that finally destroyed the old monarchical world.  Again, Wood is worth quoting at length.  This comes from the final paragraph of his incredible book:

“America…would discover its greatness by creating a prosperous free society belonging to obscure people with their workaday concerns and pecuniary pursuits of happiness—common people with their common interests in making money and getting ahead.  No doubt the cost that America paid for this democracy was high—with its vulgarity, its materialism, its rootlessness, its anti-intellectualism.  But there is no denying the wonder of it and the real earthly benefits it brought to the hitherto neglected and despised masses of common laboring people.  The American Revolution created this democracy, and we are living with its consequences still.”

Many people write good histories.  Few write with such sweeping authority and elegance.  Small wonder, then, that Gordon Wood, the greatest historian of Early America in the last fifty years, has written what I regard as the best and most important book on Early America to appear in the last fifty years, or, for that matter, at any time since Americans began chronicling their past.      

 

           

America v. Europe: Guns and Welfare
By Hyrum Lewis

On his Jan 16 show, CNN’s Fareed Zakaria noted (transcript here) that America has a higher murder rate than most European countries. He then noted that America also has much higher gun ownership. The conclusion he drew, was that America’s violent crime is caused by high gun ownership.

But is it not just as likely that Zakaria has it backwards? That is, it’s not that America has a high violent crime rate because of its many guns, but it has many guns because of a high crime rate? The more violent the country, the more likely the citizens are to want to have means to protect themselves (weapons). This is certainly true of my experience—I and almost all of my neighbors own guns, and yet none of us has ever committed a violent crime. We own the guns in order to prevent the violent crime that we know is widespread in America. Without the crime, we would not need the guns.

Perhaps the same holds for welfare state. We assume that the European welfare states are responsible for the lower poverty rates in those countries, but it could be that the low poverty rate (which preceded their welfare states) allows the state to be so generous since there are fewer poor to take care of. In our country, which admits millions of impoverished immigrants and has a historically disadvantaged underclass, it is naturally more difficult to provide for them than it is for the Europeans (especially since the Europeans shipped many of their poor to the United States in past waves of immigration).

My point is, that in too many cases, we are too quick to identify a cause without knowing if it is instead an effect. We should, then, be cautious in examining data lest we be misled.

I hope scholars will do studies to try to separate cause from effect in these matters. Can they show that there were high poverty rates in European countries before, but not after, the implementation of their programs? If so, we would then know that the welfare state did, indeed, reduce the poverty rate. But until then, we only see a correlation between welfare states and poverty rates, without knowing which causes which.

Students Seeking Comfortable Employment
By John von Heyking

Each year I invite graduates of my Political Science department to speak to current students about their careers.  Students gain better insight into their future jobs from hearing from graduates than anything I can tell them.  I invite graduates from a cross-section of career paths, including lawyers, civil service, elected office, business, and administration.  I make a point of omitting graduates who have gone into academics.  Teaching students about the importance of liberal education is crucial, but there is nothing wrong about appealing to their desire for material gain.  Liberal education is about getting a life, but students still need to get a job.

When speaking of material incentives, one presenter this past year told students that while a Poli Sci degree is, on average, less lucrative than one in Nuclear Engineering or even Economics, their degree can yield greater power.  As a town manager, the graduate was only half-kidding.

Recently, Universum concluded a global study that rounds out the story.  It surveyed 10,000 university students worldwide, and reports that students list government as their top choice for a future employer.  This holds true across majors, including liberal arts, engineering, business, and health professional.  It seems universities are breeding an ethos conducive to civil service instead of entrepeneurship.

The study’s authors suggest several factors contribute to this finding

  • 60% of students are women: "Women look for different things in a job than men”
  • the recession has people looking for more secure work and civil service, which in many cases offers lifetime tenure and defined pensions, which the private sector generally does not provide
  • civil service is purportedly more ethical than the private sector, which pollutes more.  Students still believe the Hegelian fallacy that the civil service is the universal class because it has the universal interests of society in mind.
  • For liberal arts majors, there are simply more employment opportunities for them: "You can probably find something you qualify for even if you are a general arts graduate…. There aren't many employers you can say that about."  Even so, the study indicates that government is the most attractive option for all majors, including engineers, business, health and professional, as well as liberal arts.

Alexis de Tocqueville noted that what today we call the administrative state would expand due to its benevolence.  In this case, the state simply seems to offer the easiest path to employment. 

However, with balancing budgets consisting as perhaps the greatest domestic political challenge in the next few years, university instructors are well-advised to encourage their students to consider alternative career paths, including business.  They should take a page out of one of my grad school professors, who once shocked a group of Peace Studies students when he told them the best way for them to “change the world” was to become investment bankers.

If students really want to pursue their ideals, they should see them in routes other than government.  Civil service is not the universal class as Hegel believed.  Professors should encourage them to embrace risk.  This message may not be entirely believable coming from a tenured professor (and in most cases, a state employee), but even we had to take a considerable risk in going to graduate school for many years, and confront the risk of studying a topic that would yield few employment opportunities and other social benefits.

Prev 1 ...4 5 6 7 8 9 10 11 12 13 ... 64 Next