August 2011

Reductionism in Political Science
By Ryan T. Anderson on August 02, 2011

During the Summer Institute this year we heard elequent presentations on natural law from Hadley Arkes, J. Budziszewski, and Robert George. A presentation by John Mueller, on his new book Redeeming Economics and how the founders of modern economics got the discipline off on the wrong foot, got me thinking about how various theories of natural law--and of political science in general--likewise get off on the wrong foot.

So, what went wrong? Joseph Schumpeter, the great economic historian of the 20th century, wrote of Adam Smith in his History of Economic Analysis (1954) that “the Wealth of Nations does not contain a single analytic idea, principle or method that was entirely new in 1776.” John Mueller goes a step further to say not only that Smith does not add anything to economics, but that his theory actually leaves out both distribution and utility.

The elimination of these two aspects should come as no surprise to anyone familiar with the debates in philosophy and political theory over the so-called “modernity project.” Consider how a Scholastic such as Aquinas understood various sciences. At the heart of Aquinas’s social thinking was a recognition that order exists on four irreducible planes, distinguished by how they relate to our mind and will: the order that exists in nature, independent of human thought and choice; the order that we bring into our thinking itself; the order that we bring into our thinking about what to do; and the order that we bring into our thinking about how to do it. These four orders give rise to four irreducible sciences: first, metaphysics and natural science to study what is, what exists independently of human choice; second, logic to study the relations of concepts; third, ethics and practical philosophy to study what is to be, the ends of human choice; and finally, the applied arts and sciences to study how to achieve those ends, the means.

Yet much of modern social thinking rests on the explicit rejection of this third order, and thus of this third science. Machiavelli announced the ambitions of this new political science in Chapter 15 of The Prince (1513): “Many have imagined republics and principalities that have never been seen or known to exist in truth; for it is so far from how one lives to how one should live that he who lets go of what is done for what should be done learns his ruin rather than his preservation.” In a thinly veiled assault on Plato’s Republic and Augustine’s City of God as the imagined republics and principalities that argued for how one should live, Machiavelli set out to reveal the “effectual truth” of how successful people do live. Implicit here was a reduction of political thought to the first and fourth orders. Investigate how people are (first order), and then reason about the means (fourth order), in this case, to staying in power, without any concern for how people ought to be, quite apart from how they might serve our interests (third order).

Thomas Hobbes and David Hume make this reduction even more explicit. In the Leviathan (1651), Hobbes writes that “the thoughts are to the desires, as scouts, and spies, to range abroad, and find the way to the things desired.” And in A Treatise on Human Nature (1739), Hume argues that “reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them.” The third-order science that considered which ends one should act for has been eliminated. It is now for political science to study man’s passions as they are given (first order), and then to devise the best way to secure those ends (fourth order). If this is how one understands human action, then, of course, distribution (deciding which people should be the ends of one’s economic acts) will be eliminated from consideration.

Yet Adam Smith went even further. Smith was, of course, taught by David Hume and Francis Hutcheson (the famous Scottish Enlightenment moral sense/sentiment theorists). Mueller argues that Smith, in his quest for a Newtonian science of economics and under the influence of Stoic pantheism, went further than Hume to deny reason a role in selecting either the ends or the means. Smith thought that the sentiments fully determine human action, so that the only variables to explain are production and exchange: a streamlined science with fewer moving parts. (And, it should be noted, this theory of production and exchange logically leads to Karl Marx’s criticisms, based on Smith’s faulty “labor theory of value” of modern capitalism.)

With time, economists came to understand just how thin this theory really was and reintroduced the concept of consumption based on utility (helpfully refining the idea into one of marginal utility). This reintroduction of the Augustinian understanding of consumption based on utility corrected for the problems in both Smith and Marx. Rather than viewing the worth of objects for human consumption as intrinsic to the object (or the labor that produced the object), the theory of utility saw that an actor’s preference for an object, based in the utility it brought, explained the value in and the choice for the object. But this still left unexplained the choice of which person(s)—self, other(s)—would obtain the object. That is, it left out distribution. And rediscovering that element of economic science is at the heart of Redeeming Economics.  

Renewing the Humanities
By Stefan McDaniel on August 04, 2011

 

Debate about liberal education has long been central to the culture wars. Most of us are familiar (perhaps wearily familiar) with the main points of disagreement. Should colleges and universities to teach truth, merely sponsor the search for truth, or proclaim truth a delusion? Is it possible or desirable to identify a canon of “great books,” and what should it contain? Is it possible or desirable to practice neutrality among cultures, or neutrality among a chosen set of cultures? What (if any) restrictions on academic freedom are justifiable? To what (if any) religious or ethical standards may the opinions and behavior of students or faculty be held? 

While the belligerents have been slinging heated tracts at one another, their mutual enemies have quietly gained possession of the field. Traditionalists and postmodernists alike generally agree that liberal education should be liberal (i.e. worth doing for it’s own sake), but the contemporary university is increasingly ordered toward creating and credentialing efficient producers for the market and effective managers for the state. Furthermore, the humanities, the most distinctively liberal disciplines, have well-earned reputations as refuges of indolence and fraud. Though our disagreements are far from trivial, all of us seriously concerned with liberal education should collaborate to promote its autonomy and rigor. 

Let us begin with a notion glamorized (though not invented) during the Renaissance: Man (i.e. humankind) is worth studying. Man in all his aspects, but especially insofar as he appears to be distinctive, that is to say insofar as he is a cultural being, one who applies, historical, religious, philosophical, and ethical categories to his own experience and activity. 

If that is granted, we must ask whether there are any disciplines universally fundamental to the intelligent, responsible, and productive study of human culture. I believe there are at least two such disciplines—language and history.

All committed humanists must study language, broadly construed to include semiotics in general, but with natural languages as the indispensable core. Culturally speaking, man lives by signs alone, so semiotics stands in much the same relationship to the diverse branches of the humanities that mathematics does to all departments of engineering. The humanities, especially at the undergraduate level, can increase their respectability and rigor by mandating, whatever the specific field of study, high competence in several natural languages and mastery of the fundamentals of syntax, semantics, and the general theory of signs. 

Only slightly less important than language is history, which the great historian John Lukacs defines with lapidary exactness, as the “remembered past.” Through the study of the remembered past (which illuminates and is illuminated by study of the languages by which it is transmitted) humanists learn how human beings construe their experience; the categories, values, connections, and problems perceived in the natural and social world that condition human thought and action.

Except where a humanist’s specialty obviously requires a different emphasis, there are good reasons for Americans strongly to prefer the history of America and Europe. People are, generally speaking, best equipped to interpret and analyze the historical materials of their own culture. Still, it is not of decisive importance whose history is studied, so long as the habit of historical thinking is firmly ingrained. We obviously cannot agree on the proper content of a shared deposit of historical knowledge, but perhaps we can agree that an educated public should be acutely aware of the value of detailed understanding of history and eager to remedy historical ignorance. Furthermore, although there is little hope of consensus on any important aspect of history, the practice of scrutinizing and comparing the methods and materials used by different historians should create common criteria for identifying honest and intelligent interpretation.  

The subjects generally classed among the humanities are usually studied most profitably when approached with the skill-set given by study of language, and are certainly seen most comprehensively within the context of history. For, in various historical circumstances and using various special modes of expression, man has asked and ventured answers to certain fundamental questions, “What is the world like, what has happened so far, and what should we do now?” and thence came philosophy (including the natural and formal sciences), religion, economics, poetry, etc.

The approach I have suggested would improve the general quality and reputation of liberal learning and, since studying language and history develop fungible skills, provide many other benefits. There’s even something in it for the utilitarians, since, as David Goldman of First Things once argued, one of the United States’ most troublesome liabilities in international relations is poor intelligence. For this, Goldman suggests, we must blame a culture that produces proportionately few people with significant knowledge of foreign languages and folkways. American colleges and universities are culturally powerful institutions. If the policies and ethos of the academy demanded, even at the undergraduate level, significant knowledge of foreign languages and the habit of historical thinking (which implies the habit of cultural analysis), the United States would have a much larger pool of qualified candidates for work in diplomacy and intelligence. 

Higher Law Constitutionalism Revisited
By Paul DeHart on August 06, 2011

In a blog post published a several months ago on this site there is an interesting discussion of the relation of the Constitution to majoritarianism. Without engaging the post in question directly, it is the issue of Constitutional majoritarianism that I would like to take up here. For this thesis is controversial.

Is it correct to think of the Constitution as the institutionalization of popular rule as determined by the majority? This is a question to which some Constitutional theorists, of the first rank, have given a negative answer. Daniel Walker Howe, for instance, sees the Constitution as an institutional design for promoting rule by a natural aristocracy of wisdom and virtue (as opposed to a landed one). And, to be sure, one can find endorsements of rule by a natural aristocracy in the writings of founders such as Jefferson and Adams. Howe believes he finds this way of thinking manifest also in key passages of The Federalist (perhaps especially in the argument of Federalist 10).

I have the utmost regard for Prof. Howe (his article and subsequent chapter on the political psychology of the Federalist was groundbreaking and remains of utmost importance for any serious student of Publius). Nevertheless, I agree with the prior blog post that the Constitution does indeed seek to instantiate majority rule. But I think this point must be made with great qualification. Recall, for instance, Madison's argument in his "Vices of the Political System of the United States." There Madison argues that the principle vice of the political system of the United States is the injustice of the laws of the several states. There are two main reasons for the injustice of the laws enacted by the state legislatures. The first reason has to do with the character or virtue of most of those elected to office. Madison says individuals run for office for three reasons: ambition, interest, and public good. Unfortunately, only a few of those who run for office are animated by the last motive. And once elected, Madison notes that those animated by the vicious motives of ambition and interest are most effective in establishing legislative coalitions. But this is not the main reason for the injustice of the laws of the several states. The main reason is the injustice of popular majorities that drive the legislatures to adopt unjust policies. So the principal vice of the political system of the United States was the consequence of the unrestrained majoritarianism that obtained in the several states.

Given that we are to have a republic rather than a monarchy, aristocracy, or even a mixed regime (in the Aristotelian sense--see Federalist 14), framers such as Madison sought to embody institutionally the maxim that in all free governments the deliberate will of the people will prevail over the will of their rulers. But the deliberate will of the people is distinct from whatever the people (or a majority) happens to will now. Majorities animated by immediate, narrow interests or by the passion of the moment are, by institutional design, resisted and suppressed. Only long lasting and broadly distributed majorities, animated by justice, should prevail. So Publius cannot be described as a mere majoritarian (or, in other words, as an unconstrained majoritarian). He favors certain majorities and rejects others--and such a distinction among exercises of majority will, as I argue in Uncovering the Constitution's Moral Design, cannot be made on the basis of majority will (or on the basis of republicanism, majoritarianism, a theory of consent, etc.).   Distinctions among exercises of majority will are only intelligible to the extent that the distinction is made on the basis of a standard that is not majoritarian--by a standard that transcends majority will. Consequently, the fact that Publius distinguishes exercises of popular will, favoring some and rejecting others, indicates that majority rule is not the end or goal of his account of politics. Rather, majority rule is but an instrument, valuable only because (or inasmuch as) it serves a higher end. What is that higher end? Federalist 51 comes to mind: "Justice is the end of government. It is the end of civil society. It has ever been, and ever will be pursued, until it be obtained, or until liberty be lost in the pursuit."  In that same essay Madison praises the extended sphere the new Constitution will put into effect just because, according to him, in the extended republic of the United States majorities will seldom form on principles other than those of justice and the general good.

In sum, the problem with characterizing the Constitution as majoritarian, then, is that it makes rule by the majority appear as the end or goal of republican form. But that being so, then the constrained majoritarianism of Publius would be nonsense indeed.   

Getting an Academic Job
By Anonymous on August 08, 2011

Although I am no expert on getting a job, I have some personal and professional experience that might be of value to some readers of the ISI blog.

You should know that, before becoming an academic (I received my Ph.D. only in 2009), I earned a Masters Degree in Human Resources Management from the University of Illinois at Urbana-Champaign at the Institute for Labor and Industrial Relations. After that I worked for over a decade with three organizations, M&I Data Services, AuditForce, and Cardinal Stritch University. With each organization, I participated in both hiring and firing employees. I joined the faculty at Carthage College in 2007, and have continued to serve on hiring committees for faculty positions, both tenure-track and contract hires.

Rather than rehash the dismal data about the job market, I will offer several tips and strategies you might consider to help separate you from the competition. My comments on this blog represent the rough draft of a larger effort I am working on about positioning in an extremely competitive academic job market. My comments here relate mostly to what you should do when submitting applications for jobs (I plan to make subsequent posts about interviews).

First, avoid appearing silly. I know this sounds a bit trite, but it matters. I have seen a disturbing number of applications with grotesque typographical errors and slapdash phrasing. For instance, one candidate for an academic job boasted that he had “developed the ability to speak orally.” (I will refrain from asking how else one might learn to speak.) Another candidate used the date 2110 for his cover letter. Our committee considered sending him a rejection letter, but, since he was obviously from the future, we decided that he had already learned he didn’t get the job. The point here is that you must avoid silly mistakes in important communications. We all make mistakes, but your application is not the place. Go above and beyond in making your materials shine. Hiring committees are an unforgiving lot and will look for any excuse to rule you out as a candidate. Don’t give them the ammunition.

Do exactly what the job advertisement asks you to do. This means that you should do no more and no less than what the employer wants in an application. Do not try to be creative or to cut corners. For instance, don’t provide a separate discussion of your research unless it is required in the guidelines. Similarly, if the application guidelines ask for a cover letter AND a philosophy of teaching statement, do not fold the two components into a single document. Write a separate cover letter and a more detailed philosophy of teaching statement. If you have doubts about what is required, do not make assumptions. Even if the contact person is the chair of a department, it is better to work up the courage to drop an e-mail or make a phone call than it is to guess at what is wanted.

Start drafting cover letters and philosophy of teaching statements now—even if you don’t have a job in mind. I believe that it is better to re-craft something than it is to start from scratch when you have a deadline approaching. Spend time now, when there is no pressure, thinking about what good teaching means to you. Ask your mentors what they think about good teaching, recall what you remember as good teaching, search the web for ideas, look at your peers’ philosophy of teaching statements, and, above all, try to determine what passes for good teaching at the institution to which you are applying (again, do you homework). Have others read your letters and statements before sending them off.

In terms of writing good cover letters, I have strong opinions. Others will certainly disagree, and I respect that. However, I believe that good cover letters follow a fairly set formula. The letter should:

1) be between 300 and 500 words;

2) indicate clearly to which position you are applying;

3) contain a brief, but confident, one-paragraph discussion of your qualitative abilities (e.g., organizational skills, excellent writing skill, ability to teach a variety of courses);

4) discuss in one to two paragraphs your objective qualifications (e.g., Ph.D. earned in 2005, dissertation on “Pet Keeping Habits of the Parisian Bourgeoisie,” award winning teaching, serve as adviser to the “Stop, Drop, and Roll Fire Safety Club”); and

5) conclude with a very brief paragraph about how you arecertain that youcan contribute quality scholarship, meaningful advising, and excellent teaching to the given department at whatever college. Also mention that you look forward to the opportunity to discuss yourqualifications with themin more detail.

Good philosophy of teaching statements are less formulaic and require more thought than cover letters. Of the excellent statements that I have seen, no two are alike. However, I have noticed a few common themes. First, they are rarely longer than three pages (in fact, most are just about two pages). Next, good statements are well structured. In other words, they have introductory paragraphs that tell the reader what the body of the letter will say and have concluding paragraphs that sum up the statement. Finally, the content of good statements discusses not just teaching, but also research and advising (they also discuss the relationship among teaching, advising, and research). You will notice that my comments lack specificity about content. This is because I believe that teaching statements are idiosyncratic. In essence, they are a chance for you to express your thoughtfulness and individuality. As such, philosophy of teaching statements require contemplation and, above all, repeated revision.

This leads to my next major point. Tailor each cover letter and philosophy of teaching statement to each institution to which you apply. Although it is fine to work from templates, you should craft every application as if the institution to which you are applying is the ONLY one where you could EVER see yourself working. This might seem overstated, but the hiring committee wants to believe that the person it hires is destined for the position. Hiring committees look not only for excellence in a candidate, but also for a good fit.

In order to tailor your materials, you must do your homework. Before you send an application to a prospective employer, you should know as much as you can about the position to which you are applying. The web is the most obvious place to start. Look at all the programs and departments that might relate to your position. In addition, look up the biographies, bibliographies, and areas of expertise of anyone with whom you might work. You need to differentiate yourself from the faculty who work at your target institution. They don’t want to hire copies of themselves. Some overlap between you and current faculty won’t hurt. Just keep in mind that your potential employers will want someone who can expand their department’s offerings, not repeat them. Doing this initial spade work can help you to indentify honest ways in which you might make a positive contribution.

Don’t be afraid to make calls. Let your mentors know where you are applying. Ask them if they have contacts. Ask your friends and peers for help. Do not be shy! In addition, start doing your homework BEFORE you get your assignment. Especially those of you who are still in graduate school should identify institutions where you would like to work. Investigate how many faculty they have in the department, how many are tenured faculty, how many contract, what employment opportunities at the institution have been in the last few years, is the institution financially secure, etc… Even before you are aware of a job opportunity, get in contact with department chairs at those institutions. Write them fan mail about their scholarship. Ask them (e-mail) questions about their schools, about teaching at the type of institution where they work, about the job market. Impress them (or at least start developing networking skills) before jobs become available.

The strategies listed above are low-risk, low-stress. Yet, they have the potential to yield substantial benefits. I am not making this up. I have these used this strategies, and I know others who have, when searching for both professional and academic jobs.

Annual Reports and Student Evaluations
By Ryan R. Holston on August 10, 2011

A friend of mine recently posted a message on a familiar social networking site (which I’m sure can be guessed!) that revealed a great deal of angst over having to write an Annual Self-Evaluation for performance in his/her current position.  This friend is an academic in a relatively new teaching job, who says that all members of the faculty are expected to provide evidence of “excellence in teaching.”  However, in the short time at this institution, my friend says that his/her feedback received from student evaluations has been less than stellar, thus leaving this person in a bind as to how to present their case to the department chair and dean. 

Since it is that time of year when many, if not most, academics are doing such self-evaluations and since we’ve all experienced at one point or another student feedback that we wouldn’t necessarily want as the top line on our C.V., I thought this would be a good opportunity to reflect on both phenomena.

The first and most important thing to remember when considering student evaluations is to approach them with the appropriate perspective or frame of mind.  It is a cliché in academia that in assessing faculty teaching, too much emphasis is often placed on evaluations that come from students who are, in the first place, unqualified to be appraising the teaching of their professors, and, in the second place, often base their evaluations on superficial considerations, such as whether they found the professor entertaining or whether grades received in the course reflected their self-perceived intelligence level.  The problem is that these criticisms of student evaluations do nothing to help the faculty member who finds him or herself already within this system of assessment.  Moreover, and more problematically, such criticisms actually buy into the logic which treats such evaluations as serving as an - albeit flawed - metric or barometer of good or bad teaching.  But it seems to me that faculty have it within their power to alter this assumed premise about student evaluations simply by changing their outlook on and interpretation of them.  The key is to treat student evaluations simply for what they are -- the perspective on one’s teaching from the vantage point of someone who sits in the room and takes the class.  In other words, once one jettisons the assumption that such reports are designed to measure, score, or rate professors’ performances and replaces it with the idea that they merely provide the view of the course from the eyes of the student, feedback not only loses its teeth and becomes considerably less threatening but can actually be a useful tool in improving one’s teaching (not to mention impressing one’s department chair and dean).

Consider that there are two possible outlooks or ways of interpreting my friend’s negative student feedback.  One is to view these student evaluations as reflecting his/her poor performance over the course of the academic year.  Students, implicitly perceived as the authorities on good and bad teaching, “rated” this colleague of mine poorly, and, if this is the case, he/she is immediately put in a defensive position of having to justify the “score” received.  Interestingly, many respondents to this friend’s post seemed to accept this premise and immediately began offering possible excuses for the performance:  this mutual friend of ours was still relatively new to teaching, he/she did not have ample time at the institution to gauge and adjust to the student population, he/she was teaching one or more required courses that self-selects students who are forced (and thus do not want) to be there, his/her research had been occupying a substantial and disproportionate amount of time, etc.  But this all misses the point.  Such excuses are only warranted if one assumes that the goal is to get high “scores” from students whose job is to “rate” one’s performance.  Alternatively, my advice to my friend was to reflect on the particular areas in which the students voiced critical concerns about the course and to consider which of these concerns had merit and which did not.  The outcome of such self-reflection and deliberation - what is valid and what is invalid about the areas of concern to the students - is what I believe should comprise the content of an Annual Self-report that relates to teaching.  The message that this sends to any department chair or dean is decidedly different from that which assumes one is being “rated.”  First, it conveys the sense that one is open to - and does not just get defensive about - criticism of one’s teaching, something which even the best teachers receive (in the words of my current chair, if you only receive positive feedback from students, you must be doing something wrong).  Second, it indicates that one is engaged in an active process of examining and thinking hard about what good teaching is -- one of the main things, in my opinion, that chairs and deans are actually looking for in assessing faculty teaching.  Third, and related to this latter point, is that such qualitative discrimination between students’ critical comments tacitly reinforces the notion that the value of student feedback is its vantage point or perspective, which still requires the judgment and expertise of the professor in order to have legitimacy, e.g. student complaints about “too much reading” aren’t unequivocally valid.  Indeed, even if one has overwhelmingly positive student feedback, I would argue that all of these points that are crucial to convey to one’s supervisors become undermined when one treats student evaluations as if they were merely a “score” or “rating” of one’s teaching performance. 

In the end, I told my friend, this is about the difference between seeing student feedback as a teacher’s report card or a self-improvement resource.  The former outlook, in my opinion, creates teachers who seek only the approval of their students, while the latter, by contrast, uses a critical eye and actually sees student criticisms as helpful.  It is important, in my view, to continually reemphasize the latter outlook on student evaluations, not only in one’s own mind, but in the minds of department chairs and administrators as well.   

First Things First: Some Thoughts on Blessed John Henry Newman
By Stefan McDaniel on August 13, 2011

It is no small irony that we have come to summarize Cardinal Newman as the “Inspiration of Vatican II.”  Whatever its nuances, Vatican II was unquestionably written in an exuberant allegro, with affirmation, of Man and the works of Man, as its resounding keynote.  Newman was a humanist in the reputable sense of the word, but his appreciations of the natural man were always very strictly and clearly subordinate, in tone and substance alike, to the supernatural, that is, to the truths, duties, and privileges distinctive of revealed religion.  This incessant emphasis was the whole power of the Oxford Movement, which came as a startling blow to the smirking face of Victorian England (the most self-satisfied community in history until Clinton’s America). The recent financial crisis may have dampened Fukyama-style utopianism, but the age remains haughty, and is high time for Christians to strike again.

A war on the pride of the world requires no novel venture of the spirit, but simply an affirmation of the absolute cultural priority of authentic religion. As Newman recognized, authentic religion fuses two qualities—beauty and severity—into a subtle synthesis easier to experience than to describe. “It is a paradox,” he says,  “how the good Christian should in all things be sorrowful yet always rejoicing, and dying yet living, and having nothing, yet possessing all things….We have not eyes keen enough to follow out the lines of God’s providence and will, which meet at length though at first sight they seem parallel.”

But though religion includes both beauty and severity, man is in very slight danger of preferring the latter to the former.  It is with good reason that Newman spends most of his ammunition on our softness and sentimentality, avidly persecuting all who would rise with Christ without dying with him. In all his religious writings, but especially in the Parochial and Plain Sermons, he shows his gifts as a religious psychologist, charting the movements of the quasi-Christian mind with a flat accuracy that is droll and sobering by turns. So intently does he pursue us along the paths of our evasions that he is sometimes driven to contortions suggestive of Augustine: “[O]ur Saviour says, ‘If ye know these things, happy are ye if ye do them.’ … When we read such passages…we pass over them as admitting them without dispute; and thus we contrive practically to forget them. Knowledge is nothing compared with doing; but the knowing that knowledge is nothing we make to be something, we make it count, and thus we cheat ourselves.”

Such analyses of human nature are never smug or morbid, but always aim to have the reader realize his state, his tendencies, his habits, his true beliefs and motives, and to notice how they differ from those of the ideal personality of the Primitive Christian, who was but a reiteration of Christ.

The Primitive Christian recognized duties that seemed almost as fanatical to comfortable Englishmen then as they do to comfortable Americans now: He knew to watch and fast.  We must remember that Newman was thoroughly neo-patristic. His commitment to beauty in liturgy and ornament, and objectivity and richness of doctrine was wholly bound up with a high, priestly ideal of Christian life modeled on the arduous way of the Apostles and Fathers.  It is “our duty to war against the flesh as they warred against it, that we may inherit the gifts of the spirit as they inherited them.”  Even Christians of impeccable orthodoxy and undeniable fervor tend severely to underrate the importance of mortification, but this is not an error to which Newman shows the slightest inclination. He would agree entirely with David Hart that “it takes formidable faith and devotion to resist the evils of one’s age, and it is to the history of Christian asceticism…that all Christians…should turn for guidance. To have no god but the God of Christ, after all, means today that we must endure the lenten privations of what is most certainly a dark age, and strive to resist the bland solace, inane charms, brute viciousness, and dazed passivity of post-Christian culture—all of which…enjoin us to believe in and adore ourselves.”

Indeed, Hart probably does not go far enough for Newman. Even at its most excellent, high civilization is a pageant of human power. It trumpets its triumphs and encourages us to accept its standards, priorities, and assurances. We easily confuse “decency,” cheerfulness, and industry with holiness, and thanks to the deceitfulness of riches we forget our poverty. From first to last Newman demanded that Christians be unseduced, that they make the Cross of Christ the measure of the world.  The Cross, that “bids us grieve for our sins in the midst of all that smiles  and glitters around us.” 

But what of the beauty of true religion? Newman did not ignore the artistic beauties inspired by devotion, and he was in the highest degree sensitive to the intellectual beauties of theology. When he speaks of the beauty of religion, however, he means the sweetness of life hidden in God.   Firm conviction and energetic obedience give a man “deep, silent, hidden peace, which the world sees not,—like some well in a retired and shady place, difficult of access.”  Ever the personalist, Newman is fascinated by the Christian personality thus shaped by grace.  Time and again, as though reciting the Divine Names, he lists its varied qualities and distinguishes them from their counterfeits. The Christian is like Love itself, “cheerful, easy, kind, gentle, courteous, candid, unassuming; has no pretence, no affectation, no ambition, no singularity; because he has neither hope nor fear about this world. He is serious, sober, discreet, grave, moderate, mild, with so little that is unusual or striking in his bearing, that he may easily be taken at first sight for an ordinary man.”

This, then was Newman’s nakedly otherworldly ideal, an ideal which all Christians must allow to entrance and inflame them. It is because Newman in no small measure attained this standard, and not because he thought Christianity compatible with science and civil freedom, that the Catholic Church has raised him to its altars. Increasingly throughout his life, his deeds and sufferings, his goings out and his comings in, were ruled by the one thing necessary. A worldling might suppose this made for a mute ethereal figure who fled from bright lights and sudden motions to be alone with his recondite pleasures.  But of course Newman’s life was rich and expansive beyond reckoning. Only Ian Ker’s remarkable biography has made some creditable attempt to measure this multifarious man: the lively and sarcastic correspondent; the movement organizer with the energy and craft of Napoleon; the significant poet and competent literary critic with an uncultivated aptitude for mathematics; the polemical genius, font of searing irony and grotesque images; the major theorist and teacher of liberal learning; the acute observer and prophetical analyst of classes, nations, movements.  All these things and more, knit tightly into one improbable, imposing whole, made Newman, and thus there is no honest atheist (or even liberal Protestant) who will not own his purely worldly greatness.  

Newman does not, then, teach pious mediocrity, or flight from society, or contempt of nature, but a lively perception, which nothing may bedim, that two loves built two cities.

New Natural Law Versus Historical and Revelation-Grounded Traditionalism
By Peter Haworth on August 13, 2011

Is practical reason the central feature of human ethics, or are historical realities like tradition and revelation also necessary (and possibly more important) for sound ethical understanding? Can practical reason reveal significant insights about the value of God, or are speculative reason and religious experience necessary premises for recognition of God's goodness? These questions are the crux of a long debate within philosophy and political theory. Recent interlocutors in the debate have included Germain Grisez, John Finnis, Joseph Boyle, Robert Georege, Christopher Tollefsen, Ralph McInerny, Alasdair MacIntyre, Jean Porter, D. Stephen Long, and Russell Hittinger. The lines of disagreement have, again, formed in a new symposium that is now in progress on theology and natural in ANAMNESIS, A Journal for the Study of Tradition, Place, and ‘Things Divine.’ 

In continuity with the historical debate, the online symposium in ANAMNESIS discusses the varying views about the role of tradition, revelation, and reason as these themes relate to human ethics and knowledge of God. In his essay, "God, Religion, and the New Natural Law," R.J. Snell defends New Natural Law (NNL) views (e.g., those of Grisez, Finnis, and George) on human reason elucidating significant insights about God. Although not discussed in Snell’s essay, NNL theory also maintains that human ethics are largely reducible second-order rules of practical reason that do not require revelation and tradition in order to be known by human subjects. This can be seen in Robert George's lucid essays on the topic. In contrast, Thaddeus Kozinski’s two essays, "Turning to an Empty Subject: A Response to R.J. Snell's God, Religion, and the New Natural Law" and "The Good, the Right, and Theology," employ a more traditionalist paradigm concerning the importance of revelation and tradition for understanding both human ethics and God. 

According to Kozinski, NNL can only provide a partial account of human ethics due to its over-reliance on practical reason. Moreover, he criticizes the New Natural Lawyers (e.g., Grisze, Finnis, George, and Tollefsen) for the following failures: (1) they insufficiently employ insights (like those of Alasdair MacIntyre) about tradition and practice–i.e., their role in directing reason and subjectivity; (2) they insufficiently incorporate Christian theology, which Kozinski cites Jacques Maritain as viewing to be essential for human ethics, into their natural law theory. Kozinski also criticizes the political philosophical implications of the New Natural Law, and one should peruse the whole article for the entirety of its wisdom–e.g., insights from Jean Porter, D. Steven Long, and James Schall.

As readers will ascertain, this is an extremely important debate with major implications for the possibility of establishing publicly reasonable justifications for traditional morality. If the NNL paradigm fails to accomplish this, traditionally minded people in modern political societies might be logically compelled to either (1) settle for a modus vivendi varient of liberalism or (2) pursue creative legal solutions for establishing their own smaller traditionalist polities. In short, the implications are pivotal for the political life of American traditionalists.  

Please visit ANAMNESIS and follow this debate. You can now leave comments for the authors and engage in critical discussion about the issues. Also, you can now follow ANAMNESIS on Facebook and Twitter.

Why do Intellectuals Lean Left? It's Not Intellect
By Hyrum Lewis on August 15, 2011

It’s no secret that academics (especially in the social sciences and humanities) are generally far more leftist and statist in their politics than the average American. Why is this?  Why are those who deal in ideas more disposed to use the force of government to redistribute wealth, control commercial activity, oppose capitalism, expand the welfare state, and regulate our lives than are others?

This is a perennial and important question that I hope to address, but first I think it’s worth debunking the answer that left-wing intellectuals themselves believe: intellectuals are just smarter than others and therefore more likely to believe the right things.  They believe that intellectuals are left-wingers for the same reason that intellectuals are more likely to accept a correct scientific theory—it’s the truth and intellectuals, being smarter, are more equipped to arrive at the truth (they might even cite studies that show that, indeed, graduate education correlates strongly with leftist politics).

Besides being self-serving, this reasoning fails for five main reasons that I can see (please feel free to chime in others that I’ve missed).

1.     First, if leftist politics are simply a matter of intellectually arriving at the truth, then why do we find some of the most distinguished minds in any field rejecting leftism? (Samuel Huntington or James Q. Wilson in political science, Niall Ferguson or Walter McDougall in history, Richard Posner or Antonin Scalia in law, W.V.O. Quine or Robert Nozick in philosophy, etc.).  If high intelligence led directly to an arrival at the purely rational truths of left-wing politics, then we should expect to see the same consensus among intellectual elites on political questions that we see on scientific questions.  Such is not the case. No top-notch physicist rejects the theory of relativity, but plenty of top-notch political scientists reject Obamacare.

2.     Second, the education = leftism statistics they cite are misleading: while it’s true that the most educated in society tend leftward in their politics, it’s also true that the least educated tend strongly leftist as well.  The curve is not an upward one, but saddle shaped at either end.  So if more education means more leftist politics, why do we find that high school dropouts are more lefty in their politics than high school graduates?  Shouldn’t it be the other way around with the enlightened high school graduate arriving at the left wing truths more readily than the less enlightened dropout?

3.     Third, those of us who spend most of our time around left-wing intellectuals have noticed that the “reasons” they give for being lefties are not reasons at all, but assumptions based on emotion. Nobody accepts the theory of relativity for emotional reasons, and yet almost all intellectuals favor expanding the welfare state because they feel, on a gut level, that it’s “compassionate.”  They didn’t think their way to this conclusion; they felt their way to it as a moral duty.  Their leftism is a matter of the heart (emotion) rather than the head (intellect). 

4.     Fourth, if “smarter is leftier” were true then we would expect to see people becoming more left wing as they gained more experience and education in non-academic settings (e.g. greater life experience through age or contact with a wider variety of viewpoints).  In fact, research shows that the opposite occurs: people become more conservative as they get older, not less, and those who are exposed to a wider variety of outlooks are more likely to move from the left to the center in their politics, not vice versa (indeed, if academics really believed that “smarter is leftier” they wouldn’t be so paranoid about conservative speakers coming to campus because they would trust, with John Stuart Mill, that any challenges would only reinforce the “truth” of their position.  Instead they shout down, boycott, and protest those who might offer alternative viewpoints at their schools and resort to name-calling and clichés rather than reason when challenged—hardly the behavior of those who are confident that more experience and knowledge will only reinforce the “truths” that they have come to).

5.     Fifth and finally, if “smarter is leftier” were true, we would see professors themselves becoming more left-wing as they became more educated in the fields that relate to their political views.  This does not happen.  Take economics for example: academic economists are far more conservative on economic matters than their counterparts in history or sociology who know far less about economics. As intellectuals gain greater empirical knowledge of economic theories and data, they move to the right—opposite of what the “smarter is leftier” thesis would predict.

Beyond the above five reasons, it is also worth noting that intellectuals are oftentimes far less rational than the common person on the street.  For instance, the unscientific and demonstrably false notion that male/female differences are purely social constructs is much more likely to be accepted by an academic than a truck driver. Marxism has been as soundly falsified as has the flat earth theory, and yet the only place this dangerous and outdated doctrine survives (besides lunatic states like North Korea) is on university campuses. Scientist Timothy Ferris, in his recent book, The Science of Liberty, defends and praises intellect against dogmatism, but also notes (puzzlingly) that those who are supposed to be most trained in clear thinking (university professors) are also the most susceptible to some of the most irrational viewpoints out there (e.g. postmodernism).

We all know that intellectuals were more likely than common people to be suckers for Stalinism, Maoism, and other leftist utopian promises (and even today, Fidel Castro and Hugo Chavez are only popular among intellectual elites), but John P. Diggins has shown that intellectuals were also much more likely to praise Mussolini and Hitler than average Americans. For all of their pretensions to “greater wisdom,” academics are often more likely to embrace patent foolishness than everyone else; indeed, some ideas are so ludicrous that only people in the ivory tower can believe them.  No wonder William F. Buckley Jr. famously preferred to be governed by random citizens in the phone book than Harvard’s faculty. 

In summary, what I have been trying to show above is that “higher intelligence” or “greater ability to arrive at the truth” is not the reason that academics lean left.  There may be a correlation between intelligence and leftism, but this is clearly not a causative relationship—there are other variables involved. What these variables are and what is causing academics to lean left is something I hope to address in the future.

James O'Keefe and lying for a good cause
By Anonymous on August 17, 2011

Conservatives are singing the praises of James O’Keefe and his “undercover work” once again as a result of the publication of this NY Times Magazine profile.

I couldn’t help but laugh out loud to hear that a man whose entire career is founded upon lies calls his organization Project Veritas. You know, veritas, Latin for “truth.”

At one point in the profile O’Keefe is quoted saying the following: “A lot of people sit around discussing what to do. They draw up proposals, look for funding and nothing happens. I grab my camera and go do it.”

Acting in an upright, ethical manner may take longer to bear fruit. Yes, unethical and illegal behavior may bring about more tangible results in the short term. But in the long term? I fear that O’Keefe and others following his lead are performing a major disservice not only to the conservative movement, but to society as a whole.

There’s a reason, after all, that various groups ask members to keep e-mail conversations private and off-the-record. There’s a reason why both business and academic meetings will sometimes be held in private, where nothing will be reported to the press. Privacy is important. It gives us a certain amount of security to speak freely, to think freely, to try out new ideas—some of which may be stupid and which we wouldn’t want attached to our names. I don’t think we’d want a liberal O’Keefe posing as a hotel bellhop at an ISI Summer Institute, secretly filming the event, and then cutting-and-pasting all of the most embarrassing moments of ISI fellows misspeaking, and then publicizing that as representative of the group. I don’t think we want a liberal O’Keefe posing as a telephone repair man and entering our offices—for any purpose. 

I don’t think we want to live in a world where we constantly have to be second-guessing everyone we meet as a potential undercover spy.

But regardless of the consequences, we shouldn’t be embracing dishonest methods of exposing our opponents because of the harm it does to our own characters, for the vice in implants into our own souls. Veritas, truth-telling, honesty: these are virtues. Conservatives used to stand for virtue. Lies and dishonesty: these are vices. I’d rather not see our side embrace them. Even as means to good ends. 

During this summer’s ISI Institute one of the participants asked Professor Budziszewski what he thought of recent attempts to justify lying using Thomistic natural law arguments that appealed to his four witnesses: Some people argued that as a matter of deep conscience (witness number 1) that lying in certain circumstances was obviously right. Some people argued that as a matter of the human design after the fall (witness numbers 2 and 3) that speech was no longer intended solely for truth-telling and that lying now had a certain designedness to it. And some argued that as a matter of the consequences that would result from not lying (witness number 4) that lying sometimes had to be done. 

Professor Budziszweski had no patience for any of these arguments. He sided with the traditional Thomistic argument on the illicitness of all lies, and called the recent attempts to defend lying from a Thomistic vantage point embarrassing, that the people were fudging, and that he wished they would stop.

He’s right.

A Reflection on Teaching
By Angela Miceli on August 19, 2011

As I prepare myself to start a new semester of teaching American Government, I do what I always do: I read Fr. James Schall’s little essay “On Teaching Political Philosophy.”  (The essay is in an ISI book: On the Unseriousness of Human Affairs published by ISI in 2001 - you should definitely get this gem for yourself!) I am teaching at a local branch of a large state university, and many of my students are only taking American Government because they have to take it.  Most of them come into my classroom rolling their eyes and hoping for an easy ‘A’ by scuttling by doing the bare minimum of work.  It is my task as the teacher to convince - or rather persuade them - to care and to teach them why political things are in fact very important.  Reading Fr. Schall’s little essay reminds me why I do what I do: teaching students to seek ‘what is,’ to put it in Fr. Schall’s words.  

First, Schall reminds me what it means to be a teacher, “Teaching is not an exact science, something for which we can all be grateful.  Rather, teaching is an overflow of the truth of existing things we have affirmed in our souls” (p. 111).  Teaching the Declaration of Independence, the Constitution, and the principles of republicanism and federalism leave ample room for questions of political philosophy - questions about the nature of the human person qua political and rational creature.  Even teaching something as seemingly technical as “Intro to American Government,” we teachers are still responsible for letting the students know that truth is in fact real and they should be seeking it.  But this is such a challenge! I cannot tell you how many of my students walk into my class as confirmed relativists as freshmen.  Most of them, of course, have no idea why.  I love these students.  They are searching for something more - searching in the same I searched as a freshman and continue to search in my profession.  Telling students that knowing themselves, indeed governing themselves, is the true essence of their freedom and liberty and the foundation of our government shocks them.  They believe that I am a radical.  Schall comments on this in his essay: “These same [relativist] students are quite surprised, even sometimes pleased [and some of them, not so pleased], to learn that purpose of thinking is not just thinking but thinking the truth.  They are relieved to be told, finally, that the purpose of truth is that we should live according to it, that we will not be happy unless we ‘know ourselves’” (p. 117.)

And yet, this presents another challenge, something that Lee Trepanier brought up in a post from October 2010.  Professor Trepanier asks in this post: “As professors, do we profess a certain mode of inquiry or discovery or do we profess a certain philosophical (and maybe even theological) commitment?”  This is a critical question for the teacher.  One thing I find that students always want to know is my opinion on everything.  Perhaps they are used to the ideological professor telling them the ‘correct’ way of thinking.  Many students have told me that if they speak up on certain taboo issues (like being pro-life), their professors actually give them lower grades than if they just tout the party line.  I have no desire to impose my opinions or views on my students, but my goal is teach them how to ask the right questions and point them to what might provide them some answers.  I want to teach them what I am always learning - how to be truly free.  This is the meaning of the liberal arts. 

It is a delicate balance, I think, this professing ‘a mode of inquiry or discovery’ to the student versus professing ‘a certain philosophical commitment.’  To be sure, each professor holds some kind of philosophical commitment.  But as professors, we are supposed to profess what is true.  What I believe to be imperative is for us to remember, as Fr. Schall says in his essay, “If I know a truth, it is not because I believe I made it to be true, but because I discovered it in something I did not make, in a reality which was there before me. If my views are true, they are potentially everyone’s views, but only if we all proceed through the process by which they can be seen to be true” (p. 117). 

Prev 1 2 Next