This blog is no longer active

Check out another online publication from ISI, The Intercollegiate Review.

Getting an Academic Job
By Anonymous

Although I am no expert on getting a job, I have some personal and professional experience that might be of value to some readers of the ISI blog.

You should know that, before becoming an academic (I received my Ph.D. only in 2009), I earned a Masters Degree in Human Resources Management from the University of Illinois at Urbana-Champaign at the Institute for Labor and Industrial Relations. After that I worked for over a decade with three organizations, M&I Data Services, AuditForce, and Cardinal Stritch University. With each organization, I participated in both hiring and firing employees. I joined the faculty at Carthage College in 2007, and have continued to serve on hiring committees for faculty positions, both tenure-track and contract hires.

Rather than rehash the dismal data about the job market, I will offer several tips and strategies you might consider to help separate you from the competition. My comments on this blog represent the rough draft of a larger effort I am working on about positioning in an extremely competitive academic job market. My comments here relate mostly to what you should do when submitting applications for jobs (I plan to make subsequent posts about interviews).

First, avoid appearing silly. I know this sounds a bit trite, but it matters. I have seen a disturbing number of applications with grotesque typographical errors and slapdash phrasing. For instance, one candidate for an academic job boasted that he had “developed the ability to speak orally.” (I will refrain from asking how else one might learn to speak.) Another candidate used the date 2110 for his cover letter. Our committee considered sending him a rejection letter, but, since he was obviously from the future, we decided that he had already learned he didn’t get the job. The point here is that you must avoid silly mistakes in important communications. We all make mistakes, but your application is not the place. Go above and beyond in making your materials shine. Hiring committees are an unforgiving lot and will look for any excuse to rule you out as a candidate. Don’t give them the ammunition.

Do exactly what the job advertisement asks you to do. This means that you should do no more and no less than what the employer wants in an application. Do not try to be creative or to cut corners. For instance, don’t provide a separate discussion of your research unless it is required in the guidelines. Similarly, if the application guidelines ask for a cover letter AND a philosophy of teaching statement, do not fold the two components into a single document. Write a separate cover letter and a more detailed philosophy of teaching statement. If you have doubts about what is required, do not make assumptions. Even if the contact person is the chair of a department, it is better to work up the courage to drop an e-mail or make a phone call than it is to guess at what is wanted.

Start drafting cover letters and philosophy of teaching statements now—even if you don’t have a job in mind. I believe that it is better to re-craft something than it is to start from scratch when you have a deadline approaching. Spend time now, when there is no pressure, thinking about what good teaching means to you. Ask your mentors what they think about good teaching, recall what you remember as good teaching, search the web for ideas, look at your peers’ philosophy of teaching statements, and, above all, try to determine what passes for good teaching at the institution to which you are applying (again, do you homework). Have others read your letters and statements before sending them off.

In terms of writing good cover letters, I have strong opinions. Others will certainly disagree, and I respect that. However, I believe that good cover letters follow a fairly set formula. The letter should:

1) be between 300 and 500 words;

2) indicate clearly to which position you are applying;

3) contain a brief, but confident, one-paragraph discussion of your qualitative abilities (e.g., organizational skills, excellent writing skill, ability to teach a variety of courses);

4) discuss in one to two paragraphs your objective qualifications (e.g., Ph.D. earned in 2005, dissertation on “Pet Keeping Habits of the Parisian Bourgeoisie,” award winning teaching, serve as adviser to the “Stop, Drop, and Roll Fire Safety Club”); and

5) conclude with a very brief paragraph about how you arecertain that youcan contribute quality scholarship, meaningful advising, and excellent teaching to the given department at whatever college. Also mention that you look forward to the opportunity to discuss yourqualifications with themin more detail.

Good philosophy of teaching statements are less formulaic and require more thought than cover letters. Of the excellent statements that I have seen, no two are alike. However, I have noticed a few common themes. First, they are rarely longer than three pages (in fact, most are just about two pages). Next, good statements are well structured. In other words, they have introductory paragraphs that tell the reader what the body of the letter will say and have concluding paragraphs that sum up the statement. Finally, the content of good statements discusses not just teaching, but also research and advising (they also discuss the relationship among teaching, advising, and research). You will notice that my comments lack specificity about content. This is because I believe that teaching statements are idiosyncratic. In essence, they are a chance for you to express your thoughtfulness and individuality. As such, philosophy of teaching statements require contemplation and, above all, repeated revision.

This leads to my next major point. Tailor each cover letter and philosophy of teaching statement to each institution to which you apply. Although it is fine to work from templates, you should craft every application as if the institution to which you are applying is the ONLY one where you could EVER see yourself working. This might seem overstated, but the hiring committee wants to believe that the person it hires is destined for the position. Hiring committees look not only for excellence in a candidate, but also for a good fit.

In order to tailor your materials, you must do your homework. Before you send an application to a prospective employer, you should know as much as you can about the position to which you are applying. The web is the most obvious place to start. Look at all the programs and departments that might relate to your position. In addition, look up the biographies, bibliographies, and areas of expertise of anyone with whom you might work. You need to differentiate yourself from the faculty who work at your target institution. They don’t want to hire copies of themselves. Some overlap between you and current faculty won’t hurt. Just keep in mind that your potential employers will want someone who can expand their department’s offerings, not repeat them. Doing this initial spade work can help you to indentify honest ways in which you might make a positive contribution.

Don’t be afraid to make calls. Let your mentors know where you are applying. Ask them if they have contacts. Ask your friends and peers for help. Do not be shy! In addition, start doing your homework BEFORE you get your assignment. Especially those of you who are still in graduate school should identify institutions where you would like to work. Investigate how many faculty they have in the department, how many are tenured faculty, how many contract, what employment opportunities at the institution have been in the last few years, is the institution financially secure, etc… Even before you are aware of a job opportunity, get in contact with department chairs at those institutions. Write them fan mail about their scholarship. Ask them (e-mail) questions about their schools, about teaching at the type of institution where they work, about the job market. Impress them (or at least start developing networking skills) before jobs become available.

The strategies listed above are low-risk, low-stress. Yet, they have the potential to yield substantial benefits. I am not making this up. I have these used this strategies, and I know others who have, when searching for both professional and academic jobs.

Higher Law Constitutionalism Revisited
By Paul DeHart

In a blog post published a several months ago on this site there is an interesting discussion of the relation of the Constitution to majoritarianism. Without engaging the post in question directly, it is the issue of Constitutional majoritarianism that I would like to take up here. For this thesis is controversial.

Is it correct to think of the Constitution as the institutionalization of popular rule as determined by the majority? This is a question to which some Constitutional theorists, of the first rank, have given a negative answer. Daniel Walker Howe, for instance, sees the Constitution as an institutional design for promoting rule by a natural aristocracy of wisdom and virtue (as opposed to a landed one). And, to be sure, one can find endorsements of rule by a natural aristocracy in the writings of founders such as Jefferson and Adams. Howe believes he finds this way of thinking manifest also in key passages of The Federalist (perhaps especially in the argument of Federalist 10).

I have the utmost regard for Prof. Howe (his article and subsequent chapter on the political psychology of the Federalist was groundbreaking and remains of utmost importance for any serious student of Publius). Nevertheless, I agree with the prior blog post that the Constitution does indeed seek to instantiate majority rule. But I think this point must be made with great qualification. Recall, for instance, Madison's argument in his "Vices of the Political System of the United States." There Madison argues that the principle vice of the political system of the United States is the injustice of the laws of the several states. There are two main reasons for the injustice of the laws enacted by the state legislatures. The first reason has to do with the character or virtue of most of those elected to office. Madison says individuals run for office for three reasons: ambition, interest, and public good. Unfortunately, only a few of those who run for office are animated by the last motive. And once elected, Madison notes that those animated by the vicious motives of ambition and interest are most effective in establishing legislative coalitions. But this is not the main reason for the injustice of the laws of the several states. The main reason is the injustice of popular majorities that drive the legislatures to adopt unjust policies. So the principal vice of the political system of the United States was the consequence of the unrestrained majoritarianism that obtained in the several states.

Given that we are to have a republic rather than a monarchy, aristocracy, or even a mixed regime (in the Aristotelian sense--see Federalist 14), framers such as Madison sought to embody institutionally the maxim that in all free governments the deliberate will of the people will prevail over the will of their rulers. But the deliberate will of the people is distinct from whatever the people (or a majority) happens to will now. Majorities animated by immediate, narrow interests or by the passion of the moment are, by institutional design, resisted and suppressed. Only long lasting and broadly distributed majorities, animated by justice, should prevail. So Publius cannot be described as a mere majoritarian (or, in other words, as an unconstrained majoritarian). He favors certain majorities and rejects others--and such a distinction among exercises of majority will, as I argue in Uncovering the Constitution's Moral Design, cannot be made on the basis of majority will (or on the basis of republicanism, majoritarianism, a theory of consent, etc.).   Distinctions among exercises of majority will are only intelligible to the extent that the distinction is made on the basis of a standard that is not majoritarian--by a standard that transcends majority will. Consequently, the fact that Publius distinguishes exercises of popular will, favoring some and rejecting others, indicates that majority rule is not the end or goal of his account of politics. Rather, majority rule is but an instrument, valuable only because (or inasmuch as) it serves a higher end. What is that higher end? Federalist 51 comes to mind: "Justice is the end of government. It is the end of civil society. It has ever been, and ever will be pursued, until it be obtained, or until liberty be lost in the pursuit."  In that same essay Madison praises the extended sphere the new Constitution will put into effect just because, according to him, in the extended republic of the United States majorities will seldom form on principles other than those of justice and the general good.

In sum, the problem with characterizing the Constitution as majoritarian, then, is that it makes rule by the majority appear as the end or goal of republican form. But that being so, then the constrained majoritarianism of Publius would be nonsense indeed.   

Renewing the Humanities
By Stefan McDaniel

 

Debate about liberal education has long been central to the culture wars. Most of us are familiar (perhaps wearily familiar) with the main points of disagreement. Should colleges and universities to teach truth, merely sponsor the search for truth, or proclaim truth a delusion? Is it possible or desirable to identify a canon of “great books,” and what should it contain? Is it possible or desirable to practice neutrality among cultures, or neutrality among a chosen set of cultures? What (if any) restrictions on academic freedom are justifiable? To what (if any) religious or ethical standards may the opinions and behavior of students or faculty be held? 

While the belligerents have been slinging heated tracts at one another, their mutual enemies have quietly gained possession of the field. Traditionalists and postmodernists alike generally agree that liberal education should be liberal (i.e. worth doing for it’s own sake), but the contemporary university is increasingly ordered toward creating and credentialing efficient producers for the market and effective managers for the state. Furthermore, the humanities, the most distinctively liberal disciplines, have well-earned reputations as refuges of indolence and fraud. Though our disagreements are far from trivial, all of us seriously concerned with liberal education should collaborate to promote its autonomy and rigor. 

Let us begin with a notion glamorized (though not invented) during the Renaissance: Man (i.e. humankind) is worth studying. Man in all his aspects, but especially insofar as he appears to be distinctive, that is to say insofar as he is a cultural being, one who applies, historical, religious, philosophical, and ethical categories to his own experience and activity. 

If that is granted, we must ask whether there are any disciplines universally fundamental to the intelligent, responsible, and productive study of human culture. I believe there are at least two such disciplines—language and history.

All committed humanists must study language, broadly construed to include semiotics in general, but with natural languages as the indispensable core. Culturally speaking, man lives by signs alone, so semiotics stands in much the same relationship to the diverse branches of the humanities that mathematics does to all departments of engineering. The humanities, especially at the undergraduate level, can increase their respectability and rigor by mandating, whatever the specific field of study, high competence in several natural languages and mastery of the fundamentals of syntax, semantics, and the general theory of signs. 

Only slightly less important than language is history, which the great historian John Lukacs defines with lapidary exactness, as the “remembered past.” Through the study of the remembered past (which illuminates and is illuminated by study of the languages by which it is transmitted) humanists learn how human beings construe their experience; the categories, values, connections, and problems perceived in the natural and social world that condition human thought and action.

Except where a humanist’s specialty obviously requires a different emphasis, there are good reasons for Americans strongly to prefer the history of America and Europe. People are, generally speaking, best equipped to interpret and analyze the historical materials of their own culture. Still, it is not of decisive importance whose history is studied, so long as the habit of historical thinking is firmly ingrained. We obviously cannot agree on the proper content of a shared deposit of historical knowledge, but perhaps we can agree that an educated public should be acutely aware of the value of detailed understanding of history and eager to remedy historical ignorance. Furthermore, although there is little hope of consensus on any important aspect of history, the practice of scrutinizing and comparing the methods and materials used by different historians should create common criteria for identifying honest and intelligent interpretation.  

The subjects generally classed among the humanities are usually studied most profitably when approached with the skill-set given by study of language, and are certainly seen most comprehensively within the context of history. For, in various historical circumstances and using various special modes of expression, man has asked and ventured answers to certain fundamental questions, “What is the world like, what has happened so far, and what should we do now?” and thence came philosophy (including the natural and formal sciences), religion, economics, poetry, etc.

The approach I have suggested would improve the general quality and reputation of liberal learning and, since studying language and history develop fungible skills, provide many other benefits. There’s even something in it for the utilitarians, since, as David Goldman of First Things once argued, one of the United States’ most troublesome liabilities in international relations is poor intelligence. For this, Goldman suggests, we must blame a culture that produces proportionately few people with significant knowledge of foreign languages and folkways. American colleges and universities are culturally powerful institutions. If the policies and ethos of the academy demanded, even at the undergraduate level, significant knowledge of foreign languages and the habit of historical thinking (which implies the habit of cultural analysis), the United States would have a much larger pool of qualified candidates for work in diplomacy and intelligence. 

Reductionism in Political Science
By Ryan T. Anderson

During the Summer Institute this year we heard elequent presentations on natural law from Hadley Arkes, J. Budziszewski, and Robert George. A presentation by John Mueller, on his new book Redeeming Economics and how the founders of modern economics got the discipline off on the wrong foot, got me thinking about how various theories of natural law--and of political science in general--likewise get off on the wrong foot.

So, what went wrong? Joseph Schumpeter, the great economic historian of the 20th century, wrote of Adam Smith in his History of Economic Analysis (1954) that “the Wealth of Nations does not contain a single analytic idea, principle or method that was entirely new in 1776.” John Mueller goes a step further to say not only that Smith does not add anything to economics, but that his theory actually leaves out both distribution and utility.

The elimination of these two aspects should come as no surprise to anyone familiar with the debates in philosophy and political theory over the so-called “modernity project.” Consider how a Scholastic such as Aquinas understood various sciences. At the heart of Aquinas’s social thinking was a recognition that order exists on four irreducible planes, distinguished by how they relate to our mind and will: the order that exists in nature, independent of human thought and choice; the order that we bring into our thinking itself; the order that we bring into our thinking about what to do; and the order that we bring into our thinking about how to do it. These four orders give rise to four irreducible sciences: first, metaphysics and natural science to study what is, what exists independently of human choice; second, logic to study the relations of concepts; third, ethics and practical philosophy to study what is to be, the ends of human choice; and finally, the applied arts and sciences to study how to achieve those ends, the means.

Yet much of modern social thinking rests on the explicit rejection of this third order, and thus of this third science. Machiavelli announced the ambitions of this new political science in Chapter 15 of The Prince (1513): “Many have imagined republics and principalities that have never been seen or known to exist in truth; for it is so far from how one lives to how one should live that he who lets go of what is done for what should be done learns his ruin rather than his preservation.” In a thinly veiled assault on Plato’s Republic and Augustine’s City of God as the imagined republics and principalities that argued for how one should live, Machiavelli set out to reveal the “effectual truth” of how successful people do live. Implicit here was a reduction of political thought to the first and fourth orders. Investigate how people are (first order), and then reason about the means (fourth order), in this case, to staying in power, without any concern for how people ought to be, quite apart from how they might serve our interests (third order).

Thomas Hobbes and David Hume make this reduction even more explicit. In the Leviathan (1651), Hobbes writes that “the thoughts are to the desires, as scouts, and spies, to range abroad, and find the way to the things desired.” And in A Treatise on Human Nature (1739), Hume argues that “reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them.” The third-order science that considered which ends one should act for has been eliminated. It is now for political science to study man’s passions as they are given (first order), and then to devise the best way to secure those ends (fourth order). If this is how one understands human action, then, of course, distribution (deciding which people should be the ends of one’s economic acts) will be eliminated from consideration.

Yet Adam Smith went even further. Smith was, of course, taught by David Hume and Francis Hutcheson (the famous Scottish Enlightenment moral sense/sentiment theorists). Mueller argues that Smith, in his quest for a Newtonian science of economics and under the influence of Stoic pantheism, went further than Hume to deny reason a role in selecting either the ends or the means. Smith thought that the sentiments fully determine human action, so that the only variables to explain are production and exchange: a streamlined science with fewer moving parts. (And, it should be noted, this theory of production and exchange logically leads to Karl Marx’s criticisms, based on Smith’s faulty “labor theory of value” of modern capitalism.)

With time, economists came to understand just how thin this theory really was and reintroduced the concept of consumption based on utility (helpfully refining the idea into one of marginal utility). This reintroduction of the Augustinian understanding of consumption based on utility corrected for the problems in both Smith and Marx. Rather than viewing the worth of objects for human consumption as intrinsic to the object (or the labor that produced the object), the theory of utility saw that an actor’s preference for an object, based in the utility it brought, explained the value in and the choice for the object. But this still left unexplained the choice of which person(s)—self, other(s)—would obtain the object. That is, it left out distribution. And rediscovering that element of economic science is at the heart of Redeeming Economics.  

Generic Education? Headed for Bankruptcy.
By Gabriel Martinez

High-school students spend most of their waking hours trying to fit in, to blend, to be one of the crowd.  And yet this is the end of the season in which students try to "differentiate" themselves - the college application season.

Interestingly, universities and colleges also have the same inner conflict.  They bend over backwards to fit in, to blend, to match what everyone else is doing.  And then they turn around and say to potential students,  "we're exceptional, look at us ... but don't freak out, we're like every other school ... except that we're special ... but in a very normal sort of way."   We're bold yet bland.

It turns out that "doing what everyone else is doing" is bad business. (It's also bad for your health, it turns out).  Not long ago, the Chronicle of Higher Education ran an article that concluded: "you've got to differentiate yourself. [If everyone follows] the solution everyone has come up with, and at some point the market's going to get saturated "

From a business standpoint, a university cannot survive if its strategy is to imitate what everyone else is doing (check out this classic article on business strategy by Michael Porter).  Big, medium, and small, colleges across the country offer an increasingly shapeless, generic education.  The same classes, the same bland content, the same weak requirements, the same emphasis on entertainment.  Why go someplace and spend $30K-$40K when you can get exactly the same product from home, online, at a fraction of the cost?

This is good news for schools that are already very differentiated.  I’m thinking particularly of schools with a strong commitment to liberal education and where faith is lived intensely.  Does your school have a product that other colleges cannot imitate (without tremendous cost)?  Do you have a high-quality package that cannot be copied by an online program?  Then you have a competitive advantage precisely because you are targeting an under-served market and not an over-crowded one.

That’s for the supply side.  On the demand side: over the next 10 years or so, there will be fewer US residents who are of college age.  Not only the fraction of college-age people in the United States will fall: the US will actually lose over a million 18-22 year-olds by 2020.  The situation is particularly bad in the Midwest and the Northeast.

The picture is brighter in the West and in the South, and even more so in Florida (you can find the data here).  The share of the population that is college-age is relatively low in those regions (which explains why, historically, there have been fewer college graduates per capita in those states than elsewhere).  But unlike the country as a whole, they are expected to gain 18-22 year olds over 2010-2020.  Between 2020 and 2030, the college-age population in Florida is projected to grow by nearly 30%.

The market for generic education is not only over-crowded (and increasingly so), but shrinking.  The market for the specialized product that many schools (including mine) offer, in the location in which we offer it, is under-served and expanding.

Five Books Every American Should Read: The Uncommon Reader, Number Five
By Jessica Hooten

 

How do you choose only five books that every American should read? At least 100 would be a beginning, but five! On Facebook occasionally a list gets passed around of the books everyone should have read and people highlight how many they’ve actually read. The list is unbelievably arbitrary and includes the Harry Potter series (which I love but would never include on any recommended reading list).  In substitution for this list, I’ve posted Mortimer Adler’s list from How to Read a Book, with the addition of a few female writers that the unintentional misogynist neglected or whom he preceded, such as Flannery O’Connor, Virginia Woolf, and Marilynne Robinson. There are just so many books to read!

To finish this list of the five books every American should read, I have the problem of too many choices. People have sent me their picks: To Kill a Mockingbird, The Great Gatsby, Catcher in the Rye… But I only get to choose one. Because I have to limit myself to five, I’ve been narrowing my choices to American authors and novels. I haven’t even pondered the canonically great books such as The Divine Comedy or the American Modern Library’s 20th century bestsellers such as The Brothers Karamazov. Nor have I ventured into poetry, prose, or drama recommendations, though Dickinson, Emerson, and Williams would make that list.  Instead, I have limited myself, and I have progressed chronologically from nineteenth century to now.

Yet, with one choice left, I wanted a work that would somehow encapsulate all other works, that would be a starting point for those who have yet to understand the importance of reading for the development of character, and, let’s be realistic, a work that was short enough for the average American reader.

So, to echo Monty Python, “and now, for something completely different”: my final recommendation is Alan Bennett’s The Uncommon Reader, a recent novella by British playwright published (on my birthday of all irrelevant coincidences) in London. The book is hilarious and uplifting and a wonderful instigator for reading. With or without knowing it (not being a Bennett scholar, I will not make statements of authorial intention), Bennett has embodied C.S. Lewis’s An Experiment in Criticism in this playful tale.

The story follows the new obsession of Queen Elizabeth II with reading. Her reading causes quite a commotion among her cabinet members, family, and her subjects. At the beginning of the tale, she is not much of a reader, for “liking books was something she left to other people.” However, reading transforms her, making the monarchist into somewhat of a…democrat. For any more information than that, I recommend reading the book—it’s too short to give too much away.

This past year I attended a presentation by Yann Martel, the author of Life of Pi (another book I’d highly recommend), who admitted that he sends book recommendations to the Prime Minister of Canada for the leader’s edification. At first I laughed—how funny for an author to recommend novels to his political representative! Then I reconsidered—how audacious, how fitting, how absurd. I’ve concluded that I side with Martel, who is enacting what Bennett satirizes in The Uncommon Reader, the need for those in power to be reading books. However, I think it substantially more important for the citizens to be reading literature, especially in a country where the government should be of, by, and for the people.

In 1958 Ezra Pound wrote a poem “Cantico del Sole,” in which he wonders what “America would be like/ If the Classics had a wide circulation.” I think fifty years later, the question is, what would America be like if any books had a wide circulation? What would America be like if teenage boys enjoyed reading The Odyssey or mothers spent their free time reading Augustine’s Confessions? What would America be like if we were all common readers? 

Saving John Locke...and America
By Anonymous

[Please don't identify the author in any reply.]

     Critics on the left--and even many on the right--have rejected Locke’s political philosophy as too individualistic, as insufficiently concerned with man’s social character. They have also argued that the state of nature is historically inaccurate. Man, they claim, is a political animal and government is therefore not formed by means of a social contract. These criticisms threaten the foundations of the American political order, since if we reject Locke we must also reject the Lockean second paragraph of the Declaration of Independence, and with it the doctrine of modern natural rights accepted by the founders and defended by means of the Constitution of 1787.

     In fact, the critics have misread Locke. His political philosophy does not rely on an historical state of nature. Let us recall that Locke’s “Second Treatise” (ST) is written in the wake of his “First Treatise” (FT), which argues against deriving monarchical authority on the basis of its alleged origins in Adam. In general, Locke does not believe that present rule can be justified on the basis of events in the distant past. He tells us this rather explicitly in the “Second Treatise,” writing that “at best an argument from what has been, to what should of right be, has no great force” (s. 103). Men need not be in the state of nature to make their consent valid. Indeed, for the men of his day (and ours), “tacit consent” (ST, s. 119) while under a government is sufficient for Locke.

     The critics are also wrong to think that Locke is radically individualistic. Locke does not believe that men can be properly understood as men apart from society. It is for this reason that he presents men in the state of nature as social and rationally cooperative (ST, ss. 6, 15, 19). Moreover, Locke believes that men can fully flourish as individuals only if they are embedded in culturally rich, socially developed, and very civilized surroundings. Consider that although he presents Americans (i.e. Indians) as living in a society, he rejects their way of life as insufficiently rational (ST, ss. 41, 92; FT, ss. 57-58). Therefore, a fortiori, he does not think atomized men can have good lives.

     Men must live in society since all significant human goods, and all intellectual goods, are produced by men in rational and social cooperation with each other. These goods are not produced by government. Government’s task is merely to protect society from force and fraud so that men in society can actively pursue the good. As a result, government has a claim on us only to the extent that it seeks to protect our rights.

     Locke’s use of state of nature theory is therefore not literal. It is an analytical tool meant to determine the proper limits of government. It is possible to imagine a human society without government because such a society would be missing only its artificial protective shell. In the absence of this protection, it could function and prosper provided men did not violate each others’ rights.  But since men are in fact untrustworthy, we must have government. How much government? Just enough to solve the problem of no government: the protection of natural rights.

     By limiting government, Locke refocuses our primary political concerns on strengthening, deepening, and enriching civil society. In a Lockean society, we have social obligations to our parents, who nurtured us, and, in general, to anyone who benefits us (ST, ss. 66-70). These social obligations and our rational need for each other are rich sources of cooperation. Lockean man is not an atomized individual; he is instead a social creature with deep connections to his community.

     There is much more to say—about consent, private property, democracy, and a host of other topics. But let me close here by noting the (relative) absence of coercion in Locke’s scheme regarding the fulfillment of social obligations. What sustains such a society, if not the government? What keeps it from drifting into vice? The unsettling answer is: civil society. This answer is unsettling because it makes the good life so contingent and uncertain. And yet, on the issue of the centrality of civil society, consider the evidence in Locke’s favor. Throughout history, most societies have been unfavorable to human liberty. Even today, most of the world's people do not have the habits and ideas necessary for them to live freely. If this is to change, it must be by means of a focus on individual habits and ideas, and the slow construction of civil society. Constitutions and laws, no matter how sound on paper, are merely futile, top-down impositions and will not bear fruit if the soil--the habits and ideas of the people, and the elements of civil society--has not been prepared. Just as a society cannot be reshaped in top-down fashion so as to be Lockean, neither can a Lockean society be sustained in top-down fashion. The hard, Lockean truth is that America’s habits of liberty--her virtues--must be sustained by individual citizens and by voluntary, sub-governmental structures. Those who seek a more centralized, planned, and certain foundation for human virtue must contend, as Locke did, with the sad history of human efforts to promote virtue coercively.

America's Approach to Global Warming...
By Hyrum Lewis

The debate over global warming policy in America is yet another example of how polarization and binary thinking have distorted our understanding of a complex issue. On the one hand, you have the “alarmists” like Al Gore claiming that we must take immediate, drastic action to halt carbon emissions or the destruction of the planet and civilization as we know it will ensue. On the other hand you have the “denialists” who claim that global warming is a hoax perpetrated by scheming scientists who want to take over our lives. I propose (as always) that seeing this debate in terms of “two sides” is terribly distorting and prevents the better understanding we seek. The reality is more nuanced than either side wants to admit. Instead of saying “yes or no” to the denialists or the alarmists, I think we need to frame the debate in terms of four questions:

1. First, is global warming happening?
2. Second, is that a bad thing?
3. Third, is there anything we can do about it?
4. Fourth, would the benefits of doing something about it outweigh the costs?

Note, that the alarmist position requires us to answer a simple “yes” to all four questions, and the denialist position requires us to answer a simple “no” to all four questions, but each requires far more analysis than a simple yes or no. Let’s turn to each:

First question: is global warming happening?
If this question is asking if the earth has gotten warmer over the last century, then the answer is unequivocally and scientifically “yes.” However, we must consider a couple of further points before answering glibly: First, just because warming happened in the recent past, does not mean for sure that it’s still “happening” (present tense) any more than the fact that the stock market has risen over the past 100 years means the market is “going up” (we can assume it is going up, long term, since the past tends to predict the future, but we can’t be sure). Second, the time horizon of one century may be arbitrary. For instance, this year was colder than last year, but that does not mean the world is  “getting colder”; likewise, the globe is now colder than it was 1,000 years ago (when Greenland was, indeed, green) so in that larger macro sense, the earth is “getting colder.” Why are we using a century as our measurement to determine if the earth is “getting warmer” instead of, say, a millennium or a decade?

Second question: Is this a bad thing?
For all we know, a warmer globe might be better and a number of scientists have actually made this case. (perhaps I’m biased, though, living in a quasi-arctic climate with seven month winters). Perhaps a warming globe would simply cause a demographic shift towards the poles (opening up much of Northern Canada to settlement, for instance, or causing people to move from Southern Arizona to Montana). A warmer globe very well might open millions of acres of land to cultivation and increase agricultural productivity in many places. I don’t think the scientists know, for sure, if a warmer globe would necessarily be a worse globe. They have their guesses and estimates, but these are highly speculative conjectures, not hard conclusions like their climate data. (I don’t understand why Al Gore’s apocalyptic tone about rising oceans can’t be met with a suggestion of adapting by building dikes or elevating cities—something humans have done countless times in our history)

Third question: Is there anything we can do about it?
Global climate change was happening long before humans came on the scene and will likely be happening long after we are extinct. While the evidence looks pretty clear that humans can and do affect the climate with our industrial by-products, there are nonetheless forces beyond us causing the climate to change (to the warmer or cooler) without us. It may very well be that even if we de-industrialized and drastically reduced all of our emissions, the planet would nonetheless continue to warm (or cool) regardless of our actions. After all, Greenland was green before it was white and long before humans started pumping out CO2.

Fourth Question: Do the benefits of doing something about global warming outweigh the costs?
Even if the alarmists are right about the terrible consequences of global warming, it’s entirely possible that the consequences of de-industrialization required to stop the warming would be even more terrible. The developing world would be especially hard hit as most of their growth that has lifted millions out of poverty would be halted and perhaps reversed. Are we sure that we would prefer de-industrialization-induced mass-starvation to rising sea levels? I’m not so sure, and I don’t think that scientists, for all of their theories of global warming, can answer this question conclusively either way.

So, here’s to Americans and our representatives addressing these questions with more humility and nuance than the simplistic views of either the denialists or alarmists allow. As we move forward, let’s use science as our guide to determine a sensible and moderate political course in dealing with this complex and important issue.

Five Books Every American Should Read: Beloved, Number Four
By Jessica Hooten

 

I put off reading Toni Morrison’s Beloved for a long time, mostly because I had been horrified by her novel Song of Solomon when I was fourteen (not a book geared for teenagers: among other terrors, it includes a woman attempting an abortion with a knitting needle). For my first upper-level English course, I assigned the book as a way to force myself to read it. I had to reread the first chapter at least six or seven times to catch on to the rhythm; it reminded me of the first time I encountered War and Peace and could not latch on to the names, vocabulary, or style of Tolstoy.

Morrison’s work is often called lyrical; it reads like poetry. The novel begins with confusion: fragments, ambiguous nouns, contrasting images, and repetitions: “124 was spiteful. Full of a baby’s venom. The women in the house knew it and so did the children.” The sounds of the words play off one another like a building beat, but it’s unfamiliar and initially thwarting. After a few efforts reading it, I chose to purchase the audiobook read by Morrison. She reads the novel as though singing it, and I highly recommend reading and listening together to understand the lyricism that critics rave about.

Recently a student came over to my house to sit out on the porch and discuss her summer reading list. Her boyfriend is an African-American student who is attempting to teach her about his culture and asked her to read Beloved; she was rightly haunted by it. Literally, she confessed to having nightmares about the book. The plot itself is full of ghosts, including the title character Beloved. Although the story occurs in post-Civil War Ohio, the main characters bring the antebellum South to the forefront, remembering the suffering they endured under slavery. Their memories make the tales of Frederick Douglass and Harriet Jacobs look Disney-worthy.

My favorite character is dead before the end of the first paragraph, but she is resurrected a quarter of a way through in a memory and preaches a sermon that outdoes Melville’s or Hawthorne’s preachers, at least aesthetically. As she sits on a large (literal and figurative) rock, Baby Suggs shouts to a congregation of former slaves in a clearing in the woods: “Let the children come; “Let your mothers hear you laugh;” “Let the grown men come.” Her authority is resolute, and the outcast children of God respond. She preaches the immanence of God found in their flesh: “Yonder they do not love your flesh. …You got to love it. This is flesh I’m talking about here. Flesh that needs to be loved.” It’s beautiful how she claims the right to be loved for a people tragically and wrongly despised. The passage should be read aloud, preached as a healing word to those who have been persecuted.

In A.S. Byatt’s introduction to the novel, she quotes T.S. Eliot: “[E]very new work of literature altered the literature of the past—in a sense reread that literature. Belovedenacts this alteration more forcefully than most classics.” In choosing the great works that every American should read, I’m choosing a twentieth-century novel that rereads the greatest literature in our canon from The Odyssey to the Bible to Uncle Tom’s Cabin. InBeloved Morrison somehow sings an old and new song.

Professor Flipflops, or Some Thoughts on How to Approach Summer, Part Two
By Glenn Moots

Part Two: Time Management and Putting the Summer into Perspective

 

The question of time management arises because, assuming you aren’t teaching or doing administrative work, you have more time in the summer. You not only don’t have to devote time to course prep, teaching, and grading; you potentially have whole blocks of time for dedicated research, writing, and course design. Woo hoo!

 

How might one organize one’s days or weeks in the summertime? The general consensus among writers (and I tend to agree) is that writing is best done in the morning and research and reading best done in the afternoon or evening (depending on your family needs). Summer is therefore a great time for getting up earlier because it’s brighter and warmer in the morning. Also, it is probably better to do some writing every day than to wait or hope for stretches of enormous productivity to balance out long periods of inactivity.

 

Speaking for myself, I feel better if I steadily chip away at projects during the whole summer. But this is a general rule. I also enjoy having dedicated time of non-academic activity in the summer together with equally lengthy periods of high productivity. (Ideally, these are scheduled.)  These extremes may also correspond to personal or family considerations.

 

As many of you know, there are unscheduled episodes with extended family, housecleaning, allergies, hospitality, car or home maintenance, or other distractions from academic projects. Go with it. Be thankful that you don’t have to juggle your classes too. Your career won’t end because of a few bad days or distractions in the summer. Along those lines, capitalize on those fits of productivity that may happen late at night or when you’re at your in-laws. Again, you don’t have to answer to the usual routine. Take advantage of that flexibility and charge ahead…

 

…Or not. Failure to progress on a major academic project can drive you crazy. But it isn’t an excuse for driving others crazy or neglecting your other duties and vocations – especially when you may not be obliged to teach classes as usual. If professors are supposed to be dispensing wisdom, we need to be about the chores and graces of everyday life. If you are looking for insights into wisdom, then all your attendance to family and other commitments will come around full circle into your teaching and writing. Take advantage of the summer to have a cap gun war with your kids or take a road trip to visit dear friends who won’t be around forever. This is important, too. Will your family ever really cherish your books or articles on Whiggism or nous? Or are they more likely to remember that great road trip or waterfront cottage? Hmmmm…

 

Also, just because you don’t feel like you’re actively thinking about your research doesn’t mean that your brain isn’t still processing it. This is why you can have insights while mowing the lawn, waxing your car, sitting on the beach, of having a cookout with friends. Don’t hesitate to launch these activities without guilt. Your brain never sleeps; it’s always working. Learn how to do things that require a different kind of thinking or, better yet, encourage active but free contemplation. Learn from time spent well with others. All of this will enrich your teaching and writing. Take a cue from the principle of Sabbath rest. You’ll come back to your projects better than you left them.

 

Summer can also be a good time to explore something new or overdue. Your decision to launch your own personal John Ford film festival, travel, or read outside your field can generate new ideas for writing or teaching. New specializations may sprout, so don’t hesitate to branch out.  Likewise, summer can enable you to stray a bit from the subject at hand and explore the periphery. Explore a new timeframe for your subject or a different school of interpretation, for example. This scholarly wandering can provide excellent ideas for research or teaching on the margins of what you are doing now.

 

Don’t forget to keep your summer in the context of the rest of the year. The only thing worse than an unproductive summer is an unproductive academic year. Therefore, see your summer as an opportune time to launch projects that will keep you productive during the rest of the year. Design a course that will be more enjoyable to teach in the fall. Write those quizzes now so that you can keep your research consistent during the regular semester. Use the summer to draft a prospectus or submit a manuscript whose next step will keep you busy during the school year. We are no less obliged to master the seasons of the year than we are to master the seasons of our lives. This requires the kind of architectonic thinking that Aristotle prescribes in his Nicomachean Ethics.

 

The best advice for summer is advice that holds for your whole life. Whether you type a chapter or have a squirt gun war tomorrow, take the advice of Ecclesiastes and do it with all your might. Others will have to decide whether we truly work for a living, but we must live with thankful hearts that the academic calendar can afford both a restful and a productive summer.

Prev 1 ...3 4 5 6 7 8 9 10 11 12 ... 64 Next