Our Self-Crippled War

by Elan Journo | September 10, 2009

Watching video of the Twin Towers imploding, we all felt horror and outrage. We expected our government to fight back — to protect us from the enemy that attacked us on 9/11. We knew it must, and could, be done. Fighting all-out after Pearl Harbor, we had defeated the colossal naval and air forces of Japan. But eight years later — twice as long as it took to smash Japanese imperialism — what has Washington’s military response to 9/11 achieved?

The enemy that struck us — properly identified not as “terrorism” but rather the jihadist movement seeking to impose Islamic law worldwide — is not merely undefeated, but resurgent.

Islamist factions in Pakistan fight to conquer that country and seize its nuclear weapons. The movement’s inspiration and standard-bearer, the Islamic Republic of Iran, remains the leading sponsor of terrorism, and may soon acquire its own nuclear weapons.

Then there’s the Afghanistan debacle. Eight years ago, practically everyone agreed we must (and could) eliminate the Taliban and its jihadist allies — a primitively equipped force thousands of times less powerful than Imperial Japan. Now that goal seems unreachable.

Today swaggering holy warriors control large areas of the country. They summarily execute anyone deemed un-Islamic, and operate a shadow government with its own religious law courts and “virtue” enforcers. Last year the CIA warned that virtually every major terrorist threat the agency was aware of threaded back to the tribal areas near the Taliban-infested Afghan-Pakistan border.

Why have we been so unsuccessful?

No, the problem is not a shortage of troops, nor is the remedy another Iraq-like “surge.” That sham, appeasing solution entails not quelling the insurgency, but paying tens of thousands of dollars to insurgents not to fight us, for as long as the money flows. And it means leaving Iraq in the hands of leaders far more committed to jihadists than Hussein. No, the crucial problem is the inverted war policy governing U.S. forces on the battlefield.

Defeating the Islamist threat demanded that we fight to crush the jihadists. Victory demanded we recognize the unwelcome necessity of civilian casualties and place blame for them at the hands of the aggressor (as we were more willing to do in World War II). Victory demanded allowing our unmatched military to do its job — without qualification. Instead, our leaders waged a “compassionate” war.

Before the Afghan war began, Washington defined lengthy “no-strike” lists including cultural sites, electrical plants — a host of legitimate strategic targets ruled untouchable — for fear of affronting or harming civilians. Meanwhile, we sent C-17 cargo planes to drop 500,000-odd Islam-compliant food packets to feed starving Afghans and, inevitably, jihadists.

Many Islamists survived, regrouped and staged a fierce comeback.

The no-strike lists lengthened. So, necessary bombing raids are now often canceled, sacrificing the opportunity to kill Islamist fighters. Jihadists exploit this to their advantage. Lt. Gen. Gary L. North tried to justify the policy to a reporter: “Eventually, we will get to the point where we can achieve — within the constraints of which we operate, which by the way the enemy does not operate under — and we will get them.”

“Eventually” — for another eight years?

In Washington’s “compassionate” war, we give the enemy every advantage — and then compel our soldiers to fight with their hands tied . . . ever tighter.

Naturally, U.S. deaths have soared. More Americans died in the first eight months of this year (182) than in all of last year — the bloodiest year of the war, up till now.

If Afghanistan now seems unwinnable, blame Bush and Obama. Bush crusaded not to destroy the Taliban but to bring Afghans elections and reconstruction. Obama’s “new” tack is to insist we spend billions more on nation-building and bend over backwards to safeguard the local population. Both take for granted the allegedly moral imperative of putting the lives and welfare of Afghans first — ahead of defeating the enemy to protect Americans.

This imperative lies behind Washington’s self-crippled war — a war which could have worked to deter other jihadists and their state-sponsors, but instead encourages them to attempt further attacks.

How many more Americans must die before we challenge this conception of a proper war?

About The Author

Elan Journo

Senior Fellow and Vice President of Content Products, Ayn Rand Institute

An Unwinnable War?

by Elan Journo | Fall 2009 | Winning the Unwinnable War

Author’s note: The following is the introduction to Winning the Unwinnable War: America’s Self-Crippled Response to Islamic Totalitarianism.

“I don’t think you can win it. . . . I don’t have any . . . definite end [for the war]”
— President George W. Bush 1

The warriors came in search of an elusive Taliban leader. Operating in the mountains of eastern Afghanistan, the team of Navy SEALs was on difficult terrain in an area rife with Islamist fighters. The four men set off after their quarry. But sometime around noon that day, the men were boxed into an impossible situation. Three Afghan men, along with about one hundred goats, happened upon the team’s position. What should the SEALs do?

Their mission potentially compromised, they interrogated the Afghan herders. But they got nothing. Nothing they could count on. “How could we know,” recalls one of the SEALs, “if they were affiliated with a Taliban militia group or sworn by some tribal blood pact to inform the Taliban leaders of anything suspicious-looking they found in the mountains?” It was impossible to know for sure. This was war, and the “strictly correct military decision would still be to kill them without further discussion, because we could not know their intentions.” Working behind enemy lines, the team was sent there “by our senior commanders. We have a right to do everything we can to save our own lives. The military decision is obvious. To turn them loose would be wrong.”

But the men of SEAL Team 10 knew one more thing. They knew that doing the right thing for their mission — and their own lives — could very well mean spending the rest of their days behind bars at Leavenworth. The men were subject to military rules of engagement that placed a mandate on all warriors to avoid civilian casualties at all costs. They were expected to bend over backward to protect Afghans, even if that meant forfeiting an opportunity to kill Islamist fighters and their commanders, and even if that meant imperiling their own lives.

The SEALs were in a bind. Should they do what Washington and the military establishment deemed moral — release the herders and assume a higher risk of death — or protect themselves and carry out their mission — but suffer for it back home? The men — Lt. Michael Murphy; Sonar Technician 2nd Class Matthew Axelson; Gunner’s Mate 2nd Class Danny Dietz; and Hospital Corpsman 2nd Class Marcus Luttrell — took a vote.

They let the herders go.

Later that afternoon, a contingent of about 100–140 Taliban fighters swarmed upon the team. The four Americans were hugely outnumbered. The battle was fierce. Dietz fought on after taking five bullets, but succumbed to a sixth, in the head. Murphy and Axelson were killed not long after. When the air support that the SEALs had called for finally arrived, all sixteen members of the rescuing team were killed by the Islamists. Luttrell was the lone survivor, and only just. 2

The scene of carnage on that mountainside in Afghanistan captures something essential about American policy. What made the deadly ambush all the more tragic is that in reaching their decision, those brave SEALs complied with the policies handed down to them from higher-ups in the military and endorsed by the nation’s commander-in-chief. Their decision to place the moral injunction to selflessness ahead of their mission and their very lives encapsulates the defining theme of Washington’s policy response to 9/11.

Across all fronts U.S. soldiers are made to fight under the same, if not even more stringent, battlefield rules. Prior to the start of the Afghanistan War and the Iraq War, for instance, the military’s legal advisors combed through the Pentagon’s list of potential targets, and expansive “no-strike” lists were drawn up. 3 Included on the no-strike lists were cultural sites, electrical plants, broadcast facilities — a host of legitimate strategic targets ruled untouchable, for fear of affronting or harming civilians. To tighten the ropes binding the hands of the military, some artillery batteries “were programmed with a list of sites that could not be fired on without a manual override,” which would require an OK from the top brass. 4 From top to bottom, the Bush administration consciously put the moral imperative of shielding civilians and bringing them elections above the goal of eliminating real threats to our security.

This book shows how our own policy ideas led to 9/11 and then crippled our response in the Middle East, and makes the case for an unsettling conclusion: By subordinating military victory to perverse, allegedly moral constraints, Washington’s policy has undermined our national security. Only by radically rethinking our foreign policy in the Middle East can we achieve victory over the enemy that attacked us on 9/11.

But from the outset the Bush administration had insisted that we’re in a new kind of war — an unwinnable war. To scale back people’s expectations, it told us not to wait for a defeated enemy to surrender, in the way that Japan did aboard the USS Missouri in 1945.

This much is true: the “war on terror” is essentially different from our actions in World War II. Back then, we brought Japan to its knees within four years of Pearl Harbor — yet eight years after 9/11, against a far weaker enemy, we find ourselves enmeshed in two unresolved conflicts (Iraq and Afghanistan) while further mass-casualty attacks and new flashpoints (such as Pakistan) loom. Why?

It is not for lack of military strength and prowess; in that regard America is the most powerful nation on earth. It is not for a lack of troops, or planning, or any sort of bungled execution. Our soldiers have amply demonstrated their skill and courage — when they were allowed to fight. But such occasions were deliberately few; for as a matter of policy Washington sent them, like the SEALs in Afghanistan, into combat but prohibited them from fighting to win. This underscores how Bush’s war indeed differs from the triumphant, all-out military campaign against Japan — and how it is far from a new kind of war. It is in fact an eerie replay of Vietnam.

The philosopher Ayn Rand observed, at the time, that in the Vietnam War “American forces were not permitted to act, but only to react; they were to ‘contain’ the enemy, but not to beat him.” Nevertheless Vietnam — like the fiascos of today — was seen as discrediting military action, even though (as Rand observed) U.S. soldiers in Vietnam were thrust into “a war they had never been allowed to fight. They were defeated, it is claimed — two years after their withdrawal from Vietnam. The ignominious collapse of the South Vietnamese, when left on their own, is being acclaimed as an American military failure.” 5

For good reason Vietnam was called a “no-win” war. Rand properly laid the blame for the disaster at the feet of American politicians and their intellectual advisors. The entire “war on terror” is likewise a no-win war. His words redolent of the Vietnam era, President Bush told an interviewer on NBC’s “Today” show that “I don’t think you can win [the war],” and that he blithely envisioned no definite end for it (see the epigraph). His words have been self-fulfilling. In the current conflict — as in Vietnam — the disaster is due not to a military failure. We are in an unwinnable war, but only because of the ideas setting the direction of our foreign policy.

Irrational ideas have shaped the Mideast policy not just of George W. Bush, but also of earlier administrations that had to confront the Islamist movement — from Jimmy Carter on. And although President Obama glided into office as the candidate of “change,” his administration brings us full circle to the appeasing policies that characterized the run-up to 9/11 (see chapter 1). The irrationality of American policy all but guarantees that the Islamist movement will continue to menace the American public and that this conflict will figure prominently in foreign-policy thinking for years to come.

But the overarching message of this book — that certain dominant ideas about morality subvert American policy — should not be taken as a rejection of the need for morality, per se, in foreign policy. Far from it. Trying to implement a foreign policy unguided by the right moral principles is like trying to cross an eight-lane freeway blindfolded and with your ears plugged. Seat-of-the-pants amoral temporizing does not a policy make, and practically it is inimical to achieving U.S. security. What we demonstrate in the following pages is that the United States needs to challenge the specific morality that currently dominates our policy — and instead adopt better, more American, ideas.

To that end, we offer a new vision and specific policy recommendations for how to address ongoing problems and threats deriving from the Middle East. Those suggestions — and, broadly, all the critiques offered in this book — originate neither from a liberal, nor a conservative, nor a libertarian, nor a neoconservative outlook. Their frame of reference, instead, is the secular, individualist moral system defined by Ayn Rand. Taking U.S. policy in this new direction would enable us properly to conceptualize and achieve America’s long-range self-interest: the safeguarding of our lives from foreign aggressors.

No one can predict with certainty what will unfold in the interval between the writing of this book and your reading of these words. But given the entrenched policy trends described in these pages, the lessons of the last eight years will likely go unlearned — much to the detriment of our security. My hope is that this book will counteract those trends by awakening Americans to the actual nature of the war we are in, and that in fact, if we’re guided by the right ideas, the war against Islamic totalitarianism is winnable.

* * *

The book’s central argument is developed across seven essays. Although each essay is self-contained, I encourage you to read them all in sequence because they are parts of a thematic whole. To aid the reader in integrating the steps of the argument, I offer the following outline of the topics covered and their logical progression.

Part 1 considers the nature of the Islamist threat, its origin, and the role of U.S. policy in empowering that menace. Chapter 1 demonstrates how unprincipled U.S. policy — from Carter through Clinton — worked to galvanize the enemy to bring its holy war to our shores on 9/11. Chapter 2 explores the widely evaded nature and goals of the enemy, and indicates how that should figure in America’s military response.

Part 2 focuses on the change in policies that were the impetus for Washington’s military operations in Iraq and Afghanistan. Chapter 3 exposes the nature of Bush’s crusade for “democracy” — sometimes called the Forward Strategy for Freedom — and the destructive moral ideas that informed it. Chapter 4 identifies the ruinous impact on U.S. self-defense of “Just War Theory” — the widely accepted doctrine of morality in war. Chapter 5 brings out the profound opposition between neoconservative thought (a major ideological influence on Bush’s war policy) and America’s true national interest in foreign policy.

Taken together, what these three chapters argue is that America effectively renounced the fully achievable goal of defeating the enemy — for the sake of a welfare mission to serve the poor and oppressed of the Middle East. With the goal of victory abandoned, war certainly becomes unwinnable.

Part 3 looks at what Bush’s policies have wrought in the Middle East — and what the Obama administration should do. Chapter 6 surveys Afghanistan, post-surge Iraq, and the broader Islamist threat emanating from Pakistan, Iran, and elsewhere. Bush’s policy, it is argued, has actually left the enemy stronger than before 9/11. The enduring threats we face and the depressingly inadequate policy options being considered underline the pressing need for a real alternative to the conventional mold in foreign policy. Although the enemy grows stronger, chapter 7 argues that victory remains achievable. The way forward requires that we adopt a radically different approach to our foreign policy in the Middle East — one founded on a different moral framework.

* * *

A word on the genesis of the essays in this collection. All but three of the essays originally appeared, in somewhat different form, in The Objective Standard between 2006 and 2007. The exceptions are chapters 1, 6, and 7. These were written in winter 2008–09, and are published here for the first time.

Elan Journo
The Ayn Rand Institute
May 2009

About The Author

Elan Journo

Senior Fellow and Vice President of Content Products, Ayn Rand Institute

Endnotes

1 George W. Bush, interview by Matt Lauer, Today, MSNBC, 30 Aug. 2004, www.msnbc.msn.com/id/5866571/ (accessed 25 Aug. 2008).

2 “Excerpt from Lone Survivor,” ArmyTimes.com, 18 June 2007, www.armytimes.com/news/2007/06/navy_sealbook_excerpt_070618w/ (accessed 16 Oct. 2008); John Springer, “He knew his vote would sign their death warrant,” TODAYShow.com, 12 June 2007, www.msnbc.msn.com/id/19189482/ (accessed 16 Oct. 2008).

3 Colin H. Kahl, “How We Fight,” Foreign Affairs (November/December 2006), www.foreignaffairs.com/articles/62093/colin-h-kahl/how-we-fight (accessed 16 Oct. 2008).

4 Kahl, “How We Fight.”

5 Ayn Rand, “The Lessons of Vietnam,” in The Voice of Reason: Essays in Objectivist Thought (New York: Plume, 1990), 140, 143. Emphasis in the original.

 

Climate Vulnerability and the Indispensable Value of Industrial Capitalism

by Keith Lockitch | September 2009

ABSTRACT:

It is widely believed that man-made greenhouse gas emissions are increasing overall vulnerability to climate-related disasters, and that, consequently, policies aimed at cutting off these emissions are urgently needed. But a broader perspective on climate vulnerability suggests that the most important factors influencing susceptibility to climate-related threats are not climatologic, but political and economic. The dramatic degree to which industrial development under capitalism has reduced the risk of harm from severe climate events in the industrialized world is significantly under-appreciated in the climate debate. Consequently, so too is the degree to which green climate and energy policies would undermine the protection that industrial capitalism affords — by interfering with individual freedoms, distorting market forces, and impeding continued industrial development and economic growth. The effect of such policies would, ironically, be a worsening of overall vulnerability to climate.

1. INTRODUCTION

Severe climate events have become a weapon in the rhetorical arsenal of green politics. Hurricane Katrina became the literal poster child for global warming when the movie placard for Al Gore’s An Inconvenient Truth depicted a satellite image of the storm blowing out of a set of industrial smokestacks. No climate-related disaster occurs today without being seized upon as a cautionary tale against the purported threat of anthropogenic climate change.

The  claim  that  man-made  greenhouse  gas  emissions  are  causing  large-scale changes to the earth’s climate systems — dramatically increasing the risk of climate catastrophe — is omnipresent and trumpeted daily with ever-increasing alarm. Climate-related tragedies past and present are routinely used to underscore the theme of man’s vulnerability to the climate. Consider the following from Spencer Weart’s book The Discovery of Global Warming:

In 1972 a drought ravaged crops in the Soviet Union, disrupting world grain markets, and the Indian monsoon failed. In the United States the Midwest was struck by droughts severe enough to show up repeatedly on the front pages of newspapers and on television news programs. Most dramatic of all, years of drought in the African Sahel reached an appalling peak, starving millions, killing hundreds of thousands, and bringing on mass migrations. Television and magazine pictures of sun-blasted fields and emaciated refugees brought home just what climate change could signify for all of us.1

The intended implications are clear: all of us are dangerously susceptible to the ravages of climate; to protect ourselves we must immediately adopt drastic policies aimed at cutting off greenhouse gas emissions.

And such policies are not merely being pondered, but are steadily moving toward political reality. International negotiators will meet in Copenhagen in December 2009 to hammer out a much stronger successor to 1997’s Kyoto Protocol, which imposed on its signatories binding emissions cuts.2  3  Also, as of this writing (April 2009), a draft bill before the U.S. Congress would impose energy rationing in a variety of guises: a cap-and-trade system rationing U.S. carbon emissions, a renewable energy mandate, forced energy efficiency programs, and more.4 Should the bill fail, regulation of greenhouse gases might still go forward in the United States since the EPA — following the Supreme Court — has “found” them to be air pollutants under the Clean Air Act.5

The lurid examples of climate-related tragedy fuel this political agenda by imparting a sense of panicked urgency. They convey the impression that something is happening that is unprecedented in human history — that where mankind once flourished in a world with a stable, benign climate, we are now facing an apocalyptic hell beyond all capacity to manage.

But vulnerability to the climate has been a feature of human existence for all of human history; there have always been droughts and floods and hurricanes and heat waves — and there always will be, regardless of what happens to atmospheric greenhouse gas concentrations. Moreover, the history of industrial development has been one of an ever-increasing ability to cope with natural disasters — an ever-increasing resilience against them.

Yet none of this is sufficiently appreciated in the climate debate. We in the industrialized world tend to ignore or forget just how harsh and precarious life was in the preindustrial era, and still is today in nonindustrialized countries. We take industrial development for granted and tend not to consider the ways it actually reduces our climate vulnerability. We also take for granted the political and economic freedoms that make industrial development possible and fail to recognize the myriad ways that proposed climate and energy policies would undermine those freedoms.

A proper assessment of proposed green policies requires a broader perspective on climate vulnerability than one that focuses merely on climatologic factors. In particular, the role of political and economic factors must also be considered. To what degree is susceptibility to climate-related threats reduced by policies that expand political freedom and thereby foster industrial development and economic growth? And to what degree is climate vulnerability actually worsened by policies that interfere with market freedoms and thereby restrict development and growth? Given the far-reaching implications of proposed energy and climate policies, such a broader consideration of climate vulnerability is urgently needed.

2. CLIMATE VULNERABILITY: PREINDUSTRIAL AND POST

Nature has never been unqualifiedly hospitable to man. Whatever periods of human flourishing occurred in the preindustrial era, they occurred against a general background of unrelenting hardship and privation. For most of human history, life has consisted of a precarious struggle to eke out a bare subsistence at the constant mercy of drought and disease, storm and flood, famine and plague.

Prior to the widespread utilization of coal in the eighteenth century, the primary sources of fuel for heating, cooking, and other uses were biomass fuels such as wood and animal dung (still true in many poor countries today). With access only to fuels of such low energy-density and to rudimentary technology, people in preindustrial civilizations had little control over nature and were easily overwhelmed by its powerful forces.6

Ramshackle dwellings and primitive fuels afforded little protection against the elements. Describing everyday life in sixteenth-century Europe, historian William Manchester writes of

tiny cabins of crossed laths stuffed with grass or straw, inadequately shielded from rain, snow, and wind. They lacked even a chimney; smoke from the cabin’s fire left through a small hole in the thatched roof — where, unsurprisingly, fires frequently broke out. These homes were without glass windows or shutters; in a storm, or in frigid weather, openings in the walls could only be stuffed with straw, rags — whatever was handy.7

Shelters of such poor quality were typical for people the world over until as recently as several generations ago. In countries at even moderately northern latitudes, a prodigious labor was required just to keep from freezing through a normal winter — let alone cope with unusual extremes of cold. For instance, a typical household on the early American frontier consumed thousands of pounds of firewood every year — twenty to forty cords annually, according to one estimate (forty cords being a stack of wood four feet high by four feet deep by 320 feet long) — which of course had to be gathered or chopped by hand.8 And in return for the meager warmth such fuels provided, they posed serious health risks of their own. “They can generate high levels of poisonous carbon monoxide,” writes energy analyst Vaclav Smil, “while poorly-vented combustion, in shallow pits or fireplaces, produces high concentrations of fine particulates, including various carcinogens. Repeated inhalation of this smoke leads to impaired lung function and chronic respiratory diseases (bronchitis, emphysema).”9

Smil also writes of the “millennia-long stagnation” in the development of preindustrial agriculture, which he attributes partly to “the inadequate power and relatively high energy cost of the only two kinds of prime movers available for field work; human and animal muscles.”10 Primitive technology and ignorance of sophisticated agricultural methods left preindustrial farmers with little control over the results of their toil. The threat of drought, crop failure, and starvation was omnipresent and periodic famines that regularly decimated whole populations were the rule not the exception.11

Undernourished and lacking access to clean drinking water or basic sanitation, completely ignorant of medical science, helpless before natural threats they couldn’t understand or predict — individuals in the preindustrial world were completely at the mercy of whatever adversities nature threw their way. Little wonder, then, that life expectancy has been so low for most of human history. Estimates of life expectancy in prehistoric eras put it at somewhere between twenty and thirty years, and it remained below forty years right up through the start of the nineteenth century.12

Yet life expectancy in developed countries today is as high as eighty years — and it should go without saying that the majority of people in today’s industrialized world enjoy a length and quality of life incomparably superior to the squalid misery alluded to above. In the brief span of two centuries, human life has been completely transformed — transformed by extraordinary advances in science, technology, and medicine and by the growth of market institutions and the expansion of political and economic freedom associated with the birth of industrial capitalism.

Too often, we take for granted the astonishing and life-saving products of industrial capitalism and industrial-scale energy. We in the developed world don’t think about the fact that things we regard as completely commonplace and unremarkable would seem, to anyone from any previous period in history, an absolutely unimaginable miracle. We forget, as we flood our homes with light by a casual flick of a switch, that through most of human history (and still today in many parts of the world) the close of day meant darkness and an end to all activity. The precarious existence of the preindustrial farmer doesn’t even register as a glimmer in our consciousness as we walk into our modern grocery stores, with their shelves upon shelves of fresh, healthful foods — prepared, packaged, refrigerated, and relatively inexpensive — all supplied and served by a vast infrastructure of agricultural, transportation, and business and marketing systems.

We hardly even notice when our furnaces fire up automatically, sending hot water through radiators or blowing warm air through vents in our well-insulated walls — or when a different setting sends in an air-conditioned breeze to drive off the heat of summer. Rightly concerned about heat waves and spells of extreme cold, we forget just how much more suffering and death such climate events inflict on people lacking modern amenities. This holds true even in developed countries today where the cost of energy has, for example, limited the adoption of air conditioning. More than thirty thousand deaths were attributed to the heat wave that struck Western Europe in 2003 — widely taken as a sign of the extreme threat posed by global warming.13 But, as Patrick Michaels has pointed out, the temperatures that exacted such a tragic toll that summer were lower than those in Western America, where no deaths were attributed to the heat. “The difference,” argued Michaels, “is air conditioning run by affordable energy.”14

Or, consider Spencer Weart’s drought example, which he takes as portending the future threat that climate change “could signify for all of us.”15 It is true that severe drought did indeed strike the regions he mentions in 1972, and the consequences were indeed harsh: food rationing in the Soviet Union, famine in India that persisted through the mid-70s, and mass starvation in sub–Saharan Africa, which went on for decades as the drought continued through much of the ’80s and ’90s. But from a historical perspective, these tragic events are unfortunately nothing unusual. What really stands out as remarkable and unprecedented is the negligible effect of the drought in the United States.

Despite drought conditions severe enough to rate comparison with the 1930s Dust Bowl, Americans saw only minor economic losses and fluctuations in food prices.16 It is telling that the most that Weart could find to say was that the Midwest droughts showed up on “the front pages of newspapers and on television news programs.”17 Observe that they specifically did not “show up” at all on people’s waistlines and barely registered on their pocketbooks. Such resilience is testament to the adaptive flexibility of an industrialized economy and a (relatively) free market — to industrial capitalism’s ability to respond quickly when normal conditions are disrupted. While the other regions mentioned suffered a total failure of their food production and distribution systems, the United States donated surplus food supplies to Africa, sold food grains to India, and arranged a massive sale of wheat to the Soviet Union in late 1972.18 19 20

Contrast this to the helplessness before nature of India’s peasant farmers or the Sahel’s nomadic tribes. Why were they unable to benefit from the agricultural practices that empowered the American farmers — the irrigation of fields, the use of fertilizers and pesticides, and the application of sophisticated methods of agricultural management? What role did their primitive cultural traditions and their countries’ oppressive political systems play in suppressing the industrial development and free market mechanisms that made such advances possible? And in the case of the Soviet Union, should there really be any surprise that its state-owned collective farms were unable to cope with unfavorable weather conditions? Even under good conditions — and with the advantage of some of the most fertile agricultural land in the world — the central planners of the Soviet agricultural ministry were rarely able to coerce adequate food production.

Looked at from the vantage point of human history, recent climate-related tragedies suggest an opposite perspective to that offered by the advocates of green policies. The message these and numerous other examples convey is not “man’s vulnerability to climate,” but his vulnerability only under the wrong political and economic conditions. Standing out above all else is the unprecedented degree of protection from climate-related threats that exists under industrial capitalism.

Consider the poster child of global warming alarm: Hurricane Katrina. In 1970, a severe tropical cyclone struck the coast of the Bay of Bengal, in what is today Bangladesh. It is estimated that the storm was a category 3 cyclone, and the death toll it left in its wake was estimated to have been as high as three hundred thousand people.21 Compare this with Hurricane Katrina, which struck New Orleans in 2005. By the time it made landfall Katrina was also a category 3 storm and the directly affected population was comparable to that in Bangladesh.22 23 Yet the number of people dead or missing was far, far less — estimates put it at around two thousand.24

Without denying the tragedy of the lives lost to Katrina, two thousand versus three hundred thousand is an incredible difference. In assessing what accounts for that difference, one can debate the relative roles of social, political, geographic and climatologic factors, but there can be no question of the fundamental and decisive importance of the technology and infrastructure made possible by industrial capitalism. Unlike the helpless victims of the Bangladesh storm, the citizens of New Orleans  could  rely  on  advanced  early  warning systems  and  a  functioning communications infrastructure, modern vehicles and paved roads to facilitate evacuation and transport relief supplies, sturdier homes and structures and advanced flood control systems, etc. Indeed, much of this even failed in New Orleans: the levees were breached, many people couldn’t or wouldn’t evacuate, the relief effort was delayed, and so on. Yet, even in spite of these failures, hundreds of thousands of lives were saved by the products of industrial technology and industrial-scale energy.

This is the real lesson of today’s climate-related tragedies: the immeasurable degree to which industrial development under capitalism has reduced our vulnerability to climate threats.

3. CLIMATE VULNERABILITY AND DISTORTIONS OF THE FREE MARKET

A corollary lesson is the degree to which our protection against climate disasters is weakened by government policies that obstruct the life-saving benefits of industrial capitalism or otherwise interfere with the mechanisms of the free market.

It is arguable that — though it was orders of magnitude lower than in Bangladesh — the toll in New Orleans was still higher than it need have been. Consider the following 2006 statement from ten of the world’s top hurricane experts, who point out that “a Katrina-like storm or worse was (and is) inevitable even in a stable climate” and suggest that while the “possible influence of climate change on hurricane activity” is an important scientific question, it is not “the main hurricane problem facing the United States.” 

Rapidly escalating hurricane damage in recent decades owes much to government policies that serve to subsidize risk. State regulation of insurance is captive to political pressures that hold down premiums in risky coastal areas at the expense of higher premiums in less risky places. Federal flood insurance programs likewise undercharge property owners in vulnerable areas. Federal disaster policies, while providing obvious humanitarian benefits, also serve to promote risky behavior in the long run.25

By distorting the free market price signals individuals use to guide their choices, these and myriad other government interventions and regulations, going back decades, have lured people into floodplains and produced a higher overall vulnerability to hurricanes and flooding.

Or, consider the role of government policies in enhancing the risks from wildfire — another item on the laundry list of disasters that many fear will be exacerbated by global warming. With every major blaze that occurs today the news reports never fail to include prominent mention of climate change (notwithstanding the obligatory caveat that no individual wildfire can be attributed to it).

In February 2009, for instance, a number of severe bushfires raged through southeastern Australia, killing 173 people and destroying thirteen hundred homes as they burned more than 4,500 square kilometers (1.1 million acres) — the deadliest bushfires in Australia’s history.26 27 28 Not surprisingly, this was widely reported in the press as a sign of what global warming has in store for Australia’s future. For instance, a New York Times story — ostensibly about the role of arson in setting the fires ablaze — included the following:

Climate scientists say that no single rare event like the deadly heat wave or fires can be attributed to global warming, but the chances of experiencing such conditions are rising along with the temperature. . . . The flooding in the northeast and the combustible conditions in the south were consistent with what is forecast as a result of recent shifts in climate patterns linked to rising concentrations of greenhouse gases. . . .29

Another story asserted that “the government’s failure to set tough greenhouse gas emissions targets would endanger lives.”30

But while there is no question that high temperatures and dry conditions are crucial causal factors in the risk and severity of wildfire, the “only controllable factor” — according to meteorologist and bushfire expert David Packham — is the fuel that feeds the fires: “the dead leaves, pieces of bark and grass that become the gas that feeds the 50m high flames.”31

Packham argues that the bush, properly managed, need not pose nearly such a deadly threat. The main factor that kept residents as dangerously exposed as they were was the green policies of local government councils that restricted the clearing of trees and brush. 

Fuels build up year after year at an approximate rate of one tonne a hectare a year, up to a maximum of about 30 tonnes a hectare. If the fuels exceed about eight tonnes a hectare, disastrous fires can and will occur. Every objective analysis of the dynamics of fuel and fire concludes that unless the fuels are maintained at near the levels that our indigenous stewards of the land achieved, then we will have unhealthy and unsafe forests that from time to time will generate disasters such as the one that erupted on Saturday.

It has been a difficult lesson for me to accept that despite the severe damage to our forests and even a fatal fire in our nation’s capital, the political decision has been to do nothing that will change the extreme threat to which our forests and rural lands are exposed.

In the wake of the tragedy, distressing stories emerged of bushfire victims who had repeatedly pleaded for controlled burns and other fire prevention measures, but who were rebuffed by local governments citing “threats to biodiversity.”32 Regional councils refused to trim out-of-control vegetation on public lands and even prevented people from clearing firebreaks on their own, private property.33

Liam Sheahan, a resident who disregarded such restrictions and cleared a one hundred meter swath around his property in 2002, ended up before a local magistrate facing legal charges. A two-year court battle ended with Sheahan’s conviction, costing him $100,000 in fines and legal fees. “We’ve got thousands of trees on our property. We cleared about 247,” said Sheahan. The result? “The house is safe because we did all that. We have got proof right here. We are the only house standing in a two-kilometre area.”34

In light of such political policies restricting people’s freedom to protect their own safety on their own property, it is bordering on criminal to point to emissions reductions — on the assumption that they might someday have a salutary effect on Australia’s climate — as the primary call to action as a precaution against extreme bushfires.

4. THE THREAT OF MISANTHROPOGENIC CLIMATE POLICY

The industrial revolution and the development of industrial-scale energy required the unprecedented political freedom of England and the United States. This is what has made us comparatively safe from droughts, wildfires, hurricanes and the like. Policies restricting that freedom and interfering with market forces undermine this achievement and increase our climate vulnerability. Property owners have an obvious reason to reduce bush or forest fuel loads long before they pose a risk of unprecedented, extreme wildfire — but too many governments today prohibit such actions. Similarly, if the risk of living in a flood-prone coastal community was properly reflected in market prices — such as flood insurance premiums, home values, unsubsidized relief and recovery costs, and so on — individuals could act accordingly without false assurances of safety. It is only policies that distort such price signals and market forces that give rise to mounting dangers that go unattended for decades.

And the threat of more such destructive policies is only growing. The failure to appreciate how a truly free market operates and the unprecedented degree to which industrial capitalism has reduced vulnerability to climate-related risks is behind much of the alarm over “unchecked climate change.” Ignoring the fact that no civilization in human history has ever achieved greater protection against climate disasters than today’s industrialized nations, people are whipping themselves into a hysterical frenzy over the belief that changes in the earth’s climate will be an unmanageable calamity.

But under capitalism, there is no special problem of adapting to changes in the earth’s climate — even large-scale changes. Whether man-made or not, when such changes occur (as they have already occurred in human history), they would merely constitute one set of factors among all the others that are constantly integrated by and reflected in a free market. Individuals are continually making decisions and taking actions to enrich their own lives, based on the best knowledge they can acquire and the opportunities in the market. If, over the course of decades, some regions become warmer and others colder, or some regions become drier and others wetter, or sea levels rise or sea levels fall — these changes would simply be reflected in people’s knowledge and economic decisions. There is no reason to regard these changes any differently from any other forces driving continual market evolution and adaptation. And the more widespread industrial civilization is — the more readily available industrial-scale energy and the other products of industrial capitalism are — the easier the adaptation.

But this is not a perspective widely shared today. “Needless to say, a sea level rise of one meter by 2100 would be an unmitigated catastrophe for the planet,” shrieks climate activist Joe Romm.

The first meter of SLR [sea level rise] would flood 17% of Bangladesh, displacing tens of millions of people, and reducing its rice-farming land by 50 percent. Globally, it would create more than 100 million environmental refugees and inundate over 13,000 square miles of this country [America].35

Environmental refugees? A sea level rise of one meter by next month would be a catastrophe creating environmental refugees. A sea level rise of one meter by 2100 — i.e., barely more than one centimeter per year — would be a steady change that could be addressed in myriad ways and need not create a single refugee.*

But advocates of green policies are not interested in freedom. Restrictions on freedom are the essence of green climate and energy policies, which far from loosening the fetters of government interference, will tighten them considerably. “It’s important to change the light bulbs,” preaches Al Gore, “but it’s much more important to change the laws.”36

Our entire modern civilization is powered by industrial-scale energy. More than 86 percent of the world’s energy comes from burning fossil fuels — i.e., from the very process of creating carbon dioxide (and water) by oxidizing hydrocarbons. At the same time, an insignificant 2 percent of the world’s energy comes from renewable sources such as solar and wind.37 Despite the feverish claims of green energy prophets such as Gore, the obstacles to a rapid scale-up of current solar and wind technologies are beyond formidable.38  39  Yet, the almost universally accepted “solution” for the alleged problem of man-made climate change is to cut off greenhouse gas emissions by imposing worldwide draconian controls on energy production and consumption.

Even leaving aside the question of whether or not greenhouse gases are the dominant agent driving the earth’s climate (which is far from “settled” despite the insistent claims of an unchallengeable scientific consensus to the contrary) — it would still be absurd to adopt the policy of emissions reduction as the “solution.”

Even if representatives from all of the major greenhouse gas emitting nations could agree to binding emissions targets (including China and India, whose populations are finally enjoying the benefits of serious industrial development); and even if those agreements were to translate into laws actually enacted in each of those countries (recall that the U.S. Senate voted against ratifying the Kyoto Protocol); and even if those laws were implemented and enforced in ways that actually reduced emissions (until the recent, severe global recession, hardly any Kyoto signatories were on track to meeting their emissions targets, and emissions had been increasing under the European Union’s cap and trade system); and even if the net effect is that global atmospheric greenhouse gas concentrations actually stabilize and diminish; and even if that actually has the effect of stabilizing or reducing global temperatures — even if all these steps, none of which are trivial, were accomplished — what would be the result? A heavy and permanent stifling of the global economy, a significant expansion of government controls and regulations, a significant restriction of personal freedom, widespread energy privation, and considerable sacrifice inflicted on those who can least afford it — and in the end, a global civilization that, deprived of industrialization and energy, is far, far less capable of coping with severe climate events.

Far from solving the problem of climate-related risk, this absurdly indirect, Rube Goldberg policy would, tragically and ironically, make us more vulnerable to the climate.40

5. CONCLUSION

A broader perspective on climate vulnerability suggests that industrial development under capitalism is not merely one factor among others influencing susceptibility to climate-related risks. Rather, it is the dominant factor, reducing climate vulnerability to a degree that makes all other factors irrelevant.

But the life-saving value of industrial capitalism is profoundly unappreciated in today’s culture. This is not merely because people have forgotten or ignored its history, but because its opponents have actively sought to bury and distort that history. As Ayn Rand explains:

No politico-economic system in history has ever proved its value so eloquently or has benefited mankind so greatly as capitalism — and none has ever been attacked so savagely, viciously, and blindly. The flood of misinformation, misrepresentation, distortion, and outright falsehood about capitalism is such that the young people of today have no idea (and virtually no way of discovering any idea) of its actual nature. While archeologists are rummaging through the ruins of millennia for scraps of pottery and bits of bones, from which to reconstruct some information about prehistorical existence — the events of less than a century ago are hidden under a mound more impenetrable than the geological debris of winds, floods, and earthquakes: a mound of silence.41

The debate over climate and energy policy raises fundamental questions. But ultimately, it is not a debate over how many parts-per-million of carbon dioxide should be in the atmosphere, or whether the average global temperature should be 57 degrees or 62 degrees — as if we can control that anyway.

Fundamentally, this is a debate about how society should be organized. The advocates of statism have made their position clear and are actively working to advance their cause. It is time for those who value freedom to do the same.

 


*Would be a steady change that could be addressed in myriad ways and need not create a single refugee — so long as people are free.

 

About The Author

Keith Lockitch

Vice President of Education and Senior Fellow, Ayn Rand Institute

Reprinted from “Energy & Environment”, Volume 20 No. 5, 2009

REFERENCES

1. Weart, S. R., The Discovery of Global Warming, New Histories of Science, Technology, and Medicine (Cambridge: Harvard University Press, 2003), 71.

2. “COP15: United Nations Climate Change Conference,” (Copenhagen, 2009), http://en.cop15.dk/.

3. The full text of the Kyoto Protocol is available at http://unfccc.int/kyoto_protocol/items/2830.php.

4. Waxman, H. A., and Markey, E. J., The American Clean Energy and Security Act of 2009 (Discussion Draft Summary), 2009, http://energycommerce.house.gov/Press_11/20090331/acesa_summary.pdf.

5. Jackson, L. P., “Proposed Endangerment and Cause or Contribute Findings for Greenhouse Gases Under Section 202(a) of the Clean Air Act; Proposed Rule,” Federal Register 74, no. 78 (April 24, 2009), http://epa.gov/climatechange/endangerment/downloads/EPA-HQ-OAR-2009-0171-0001.pdf.

6. See, for example, chapter 3 of Smil, V., Energy: A Beginner’s Guide (Oxford: Oneworld, 2006).

7. Manchester, W., A World Lit Only by Fire (Boston: Little, Brown and Company, 1993), 53–4.

8. MacCleery, D. W., American Forests, A history of resiliency and recovery (USDA Forest Service, Washington, D.C., and Forest History Society, Durham, NC, 1992), cited in Hicks, R. R., Ecology and management of central hardwood forests (Canada: John Wiley and Sons, 1998).

9. Smil, Energy, 74.

10. Smil, Energy, 68.

11. See, for example, chapter 2 in Bernstein, A., The Capitalist Manifesto: The Historic, Economic and Philosophic Case for Laissez-Faire (Lanham, MD: University Press of America, 2005) and references therein; or chapters 1 and 2 in Manchester, A World Lit Only by Fire.

12. Preston, S. H., “Human Mortality Throughout History and Prehistory,” in Simon, J. L., ed., The State of Humanity, (Cambridge: Blackwell, 1995).

13. “European Heat Wave 2003: A Global Perspective,” World Climate Report, January 31, 2007, http://www.worldclimatereport.com/index.php/2007/01/31/european-heat-wave-2003-a-global-perspective/.

14. Michaels, P. J., “Energy Tax Blacks Out Many Lives in Europe,” Foxnews.com, August 20, 2003, http://www.foxnews.com/story/0,2933,95260,00.html.

15. Weart, The Discovery of Global Warming, 71.

16. See, for example, “Drought Ruining Oklahoma Wheat,” New York Times, April 16, 1972; and King, S. S., “In Midwest, Drought Worsens,” New York Times, July 26, 1974. Both accessed via ProQuest Historical Newspapers.

17. Weart, The Discovery of Global Warming, 71.

18. Ottaway, D. B., “U.S. Aids Africa Drought Airlift,” Washington Post, May 16, 1973. Accessed via ProQuest Historical Newspapers.

19. “India, Struck by Drought, Is Buying Grain From U.S.,” New York Times, January 18, 1973. Accessed via ProQuest Historical Newspapers.

20. Smith, H., “Brezhnev Meeting with Top Aides Seen as Effort to Spur Soviet Harvest,” New York Times, August 10, 1972; and “Soviets Admit a Record Crop Failure,” Washington Post, November 5, 1972. Both accessed via ProQuest Historical Newspapers.

21. Frank, N. L. and Husain, S. A., “the deadliest tropical cyclone in history?” Bulletin of the American Meteorological Society 52, no. 6 (June 1971), http://ams.allenpress.com/archive/1520-0477/52/6/pdf/i1520-0477-52-6-438.pdf.

22. Knabb, R. D., Rhome, J. R. and Brown, D. P., Tropical Cyclone Report: Hurricane Katrina (National Hurricane Center, August 20, 2005), http://www.nhc.noaa.gov/pdf/TCR-AL122005_Katrina.pdf.

23. Centre for Research on the Epidemiology of Disasters (CRED), EMDAT: Emergency Events Database (Brussels: Université Catholique de Louvain, 2008), http://www.emdat.be/.

24. Estimates of fifteen hundred dead, plus several hundred missing in Knabb, R. D., Rhome, J. R. and Brown, D. P., Tropical Cyclone Report: Hurricane Katrina (National Hurricane Center, August 20, 2005), http://www.nhc.noaa.gov/pdf/TCR-AL122005_Katrina.pdf.

25. Emanuel, K., et al., “Statement on the U.S. Hurricane Problem,” July 25, 2006, http://wind.mit.edu/~emanuel/Hurricane_threat.htm.

26. Victoria police report, March 30, 2009, http://www.police.vic.gov.au/content.asp?Document_ID=19190.

27. “Australian brush fires: Police release suspect photo,” Telegraph.co.uk, February 12, 2009, http://www.telegraph.co.uk/news/worldnews/australiaandthepacific/australia/4603207/A ustralian-brush-fires-Police-release-suspect-photo.html.

28. “Australian bush fires: Dozens of people still unaccounted for,” Telegraph.co.uk, February 25, 2009, http://www.telegraph.co.uk/news/worldnews/australiaandthepacific/australia/4803327/A ustralian-bush-fires-Dozens-of-people-still-unaccounted-for.html.

29. Foley, M., “Australia Police Confirm Arson Role in Wildfires,” New York Times, February 10, 2009, http://www.nytimes.com/2009/02/10/world/asia/10australia.html.

30. Foley, M., “Fires and climate change prompt soul-searching in Australia,” International Herald Tribune, February 16, 2009, reprinted at http://www.ecoearth.info/shared/reader/welcome.aspx?linkid=118796&keybold=climate%20AND%20%20change%20AND%20%20increased%20AND%20%20wildfires.

31. Packham, D., “Victoria bushfires stoked by green vote,” Australian, February 10, 2009, http://www.theaustralian.news.com.au/story/0,25197,25031389-7583,00.html.

32. McGuirk, R., “Australia debates controlled burns,” Associated Press, February 12, 2009, reprinted on Newsvine, http://www.newsvine.com/_news/2009/02/12/2430370-australia-debates-controlled-burns.

33. Petrie, A., “Angry survivors blame council ‘green’ policy,” The Age, February 11, 2009, http://www.theage.com.au/national/angry-survivors-blame-council-green-policy-20090211-83p0.html.

34. Baker, R. and McKenzie, N., “Fined for illegal clearing, family now feel vindicated,” Age, February 12, 2009, http://www.theage.com.au/national/fined-for-illegal-clearing-family-now-feel-vindicated-20090211-84sw.html .

35. Romm, J., “An introduction to global warming impacts: Hell and High Water,” March 22, 2009, http://climateprogress.org/2009/03/22/an-introduction-to-global-warming-impacts-hell-and-high-water .

36. Eilperin, J., “Gore begins huge public campaign to go green,” Seattle Times, March 31, 2008, http://seattletimes.nwsource.com/html/nationworld/2004316880_gore31.html.

37. See, for example, Energy Information Administration, “International Energy Annual 2006,” released June–December 2008, http://www.eia.doe.gov/iea/overview.html"> http://www.eia.doe.gov/iea/overview.html">http://www.eia.doe.gov/iea/overview.html.

38. Gore, A., “A Generational Challenge to Repower America,” speech delivered July, 17, 2008, http://www.wecansolveit.org/pages/al_gore_a_generational_challenge_to_repower_america.

39. Smil, V., “Moore’s Curse and the Great Energy Delusion,” American, November, 19, 2008, http://www.american.com/archive/2008/november-december-magazine/moore2019s-curse-and-the-great-energy-delusion .

40. “Rube Goldberg, 1883–1970, U.S. cartoonist, whose work often depicts deviously complex and impractical inventions,” http://www.Dictionary.com; see also http://www.rubegoldberg.com .

41. Rand, A., Capitalism: The Unknown Ideal, Centennial Edition (New York: Signet, 1986), viii.

Why is Ayn Rand Still Relevant: Atlas Shrugged and Today’s World

by Yaron Brook and Don Watkins | August 10, 2009 | CNBC

Those who haven’t yet picked up Ayn Rand’s 1957 classic novel Atlas Shrugged may be wondering why so many people are invoking the book in discussions of today’s events.

Well, the short answer is: because today’s world is strikingly similar to the world of Atlas Shrugged.

Consider the government’s affordable housing crusade, in which lenders were forced to make loans to subprime borrowers who allegedly “needed” to own homes.

“We must not let vulgar difficulties obstruct our feeling that it’s a noble plan motivated solely by the public welfare. It’s for the good of the people. The people need it. Need comes first…”

Those might sound like the words of Barney Frank, but in fact they belong to Eugene Lawson, a banker in Atlas Shrugged who went bankrupt giving loans to people on the basis of their “need” rather than their ability to repay. In the quoted scene, Lawson is urging his politically powerful friends to pass a law restricting economic freedom for the “public good” — long-range consequences be damned.

Or consider this cry from Atlas Shrugged villain Wesley Mouch, head of the “Bureau of Economic Planning and National Resources”:

“Freedom has been given a chance and failed. Therefore, more stringent controls are necessary. . . . I need wider powers!”

This mirrors the incessant claims by today’s politicians and bureaucrats that all our problems would disappear if only they had more power. They tell us that health care is expensive and ineffective — not because the government has its tentacles in every part of it and forces us to pay for other people’s unlimited medical-care wants and needs — but because there is no bureaucrat forcing us to buy insurance and dictating which tests and treatments are “necessary.” They tell us that American auto companies failed to compete — not because they were hamstrung by pro-union laws and fuel efficiency standards — but because there was o government auto czar. They tell us that we are reeling from a financial crisis — not as a result of massive, decades-long government intrusion in the financial and housing markets — but because the intrusion wasn’t big enough; we didn’t have a single, all-powerful “systemic risk” regulator.

Atlas Shrugged shows us an all-too-familiar pattern: Washington do-gooders blaming the problems they’ve created on the free market, and using them as a pretext for expanding their power. And more: it provides the fundamental explanation for why the government gets away with continually increasing its control over the economy and our lives. The explanation, according to Atlas, is to be found in the moral precepts we’ve heard all our lives.

From the time we’re young we are taught that the essence of morality is to sacrifice one’s own interests for the sake of others, and that to focus on one’s own interests is immoral and destructive. As a result, we want the government to protect us from doctors and businessmen out for their own profit. We want the government to redistribute wealth from the successful to the unsuccessful. We want the government to ensure that those in need are given “free” health care, cheap housing, guaranteed retirement pay and a job they can never lose. We want the government to take these and many other anti-freedom measures because virtually everyone today believes that they are moral imperatives.

This view of morality, Atlas argues, inevitably leads to the disappearance of freedom.

A free society is one in which the individual’s life belongs to him, where he can pursue his own happiness without interference by others. That is incompatible with the view that morally his life belongs to others. So long as you accept that self-sacrifice for the needs of others is good, you will not be able to defend a capitalist system that enshrines and protects individual freedom and the profit motive.

The only way to stop the growth of the state and return to the Founding Fathers’ ideal of limited government is to recognize that individuals not only have a political right to pursue their own happiness, but a moral right to pursue their own happiness. This is what Ayn Rand called a morality of rational self-interest. It is a selfishness that consists, not of doing whatever you feel like, but of using your mind to discover what will truly make you happy and successful. It is a selfishness that consists, not of sacrificing others in the manner of a Bernie Madoff, but of producing the values your life requires and dealing with others through mutually advantageous, voluntary trade.

It’s no accident that, at the very instant Washington is extending its grip over our lives, Atlas Shrugged is selling faster than ever before. Americans sense that Atlas has something important to say about this frightening trend. It does. If you want to understand the ideas undermining American liberty — and the ideas that could foster it once again — read Atlas Shrugged.

About The Authors

Yaron Brook

Chairman of the Board, Ayn Rand Institute

Don Watkins

Former Fellow (2006-2017), Ayn Rand Institute

Justice Holmes and the Empty Constitution

by Tom Bowden | Summer 2009 | The Objective Standard

On April 17, 1905, Justice Oliver Wendell Holmes Jr. issued his dissenting opinion in the case of Lochner v. New York.

1

 At a mere 617 words, the dissent was dwarfed by the 9,000 words it took for the Supreme Court’s eight other Justices to present their own opinions. But none of this bothered Holmes, who prided himself on writing concisely. “The vulgar hardly will believe an opinion important unless it is padded like a militia brigadier general,” he once wrote to a friend. “You know my view on that theme. The little snakes are the poisonous ones.”

2

Of the many “little snakes” that would slither from Justice Holmes’s pen during his thirty years on the Supreme Court, the biting, eloquent dissent in Lochner carried perhaps the most powerful venom. A dissent is a judicial opinion in which a judge explains his disagreement with the other judges whose majority votes control a case’s outcome. As one jurist put it, a dissent “is an appeal . . . to the intelligence of a future day, when a later decision may possibly correct the error into which the dissenting judge believes the court to have been betrayed.”3 Holmes’s Lochner dissent, though little noticed at first, soon attained celebrity status and eventually became an icon. Scholars have called it “the greatest judicial opinion of the last hundred years” and “a major turning point in American constitutional jurisprudence.”4 Today, his dissent not only exerts strong influence over constitutional interpretation and the terms of public debate, but it also serves as a litmus test for discerning a judge’s fundamental view of the United States Constitution. This means that any Supreme Court nominee who dares to question Holmes’s wisdom invites a fierce confirmation battle and risks Senate rejection. As one observer recently remarked, “The ghost of Lochner continues to haunt American constitutional law.”5

Holmes’s dissent in Lochner blasted the majority opinion endorsed by five members of the nine-man Court. Holmes, as if anticipating the modern era of “sound bites,” littered his dissent with pithy, quotable nuggets that seemed to render the truth of his opinions transparently obvious. Prominent scholars have called the dissent a “rhetorical masterpiece” that “contains some of the most lauded language in legal history.”6 His “appeal to the intelligence of a future day” was a stunning success. So thoroughly did Holmes flay the majority’s reasoning that Ronald Dworkin, a prominent modern legal philosopher, dismisses the majority decision as an “infamous . . . example of bad constitutional adjudication” that gives off a “stench”; and Richard A. Posner, prolific author and federal appellate judge, writes that Lochner is the type of decision that “stinks in the nostrils of modern liberals and modern conservatives alike.”7

What heinous offense did the Lochner majority commit to provoke Holmes’s caustic dissent? It was not the fact that they had struck down a New York law setting maximum working hours for bakers. Holmes personally disapproved of such paternalistic laws and never questioned the Supreme Court’s power to strike down legislation that violated some particular clause in the Constitution.8 No, in Holmes’s eyes the majority’s unforgivable sin did not lie in the particular result they reached, but in the method by which they reached it. The majority interpreted the Constitution as if it embodies a principled commitment to protecting individual liberty. But no such foundational principle exists, Holmes asserted, and the sooner judges realize they are expounding an empty Constitution — empty of any underlying view on the relationship of the individual to the state — the sooner they will step aside and allow legislators to decide the fate of individuals such as Joseph Lochner.

Lochner, a bakery owner whose criminal conviction sparked one of the Supreme Court’s most significant cases, never denied he had violated the New York Bakeshop Act of 1895. Instead, he contended that the statute itself was unconstitutional. The majority agreed with Lochner, and Holmes was moved to dissent — for reasons that are best understood against the background of Progressive Era reform.

The New York Bakeshop Act of 1895

The first decade of the twentieth century was a time of rapid economic and population growth in America. European immigrants streamed into the cities, searching for the upward economic and cultural mobility that defined the American Dream. Of course, they all needed to eat, and the baking industry was one of many that expanded rapidly to meet demand. From the growth pangs of that industry came the legal dispute that eventually took the form of Lochner v. New York.

The great, mechanized bakeries that today produce mass quantities of baked goods had not yet been organized. What few machines had been invented (such as the mechanical mixer, patented in 1880) were not widely owned.9 Thus three-quarters of America’s bread was baked at home, mostly in rural areas.10 But in the fast-growing cities, many people lived in tenement apartments that lacked an oven for home baking. Bread was baked here as it had been in urban environments for centuries, as it had been in ancient Rome — in commercial ovens scattered about the city. Consumers could walk a short distance and buy what they would promptly eat before it went stale (the first plastic wrap, cellophane, was not manufactured in America until 1924).11 In New York City, bakeries were often housed in tenement basements whose solid earth floors could support the heavy ovens.

From the great Midwestern farms came massive railroad shipments of flour, which was packaged and distributed by wagons and trucks to each bakery’s storeroom. Laborers were needed to unload bags and barrels that weighed as much as two hundred pounds; sift the flour and yeast; mix the flour with ingredients in great bowls, troughs, and sifters; knead the dough; fire up the ovens; shove the loaves in and out of the ovens; and clean and maintain the tools and facilities.12 Most urban bakeshops employed four or fewer individuals to perform this work.13 Long hours were typical, as was true generally of labor at the turn of the century, on farms and in factories. Indeed, bakers worked even longer hours than other laborers. Ovens were heated day and night, and bakers worked while others were sleeping, so that customers could buy fresh bread in the morning.14 A baker’s workday might start in the late evening and end in the late morning or early afternoon of the next day.15 A typical workday exceeded 10 hours; workweeks often consumed 70 or 80 hours, and on occasion more than 100 hours.16

These bakeshops did not feature the clean, well-lit, well-ventilated working conditions that mechanization and centralization would later bring to the industry. Urban bakeshops shared dark, low-ceilinged basement space with sewage pipes. Dust and fumes accumulated for lack of ventilation. Bakeshops were damp and dirty, and facilities for washing were primitive.17 In order to entice people to work long hours in these conditions, shop owners had to offer wages high enough to persuade laborers to forgo other opportunities. A typical bakeshop employee would earn cash wages of as much as $12 per week.18 Despite harsh conditions, the mortality rate for bakers did not markedly exceed other occupations.19 And many who had escaped Europe to pursue upward mobility discovered that competing employers — when they could be found — offered nothing better.

No governmental or private coercion required anyone to take a bakery job within the state of New York. Labor contracts were voluntary, and terminable at will. The law left each individual — employer and employee alike — free to make his own decisions, based on his own judgment, and to negotiate whatever terms were offered. But such voluntary arrangements were not satisfactory to the New York legislature in these, the early years of what later became known as the Progressive Era. The hallmark of that political reform movement, which began in the 1890s and ended with World War I, was increased government intervention in the marketplace through such measures as railroad regulation, antitrust legislation, and income taxation. Progressive reformers focused special attention on housing and working conditions and advanced a variety of arguments that laws should limit hours of labor. Some said this would spread jobs and wealth among more people, eliminating unemployment. Others attacked the validity of labor contracts reached between bakeshop owners and laborers. According to one critic, “An empty stomach can make no contracts. [The workers] assent but they do not consent, they submit but they do not agree.”20

The Bakeshop Act of 1895, sponsored by a coalition of prominent powers in New York politics, passed both houses of the state legislature unanimously.21 The Act made it a crime for the owner of a bakeshop to allow a laborer to work more than 10 hours in one day, or more than 60 hours in one week. Bakeshop owners, however, were exempted; only employees’ hours were limited.22 Although similar laws in other states allowed employees to voluntarily opt out, New York’s law included no such “free-contract proviso.”23 The law also provided funds for hiring four deputies to seek out violations and enforce the law.24

New York v. Lochner: Crimes and Appeals

During the first three months after the Bakeshop Act took effect, 150 bakeries were inspected, of which 105 were charged with violations.25 In 1899, inspectors brought about the arrest of Joseph Lochner, a German immigrant whose shop, Lochner’s Home Bakery, was located upstate in Utica.26 Lochner had arrived in America at age 20 and worked for eight years as a laborer before opening his own shop. In contrast to the dreary basement bakeries that furnished the Bakeshop Act’s rationale, Lochner’s bakery (at least, as shown in a 1908 photograph) seems to have been a “relatively airy and mechanized aboveground shop.”27 In any event, Lochner was indicted, arraigned, tried, and convicted of having offended the statute in December 1899, by permitting an employee to work more than 60 hours in one week. To avoid a 20-day jail sentence, Lochner paid the $20 fine.28 Two years later, Lochner was arrested again, for having allowed another employee to work more than 60 hours.29 (Not coincidentally, Lochner had been quarreling for many years with the Utica branch of the journeyman bakers’ union, an avid supporter of the maximum hours regulation.)30 Offering no defense at his 1902 trial, Lochner was sentenced to pay $50, or serve 50 days in jail. This time, however, instead of paying the fine, he appealed his conviction.31 Lochner seems to have been a “hardheaded man who had determined that no one else was going to tell him how to run his business — not the state of New York and especially not the workers or their union.”32

The first New York appellate court to consider Lochner’s case held that the parties’ right to make employment contracts was subordinate to the public’s power to promote health. The court treated the Bakeshop Act as a health law, assuming (without factual findings from the trial court) that working long hours in hot, ill-ventilated areas, with flour dust in the air, “might produce a diseased condition of the human system, so that the employees would not be capable of doing their work well and supplying the public with wholesome food.”33 Rejecting Lochner’s argument that his contract rights were being violated, the court observed that “the statute does not prohibit any right, but regulates it, and there is a wide difference between regulation and prohibition, between prescribing the terms by which the right may be enjoyed, and the denial of that right altogether.”34 In other words, a right is not violated unless it is annihilated.

The next New York appellate court to consider Lochner’s case also treated the Bakeshop Act as a health law that trumped the parties’ right to make labor contracts. The court pointed out that the statute regulated not only bakers’ working hours but a bakeshop’s drainage, plumbing, furniture, utensils, cleaning, washrooms, sleeping places, ventilation, flooring, whitewashing, and walls, even to the point that the factory inspector “may also require the wood work of such walls to be painted.”35 Given the Act’s close attention to such health-related details, the court thought it “reasonable to assume . . . that a man is more likely to be careful and cleanly when well, and not overworked, than when exhausted by fatigue, which makes for careless and slovenly habits, and tends to dirt and disease.”36

New York’s power to regulate for health reasons was grounded, the court held, in the “police power” that state governments possess as part of their sovereignty. While noting the “impossibility of setting the bounds of the police power,” the court held that the Bakeshop Act’s purpose “is to benefit the public; that it has a just and reasonable relation to the public welfare, and hence is within the police power possessed by the Legislature.”37 According to a then-prominent legal treatise cited by the court, the Act’s maximum hours provision was especially necessary to safeguard health against the supposedly mind-muddling effects of capitalism:

If the law did not interfere, the feverish, intense desire to acquire wealth . . . inciting a relentless rivalry and competition, would ultimately prevent, not only the wage-earners, but likewise the capitalists and employers themselves, from yielding to the warnings of nature and obeying the instinct of self-preservation by resting periodically from labor.38

In a concurring opinion, another judge warned that to invalidate the law would “nullify the will of the people.”39

In dissent, however, Judge Denis O’Brien urged that the Bakeshop Act be struck down as unconstitutional. He, too, acknowledged the long-established understanding that the police power authorizes legislation “for the protection of health, morals, or good order,” but he did not believe that the maximum hours provision served any such purpose.40 Instead, he urged that this portion of the law be voided as an unjustified infringement on individual liberty:

Liberty, in its broad sense, means the right, not only of freedom from actual restraint of the person, but the right of such use of his faculties in all lawful ways, to live and work where he will, to earn his livelihood in any lawful calling, and to pursue any lawful trade or avocation. All laws, therefore, which impair or trammel those rights or restrict his freedom of action, or his choice of methods in the transaction of his lawful business, are infringements upon his fundamental right of liberty, and are void.41

In so dissenting, Judge O’Brien was following leads supplied by Supreme Court Justices as to how the Constitution should be interpreted. Justice Stephen Field, dissenting in the Slaughter-House Cases of 1873, had argued that a state monopoly on slaughterhouse work violated the “right to pursue one of the ordinary trades or callings of life.”42 And in Allgeyer v. Louisiana, an 1897 case, the Supreme Court had actually struck down a Louisiana insurance law, holding that the Constitution’s references to “liberty” not only protect “the right of the citizen to be free from the mere physical restraint of his person, as by incarceration” but also “embrace the right of the citizen to be free in the enjoyment of all his faculties . . . to pursue any livelihood or avocation; and for that purpose to enter into all contracts which may be proper.”43

As Joseph Lochner pondered his next step, he found cause for hope in the fact that his conviction had been upheld by the narrowest possible margins (3–2 and 4–3) in New York’s appellate courts. The conflict between “liberty of contract” and the “police power,” like a seesaw teetering near equilibrium, seemed capable of tipping in either direction. Sensing that victory was attainable, Lochner took his fight to the highest court in the land.

Lochner v. New York: The Supreme Court’s Decision

When Lochner’s petition arrived at the Supreme Court, it was accepted for review by Justice Rufus Peckham, a noted opponent of state regulation and author of the Court’s Allgeyer opinion.44 The case was argued over two days in February 1905.45 At first the court voted 5–4 in private conference to uphold Lochner’s conviction. But then Justice Peckham wrote a sharp dissent that convinced another Justice to change his mind. With a little editing, Peckham’s dissent then became the majority’s official opinion declaring the Bakeshop Act unconstitutional.46

Early in his opinion, Peckham conceded that all individual liberty is constitutionally subordinate to the amorphous “police power”:

There are . . . certain powers, existing in the sovereignty of each state in the Union, somewhat vaguely termed police powers, the exact description and limitation of which have not been attempted by the courts. Those powers, broadly stated, and without, at present, any attempt at a more specific limitation, relate to the safety, health, morals, and general welfare of the public. Both property and liberty are held on such reasonable conditions as may be imposed by the governing power of the state in the exercise of those powers. . . .47

Thus Peckham had to admit that the bulk of the Bakeshop Act, being directed at health hazards curable by better plumbing and ventilation, was valid under the police power. But the Act’s maximum hours provision, Peckham wrote, was not really a health law, because it lacked any “fair ground, reasonable in and of itself, to say that there is material danger to the public health, or to the health of the employees, if the hours of labor are not curtailed.”48

So if the maximum hours provision was not a health law, what was it? In the majority’s view it was a “labor law,” designed to benefit one economic class at another’s expense.49 “It seems to us,” Peckham wrote, “that the real object and purpose were simply to regulate the hours of labor between the master and his employees . . . in a private business, not dangerous in any degree to morals, or in any real and substantial degree to the health of the employees.”50 Finding that the “statute necessarily interferes with the right of contract between the employer and employees,” Peckham concluded that laws such as this, “limiting the hours in which grown and intelligent men may labor to earn their living, are mere meddlesome interferences with the rights of the individual. . . .”51 Four Justices sided with Peckham in holding that the “limit of the police power has been reached and passed in this case,” yielding a five-man majority to strike down the maximum hours portion of the New York Bakeshop Act.52 (Three Justices, not including Holmes, dissented on grounds that the law really was a health measure and therefore valid under the police power.)

At this point — that is, before taking Holmes’s dissent into account — opinions on the Bakeshop Act’s validity had been expressed by some 20 appellate judges (12 in New York, and 8 on the Supreme Court). Remarkably, these 20 had split evenly: Ten thought the Act a legitimate exercise of the police power, while 10 thought it exceeded that power.53 This is the kind of split opinion one might expect from a jury that has been asked to decide a close question of fact, such as whether the noise from a woodworking shop is loud enough to be classified as an illegal nuisance. In Lochner’s case, a score of highly experienced judges split down the middle while engaged in what they saw as a similar task, namely deciding whether a provision restricting work hours was or was not a health law.

Justice Holmes, by radically reframing the issue over which his brethren had been agonizing, sought to show how this thorny problem could be made to disappear. In essence he asked a much more fundamental question: What if the Constitution contains no limit on the police power? What if the distinction between “health laws” and other types of law is just a red herring? In raising this issue, Holmes was banking on the fact that nobody — not even the five-man Lochner majority — regarded “liberty of contract” as an ironclad principle or claimed to know the precise nature of the states’ constitutional “police powers.” Before he was through, Holmes would call into question not only the majority’s decision to invalidate the Bakeshop Act but the very idea that the United States Constitution embodies principles relevant to such decisions.

Holmes in Dissent: The Empty Constitution

Uninterested in whether or not the Bakeshop Act was a health law, Holmes devoted only a single line of his dissent to the issue: “A reasonable man might think it a proper measure on the score of health.”54 As one commentator noted, he “entirely ignored his colleagues and refused to engage in their debate about how to apply existing legal tests for distinguishing health and safety laws from special interest legislation.”55 Holmes, who has been called “the finest philosophical mind in the history of judging,” had more profound issues on his mind.56

Peckham’s majority opinion had been based on the premise that the Constitution protects individual liberty, including liberty of contract. Holmes attacked that premise outright. How could liberty of contract possibly be a principle capable of yielding a decision in Lochner’s case, Holmes asked, when violations of such liberty are routinely permitted by law? “The liberty of the citizen to do as he likes so long as he does not interfere with the liberty of others to do the same,” Holmes observed, “is interfered with by school laws, by the Post Office, by every state or municipal institution which takes his money for purposes thought desirable, whether he likes it or not.” For good measure, he cited several cases in which the Court had recently approved laws prohibiting lotteries, doing business on Sunday, engaging in usury, selling stock on margin, and employing underground miners more than eight hours a day — each law a clear interference with contractual liberty. “General propositions do not decide concrete cases,” Holmes nonchalantly concluded — and what judge could have shown otherwise, given the state of American jurisprudence at the time?

With “liberty of contract” in tatters, Holmes could casually dismiss it as a mere “shibboleth,” a subjective opinion harbored by five Justices that has no proper role in constitutional adjudication.57 To drive home his contempt for the majority’s approach, Holmes included in his Lochner dissent a snide, sarcastic gem that has become the most quoted sentence in this much-quoted opinion: “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”58 For a modern reader to grasp the meaning of this reference, some factual background is required. The English author Herbert Spencer (1820–1903) was a prominent intellectual whose most important book, Social Statics, was originally published in 1853 and reissued continually thereafter. “In the three decades after the Civil War,” one historian has written, “it was impossible to be active in any field of intellectual work without mastering Spencer.”59 Central to Spencer’s thinking was a belief that our emotions dictate our moral values, which include an “instinct of personal rights.”60 That “instinct” Spencer defined as a “feeling that leads him to claim as great a share of natural privilege as is claimed by others — a feeling that leads him to repel anything like an encroachment upon what he thinks his sphere of original freedom.”61 This led Spencer to conclude: “Every man has freedom to do all that he wills, provided he infringes not the equal freedom of any other man.62 Holmes, by coyly denying that Spencer’s “law of equal liberty” had the solemn status of a constitutional principle, masterfully conveyed two points: that any principle of individual liberty must emanate from a source outside the Constitution, not within it — and that the Peckham majority’s “liberty of contract” had the same intellectual status as Spencer’s emotionalist rubbish. “All my life I have sneered at the natural rights of man,” Holmes confided to a friend some years later.63 But in a lifetime of sneering, Holmes never uttered a more damaging slur than this offhand reference to Herbert Spencer’s Social Statics.

In order to mock “liberty of contract” as nothing more than a reflection of the majority’s tastes in popular reading, Holmes had to evade large swaths of evidence tending to show that the Constitution indeed embodies a substantive commitment to individual liberty. In the Declaration of Independence, the Founders clearly stated their intent to create a government with a single purpose — the protection of individual rights to life, liberty, and the pursuit of happiness. Consistent with the Constitution’s Preamble, which declares a desire to “secure the blessings of liberty to ourselves and our posterity,” every clause in the Bill of Rights imposes a strict limit on government’s power over individual liberty and property. In addition, Article I forbids the states to pass any law “impairing the obligation of contracts.”64 And to prevent future generations from interpreting such clauses as an exhaustive list, the Ninth Amendment states: “The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people.”

To be sure, the Constitution’s basic principle was undercut by important omissions and contradictions, the most serious being its toleration of slavery at the state level. But the Civil War tragically and unmistakably exposed the evil of a legal system that allows state governments to violate individual rights.65 Immediately after that war’s end, three constitutional amendments re-defined and strengthened the federal system, elevating the federal government to full sovereignty over the states and extending federal protection to individuals whose rights might be violated by state legislation. Two of these amendments were quite specific: The Thirteenth banned slavery, and the Fifteenth required that blacks be allowed to vote. But the Fourteenth Amendment’s reach was much broader. Not only did it endow individuals with federal citizenship, it also specified that no state government shall “abridge the privileges or immunities”66 of any citizen or deprive any person of “life, liberty, or property, without due process of law.”

In light of this context, no honest jurist in 1905 could deny that the Constitution embodies certain views on the proper relationship between the individual and his government. Reasonable disagreements might concern how that basic framework should guide interpretation of the document’s express language, but no such disagreement could obscure the fact that the Constitution was chock-full of substantive content. Yet it was precisely this fact that Holmes now urged the Court to evade. The same compromises and exceptions that rendered “liberty of contract” an easy target in Holmes’s attack on the Lochner majority also lent plausibility to his wider assault on the notion that America’s Constitution embodies any principles at all. A constitution, he wrote, “is not intended to embody a particular economic theory, whether of paternalism and the organic relation of the citizen to the State or of laissez faire.” As is evident from the two illustrations he chose, Holmes was using “economic theory” to mean a principle defining the individual’s relationship to the state. His first example, “paternalism and the organic relation of the citizen to the State,” refers to the Hegelian view that a nation, in one philosopher’s description, “is not an association of autonomous individuals [but] is itself an individual, a mystic ‘person’ that swallows up the citizens and transcends them, an independent, self-sustaining organism, made of human beings, with a will and purpose of its own.”67 Thus, as Hegel wrote, “If the state claims life, the individual must surrender it.”68 Holmes’s second example, “laissez faire,” refers to unregulated capitalism, a social system in which a nation is an association of autonomous individuals, who appoint government as their agent for defending individual rights (including private property rights) against force and fraud.

In Holmes’s view, a constitution cannot and should not attempt to embody either of these theories, or indeed any particular view on the individual’s relation to the state. Rather, a constitution is “made for people of fundamentally differing views,” any one of which may rightfully gain ascendancy if its adherents compose a sufficiently influential fraction of the electorate. As Holmes put it: “Every opinion tends to become a law,” and the reshaping of law is the “natural outcome of a dominant opinion.”69 In other words, a nation made up of capitalists, socialists, communists, anarchists, Quakers, Muslims, atheists, and a hundred other persuasions cannot reasonably expect its constitution to elevate one political view above all the others. Because opinions vary so widely, a nation that deems one superior to all others risks being torn apart by internal dissensions unable to find outlets in the political process. On this view, a proper constitution averts disaster by providing an orderly mechanism for embodying in law the constantly shifting, subjective opinions of political majorities. As one commentator explained, “Holmes believed that the law of the English-speaking peoples was an experiment in peaceful evolution in which a fair hearing in court substituted for the violent combat of more primitive societies.”70 It did not trouble Holmes that under such a constitution, society might adopt “tyrannical” laws. As he once wrote to a friend, “If my fellow citizens want to go to Hell I will help them. It’s my job.”71 And so Holmes was able to conclude, in his Lochner dissent, “that the word liberty in the Fourteenth Amendment is perverted when it is held to prevent the natural outcome of a dominant opinion.”

So there you have it. In just 617 carefully chosen words, the framework of liberty erected by the Founding Fathers and buttressed by the Civil War amendments had been interpreted out of existence.

According to Holmes, judges who claim to find fundamental principles in the Constitution are merely giving vent to their own personal political beliefs, which make some laws seem “natural and familiar” and others “novel, and even shocking.” But either reaction, in his view, is an “accident” having no proper place in adjudication. A judge’s “agreement or disagreement has nothing to do with the right of a majority to embody their opinions in law,” Holmes wrote, no matter what the judge’s reasons. “Some of these laws embody convictions or prejudices which judges are likely to share,” said Holmes. “Some may not.”72 Thus, it makes no difference whether a judge holds a conviction based on careful reflection, and an understanding of the Constitution’s specific clauses and content, its history and mission — or merely harbors a prejudice based on upbringing, social class, or a desire to please those in power. All such views are personal to the judge and hence irrelevant in adjudication — an interpretive principle to which Holmes made no exception for himself. “This case is decided upon an economic theory which a large part of the country does not entertain,” Holmes wrote in Lochner. “If it were a question whether I agreed with that theory, I should desire to study it further and long before making up my mind. But I do not conceive that to be my duty. . . .”

In short, Holmes believed that the Supreme Court presides over an empty Constitution — empty of purpose, of moral content, of enduring meaning — bereft of any embedded principles defining the relationship between man and the state. This distinctively Holmesian view, novel in 1905, is today’s orthodoxy. It dominates constitutional interpretation, defines public debate, and furnishes a litmus test for evaluating nominees to the Supreme Court. Although judges sometimes close their eyes to its logical implications when their pet causes are endangered, Holmes’s basic argument remains unrefuted by the legal establishment. In his bleak universe, there exists no principled limit on government power, no permanent institutional barrier between ourselves and tyranny — and the government can dispose of the individual as it pleases, as long as procedural niceties are observed. This pernicious Holmesian influence is reflected in the declining stature of America’s judiciary.

Lochner’s Legacy: Empty Robes

Although the Lochner decision was influential for a time, it was ultimately overshadowed by Holmes’s dissent. During the 32-year period (1905–1937) known as “the Lochner era,” the Supreme Court occasionally emulated theLochner majority by striking down state laws in the name of individual liberty.73 For example, the Court overrode laws setting minimum wages for women, banning the teaching of foreign languages to children, and requiring children to attend public schools.74 But then, in 1937, at the height of the New Deal, the Court “finally ended the Lochner era by upholding a state minimum wage law.”75 A year later, the Court announced that all economic intervention would be presumed valid, unless a “specific prohibition of the Constitution” (for instance, Article I’s ban on export taxes at the state level) said otherwise.76 In effect, any new exercise of government power over the economy was now presumed innocent until proven guilty. As the Supreme Court said in another New Deal case, “A state is free to adopt whatever economic policy may reasonably be deemed to promote the public welfare,” and the “courts are without authority . . . to override it.”77 One scholar summarized the sea change this way: “When the New Deal Court repudiated Lochner after 1937, it was repudiating market freedom as an ultimate constitutional value, and declaring that, henceforth, economic regulation would be treated as a utilitarian question of social engineering.”78 The Lochner majority was last cited approvingly by the Supreme Court in 1941.79

Holmes’s dissent was instrumental in consigning the Lochner decision to legal hell. According to liberal Justice Felix Frankfurter, the dissent was “the turning point” in a struggle against “the unconscious identification of personal views with constitutional sanction.”80 Echoing Holmes, conservative theorist Robert Bork has reviled Lochner as a “notorious” decision that enforced “an individual liberty that is nowhere to be found in the Constitution itself.”81 Added Bork: “To this day, when a judge simply makes up the Constitution he is said ‘to Lochnerize.’ . . .”82 Other commentators agree: “Supreme Court justices consistently use Lochner as an epithet to hurl at their colleagues when they disapprove of a decision declaring a law unconstitutional.”83 “We speak of ‘lochnerizing’ when we wish to imply that judges substitute their policy preferences for those of the legislature.”84 Typical of modern attitudes are the Washington Post’s reference to the “discredited Lochner era”85 and the New York Times’s observation that the era “is considered one of the court’s darkest.”86

With the canonization of Holmes’s Lochner dissent, a miasma of judicial timidity seeped into America’s courtrooms. More than sixty years have elapsed since the Supreme Court last struck down an economic regulation on grounds that it violated unenumerated property or contract rights. And in the noneconomic realm, the Court’s Lochner-esque decision in Roe v. Wade (1973) generated fierce public and professional backlash, discouraging further forays of that type. In Roe, a decision “widely regarded as the second coming of Lochner,” a sharply divided Court held that the Constitution protects a woman’s right to abort her first-trimester fetus.87 Here, one must carefully distinguish the method of that Court’s decision from its specific content. Because the Constitution does not expressly authorize states to ban abortion, the Court was entitled to evaluate the law’s validity in light of the Constitution’s fundamental commitment to protecting individual liberty (including that of women, regardless of any errors the Founders may have made on that score). One can agree with that liberty-oriented approach and yet still acknowledge the Court’s failure to apply it persuasively. (Essentially, the Roe Court recited a grab bag of pro-liberty clauses and precedents and invited the reader to choose a favorite.)88

Predictably, however, conservatives have aimed their critical arrows — dipped in the venom of Holmes’s dissent — straight at Roe v. Wade’s conclusion that the Constitution protects individual liberty. Those arrows struck home. A large segment of the public now believes that any such holding, no matter how firmly grounded in the Constitution’s language and history, is merely rhetorical camouflage for judges’ assumption of extra-constitutional power to impose their own personal opinions on the law.89 Little wonder that recurring public protests and even death threats have dogged the Court ever since. Fear of similar backlash has hindered the administration of justice in other areas as well. For example, the Court needed seventeen years of hand-wringing to finally decide, in Lawrence v. Texas (2003), that the Constitution does not permit gays to be thrown in jail for private, consensual sex.90 Dissenting in that case, Justice Scalia referenced Lochner obliquely, asserting that the Constitution no more protects homosexual sodomy than it does the right to work “more than 60 hours per week in a bakery.”91

Notwithstanding occasional hard-won exceptions, the emasculated Supreme Court now spurns virtually every opportunity to search the Constitution for underlying principles that place limits on state power. A few years ago, when Suzette Kelo’s house was seized under the eminent domain power for transfer to a private developer in Connecticut, she took her case to the Supreme Court — only to be told that the Constitution offers her no protection.92 Abigail Burroughs, terminally ill with neck and head cancer, died several years before the Court disdainfully turned its back on her survivors’ plea for a constitutional right to use experimental life-saving medicine unapproved by the Food and Drug Administration.93 And Dr. Harold Glucksberg, a physician whose terminally ill patient sought a painless suicide, lost his case on the grounds that offering voluntary medical assistance at the end of life is not “deeply rooted in this Nation’s history and tradition.”94 Cases such as these have made it painfully clear to Americans that their Constitution — as interpreted by the modern Supreme Court — imposes no principled limits on the state’s power to dispose of their property and lives. If more proof is necessary, observe that both the Bush and Obama administrations, in recent highly publicized legislation, have dramatically expanded government control of the economy and of private businesses without any discernible worry that the Supreme Court will trouble itself over the rampant abrogation of private property and contract rights.

Lochner’s Other Legacy: An Empty Debate

By arguing that the Constitution is nothing but a highly formalized mechanism for molding subjective opinions into law, Holmes shifted the terms of public debate toward discussion of whose subjective opinions count. Beginning in the 1980s, conservatives such as Edwin Meese III, the U.S. attorney general under Ronald Reagan, and Robert Bork, federal judge and failed Supreme Court nominee, successfully framed the alternatives for constitutional interpretation in Lochnerian terms. According to this view, judges have only two options: to emulate the majority in Lochner by brazenly enforcing their own subjective opinions — or to emulate Holmes in dissent by deferring to the subjective opinions of society (as manifested by legislative vote). In today’s parlance, this means judges must choose between “judicial activism” and “judicial restraint.”95 On this basis, Holmesian conservatives routinely condemn Lochner v. New York, Roe v. WadeLawrence v. Texas, and similar cases as illegitimate exercises of raw judicial power, “activist” decisions unauthorized by the Constitution and dangerous to the body politic. According to Bork, Lochner “lives in the law as the symbol, indeed the quintessence, of judicial usurpation of power.”96

Today’s liberals generally find themselves on the defensive against such conservative attacks. On the liberal view, a mechanically applied doctrine of “judicial restraint” would improperly tie judges’ hands, allowing legislative majorities unrestrained power to enact any law not expressly forbidden by the Constitution. As Judge Posner has observed, “This would mean that a state could require everyone to marry, or to have sexual intercourse at least once a month, or that it could take away every couple’s second child and place it in a foster home.”97 But as an alternative to the folly of such “judicial restraint,” liberals offer dubious interpretive methods of their own. Rather than refute Holmes’s attack on the Lochner majority, liberals contend that the Constitution “must draw its meaning from the evolving standards of decency that mark the progress of a maturing society.”98 Or, as Al Gore pledged during his 2000 presidential run, “I would look for justices of the Supreme Court who understand that our Constitution is a living and breathing document, that it was intended by our founders to be interpreted in the light of the constantly evolving experience of the American people.”99

In sum, neither conservatives nor liberals have advanced a method of interpretation aimed at objectively identifying and applying constitutional principles that limit the power of government over the individual. Instead, both factions accept the Holmesian model that makes all government action a matter of subjective social opinions. Although the factions differ in detail — conservatives are more likely to venerate freeze-dried opinions from centuries past, whereas liberals prefer a bubbling stew of modern sentiments — the current controversy is nothing but Lochner warmed over. As one legal history states more formally, “The majority and dissenting opinions in Lochner stand today as landmarks in the literature of judicial activism and restraint.”100 So long as Lochner sets the terms of debate, Americans will continue to believe they face a Hobson’s choice between judicial eunuchs who passively allow legislatures to dominate a helpless populace — and judicial dictators who actively impose their own personal prejudices on that same helpless populace. Given those alternatives, it is no wonder that Holmesian conservatives are winning the public debate. Any citizen who wants to have some slight influence on the “dominant opinion” will more likely prefer an all-powerful legislature beholden to the voting public, as against an all-powerful, life-tenured judiciary beholden to no one.

In recent decades, the bellwether of this struggle between “activism” and “restraint” has been Roe v. Wade — and so it will continue, until that fragile decision is either overruled or placed on a sound constitutional basis.101 For many years now, the addition of a single conservative Justice would have been enough to tip the balance against Roe. If that decision is finally overruled on Holmesian grounds, then the last ragged vestiges of a principled, content-filled Constitution will have succumbed. After that, it may become virtually impossible to hear the voices of the Constitution’s framers above the clamor of pressure groups competing to forge the next “dominant opinion.” Ultimately, the outcome may depend on whether dissenters from the Holmesian consensus continue to be exposed and ostracized at the judicial nomination stage, by means of the Lochner litmus test.

The Lochner Litmus Test

During his lifetime, Holmes took pleasure from the prospect that his work would have enduring influence after his death. He once spoke, with characteristic eloquence, of feeling

the secret isolated joy of the thinker, who knows that, a hundred years after he is dead and forgotten, men who never heard of him will be moving to the measure of his thought — the subtle rapture of a postponed power, which the world knows not because it has no external trappings, but which to his prophetic vision is more real than that which commands an army.102

And indeed, the world is still “moving to the measure of his thought.” Holmes’s dissent is largely responsible for the “modern near-consensus that unelected justices have no mandate ‘to impose a particular economic philosophy upon the Constitution.’”103 Notably, President Obama’s regulatory czar, Cass Sunstein, is a former constitutional law professor who wrote an article, “Lochner’s Legacy,” stating that “for more than a half-century, the most important of all defining cases has been Lochner v. New York.104 In this post-Lochner world, it is not intellectually respectable to hold that the Constitution embodies any particular view of the relationship between the individual and the state. A judge who dares to suggest otherwise will inevitably be accused of resurrecting Lochner. And a judicial nominee who fails to pledge allegiance to Holmes’s empty Constitution may be grilled and required to recant, on pain of losing a confirmation vote.

Consider two examples. Clarence Thomas, before being nominated to the Supreme Court, had said in a speech that “the entire Constitution is a Bill of Rights; and economic rights are protected as much as any other rights.”105 When Thomas’s nomination reached the Senate, noted liberal constitutional scholar Laurence Tribe opposed confirmation in a New York Times op-ed that said: “Thomas would return the Court to the Lochner (1905) era — an era in which the Court was accused of sacrificing the health and safety of American workers at the altar of laissez-faire capitalism.”106 Thomas later went on the record as rejecting a return to the Lochner approach and endorsing the line of cases that discredited the majority opinion.107 The Senate then confirmed his appointment, but by a razor-thin margin (52–48). Similarly, in another confirmation fight fourteen years later, a young senator (and former law professor) named Barack Obama spoke out against the nomination of California appellate judge Janice Rogers Brown to the federal bench. It seems that Brown, in a public speech, had dared to disagree with Holmes, asserting that his “Lochner dissent has troubled me — has annoyed me — for a long time . . . because the framers did draft the Constitution with a surrounding sense of a particular polity in mind. . . .”108 Obama leaped to the attack: “For those who pay attention to legal argument, one of the things that is most troubling is Justice Brown’s approval of the Lochner era of the Supreme Court.”109 Predictably, Brown backtracked during her confirmation hearings, pledging that she would not really pursue a Lochner approach.110 She was then confirmed, narrowly, by a 56–43 vote.111

As President Obama and the Senate gear up to select a replacement for retiring Justice David Souter, the Lochner litmus test will once again serve as a powerful tool for identifying a nominee’s fundamental approach to construing the Constitution. The alternatives embodied in Lochner will be trotted out once again, and candidates will be invited to condemn the discredited majority approach and endorse the Holmesian view.

But what if the opinions set forth in Lochner do not exhaust the alternatives? What if judges can properly aspire to be, not petty despots or passive rubber stamps, but objective interpreters of a constitution by means of its fundamental principles? The question deserves attention, before the Supreme Court sinks into a timorous lassitude from which it cannot recover.

The Path Not Taken

Justice Holmes took advantage of clashing precedents to claim that the Constitution lacks all content, that the nation’s fundamental law is agnostic on the issue of man’s relation to the state. But Holmes was wrong about the empty Constitution. Not only is the document saturated with substantive content, but the deliberate disregard of that content inevitably left an interpretive vacuum where the Founders’ framework once stood, a vacuum that had to be filled by some other principle of man’s relation to the state. If the Lochner dissent was to be taken seriously, the individual had to be treated on principle as a rightless creature doomed to cringe before the “natural outcome” of society’s “dominant opinion,” and the Constitution had to be regarded on principle as an institutional juggernaut imposing society’s shifting, subjective opinions on recalcitrant individuals. Thus by intellectual sleight of hand, Holmes managed to radically redefine the Constitution’s content while presenting himself as the very soul of content-neutrality. And for more than a century now, we have been “moving to the measure of his thought,” following Holmes’s path into that shadowy, clamorous jungle where pressure groups struggle incessantly for the privilege of imposing their arbitrary “dominant opinions” on others, by force of law — while individuals are legally helpless to resist ever-growing assaults on their lives, liberties, and property. Only by retracing our steps and revisiting the Lochner decision with a different mind-set can we hope to find a clearer road.

The Lochner case arrived at the Supreme Court in the posture of a dispute over whether a restriction on working hours was a health law or not. But in his dissent Holmes highlighted a more fundamental issue: Does the Constitution protect the principle of liberty of contract? If so, then the government’s so-called police power is and must be severely limited — limited by the principle of the inalienable rights of the individual. But if a principle is a general truth that guides action in every case where it applies, brooking no exceptions, then surely neither the “police power” nor “liberty of contract,” as defined by the Court at that time, qualified as a genuine principle. The vague and undefinable “police power” gave society virtually unlimited control over the individual — yet even in Holmes’s view, that power was somehow subordinate to the equally vague “traditions of our people and our law.” On the other hand, “liberty of contract” supposedly protected an individual’s right to dispose of his labor and property — except in the dozens of situations where the police power could override it. How could a judge possibly know when to apply one and not the other? There was no objective basis for choosing.

Despite the lack of clear, consistent principles to govern cases such as Lochner’s, a Supreme Court Justice with Holmes’s penetrating philosophical skills could have explained, even in 1905, why both Holmes and the majority were erring in their approaches to Lochner’s case. That explanation would have had to begin with the realization that every constitution embodies some particular view of the individual’s relation to the state. Although Holmes was wrong to deny that the Constitution has content, the majority was also wrong in its interpretation of that content. On that score, it was surely preposterous for Justice Peckham to concede that individuals’ liberty and property are held in thrall to each state’s “vaguely termed police powers, the exact description and limitation of which have not been attempted by the courts.” After all, the term “police power” is not even mentioned in the Constitution, and nowhere does the document require that states be allowed to legislate for the “safety, health, morals and general welfare of the public,” a shapeless pile of verbiage that could excuse almost any law, regardless of content. Although it is true that the states in a federal system must be recognized as possessing power to enact and enforce laws, there was never any need to define that power in a way that threatened the Constitution’s underlying framework of protection for individual rights. Under a more objective concept of New York’s “police power,” therefore, the Court’s inquiry would have shifted to whether the Bakeshop Act protected Lochner’s rights or violated them.

As to what Lochner’s individual rights entailed, again the Constitution’s content could not properly be ignored. For example, the document’s references to the inviolable “obligation of contracts” (Article I), unenumerated rights “retained by the people” (Ninth Amendment), citizens’ inviolable “privileges or immunities” (Fourteenth Amendment), and individuals’ rights to “life, liberty, and property” (Fourteenth Amendment), all would have been recognized as relevant. Although it would not have been self-evident which clauses might apply to Lochner’s case or precisely how they should be interpreted, the Court could have taken first steps toward limiting the amorphous police power. How? By defining liberty of contract as a principle subsuming an individual’s unassailable freedom to trade his property, his money, and his labor according to his own judgment. Contra Holmes, general propositions can decide concrete cases, if those propositions are objectively defined. But such definition is impossible, at the constitutional level, so long as judges refuse to acknowledge that government exists for any particular purpose.

None of this is to deny that constitutional interpretation can be fraught with difficulty. Reasonable judges can arrive at different interpretations, especially in cases at the intersection of individual rights and legitimate exercises of government power. And even the most incisive interpretations cannot, and should not attempt to, rewrite the Constitution. So, for example, as long as the Post Office clause resides in Article I, the Supreme Court cannot abolish that ponderous government monopoly — even if it violates liberty of contract in obvious ways. Moreover, the Court must pay due respect to precedent, while never allowing an injustice to survive any longer than may be necessitated by innocent reliance on prior erroneous rulings. But in the mind of an objective judge, none of these pitfalls will obscure the fact that the Constitution has content — a specific view of the proper relation between man and the state — which content cannot be ignored without betraying the Court’s duty of objective interpretation. To take the purpose of government into account when interpreting the Constitution’s express language is not a judicial usurpation of power. On the contrary, it is an essential part of objective interpretation, no more in need of special authorization than is the use of concepts or logic.112

Ayn Rand once observed that Justice Holmes “has had the worst philosophical influence on American law.”113 The nihilistic impact of his Lochner dissent alone is enough to justify her claim. But it is not too late for a new generation of jurists to target that influence for elimination, by embarking upon the mission that Holmes and his brethren should have undertaken a century ago. Tomorrow’s jurists will need to honestly confront Lochner, that “most important of all defining cases” in American jurisprudence, with the understanding that neither the majority nor the dissents in that case properly took into account the Constitution’s substantive content. They will need to challenge the false Lochnerian alternatives of “judicial activism” and “judicial restraint.” And they will need to question whether, and on what grounds, Lochner should continue to serve as a litmus test for Supreme Court appointees. Once the “ghost of Lochner” has ceased to haunt American constitutional law, the Supreme Court can assume its proper role as ultimate legal authority on the objective meaning of America’s founding document.


About The Author

Tom Bowden

Analyst and Outreach Liaison, Ayn Rand Institute

“The Objective Standard”

 

Endnotes

Acknowledgments: The author would like to thank Onkar Ghate for his invaluable suggestions and editing, Adam Mossoff and Larry Salzman for their helpful comments on earlier drafts, Peter Schwartz for sharing his thoughts on legal interpretation, and Rebecca Knapp for her editorial assistance.

1  Lochner v. New York, 198 U.S. 45, 65 (1905) (Holmes, J., dissenting). The full text of the dissent can be found here.

2 Sheldon M. Novick, Honorable Justice: The Life of Oliver Wendell Holmes (Boston: Little, Brown and Co., 1989), p. 283.

3 Charles Evans Hughes, The Supreme Court of the United States, quoted in Catherine Drinker Bowen, Yankee from Olympus: Justice Holmes and His Family (Boston: Little, Brown and Co., 1943), p. 373.

4 Richard A. Posner, Law and Literature (Cambridge, MA: Harvard University Press, 1998), p. 271; G. Edward White, Justice Oliver Wendell Holmes: Law and the Inner Self (New York: Oxford University Press, 1993), p. 324.

5 David E. Bernstein, review of Michael J. Phillips, The Lochner Court, Myth and Reality: Substantive Due Process from the 1890s to the 1930sLaw and History Review, vol. 21 (Spring 2003), p. 231.

6 Posner, Law and Literature, p. 271; Bernard H. Siegan, Economic Liberties and the Constitution (Chicago: University of Chicago Press, 1980), p. 203.

7 Ronald Dworkin, Freedom’s Law: The Moral Reading of the American Constitution (Cambridge, MA: Harvard University Press, 1997), pp. 82, 208; Richard A. Posner, Overcoming Law (Cambridge, MA: Harvard University Press, 1995), pp. 179–80.

8 Albert W. Alschuler, Law Without Values: The Life, Work, and Legacy of Justice Holmes (Chicago: University of Chicago Press, 2000), p. 63; Posner, Law and Literature,p. 269.

9 Paul Kens, Lochner v. New York: Economic Regulation on Trial (Lawrence: University Press of Kansas, 1998), pp. 7–8.

10 Ibid., p. 6.

11 “DuPont Rid of Cellophane,” New York Times, June 30, 1986, http://www.nytimes.com/1986/06/30/business/du-pont-rid-of-cellophane.html?&pagewanted=print (last accessed May 14, 2009).

12 Kens, Lochner v. New York, p. 13.

13 Ibid., p. 7.

14 73 A.D. 120, 128 (N.Y. App. Div. 1902).

15 Kens, Lochner v. New York, p. 13.

16 Ibid.

17 Ibid., pp. 8–9.

18 Ibid., p. 13. In that era, hourly wages were virtually unknown; laborers were hired by the day, or sometimes by the week.

19 Ibid., p. 10.

20 David Montgomery, Beyond Equality: Labor and the Radical Republicans, 1862–1972 (Urbana, IL: University of Illinois Press, 1981), p. 252 (emphasis in original).

21 Kens, Lochner v. New York, pp. 63–64; Session Laws of New York, 1895, vol. 1, ch. 518.

22 Kens, Lochner v. New York, p. 65.

23 Ibid., p. 21.

24 Ibid., p. 67.

25 Ibid., p. 90.

26 Ibid., p. 89; Peter Irons, A People’s History of the Supreme Court (New York: Penguin, 1999), p. 255.

27 Kens, Lochner v. New York, p. 89.

28 Ibid., p. 90.

29 Ibid., p. 89.

30 Ibid., p. 90.

31 Ibid., pp. 91–92. Ironically, Lochner’s team of appellate lawyers included one Henry Weismann, who had actually lobbied on behalf of the bakers’ union for passage of the Bakeshop Act in 1895.

32 Ibid., p. 89.

33 73 A.D. at 128.

34 73 A.D. at 127.

35 New York v. Lochner, 69 N.E. 373, 376, 378-79 (N.Y. 1904).

36 69 N.E. at 380.

37 69 N.E. at 376, 381.

38 Christopher Gustavus Tiedeman, A Treatise on the Limitations of Police Power in the United States (St. Louis: The F.H. Thomas Law Book Co., 1886), p. 181, quoted in New York v. Lochner, 73 A.D. 120, 126 (N.Y. App. 1902).

39 69 N.E. at 381 (Gray, J., concurring).

40 69 N.E. at 388 (O’Brien, J., dissenting).

41 69 N.E. at 386 (O’Brien, J., dissenting).

42 83 U.S. 36, 88 (Field, J., dissenting).

43 165 U.S. 578, 589 (1897).

44 Kens, Lochner v. New York, p. 117.

45 Novick, Honorable Justice, p. 280.

46 Ibid., p. 281.

47 198 U.S. 53 (emphasis added).

48 198 U.S. at 61.

49 198 U.S. at 57; see also Howard Gillman, The Constitution Besieged: The Rise and Demise of Lochner Era Police Powers Jurisprudence (Durham, NC: Duke University Press, 1993).

50 198 U.S. at 64.

51 198 U.S. at 53, 61.

52 198 U.S. at 58.

53 The grounds on which Judges McLennan and Williams dissented, in the first New York appellate court, are unclear, as they did not deliver written opinions. 73 A.D. at 128.

54 Unless otherwise noted, all quotations of Holmes in this section are from his dissent, 198 U.S. at 74–76.

55 Jeffrey Rosen, The Supreme Court: The Personalities and Rivalries That Defined America (New York: Times Books/Henry Holt and Company, 2007), p. 113.

56 Posner, Overcoming Law, p. 195 (emphasis in original).

57 A “word or saying used by adherents of a party, sect, or belief and usually regarded by others as empty of real meaning.” Merriam-Webster Online, “shibboleth,” http://www.merriam-webster.com/dictionary/shibboleth (last accessed January 28, 2009).

58 A Lexis/Nexis search performed on February 27, 2009, indicated that this sentence had been quoted verbatim in 59 reported appellate cases, 98 news reports, and 338 law review articles.

59 Richard Hofstadter, Social Darwinism in American Thought (New York: George Braziller, Inc., rev. ed., 1959), p. 33.

60 Herbert Spencer, Social Statics: The Conditions Essential to Human Happiness Specified, and the First of Them Developed (New York: Robert Schalkenbach Foundation, 1995), pp. 25, 86.

61 Ibid., p. 86.

62 Ibid., pp. 95–96 (emphasis in original).

63 Richard A. Posner, ed., The Essential Holmes: Selections from the Letters, Speeches, Judicial Opinions, and Other Writings of Oliver Wendell Holmes, Jr. (Chicago: University of Chicago Press, 1992), p. xxv.

64 Article I, section 10, clause 1.

65 See Harry Binswanger, “The Constitution and States’ Rights,” The Objectivist Forum, December 1987, pp. 7–13.

66 This was a 19th-century term of art denoting “fundamental rights” and “substantive liberties” of the individual, to be protected against “hostile state action.” Michael Kent Curtis, No State Shall Abridge: The Fourteenth Amendment and the Bill of Rights (Durham, NC: Duke University Press, 1986), pp. 47–48.

67 Leonard Peikoff, The Ominous Parallels: The End of Freedom in America (New York: Stein and Day, 1982), p. 27.

68 Robert Maynard Hutchins, ed., Great Books of the Western World (Volume 46: Hegel)(Chicago: W. Benton, 1952), p. 123.

69 As Holmes wrote in another dissent years later, concerning liberty of contract, “Contract is not specially mentioned in the text that we have to construe. It is merely an example of doing what you want to do, embodied in the word liberty. But pretty much all law consists in forbidding men to do some things that they want to do, and contract is no more exempt from law than other acts.” Adkins v. Children’s Hospital, 261 U.S. 525, 568 (1923) (Holmes, J., dissenting).

70 Sheldon M. Novick, “Oliver Wendell Holmes,” The Oxford Companion to the Supreme Court of the United States (New York: Oxford University Press, 1992), p. 410.

71 Letter to Harold Laski, March 4, 1920, in Mark deWolfe Howe, ed., Holmes-Laski Letters: The Correspondence of Mr. Justice Holmes and Harold J. Laski 1916–1935, vol. 1 (Cambridge, MA: Harvard University Press, 1953), p. 249.

72 Emphasis added.

73 Some legal historians hold that the Lochner Era actually began in 1897, when the Supreme Court in Allgeyer v. Louisiana struck down a state insurance law that interfered with contractual freedom.

74 Adkins v. Children’s Hospital,261 U.S. 525 (1923); Meyer v. Nebraska, 262 U.S. 390 (1923); Pierce v. Society of Sisters, 268 U.S. 510 (1925).

75 Adam Cohen, “Looking Back on Louis Brandeis on His 150th Birthday,” New York Times(November 14, 2006), p. A26. In West Coast Hotel v. Parrish, 300 U.S. 379 (1937), the Court upheld a state minimum-wage law for women.

76 United States v. Carolene Products, 304 U.S. 144, 152 n.4 (1938).

77 Nebbia v. New York, 291 U.S. 502, 537 (1934).

78 Bruce Ackerman, We the People: Transformations (Cambridge, MA: Harvard University Press, 2000), p. 401.

79 United States v. Darby, 312 U.S. 100 (1941); see Ackerman, We the People, p. 375.

80 Quoted in White, Justice Oliver Wendell Holmes, p. 362.

81 Robert Bork, “Individual Liberty and the Constitution,” The American Spectator, June 2008, pp. 30, 32.

82 Robert H. Bork, The Tempting of America: The Political Seduction of the Law (New York: Touchstone, 1990), p. 44.

83 Bernstein, review of Phillips, The Lochner Court, Myth and Reality, p. 231.

84 William M. Wiecek, Liberty Under Law: The Supreme Court in American Life (Baltimore: Johns Hopkins University Press, 1988), p. 124.

85 Bruce Fein, “Don’t Run from the Truth: Why Alito Shouldn’t Deny His Real Convictions,” Washington Post, (December 18, 2005), p. B1.

86 Adam Cohen, “Last Term’s Winner at the Supreme Court: Judicial Activism,” New York Times(July 9, 2007), p. A16.

87 Posner, Law and Literature, p. 271; 410 U.S. 113 (1973).

88 Roe v. Wade, 410 U.S. 113, 152 (1973).

89 According to a 2005 Pew Research Center public opinion poll, 26 percent of respondents believe the Supreme Court should “completely overturn” its decision in Roe v. Wade: http://people-press.org/questions/?qid=1636990&pid=51&ccid=51#top (last accessed May 4, 2009). A 2008 Gallup poll on the same issue found that 33 percent would like to see the decision overturned: http://www.gallup.com/poll/110002/Will-Abortion-Issue-Help-Hurt-McCain.aspx (last accessed May 4, 2009). On average, about one-third of Americans disapprove of the way the Supreme Court is doing its job: http://www.gallup.com/poll/18895/Public-Divided-Over-Future-Ideology-Supreme-Court.aspx (last accessed May 4, 2009).

90 In the 1986 case of Bowers v. Hardwick, 478 U.S. 186, the Supreme Court held that homosexual conduct between consulting adults in their home could be criminally punished. Not until 2003 did the Court, in Lawrence v. Texas, 539 U.S. 558, strike down a state law that put gays in jail — and then only by a 6–3 vote.

91 539 U.S. 558, 592 (Scalia, J., dissenting).

92 Kelo v. City of New London, 545 U.S. 469 (2005).

93 “Court Declines Experimental Drugs Case,” USA Today, January 14, 2008, http://www.usatoday.com/news/washington/2008-01-14-280098622_x.htm (last accessed April 30, 2009).

94 Washington v. Glucksberg,521 U.S. 702, 720–21 (1997).

95 Although today’s legal professionals debate such interpretive concepts as “public-meaning originalism,” “living constitutionalism,” and judicial “humility,” it is the activism/restraint dichotomy that continues to dominate public discussion outside the courts and academia.

96 Bork, The Tempting of America, p. 44.

97 Richard A. Posner, Sex and Reason (Cambridge, MA: Harvard University Press, 1992), p. 328.

98 Trop v. Dulles,356 U.S. 86, 101 (1958).

99 Transcript of Democratic Presidential Debate, Los Angeles, California, March 1, 2000, http://edition.cnn.com/TRANSCRIPTS/0003/01/se.09.html (last accessed April 30, 2009).

100 Ronald M. Labbé and Jonathan Lurie, The Slaughterhouse Cases: Regulation, Reconstruction, and the Fourteenth Amendment (Lawrence: University Press of Kansas, 2003), p. 249.

101 In a 1992 case, Planned Parenthood v. Casey, 505 U.S. 833, a plurality of the Supreme Court singled out the Fourteenth Amendment’s concept of “liberty” as the proper basis for upholding a woman’s qualified right to abortion. However, the Court also reaffirmed Roe’s holding that the states have “their own legitimate interests in protecting prenatal life.” 505 U.S. at 853. Hence this entire line of cases remains vulnerable to the Holmesian critique in Lochner. If the “police power” can be interpreted to have no limits, then why not the state’s “legitimate interests in protecting prenatal life”?

102 Posner, The Essential Holmes, p. 220 (correcting Holmes’s obsolete spelling of “subtle” as “subtile”). In a similar vein, Holmes gave a eulogy in 1891 praising men of “ambition” whose “dream of spiritual reign” leads them to seek the “intoxicating authority which controls the future from within by shaping the thoughts and speech of a later time.” Posner, The Essential Holmes, p. 214.

103 Stuart Taylor Jr., “Does the President Agree with This Nominee?” TheAtlantic.com, May 3, 2005, http://www.theatlantic.com/magazine/archive/2005/05/does-the-president-agree-with-this-nominee/304012/ (last accessed April 30, 2009).

104 Cass R. Sunstein, “Lochner’s Legacy,” Columbia Law Review, vol. 87 (June 1987), p. 873.

105 Quoted in Scott Douglas Gerber, First Principles: The Jurisprudence of Justice Clarence Thomas (New York: NYU Press, 2002), p. 54.

106 Ibid., p. 54.

107 Ibid., pp. 54–55; Dworkin, Freedom’s Law, pp. 308–10.

108 Janice Rogers Brown, “‘A Whiter Shade of Pale’: Sense and Nonsense—The Pursuit of Perfection in Law and Politics,” address to Federalist Society, University of Chicago Law School, April 20, 2000, http://www.communityrights.org/PDFs/4-20-00FedSoc.pdf.

109 “Remarks of U.S. Senaor Barack Obama on the nomination of Justice Janice Rogers Brown,” June 8, 2005, http://obamaspeeches.com/021-Nomination-of-Justice-Janice-Rogers-Brown-Obama-Speech.htm (last accessed January 29, 2009).

110 Taylor, “Does the President Agree with This Nominee?” supra.

111 In addition, the late Bernard H. Siegan, a professor at the University of San Diego School of Law, was rejected by the Senate for a seat on the U.S. Court of Appeals based largely on the support for the Lochner decision expressed in his book, Economic Liberties and the Constitution. See Larry Salzman, “Property and Principle: A Review Essay on Bernard H. Siegan’s Economic Liberties and the Constitution,” The Objective Standard, vol. 1, no. 4 (Winter 2006–2007), p. 88.

112 Promising work on objective judicial interpretation is being undertaken by Tara Smith, professor of philosophy, University of Texas at Austin. See “Why Originalism Won’t Die—Common Mistakes in Competing Theories of Judicial Interpretation,” Duke Journal of Constitutional Law & Public Policy, vol. 2 (2007), p. 159; “Originalism’s Misplaced Fidelity,” Constitutional Commentary, vol. 25, no. 3 (forthcoming, August 2009).

113 Quoted in Marlene Podritske and Peter Schwartz, eds., Objectively Speaking: Ayn Rand Interviewed (Lanham, MD: Lexington Books, 2009), p. 60.

The Corrupt Critics of CEO Pay

by Yaron Brook and Don Watkins | May 2009

Since the start of this crisis, we’ve been regaled with stories of CEOs receiving lavish bonuses. Well-paid executives have been vilified as reckless and greedy. L.A. Times columnist Patt Morrison captured the mood when she declared: “I want blood.”

But this is nothing new.

Long before the current crisis, Warren Buffet, John McCain, President Obama, and many other critics condemned (supposedly) outrageous executive pay. “We have a [moral] deficit when CEOs are making more in ten minutes than some workers make in ten months,” Obama said during the presidential campaign.

With today’s government entanglement in business affairs, many Americans are open to attempts by Washington to slash CEO pay. Apparently hoping to exploit that opportunity, the chairman of the House Financial Services Committee, Barney Frank, recently floated the idea of extending the TARP executive pay caps to every financial institution, and potentially to all U.S. companies.

It’s understandable that taxpayers think they should have some say in how bailed out businesses are run, which is one reason why Washington should have never bailed-out those companies in the first place. But why have the critics been so intent on dictating to shareholders of private companies how much they can pay their CEOs?

It’s not because the supposed victims, shareholders, have been demanding it. A few ideologically motivated activists aside, most shareholders in the years leading up to the crisis weren’t complaining about CEO pay packages. Virtually every time they had a chance to vote on a “say on pay” resolution, which would have given them a non-binding vote on CEO compensation, shareholders rejected the measure. Even if they had been given a say, there is no reason to expect they would have put the brakes on high pay. In Britain, for instance, shareholders had a government-mandated right to vote on management compensation, yet CEO pay still rose unabated.

So what has the critics all riled up?

They allege that, despite appearances, executives were not really being paid for performance. Pointing to CEOs who raked in huge bonuses while their companies tanked, the critics say that executive pay was driven not by supply and demand, but by an old boys’ network that placed mutual back-scratching above shareholder welfare. As Obama put it last year, “What accounts for the change in CEO pay is not any market imperative. It’s cultural. At a time when average workers are experiencing little or no income growth, many of America’s CEOs have lost any sense of shame about grabbing whatever their . . . corporate boards will allow.”

It was a compelling tale, but this account of rising pay just doesn’t square with the facts. To name a few: (1) the rise in CEO pay was in line with that of other elite positions, such as professional athletes; (2) the rise in pay continued even as fewer CEOs chaired their board of directors; (3) the companies that paid CEOs the most generally had stock returns much greater than other companies in their industries, while companies that paid their CEOs the least underperformed in their industries.

The critics of CEO pay ignore all of this. They take it as obvious that executives making millions are overpaid. “It turns out that these shareholders, who are wonderfully thoughtful and collectively incisive, become quite stupid when it comes to paying the boss, the guy who works for them,” Barney Frank has said. But what kind of compensation package will attract, retain, and motivate the best CEO is a complicated question. Companies have to weigh thousands of facts and make many subtle judgments in order to assess what a CEO is worth.

What should be the mix between base salary and incentive pay? What kind of incentives should be offered; stock options, restricted stock options, stock appreciation rights? How should those incentives be structured; over what time frame and using which metrics? And what about a severance plan? What kind of plan will be necessary to attract the best candidate? And so forth and so on.

The mere fact that people make their living as executive-pay consultants illustrates how challenging the task is. Central planners like Frank cavalierly dismiss this and declare that they can somehow divine that lower pay for executives will not hinder a company.

Of course, a free market doesn’t eliminate mistakes. A company can hire an incompetent CEO, or structure a pay package that rewards executives for short-term profits at the expense of the company’s long-term welfare. But a company suffers from its mistakes: shareholders earn less, managers need to be fired, and competitors gain market share.

There is, however, something that can short-circuit this corrective process and help keep highly paid incompetents in business: government coercion.

Take the Williams Act, which restricts stock accumulation for the purpose of a takeover, for example. In a truly free market, if poor management is causing a company’s stock to tank, shareholders or outsiders are incentivized to buy enough shares to fire the CEO and improve company performance. But the Williams Act, among other regulations, makes ousting poor management more difficult.

And while the critics have tried to scapegoat “overpaid executives” for our current financial turmoil, the actual cause was, as past editions of Fusion have indicated, coercive government regulations and interventions. Far from vindicating the denunciations of “stupid” shareholders and “inept” CEOs, the recent economic downturn shows what happens when the government interferes with economic decision-making through policies such as the “affordable housing” crusade and the Fed’s artificially low interest rates.

If the critics’ goal were really to promote pay for performance, they would advocate an end to all such regulations and let the free market work.

But that’s not what they advocate. Instead, they call for more regulatory schemes, such as government-mandated “say on pay,” massive tax hikes on the rich, and even outright caps on executive compensation. They do not want pay to be determined by the market, reflect performance, or reward achievement — they just want it to be lower. Frank stated the point clearly when he threatened that if “say on pay” legislation doesn’t sufficiently reduce CEO compensation, “then we will do something more.” Another critic, discussing former Home Depot CEO Robert Nardelli, confessed that “it’s hard to believe that those leading the charge against his pay package . . . weren’t upset mainly by the fact that Nardelli had a $200 million pay package in the first place — no matter how he had performed.”

The critics want to bring down CEO pay, not because it is economically unjustifiable, but because they view it as morally unjustifiable. Prominent opponent of high CEO pay, Robert Reich, for instance, penned a Wall Street Journal column titled “CEOs Deserve Their Pay,” where he defended CEO pay from an economic standpoint, but denied that it was justified ethically. Insisting that wealth rightfully belongs to “society” rather than the individuals who create it, the critics maintain that “society” and not private owners should set salary levels. Many critics go so far as to regard all differences in income as morally unjust and the vast disparity between CEOs and their lowest-paid employees as morally obscene.

But it’s the attack on CEO pay that’s obscene.

Far from relying on nefarious backroom deals, successful CEOs earn their pay by creating vast amounts of wealth. Jack Welch, for instance, helped raise GE’s market value from $14 billion to $410 billion. Steve Jobs’s leadership famously turned a struggling Apple into an industry leader. Only a handful of people develop the virtues — vision, drive, knowledge, and ability — to successfully run a multibillion-dollar company. They deserve extraordinary compensation for their extraordinary achievements.

In smearing America’s great wealth creators as villains and attributing their high pay to greed and corruption rather than productive achievement, the critics want us to overlook the virtues that make CEOs successful. In demanding lower executive pay, despite the wishes of shareholders, the critics aim to deprive CEOs of their just desserts. In denouncing CEO pay for the sole reason that it’s higher than the pay of those who haven’t achieved so much, the critics seek to punish CEOs because they are successful.

Ultimately, how to pay CEOs is a question that only shareholders have a right to decide. But in today’s anti-business climate, it’s vital that we recognize the moral right of successful CEOs to huge rewards.

They earn them.

About The Authors

Yaron Brook

Chairman of the Board, Ayn Rand Institute

Don Watkins

Former Fellow (2006-2017), Ayn Rand Institute

America’s Unfree Market

by Yaron Brook and Don Watkins | May 2009

Since day one of the financial crisis, we have been told that the free market has failed. But this is a myth. Regardless of what one thinks were the actual causes of the crisis, the free market could not have been the source because, whatever you wish to call America’s economy post World War I, you cannot call it a free market. America today is a mixed economy — a market that retains some elements of freedom, but which is subject to pervasive and entrenched government control.

The actual meaning of “free market” is: the economic system of laissez-faire capitalism. Under capitalism, the government’s sole purpose is to protect the individual’s rights to life, liberty, property, and the pursuit of happiness from violation by force or fraud. This means a government is limited to three basic functions: the military, the police, and the court system. In a truly free market, there is no income tax, no alphabet agencies regulating every aspect of the economy, no handouts or business subsidies, no Federal Reserve. The government plays no more role in the economic lives of its citizens than it does in their sex lives.

Thus a free market is a market totally free from the initiation of physical force. Under such a system, individuals are free to exercise and act on their own judgment. They are free to produce and trade as they see fit. They are fully free from interference, regulation, or control by the government.

Historically, a fully free market has not yet existed. But it was America’s unsurpassed economic freedom that enabled her, in the period between the Civil War and World War I, to become an economic juggernaut, and the symbol of freedom and prosperity.

That freedom has largely been curtailed. But one sector that remains relatively free is America’s high-tech industry. Throughout the late 20th century, the computer industry had no significant barriers to entry, no licensing requirements, no government-mandated certification tests. Individuals were left free for the most part to think, produce, innovate and take risks: if they succeeded, they reaped the rewards; if they failed, they could not run to Washington for help.

The results speak for themselves.

Between 1981 and 1985, about 6 million personal computers were sold worldwide. During the first half of this decade, that number climbed to 855 million. Meanwhile, the quality of computers surged as prices plummeted. For instance, the cost per megabyte for a personal computer during the early 1980s was generally between $100 and $200; today it’s less than a cent.

That is what a free economy would look like: unbridled choice in production and trade with innovation and prosperity as the result.

But this is hardly what the economy looks like today.

The latest Federal budget was $3.6 trillion dollars, up from less than $1 billion a century ago. Taxes eat up nearly half of the average American’s income. A mammoth welfare state doles out favors to individuals and to businesses. Hundreds of thousands of regulations direct virtually every aspect of our lives. The Federal Reserve holds virtually unlimited control over the U.S. monetary and banking systems.

All of this represents the injection of government force into the market. And just as the elimination of force makes possible all the tremendous benefits of the free market, the introduction of force into markets undermines those benefits.

Nowhere is this clearer than in the highly controlled U.S. automotive industry and in the housing market.

The U.S. automotive industry is subject to thousands of regulations, but most relevant here are pro-union laws, such as the Wagner Act, which force Detroit to deal with the United Auto Workers (UAW), and the Corporate Average Fuel Economy (CAFE) law. These laws – not some innate inability to produce good cars – put American companies at a severe competitive disadvantage with foreign automakers.

In a free market, individuals would be free to voluntarily form and join unions, while employers would have the freedom to deal with those unions or not. But under current law, the UAW is protected by the coercive power of government. Individuals who wish to work for Detroit auto companies are forced to join the UAW — and Detroit is forced to hire UAW workers. This gives the UAW the ability to command above-market compensation for its members, to the detriment of the auto companies.

Compounding this, CAFE standards force Detroit to manufacture small, fuel-efficient cars in domestic (UAW) factories. These cars are notorious money-losers for American auto companies, swallowing up tens of billions of dollars. But under CAFE, the Big Three are barred by law from focusing on their more profitable lines of larger vehicles, from producing their fuel-efficient fleet overseas, or even from using the threat of offshoring as bargaining leverage.

Imagine if the same sort of anti-market policies imposed on Detroit had been applied to the computer industry. Suppose that in the mid-1980s, as IBM-compatible computers were battling Apple for preeminence, the government had decided that it favored Apple computers and would give tax incentives for computer buyers to purchase them. This would have hobbled and very likely wiped out IBM, Intel, Microsoft, and thousands of other companies. And while today Apple is an innovative, well-managed company, it is because of market pressures that required it to shape up or go bankrupt – pressures that would not have existed had Washington loaned it a helping hand.

Now turn from the auto industry to housing.

The conventional view of the housing crisis is that it was the result of a housing market free of government control. But, once again, the notion that the housing market was free is a total fantasy.

On a free market, the government would neither encourage nor discourage homeownership. Individuals would be free to decide whether to buy or to rent. Lenders would lend based on their expectation of a profit, knowing that if they make bad loans, they will pay the price. Interest rates would be determined by supply and demand – not by government fiat.

But that is not what happened in our controlled market. Instead, the government systematically intervened to encourage homeownership and real estate speculation. Think: Fannie and Freddie, the Community Reinvestment Act, tax code incentives for flipping homes, really the list goes on and on. This was a free market?

Unquestionably, today’s crisis is complex, and to identify its cause is not easy. But the opponents of the free market are not interested in identifying the cause. Their aim since day one has been to silence the debate and declare the matter settled: we had a free market, we had a financial crisis, and therefore, the free market was to blame. The only question, they would have us believe, is how, not whether, the government should intervene.

But they are wrong. There was no free market. And when you look across the American economy, what you see is that the freer parts, like the high-tech industry, are the most productive, and the more controlled parts, like the automotive, banking and housing industries, are in crisis.

Is this evidence that we need more government intervention or more freedom?

About The Authors

Yaron Brook

Chairman of the Board, Ayn Rand Institute

Don Watkins

Former Fellow (2006-2017), Ayn Rand Institute

Energy at the Speed of Thought: The Original Alternative Energy Market

by Alex Epstein | Summer 2009 | The Objective Standard

The most important and most overlooked energy issue today is the growing crisis of global energy supply. Cheap, industrial-scale energy is essential to building, transporting, and operating everything we use, from refrigerators to Internet server farms to hospitals. It is desperately needed in the undeveloped world, where 1.6 billion people lack electricity, which contributes to untold suffering and death. And it is needed in ever-greater, more-affordable quantities in the industrialized world: Energy usage and standard of living are directly correlated.1

Every dollar added to the cost of energy is a dollar added to the cost of life. And if something does not change soon in the energy markets, the cost of life will become a lot higher. As demand increases in the newly industrializing world, led by China and India,2 supply stagnates3 — meaning rising prices as far as the eye can see.

What is the solution?

We just need the right government “energy plan,” leading politicians, intellectuals, and businessmen tell us. Of course “planners” such as Barack Obama, John McCain, Al Gore, Thomas L. Friedman, T. Boone Pickens, and countless others favor different plans with different permutations and combinations of their favorite energy sources (solar, wind, biomass, ethanol, geothermal, occasionally nuclear and natural gas) and distribution networks (from decentralized home solar generators to a national centralized so-called smart grid). But each agrees that there must be a plan — that the government must lead the energy industry using its power to subsidize, mandate, inhibit, and prohibit. And each claims that his plan will lead to technological breakthroughs, more plentiful energy, and therefore a higher standard of living.

Consider Nobel Peace Prize winner Al Gore, who claims that if only we follow his “repower American plan” — which calls for the government to ban and replace all carbon-emitting energy (currently 80 percent of overall energy and almost 100 percent of fuel energy)4 in ten years — we would be using

fuels that are not expensive, don’t cause pollution and are abundantly available right here at home. . . . We have such fuels. Scientists have confirmed that enough solar energy falls on the surface of the earth every 40 minutes to meet 100 percent of the entire world’s energy needs for a full year. Tapping just a small portion of this solar energy could provide all of the electricity America uses.

And enough wind power blows through the Midwest corridor every day to also meet 100 percent of US electricity demand. Geothermal energy, similarly, is capable of providing enormous supplies of electricity for America. . . . [W]e can start right now using solar power, wind power and geothermal power to make electricity for our homes and businesses.5

And Gore claims that, under his plan, our vehicles will run on “renewable sources that can give us the equivalent of $1 per gallon gasoline.”6

Another revered thinker, Thomas L. Friedman, also speaks of the transformative power of government planning, in the form of a government-engineered “green economy.” In a recent book, he enthusiastically quotes an investor who claims: “The green economy is poised to be the mother of all markets, the economic investment opportunity of a lifetime.”7 Friedman calls for “a system that will stimulate massive amounts of innovation and deployment of abundant, clean, reliable, and cheap electrons.”8 How? Friedman tells us that

there are two ways to stimulate innovation — one is short-term and the other is long-term — and we need to be doing much more of both. . . . First, there is innovation that happens naturally by the massive deployment of technologies we already have [he stresses solar and wind]. . . . The way you stimulate this kind of innovation — which comes from learning more about what you already know and doing it better and cheaper — is by generous tax incentives, regulatory incentives, renewable energy mandates, and other market-shaping mechanisms that create durable demand for these existing clean power technologies. . . . And second, there is innovation that happens by way of eureka breakthroughs from someone’s lab due to research and experimentation. The way you stimulate that is by increasing government-funded research. . . .9

The problem with such plans and claims: Politicians and their intellectual allies have been making and trying to implement them for decades — with nothing positive (and much negative) to show for it.

For example, in the late 1970s, Jimmy Carter heralded his “comprehensive energy policy,” claiming it would “develop permanent and reliable new energy sources.” In particular, he (like many today) favored “solar energy, for which most of the technology is already available.” All the technology needed, he said, “is some initiative to initiate the growth of a large new market in our country.”10

Since then, the government has heavily subsidized solar, wind, and other favored “alternatives,” and embarked on grand research initiatives to change our energy sources — claiming that new fossil fuel and nuclear development is unnecessary and undesirable. The result? Not one single, practical, scalable source of energy. Americans get a piddling 1.1 percent of their power from solar and wind sources,11 and only that much because of national and state laws subsidizing and mandating them. There have been no “eureka breakthroughs,” despite many Friedmanesque schemes to induce them, including conveniently forgotten debacles such as government fusion projects,12 the Liquid Fast Metal Breeder Reactor Program,13 and the Synfuels Corporation.14

Many good books and articles have been written — though not enough, and not widely enough read — chronicling the failures of various government-sponsored energy plans, particularly those that sought to develop “alternative energies,” over the past several decades.15 Unfortunately, the lesson that many take from this is that we must relinquish hope for dramatic breakthroughs, lower our sights, and learn to make do with the increasing scarcity of energy.

But the past failures do not warrant cynicism about the future of energy; they warrant cynicism only about the future of energy under government planning. Indeed, history provides us ample grounds for optimism about the potential for a dynamic energy market with life-changing breakthroughs — because America once had exactly such a market. For most of the 1800s, an energy market existed unlike any we have seen in our lifetimes, a market devoid of government meddling. With every passing decade, consumers could buy cheaper, safer, and more convenient energy, thanks to continual breakthroughs in technology and efficiency — topped off by the discovery and mass availability of an alternative source of energy that, through its incredible cheapness and abundance, literally lengthened and improved the lives of nearly everyone in America and millions more around the world. That alternative energy was called petroleum. By studying the rise of oil, and the market in which it rose, we will see what a dynamic energy market looks like and what makes it possible. Many claim to want the “next oil”; to that end, what could be more important than understanding the conditions that gave rise to the first oil?

Today, we know oil primarily as a source of energy for transportation. But oil first rose to prominence as a form of energy for a different purpose: illumination.

For millennia, men had limited success overcoming the darkness of the night with man-made light. As a result, the day span for most was limited to the number of hours during which the sun shone — often fewer than ten in the winter. Even as late as the early 1800s, the quality and availability of artificial light was little better than it had been in Greek and Roman times — which is to say that men could choose between various grades of expensive lamp oils or candles made from animal fats.16 But all of this began to change in the 1820s. Americans found that lighting their homes was becoming increasingly affordable — so much so that by the mid-1860s, even poor, rural Americans could afford to brighten their homes, and therefore their lives, at night, adding hours of life to their every day.17

What made the difference? Individual freedom, which liberated individual ingenuity.

The Enlightenment and its apex, the founding of the United States of America, marked the establishment of an unprecedented form of government, one established explicitly on the principle of individual rights. According to this principle, each individual has a right to live his own life solely according to the guidance of his own mind — including the crucial right to earn, acquire, use, and dispose of the physical property, the wealth, on which his survival depends. Enlightenment America, and to a large extent Enlightenment Europe, gave men unprecedented freedom in the intellectual and economic realms. Intellectually, individuals were free to experiment and theorize without restrictions by the state. This made possible an unprecedented expansion in scientific inquiry — including the development by Joseph Priestly and Antoine Lavoisier of modern chemistry, critical to future improvements in illumination.18 Economically, this freedom enabled individuals to put scientific discoveries and methods into wealth-creating practice, harnessing the world around them in new, profitable ways — from textile manufacturing to steelmaking to coal-fired steam engines to illuminants.

There had always been a strong desire for illumination, and therefore a large potential market for anyone who could deliver it affordably — but no one had been able to actualize this potential. In the 1820s, however, new scientists and entrepreneurs entered the field with new knowledge and methods that would enable them to harness nature efficiently to create better, cheaper illuminants at a profit. Contrary to those who believe that the government is necessary to stimulate, invest in, or plan the development of new energy sources, history shows us that all that is required is an opportunity to profit.

That said, profiting in the illumination industry was no easy task. The entrenched, animal-based illuminants of the time, whatever their shortcomings, had long histories, good reputations, refined production processes, established transportation networks and marketing channels, and a large user base who had invested in the requisite lamps. In other words, animal-based illuminants were practical. For a new illumination venture to be profitable, it would have to create more value (as judged by its customers) than it consumed. A successful alternative would not only have to be a theoretical source of energy, or even work better in the laboratory; it would have to be produced, refined, transported, and marketed efficiently — or it would be worthless. Unlike today, no government bureaucrats were writing big checks for snazzy, speculative PowerPoint presentations or eye-popping statistics about the hypothetical potential of a given energy source. Thus, scientists and entrepreneurs developed illumination technologies with an eye toward creating real value on the market. They began exploring all manner of potential production materials — animal, vegetable, and mineral — and methods of production and distribution. Many of their attempts failed, such as forays into fish oils and certain plant oils that proved unprofitable for reasons such as unbearable smell, high cost of mass production, and low-quality light.19 But, out of this torrent of entrepreneurial exploration and experimentation, three illumination breakthroughs emerged.

One, called camphene, came from the work of the enterprising scientist Isaiah Jennings, who experimented with turpentine. If turpentine could create a quality illuminant, he believed, the product held tremendous commercial potential as the lowest-cost illuminant on the market: Unlike animal fat, turpentine was neither in demand as a food product nor as a lubricant. Jennings was successful in the lab, and in 1830, he took out a patent for the process of refining turpentine into camphene. The process he patented was a form of distillation — boiling at different temperatures in order to separate different components — a procedure that is vital to the energy industry to this day.

Before camphene could succeed on the market, Jennings and others had to solve numerous practical problems. For example, they discovered that camphene posed the threat of explosion when used in a standard (animal) oil lamp. The initial solution was to design new lamps specifically for use with camphene — but this solution was inadequate because the money saved using camphene would barely defray the expense of a new lamp. So, producers devised methods that enabled customers to inexpensively modify their existing lamps to be camphene-safe. The payoff: In the 1840s, camphene was the leading lamp oil, while use of animal oils, the higher-cost product, as illuminants declined in favor of their use as lubricants. Camphene was the cheapest source of light to date, creating many new customers who were grateful for its “remarkable intensity and high lighting power.”20

Second, whereas Jennings had focused on developing a brand-new source of illumination, another group of entrepreneurs — from, of all places, the Cincinnati hog industry — saw an opportunity to profitably improve the quality of light generated from animal lard, an already widely used source of illumination. At the time, the premium illuminant in the market was sperm whale oil, renowned for yielding a safe, consistent, beautiful light — at prices only the wealthy could afford. In the 1830s, soap makers within the hog industry set out to make traditional lard as useful for illumination as the much scarcer sperm whale oil. They devised a method of heating lard with soda alkali, which generated two desirable by-products that were as good as their sperm equivalents but less expensive: a new lard oil, dubbed stearin oil, for lamps and stearic acid for candles. This method, combined with a solid business model employing Cincinnati’s feedstock of hogs, created a booming industry that sold 2 million pounds of stearin products annually. The price of stearin oil was one third less than that of sperm whale oil, making premium light available to many more Americans.21

Thus camphene and stearin became leaders in the market for lamps and candles — both portable sources ofillumination. The third and final new form of illumination that emerged in the early 1800s was a bright, high-quality source of illumination delivered via fixed pipes to permanent light fixtures installed in homes and businesses. In the 17th century, scientists had discovered that coal, when heated to extremely high temperatures (around 1600 degrees), turns into a combustible gas that creates a bright light when brought to flame. In 1802, coal gas was used for the first time for commercial purposes in the famous factory of Boulton & Watt, near Birmingham, England.22 Soon thereafter, U.S. entrepreneurs offered coal gas illumination to many industrial concerns — making possible a major extension of the productive day for businesses, and thus increasing productivity throughout American industry. Initially, the high cost of the pipes and fixtures required by gas lighting precluded its use in homes. But entrepreneurs devised more efficient methods of installing pipes in order to bring gas into urban homes, and soon city dwellers in Baltimore, Boston, and New York would get more useful hours out of their days. Once the infrastructure was in place, the light was often cheaper than sperm whale oil, and was reliable, safe, and convenient. As a result, during the 1830s and 1840s, the coal-gas industry grew at a phenomenal rate; new firms sprang up in Brooklyn, Bristol (Rhode Island), Louisville, New Orleans, Pittsburgh, and Philadelphia.23

By the 1840s, after untold investing, risk-taking, thinking, experimentation, trial, error, failures, and success, coal gas, camphene, and stearin producers had proven their products to be the best, most practical illuminants of the time — and customers eagerly bought them so as to bring more light to their lives than ever before.

But this was only the beginning. Because the market was totally free, the new leaders could not be complacent; they could not prevent better ideas and plans from taking hold in the marketplace. Unlike the static industries fantasized by today’s “planners,” where some government-determined mix of technologies produces some static quantity deemed “the energy Americans need,” progress knew no ceiling. The market in the 19th century was a continuous process of improvement, which included a constant flow of newcomers who offered unexpected substitutes that could dramatically alter Americans’ idea of what was possible and therefore what was “needed.”

In the early 1850s, entrepreneurs caused just such a disruption with a now-forgotten product called coal oil.24 Coal oil initially emerged in Europe, which at the time also enjoyed a great deal of economic freedom. Scientists and entrepreneurs in the field of illumination were particularly inclined to look for illuminants in coals and other minerals because of the relative scarcity of animal and vegetable fats, and correspondingly high prices for both. Beginning with the French chemist A. F. Selligue, and continuing with the British entrepreneur James Young, Europeans made great strides in distilling coal at low heat (as against the high heat used to create coal gas) to liquefy it, and then distilling it (as Jennings had distilled turpentine into camphene) to make lamp oil and lubricants that were just as good as those from animal sources. Coal was plentiful, easy to extract in large quantities, and therefore cheap. The primary use of coal oil in Europe, however, was as a lubricant. In North America, the primary use would be as an illuminant.

Beginning in the 1840s, a Canadian physician named Abraham Gesner, inspired by the Europeans, conducted experiments with coal and was able to distill a quantity of illuminating oil therefrom. Gesner conceived a business plan (like so many scientists of the day, he was entrepreneurial), and teamed with a businessman named Thomas Cochrane to purchase an Alberta mining property from which he could extract a form of coal (asphaltum), refine it at high quality, and sell it below the going price for camphene.

But in 1852 the project was aborted — not because the owners lost the means or will to see it through, but because the Canadian government forbade it. The government denied that the subsurface minerals belonged to those who harnessed their value; it held that they were owned by the Crown, which did not approve of this particular use.

Gesner’s experience in Canada highlights a vital precondition of the rapid development of the American illumination energy industry: the security of property rights. All of the industries had been free to acquire and develop the physical land and materials necessary to create the technologies, make the products, and bring them to market based on the entrepreneurs’ best judgment. They had been free to cut down trees for camphene, raise hogs for stearin, and mine coal and build piping for gas lighting, so long as they were using honestly acquired property. And this freedom was recognized as a right, which governments were forbidden to abrogate in the name of some “higher” cause, be it the Crown or “the people” or the snail darter or protests by those who say, “Not in my backyard” about other people’s property. Because property rights were recognized, nothing stopped them from acting on their productive ideas. Had property rights not been recognized, all their brilliant ideas would have been like Gesner’s under Canadian rule: worthless.

Not surprisingly, Gesner moved to the United States. He set up a firm, the New York Kerosene Company, whose coal-oil illuminant, kerosene, was safer and 15 percent less expensive than camphene, more than 50 percent less expensive than coal gas, 75 percent less expensive than lard oil, and 86 percent less expensive than sperm whale oil. Unfortunately, this was not enough for Gesner to succeed. His product suffered from many problems, such as low yields and bad odor, and was not profitable. However, his limited successes had demonstrated that coal’s abundance and ease of refining made it potentially superior to animal and vegetable sources.

That potential was fully actualized by a businessman named Samuel Downer and his highly competent technical partners, Joshua Merrill and Luther Atwood. Downer had devoted an existing company to harnessing a product called “coup oil,” the properties of which rendered it uncompetitive with other oils. Recognizing the hopelessness of coup oil, Downer set his sights on coal-oil kerosene. Downer’s firm made major advances in refining technology, including the discovery of a more efficient means of treating refined oil with sulfuric acid, and of a process called “cracking” — also known as “destructive distillation” — which uses high heat to break down larger molecules into smaller ones, yielding higher amounts of the desired substance, in this case kerosene. (Unbeknownst to all involved, these discoveries would be vital to the undreamed of petroleum industry, which would emerge in the near future.) By 1859, after much effort went into developing effective refining processes and an efficient business model, Downer’s firm was able to make large profits by selling kerosene at $1.35 a gallon — a price that enabled more and more Americans to light their houses more of the time. Others quickly followed suit, and by decade’s end, businessmen had started major coal-oil refineries in Kentucky, Cincinnati, and Pittsburgh. The industry had attracted millions in investment by 1860, and was generating revenues of $5 million a year via coal oil — a growing competitor to coal gas, which was generating revenues of $17 million a year and had attracted $56 million (more than $1 billion in today’s dollars) in investment.25

As the 1850s drew to a close, coal oil and coal gas were the two leading illuminants. These new technologies brightened the world for Americans and, had the evolution of illumination innovation ended here, most Americans of the time would have died content. Their quality of life had improved dramatically under this energy revolution — indeed, so dramatically that, were a comparable improvement to occur today, it would dwarf even the most extravagant fantasies of today’s central planners. This points to a crucial fact that central planners cannot, do not, or will not understand: The source of an industry’s progress is a free market — a market with real economic planning, profit-driven individual planning.

The revolution in illumination was a process of thousands of entrepreneurs, scientists, inventors, and laborers using their best judgment to conceive and execute plans to make profits — that is, to create the most valuable illuminant at the lowest cost — with the best plans continually winning out and raising the bar. As a result, the state of the market as a whole reflected the best discoveries and creativity of thousands of minds — a hyperintelligent integration of individual thinking that no single mind, no matter how brilliant, could have foreseen or directed.

Who knew in 1820 that, of all the substances surrounding man, coal — given its physical properties, natural quantities, and costs of extraction and production — would be the best source for inexpensive illumination? Who knew all the thousands of minute, efficiency-producing details that would be reflected in the operations of the Samuel Downer Company — operations developed both by the company and by decades of trial and error on the market? Consider, then, what it would have meant for an Al Gore or Thomas Friedman or Barack Obama to “plan” the illumination energy market. It would have meant pretending to know the best technologies and most efficient ways of harnessing them and then imposing a “plan.” And, given that neither Gore nor Friedman nor anyone else could possibly possess all the knowledge necessary to devise a workable plan, what would their “plan” consist of? It would consist of what all central planners’ “plans” consist of: prohibition, wealth transfers, and dictates from ignorance. Depending on when the “planners” began their meddling and who was whispering in their ear, they might subsidize tallow candles or camphene, thereby pricing better alternatives out of the market or limiting lighting choices to explosive lamps.

Thankfully, there was no such “planner” — there were only free individuals seeking profit and free individuals seeking the best products for their money. That freedom enabled the greatest “eureka” of them all — from an unlikely source.

George Bissell was the last person anyone would have bet on to change the course of industrial history. Yet this young lawyer and modest entrepreneur began to do just that in 1854 when he traveled to his alma mater, Dartmouth College, in search of investors for a venture in pavement and railway materials.26 While visiting a friend, he noticed a bottle of Seneca Oil — petroleum — which at that time was sold as medicine. People had known of petroleum for thousands of years, but thought it existed only in small quantities. This particular bottle came from an oil spring on the land of physician Dr. Francis Beattie Brewer in Titusville, Pennsylvania, which was lumber country.

At some point during or soon after the encounter, Bissell became obsessed with petroleum, and thought that he could make a great business selling it as an illuminant if, first, it could be refined to produce a high quality illuminant, and, second, it existed in substantial quantities. Few had considered the first possibility, and most would have thought the second out of the question. The small oil springs or seeps men had observed throughout history were thought to be the mere “drippings” of coal, necessarily tiny in quantity relative to their source.

But Bissell needed no one’s approval or agreement — except that of the handful of initial investors he would need to persuade to finance his idea. The most important of these was Brewer, who sold him one hundred acres of property in exchange for $5,000 in stock in Bissell’s newly formed Pennsylvania Rock Oil Company of New York.

To raise sufficient funds to complete the project, Bissell knew that he would have to demonstrate at minimum that petroleum could be refined into a good illuminant. He solicited Benjamin Silliman Jr., a renowned Yale chemist, who worked with the petroleum, refined it, and tested its properties for various functions, including illumination. After collecting a $500 commission (which the crash-strapped firm could barely put together), Silliman delivered his glowing report: 50 percent of crude petroleum could be refined into a fine illuminant and 90 percent of the crude could be useful in some form or another.

Proof of concept in hand, Bissell raised just enough money to enact the second part of his plan: to see if oil could be found in ample quantities. According to the general consensus, his plan — to drill for oil — was unlikely to uncover anything. (One of Bissell’s investors, banker James Townsend, recalled his friends saying, “Oh, Townsend, oil coming out of the ground, pumping oil out of the earth as you pump water? Nonsense! You’re crazy.”) But Bissell’s organization had reason to suspect that the consensus was wrong — mostly because saltwater driller Samuel Kier had inadvertently found modest quantities of oil apart from known coal deposits, which contradicted the coal-drippings theory. And so Bissell proceeded, albeit with great uncertainty and very little money.

He sent Edwin Drake, a former railroad conductor and jack-of-many-trades, to Titusville to find oil. Drake and his hired hands spent two years and all the funds the company could muster, but after drilling to 69.5 feet with his self-made, steam-powered rig, he found nothing. Fortunately, just as the investors told Drake to wrap up the project, his crew noticed oil seeping out of the rig. Ecstatic, they attempted to pump the oil out of the well — and succeeded. With that, a new industry was born.

That is, a new potential industry was born. In hindsight we know that oil existed in quantities and had physical qualities that would enable it to supplant every other illuminant available at the time. But this was discovered only later by entrepreneurs with the foresight to invest time and money in the petroleum industry.

Bissell and other oilmen faced a difficult battle. They had to extract, refine, transport, and market at a profit this new, little-understood material, whose ultimate quantities were completely unknown — while vying for market share with well-established competitors. Fortunately, they were up to the task, and many others would follow their lead.

When word got out about Drake’s discovery, a “black gold” rush began, a rush to buy land and drill the earth for as much of this oil as possible. For example, upon seeing Drake’s discovery, Jonathan Watson, a lumber worker on Brewer’s land, bought what would become millions of dollars worth of oil land. George Bissell did the same. Participants included men in the lumber industry, salt borers turned oil borers, and others eager to take advantage of this new opportunity.27

Progress in this new industry was messy and chaotic — and staggering. In 1859, a few thousand barrels were produced; in 1860, more than 200,000; and in 1861, more than 2 million.28 Capital poured in from investors seeking to tap into the profits. In the industry’s first five years, private capitalists invested $580 million — $7 billion in today’s dollars.29 Even in the middle of the 19th century, when wealth was relatively scarce, the supposed problem of attracting capital to fund the development of a promising energy source did not exist so long as the energy source was truly promising.

As producers demonstrated that enormous quantities of oil existed, they created a huge profit opportunity for others to build businesses performing various functions necessary to bring oil to market. At first, would-be transporters were hardly eager to build rail lines to Titusville, and would-be refiners were hardly eager to risk money on distillation machines (“stills”) that might not see use. As such, the oil industry was not functioning efficiently, and much of the oil produced in the first three years went to waste. The oil that did not go to waste was expensive to bring to market, requiring wagon-driving teamsters to haul it 20–40 miles to the nearest railroad station in costly 360-pound barrels.30

But once production reached high levels, driving crude oil prices down, the transportation, refining, and distribution of oil attracted much investment and talent. An early, price-slashing solution to transportation problems was “pond fresheting.” Entrepreneurial boatmen on Oil Creek and the Alleghany River, which led to Pittsburgh, determined that they could offer cheaper transportation by strapping barrels of oils on rafts and floating them down the river. But this only worked half the year; the rest of the time, water levels were too low. The ingenious workaround they devised was to pay local dam owners to release water (“freshet”) at certain points in the year in order to raise water levels, thereby enabling them to float their rafts downstream. The method worked, and Pittsburgh quickly became the petroleum refining capital of America.31

Railroads entered the picture as well, building lines to new cities, which allowed them to become refining cities. In 1863, the Lake Shore Railroad built a line to Cleveland, inspiring many entrepreneurs to establish refineries there — including a 23-year-old named John Rockefeller.32 Another innovation in oil transport was “gathering lines” — small several-mile-long pipelines that connected drilling sites to local storage facilities or railroads. At first, gathering lines were halted by the Pennsylvania government’s lax enforcement of property rights; the politically-influential teamsters would tear down new pipelines, and the government would look the other way. But once rights were protected, gathering lines could be constructed quickly for any promising drilling site, enabling sites to pump oil directly to storage facilities or transportation centers without the loss, danger, and expense of using barrels and teamsters. Still another innovation was the tank car. These special railroad cars could carry far more oil than could normal boxcars loaded with barrels, and, once certain problems were solved (wood cars were replaced by iron cars and measures were taken to prevent explosion), they became the most efficient means of transportation.33

In the area of refining, innovation was tremendous. Certain industry leaders, such as Joshua Merrill of the Samuel Downer Company and Samuel Andrews of Clark, Rockefeller, and Andrews (later to be named Standard Oil), continuously experimented to solve difficulties associated with the refining process. To refine crude oil is to extract from it one or more of its valuable “fractions,” such as kerosene for illumination, paraffin wax for candles, and gasoline for fuel. The process employs a still to heat crude oil at multiple, increasing temperatures to boil off and separate the different fractions, each of which has a different boiling point. Distillation is simple in concept and basic execution, but to boil off and bottle kerosene was hugely problematic: Impure kerosene could be highly noxious and highly explosive. Additionally, early stills did not last very long, yielded small amounts of kerosene per unit, took hours upon hours to cool between batches, and raised numerous other challenges.

Throughout the 1860s, the leading refiners experimented with all aspects of the refining process: Should stills be shaped horizontally or vertically? How should heat be applied for evenness of temperature? How can the life of the still be maximized? How can the tar residue at the bottom be cleaned quickly and with as little damage to the still as possible? What procedures should one employ to purify the kerosene once distillation has been performed? When the process involves a chemical treatment, how much of that treatment should be used? Is it profitable to “crack” the oil, heating it at high temperature to create more kerosene molecules, which creates more kerosene per barrel but takes longer and requires expensive purification procedures?

The leading refiners progressively asked and answered these questions, and profited immensely from the knowledge they gained. By the end of the 1860s, the basics of refining technology had been laid down,34 though it would not be until the 1870s — the Rockefeller era — that they would be employed industry-wide.

On the marketing and distribution end, kerosene became a widely available good. Refining firms made arrangements with end sellers, most notably wholesale grocers and wholesale druggists, to sell their product. Rockefeller’s firm was a pioneer in international sales, setting up a New York office to sell kerosene all around the world — where it was in high demand thanks to its quality and cheapness, and to the lack of alternatives.35

The pace of growth of the oil industry was truly phenomenal. Within five years of its inception, with no modern communication or construction technology, the industry had made light accessible to even some of the poorest Americans. In 1864, a chemist wrote:

Kerosene has, in one sense, increased the length of life among the agricultural population. Those who, on account of the dearness or inefficiency of whale oil, were accustomed to go to bed soon after the sunset and spend almost half their time in sleep, now occupy a portion of the night in reading and other amusements.36

Within five years, an unknown technology and an unimagined industry had become a source of staggering wealth creation. Had the early days of this industry been somehow filmed, one would see oilmen in every aspect of the business building up an enormous industry, moving as if the film were being fast-forwarded. Almost nothing in history rivals this pace of development, and it is inconceivable today that any construction-heavy industry could progress as quickly. It now takes more than five years just to get a permit to start building an oil derrick, let alone to complete the derrick, much less thousands of them.

But in the mid-1800s, no drilling permits or other government permissions were required to engage in productive activity. This did not mean that oilmen could pollute at will — property rights laws prohibited polluting others’ property (though some governments, unfortunately, were lax in their enforcement of such laws). It did mean that, for the most part, they were treated as innocent until proven guilty; and they knew that so long as they followed clearly defined laws, their projects would be safe.37

Anyone with an idea could implement it as quickly as his abilities permitted. If he thought a forest contained a valuable mineral, he could buy it. If he thought drilling was the best means of extracting the mineral, he could set up a drilling operation. If he thought a railroad or a pipeline was economical, he could acquire the relevant rights-of-way, clear the land, and build one. If he thought he could do something better than others, he could try — and let the market be the judge. And he could do all of these things by right, without delay — in effect, developing energy at the speed of thought.

As one prominent journalist wrote:

It is certain . . . the development [of the petroleum industry] could never have gone on at anything like the speed that it did except under the American system of free opportunity. Men did not wait to ask if they might go into the Oil Region: they went. They did not ask how to put down a well: they quickly took the processes which other men had developed for other purposes and adapted them to their purpose. . . . Taken as a whole, a truer exhibit of what must be expected of men working without other regulation than that they voluntarily give themselves is not to be found in our industrial history.38

Imagine if George Bissell and Edwin Drake were to pursue the idea of drilling for oil in today’s political context. At minimum, they would have to go through a multiyear approval process in which they would be required to do environmental impact studies documenting the expected impact on every form of local plant and animal life. Then, of course, they would have to contend with zoning laws, massive taxes, and government subsidies handed to their competitors. More likely, the EPA would simply ax the project, declaring Titusville “protected” government land (the fate of one-third of the land in the United States today). More likely still, Bissell would not even seriously consider such a venture, knowing that the government apparatus would wreck it with unbearable costs and delays, or a bureaucratic veto.

The speed of progress depends on two things: the speed at which men can conceive of profitable means of creating new value — and the speed at which they can implement their ideas. Since future discoveries depend on the knowledge and skills gained from past discoveries, delays in market activity retard both the application and the discovery of new knowledge.

In 1865, members of the oil industry experienced a tiny fraction of the government interference with which the modern industry regularly contends: the Civil War’s Revenue Act of 1865. This was a $1 per barrel tax on crude inventory — approximately 13 percent of the price. This Act “slowed drilling to a virtual standstill” and “put hundreds of marginal producers out of business” by eating into businesses’ investment and working capital.39 Remarkably, the damage done by the Act scared the government away from taxing crude and oil products for decades, an effective apologyforits previous violation of property rights. Such was the general economic climate of the time.

After the brief but crushing bout of confiscatory taxation, the economic freedom that made possible the rise of the oil industry resumed, as did the industry’s explosive growth. In 1865, kerosene cost 58 cents a gallon, much less expensive than any prior product had been — and half the price of coal oil.40 But entrepreneurs did not have time to revel in the successes of the past. They were too busy planning superior ventures for the future — knowing that with creativity they could always come up with something better, and that customers would always reward better, cheaper products.

The paragon of this relentless drive to improve was Rockefeller, who developed a new business structure that would bring the efficiency of oil refining — and ultimately, the whole process of producing and selling oil — to new heights. Rockefeller was obsessed with efficiency and with careful accounting of profit and loss. In seeking to maximize his efficiency, he had one central realization that steered the fate of his company: Tremendous efficiency could be achieved through scale. From his first investment in a refinery in 1863, when he built the largest refinery in Cleveland, to his continual borrowing to expand the size of his operations, Rockefeller realized that the more oil he refined, the more he could invest in expensive but efficient devices and practices whose often-high costs could be spread over a large number of units. He created barrel-making facilities that cut his barrel costs from $3 to $1 each. He built large-scale refineries that required less labor per barrel. He purchased a fleet of tank cars, and created an arrangement with a railroad that lowered his costs from $900,000 to $300,000 a trip. (Such savings are the real basis of Rockefeller’s much-maligned rebates from railroads.)

Rockefeller’s improvements, which can be enumerated almost indefinitely, helped lower the prevailing per-gallon price of kerosene from 58 cents in 1865, to 26 cents in 1870 — a price at which most of his competitors could not afford to stay in business — to 8 cents in 1880. These incredible prices represented the continuous breakthroughs that the Rockefeller-led industry was making. Every five years marked another period of dramatic progress — whether through long-distance pipelines that eased distribution or through advances in refining that made use of vast deposits of previously unrefinable oil. Oil’s potential was so staggering that no alternative was necessary. But then someone conceived of one: the electric lightbulb.

Actually, many men had conceived of electric lightbulbs in one form or another; but Thomas Edison, beginning in the late 1870s, was the first to successfully develop one that was practical and potentially profitable. Edison’s lightbulb lasted hundreds of hours, and was conceived as part of a practical distribution network — the Edison system, the first electrical utility and distribution grid. As wonderful as kerosene was, it generated heat and soot and odor and smoke and had the potential to explode; lightbulbs did not. Thus, as soon as Edison’s lightbulb was announced, the stock prices of publicly traded oil refiners plummeted.

Oil, it appeared, was no longer the future of illumination energy; electricity was. This fact, and the competitive pressures it placed on the oil industry, prompted entrepreneurs to figure out whether their product could enjoy comparable consumer demand in any other sphere, inside or outside of the energy industry. They worked to expand the market for oil as a lubricant and as a fuel for railroads and tankers. But the fate of the industry would hinge on the rise of the automobile in the 1890s.41

It is little known that most builders of automobiles did not intend them to run on gasoline. Given the growth and popularity of electricity at the time, many cars were designed to run on electric batteries, whereas other cars ran on steam or ethanol. Gasoline’s dominance was not a fait accompli.

If the market had not been free, the electric car would likely have been subsidized into victory, given the obsession with electricity at the time. But when the technologies were tested in an open market, oil/gasoline won out — because of the incredible efficiency of the Rockefeller-led industry coupled with gasoline’s energy density. Per unit of mass and volume, it could take a car farther than an electric battery or a pile of coal or a vat of ethanol (something that remains true to this day). Indeed, Thomas Edison himself explained this to Henry Ford, in a story told by electricity entrepreneur Samuel Insull.

“He asked me no end of details,” to use Mr. Ford’s own language, “and I sketched everything for him; for I have always found that I could convey an idea quicker by sketching than by just describing it.” When the conversation ended, Mr. Edison brought his fist down on the table with a bang, and said: “Young man, that’s the thing; you have it. Keep at it. Electric cars must keep near to power stations. The storage battery is too heavy. Steam cars won’t do, either, for they require a boiler and fire. Your car is self-contained — carries its own power plant — no fire, no boiler, no smoke and no steam. You have the thing. Keep at it.”. . . And this at a time when all the electrical engineers took it as an established fact that there could be nothing new and worthwhile that did not run by electricity.42

By 1912, gasoline had become a staple of life — and was on the way to changing it even more than kerosene had. A trade journal from 1912, Gasoline — The Modern Necessity, read:

It seems almost unbelievable that there was once a time when the refiners and manufacturers of petroleum products concerned themselves seriously with finding a market for the higher distillates. At the present time it is the higher distillate known as gasoline that is giving not alone the refiners grave concern but modern civilization as well. Then it was how to find an adequate and profitable market for it; now it is how to meet the ever-increasing demand for it.43

Oil was the ultimate alternative energy — first for illumination, then for locomotion. In a mere half century, oil went from being useless black goo to the chief energy source leading the illumination and mobilization of the world. Young couples filling up their automobiles in 1910 had nary a clue as to how much thought and knowledge went into their ability to power their horseless carriages so cheaply and safely. Nor did most appreciate that all of this depended on a political system in which the government’s recognition and protection of the right to property and contract enabled businessmen to develop the world around them, risk their time and money on any innovation they chose, and profit from the results.

If we compare today’s “planned” energy market to the rights-respecting energy market that brought about the emergence of oil, we can see in concrete fact the practicality of a genuinely free market.

Instead of protecting property rights and unleashing the producers of energy to discover the best forms of energy and determine how best to deploy them (which includes genuine privatization of the electricity grid and other transcontinental development),44 our government randomly dictates what the future is to be. Today, we are told, as if it were written in the stars, that plug-in hybrids powered by solar and wind on a “smart grid” are the way to go — a claim that has no more validity than an 1860s claim that a network of wagon drivers should deliver coal oil nationwide.

What sources of energy are best pursued and how best to pursue them can be discovered only by millions of minds acting and interacting freely in the marketplace — where anyone with a better idea is free to prove it and unable to force others to fund his pursuit. When the government interferes in the marketplace, countless productive possibilities are precluded from coming into existence.

Today’s government as “energy planner” not only thwarts the market by coercively subsidizing the “right” energy technologies; it damages the market by opposing or even banningthe “wrong” energy technologies or business models. Today’s energy policy severely restricts the production of every single practical, scalable form of energy: coal, natural gas, oil, and, above all, nuclear.

Nuclear energy deserves special mention because it has tremendous proven potential, the result of its incredible energy density: more than one million times that of any fossil fuel — which, unlike oil, coal, or natural gas, has never been allowed to develop in anything resembling a free market. Thanks to environmentalist hysteria, this proven-safe source of energy has been virtually banned in the United States. And when nuclear plants have been permitted, construction costs and downtime losses have been multiplied many times over by multi-decade regulatory delays. Even in other countries, where nuclear power is much more welcome, it is under the yoke of governments and is therefore progressing at a fraction of its potential.

If the scientists, engineers, and businessmen in the nuclear power industry had been able to pursue their ideas and develop their products in a free market — as oilmen once were able to do — how much better would our lives be today? What further technologies would have blossomed from that fertile foundation? Would automobiles even be running on gasoline? Would coal be used for anything anymore? And if entrepreneurs with other, perhaps even better, energy ideas had been free to put them into practice as quickly as their talents would allow — just as their 19th-century forebears had — might we by now have realized the dream of supplanting nuclear fission with nuclear fusion, which many consider the holy grail of energy potential?

The fact is, we cannot even dream of what innovations would have developed or what torrents of energy would have been unleashed. As the history of the original alternative energy industry illustrates, no one can predict the revolutionary outcomes of a market process. Happily, however, with respect to the future, we can do better than dream: We can see for ourselves what kind of untapped energy potential exists, by learning from the 19th century. We can — and must — remove the political impediments to energy progress by limiting the government to the protection of rights. Then, we will witness something truly spectacular: energy at the speed of 21st-century thought.

About The Author

Alex Epstein

Alex Epstein was a writer and a fellow on staff at the Ayn Rand Institute between 2004 and 2011.

Endnotes

1 Robert Bryce, Gusher of Lies (New York: PublicAffairs, 2008), p. 132.

2 Ibid., pp. 267–70.

3 International Energy Outlook 2008, “Highlights,” Energy Information Administration, U.S. Department of Energy, http://www.eia.doe.gov/oiaf/ieo/pdf/highlights.pdf.

4 Annual Energy Review, “U.S. Primary Energy Consumption by Source and Sector, 2007,” Energy Information Administration, U.S. Department of Energy, http://www.eia.doe.gov/emeu/aer/pecss_diagram.html.

5 “Al Gore’s Challenge to Repower America,” speech delivered July 17, 2008, http://www.repoweramerica.org/about/challenge/.

6 Ibid.

7 Thomas L. Friedman, Hot, Flat, and Crowded: Why We Need a Green Revolution—and How It Can Renew America (New York: Farrar, Straus & Giroux, 2008), p. 172.

8 Ibid., p. 186.

9 Ibid., pp. 187–88.

10 Jimmy Carter, “NATIONAL ENERGY PLAN—Address Delivered Before a Joint Session of the Congress,” April 20, 1977, http://www.presidency.ucsb.edu/ws/index.php?pid=7372.

11 http://www.eia.doe.gov/aer/txt/ptb0103.html.

12 Richard Nixon, “Special Message to the Congress on Energy Policy,” April 18, 1973, http://www.presidency.ucsb.edu/ws/index.php?pid=3817&st.

13 Linda R. Cohen and Roger G. Noll, The Technology Pork Barrel (Washington, DC: The Brookings Institution), pp. 217–18.

14 Ibid., pp. 259–313.

15 In this regard, I recommend Gusher of Lies by Robert Bryce and The Technology Pork Barrel by Linda R. Cohen and Roger G. Noll.

16 Harold F. Williamson and Arnold R. Daum, The American Petroleum Industry 1859–1899: The Age of Illumination (Evanston, IL: Northwestern University, 1963), p. 29.

17 Ibid., p. 320.

18 Ibid., p. 28.

19 M. Luckiesh, Artificial Light (New York: The Century Co., 1920), pp. 51–56.

20 Williamson and Daum, The American Petroleum Industry, pp. 33–34.

21 Ibid., pp. 34–36.

22 Ibid., p. 32.

23 Ibid., pp. 32, 38–42.

24 This discussion is based on Williamson and Daum, The American Petroleum Industry, pp. 43–60.

25 Calculated using GDP Deflator and CPI, http://www.measuringworth.com/.

26 This discussion is based on Williamson and Daum, The American Petroleum Industry, pp. 63–81.

27 Ibid., pp. 86–89.

28 Ibid., p. 103.

29 Robert L. Bradley, Oil, Gas, and Government: The U.S. Experience, vol. 1 (London: Rowman & Littlefield, 1996), p. 18.

30 Williamson and Daum, The American Petroleum Industry,pp. 85, 106.

31 Ibid., pp. 165–69.

32 Burton W. Fulsom, The Myth of the Robber Barons (Herndon, VA: Young America’s Foundation, 1996), p. 85.

33 Williamson and Daum, The American Petroleum Industry, pp. 183–89.

34 Ibid., pp. 202–31.

35 Alex Epstein, “Vindicating Capitalism: The Real History of the Standard Oil Company,” The Objective Standard, Summer 2008, pp. 29–35.

36 Williamson and Daum, The American Petroleum Industry, p. 320.

37 For a comprehensive account of the existence and decline of economic freedom in the oil industry, see Bradley, Oil, Gas, and Government.

38 Paul Henry Gidden, The Birth of the Oil Industry (New York: The Macmillan Company, 1938), p. xxxix.

39 Bradley, Oil, Gas, and Government.

40 Discussion based on Alex Epstein, “Vindicating Capitalism.”

41 Discussion based on Harold F. Williamson, Ralph L. Andreano, Arnold R. Daum, and Gilbert C. Klose, The American Petroleum Industry, 1899–1959: The Age of Energy (Evanston, IL: Northwestern University Press, 1963), pp. 184–95.

42 Samuel Insull, The Memoirs of Samuel Insull (Polo, IL: Transportation Trails, 1934, 1992), pp. 142–43.

43 Williamson, Andreano, Daum, and Klose, The American Petroleum Industry, p. 195.

44 Raymond C. Niles, “Property Rights and the Electricity Grid,” The Objective Standard, Summer 2008.

No More Green Guilt

by Keith Lockitch | May 01, 2009

Every investment prospectus warns that “past performance is no guarantee of future results.” But suppose that an investment professional’s record contains nothing but losses, of failed prediction after failed prediction. Who would still entrust that investor with his money?

Yet, in public policy there is one group with a dismal track record that Americans never seem to tire of supporting. We invest heavily in its spurious predictions, suffer devastating losses, and react by investing even more, never seeming to learn from the experience. The group I’m talking about is the environmentalist movement.

No More Green Guilt

Consider their track record — like the dire warnings of catastrophic over-population. Our unchecked consumption, we were told, was depleting the earth’s resources and would wipe humanity out in a massive population crash. Paul Ehrlich’s 1968 bestseller, The Population Bomb, forecasted hundreds of millions of deaths per year throughout the 1970s, to be averted, he insisted, only by mass population control “by compulsion if voluntary methods fail.”

But instead of global-scale famine and death, the 1970s witnessed an agricultural revolution. Despite a near-doubling of world population, food production continues to grow as technological innovation creates more and more food on each acre of farmland. The U.S., which has seen its population grow from 200 to 300 million, is more concerned about rampant obesity than a shortage of food.

The Alar scare in 1989 is another great example. The NRDC, an environmentalist lobby group, engineered media frenzy over the baseless assertion that Alar, an apple ripening agent, posed a cancer threat. The ensuing panic cost the apple industry over $200 million dollars, and Alar was pulled from the market even though it was a perfectly safe and value-adding product.

Or consider the campaign against the insecticide DDT, beginning with Rachel Carson’s 1962 bookSilent Spring. The world had been on the brink of eradicating malaria using DDT — but for Carson and her followers, controlling disease-carrying mosquitoes was an arrogant act of “tampering” with nature. Carson issued dire warnings that nature was “capable of striking back in unexpected ways” unless people showed more “humility before [its] vast forces.” She asserted, baselessly, that among other things DDT would cause a cancer epidemic. Her book led to such a public outcry that, despite its life-saving benefits and mountains of scientific evidence supporting its continued use, DDT was banned in the United States in 1972. Thanks to environmentalist opposition, DDT was almost completely phased out worldwide. And while there is still zero evidence of a DDT cancer risk, the resurgence of malaria needlessly kills over a million people a year.

Time and time again, the supposedly scientific claims of environmentalists have proven to be pseudo-scientific nonsense, and the Ehrlichs and Carsons of the world have proven to be the Bernard Madoffs of science. Yet Americans have ignored the evidence and have instead invested in their claims — accepting the blame for unproven disasters and backing coercive, harmful “solutions.”

Today, of course, the Green doomsday prediction is for catastrophic global warming to destroy the planet — something that environmentalists have pushed since at least the early 1970s, when they were also worried about a possible global cooling shifting the planet into a new ice age.

But in this instance, just as with Alar, DDT, and the population explosion, the science is weak and the “solutions” drastic. We are told that global warming is occurring at an accelerating rate, yet global temperatures have been flat for the last decade. We are told that global warming is causing more frequent and intense hurricanes, yet the data doesn’t support such a claim. We are warned of a potentially catastrophic sea level rise of 20 feet over the next century, but that requires significant melting of the land-based ice in Antarctica and Greenland. Greenland has retained its ice sheet for over 100,000 years despite wide-ranging temperatures and Antarctica has been cooling moderately for the last half-century.

Through these distortions of science we are again being harangued to support coercive policies. We are told that our energy consumption is destroying the planet and that we must drastically reduce our carbon emissions immediately. Never mind that energy use is an indispensable component of everything we do, that 85 percent of the world’s energy is carbon-based, or that there are no realistic, abundant alternatives available any time soon, and that billions of people are suffering today from lack of energy.

Despite all of that, Americans seem to once again be moving closer to buying the Green investment pitch and backing destructive Green policies. Why don’t we learn from past experience? Do you think a former Madoff investor would hand over money to him again?

It’s not that we’re too stupid to learn, it’s that we are holding onto a premise that distorts our understanding of reality. Americans are the most successful individuals in history — even in spite of this economic downturn — in terms of material wealth and the quality of life and happiness it brings. We are heirs to the scientific and industrial revolutions, which have increased life expectancy from 30 years to 80 and improved human life in countless, extraordinary ways. Through our ingenuity and productive effort, we have achieved an unprecedented prosperity by reshaping nature to serve our needs. Yet we have always regarded this productivity and prosperity with a certain degree of moral suspicion. The Judeo-Christian ethic of guilt and self-sacrifice leads us to doubt the propriety of our success and makes us susceptible to claims that we will ultimately face punishment for our selfishness — that our prosperity is sinful and can lead only to an apocalyptic judgment day.

Environmentalism preys on our moral unease and fishes around for doomsday scenarios. If our ever-increasing population or life-enhancing chemicals have not brought about the apocalypse, then it must be our use of fossil fuels that will. Despite the colossal failures of past Green predictions, we buy into the latest doomsday scare because, on some level, we have accepted an undeserved guilt. We lack the moral self-assertiveness to regard our own success as virtuous; we think we deserve punishment.

It is time to stop apologizing for prosperity. We must reject the unwarranted fears spread by Green ideology by rejecting unearned guilt. Instead of meekly accepting condemnation for our capacity to live, we should proudly embrace our unparalleled ability to alter nature for our own benefit as the highest of virtues.

It’s time to recapture our Founding Fathers’ admiration for the virtue of each individual’s pursuit of his own happiness.

About The Author

Keith Lockitch

Vice President of Education and Senior Fellow, Ayn Rand Institute

The Real Meaning of Earth Hour

by Keith Lockitch | March 23, 2009

On Saturday, March 28, cities around the world will turn off their lights to observe “Earth Hour.” Iconic landmarks from the Sydney Opera House to Manhattan’s skyscrapers will be darkened to encourage reduced energy use and signal a commitment to fighting climate change.

While a one-hour blackout will admittedly have little effect on carbon emissions, what matters, organizers say, is the event’s symbolic meaning. That’s true, but not in the way organizers intend.

We hear constantly that the debate is over on climate change — that man-made greenhouse gases are indisputably causing a planetary emergency. But there is ample scientific evidence to reject the claims of climate catastrophe. And what’s never mentioned? The fact that reducing greenhouse gases to the degree sought by climate activists would, itself, cause significant harm.

Politicians and environmentalists, including those behind Earth Hour, are not calling on people just to change a few light bulbs, they are calling for a truly massive reduction in carbon emissions — as much as 80 percent below 1990 levels. Because our energy is overwhelmingly carbon-based (fossil fuels provide more than 80 percent of world energy), and because the claims of abundant “green energy” from breezes and sunbeams are a myth — this necessarily means a massive reduction in our energy use.

People don’t have a clear view of what this would mean in practice. We, in the industrialized world, take our abundant energy for granted and don’t consider just how much we benefit from its use in every minute of every day. Driving our cars to work and school, sitting in our lighted, heated homes and offices, powering our computers and countless other labor-saving appliances, we count on the indispensable values that industrial energy makes possible: hospitals and grocery stores, factories and farms, international travel and global telecommunications. It is hard for us to project the degree of sacrifice and harm that proposed climate policies would force upon us.

This blindness to the vital importance of energy is precisely what Earth Hour exploits. It sends the comforting-but-false message: Cutting off fossil fuels would be easy and even fun! People spend the hour stargazing and holding torch-lit beach parties; restaurants offer special candle-lit dinners. Earth Hour makes the renunciation of energy seem like a big party.

Participants spend an enjoyable sixty minutes in the dark, safe in the knowledge that the life-saving benefits of industrial civilization are just a light switch away. This bears no relation whatsoever to what life would actually be like under the sort of draconian carbon-reduction policies that climate activists are demanding: punishing carbon taxes, severe emissions caps, outright bans on the construction of power plants.

Forget one measly hour with just the lights off. How about Earth Month, without any form of fossil fuel energy? Try spending a month shivering in the dark without heating, electricity, refrigeration; without power plants or generators; without any of the labor-saving, time-saving, and therefore life-saving products that industrial energy makes possible.

Those who claim that we must cut off our carbon emissions to prevent an alleged global catastrophe need to learn the indisputable fact that cutting off our carbon emissions would be a global catastrophe. What we really need is greater awareness of just how indispensable carbon-based energy is to human life (including, of course, to our ability to cope with any changes in the climate).

It is true that the importance of Earth Hour is its symbolic meaning. But that meaning is the opposite of the one intended. The lights of our cities and monuments are a symbol of human achievement, of what mankind has accomplished in rising from the cave to the skyscraper. Earth Hour presents the disturbing spectacle of people celebrating those lights being extinguished. Its call for people to renounce energy and to rejoice at darkened skyscrapers makes its real meaning unmistakably clear: Earth Hour symbolizes the renunciation of industrial civilization.

About The Author

Keith Lockitch

Vice President of Education and Senior Fellow, Ayn Rand Institute

Further Reading

Ayn Rand | 1957
For the New Intellectual

The Moral Meaning of Capitalism

An industrialist who works for nothing but his own profit guiltlessly proclaims his refusal to be sacrificed for the “public good.”
View Article
Ayn Rand | 1961
The Virtue of Selfishness

The Objectivist Ethics

What is morality? Why does man need it? — and how the answers to these questions give rise to an ethics of rational self-interest.
View Article