What’s Really Driving the Toyota Controversy?

by Don Watkins and Yaron Brook | March 26, 2010

How many Congressmen does it take to identify the cause of a runaway Toyota Prius? No, it’s not a trick question. Yesterday a Congressional panel issued a draft report on a case of supposed runaway acceleration reported last week in San Diego.

Why wasn’t that left to the objective assessment of the police and courts? The answer to that question was made clear during last month’s Congressional hearings on the Toyota recalls.

In February Toyota execs were hauled in front of Congress, purportedly so that renowned auto experts like Henry Waxman could determine the cause of reported cases of unintended acceleration and evaluate Toyota’s alleged failure to respond.

The hearings had the fingerprints of politics all over them. Indeed, it was uncanny how easily a careful observer could predict whether a given Congressman would defend or deride Toyota. Just take stock of which pressure groups dominate his district.

Would you believe that Rep. John Dingell of Detroit was highly critical of Toyota? Or that Henry Cuellar, whose district is home to thousands of Toyota employees, came to the car company’s defense?

The whole spectacle is a rogues’ gallery of pressure groups descending upon Washington.

The United Auto Workers union showed up hoping to use Toyota’s problems as leverage to force the company to keep open its sole UAW factory. Rumor has it that some in the UAW camp even hope to unionize all of Toyota’s U.S. factories.

The trial lawyers are drooling at the prospect of parlaying Toyoda’s apology into hundred-million-dollar awards. No need to wait for the evidence to come in, either — they have their own “experts” who already “know” Toyota is covering up pedals of doom.

And then there’s the Detroit lobby, which is hard to distinguish from the government itself now that the government holds a 60 percent stake in General Motors. Even if concerns about an electronic problem in Toyota’s pedals turn out to be baseless, domestic manufacturers stand to benefit by prolonging the parade of bad publicity.

Anyone who thinks Toyota’s executives were summoned to Congress to discuss the evidence concerning Toyota’s pedals has missed the point. Regardless of what we ultimately discover about Toyota, this was about a horde of pressure groups seeking to impose their economic agendas via political power. But pressure groups are only a symptom. The cause is the government’s power to intervene in the market to pick winners and losers.

In the auto industry alone, the government controls everything from whom car companies can hire (unionized employees) to what kind of vehicles they must build (hybrids). And elsewhere it decides which businesses are “too big to fail,” which industries “deserve” massive subsidies, and which unproven technologies warrant billions of taxpayer “investment.” That’s a recipe for pressure group warfare.

This is not what Madison and Jefferson had in mind. Their vision was of a strictly limited government, which would perform one basic function — guard individual rights. Its role was to protect the individual’s rights to life, liberty, and property from infringement by thugs and frauds, while otherwise leaving people free to produce and trade in a free market. In the original American system, it’s the job of the market to pick winners and losers, and the job of the courts — not Congress — to arbitrate disputes, such as that between Toyota and drivers harmed in accidents.

The truth is Toyota’s troubles should not be a political issue. On a free market, Toyota would have to address the real or alleged problems with its cars and work to restore its reputation with consumers, or suffer the consequences. And if the company were proved in a court of law to be guilty of negligence, it would be held accountable. In any case, there would be no need for the circus now taking place, with all its sordid political posturing and favor-trading.

So here’s a proposal. Make Washington come up with a plan to disentangle government from the economy. It might even start with a Congressional investigation.

About The Authors

Don Watkins

Former Fellow (2006-2017), Ayn Rand Institute

Yaron Brook

Chairman of the Board, Ayn Rand Institute

Anti-Smoking Paternalism: A Cancer on American Liberty

by Don Watkins | March 06, 2010

Newport Beach is considering banning smoking in a variety of new places, potentially including parks and outdoor dining areas. This is just the latest step in a widespread war on smoking by federal, state, and local governments — a campaign that includes massive taxes on cigarettes, advertising bans, and endless lawsuits against tobacco companies. This war is infecting America with a political disease far worse than any health risk caused by smoking; it is destroying our freedom to make our own judgments and choices.

According to the anti-smoking movement, restricting people’s freedom to smoke is justified by the necessity of combating the “epidemic” of smoking-related disease and death. Cigarettes, we are told, kill hundreds of thousands each year, and expose countless millions to secondhand smoke. Smoking, the anti-smoking movement says, in effect, is a plague, whose ravages can only be combated through drastic government action.

But smoking is not some infectious disease that must be quarantined and destroyed by the government. It’s a voluntary activity that every individual is free to abstain from (including by avoiding restaurants and other private establishments that permit smoking). And, contrary to those who regard any smoking as irrational on its face, cigarettes are a potential value that each individual must assess for himself. Of course, smoking can be harmful — in certain quantities, over a certain period of time, it can be habit forming and lead to disease or death. But many understandably regard the risks as minimal if one smokes relatively infrequently, and they see smoking as offering definite value, such as physical pleasure.

Are they right? Can it be a value to smoke cigarettes — and if so, in what quantity? This is the sort of judgment that properly belongs to every individual, based on his assessment of the evidence concerning smoking’s benefits and risks, and taking into account his particular circumstances (age, family history, etc.). If others believe the smoker is making a mistake, they are free to try to persuade him of their viewpoint. But they should not be free to dictate his decision, any more than they should be able to dictate his decision on whether and to what extent to drink alcohol or play poker. The fact that some individuals will smoke themselves into an early grave is no more justification for banning smoking than that the existence of alcoholics is grounds for prohibiting you from enjoying a drink at dinner.

Implicit in the war on smoking, however, is the view that the government must dictate the individual’s decisions with regard to smoking, because he is incapable of making them rationally. To the extent the anti-smoking movement succeeds in wielding the power of government coercion to impose on Americans its blanket opposition to smoking, it is entrenching paternalism: the view that individuals are incompetent to run their own lives, and thus require a nanny-state to control every aspect of those lives.

This state is well on its way: from trans-fat bans to bicycle helmet laws to prohibitions on gambling, the government is increasingly abridging our freedom on the grounds that we are not competent to make rational decisions in these areas — just as it has long done by paternalistically dictating how we plan for retirement (Social Security) or what medicines we may take (the FDA).

Indeed, one of the main arguments used to bolster the anti-smoking agenda is the claim that smokers impose “social costs” on non-smokers, such as smoking-related medical expenses — an argument that perversely uses an injustice created by paternalism to support its expansion. The only reason non-smokers today are forced to foot the medical bills of smokers is that our government has virtually taken over the field of medicine, in order to relieve us inept Americans of the freedom to manage our own health care, and bear the costs of our own choices.

But contrary to paternalism, we are not congenitally irrational misfits. We are thinking beings for whom it is both possible and necessary to rationally judge which courses of action will serve our interests. The consequences of ignoring this fact range from denying us legitimate pleasures to literally killing us: from the healthy 26-year-old unable to enjoy a trans-fatty food to the 75-year-old man unable to take an unapproved, experimental drug without which he will certainly die.

By employing government coercion to deprive us of the freedom to judge for ourselves what we inhale or consume, the anti-smoking movement has become an enemy, not an ally, in the quest for health and happiness.

About The Author

Don Watkins

Former Fellow (2006-2017), Ayn Rand Institute

Were the Founding Fathers Media Socialists?

by Don Watkins | March 01, 2010

The Federal Communications Commission’s Chief Diversity Officer, Mark Lloyd, wants government to socialize the media. In his 2006 book Prologue to a Farce, Lloyd calls for a far-reaching government program that would straitjacket private media companies and funnel tens of billions of dollars into a tax-supported “public” media — an agenda shared by many of his associates. A massive nonprofit media run by the state would better inform Americans, Lloyd claims, although, feeling generous, he allows that “there should be a place for private communications services in a republic.”

You might think this radical call for government control of the media is at odds with the First Amendment and the ideals of its authors. Not according to Lloyd and his fellow travelers, who portray their vision of a government-funded press as a continuation of the American tradition. The Founders, they say, weren’t committed to protecting a profit-seeking press from government control. Instead, their primary concern was making sure the press could effectively educate and inform Americans, and they obsessively sought to subsidize the press in order to achieve that goal.

Let’s review the facts. During the founding era, America was buzzing with newspapers — all of them privately owned and for-profit. Profit-seeking was so much a part of the American press that, as Professor Paul Starr notes, “The word ‘advertiser’ appeared in the title of 5 of 8 dailies published in 1790 and 20 of 24 dailies in 1800.” The Founders did not curtail this profit-seeking press or supplement it with a government press. Instead they created a limited, rights-protecting government that secured freedom of speech and of the press. They were keenly aware that a free country depended on the free communication of ideas; indeed, it was America’s burgeoning press that had helped transform the colonists from loyal subjects into intransigent rebels, something that would have been impossible had the British government controlled or restricted the press.

Lloyd’s plan is point for point a repudiation of the Founders’ ideals.

Lloyd advocates billions in new taxes on the private media, while the Founders reviled the 1765 Stamp Act, which sparked the chain of events climaxing in the Revolution, in large measure because it taxed the press.

Lloyd calls for “federal regulations over commercial broadcast and cable programs regarding political advertising and commentary, educational programs for children” and even “the number of commercials” they can run, while the Founders solemnly declared that “Congress shall make no law” abridging the freedom of speech.

Lloyd advocates a government-run “public” media that would force you to support through taxes ideas you may oppose, while the Founders recognized the individual’s freedom of conscience, which includes the right not to support views you object to.

The most Lloyd can dig up to substantiate his claim that a sprawling “public” press and crippling restrictions on the private press are consistent with the Founders’ ideals is an obscure 1792 act that reduced postal shipping rates for newspapers. According to Lloyd, the Founders’ “advocacy of the Postal Act of 1792 put communication service and a subsidy for political discourse at the center of our republic.”

It was not a subsidy but freedom that the Founders put at the center of our republic. Even if we grant Lloyd that the Founders supported the Postal Act because they saw a modest role for government in promoting the spread of news, an objective assessment of such support would have to conclude that it contradicts their fundamental commitment to a free press. The reporting of news must be left to the voluntary actions of private individuals — any news subsidy inevitably sets the stage for government control of the press (just observe Washington’s intrusion into the affairs of today’s bailout recipients).

At the deepest level, Lloyd’s is an act of moral embezzlement. He is using what is at most a minor inconsistency on the part of the Founders to smash their achievement and destroy America’s free press. The FCC’s adoption of his proposals would not continue the American tradition. It would end it.

About The Author

Don Watkins

Former Fellow (2006-2017), Ayn Rand Institute

Commercialism Only Adds to Joy of the Holidays

by Onkar Ghate | December 18, 2009 | USNews.com

I’m an atheist, and I love Christmas. If you think that’s a contradiction, think again.

Do you remember as a child composing wish lists of things you genuinely valued, thought you deserved, and knew would bring you pleasure? Do you remember eagerly awaiting the arrival of Christmas morning and the new bike, book, or chemistry set you were hoping for? That childhood feeling captures the spirit of Christmas and explains why so many of us look forward to the season each year.

You may no longer anticipate Christmas morning with that same childhood excitement. After all, even if you still make a wish list, couldn’t you just go out and buy the items yourself? Yet the pleasure of exchanging gifts as a token of friendship and love remains. Particularly when you receive (or purchase) a gift that could come only from someone who knows you well — say, a shirt that broadens your style or a new wine that becomes one of your favorites — it serves as a material reminder of a spiritual bond.

More widely, through cards, telephone calls, parties, long-distance travel, and vacation, Christmas serves as a time to reconnect with cherished family and friends, to share important events of the past year, and to look forward to the next. It’s a time to enjoy delectable chocolates, spiced eggnog, four-course meals, festive music, and party games.

Christmas is a spiritual holi­day whose leitmotif is personal, selfish plea­sure and joy. The season’s commercialism, far from detracting from this celebration, as we’re often told, is integral to it.

“The best aspect of Christmas,” Ayn Rand once observed, is “that Christmas has been commercialized.” The gift buying “stimulates an enormous outpouring of ingenuity in the creation of products devoted to a single purpose: to give men pleasure. And the street decorations put up by departments stores and other institutions — the Christmas trees, the winking lights, the glittering colors — provide the city with a spectacular display, which only ‘commercial greed’ could afford to give us. One would have to be terribly depressed to resist the wonderful gaiety of that spectacle.”

Before Christians co-opted the holiday in the fourth century (there is no reason to believe Jesus was born in December), it was a pagan celebration of the winter solstice, of the days beginning to grow longer. The Northern European tradition of bringing evergreens indoors, for instance, was a reminder that life and production were soon to return to the now frozen earth.

This focus on earthly joy is the actual source of the emotion most commonly identified with Christmas: goodwill. When you genuinely feel good about your own life and when you’re allowed to acknowledge and celebrate that joy, you come to wish the same happiness for others. It is those who despise their own lives who lash out at and make life miserable for the rest of us.

The commercialism of Christmas reinforces our goodwill. When you scour the malls in search of the perfect gift for a loved one and witness the cornucopia of goods and lights and decorations, you can’t help but feel that your fellow human beings are not enemies to be feared or fools to be avoided but fellow travelers and potential allies in the quest for joy. It’s no accident that America, the world’s most productive country, is also its most benevolent.

Christmas’s relation to goodwill leads many to believe the holiday is inseparable from Christianity, allegedly the religion of goodwill. But the connection is tenuous. A doctrine that tells you that you’re a sinner — that you must seek redemption but cannot earn it yourself and that Jesus, sinless, has endured an excruciating death to redeem you, who doesn’t deserve his sacrifice but who should accept it anyway — can hardly be characterized as expressing a benevolent view of man.

Christianity from the outset has been suspicious of human, earthly pleasure and joy. At best, these are seen as unbecoming a sinner, who should be busy repenting and fretting over his fate in an imagined next life. There once existed a war against Christmas — when religionists held sway in America. The Puritans canceled Christmas; in Boston from 1659 to 1681, the fine for exhibiting Christmas merriment was 5 shillings.

Christmas as we know it, with its twinkling lights, flying reindeer, and dancing snowmen, is largely a creation of 19th-century America. One of the most un-Christian periods in Western history, it was a time of worldly invention, industrialization, and profit. Only such an era would think of a holiday dominated by commercialism and joy and sense the connection between the two.

Christmas in America is not a Christian holiday. And besides, in a country that separates church from state, no national holiday can be regarded as the purview of a religion.

But any celebration can be corrupted. It’s not uncommon today to hear people say Christmas is their most stressful period. Pressed for time (and this year probably for money, too), they feel there are just too many lights to put up, meals to cook, and gifts to buy. Seeking something to blame, they blame the commercialism of the season. But there is no commandment, “Thou shall buy a present for every­one you know.” This is the religious mentality of duty rearing its ugly head again. Do and buy only that which you can truly afford and enjoy; there are myriad ways to celebrate with loved ones without spending a cent.

But whatever you do end up doing, don’t let the state of the economy rob you of the gaiety of the season. Perhaps now more than ever, we all need to remind ourselves that reaching joy on this Earth is the meaning of life.

Merry Christmas!

About The Author

Onkar Ghate

Chief Philosophy Officer and Senior Fellow, Ayn Rand Institute

Smash the Labor Monopolies!

by Tom Bowden | September 15, 2009

When President Obama addresses the AFL-CIO on Sept. 15, he is expected to reiterate his support for the so-called Employee Free Choice Act. Congress is sharply divided over the proposed law, which would change the voting and arbitration procedures by which federal law forces companies to deal with labor unions.

Because the changes favor Big Labor, pro-union Democrats have been locked in a prolonged partisan squabble with their Republican opponents, and legislative compromise seems likely. But that’s really beside the point. Instead of quibbling over the methods by which unions can be forced upon unwilling employers and employees, Congress should be debating how to make the labor market truly free — free from government coercion.

For more than seventy years, Congress has maintained a statutory scheme that fastens coercive labor monopolies on individual companies. Starting with the Wagner Act in 1935, any union that wins a simple majority of employee votes becomes, by force of law, the exclusive bargaining agent for every single employee in that workplace. Such a victory slams the door shut on individuals who want to deal directly with the company, and leaves the union with a government-protected stranglehold on that firm’s labor supply. Predictably, these company-by-company labor monopolies have had the kind of deadening effects that come with all coercive monopolies.

Here’s how it works in practice: Each company is required by law to “bargain in good faith” with the union before making any important decision affecting jobs, wages, or working conditions. The union, in its legally privileged position, can just say no. When pressed, it can mobilize a crippling strike even if thousands of employees would rather keep working — because here, too, the outcome of an employee majority vote binds everyone. Usually, however, the mere threat of such a strike is enough to keep employers in line.

Now suppose a unionized firm wants to sell or close an unprofitable plant, or revamp a workflow to save expenses. At the “bargaining” table, the union’s predictable resistance is typically followed by one of two results. Either the union stands firm, in which case the unprofitable practices continue — or the union acquiesces, in exchange for higher wages and benefits, or a job for the shop steward’s son, or some other favor. This is not genuine bargaining but organized extortion, made possible by federal labor law.

So, while non-unionized competitors charge ahead with nimble, inventive, rapid responses to market challenges, unionized companies learn to slow down, “negotiate,” compromise, draw up rules — in other words, kowtow to the union. The inevitable results are bloated prices and declining product quality, as witness the domestic auto industry.

Detroit’s automakers, having suffered through painful work stoppages in the decades following World War II, discovered they could avoid labor unrest by caving in to the United Auto Workers’ demands. Over the years, meeting those demands gave rise to labor agreements as thick as telephone books, testaments to the stultifying regimentation that sapped Detroit’s competitive juices.

Because car manufacturing is complex and capital intensive, many years passed before competitors from Japan, Korea, and Germany could establish non-unionized plants in America’s southland. Now, however, the sun is setting on Detroit. GM and Chrysler are writhing in red ink, drained to the point of bankruptcy by costly union concessions, and Ford struggles to survive.

Not all labor unions wield UAW-level power, but most would like to. That’s why the Employee Free Choice Act would eliminate secret ballots in union elections and replace them with individually signed cards, open to union inspection. This would allow union organizers to more easily target, and intimidate, anti-union employees — and therefore win more often. The Act would also allow government arbitrators to impose initial “contract” terms if the union and employer disagree. That’s contrary to existing law, which allows for a no-contract impasse in that situation.

Congress should not only reject the transparent power grab known as the Employee Free Choice Act, it should start hacking at the root of the complex federal regime that denies free choice in bargaining. That means repealing the Wagner Act, so that labor law can recognize and protect the absolute right of companies and employees to deal with each other on an entirely voluntary basis.

About The Author

Tom Bowden

Analyst and Outreach Liaison, Ayn Rand Institute

Our Self-Crippled War

by Elan Journo | September 10, 2009

Watching video of the Twin Towers imploding, we all felt horror and outrage. We expected our government to fight back — to protect us from the enemy that attacked us on 9/11. We knew it must, and could, be done. Fighting all-out after Pearl Harbor, we had defeated the colossal naval and air forces of Japan. But eight years later — twice as long as it took to smash Japanese imperialism — what has Washington’s military response to 9/11 achieved?

The enemy that struck us — properly identified not as “terrorism” but rather the jihadist movement seeking to impose Islamic law worldwide — is not merely undefeated, but resurgent.

Islamist factions in Pakistan fight to conquer that country and seize its nuclear weapons. The movement’s inspiration and standard-bearer, the Islamic Republic of Iran, remains the leading sponsor of terrorism, and may soon acquire its own nuclear weapons.

Then there’s the Afghanistan debacle. Eight years ago, practically everyone agreed we must (and could) eliminate the Taliban and its jihadist allies — a primitively equipped force thousands of times less powerful than Imperial Japan. Now that goal seems unreachable.

Today swaggering holy warriors control large areas of the country. They summarily execute anyone deemed un-Islamic, and operate a shadow government with its own religious law courts and “virtue” enforcers. Last year the CIA warned that virtually every major terrorist threat the agency was aware of threaded back to the tribal areas near the Taliban-infested Afghan-Pakistan border.

Why have we been so unsuccessful?

No, the problem is not a shortage of troops, nor is the remedy another Iraq-like “surge.” That sham, appeasing solution entails not quelling the insurgency, but paying tens of thousands of dollars to insurgents not to fight us, for as long as the money flows. And it means leaving Iraq in the hands of leaders far more committed to jihadists than Hussein. No, the crucial problem is the inverted war policy governing U.S. forces on the battlefield.

Defeating the Islamist threat demanded that we fight to crush the jihadists. Victory demanded we recognize the unwelcome necessity of civilian casualties and place blame for them at the hands of the aggressor (as we were more willing to do in World War II). Victory demanded allowing our unmatched military to do its job — without qualification. Instead, our leaders waged a “compassionate” war.

Before the Afghan war began, Washington defined lengthy “no-strike” lists including cultural sites, electrical plants — a host of legitimate strategic targets ruled untouchable — for fear of affronting or harming civilians. Meanwhile, we sent C-17 cargo planes to drop 500,000-odd Islam-compliant food packets to feed starving Afghans and, inevitably, jihadists.

Many Islamists survived, regrouped and staged a fierce comeback.

The no-strike lists lengthened. So, necessary bombing raids are now often canceled, sacrificing the opportunity to kill Islamist fighters. Jihadists exploit this to their advantage. Lt. Gen. Gary L. North tried to justify the policy to a reporter: “Eventually, we will get to the point where we can achieve — within the constraints of which we operate, which by the way the enemy does not operate under — and we will get them.”

“Eventually” — for another eight years?

In Washington’s “compassionate” war, we give the enemy every advantage — and then compel our soldiers to fight with their hands tied . . . ever tighter.

Naturally, U.S. deaths have soared. More Americans died in the first eight months of this year (182) than in all of last year — the bloodiest year of the war, up till now.

If Afghanistan now seems unwinnable, blame Bush and Obama. Bush crusaded not to destroy the Taliban but to bring Afghans elections and reconstruction. Obama’s “new” tack is to insist we spend billions more on nation-building and bend over backwards to safeguard the local population. Both take for granted the allegedly moral imperative of putting the lives and welfare of Afghans first — ahead of defeating the enemy to protect Americans.

This imperative lies behind Washington’s self-crippled war — a war which could have worked to deter other jihadists and their state-sponsors, but instead encourages them to attempt further attacks.

How many more Americans must die before we challenge this conception of a proper war?

About The Author

Elan Journo

Senior Fellow and Vice President, Content and Advanced Training, Ayn Rand Institute

Why is Ayn Rand Still Relevant: Atlas Shrugged and Today’s World

by Yaron Brook and Don Watkins | August 10, 2009 | CNBC

Those who haven’t yet picked up Ayn Rand’s 1957 classic novel Atlas Shrugged may be wondering why so many people are invoking the book in discussions of today’s events.

Well, the short answer is: because today’s world is strikingly similar to the world of Atlas Shrugged.

Consider the government’s affordable housing crusade, in which lenders were forced to make loans to subprime borrowers who allegedly “needed” to own homes.

“We must not let vulgar difficulties obstruct our feeling that it’s a noble plan motivated solely by the public welfare. It’s for the good of the people. The people need it. Need comes first…”

Those might sound like the words of Barney Frank, but in fact they belong to Eugene Lawson, a banker in Atlas Shrugged who went bankrupt giving loans to people on the basis of their “need” rather than their ability to repay. In the quoted scene, Lawson is urging his politically powerful friends to pass a law restricting economic freedom for the “public good” — long-range consequences be damned.

Or consider this cry from Atlas Shrugged villain Wesley Mouch, head of the “Bureau of Economic Planning and National Resources”:

“Freedom has been given a chance and failed. Therefore, more stringent controls are necessary. . . . I need wider powers!”

This mirrors the incessant claims by today’s politicians and bureaucrats that all our problems would disappear if only they had more power. They tell us that health care is expensive and ineffective — not because the government has its tentacles in every part of it and forces us to pay for other people’s unlimited medical-care wants and needs — but because there is no bureaucrat forcing us to buy insurance and dictating which tests and treatments are “necessary.” They tell us that American auto companies failed to compete — not because they were hamstrung by pro-union laws and fuel efficiency standards — but because there was o government auto czar. They tell us that we are reeling from a financial crisis — not as a result of massive, decades-long government intrusion in the financial and housing markets — but because the intrusion wasn’t big enough; we didn’t have a single, all-powerful “systemic risk” regulator.

Atlas Shrugged shows us an all-too-familiar pattern: Washington do-gooders blaming the problems they’ve created on the free market, and using them as a pretext for expanding their power. And more: it provides the fundamental explanation for why the government gets away with continually increasing its control over the economy and our lives. The explanation, according to Atlas, is to be found in the moral precepts we’ve heard all our lives.

From the time we’re young we are taught that the essence of morality is to sacrifice one’s own interests for the sake of others, and that to focus on one’s own interests is immoral and destructive. As a result, we want the government to protect us from doctors and businessmen out for their own profit. We want the government to redistribute wealth from the successful to the unsuccessful. We want the government to ensure that those in need are given “free” health care, cheap housing, guaranteed retirement pay and a job they can never lose. We want the government to take these and many other anti-freedom measures because virtually everyone today believes that they are moral imperatives.

This view of morality, Atlas argues, inevitably leads to the disappearance of freedom.

A free society is one in which the individual’s life belongs to him, where he can pursue his own happiness without interference by others. That is incompatible with the view that morally his life belongs to others. So long as you accept that self-sacrifice for the needs of others is good, you will not be able to defend a capitalist system that enshrines and protects individual freedom and the profit motive.

The only way to stop the growth of the state and return to the Founding Fathers’ ideal of limited government is to recognize that individuals not only have a political right to pursue their own happiness, but a moral right to pursue their own happiness. This is what Ayn Rand called a morality of rational self-interest. It is a selfishness that consists, not of doing whatever you feel like, but of using your mind to discover what will truly make you happy and successful. It is a selfishness that consists, not of sacrificing others in the manner of a Bernie Madoff, but of producing the values your life requires and dealing with others through mutually advantageous, voluntary trade.

It’s no accident that, at the very instant Washington is extending its grip over our lives, Atlas Shrugged is selling faster than ever before. Americans sense that Atlas has something important to say about this frightening trend. It does. If you want to understand the ideas undermining American liberty — and the ideas that could foster it once again — read Atlas Shrugged.

About The Authors

Yaron Brook

Chairman of the Board, Ayn Rand Institute

Don Watkins

Former Fellow (2006-2017), Ayn Rand Institute

The Corrupt Critics of CEO Pay

by Yaron Brook and Don Watkins | May 2009

Since the start of this crisis, we’ve been regaled with stories of CEOs receiving lavish bonuses. Well-paid executives have been vilified as reckless and greedy. L.A. Times columnist Patt Morrison captured the mood when she declared: “I want blood.”

But this is nothing new.

Long before the current crisis, Warren Buffet, John McCain, President Obama, and many other critics condemned (supposedly) outrageous executive pay. “We have a [moral] deficit when CEOs are making more in ten minutes than some workers make in ten months,” Obama said during the presidential campaign.

With today’s government entanglement in business affairs, many Americans are open to attempts by Washington to slash CEO pay. Apparently hoping to exploit that opportunity, the chairman of the House Financial Services Committee, Barney Frank, recently floated the idea of extending the TARP executive pay caps to every financial institution, and potentially to all U.S. companies.

It’s understandable that taxpayers think they should have some say in how bailed out businesses are run, which is one reason why Washington should have never bailed-out those companies in the first place. But why have the critics been so intent on dictating to shareholders of private companies how much they can pay their CEOs?

It’s not because the supposed victims, shareholders, have been demanding it. A few ideologically motivated activists aside, most shareholders in the years leading up to the crisis weren’t complaining about CEO pay packages. Virtually every time they had a chance to vote on a “say on pay” resolution, which would have given them a non-binding vote on CEO compensation, shareholders rejected the measure. Even if they had been given a say, there is no reason to expect they would have put the brakes on high pay. In Britain, for instance, shareholders had a government-mandated right to vote on management compensation, yet CEO pay still rose unabated.

So what has the critics all riled up?

They allege that, despite appearances, executives were not really being paid for performance. Pointing to CEOs who raked in huge bonuses while their companies tanked, the critics say that executive pay was driven not by supply and demand, but by an old boys’ network that placed mutual back-scratching above shareholder welfare. As Obama put it last year, “What accounts for the change in CEO pay is not any market imperative. It’s cultural. At a time when average workers are experiencing little or no income growth, many of America’s CEOs have lost any sense of shame about grabbing whatever their . . . corporate boards will allow.”

It was a compelling tale, but this account of rising pay just doesn’t square with the facts. To name a few: (1) the rise in CEO pay was in line with that of other elite positions, such as professional athletes; (2) the rise in pay continued even as fewer CEOs chaired their board of directors; (3) the companies that paid CEOs the most generally had stock returns much greater than other companies in their industries, while companies that paid their CEOs the least underperformed in their industries.

The critics of CEO pay ignore all of this. They take it as obvious that executives making millions are overpaid. “It turns out that these shareholders, who are wonderfully thoughtful and collectively incisive, become quite stupid when it comes to paying the boss, the guy who works for them,” Barney Frank has said. But what kind of compensation package will attract, retain, and motivate the best CEO is a complicated question. Companies have to weigh thousands of facts and make many subtle judgments in order to assess what a CEO is worth.

What should be the mix between base salary and incentive pay? What kind of incentives should be offered; stock options, restricted stock options, stock appreciation rights? How should those incentives be structured; over what time frame and using which metrics? And what about a severance plan? What kind of plan will be necessary to attract the best candidate? And so forth and so on.

The mere fact that people make their living as executive-pay consultants illustrates how challenging the task is. Central planners like Frank cavalierly dismiss this and declare that they can somehow divine that lower pay for executives will not hinder a company.

Of course, a free market doesn’t eliminate mistakes. A company can hire an incompetent CEO, or structure a pay package that rewards executives for short-term profits at the expense of the company’s long-term welfare. But a company suffers from its mistakes: shareholders earn less, managers need to be fired, and competitors gain market share.

There is, however, something that can short-circuit this corrective process and help keep highly paid incompetents in business: government coercion.

Take the Williams Act, which restricts stock accumulation for the purpose of a takeover, for example. In a truly free market, if poor management is causing a company’s stock to tank, shareholders or outsiders are incentivized to buy enough shares to fire the CEO and improve company performance. But the Williams Act, among other regulations, makes ousting poor management more difficult.

And while the critics have tried to scapegoat “overpaid executives” for our current financial turmoil, the actual cause was, as past editions of Fusion have indicated, coercive government regulations and interventions. Far from vindicating the denunciations of “stupid” shareholders and “inept” CEOs, the recent economic downturn shows what happens when the government interferes with economic decision-making through policies such as the “affordable housing” crusade and the Fed’s artificially low interest rates.

If the critics’ goal were really to promote pay for performance, they would advocate an end to all such regulations and let the free market work.

But that’s not what they advocate. Instead, they call for more regulatory schemes, such as government-mandated “say on pay,” massive tax hikes on the rich, and even outright caps on executive compensation. They do not want pay to be determined by the market, reflect performance, or reward achievement — they just want it to be lower. Frank stated the point clearly when he threatened that if “say on pay” legislation doesn’t sufficiently reduce CEO compensation, “then we will do something more.” Another critic, discussing former Home Depot CEO Robert Nardelli, confessed that “it’s hard to believe that those leading the charge against his pay package . . . weren’t upset mainly by the fact that Nardelli had a $200 million pay package in the first place — no matter how he had performed.”

The critics want to bring down CEO pay, not because it is economically unjustifiable, but because they view it as morally unjustifiable. Prominent opponent of high CEO pay, Robert Reich, for instance, penned a Wall Street Journal column titled “CEOs Deserve Their Pay,” where he defended CEO pay from an economic standpoint, but denied that it was justified ethically. Insisting that wealth rightfully belongs to “society” rather than the individuals who create it, the critics maintain that “society” and not private owners should set salary levels. Many critics go so far as to regard all differences in income as morally unjust and the vast disparity between CEOs and their lowest-paid employees as morally obscene.

But it’s the attack on CEO pay that’s obscene.

Far from relying on nefarious backroom deals, successful CEOs earn their pay by creating vast amounts of wealth. Jack Welch, for instance, helped raise GE’s market value from $14 billion to $410 billion. Steve Jobs’s leadership famously turned a struggling Apple into an industry leader. Only a handful of people develop the virtues — vision, drive, knowledge, and ability — to successfully run a multibillion-dollar company. They deserve extraordinary compensation for their extraordinary achievements.

In smearing America’s great wealth creators as villains and attributing their high pay to greed and corruption rather than productive achievement, the critics want us to overlook the virtues that make CEOs successful. In demanding lower executive pay, despite the wishes of shareholders, the critics aim to deprive CEOs of their just desserts. In denouncing CEO pay for the sole reason that it’s higher than the pay of those who haven’t achieved so much, the critics seek to punish CEOs because they are successful.

Ultimately, how to pay CEOs is a question that only shareholders have a right to decide. But in today’s anti-business climate, it’s vital that we recognize the moral right of successful CEOs to huge rewards.

They earn them.

About The Authors

Yaron Brook

Chairman of the Board, Ayn Rand Institute

Don Watkins

Former Fellow (2006-2017), Ayn Rand Institute

A Critique of Climate Change Science and Policy

by Keith Lockitch | April 13, 2009

It is now widely believed that man-made greenhouse gases are causing an unnatural warming of the earth that will have devastating consequences for human life. Environmentalists and politicians are pressing for severe restrictions on greenhouse gas emissions in order to prevent climate change. But what does the scientific evidence actually support regarding the causes of climate variability and the role of anthropogenic greenhouse gases? Are the predictions of catastrophic changes supported by scientific fact? Are governmental economic intervention and restrictions on emissions an appropriate policy response? Drs. Keith Lockitch and Willie Soon address these critical issues and answer audience questions in a lively panel discussion. (Recorded April 13, 2009.)

About The Author

Keith Lockitch

Vice President of Education and Senior Fellow, Ayn Rand Institute

The Real Meaning of Earth Hour

by Keith Lockitch | March 23, 2009

On Saturday, March 28, cities around the world will turn off their lights to observe “Earth Hour.” Iconic landmarks from the Sydney Opera House to Manhattan’s skyscrapers will be darkened to encourage reduced energy use and signal a commitment to fighting climate change.

While a one-hour blackout will admittedly have little effect on carbon emissions, what matters, organizers say, is the event’s symbolic meaning. That’s true, but not in the way organizers intend.

We hear constantly that the debate is over on climate change — that man-made greenhouse gases are indisputably causing a planetary emergency. But there is ample scientific evidence to reject the claims of climate catastrophe. And what’s never mentioned? The fact that reducing greenhouse gases to the degree sought by climate activists would, itself, cause significant harm.

Politicians and environmentalists, including those behind Earth Hour, are not calling on people just to change a few light bulbs, they are calling for a truly massive reduction in carbon emissions — as much as 80 percent below 1990 levels. Because our energy is overwhelmingly carbon-based (fossil fuels provide more than 80 percent of world energy), and because the claims of abundant “green energy” from breezes and sunbeams are a myth — this necessarily means a massive reduction in our energy use.

People don’t have a clear view of what this would mean in practice. We, in the industrialized world, take our abundant energy for granted and don’t consider just how much we benefit from its use in every minute of every day. Driving our cars to work and school, sitting in our lighted, heated homes and offices, powering our computers and countless other labor-saving appliances, we count on the indispensable values that industrial energy makes possible: hospitals and grocery stores, factories and farms, international travel and global telecommunications. It is hard for us to project the degree of sacrifice and harm that proposed climate policies would force upon us.

This blindness to the vital importance of energy is precisely what Earth Hour exploits. It sends the comforting-but-false message: Cutting off fossil fuels would be easy and even fun! People spend the hour stargazing and holding torch-lit beach parties; restaurants offer special candle-lit dinners. Earth Hour makes the renunciation of energy seem like a big party.

Participants spend an enjoyable sixty minutes in the dark, safe in the knowledge that the life-saving benefits of industrial civilization are just a light switch away. This bears no relation whatsoever to what life would actually be like under the sort of draconian carbon-reduction policies that climate activists are demanding: punishing carbon taxes, severe emissions caps, outright bans on the construction of power plants.

Forget one measly hour with just the lights off. How about Earth Month, without any form of fossil fuel energy? Try spending a month shivering in the dark without heating, electricity, refrigeration; without power plants or generators; without any of the labor-saving, time-saving, and therefore life-saving products that industrial energy makes possible.

Those who claim that we must cut off our carbon emissions to prevent an alleged global catastrophe need to learn the indisputable fact that cutting off our carbon emissions would be a global catastrophe. What we really need is greater awareness of just how indispensable carbon-based energy is to human life (including, of course, to our ability to cope with any changes in the climate).

It is true that the importance of Earth Hour is its symbolic meaning. But that meaning is the opposite of the one intended. The lights of our cities and monuments are a symbol of human achievement, of what mankind has accomplished in rising from the cave to the skyscraper. Earth Hour presents the disturbing spectacle of people celebrating those lights being extinguished. Its call for people to renounce energy and to rejoice at darkened skyscrapers makes its real meaning unmistakably clear: Earth Hour symbolizes the renunciation of industrial civilization.

About The Author

Keith Lockitch

Vice President of Education and Senior Fellow, Ayn Rand Institute

Further Reading

Ayn Rand | 1957
For the New Intellectual

The Moral Meaning of Capitalism

An industrialist who works for nothing but his own profit guiltlessly proclaims his refusal to be sacrificed for the “public good.”
View Article
Ayn Rand | 1961
The Virtue of Selfishness

The Objectivist Ethics

What is morality? Why does man need it? — and how the answers to these questions give rise to an ethics of rational self-interest.
View Article