Christie shows once again that Republican Party is no friend to the working class

No one likes to play pension politics more than the Republicans.  For example, over the past few years, Republican politicians and their fellow travelers have accused public workers of receiving overly generous pensions as a means to drive a wedge between them and the rest of the middle class. Rather than cause the banks that manipulated Detroit into bankruptcy to lose their truly onerous profit, the overseers of the Motor City—appointed by Republican Governor Rick Snyder—are taking money from retired Detroit workers.

Republicans use pension politics not only to hurt unions, but to harm government operations.  Under Bush, the Republican Congress saddled the Post Office, a quasi-governmental organization, with onerous pension fund payments that have forced it to raise postage rates and cut operations. The Post Office’s private competitors—FedEx, UPS and DHL—don’t have to set aside anywhere as much money for their pension plans.

The latest Republican to play the pension politics game is New Jersey’s bully in residence, Governor Chris Christie, who intends to balance this year’s state budget by not paying $2.43 billion into New Jersey’s ailing public pension system over the next two years.

In not making these payments, Christie reneges on a deal he made with public unions a few years back. Christie’s pension overhaul shifted more costs to public workers, raised their retirement age to 65, and froze yearly cost-of-living adjustments. In exchange, Christie and lawmakers agreed to make bigger payments each year to the pension fund to repair the financial damage of years of former administrations paying nothing into the system.

When announcing his dead-beating of New Jersey public workers, Christie tried to make himself into a hero of the day by declaring passionately that he refused to cut funds for Medicaid and schools.  Of course, Medicaid and support of public schools are two other issues on which the Republicans and many Democrats also play games.  After years of advocating cuts to social service programs, Christie came off looking like what he is—a a very big hypocrite!

What Christie didn’t say is that New Jersey, like most other states and the federal government, has purposely starved itself over the past three decades by lowering the tax burden, especially on businesses and the wealthy.  This fiscal anorexia is the true cause of the budget shortfall that New Jersey faces.

Instead of weakening the already damaged finances of state pensions, Christie could curtail tax breaks for corporations. He could call for the repeal of the massive tax breaks and other incentives to businesses he signed into law late last year, which will cost the state millions of dollars.  One article two years ago computed the value of Christie’s corporate handouts at that point to be $1.57 billion. If New Jersey reneged on those corporate giveaways, it would leave New Jersey with a $1.03 billion shortfall on pensions, which could be paid for with a bump in taxes of about $382 a year for two years (but probably moving forward as well) on the approximately 1.35 million New Jerseyites making more than $75,000 a year.

But making businesses and the relatively well-off pay their fair share is not part of the Republican play book. At least since Ronald Reagan gained influence, Republicans have been dedicated to cutting taxes ever more on the wealthy and businesses while shrinking the services that government offers to everyone—from support of public schools and universities to funding for libraries, roads and the indigent.  It’s now been more than 30 years since wealth and income started flowing from the poor and middle class to the wealthy and especially the ultra-wealthy. By refusing to fund pensions for public employees and ratcheting up giveaways to businesses, Christie is merely going with the flow, just floating along a rising tide which is lifting the oversized yachts of the rich while sinking everybody else.

NH town should fire police commissioner who called the President the N-word

Everyone has the right to exercise free speech, except when they are serving as a representative of an organization. As an organizational representative, what you say should hew to the standards and beliefs of the organization.

It’s amazing how many times public language controversies hinge on whether the individual who made the statement was representing him or herself or an organization.  At essence, representing the organization was at the heart of the “Duck Dynasty” and Donald Sterling controversies.  AMC thought Phil Robertson, star of reality show “Duck Dynasty,” was representing the network when he made sexist and then racist statements. If Donald Sterling didn’t represent a professional basketball team, his vilely racist comments would not have made such news.  The National Basketball Association recognized that as an owner Sterling always represented the league and therefore fined him and is trying to force him to sell his team.

Representing the organization is also the case with Wolfeboro, New Hampshire police commissioner Robert Copeland who called President Obama a “nigger.”

For at least the last 150 years, virtually everyone who speaks English has regarded “nigger” as a term of disparagement, similar to “kike,” “dago” or “frog.”

Some defenders of the use of the term “nigger” point out that calling a white from the rural south a “cracker” is the same thing.  Since one is supposedly “allowed,” why not the other?

This view neglects the historical fact of slavery and the legal and institutional racism that poisoned much of the United States for more than a century after the demise of slavery and still affects the country in negative ways. “Niggers” were chattel that could be sold.  “Niggers” had no control over their own lives.  “Niggers” suffered the physical pains and humiliation of whippings, forced separation of families and rape. “Niggers” were considered genetically and morally inferior creatures who couldn’t take care of themselves, who shunned work and didn’t know how to handle money. “Niggers” were less than human. “Niggers” could be easily fooled since they had the emotions of children. “Niggers” deserve to live in poverty since they don’t know how to work hard.

That’s what most whites meant when they used the N-word from about 1800 onwards.

Everyone knows that the word “nigger” reflects all these meanings, which makes it a more scurrilous and damaging phrase than other ethnic insults, at least in the United States.

Even the Afro-American men who use the term with each other as a kind of endearment know that it’s a horrible insult.  Male-bonding often devolves to gentle competitive sparring. I don’t know how many times I have called my male cousins “assholes” or “bozos” to their face, and none got mad, because they knew that my diminishment of them was a form of affection—or perhaps a replacement of the affection that American men are not supposed to display to other men.  The very fact that the word “nigger” is especially harmful and disgusting makes it an ideal choice for male-bonding between Afro-American men.  It doesn’t make the word acceptable in any other context, and certainly not acceptable for whites to use because when whites say “nigger,” it carries all its historical baggage.

Barack Obama is a smooth-talking city slicker who had an outstanding career as a scholar and elected official, and has excelled at everything he ever tried (except being a progressive president).  He shares not a trait with the composite “nigger” that people evoke when they use that word. To call Obama a “nigger” can only be understood as an insult. When Copeland refused to apologize, he said that the President “meets and exceeds my criteria for such.”  He could not have possibly meant anything other than the vilest of insults.

Now Copeland is entitled to his opinion about the President and African-Americans in general, and he has the right to express it.

But he doesn’t have the right to express it as a representative of Wolfeboro.

The simple fact of the matter is that when you are an elected or an appointed official, you pretty much sign on to representing your jurisdiction on a 24/7 basis. Once you become a mayor, Congressperson, police commissioner or judge, you de facto give up some right of speech. So whereas you might be entitled to your opinion and to use language that is inherently insulting, exercising that right will conflict with your public duties and with the image that your jurisdiction wants to maintain.

Executives of corporations face a similar constraint: if the news media discovered that a chief executive officer of a Fortune 500 company said “nigger” during a private dinner party, he would be unable to hide behind his right to free speech. The board of directors would summarily fire her/him. And they would be right to do it.

While no decision has been made yet, I’m guessing that the Wolfeboro township commissioners are going to end up firing Copeland or asking him to resign. They will have no choice. Otherwise, they will be tarred with the same racist brush that has rightfully dirtied their police commissioner. Now that kind of institutional racism might fly in the rural south, but probably not in New England.

Colleges across the country show that two wrongs don’t make it right

Across the country, colleges are holding graduation ceremonies for hundreds of thousands of graduates. But what used to be called the graduation season is rapidly gaining a new name: commencement speaker cancellation season.

This year in particular there seems to be a large number of high profile commencement speakers who have backed out or have been disinvited after campus protests. Former Bush II Secretary of State Condoleezza Rice was forced out as Rutgers’ commencement speaker. International Monetary Fund (IMF) chief Christine Lagarde withdrew from speaking at Smith College after protests.  Former University of California-Berkeley Chancellor, Robert J. Birgeneau, withdrew from speaking at Haverford after protesters wanted him to apologize for having campus police use batons against Occupy protesters. Brandeis University reversed its decision to award Islamic feminist Ayaan Hirsi Ali an honorary degree, after previously announcing it would do so, after protests by right-wing Moslems over her criticism of their religion.  Even Michelle Obama changed her mind about going to a Kansas high school graduation after right-wingers protested.

In three of these cases, progressive protesters forced out conservative speakers. In the two other examples, religious radicals forced out moderates who they mistakenly labeled as extremists.

Taken together, these commencement speaker cancellations involve a series of laudable and not-so-laudable actions.  In all examples, we should admire the protesters for exercising their right to make their opinions known.

But we should be disappointed in and ashamed of the institutions and the prominent individuals who backed down. Protests would have made for messy commencements, which are usually drawn out affairs that involve a lot of sophisticated choreography to move thousands of graduates on to and off the stage in a short amount of time. But so what, life is messy and democracy is messy.  By backing down or backing out, the individuals and institutions demonstrated a lack of respect for public discourse.

While we can admire organizations such as the New York Public Library that back down when it turns out their plans are not in the public’s best interests, giving a speaker a platform is never an occasion for backing down. Instead, the institutions could expand the venue—for example, getting another speaker to balance the controversial speaker or creating a special forum to discuss the controversies during commencement week.  The colleges could even give the protesters 10 minutes at the ceremonies to make their points.

To the degree that the speakers themselves made the decision to withdraw, they should be ashamed of themselves. They took the actions that made them controversial. They should own what they did or repudiate it. They should not run away from a spotlight that they themselves created.

For some of the educational institutions in question, backing down from the original plan marks their second mistake. Their first was to invite the speaker in the first place.

Let’s start with Birgeneau: To invite an obscure university administrator known for one action only is an open endorsement of that action. Haverford officials were stating that they thought it was right to beat up peaceful Occupy protesters. No wonder faculty and staff mobilized against the decision to ask Birgeneau to speak.

The case of Condoleezza is also easy: Commencement speakers are supposed to send graduates off on the journey that will be the rest of their lives with hopeful advice that spurs their enthusiasms and aspirations. The commencement speaker thus carries a certain moral authority.  How can anyone who was associated with the decision to create a world-wide network of torture facilities be considered a moral authority? Rice, Cheney, Bush II, Rumsfeld, John Yoo, David Addington and anyone else who was involved in deciding to pursue torture as an instrument of war should be American pariahs. No university invited Joseph McCarthy or Roy Cohn to speak after their disgrace and none should invite these international criminals, either.

Some would argue that Christine Lagarde is also a war criminal by virtue of her activities as head of the IMF. I would agree that many IMF actions have hurt people while protecting the interests of banks. It’s a political argument between left and right.  Lagarde’s politics do not in and of themselves forfeit her the moral right to be a commencement speaker, as the actions of Condoleezza Rice and Robert Birgeneau do.

The case of Brandeis University is the trickiest. On the surface, there is nothing morally objectionable in Ayaan Hirsi Ali’s actions and statements. She has fought for years against female genital mutilation and formed an organization, the AHA Foundation, whose mission is to work “to protect and reinforce the basic rights and freedoms of women and girls, including security and control of their own bodies, access to an education, the ability to work outside the home and control their own income, freedom of expression and association, and the myriad other basic civil rights defined under the laws of Western democracies and the Universal Declaration of Human Rights.” The fact that she is a fellow of the ultra-conservative American Enterprise Institute is troubling, but I can see why any mainstream organization would want to demonstrate its commitment to human rights by awarding Ali an honorary degree.

Except that Brandeis is not any mainstream organization. It is a Jewish organization giving an award to a woman who is disrupting Islam. Yes, we should support her disruptions, just as we support the disruptions that pro-choice and supporters of LGBT marriage make to the Christian and Catholic religious institutions and belief.  But when Brandeis does it, it carries stark and obvious symbolism, because it’s as if a Jewish organization is taunting Islamists, purposely getting their goat.  It plays into the myths that many right-wing Moslems have about Jews.

Someone—actually a lot of people—at Brandeis must have known that giving an honorary degree to Ali would piss off the Islamic right wing, which makes the withdrawal of the degree particularly obnoxious and cowardly. If you are going to stir the pot, don’t wimp out, which is what Brandeis has done.

If comic strips are an indication, breakfast in bed is out for moms on Mother’s Day

As in 2012, I thought I would analyze Mother’s Day this year through the lens of the Sunday comics.  And what a difference two years makes!

For one thing, two years ago I looked at all 20 comics printed in the Sunday edition of the Pittsburgh Post-Gazette. This year I looked at a semi-random 30 comics of the total of more than 70 on the Yahoo! Comics web page; those 30 included most of the 20 I remembered from the Post-Gazette.

Judging from the results of the two surveys, Mother’s Day is not as important as it used to be. Two years ago, 50% of all comics had a Mother’s Day theme.  This year, it was down to one-third. Even family-centered comic strips such as “Momma,” “Fox Trot” and “Family Tree” avoided the holiday.

In analyzing the topics of Mother’s Day comics two years ago, I found that half of them—or 25% of all cartoons that day—focused on bringing mother breakfast in bed. I concluded that breakfast in bed had become the standard practice for this manufactured holiday. Perhaps it was just a fad, because this year, not a single strip I saw depicted other family members preparing and serving mother bed in breakfast.

The interesting thing about breakfast in bed is that the act of preparation and serving doesn’t involved extra consumer purchases (since breakfast typically consists of foods always in the frig).  This lack of consumerism made Mother’s Day unlike other holidays, which tend to reduce to buying and giving gifts.

While the breakfast in bed is missing from this year’s comics, so is consumerism for the most part. True, “Arlo & Janis” creates an emotional competition between the husband giving jewelry and a phone call from a far away son. The “Peanuts” rerun details the act of selecting a card, part of the buying process.

But most of the strips focus on serving, words of appreciation and things one can make or do to show mother the love:

  • Doing chores for mom (“B.C.”)
  • Giving flowers (“Luann”)
  • Gathering flowers and making a card (“Nancy”)
  • Mom reversing roles and doing everything for the child (“Cathy”)
  • Mom getting drunk at a multi-family barbecue (“Stone Soup”)
  • The magic mirror on the wall calling mother the fairest of them all (“Wizard of Id”)
  • How we love mom’s nagging (“Drabble”)
  • Mother and daughter spending the day together, bicycling on the street (“Jump Start”)

Note that the food service takes place at home—in the kitchen or backyard. No restaurants.

So even as comic strip moms are denied the pleasure of breakfast in bed, their families are nonetheless giving more of themselves in a direct way and depending less on engaging in commercial transactions as the means to celebrate the holiday and express their emotions.

This turn to the virtues of interaction and investment of self that we see in comics may or may not reflect a change in society. Depending on which report you read, Mother’s Day spending will be either up or down this year in the aggregate. Per capita spending will go down by $5. Yet even at the low end, Americans will spend more on Mother’s Day than any other holiday but Christmas. But they spend on very few things: one study reports that people mostly give their moms traditional gifts of a card (81.3 percent), flowers (66.6 percent) or a nice meal out (56.5 percent). The first two predominate in the cartoon world.

Mother’s Day has thus not been privatized into a holiday that exists only within families. It still finds expression in the economic realm. People still interact with the rest of the world in the planning and implementation of holiday plans, and they interact the way they know best—by making a purchase that represents an emotion.

News media finally pays attention to problem they helped to create—infantilization of adults

The American news media may finally be starting to cover a trend that they helped to create—the infantilization of American adults.

The infantilized adult continues the pursuits, hobbies, predilections, opinions and thought processes of youth instead of growing into mature, adult pursuits and activities.  Reading Harry Potter and comic books, playing with Legos or My Little Pony dolls, collecting action hero paraphernalia, spending much of their free time playing video games, vacationing at Disney resorts and amusement parks—these are all signs that an adult is wallowing in callow youth instead of growing up.  The mass media has of course glorified each and every one of the trends which together are creating the infantilized American adult.

For years, American comedy movies in particular have celebrated adults who refuse to grow up. The “Harold & Kumar” movies,  “Old School,” “Big,” “Grandma’s Boy,” “Ted,” “The Wedding Crashers,” “Billy Madison,” “You, Me and Dupree,” “Dodgeball,””Step Brothers,” “The 40-year-old Virgin,” “Knocked Up,” all three “Hangovers,” the “Jackass” movies, “Bridesmaids,” “Hall Pass” and “Identity Thief”—this off the top of my head list doesn’t even scratch the surface of the multitude of movies released over the past 20 years centered on men and women who refuse to grow up.

Now A.O. Scott, a New York Times film critic has realized that staying a child is a major theme of American comedy films.

Scott announces his discovery in his review of “Neighbors,” which explores the trick-filled feud between a suburban couple who retain an adolescent lifestyle and the unruly fraternity that moves next door.

He really does nail the current state of American comedy, so I want to give an extended quote:

The central problem in American film comedy for the past 15 years or so — let’s say from middle-period Sandler through prime Apatow and late ‘Hangover’ — has been maturity, or, more precisely, its avoidance. In the old days, adulthood was a fact. Now it’s a vague, unproven theory. Adolescence used to represent constraint and frustration, to be left behind as quickly as possible. For the heroes of the New American Comedy, it represents a blissful state of hedonistic freedom, to be held onto for as long as possible.

“How to stay a child when the world expects otherwise — and how to make the world love you anyway — has usually been, in these movies, a male predicament. Women have been sirens or mommies, on hand to inflame the boys’ desires or soothe their fears. This has begun to change recently, although mainly on television, where shows like ‘Girls’ and ‘Broad City’ have extended the privileges of arrested development on a more or less equal-opportunity basis.”

It’s not Scott’s job to put the tide of comedies about adults remaining children into a broader social context, but it’s clear to me that these movies both reflect the cultural shift and help to shape it.  In most of these movies, the immature heroes and heroines grow up a little bit in the end, but these movies are not cautionary tales about arrested development.  No, they all glorify and endorse infantilization—it’s much more fun than behaving as an adult.

When you add the number of these comedies about men and women remaining boys and girls to the number of fantasy superhero movies, the conclusion is clear: Hollywood is dedicated to promoting the perpetual adolescence lifestyle to the American public.

Article claims to tell us what rich people believe but is really telling us what they want us to believe

U.S. News & World Report is fronting another article that purports to tell us how rich people think. As with other articles of this ilk, “7 Things Rich People Believe” reduces to a series of ideological beliefs presented as facts.  In the article, the writer doesn’t even attempt to justify his assertions about the minds of rich folks. No studies, not even an anecdote—just a series of wishes, assumptions and stale advice, all tinged with the ideology of greed and consumerism.

As it turns out, the writer is Tom Sightings, who I have chided before for his ideologically tinged and accuracy challenged articles that advocate that big cities are not good for retirement and that people should move to avoid paying school taxes once their kids are out of school. Sightings seems to specialize in advocating the politics of selfishness in cute, homey articles that render general advice that always seems to be the same pabulum extolling greed, consumerism and the belief that the rich are better people.

The article opens by asserting that most people both love and hate money—like it but believe it’s evil to be greedy. Sightings then exhorts us to “get beyond your mixed feelings about money and start thinking like a rich person.”

And what does Sightings say a rich person think about money?  That it’s not evil. That there’s nothing wrong with wanting more of it. And that you (the rich person) deserve to have it. Those three thoughts proffered by Sightings are all permissions to be greedy. For example, he never considers that it might be wrong for a billionaire to want more money or that people should feel ashamed to display enormous wealth when others are starving or struggling.  Consider this statement: “The wealthy are not inherently dishonest; they do not feel ashamed of their first-class lifestyle or their bulging portfolios. In fact, most rich people take pride in their accomplishments and enjoy the fruits of their labors.”  What these two sentences are really saying is that 1) the only sign of success is making money; 2) the only way to take pride in your accomplishments is to spend money; and 3) all rich people earn their wealth (meaning they deserve it).  These are all basics principles of the American consumer ideology, hammered into us daily by the news media, our civic leaders and mass entertainment.  But nowhere does Sightings prove that any of these statements are true.

To these general ideological beliefs, Sighting adds an out and out lie: that the way to achieve real wealth is to earn more, not save more.  Tell that to all the trust fund babies; the inexperienced kids who go to the front of the line for jobs because of their rich parents’ connections; or those born millionaires like Bill Gates, Michael Dell and Mitt Romney who leveraged their parents’ wealth into multi-millions or billions.  All studies suggest that the best way to achieve wealth is to be born into wealth. Now maybe Sightings is right that rich people believe the lie that the road to riches runs through your job—but I don’t think Sightings ever asked, and it’s convenient that it’s what rich folk who don’t want to pay a lot of taxes would want us to believe.

Sightings completes his list of what rich folk think with some of the more common business success tips that we’ve heard since the days of Dale Carnegie and before: Rely on brainwork. Live below your means. Spend more on education and less on entertainment. Like all writers on business success, though, when Sightings says “education,” he really means vocational training. “Yet these people typically do not put a lot of faith in formal education or fancy degrees. They focus on useful, practical skills that are relevant to their career.” In Sightings world, you won’t catch a rich person reading Plato or Proust, studying environmental science or contemplating the lessons of Chinese history.

Articles claiming or inferring that rich people think differently and those giving tips on how to think like a rich person pop up in the mass business media about every six weeks. All build their case on assumptions and anecdotes. All happily support the status quo.

These lists of what the rich think or how they differ from others always communicate three hidden messages:

  1. There is one route to success, which, of course, is to buy into the American ideology of selfishness and consumerism.
  2. Rich people deserve to be rich, and that their wealth does not depend on luck, connections, prior wealth or the accidents of birth.
  3. Everyone can become rich. All you have to do is think and act like a rich person.

The flip side of the third message is that when you don’t become rich, it’s your fault. You didn’t work as hard as that investment banker (even if you worked as many hours in your job as a janitor). You didn’t get enough training, or the wrong kind of training (I guess that associates degree was a mistake—too bad you didn’t have the bucks for Harvard!). You didn’t have the right attitude or the right thought process. Maybe you stayed poor because in your heart you didn’t like yourself enough to get rich.  Whatever, it’s all your own fault.

These articles purporting to analyze the wealthy thus serve to enforce the American ideology—to make us like the wealthy and not resent them, to make us want to be like them and to accept their version of what’s best for society.

Just the kind of stuff that rich people—those who own the media and advertise on it want—want us to believe.  

Is the U.S. giving up its support of the rule of law?

Note: I’m giving over today’s blog to distinguished anti-death penalty attorney, Marshall Dayan.  Here’s what Marshall has to say about the rule of law in contemporary America:

Americans are rightfully proud of our historical leadership when it comes to support of the rule of law: this idea that the law prevails and that our independent judicial system will apply the law to all in a fair and consistent manner.

But a number of events over the past few weeks make me wonder if the rule of law is losing some of its vitality in the United States.

Supreme Court Justice Antonin Scalia recently told law students at the University of Tennessee that they should think about revolting if taxes get too high.  He did not recommend that those opposed to high taxes organize politically and elect representatives who would reduce high taxes. He suggested that they consider a revolt.

In Nevada, a rancher, Cliven Bundy, has refused to pay grazing fees for grazing his cattle on federal land for the past twenty years.  16,000 other western cattle ranchers graze their cattle on federal lands, and they pay grazing fees for doing so.  Bundy has outspokenly rejected the authority of the federal government and the Bureau of Land Management (BLM) to charge him grazing fees, asserting that the land belonged to the State of Nevada.  The federal court rejected this argument and ordered him to pay the fees.  The court also found him to be trespassing on federal law in the absence of payment.  Bundy became a libertarian cause célèbre by defying the court’s orders requiring him to pay the grazing fees.  (His image became tarnished when he revealed himself to be a blatant racist, ironically chastising impoverished African-Americans for availing themselves of federal government economic programs while he abused government resources for his own economic advantage.)  Rather than enforce the court’s orders, BLM backed down, at least temporarily, in the face of armed resisters who have gathered in Bunkerville, Nevada to defend Bundy’s continued illegal grazing on federal lands.

In both cases, federal officials—Justice Scalia and BLM—have weakened the concept of the rule of law.

Another example of abandoning the rule of law came when the Supreme Court of Oklahoma recently issued a stay of execution to protect its jurisdictional right to decide whether an Oklahoma statute barring the revelation of the manufacturer of drugs for lethal injection violated the state constitution. After the court ruled, Oklahoma Governor Mary Fallin defied the court’s stay order.  She issued an executive order scheduling two executions for April 29, 2014.  In issuing her executive order, Governor Fallin wrongfully argued that the Oklahoma Supreme Court had acted beyond its constitutional authority and therefore she would not follow its order.  As an aside, Oklahoma badly botched the first of two attempted executions. The condemned prisoner, Clayton Lockett, died of a heart attack forty three minutes after the lethal injection failed.  Governor Fallin then delayed the execution of the other prisoner, Charles Warner.

Our second President John Adams supposedly coined the phrase, “a government of laws, and not of men.” Adams believed that while all people are fallible, we strive to create rules to be applied fairly and consistently.  This idea comes directly from the Hebrew Bible.  Leviticus 19:15 commands, “You shall not render an unfair decision: do not favor the poor or show deference to the rich; judge your neighbor fairly.”

There will always be disputes about the boundaries of government power. An independent judiciary is necessary to settle these disputes.  Without it, we run the risk of devolving into chaos.  In United States v. United Mine Workers, Justice Felix Frankfurter wrote that “[t]here can be no free society without law administered through an independent judiciary. If one man can be allowed to determine for himself what is law, every man can. That means first chaos, then tyranny.”

Political differences are healthy, and are to be wrestled with in a democratic republic.  But courts must remain independent, and must be honored and respected by people of good will on all sides of all issues, or we risk losing our democratic republic to a tyranny of raw power. The recent statements and decisions by the Justice Scalia, the BLM and Governor Fallin undercut this basic principle of American rule of law.

Supreme Court makes a major mistake by allowing Christian prayers before public meetings

I’m still flabbergasted at the naiveté—or perhaps lack of experience in the world—displayed by Supreme Court Justice Anthony Kennedy in his majority opinion upholding the right of upstate New York government officials to say Christian prayers before town meetings.

Kennedy writes that the case comes down to whether people are offended by the prayers. His widely quoted words are 1,000% wrong: “Adults often encounter speech they find disagreeable…Legislative bodies do not engage in impermissible coercion merely by exposing constituents to prayer they would rather not hear and in which they need not participate.

Maybe he should have asked Jews, Muslims or atheists what they feel.  I’m quite certain that many, if not most, will tell you that they feel oppressed and assaulted by prayers that invoke Christ or a Christian god at a public or government meeting. Many also feel angry and betrayed by those allowing and enabling prayers for one religion in what is supposed to be a secular and diverse society.

I personally have encountered maybe 20 situations in my life in which clergy or lay people have offered public prayers for one religion—always a form of Christianity—at a public event.  And every single time, I have complained, usually joined by others.  Why? A combination of a deep feeling of oppression and a disappointment that the ideals of a secular society are being trampled upon.

My earliest example was when the coach of my high school football team in Miami, Florida would ask a member of the clergy to give a prayer before every game. The clergy were mostly Christian, with an occasional rabbi; it was long before the days of Islamic or Buddhist awareness. The prayers were almost always quite ecumenical, with some clergymen not even mentioning a deity. But one time, a preacher invoked Christ several times. The three Jewish members of the team (the other two of whom made All City; I was a scrub) hit the roof. We felt so angry and betrayed by our coach, an otherwise wonderful man, Frank Downey, who had actually played on the same high school football team as my father years before. Coach Downey made sure it never happened again.

When you are different from the majority or from what is considered the social norm, it always feels a little bit like you don’t really belong, whether you a different color, a different nationality or a different religion. The majority culture impinges on everything—think of the hype and the displays of Christmas season, of the Christian holidays that have become national holidays like St. Valentine’s Day or All Hallow’s Eve or of the many times politicians talk about their Christian faith. Imagine being a Moslem and trying to explain to your children why you don’t exchange presents the morning of December 25.

Luckily, our constitution and the first amendment guarantee religious freedom and a secular society. I personally believe that a correct reading of the Constitution would prohibit every type of prayer before government meetings, let alone prayer to a specific deity.

I suggest that Justice Kennedy try to walk a mile in someone else’s shoes for a few hours.  He might change his mind about what he considers to be coercive or oppressive.

Someday we will get a Supreme Court which is dedicated to interpreting the Constitution and not to completing the Reagan right-wing agenda. Maybe then, this awful Supreme Court decision will be reversed.

Law dean rationale for making insider trading legal would allow murder, theft & anything else bad people do

It seems as if the bad idea of the week always shows up on the opinion pages of The Wall Street Journal.

This week it’s the idea that insider trading of stocks should be legal, proffered by Henry Manne, dean emeritus of the George Mason University School of Law in an article titled “Busting Insider Trading: As Pointless as Prohibition.

Mann’s reasoning is that as in the case of the prohibition of drinking alcohol in effect in the United States from 1920-1934, the law against insider trading doesn’t stop people from doing it. If people are still going to do it, it might as well be legal.

By Manne’s reasoning, murder, theft, incest, rape and every other crime might as well be legal, since people are still going to do it.

We all know, however, that if murder, theft or illegal trading were legal, instead of just a few sociopaths doing it, a large number of people would. I don’t think Manne would advocate making murder legal.

The difference between Prohibition and these crimes—and insider trading—is the difference between “who cares” and “it’s wrong.” It’s not wrong to drink alcohol and is never was except to snoopy moralists. Nor does drinking alcohol hurt anyone except for the drinker, except when that drinker does something stupid like drive or give it to minors, which are still against the law.

But it is wrong and unethical to buy and sell stocks based on information that the general public doesn’t have yet. It also hurts other people, especially when the insider is selling a stock that’s about to go into the tank. Near the end of his article, Manne makes the outrageous claim that insider trading does no harm and can have significant social and economic benefits.  Of course he never says what those benefits are. That’s because there are none. Insider trading has been illegal since 1934 because it is unfair and it allows the insider to profit unfairly. It is akin to getting an extra out in baseball or starting on third base. I know that a lot of Wall Street insiders did start on third base and think they hit a triple, but that sense of privilege often held by the moneyed —so many of whom are the bankers and executives who obtain the most insider information—should not and does not legally extend to special treatment as an investor.

Manne hides the lunacy of his argument behind an extended simile—the comparison of the FBI tracking bootleggers and other gangsters and the efforts of Manhattan federal prosecutor Preet Bharara to go after insider traders such as Stephen A. Cohen’s firm.  He spends a goodly number of words glorifying Elliott Ness, only to point out that Ness’ gallant activities led nowhere, since prohibition was repealed. His analogy is bogus not only because insider trading can’t be compared to drinking alcohol, but because the focal point of the comparison—Eliott Ness—didn’t really get much done. His reputation is mostly manufactured by the “untouchable” television series and movies.  In real life, he was pretty mediocre, although he did help gather evidence that put gangster Al Capone away—on charges of tax evasion!

I suppose there is some consistency in creating a false comparison in which one of the objects under comparison is also false.

Whenever I see articles like this one, I wonder why a major newspaper—and specifically the Wall Street Journal—would publish them. I know that Manne has a big name in legal circles as an emeritus dean and as a legal theoretician. His big idea—to use economics to analyze legal problems—certainly fits into the Journal’s bailiwick.  But a crackpot idea is a crackpot idea.

British Lord puts a happy face on environmental degradation and resource shortages

One of the most powerful rhetorical devices is to cherry pick your criteria to get the result you want.  We see a classic example of it in “The Scarcity Fallacy,” the lead essay in the Wall Street Journal’s “Review” section this week. Author Matt Ridley, a member of the British House of Lords, says that “ecologists worry that the world’s resources come in fixed amounts that will run out, but we have broken through such limits again and again.

Lord Ridley’s logical fallacy, which animates his rhetorical trickery, is that he refers only to the human race over the past 10,000 some odd years of recorded history. If he looked either closer or longer term, he might not conclude that we have always overcome resource shortages so we will in the future.

The Spanish philosopher Ortega y Gasset once said that the best point of view from which to look at history is where you can just make out the warts on Cleopatra’s nose. Detail, but not so close that all you see is detail.  Ortega believed this theoretical sweet spot reveals overarching truths.

Here’s an extreme example of the impact of measurement parameters on conclusions: In evaluating the greatest center fielders of all time, baseball numbers guru Bill James noted that he usually used the best five years of a career as a major criterion and by this measurement Mickey Mantle beat Ty Cobb, but if he had measured the best 4, 6, 7 or 8 years, Cobb would win.

In Ridley’s case, he’s measuring all of humanity over 10,000 years.

But what if he looked more closely? He would find that a number of human societies and cultures have disappeared because of resource depletion: the American Indians at Cahokia, the Pacific Islanders on Rapa Nui, the ancient Minoans on Crete, the citizens of Mohenjo-Daro in the Indus Valley, to name a few.

Ridley could have also taken a wide lens and looked at the 3.6 billion year history of life on earth, or even the 200 million years since mammals first emerged. In both these cases, one of the big lessons of history is that the overwhelming majority of species will eventually become extinct, as they fail to adapt to the ever-transforming environment on Earth.

The danger in Ridley’s conclusion that we’ll figure it out because we have always figured it out in the past is that everyone who says it, including Ridley, uses it to justify a laissez faire approach that lets the marketplace determine how we meet the resource depletion challenges that we face. In fact, if we are to survive as a species, we need to look at things in a new way and organize societies in new ways. Many are working to save human beings from extinction, for example the scientists researching planets that have living conditions similar to Earth’s. These scientists know that our sun’s ever-intensifying heat will evaporate all the water on the earth in about a billion years, so we have to find another place to live before then. The work of these scientists requires public support and public support requires higher taxes, something that lassiez-fairenistas never like. Note, too, that Ridley applauds fracking as an example of human ingenuity that shows we’ll overcome every resource shortage. Well, maybe not the shortage of clean air and water that fracking causes.

Ridley also thinks that large parts of the world haven’t yet been introduced to fertilizer and other advanced agricultural techniques, which seems to be a meager proof that we won’t run out of food. Not only that, he lauds the positive influence on the environment that humans have because birds and other animals often carry fertilizer used on crops to the forests. The article presents the world as seen through the rose-colored glasses of a true believer in technology controlled by private interests.

Ridley is so busy shoveling fertilizer about fertilizer that he ignores the real degradations we are inflicting on our planet and the real threat of resource depletion to our future well-being. His complacent and smug self-satisfaction with the human race will no doubt make many breathe a sigh of relief and go about their business using resources profligately. After all, we’ve always muddled through before.

And so did the stegosaurus, until it didn’t.