Sexually explicit TV star replaces stolid man of honor as American Airlines’ ideal traveler

Over the past three weeks, I have found the same eight-page slick mostly black-and-white advertising brochure for American Airlines inserted in one or other of my daily newspapers—The New York Times and Wall Street Journal—at least six times. This extended advertisement for American’s rebranded first class international service demonstrates how the image of the upscale consumer has changed over the past 60 years.

The cover depicts Gregory Peck, a film star active from 1944 into the 1980s, nattily dressed in a business suit, looking up towards the sky. His hands are in his pocket as he looks thoughtfully at a great expanse of sky. The top of a pencil thin pin striped handkerchief shows at the top of his coat breast pocket. In a blurry background we see the wing of an American airplane. A small headline reads, “In 1953, we invented transcontinental service.”

Turn the page and we see a two-page spread that shows an entire jet and actor Neil Patrick Harris dressed just as nattily as Peck and still as a businessman, walking inside the terminal. The headline is bigger, “Today, we reinvent it.”

What follows are two more photo spreads. In both, Neil is on the left page inside a plane and actress Julianna Margulies is on the right page, also in a plane. The two thespians are each alone and engaged in activities that reflect the headline of each page: Service. Comfort. Connectivity. Luxury. Julianna looks stunning, dressed in classic simplicity in what could be business wear or a cocktail dress. The photo is black and white, but if it were in color, her dress would still be black, I assure you.

The last page shows the tail of an American jet in full color with the headline “The legend is back.” In fact, the tail is the dominant visual image throughout the eight pages, getting into three photos, as many as Neil Patrick Harris.

American’s image of the luxury traveler sure has changed! From Gregory Peck to Neil Patrick Harris (with an assist from Julianna Margulies).

Already by 1953, Gregory Peck was typecast as the mythic American straight arrow—a little stiff and formal, honest and forthright, sincere, always following the rules, no nonsense, dedicated to principles. He had established this stereotype in such films as “The Yearling,” “Spellbound,” “Gentlemen’s Agreement,” “Snows of Kilimanjaro,” “Captain Horatio Hornblower” and “The Gunfighter.” He even played the ruthless and devious King David as an earnest school boy in “David and Bathsheba.” This stolid, straight-shooting image of Peck developed long before “The Man in the Grey Flannel Suit,” “On the Beach” and “To Kill a Mockingbird.” American Airlines uses Peck’s persona to represent its image of its customers in 1953—the idealistic American off to conquer the world for democracy (and capitalism).

And what does Neil Patrick Harris represent?  For the past 10 years, his resume consists primarily of playing the same exact role in both a long-running sit com and three Harold and Kumar movies: that of a raunchy and immoral stud who will bed any woman and whose only interest in women is their bodies and sexuality. Whether playing it straight or satirizing, he is the quintessential cool-as-Sinatra laddie boy with emotions suspended in early adolescence, down to the interest in style and consumer toys and the inability to engage the opposite sex except in games of sexual conquest.  Ironically, all the time that he has played the role of an insufferable heterosexual lothario, in real life Harris has been a completely out-of-the-closet gay. He is currently playing the lead role in the revival of “Hedwig and the Angry Inch,” a Broadway musical about a transgendered rock star.

Whether approving or disapproving of Harris or the characters he plays, no one can doubt that he represents an amoral or polymorphic sexual adventurism.

Julianna Margulies also represents sexual freedom in a backhanded way, since she is best known for two roles on TV—in “ER” and “The Good Wife”—in which she played a beautiful and talented woman whose man sleeps around and otherwise embarrasses her. Perhaps the secret narrative of the eight-page ad is that she is running away from these sorrows, but still looks like ten million bucks.

The overt message is obvious: The ideal international traveler is still rich and stylish, but now he can also be a she, and in both cases, their travel abroad is sexy and sensual. The self-sacrificing idealist has been replaced by the fun-loving sybarite.

But beneath the surface commercial message lies an ugly anti-feminist narrative. The mythic contemporary man in the ad is a spiffy philanderer (whose real-life alter ego excludes women as potential sexual partners); the mythic contemporary woman is an alluring victim of philandering.  The cultural references which viewers conjure in seeing these two actors does not paint a pretty picture for women. No matter how accomplished, wealthy or beautiful they are, American Airlines seems to be saying that at heart women are just pieces of tail.

U.S. history is studded with presidential dynasties from day one

Whenever the news media begins to stir about Jeb Bush running for president, a pundit or two does some verbal hand-wringing about the ruinous state of our democracy if the wife of one former president ran against the brother/son of two other presidents.

There are certainly many reasons to fret often about the weakening of democracy in the 21st century: the massive increase in election spending by the ultra-wealthy; the demise of trade unions; the prevalence of lying in public discourse, suborned by the mainstream media; and the refusal of politicians to follow the expressed will of the American people on matters such as taxes on the wealthy (we want them higher) and unemployment compensation (we want it extended).

But the fact that relatives of presidents may be running for our highest office is not a manifestation or a cause of a diminishment in our democratic traditions. Presidential dynasties have been a major part of presidential politics since the birth of the Republic. Most Americans living a full life since 1800 have experienced two presidents who were closely related.

Let’s do the math:

1.  34 years rolled by between the time father John Adams, our second president, (president 1797-1801) took office and his son John Quincy Adams (1825-1829) left office.

    • 12 years passed before William Henry Harrison was inaugurated.

2.  52 years rolled by between the time grandfather William Henry Harrison (1841) took office and his grandson Benjamin Harrison (1889-1893) left office.

    • 8 years passed before Theodore Roosevelt was inaugurated.

3.  44 years rolled by between the time cousin Theodore Roosevelt took office and his cousin Franklin Roosevelt died in office.

    • 44 more years passed before George H.W. Bush was inaugurated.

4.  20 years rolled by between the time father George H.W. Bush took office and his son George W. Bush left office.

If Jeb is elected in 2016 and serves eight years, the Bush presidential dynasty will have lasted 36 years. If Hillary is elected and serves two full terms, the Clinton presidential dynasty would have lasted 32 years, with zero time between dynasties.

This catalogue of presidential dynasties leaves out the dozens of other national political dynasties that have always dominated national politics: the Cabots, Dirksons/Bakers, Gores, Hydes, Kennedys, Lehmans, Macks, Madisons, Marshalls, Masons, Rockefellers, Schuylers, Tafts,  Talmadges, Wadsworths, Walkers—the list is not endless, but could go on for pages.

It looks to me as if dynastic families have always played a major role in American politics. Nothing has changed.

I’m not saying that presidential dynasties are good for the country. All things being equal, I would prefer if people got by on their talents, not their names. But the fact that a Bush may run against a Clinton does not symbolize the bankruptcy of American democracy. Rather it serves as an example of how tightly a narrow sliver of the wealthy and the connected has always controlled our politics.  We can exemplify that fact by taking a look at the backgrounds of the 10 men and one woman involved in this discussion of presidents who were related to other presidents or might be in the future.  The Adams, Harrisons, Roosevelts, Bushes and Hillary Rodham all came from privileged and connected backgrounds, all had every opportunity to succeed handed to them on a silver platter.  All, of course, except Bill Clinton, who truly did fulfill the quintessential American myth that anyone can grow up to be President, assuming he or she has talent and drive.

Christie shows once again that Republican Party is no friend to the working class

No one likes to play pension politics more than the Republicans.  For example, over the past few years, Republican politicians and their fellow travelers have accused public workers of receiving overly generous pensions as a means to drive a wedge between them and the rest of the middle class. Rather than cause the banks that manipulated Detroit into bankruptcy to lose their truly onerous profit, the overseers of the Motor City—appointed by Republican Governor Rick Snyder—are taking money from retired Detroit workers.

Republicans use pension politics not only to hurt unions, but to harm government operations.  Under Bush, the Republican Congress saddled the Post Office, a quasi-governmental organization, with onerous pension fund payments that have forced it to raise postage rates and cut operations. The Post Office’s private competitors—FedEx, UPS and DHL—don’t have to set aside anywhere as much money for their pension plans.

The latest Republican to play the pension politics game is New Jersey’s bully in residence, Governor Chris Christie, who intends to balance this year’s state budget by not paying $2.43 billion into New Jersey’s ailing public pension system over the next two years.

In not making these payments, Christie reneges on a deal he made with public unions a few years back. Christie’s pension overhaul shifted more costs to public workers, raised their retirement age to 65, and froze yearly cost-of-living adjustments. In exchange, Christie and lawmakers agreed to make bigger payments each year to the pension fund to repair the financial damage of years of former administrations paying nothing into the system.

When announcing his dead-beating of New Jersey public workers, Christie tried to make himself into a hero of the day by declaring passionately that he refused to cut funds for Medicaid and schools.  Of course, Medicaid and support of public schools are two other issues on which the Republicans and many Democrats also play games.  After years of advocating cuts to social service programs, Christie came off looking like what he is—a a very big hypocrite!

What Christie didn’t say is that New Jersey, like most other states and the federal government, has purposely starved itself over the past three decades by lowering the tax burden, especially on businesses and the wealthy.  This fiscal anorexia is the true cause of the budget shortfall that New Jersey faces.

Instead of weakening the already damaged finances of state pensions, Christie could curtail tax breaks for corporations. He could call for the repeal of the massive tax breaks and other incentives to businesses he signed into law late last year, which will cost the state millions of dollars.  One article two years ago computed the value of Christie’s corporate handouts at that point to be $1.57 billion. If New Jersey reneged on those corporate giveaways, it would leave New Jersey with a $1.03 billion shortfall on pensions, which could be paid for with a bump in taxes of about $382 a year for two years (but probably moving forward as well) on the approximately 1.35 million New Jerseyites making more than $75,000 a year.

But making businesses and the relatively well-off pay their fair share is not part of the Republican play book. At least since Ronald Reagan gained influence, Republicans have been dedicated to cutting taxes ever more on the wealthy and businesses while shrinking the services that government offers to everyone—from support of public schools and universities to funding for libraries, roads and the indigent.  It’s now been more than 30 years since wealth and income started flowing from the poor and middle class to the wealthy and especially the ultra-wealthy. By refusing to fund pensions for public employees and ratcheting up giveaways to businesses, Christie is merely going with the flow, just floating along a rising tide which is lifting the oversized yachts of the rich while sinking everybody else.

NH town should fire police commissioner who called the President the N-word

Everyone has the right to exercise free speech, except when they are serving as a representative of an organization. As an organizational representative, what you say should hew to the standards and beliefs of the organization.

It’s amazing how many times public language controversies hinge on whether the individual who made the statement was representing him or herself or an organization.  At essence, representing the organization was at the heart of the “Duck Dynasty” and Donald Sterling controversies.  AMC thought Phil Robertson, star of reality show “Duck Dynasty,” was representing the network when he made sexist and then racist statements. If Donald Sterling didn’t represent a professional basketball team, his vilely racist comments would not have made such news.  The National Basketball Association recognized that as an owner Sterling always represented the league and therefore fined him and is trying to force him to sell his team.

Representing the organization is also the case with Wolfeboro, New Hampshire police commissioner Robert Copeland who called President Obama a “nigger.”

For at least the last 150 years, virtually everyone who speaks English has regarded “nigger” as a term of disparagement, similar to “kike,” “dago” or “frog.”

Some defenders of the use of the term “nigger” point out that calling a white from the rural south a “cracker” is the same thing.  Since one is supposedly “allowed,” why not the other?

This view neglects the historical fact of slavery and the legal and institutional racism that poisoned much of the United States for more than a century after the demise of slavery and still affects the country in negative ways. “Niggers” were chattel that could be sold.  “Niggers” had no control over their own lives.  “Niggers” suffered the physical pains and humiliation of whippings, forced separation of families and rape. “Niggers” were considered genetically and morally inferior creatures who couldn’t take care of themselves, who shunned work and didn’t know how to handle money. “Niggers” were less than human. “Niggers” could be easily fooled since they had the emotions of children. “Niggers” deserve to live in poverty since they don’t know how to work hard.

That’s what most whites meant when they used the N-word from about 1800 onwards.

Everyone knows that the word “nigger” reflects all these meanings, which makes it a more scurrilous and damaging phrase than other ethnic insults, at least in the United States.

Even the Afro-American men who use the term with each other as a kind of endearment know that it’s a horrible insult.  Male-bonding often devolves to gentle competitive sparring. I don’t know how many times I have called my male cousins “assholes” or “bozos” to their face, and none got mad, because they knew that my diminishment of them was a form of affection—or perhaps a replacement of the affection that American men are not supposed to display to other men.  The very fact that the word “nigger” is especially harmful and disgusting makes it an ideal choice for male-bonding between Afro-American men.  It doesn’t make the word acceptable in any other context, and certainly not acceptable for whites to use because when whites say “nigger,” it carries all its historical baggage.

Barack Obama is a smooth-talking city slicker who had an outstanding career as a scholar and elected official, and has excelled at everything he ever tried (except being a progressive president).  He shares not a trait with the composite “nigger” that people evoke when they use that word. To call Obama a “nigger” can only be understood as an insult. When Copeland refused to apologize, he said that the President “meets and exceeds my criteria for such.”  He could not have possibly meant anything other than the vilest of insults.

Now Copeland is entitled to his opinion about the President and African-Americans in general, and he has the right to express it.

But he doesn’t have the right to express it as a representative of Wolfeboro.

The simple fact of the matter is that when you are an elected or an appointed official, you pretty much sign on to representing your jurisdiction on a 24/7 basis. Once you become a mayor, Congressperson, police commissioner or judge, you de facto give up some right of speech. So whereas you might be entitled to your opinion and to use language that is inherently insulting, exercising that right will conflict with your public duties and with the image that your jurisdiction wants to maintain.

Executives of corporations face a similar constraint: if the news media discovered that a chief executive officer of a Fortune 500 company said “nigger” during a private dinner party, he would be unable to hide behind his right to free speech. The board of directors would summarily fire her/him. And they would be right to do it.

While no decision has been made yet, I’m guessing that the Wolfeboro township commissioners are going to end up firing Copeland or asking him to resign. They will have no choice. Otherwise, they will be tarred with the same racist brush that has rightfully dirtied their police commissioner. Now that kind of institutional racism might fly in the rural south, but probably not in New England.

Colleges across the country show that two wrongs don’t make it right

Across the country, colleges are holding graduation ceremonies for hundreds of thousands of graduates. But what used to be called the graduation season is rapidly gaining a new name: commencement speaker cancellation season.

This year in particular there seems to be a large number of high profile commencement speakers who have backed out or have been disinvited after campus protests. Former Bush II Secretary of State Condoleezza Rice was forced out as Rutgers’ commencement speaker. International Monetary Fund (IMF) chief Christine Lagarde withdrew from speaking at Smith College after protests.  Former University of California-Berkeley Chancellor, Robert J. Birgeneau, withdrew from speaking at Haverford after protesters wanted him to apologize for having campus police use batons against Occupy protesters. Brandeis University reversed its decision to award Islamic feminist Ayaan Hirsi Ali an honorary degree, after previously announcing it would do so, after protests by right-wing Moslems over her criticism of their religion.  Even Michelle Obama changed her mind about going to a Kansas high school graduation after right-wingers protested.

In three of these cases, progressive protesters forced out conservative speakers. In the two other examples, religious radicals forced out moderates who they mistakenly labeled as extremists.

Taken together, these commencement speaker cancellations involve a series of laudable and not-so-laudable actions.  In all examples, we should admire the protesters for exercising their right to make their opinions known.

But we should be disappointed in and ashamed of the institutions and the prominent individuals who backed down. Protests would have made for messy commencements, which are usually drawn out affairs that involve a lot of sophisticated choreography to move thousands of graduates on to and off the stage in a short amount of time. But so what, life is messy and democracy is messy.  By backing down or backing out, the individuals and institutions demonstrated a lack of respect for public discourse.

While we can admire organizations such as the New York Public Library that back down when it turns out their plans are not in the public’s best interests, giving a speaker a platform is never an occasion for backing down. Instead, the institutions could expand the venue—for example, getting another speaker to balance the controversial speaker or creating a special forum to discuss the controversies during commencement week.  The colleges could even give the protesters 10 minutes at the ceremonies to make their points.

To the degree that the speakers themselves made the decision to withdraw, they should be ashamed of themselves. They took the actions that made them controversial. They should own what they did or repudiate it. They should not run away from a spotlight that they themselves created.

For some of the educational institutions in question, backing down from the original plan marks their second mistake. Their first was to invite the speaker in the first place.

Let’s start with Birgeneau: To invite an obscure university administrator known for one action only is an open endorsement of that action. Haverford officials were stating that they thought it was right to beat up peaceful Occupy protesters. No wonder faculty and staff mobilized against the decision to ask Birgeneau to speak.

The case of Condoleezza is also easy: Commencement speakers are supposed to send graduates off on the journey that will be the rest of their lives with hopeful advice that spurs their enthusiasms and aspirations. The commencement speaker thus carries a certain moral authority.  How can anyone who was associated with the decision to create a world-wide network of torture facilities be considered a moral authority? Rice, Cheney, Bush II, Rumsfeld, John Yoo, David Addington and anyone else who was involved in deciding to pursue torture as an instrument of war should be American pariahs. No university invited Joseph McCarthy or Roy Cohn to speak after their disgrace and none should invite these international criminals, either.

Some would argue that Christine Lagarde is also a war criminal by virtue of her activities as head of the IMF. I would agree that many IMF actions have hurt people while protecting the interests of banks. It’s a political argument between left and right.  Lagarde’s politics do not in and of themselves forfeit her the moral right to be a commencement speaker, as the actions of Condoleezza Rice and Robert Birgeneau do.

The case of Brandeis University is the trickiest. On the surface, there is nothing morally objectionable in Ayaan Hirsi Ali’s actions and statements. She has fought for years against female genital mutilation and formed an organization, the AHA Foundation, whose mission is to work “to protect and reinforce the basic rights and freedoms of women and girls, including security and control of their own bodies, access to an education, the ability to work outside the home and control their own income, freedom of expression and association, and the myriad other basic civil rights defined under the laws of Western democracies and the Universal Declaration of Human Rights.” The fact that she is a fellow of the ultra-conservative American Enterprise Institute is troubling, but I can see why any mainstream organization would want to demonstrate its commitment to human rights by awarding Ali an honorary degree.

Except that Brandeis is not any mainstream organization. It is a Jewish organization giving an award to a woman who is disrupting Islam. Yes, we should support her disruptions, just as we support the disruptions that pro-choice and supporters of LGBT marriage make to the Christian and Catholic religious institutions and belief.  But when Brandeis does it, it carries stark and obvious symbolism, because it’s as if a Jewish organization is taunting Islamists, purposely getting their goat.  It plays into the myths that many right-wing Moslems have about Jews.

Someone—actually a lot of people—at Brandeis must have known that giving an honorary degree to Ali would piss off the Islamic right wing, which makes the withdrawal of the degree particularly obnoxious and cowardly. If you are going to stir the pot, don’t wimp out, which is what Brandeis has done.

If comic strips are an indication, breakfast in bed is out for moms on Mother’s Day

As in 2012, I thought I would analyze Mother’s Day this year through the lens of the Sunday comics.  And what a difference two years makes!

For one thing, two years ago I looked at all 20 comics printed in the Sunday edition of the Pittsburgh Post-Gazette. This year I looked at a semi-random 30 comics of the total of more than 70 on the Yahoo! Comics web page; those 30 included most of the 20 I remembered from the Post-Gazette.

Judging from the results of the two surveys, Mother’s Day is not as important as it used to be. Two years ago, 50% of all comics had a Mother’s Day theme.  This year, it was down to one-third. Even family-centered comic strips such as “Momma,” “Fox Trot” and “Family Tree” avoided the holiday.

In analyzing the topics of Mother’s Day comics two years ago, I found that half of them—or 25% of all cartoons that day—focused on bringing mother breakfast in bed. I concluded that breakfast in bed had become the standard practice for this manufactured holiday. Perhaps it was just a fad, because this year, not a single strip I saw depicted other family members preparing and serving mother bed in breakfast.

The interesting thing about breakfast in bed is that the act of preparation and serving doesn’t involved extra consumer purchases (since breakfast typically consists of foods always in the frig).  This lack of consumerism made Mother’s Day unlike other holidays, which tend to reduce to buying and giving gifts.

While the breakfast in bed is missing from this year’s comics, so is consumerism for the most part. True, “Arlo & Janis” creates an emotional competition between the husband giving jewelry and a phone call from a far away son. The “Peanuts” rerun details the act of selecting a card, part of the buying process.

But most of the strips focus on serving, words of appreciation and things one can make or do to show mother the love:

  • Doing chores for mom (“B.C.”)
  • Giving flowers (“Luann”)
  • Gathering flowers and making a card (“Nancy”)
  • Mom reversing roles and doing everything for the child (“Cathy”)
  • Mom getting drunk at a multi-family barbecue (“Stone Soup”)
  • The magic mirror on the wall calling mother the fairest of them all (“Wizard of Id”)
  • How we love mom’s nagging (“Drabble”)
  • Mother and daughter spending the day together, bicycling on the street (“Jump Start”)

Note that the food service takes place at home—in the kitchen or backyard. No restaurants.

So even as comic strip moms are denied the pleasure of breakfast in bed, their families are nonetheless giving more of themselves in a direct way and depending less on engaging in commercial transactions as the means to celebrate the holiday and express their emotions.

This turn to the virtues of interaction and investment of self that we see in comics may or may not reflect a change in society. Depending on which report you read, Mother’s Day spending will be either up or down this year in the aggregate. Per capita spending will go down by $5. Yet even at the low end, Americans will spend more on Mother’s Day than any other holiday but Christmas. But they spend on very few things: one study reports that people mostly give their moms traditional gifts of a card (81.3 percent), flowers (66.6 percent) or a nice meal out (56.5 percent). The first two predominate in the cartoon world.

Mother’s Day has thus not been privatized into a holiday that exists only within families. It still finds expression in the economic realm. People still interact with the rest of the world in the planning and implementation of holiday plans, and they interact the way they know best—by making a purchase that represents an emotion.

News media finally pays attention to problem they helped to create—infantilization of adults

The American news media may finally be starting to cover a trend that they helped to create—the infantilization of American adults.

The infantilized adult continues the pursuits, hobbies, predilections, opinions and thought processes of youth instead of growing into mature, adult pursuits and activities.  Reading Harry Potter and comic books, playing with Legos or My Little Pony dolls, collecting action hero paraphernalia, spending much of their free time playing video games, vacationing at Disney resorts and amusement parks—these are all signs that an adult is wallowing in callow youth instead of growing up.  The mass media has of course glorified each and every one of the trends which together are creating the infantilized American adult.

For years, American comedy movies in particular have celebrated adults who refuse to grow up. The “Harold & Kumar” movies,  “Old School,” “Big,” “Grandma’s Boy,” “Ted,” “The Wedding Crashers,” “Billy Madison,” “You, Me and Dupree,” “Dodgeball,””Step Brothers,” “The 40-year-old Virgin,” “Knocked Up,” all three “Hangovers,” the “Jackass” movies, “Bridesmaids,” “Hall Pass” and “Identity Thief”—this off the top of my head list doesn’t even scratch the surface of the multitude of movies released over the past 20 years centered on men and women who refuse to grow up.

Now A.O. Scott, a New York Times film critic has realized that staying a child is a major theme of American comedy films.

Scott announces his discovery in his review of “Neighbors,” which explores the trick-filled feud between a suburban couple who retain an adolescent lifestyle and the unruly fraternity that moves next door.

He really does nail the current state of American comedy, so I want to give an extended quote:

The central problem in American film comedy for the past 15 years or so — let’s say from middle-period Sandler through prime Apatow and late ‘Hangover’ — has been maturity, or, more precisely, its avoidance. In the old days, adulthood was a fact. Now it’s a vague, unproven theory. Adolescence used to represent constraint and frustration, to be left behind as quickly as possible. For the heroes of the New American Comedy, it represents a blissful state of hedonistic freedom, to be held onto for as long as possible.

“How to stay a child when the world expects otherwise — and how to make the world love you anyway — has usually been, in these movies, a male predicament. Women have been sirens or mommies, on hand to inflame the boys’ desires or soothe their fears. This has begun to change recently, although mainly on television, where shows like ‘Girls’ and ‘Broad City’ have extended the privileges of arrested development on a more or less equal-opportunity basis.”

It’s not Scott’s job to put the tide of comedies about adults remaining children into a broader social context, but it’s clear to me that these movies both reflect the cultural shift and help to shape it.  In most of these movies, the immature heroes and heroines grow up a little bit in the end, but these movies are not cautionary tales about arrested development.  No, they all glorify and endorse infantilization—it’s much more fun than behaving as an adult.

When you add the number of these comedies about men and women remaining boys and girls to the number of fantasy superhero movies, the conclusion is clear: Hollywood is dedicated to promoting the perpetual adolescence lifestyle to the American public.

Article claims to tell us what rich people believe but is really telling us what they want us to believe

U.S. News & World Report is fronting another article that purports to tell us how rich people think. As with other articles of this ilk, “7 Things Rich People Believe” reduces to a series of ideological beliefs presented as facts.  In the article, the writer doesn’t even attempt to justify his assertions about the minds of rich folks. No studies, not even an anecdote—just a series of wishes, assumptions and stale advice, all tinged with the ideology of greed and consumerism.

As it turns out, the writer is Tom Sightings, who I have chided before for his ideologically tinged and accuracy challenged articles that advocate that big cities are not good for retirement and that people should move to avoid paying school taxes once their kids are out of school. Sightings seems to specialize in advocating the politics of selfishness in cute, homey articles that render general advice that always seems to be the same pabulum extolling greed, consumerism and the belief that the rich are better people.

The article opens by asserting that most people both love and hate money—like it but believe it’s evil to be greedy. Sightings then exhorts us to “get beyond your mixed feelings about money and start thinking like a rich person.”

And what does Sightings say a rich person think about money?  That it’s not evil. That there’s nothing wrong with wanting more of it. And that you (the rich person) deserve to have it. Those three thoughts proffered by Sightings are all permissions to be greedy. For example, he never considers that it might be wrong for a billionaire to want more money or that people should feel ashamed to display enormous wealth when others are starving or struggling.  Consider this statement: “The wealthy are not inherently dishonest; they do not feel ashamed of their first-class lifestyle or their bulging portfolios. In fact, most rich people take pride in their accomplishments and enjoy the fruits of their labors.”  What these two sentences are really saying is that 1) the only sign of success is making money; 2) the only way to take pride in your accomplishments is to spend money; and 3) all rich people earn their wealth (meaning they deserve it).  These are all basics principles of the American consumer ideology, hammered into us daily by the news media, our civic leaders and mass entertainment.  But nowhere does Sightings prove that any of these statements are true.

To these general ideological beliefs, Sighting adds an out and out lie: that the way to achieve real wealth is to earn more, not save more.  Tell that to all the trust fund babies; the inexperienced kids who go to the front of the line for jobs because of their rich parents’ connections; or those born millionaires like Bill Gates, Michael Dell and Mitt Romney who leveraged their parents’ wealth into multi-millions or billions.  All studies suggest that the best way to achieve wealth is to be born into wealth. Now maybe Sightings is right that rich people believe the lie that the road to riches runs through your job—but I don’t think Sightings ever asked, and it’s convenient that it’s what rich folk who don’t want to pay a lot of taxes would want us to believe.

Sightings completes his list of what rich folk think with some of the more common business success tips that we’ve heard since the days of Dale Carnegie and before: Rely on brainwork. Live below your means. Spend more on education and less on entertainment. Like all writers on business success, though, when Sightings says “education,” he really means vocational training. “Yet these people typically do not put a lot of faith in formal education or fancy degrees. They focus on useful, practical skills that are relevant to their career.” In Sightings world, you won’t catch a rich person reading Plato or Proust, studying environmental science or contemplating the lessons of Chinese history.

Articles claiming or inferring that rich people think differently and those giving tips on how to think like a rich person pop up in the mass business media about every six weeks. All build their case on assumptions and anecdotes. All happily support the status quo.

These lists of what the rich think or how they differ from others always communicate three hidden messages:

  1. There is one route to success, which, of course, is to buy into the American ideology of selfishness and consumerism.
  2. Rich people deserve to be rich, and that their wealth does not depend on luck, connections, prior wealth or the accidents of birth.
  3. Everyone can become rich. All you have to do is think and act like a rich person.

The flip side of the third message is that when you don’t become rich, it’s your fault. You didn’t work as hard as that investment banker (even if you worked as many hours in your job as a janitor). You didn’t get enough training, or the wrong kind of training (I guess that associates degree was a mistake—too bad you didn’t have the bucks for Harvard!). You didn’t have the right attitude or the right thought process. Maybe you stayed poor because in your heart you didn’t like yourself enough to get rich.  Whatever, it’s all your own fault.

These articles purporting to analyze the wealthy thus serve to enforce the American ideology—to make us like the wealthy and not resent them, to make us want to be like them and to accept their version of what’s best for society.

Just the kind of stuff that rich people—those who own the media and advertise on it want—want us to believe.  

Is the U.S. giving up its support of the rule of law?

Note: I’m giving over today’s blog to distinguished anti-death penalty attorney, Marshall Dayan.  Here’s what Marshall has to say about the rule of law in contemporary America:

Americans are rightfully proud of our historical leadership when it comes to support of the rule of law: this idea that the law prevails and that our independent judicial system will apply the law to all in a fair and consistent manner.

But a number of events over the past few weeks make me wonder if the rule of law is losing some of its vitality in the United States.

Supreme Court Justice Antonin Scalia recently told law students at the University of Tennessee that they should think about revolting if taxes get too high.  He did not recommend that those opposed to high taxes organize politically and elect representatives who would reduce high taxes. He suggested that they consider a revolt.

In Nevada, a rancher, Cliven Bundy, has refused to pay grazing fees for grazing his cattle on federal land for the past twenty years.  16,000 other western cattle ranchers graze their cattle on federal lands, and they pay grazing fees for doing so.  Bundy has outspokenly rejected the authority of the federal government and the Bureau of Land Management (BLM) to charge him grazing fees, asserting that the land belonged to the State of Nevada.  The federal court rejected this argument and ordered him to pay the fees.  The court also found him to be trespassing on federal law in the absence of payment.  Bundy became a libertarian cause célèbre by defying the court’s orders requiring him to pay the grazing fees.  (His image became tarnished when he revealed himself to be a blatant racist, ironically chastising impoverished African-Americans for availing themselves of federal government economic programs while he abused government resources for his own economic advantage.)  Rather than enforce the court’s orders, BLM backed down, at least temporarily, in the face of armed resisters who have gathered in Bunkerville, Nevada to defend Bundy’s continued illegal grazing on federal lands.

In both cases, federal officials—Justice Scalia and BLM—have weakened the concept of the rule of law.

Another example of abandoning the rule of law came when the Supreme Court of Oklahoma recently issued a stay of execution to protect its jurisdictional right to decide whether an Oklahoma statute barring the revelation of the manufacturer of drugs for lethal injection violated the state constitution. After the court ruled, Oklahoma Governor Mary Fallin defied the court’s stay order.  She issued an executive order scheduling two executions for April 29, 2014.  In issuing her executive order, Governor Fallin wrongfully argued that the Oklahoma Supreme Court had acted beyond its constitutional authority and therefore she would not follow its order.  As an aside, Oklahoma badly botched the first of two attempted executions. The condemned prisoner, Clayton Lockett, died of a heart attack forty three minutes after the lethal injection failed.  Governor Fallin then delayed the execution of the other prisoner, Charles Warner.

Our second President John Adams supposedly coined the phrase, “a government of laws, and not of men.” Adams believed that while all people are fallible, we strive to create rules to be applied fairly and consistently.  This idea comes directly from the Hebrew Bible.  Leviticus 19:15 commands, “You shall not render an unfair decision: do not favor the poor or show deference to the rich; judge your neighbor fairly.”

There will always be disputes about the boundaries of government power. An independent judiciary is necessary to settle these disputes.  Without it, we run the risk of devolving into chaos.  In United States v. United Mine Workers, Justice Felix Frankfurter wrote that “[t]here can be no free society without law administered through an independent judiciary. If one man can be allowed to determine for himself what is law, every man can. That means first chaos, then tyranny.”

Political differences are healthy, and are to be wrestled with in a democratic republic.  But courts must remain independent, and must be honored and respected by people of good will on all sides of all issues, or we risk losing our democratic republic to a tyranny of raw power. The recent statements and decisions by the Justice Scalia, the BLM and Governor Fallin undercut this basic principle of American rule of law.

Supreme Court makes a major mistake by allowing Christian prayers before public meetings

I’m still flabbergasted at the naiveté—or perhaps lack of experience in the world—displayed by Supreme Court Justice Anthony Kennedy in his majority opinion upholding the right of upstate New York government officials to say Christian prayers before town meetings.

Kennedy writes that the case comes down to whether people are offended by the prayers. His widely quoted words are 1,000% wrong: “Adults often encounter speech they find disagreeable…Legislative bodies do not engage in impermissible coercion merely by exposing constituents to prayer they would rather not hear and in which they need not participate.

Maybe he should have asked Jews, Muslims or atheists what they feel.  I’m quite certain that many, if not most, will tell you that they feel oppressed and assaulted by prayers that invoke Christ or a Christian god at a public or government meeting. Many also feel angry and betrayed by those allowing and enabling prayers for one religion in what is supposed to be a secular and diverse society.

I personally have encountered maybe 20 situations in my life in which clergy or lay people have offered public prayers for one religion—always a form of Christianity—at a public event.  And every single time, I have complained, usually joined by others.  Why? A combination of a deep feeling of oppression and a disappointment that the ideals of a secular society are being trampled upon.

My earliest example was when the coach of my high school football team in Miami, Florida would ask a member of the clergy to give a prayer before every game. The clergy were mostly Christian, with an occasional rabbi; it was long before the days of Islamic or Buddhist awareness. The prayers were almost always quite ecumenical, with some clergymen not even mentioning a deity. But one time, a preacher invoked Christ several times. The three Jewish members of the team (the other two of whom made All City; I was a scrub) hit the roof. We felt so angry and betrayed by our coach, an otherwise wonderful man, Frank Downey, who had actually played on the same high school football team as my father years before. Coach Downey made sure it never happened again.

When you are different from the majority or from what is considered the social norm, it always feels a little bit like you don’t really belong, whether you a different color, a different nationality or a different religion. The majority culture impinges on everything—think of the hype and the displays of Christmas season, of the Christian holidays that have become national holidays like St. Valentine’s Day or All Hallow’s Eve or of the many times politicians talk about their Christian faith. Imagine being a Moslem and trying to explain to your children why you don’t exchange presents the morning of December 25.

Luckily, our constitution and the first amendment guarantee religious freedom and a secular society. I personally believe that a correct reading of the Constitution would prohibit every type of prayer before government meetings, let alone prayer to a specific deity.

I suggest that Justice Kennedy try to walk a mile in someone else’s shoes for a few hours.  He might change his mind about what he considers to be coercive or oppressive.

Someday we will get a Supreme Court which is dedicated to interpreting the Constitution and not to completing the Reagan right-wing agenda. Maybe then, this awful Supreme Court decision will be reversed.