Marketers are discovering a rapidly growing group of consumers: adults who want to remain children

The latest marketer to cash in on the trend of adults wanting to remain children is a museum.

The American Museum of Natural History (AMNH), that venerable icon to the natural sciences, is now offering special sleepover parties—for adults only.  That’s right, for a mere $375 a person ($325 for members), you can snuggle up in jammies in your sleeping bag on a cot provided by the museum under the enormous blue whale in the Hall of Ocean Life with 149 people you never met before. I don’t know if they’re serving cookies and milk or ‘Smores and hot chocolate, but I understand that lights out is about 1:42 am. BTW, the museum has offered sleepovers for families for about eight years.

Now, the adult meaning of sleepover is much different from when the term is applied to children. For adults, a sleepover means having sex, usually for the first time or early in a relationship. For kids through their late teens, by contrast, it means making popcorn, watching movies, talking through the night and having mom make pancakes or French toast in the morning.

Which do you think the American Natural History museum’s resembles? There is no way the museum trustees, the insurance companies or the police are going to allow condoned sex, nor do I think many adult couples who attend the sleepover are going to want to engage in conjugal relations in full sight and earshot of everyone else trying to sleep on a cot. There may be some hidden hanky-panky among the mastodons or in a bathroom stall, but the point of the AMNH sleepover is not sex. It is therefore not an adult sleepover, at least not in conventional or traditional terms.

What is it then? Well you get a chance to see the exhibits—just like a regular visit or a special event such as a singles night or members day. You get to hear guest lecturers– just like a regular visit or a special event . You get the run of the place pretty much to yourself, which is not like the wall-to-wall masses of chattering humanity of a regular visit, but very much like a special museum event.

The only thing that differentiates the sleepover from other museum events then is the sleepover itself. The big sell point for an adult event is something for children.

In other words, a major American museum is appealing to those adults who want to do something from their childhood—have a sleepover.  The museum’s marketing department is trying to cash in on the growing number of adults who collect My Little Pony dolls, play with Legos, like to go to Disney theme parks, read comic books and juvenile fiction like Harry Potter or spend a lot of time playing shoot-‘em-up video games. And judging from the stories we see in the mass media, their number is growing by leaps and bounds. You can see just how much infantilization of American adults has progressed when  you peruse the growing number of movies dedicated to adults preserving the life they led as children: “Harold & Kumar” movies,  “Neighbors,” “The Internship,” “Old School,” “Big,” “Grandma’s Boy,” “Ted,” “The Wedding Crashers,” “Billy Madison,” “You, Me and Dupree,” “Dodgeball,””Step Brothers,” “The 40-year-old Virgin,” “Knocked Up,” all three “Hangovers,” the “Jackass” movies, “Bridesmaids,” “Hall Pass” and “Identity Thief” start the long list of movies that glorify not growing up.

Going to the museum is a pretty adult thing to do, unless it’s a children’s museum or the museum has decided to focus an exhibit on a child’s level of discourse. And keep in mind that the purpose of children’s museums or children’s exhibits is to guide children in learning how appreciate the adult experience that is museum-going. So how does a palace dedicated to the scientific education of all ages attract the fast-growing segment of adults who don’t want to grow up?  AMNH has come up with the brilliant solution by combining the very adult pleasure of looking at scientific specimens and analyzing information about the natural world with the child’s treat of having a sleepover.

While we should all retain a child’s sense of wonder and curiosity, I believe that at a certain point, it’s time, as Saul of Tarsus said, to put away childish things. His full quote, according to the King James version of the Christian Bible is “When I was a child, I spake as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things.”

The infantilization of American adults is a clear and present danger for representational democracy because adults who constantly participate in child-like activities are not practicing their adult thinking and emotional skills. I believe that mass marketers like infantilized adults because they make more docile and credulous consumers. But I for one would much rather have those who think like adults make decisions in the real world.

American political situation begins to resemble Alice’s Adventures in Wonderland

Have we fallen down a rabbit hole and entered a surrealistic world as Alice did when she fell into Wonderland?  Have we walked through a looking glass to a world that looks like ours but operates on a weird kind of logic?

I can’t be the only one who looks at the political scene in Washington, D.C. and concludes it looks a lot like Lewis Carroll’s 19th century fantasies, Alice in Wonderland and Through the Looking Glass and What Alice Found There.

What could be more bizarre than the lawsuit of the Republicans against President Obama because he has failed to implement parts of a law that they vehemently opposed and then spent four years trying to repeal?

Or how about this bit of logic from John Boehner, Speaker of the House of Representatives? In the morning he says that Congress will come up with a plan to solve the current border crisis and in the afternoon we find out the plan is to tell the president to do something. Of course, if the president does do something about the border crisis, he will be expanding the powers of the presidency, the same sin of which the lawsuit accuses him.

Or how about this bit of lunacy?  Dick Cheney, John McCain and others want to send more troops to Iraq to support a government known for repressing segments of its population.  Or think of a world turned upside down in which the United States supports a coup d’état and Russia supports the constitutionally elected government? Believe it or not, that’s what has happened in the Ukraine—although I have to add that by no means serves as an endorsement of Russia’s support of Ukrainian rebels.

As our political landscape begins to resemble the Wonderland into which Alice fell, I find myself assigning characters to an imaginary Washington Wonderland: Our President would have to be Alice. John Boehner as unctuous but ineffectual The White Rabbit and Ted Cruz as the completely bonkers Mad Hatter are easy calls, and if Cruz is the Mad Hatter, then Jeff Sessions is the March Hare and Mitch McConnell is the Dormouse who sits squeezed tight between these two crazies. The Mock Turtle, a creature that doesn’t exist outside of a mediocre pun, is Marco Rubio. Paul Ryan is the mean-spirited and hypocritical Duchess.

And who is the self-satisfied Cheshire Cat, sitting in a tree above the action—or should I say above the rhetoric and posturing masquerading as action—with a sly, knowing smile?  It’s has to be America’s ultra-wealthy as represented by those fat cats, the Koch brothers.  While our elected officials, especially our legislatures, continue to create chaos out of order, the ultra-rich enjoy a regime of low taxes, enormous tax loopholes for corporations, insufficient environmental regulations and a foreign policy driven solely by the needs of the 1%.  The Cheshire Cats smile, too, because they know they are sitting pretty, what with current campaign finance laws that allow them to buy elections and the Republican campaign to restrict voting rights (which, BTW, takes a page from wealthy northerners in the 1870’s and the South from the end of Reconstruction to the Civil Rights Act of 1964, who also moved to dramatically restrict eligible voters).

The more things don’t change, the more the ultra-wealthy smile like Cheshire Cats.

Greedy, self-serving billionaire thinks buying yacht is act of charity

Dennis M. Jones is a billionaire who claims that the $34 million he paid for his new yacht was a form of charity since the yacht creates jobs in the manufacturing, maintenance, cleaning, furnishing, decorating, cooking, and serving industries.

Funny, that’s the very same self-serving rationale that society matron Cornelia Martin gave to justify an egregiously ostentatious masked ball in New York in 1897—in the middle of a depression—in which one wealthy woman came wearing today’s equivalent of $6.4 million in jewelry sewn into her dress. Thus reports Sven Beckert on the very first page of his The Monied Metropolis, his history of the concentration of wealth into the hands of New York financiers, manufacturers and merchants during the Gilded Age (1850-1896).  The Gilded Age ranks second among American epochs in the flow of wealth upwards to a handful of very wealthy people. First place, of course, goes to the current era, which started in about 1980 and which I like to call the Age of Reagan.

If you wondered why rich folk can so glibly come up with convoluted excuse for why it’s great for the government to follow policies that take money from the middle class and poor and give it to them, now you know: They’ve been laying down the same line for decades.

Let’s take a look at two scenarios for an alternative use of Mr. Johnson’s $34 million and the untold millions he pays every year for upkeep of his metal-and-fiberglass leviathan.

The first scenario is a fantasy socialist utopia in which hundreds of families enjoy a weekend or week owning the yacht. The government pays for the cost to maintain the yacht and succession of temporary owners. Living on the yacht is open to everyone. At least everyone who likes that sort of thing. I’ll take the three-week trip with luxury accommodations across the old Silk Road instead!

Now a second, realistic scenario: The government spends the additional $34 million it collects from Johnson on teacher salaries, thereby shrinking the size of classrooms and providing an environment that’s more conducive to learning.  Or maybe, the government could fund $34 million in research into Alzheimer’s disease or cancer. $34 million would repair a lot of highways and feed a lot of hungry children. What about spending $34 million to subsidize consumer purchase of solar heating equipment.

Every single one of these uses for $34 million creates jobs. More importantly, they all improve our society and future economy more than does buying and operating a yacht for the gratification of a single individual.

But it’s Mr. Jones’ money! will shout those like George Will and the Roberts court who place property rights about human rights and social needs.

Yes, but in any other decade since we enacted the 16th Amendment in 1913, Mr. Jones would be paying a higher rate on the taxes from both his income and investments.

Moreover, no matter how hard Mr. Jones worked, he did not really earn that money by himself. The pharmaceutical company he sold 14 years ago for $3.4 billion used roads, bridges, sewers, airports and pipelines built with government money, all protected from both local and foreign threats by government organizations. His employees were mostly educated in public schools. They all made a ton less money than he did—and does—and none benefited the way he did from the company sale.

I couldn’t find a biography of Jones, but it doesn’t matter for my argument. He earned a billion in only one of two ways: Either he was already rich and connected or he was born with a talent which he may have burnished but which he did nothing to create. Either way, luck had at least as much to do with it as hard work. I’m sure that there are servants, boat engine maintenance specialists and cooks who work just as hard. 

In other words, Mr. Jones owes a lot to society and to luck. I’m not saying he shouldn’t earn—and spend—more than other people. Nor am I proposing a ceiling on wages and wealth.

What I am saying is that we have to return taxes to their levels before 1980 as long as there is still widespread poverty in the world, schools are overcrowded, our infrastructure is rapidly deteriorating, global warming is exacting severe penalties in lost lives and wealth, and people still suffer from debilitating ailments.

If that means Mr. Jones couldn’t afford to spend $34 million to buy a yacht and then untold millions to operate it, so be it. Life can be tough for billionaires.

Fill in the blank: Americans are living in the land of the____. My answer: guns

What comes first to mind when you think of the United States?

High standard of living? Beacon of representational democracy? The melting pot? Consumer society?  Fast food and blockbuster movies?

Land of the free? Home of the brave?

Not me.

When I think of the United States, the first image that comes to my mind is a gun.

We are a society awash in weaponry with an economy in large part based on weaponry.

Let’s start with the fact that we sell three quarters of all the arms exported around the world. That means of every dollar’s worth of bombs, tanks, jet fighters, ammo and machine guns sold around the world, 75 cents of it goes to a U.S. company.

Saudi Arabia, Iraq, Egypt, the Congo and Israel are among the many countries receiving arms from the Unites States, often purchased with funds borrowed from the U.S. government.

It’s a good thing the mainstream media doesn’t give much attention to the harm that’s done from the guns we sell abroad. Based on the space given in the media recently to condemning Russia for giving or selling to Ukraine rebels a rocket that the rebels used to down a commercial jetliner, coverage of the people killed with U.S. weaponry would crowd out all other news every day of the week.

At more than $660 billion a year, we dedicate more money to military spending than the next nine largest military spenders combined. We spend more than three times as much as number two on the list, China, even though the population of China is more than four times what ours is.  Let’s do the math: The autocratic Chinese spend about $139 per person per year on their military. The United States spends about $2,032 per person.  (I’m using 2013 figures from the Stockholm International Peace Institute, which I first found in a Wikipedia article).

The United States thus bears the major responsibility for the flood of weapons that help national and regional problems turn violent all over the world.

The violence doesn’t stop at our borders. Our militarism abroad runs parallel to our dedication to guns at home. The United States has the largest number of privately held guns per capita of any nation, almost one per person. Just as with military spending and weapons exports, our private ownership of guns far surpasses that of any other country in the world. We have 97 private guns per 100 people; no other nation has as many as 60 guns per 100 people.

More guns lead to more deaths and injuries from gunfire in the United States than in any other industrialized countries. Only countries at war see more of their people killed and injured by guns than the United States does.

It seems as if we worship guns and gun ownership. State legislatures and dubious court decisions have loosened gun control laws over the past two decades. After every bloody mass murder, more states pass laws to make it easier to own and carry a gun than toughen gun laws. Every week, the media covers protests of gun owners who think their rights have been squeezed or want to assert new rights to tote guns: sometimes they march into a fast food joint, sometimes on a university campus.  Very few politicians—and virtually none in the South—will come out against gun control for fear that the gun lobby will pour money into the opponent’s campaign.

The funny thing is, our reverence for the weapon both inside and outside the boundaries of our country plays to a stridently vocal minority. Only about 40% of the population has a gun in the home. Many surveys show that the number of gun owners is falling—but that each owner has more guns in his or her possession.  Our gun sales abroad primarily benefit the gun-makers, who are delighted to get the subsidies that U.S. loans to support arms sales represent.

What we have then is a society dedicated to guns and an economy in which making and selling guns play an outsized role.   Instead of singing “the land of the free and the home of the brave,” we might more accurately end the “Star Spangled Banner” with “the land of the gun…and the home of the gun sale.”

A TV commercial subtly suggests cannibalism, another makes fun of those with disabilities

Two commercials currently on TV are making me—and probably most other viewers—squirm with discomfort. Both are meant to be funny, but once explained, the logic behind the humor may turn stomachs.

The first is a spot for Lay’s potato chips that opens with an animated version of the classic Mr. Potato toy getting home from work. He can’t find his wife anywhere. He hear a strange crackle and then another. He follows the sounds until he sees his wife hiding in a room with a bag of Lay’s potato chips, munching away. She is suitably embarrassed at what amounts to an act of cannibalism, but the commercial explains that the chips are so delicious that they are irresistible. The last shot shows Mr. & Mrs. Potato Head snacking on the chips, both with a look of mischievous glee on their face—they know they are doing a naughty thing, but it just doesn’t matter.

The scene is reminiscent of Jean-Luc Godard’s masterpiece, “Weekend,” at the end of which the main female character sucks on a bone from a stew prepared by the revolutionary who has forcibly made her his concubine. “What is it we’re eating?” she asks, to which the punky gangster answers, “Your husband.” She has the last line of the movie: “Not bad…” and then keeps gnawing on the bone.

Eating another being of your own species is generally considered to be an abomination. Although the Potato Heads are not humans, they are stand-ins for humans with human emotions and aspirations, just like the various mice, ducks, rabbits, dogs, foxes, lions and other animals we have anthropomorphized since the beginning of recorded history. From Aesop and Wu Cheng’en to Orwell and Disney, authors have frequently used animals as stand-ins for humans in fairy tales, satires and children’s literature.

So when Mrs. Potato Head eats a potato, it’s an overt representation of cannibalism—humans eating other humans.

The advertiser is trying to make fun of transgression, to diminish the guilt that many on a diet or watching their weight might feel in eating potato chips, which after all, are nutritionally worthless.  But behind the jokiness of a potato eating a potato chip stands more than the idea that it’s okay for humans to eat them. The implication in having a potato playing at human eating other potatoes is that we are allowed to do anything transgressive, even cannibalism—everything is okay, as long as it leads to our own pleasure.  The end-game of such thinking is that our sole moral compass should be our own desires.

Thus the Lay’s Potato Head commercial expresses an extreme form of the politics of selfishness, the Reaganistic dictate that everyone should be allowed to pursue his or her own best interests without the constraint of society. Like the image of the vampire living on the blood of humans or of the “Purge” series of movies in which people are allowed any violent action one night a year, the Potato Head family eating other potatoes that have first been dried, processed, bathed in chemicals, extruded and baked symbolizes and justifies what the 1% continues to do to the rest of the population.

And it’s a happy message, too!  We don’t get the sense that it’s a “dog-eat-dog world in which you have to eat or be eaten.” No, Lay’s presents the gentle Reagan version: you can do anything you like to fill your selfish desires (no matter whom it hurts).

The kooky image of potatoes as cannibals may be funny, but I can’t imagine anyone is laughing at the Direct TV series of commercials that present human beings as string puppets who trip over furniture and get caught in ceiling fans.

To sell the fact that Direct TV—a satellite television service—can operate without wires, these commercials start by depicting a normal-looking character complaining about wires in the entertainment system or expressing delight that he has Direct TV and therefore can go wireless. At this point in the several versions of the spot I have seen, we are introduced to another member of the family who is a string puppet. As the normal character stammers about how wireless is okay for people but not when it comes to TV, the string puppet bounces around, hands and fingers flapping, shoulders hunching together and legs and knees dangling, until it trips or gets hung up in the fan or something that is supposed to be funny happens. But it’s only funny if one enjoys the cruel humor of slapstick and if one forgets that the stringed puppet is supposed to be part of the family—in other words a real human being with a challenging disability.

Direct TV has a long history of commercials that make fun of its audience, such as the idiot who fails to inherit a mansion, yacht and major stock portfolio but cries for glee because his rich deceased relative has willed him the Direct TV package. But the string people in these new Direct TV spots are not buffoons, not stupid, not venial, not pompous or supercilious. No, the trait that the spot exploits for humor is that they are disabled.

The commercial tries to extract humor out of mocking people with disabilities. No wonder everyone with whom I have watched this spot has turned away with a disgusted expression.

Nothing connects these two commercials except the bad taste which led to their conception and broadcast.  The Direct TV commercial has no political or social subtext to it—it’s a juvenile effort to make a joke at the expense of people with physical challenges. The Mr. Potato Head cannibalism commercial, however, seems to offer a fable about the relationship between the haves and the have-nots, or in this case—those who eat and those who are eaten. The fabulist is interested in selling products and making consumers feel good about the process of consumption, even when it is transgressive.  Some may call it an overturning of traditional morality. I call it business as usual in a post-industrial consumer society.

George Mason professor tries to play Washington Generals to Thomas Piketty’s Harlem Globetrotters

Another transparently deceptive article on wealth inequality by George Mason economics professor Tyler Cowen has me wondering if Cowen has decided to play Washington Generals to Thomas Piketty’s Harlem Globetrotters.

The Harlem Globetrotters is an exhibition basketball team known for its entertaining feats of dribbling, passing and scoring, often to a catchy version of the 1920s jazz standard “Sweet Georgia Brown.” The Globetrotters have rarely lost, thanks to the fact that they usually play the Washington Generals, an exhibition team put together for the sole purpose of serving as the Globetrotters’ on-court foil.

Over the past several months, Cowen has published a number of articles that have tried to refute the main premise of Piketty’s Capital in the 21st Century, which is that inequality of wealth and income is growing in the world. His obviously fallacious reasoning makes me wonder if Cowen decided to play Washington General as his contribution to disseminating Piketty’s important theories. Just as the Generals’ weak defense have allowed such stars as Wilt Chamberlain, Connie Hawkins, Meadowlark Lemon and Curly Neal to wow spectators, so Cowen’s weak and typically devious arguments have made Piketty look good (as if the spot-on and factually-scrupulous Piketty needed any help!)

First Cowen made a feeble attempt in Foreign Affairs to prove that wealth doesn’t tend to concentrate.  Instead of looking at the class of the wealthy, Cowen zeroed in on wealthy individuals, pointing out that old fortunes like the Rockefellers and the Astors get diluted over time. If he had instead looked at the wealthy as a class, Cowen would see that Piketty is right to conclude that inequality has increased because the numbers say it.  Call this flaw in reasoning a failure to think in terms of class.

Cowen is at it again in a Sunday New York Times business article in which he claims that even though inequality is rising in many countries, it is easing globally. Cowen presents no statistics to prove the point, but gives a bunch of reasons why it must be true. Most of his reasons turn out to be trends that do act against greater inequality, but do not change the overall flow of wealth away from the poor and middle class and to the wealthy.  Yes, Cowen is right to say that international trade has improved the standard of living in developing countries, but the fact that there are more middle class people in China and fewer in the United States does not address the question of whether inequality is growing or not.

In Capital in the 21st Century, Piketty provides statistics that demonstrate that the wealthiest are grabbing a greater share of the wealth and income pie than they used to in every single country of the world.  The most extreme difference in wealth and income between the top one percent and everyone else is currently in the United States. So the fact that there has been some movement up the economic ladder for some people in some non-western countries does not mitigate the overall picture of growing inequality in the world.

Cowen makes the same logical flaw in his New York Times piece as he does in the Foreign Affairs article: instead of looking at the totality of the statistics he looks at individual subsets from which he draws a generalized conclusion. In a metaphorical sense, Cowen’s reasoning is similar to a 2-3 zone defense with slow guards, which makes a basketball team vulnerable to both the three-point shot and drives to the basket. In other words, the careful reader or anyone who has read Piketty’s book observes Cowen trip himself up with his own words.

But the mainstream media loves deceptive arguments and outrageous statements if they support the free market or advocate against higher taxes. That explains why his mostly nonsense articles have found favor in The New Republic, Wall Street Journal, Forbes, Newsweek, Wilson Quarterly, Foreign Affairs and New York Times.  In fact, three years ago Business Week declared Cowen to be “America’s hottest economist.” That’s kind of like the newsletter of the corporation that owns the Washington Generals declaring the team the “best professional basketball team” of the century.

Except, of course, for all the others.

Former patriot of the year puts money ahead of country—but isn’t that the American way?

It turns out that Heather Bresch is as much a patriot as she was a student.

Ms. Bresch is chief executive officer of Mylan, Inc., a large maker of generic prescription drugs which recently announced that it is buying Abbott Labs for the purpose of moving to the Netherlands and enjoying lower taxes.

As Andrew Ross Sorkin detailed in a New York Times article titled “Reluctantly, Patriot Flees Home,” Ms. Bresch was recipient of a “Patriot of the Year” award by Esquire Magazine, one of the literally thousands of bogus awards given by nonprofit organizations and the news media to corporate executives every year. Bresch won the Patriot Award not for acts of valor or self-sacrifice—but for having the connections to lobby for the Food and Drug Administration Safety and Innovation Act of 2012, which gives the FDA the authority to collect user fees for the drug and equipment reviews it conducts.

Let’s grant Ms. Bresch the benefit of the doubt and assume that unlike virtually every other instance of an industry initiative to regulate itself, Bresch’ proposal was not a watered down version of what should have passed if Congress truly had in mind the best interests of the public.  But even making the incredulous assumption that she acted altruistically, Ms. Bresch has certainly not behaved as a patriot in her massive tax avoidance scheme.

A true patriot pays taxes—when represented, as Bresch so ably is, in part by her father, a Democratic Senator and former Governor of West Virginia.

A true patriot looks at the state of our roads, the state of our education system, the diminishing sums for medical and other research, the high price of college, the staggering poverty in the land of plenty and then does what he or she—or “it” in the case of corporations—can do to help.  Instead Ms, Bresch and her company, like Pfizer, Abbvie, Tyco, Walgreens, Medtronic, Chiquita and dozens of other companies, decided to buy a smaller foreign competitor and renounce American citizenship to take advantage of a gaping loophole in the U.S. tax code.

Ms. Bresch is old hat at not being what she seems.  For years her resume said she earned a Masters of Business Administration from West Virginia University. In 2007, the Pittsburgh Post-Gazette called WVU on a routine fact check after seeing a news release announcing Bresch’ appointment as Mylan’s chief operating officer only to learn that Ms. Bresch did not in fact have a degree. What happened next bordered on low slapstick: WVU said Bresch had not earned her MBA, then called back days later to change its mind. In the interim, the university awarded Bresch a post facto MBA even though she was some 22 credits short of a degree that requires 48 credits. To do so, higher ups gave her grades for courses in which she had received “incompletes” and added six additional courses to her academic record.  The Post-Gazette had a field day reporting WVU cooking the academic books to award a bogus degree and the university’s subsequent weak attempts to cover it up. Heads rolled throughout the university.

Forgotten in the academic scandal that dominated the news media in West Virginia and western Pennsylvania for months were two things:

  1. There must have been enormous political pressure on WVU for so many of its administrators to behave so unethically. It is unclear from where that pressure came, given that the Governor at the time was Heather’s daddy and the chairman of Mylan at the time was WVU’s largest donor.
  2. Bresch lied about having earned an MBA and continued to lie even after the Post-Gazette called her on it.  (Denying the truth may be the modus operandi at Mylan. About two years later, then CEO Robert Coury insisted that an FDA investigation had ended even after the FDA said it was ongoing.)

Sorkin, who neglects to mention Bresch’s past brush with resume-padding, expresses surprise that the “patriot” acted so unpatriotically, but it makes perfect sense to me.  In the United States, we learn that the most patriotic thing to do is to open or run a business. We also learn that a business is supposed to maximize profit for it shareholders in a legal manner, no matter how ethically repugnant it may seem. Lay off thousands of workers so that profit margins increase. Leave communities to chase cheaper labor and laxer environmental regulations. Suspend manufacture of needed pharmaceuticals because the profit margins are too narrow. Buy smaller foreign companies and move abroad to avoid taxes. All of it is ”just business,” in the words of fictional businessman Michael Corleone.

The syllogism is perfect:

  • Running a successful business is patriotic
  • Business rewards amoral if legal conduct, as long as it produces a profit
  • Abandoning the U.S. and denying its government of millions of dollars that could be used for safety, education, infrastructure investment, protection of the weak and security is good business.

THEREFORE

  • Abandoning the U.S. is patriotic

Those who think I’m just joking haven’t followed the past 35 years of the federal government facilitating the globalization of large American businesses to the detriment of U.S. workers and communities. It’s this record that makes me doubt that Congress will hear the cries of “unfair” that many are making and change the law so that any country that makes money in the United States has to pay the U.S. tax rate no matter how the business structures itself or where it locates its headquarters.

If you want to sell to U.S. consumers, you should have to pay U.S. taxes and at the same rate as domestic companies. That’s only fair, but fairness has nothing to do with business, nor, in the age of the politics of selfishness, does it have anything to do with either governance or tax policy,

Anti-tax sentiment in the 17th century was anti-war; today, it’s pro the wealthy

Reading about the 17th century in Geoffrey Parker’s Global Crisis really helps one understand our current challenges. Parker’s thesis is that the extreme weather conditions across the world in the 17th century tipped what would otherwise be normal political disruptions into rapid social, economic and political decay. The Fronde revolt in France, the 30 Years War in Germany, the Great Revolution that led to the temporarily overthrow of royalty in England, the Time of Trouble in Russia, the violent end of the Ming Dynasty and establishment of the Qing in China—these are just some of the major revolutions and wars that seemed to cluster around the middle of the 17th century, leading to serious population losses virtually everywhere.

Parker makes a compelling but not airtight case that the famines and extreme weather caused by what historians call “The Little Ice Age” did affect human societies enough to worsen all social and economic tensions and push many situations beyond the point of cataclysmic upheaval.

But even if we discount or reject Parker’s climate change theses, we can still learn a great deal from Global Crisis that applies to today’s world.

Take the topic of taxes, for example.  Parker shows that throughout the world in the 17th century rulers and their administrators collected and raised taxes for two purposes:

  1. To fund the extravagant lifestyles of royalty
  2. To fight wars of territorial conquest

No wonder there were literally hundreds of major and minor tax revolts throughout the entire world, and especially in Europe, during the middle war-torn decades of the 17th century! Who would want to pay for useless wars and the high life of the nobility?

Tax revolts have a storied and honorable history during the long and bloody era of royalty, including our own revolt against the British. Keep in mind though, that the American colonies were not opposed to taxes, merely to be being taxed without representation.

Fast forward to today and our far more complex post-industrial society.  In light of the strong historical connection between anti-tax revolts and warfare, isn’t it truly bizarre that the only budget item that none of our advocates for lowering tax rates want to cut is the military? In fact, virtually everyone who wants to lower taxes is also in favor of increased military spending.  They will gladly cut back spending on education, unemployment insurance, the space program, medical research, safety inspections, IRS audits and everything else the government does, but not on guns, bombs and ammo.

In the 17th century, tax protestors and rioters were mostly outsiders—peasants, merchants and minor nobles who objected to paying for foreign wars. By contrast, since the ascent of Reaganism and the politics of selfishness, most of those in favor of lowering taxes and against raising them to meet pressing needs are members of the establishment—rich folk like Pete Peterson, the Koch brothers and executives of large corporations plus their congressional factotums. And while they talk about lowering taxes as a general mantra, when you take a look at their tax proposals, they always only call for lowering taxes on two groups: the wealthy and corporations.

The rich control the news media, the multitude of think tanks that advise elected officials and the political process, which explains that the idea that taxes are bad is now so engrained in the public consciousness. Anti-tax fever has gotten so bad that Congress cannot even pass an adequate law to fund the repair and upkeep of our highways. Members of Congress either are afraid to pass a higher gas tax or are so adamantly against any tax that they just don’t care how much our roads deteriorate.

No one likes driving through potholes or over bridges that need structural work. Providing adequate permanent funding for our highways creates jobs and will lead to faster and more energy-efficient travel. To the degree that the tax would discourage driving, it may also help clean up the atmosphere.

Yet no one—not even President Obama—will come out in favor of raising the gas tax or raising other taxes to fund highway repair and maintenance. Every elected official is as afraid of anti-tax frenzy as they are of the National Rifle Association.

Some may point out that a gas tax assesses everyone and goes against my basic premise that anti-taxers are really just interested in lowering taxes for the wealthy. Let me explain: the incessant call to lower taxes, which has dominated economic discussions since about 1980, has created an atmosphere in which the default position is to hate all taxes—new, old, general or earmarked. The debate in the marketplace is about all taxes, but the bills that are passed to typically give all or a large part of the tax breaks to the wealthy and corporations. We could make a new gas tax progressive by giving poor people gas rebates on income taxes, of course, but first we have to pass a new gas tax. And that’s nearly impossible in the current anti-tax environment. Meanwhile, we keep funding our senseless, goalless wars by borrowing money from the wealthy.

Let me close with a sarcastic shout-out to the New York Time which found room in its shrinking print pages for an extensive story on a scientist who denies that climate change is occurring. I’m guessing that it’s part of a series of personality pieces on climatologists and that the series will reflect current scientific opinion, so that in each of the next 200 editions, the Times will do in-depth studies of scientists who support the reality of global warming. 200 for and 1 against will just about represent the true balance of opinion among scientists.

Or maybe today’s article is the first in a series on scientists who speak against the overwhelming flood of facts on issues that were decided years ago: Next week, the Times might feature someone who doesn’t thinks the sun revolves around the Earth; and then move on to someone who believes in spontaneous regeneration, someone who still thinks phlogiston causes things to burn, someone who believes that vaccines cause autism and someone who thinks that only gays can get AIDS.

I’m fairly confident though that today’s feature about one of the small number of anti-climate change scientists is not the start off a special series but rather a continuation of the Times and the mass media’s decades long pandering to those advertisers who gain by postponing the changes we as a society will have to make to steer a peaceful and bloodless transition to the much warmer world of the future.

Racism reborn as theories on Western superiority

We’re seeing more theories exploring the reasons why the West dominates the world order or why the West has developed a more advanced culture. A few years back, a scientist tried to show that geography determined when and which cultures would dominate the globe at any given time.  In 1997’s Guns, Germs and Steel, Jared Diamond blamed the fact that “the literate societies with metal tools have conquered or exterminated the other societies” on the three items in the title of his book.  Several scholars like Bernard Lewis have made a living touting the superiority of Western culture and telling us why. A recent Economist took a reverse gambit, dedicating a long article on why the Arab culture has failed—failure of course measured as an inability to move towards a Western political and economic model.

Most of these theories define or assume that following traits define Western superiority:

  • A free market system
  • Free trade across borders, with restraint of labor
  • A representative democracy
  • An industrialized, and now post-industrialized economy
  • A consumer society built on cars, cell phones and disposable clothing

The latest to proclaim and then explain Western superiority is Nicholas Wade, in his A Troubled Inheritance, which describes research that found minor differences in the genetic makeup of Asians, Caucasians, sub-Saharan Africans, Native Americans and the original inhabitants of Australia and Papua New Guinea. He assumes without any proof or explanation that these differences explain the differences in culture of these peoples and the superiority of white ways.

Too bad that the premise of Wade and of all of these writers—that the West has forged a superior way of life—is false.

One argument against the assumption of Western superiority is to point out the ills caused by Western ways: an epidemic of obesity and diabetes, resource shortages, the mass extinction of species, and human-made global warming.

But I prefer to do the numbers.

We have to start with a measurable standard. I know a lot of readers are going to go for standard of living or gross domestic product per person, but consider this: The only goal in the broadest of all unfolding histories—evolution—is survival. I’m going to assert that the best measurement of surviving is the size of the population.

And the Chinese win, hands down. When population historians analyze every extant population survey of different cultures, countries, continents and parts of the world, they find that at every point in the recorded history of humankind there have been more Han Chinese than any other race, culture and/or nationality. That’s 10,000 years of continual Chinese demographic superiority, even when they seemed to be under the paw of Western Europe militarily and economically.

As obnoxious as the idea of Western superiority is the very notion that we are in some kind of world competition that is culturally or racially based.

The very assumption of Western superiority is inherently racist, as is the sometimes frenzied search for proof that the races are different. It’s true that today the West seems to dominate the world and its cultural aspirations, as Greece and Rome once did for the Mediterranean world and Persia once did among the myriad cultures of the Asian subcontinent. But Asia, the Middle East, the subcontinent and Europe/America have all taken turns being the dominant economic and cultural power over recorded history. To take one moment and call it the end game of all history didn’t work when Marx tried it and it didn’t work when Francis Fukiyama tried it. And it doesn’t work when social thinkers say the West is superior just because it may have dominated and forcibly led much of the rest of the world for much of the last 300 years. This phase will pass as surely as did the Tang Dynasty, Ghengis Khan’s empire and the Moghul empire in India.

Ranking the presidents since World War II shows what a sorry lot they have been

A recent survey found that a sampling of about 1,300 Americans rank Ronald Reagan as our best president since World War II and Barack Obama as the worst—just nosing out that supreme incompetent George W. Bush AKA Bush II.

I’m not sure what goes into the thinking of most people, but if we judge the presidents on the good and bad they did, the direction into which they guided the country and the competence with which they led, Reagan should rank as the third worst president since World War II—and alas, also the third worst president ever.

Let’s start with our worst president since Roosevelt and also our worst president of all time—and it’s not even close. Harry Truman earns this dubious distinction by virtue of ordering the dropping of atom bombs on Hiroshima and Nagasaki. People make excuses for these barbarous acts which led to the slaughter of the largest and second largest number of human beings in a day’s time in recorded history. Apologists say that Truman saved more American lives than the bombs took, which is absurd on the surface, since Japan was already reeling and had already proposed virtually the same terms that they took at the final surrender. Estimates range from 150,000 to 250,000 killed by the only two atom bombs ever used on human beings. How could subduing Japan with conventional airstrikes of munitions factories and military bases taken as many lives? The almost smarmy assertion that dropping the bombs saved lives also neglects the fact that the American lives supposedly saved were soldiers, whereas most of those actually killed at Hiroshima and Nagasaki were neither soldiers nor workers in war factories, but innocent civilians.

Outside of dropping the bombs, Truman’s record is pretty shabby: He helped to start the cold war. He selected nuclear power over solar as the primary energy source for the government to support. He nationalized steel factories to stop a strike. He let Joe McCarthy walk all over the country and tacitly approved the red scare.

Let’s move on to Bush II. Rating Bush II as a worse president than Ronald Reagan is a tough call, because they are the two ideologues most responsible for the economic mess we’re in. In a sense, Bush II completed the Reagan revolution.

But Bush II led an incompetent regime that pretty much botched everything it touched.  His team was asleep at the wheel when the 9/11 attacks hit. The response included two of the most ill-conceived and expensive wars in history, two wars that destabilized the powder keg that is the Middle East and led to a worldwide loss of trust in and respect for the United States. Bush II established a torture gulag across the globe and a spy state at home. Bush II tax cuts starved the country of much needed funds to invest in the future and help the needy. His handling of Hurricane Katrina displayed both incompetence and disregard for suffering.

Any discussion of Ronald Reagan should start with the fact that he and his team were traitors who should have been placed on trial for crimes against the United States. I’m referring to the deal with Iran which kept our hostages in captivity for months longer than they had to be, only so Reagan could defeat Jimmy Carter in the 1980 election. What the Reagan Administration did for Iran in return seems unconscionable to a patriot: we sold weapons of warfare. And what did Reagan do with the money from arms sales to a country the president said was our enemy?  He funded a civil war in Nicaragua.

Even without this treachery, Reagan would still rank among our three worst presidents of all time. He was the leader of the turn in American politics around 1980 that has led us down a disastrous path. The economic plan of Reaganism called for and produced an enormous shift in wealth from the poor and middle class to the wealthy over a 30+ year period that continues. His game plan included all the reasons the rich have so much and the rest of us are struggling: lowering taxes on the wealthy and businesses; weakening laws that protect unions; privatizing government services; cutting social services; and gutting Social Security.

Reagan also asked the country to stick its head in the ground ostrich-like and ignore how our fossil-fuel dependent economy was degrading the earth and threatening our future.

Now that we have disposed of the truly incompetent and/or evil presidents, I want to reverse the order of presentation by naming Lyndon Baines Johnson as the best president we have had since FDR.  If we take away the Viet Nam War, it’s an easy call—Johnson would rank with Lincoln as our greatest of leaders.  He passed the Civil Rights Act, Medicare and Medicaid. He started food stamps, work study, Head Start and a slew of other anti-poverty programs that worked, no matter how much right-wingers want to rewrite history. He passed the most generous education bill and the strictest gun control law in American history. Under Johnson, the space program thrived and it was only a cruel twist of fate that postponed the first moon landing until early in Nixon’s first term.

Of course there have always been stories afloat about Johnson fixing elections early in his career or practicing crony capitalism (as if any president since Andrew Jackson hasn’t?). But that he was essentially a decent man comes out again and again, and especially in that transcendent moment when he learned that the FBI was spying on Martin Luther King and he hit the roof and ordered it stopped immediately. This ultimate wielder of power knew better than most that power must be restrained in a free society.

Unfortunately, there is the Viet Nam War, which he inherited from Eisenhower and Kennedy and bequeathed to Richard Nixon.  Viet Nam crystallized all the contradictions of America’s Cold War policies: imperialism parading as idealism, exaggeration of the threat from the Soviet Union and an inability to view the world from any other perspective except that of large multinational corporations.  I don’t mean to absolve Johnson—he made the decisions to escalate and bomb. It was a major flaw that disfigures Johnson as a historical figure and sullies the rest of his accomplishments.

After Johnson, I select two presidents who were pretty mediocre, but ruled over good times, made no enormous blunders and led competent administrations that did a fairly good job of running the country on a day-to-day basis and responding to the occasional disaster. If you read the labels most pundits put on these two men, you would think they were miles apart of political spectrum, but if you instead review their stands, you find them fairly close indeed. Both were centrist on social policy and both continued the imperialistic foreign policy that has guided the country since Roosevelt.  I’m talking about stodgy Republican Dwight Eisenhower and rock-star Democrat Bill Clinton. I personally favor Clinton because he tried to pass single-payer health insurance and presided over a relative shrinking of the U.S. military and U.S. militarism.

How is it possible that the evil genius of Richard Nixon can rank as high as fourth among recent presidents? His illegal actions in Southeast Asia and extension of the Viet Nam War were disgraceful. His dirty tricks and domestic spying shook the country by being the first visible signs that technology and centralized power could quickly reduce us to a police state. But Nixon also opened China, set wage and price controls, continued Johnson’s poverty and education programs and established the Environmental Protection Agency and Occupational Safety & Health Administration. He also ran a competent administration that responded with reason and rationality to most challenges, except, unfortunately, the war and Nixon’s political intrigues.

Nixon was a despicable human being by virtually all accounts, so it’s a little painful to rate him above four essentially likable men, none of whom had the competence to pursue their agendas: Carter, Obama, Kennedy and Ford. I find parts of the vision of all four of these men problematic: Carter was in favor of globalization without protections for U.S. workers or the environment. Obama is basically a pro-business, anti-union liberal who shares the consensus view that the United States should have special rights in world affairs. Kennedy was a militaristic cold-warrior who fervently believed in cutting taxes on his economic class—the ultra wealthy. Ford basically was a continuation of Eisenhower and Nixon, a pro-business cold-warrior open to compromise with progressives on social issues.  None of these men had a great impact because none knew how to work the system like Johnson or Nixon.

That leaves us with Bush I, who is to Reagan what Ford is to Nixon-Eisenhower, a continuation. Bush I was a little more effective than Carter or Obama, but his policies kept us down the path to greater inequality.

Here, then is the OpEdge ranking of presidents since 1945. Of these 12 white males, only three would rank in the top half of all our presidents. Again, I rate the bottom three as the three most disastrous presidencies in American history:

  1. Lyndon Johnson
  2. Bill Clinton
  3. Dwight Eisenhower
  4. Richard Nixon
  5. Jimmy Carter
  6. Barack Obama
  7. John F. Kennedy
  8. Gerald Ford
  9. Bush I
  10. Ronald Reagan
  11. Bush II
  12. Harry Truman

It’s the times that usually make the man or woman, and not the other way around. These men represented ideas that those with wealth and influence found attractive. Donors, their parties and the think tanks funded by big individual and corporate money shaped their views. It was General Electric money, after all, that helped turn Ronald Reagan from a New Dealer to the symbol of the politics of selfishness. None of these men would have found support if they didn’t buy into the basic premises of American foreign policy over the past century.

Since World War II we have made three major wrong turns as a country: The first was to create the cold war and continue to assert America’s divine right to intervene anywhere around the world at any time. The second was to ignore the threat of environmental degradation and resource shortages and build our economy on wasteful consumerism powered by fossil fuels. The third was to turn our back on the mixed-model social democracy that we began to establish from 1932-1976 or so and return to economic rules that favored the interests of the wealthy over everyone else’s. We probably would have taken these treacherous paths no matter who we had elected president.

Happy July 4th!