Devin Nunes waves a piece of paper with lies on it blaming a government office of bad behavior. We’ve seen it before—Senator Joe McCarthy during the Red Scare

The Nunes memo is nothing more or less than a technique of McCarthyism. We’ve seen politicians wave around pieces of paper containing lies before in American history. Even when successful, the results of the gambit are usually short-lived, although, like the McCarthyism of the early 1950s, can nonetheless hurt a lot of people. In the case of the Nunes memo, it will thankfully likely recoil in short order on the person the memo is meant to protect—Donald Trump.

Like McCarthy’s famous list of communists, the delusion of the Nunes memo starts with a threat to produce a piece of paper that blows the whistle on a whole lot of bad stuff. Wisconsin Republican Senator Joseph McCarthy had in his hand “a list of 205—a list of names that were made known to the Secretary of State as being members of the Communist Party and who nevertheless are still working and shaping policy in the State Department.” That’s the exact quote from the first time McCarthy used the dramatic smear against the State Department that has come to symbolize “McCarthyism.” Nunes’ proposed cancer lurks in the Department of Justice, the Federal Bureau of Investigation (FBI) in particular.

McCarthy’s supposed traitors were trying to bring America to its knees through the spread of communism, while the contemporary traitors are trying to take down a presidency.

Both pieces of paper were part of broader strategies to achieve political ends. McCarthy wanted to be re-elected and the other prosecutors of the Red Scare wanted to turn the country against socialist solutions to national challenges, such as universal healthcare, civil rights legislation or investment in mass transit. Nunes wants to end an investigation that is looking more and more as if it will find that Donald Trump and his close associates colluded with the Russian government to win the 2016 presidential election and then sought to conceal their nefarious and probably traitorous deeds.

McCarthy never read from his list and as far as we know, it had no names on it. But he did have an old FBI report from which he later misquoted, exaggerated and distorted to substantiate his initial accusations. The Nunes memo presents a few facts, his lies consisting of the preponderance of evidence he leaves out and the false conclusions he draws.

What differentiates these two mendacious pieces of paper is that McCarthy’s list reflected perfectly the tenor of the times and was thus widely believed and supported. The entire spectrum of the American ruling elite from the most conservative to the most liberal and the mass media that reflected their interests were shaking in their boots about the possibility of a socialist revolution coming to the United States. But the contemporary ruling elite is divided as to the efficacy of continuing the rule of Donald Trump, including many who approve of and benefit from the policies of the current administration but put the rule of American law above their own selfish self-interest. The mainstream media believed the McCarthy hogwash, but only the cynical and the true believers swallow the Nunes Kool-Aid.

It is unlikely that the Nunes memo will accomplish the long-term objective of ending the Mueller investigation, but it has accomplished one immediate goal: Keeping the mind of the public off the fact that the current administration is defying the law by not enforcing the sanctions against Russia for interfering with the 2016 election that Congress approved by overwhelming margins of 98-2 in the Senate and 419-3 in the House. If there is one thing the current administration is good at it is governing by sleight of hand. The standard trick is for Trump to say or do something obnoxious that the news media focuses on, while the administration or Congress does something truly horrifying that gets less publicity. For example, Trump’s latest Twitter argument with a prominent minority floods the news, cannibalizing potential coverage of a new plan to dismantle environmental regulations. This time, Nunes is carrying the water.

Consider these numbers. A search of Google News produces 74,200 hits after imputing “Nunes memo” over the past week and a mere 8,380 hits for “Russian sanctions.” All the mainstream television news shows and National Public Radio have opened with the Nunes memo and have given far more time to it than they have to the administration’s refusal to enforce the sanctions. The rightwing media such as Fox News is shrieking stridently that the Nunes memo shows that the Mueller investigation must end while completely ignoring Trump’s failure to implement the sanctions.

American history is full of politicians accusing various branches of government of nefarious behavior that didn’t exist. The post office has long been a bête noir of Republican privatizers. The GOP constantly accused the Obama administration of overreach. Trump and other Republicans accused the national security apparatus of failing in its handling of immigrants during the election. Then there was McCarthy. So the Nunes accusations represent nothing new or even rare in American politics.

But I can’t recall another example of a president so completely ignoring the will of Congress since Andrew Johnson instigated his own impeachment by firing the Secretary of War Edwin Stanton after Congress had passed a special law to prevent him from doing so. What did Stanton do to make Johnson defy Congress? He was aggressively implementing the program called Reconstruction meant to bring civil rights to all citizens of the states that had recently tried to secede from the union. Johnson, a former slave owner, was against sending troops to the south to protect the rights and lives of the recently freed or their southern supporters.

Trump’s position is equally obnoxious and anti-American, and perhaps treasonous. He doesn’t want Russia to suffer for interfering in our elections, certainly because it helped to elect him, and seems to welcome its help in 2018. That more Republicans seem interested in ending the Mueller investigation than forcing Trump to obey a law that achieved rare bipartisan support demonstrates that the GOP has become a stinking, putrefying corpse of corruption. Rotten to the core, with the foul stench starting from the head, as it always does.

Government privatization of services doesn’t only fail in the United States as experience of Great Britain is demonstrating

The New York Times is giving us new evidence that privatization of government services is a failed concept. The Times reports that Great Britain’s decades-old experiment with privatizing government services is failing. Privatized facilities for the elderly and the disabled have run into a slew of abuse charges in the recent past. Moreover, a report by the British government found that over the next 25 years schools could cost 40 percent more, and hospitals 70 percent more, if run by private firms instead of through the government. Sounds like a typical privatized American prison, which costs more to operate than the government facility it replaced.

Why anyone ever thought that privatization of government services would lower costs and improve quality is beyond comprehension. When the government does something, its chief concerns are quality of service and cost to taxpayers. But once a private company gets involved, another factor enters the decision-making process: profit, which in the private sector is primarily split between owners and senior management. That profit has to come from somewhere, and it does: from the total pool available for providing the service—from salaries, equipment, supplies, transportation and facilities. Whatever the money set aside to provide the services, the cut given to profit will diminish it.

But wait, privatizers say. The private sector will run things more efficiently.

But how? Through economies of scale, which assumes that many companies will have the purchasing power of the federal government or most states. For the most part, that’s just not true. And in those rare cases in which a large private sector business might have an edge in purchasing supplies or maximizing the productivity of equipment over a local government, that government can always band together with other municipalities to buy supplies or share technology and staff.

As it turns out, it’s not economies of scale on which privatizers depend, it’s cutting the costs of labor. Typically, virtually all employees of privatized government services receive lower compensation than their government paid government. Why? Because privatized employees generally aren’t in unions, while government employees are often unionized. So what, you might ask? Who cares how an organization splits the pie, as long as the service is provided at a high level of quality and costs taxpayers as little as it has to. There are unfortunately two flies in this ointment: 1) Paying lower salaries will attract less qualified employees; 2) Cutting the salaries of large numbers of people—unionized or not—drives down the entire wage scale of an economy, which leads to all the problems that inequality of wealth brings, including an increase in asset bubbles and recessions, a decrease in the possibility of individuals moving up the income ladder and anti-democratic distortions to the political system.

(The exception to the rule that a privatized worker will make less than a government worker is the military, for which privatization brings on other problems such as a lack of loyalty of the mercenary to the values of the U.S. armed forces and pressure by privatizing lobbies to instigate or continue wars so that the profit train keeps running.)

But wait, privatizers say. The private sector is more likely to innovate and those innovations will lead to higher quality and lower costs. That’s not the way it has worked out in real life. In fact, when researchers Christopher Lubienski and Sarah Theule Lubienski ran the numbers, they found that one of the major reasons public schools outperform private schools (when adjusted for poverty and disabilities) is that public schools are more innovative, introducing new teaching techniques and technology than private schools. (The other reason, FYI, is because public school teachers are more experienced and participate in more continuing education classes than private school teachers. Makes sense, since paying more attracts better employees—that’s the American way!—and if private schools can cut teacher professional development, they can produce more profit.) No one has found any innovations at private prisons, except perhaps in the area of information technology which would occur at the governmental level, too. The privatized section of the armed forces has access to all the advanced technology they want—all developed by the U.S. military!

But wait, privatizers say. Privatization ends the special interest group politics surrounding government programs. That assertion is also belied by the facts. What happens in the real world is that the industry offering the privatized services becomes another special interest that finances and influences politicians. Teachers’ unions lobby for higher salaries and smaller classes, both of which lead to better outcomes for students especially in the elementary school years, at least according to the research. The prison industry lobbies for longer prison sentences, high bails and round-ups of undocumented immigrants, all to fill their jails. The defense industries lobby for higher military budgets and more military excursions. For those dear readers who don’t see the painfully obvious difference, let me explain: what the teachers want helps society; what private prisons and military contractors want does not.

But wait, privatizers say. The private sector always does it better than government by definition. Now that’s just a lie, as a landfill’s worth of evidence demonstrates. All we have to do is compare the cost and outcomes from the American system of healthcare insurance and delivery to those of every other western democracy, all of which have one form or another of single-payer healthcare. We rate first in costs and close to last in infant mortality and life expectancy. BTW, some nationalized healthcare systems like Germany’s do find a place for private, highly regulated health insurance companies. Not surprisingly, the most nationalized part of the American system—Medicare, Medicaid, the Veteran’s Administration before Bush II and Republicans gutted its budget—do the best job on costs and quality.

Is it possible that government control or ownership works best for the delivery of all goods and services? Based on the evidence of the Soviet Union and its satellites, it would be hard to make that assertion.

On the other hand, it seems that many types of industries seem suited to government control—certainly education, prisons, the military and probably healthcare. One key similarity of these enterprises is that they require large numbers of people who interact intimately with those served. While a telecommunications company or a solar panel manufacturer may require thousands of employees, technology, facilities and equipment are at least as important to the business as people. A phone company sells phone service using phones over landlines or on wireless frequencies. A school may use computers and science labs, but it sells teachers and teaching. A military sells armed forces (although modern warfare has increased the military’s dependence on capital goods more).

Another similarity of the industries that have seen disastrous results in privatization (or in the case of education, merely mediocre results) is that they all involve the entire public and the public good. No society since about 1850 can survive without universal education and literacy. Everyone needs healthcare. We build prisons and maintain armies to protect everyone. One can make a case that everyone needs electrical, telephone, water and natural gas service, too. Evidence is mixed as to whether government or the public sector most efficiently delivers these capital-intensive utilities, but we do know that when privatized they always require a lot of regulation to make sure that everyone has cheap, ubiquitous and reliable access to them.

A final similarity I see in the industries for which past experience demonstrates that government control beats privatization is that they are either mature industries, meaning that the market will not increase for their services except through population growth; or industries that it is the public interest not to grow. We are certainly better off when we have less need for prisons and the military.

I suspect that a whole lot of industries would be better off if they were nationalized. Of course I do, I’m a democratic socialist. But the experience in the United States and elsewhere else suggests that even the most extreme free-market conservative should see the benefit of centralized public education, prisons, healthcare, military, mass transit, roads and other services that governments routinely provide in most western nations. Except, of course, those right-wingers who hope to profit from privatization; or do not believe that rich folk should be taxed so that everyone can enjoy the service in question, e.g., affordable and high quality education and health care, and hope to use privatization as a Trojan horse to achieve that end.

Four decades of the politics of selfishness has turned America from a “winner take more” country to a “winner take most” country

In reading After Piketty, a collection of essays by leading economists in response to Thomas Piketty’s seminal Capitalism in the 21st Century, I ran across the expression: “winner take all” economy. The “winner take all” economy describes an economy in which a very few people get most of the income and therefore accumulate most of the wealth, kind of like today in the United States. More accurate perhaps would be to call it the “winner take most” economy, since our system of compensation, taxation and government benefits leaves scraps for the upper middle class and breadcrumbs for everyone else.

The “winner take most” ideology is a harsh one for several reasons. On the most obvious level first, it creates a society in which there are winners and losers, and proposes to reward winners and make losers suffer. “Winner take most” ideologues typically attribute strident moral values to winning and losing. Winners deserve everything they get because they worked hard and are more talented, forgetting that people do nothing to earn the talent with which they are born, that much of “winning” results from the luck of having connections or money, and that others may work just as hard for less reward because they don’t have the connections or have less appreciated talents. Besides glorifying the rich, “winner take most” ideologues often demonize the poor. Not only are they losers—a negative attribute—but they’re losers for a reason. This view neglects both the importance of luck and the great costs extracted by having to deal with food, housing, education and other scarcities. My metaphor for summing up the importance of luck is to imagine Willie Mays turning 20 in 1850 instead of 1951.

Behind this harsh view of society as a human jungle rests the idea that everything can be reduced to the brutally immoral lowest common denominator of money, even self-worth and morality.

People like to compete and to win. There are winners and losers in wars and games, which are civilization’s sublimation of the urge to wage war. The pursuit of winning often leads to new knowledge and improvements in how we play in both games and real life. For example, Babe Ruth created the upper cut to get an edge—the home run—that helped his team to win at about the same moment in time that Henry Ford reorganized production to lower the cost of his cars and thereby sell more cars than anyone else. It feels good to compete and it feels better to win. A less crass, less celebrity-addled society in which civic virtue and not private greed is the primary value may not have such a clear divide between the winners and everyone else, but it nevertheless has winners who enjoy more material possessions and greater recognition of their worth as individuals.

The question therefore should not be whether creating a society of winners and losers is good or bad. The question should be: how much do winners and losers deserve? A slew of studies have shown that the top one percent and the top one tenth of one percent of income earners take a far larger share of income and wealth today than in 1980. The Congressional Budget Office, for example, estimates that since 1979, the income of the top 1% has grown by 275% before taxes (and 314% after taxes). By comparison, the income of the next 19%, the middle 60% and the bottom 20% have grown by 68% (73% after taxes), 38% (43%) and 41% (44%), respectively. Piketty shows that wealth has also skewered to the top 1% over the past 40 years, the share of one-percenters rising from 33.5 percent of all capital in 1979 to 54 percent in 2010. If you break out the top one tenth of the top one percent, the increase in wealth and income since the election of Ronald Reagan is even more extreme. Some studies conclude that these people got more money because they earned it by being better or more productive at what they did. But as Piketty and other economists point out, the top 1% and .1% are not dominated by the LeBron James and Bruno Mars of the world, but by owners and operators of and investors in businesses who have taken advantage of changes in technology, tax laws and business customs to increase their share of the pie through greater profits, or “rents” as economists like to say. All have benefited from a political culture that has continually cut taxes on the wealthy and slashed government services and benefits to everyone else over the past four decades.

We have turned America, once a “winner take more” economy, into a “winner take most” economy.

Perhaps it’s because I have always been an avid game player and my son participated in many games and sports growing up, but when I reflect on whether we should as a society actively seek greater equity in income and wealth, I think of the concept of “participation” trophies, which are essentially keepsakes of having participated and not done well in a youth competition.

A few years back, the participation trophy was a minor motif in the mass media for a few weeks, as shrinks, economists, public intellectuals and pundits opined about the so-called danger of awarding a trophy to every child. Instead of bolstering the children’s self-esteem, many thought participation trophies made the children feel entitled to rewards and contributed to making them soft mentally and emotionally—kind of like the “snowflake” that is now a derogatory term for someone too sensitive to stand up to the pressures of the real world (of course, as defined by the accuser!). One female teenage athlete’s opinion piece in the New York Times in 2016 condemned participation trophies for making everyone believe they’re a winner, when in real life, there are few winners. Interestingly enough, a Google News search for “participation trophy” reveals that over the past year, only professional athletes have used the term in the mass media, always in the derogatory sense of labelling someone as the loser or potential loser of a competition.

Clearly these comments represent the views of winners. The worst paid player in the major leagues will make $545,000 a year in 2018 (or roughly 5 times the annual salary of Mickey Mantle and Willie Mays—the two best players since World War II not tied to steroids), so even the worst players on the last place teams are “winners” when it comes to harvesting the rewards of society.

When first encountering participation trophies in the mass media, I remembered that in the 1990s organizers of the chess tournaments and baseball leagues in which my son participated often gave participation ribbons. I liked the idea of ribbons because it rewarded kids for their hard work and gave them something to commemorate the season or tournament. I wondered though if the trophy was going too far. One trophy per kid certainly raises the organizer’s budget, which might lead to an increase in entry fees. To have a truly open tournament or league, fees have to be low enough for most every family to afford and needed to include the possibility of scholarships of free entries. Of course, an organizer could save money by making the winners’ trophies less elaborate. At the time, I didn’t recognize that being okay with ribbons but not trophies also represented the winners’ viewpoint.

Youth chess tournaments have a complicated method to reward a wide variety of trophies and ribbons. A tournament of 250 children might have 10 sections based on chess ratings. Every section awards three to five trophies to the top finishers, plus other lesser ribbons. There is also a school team competition that combines scores in various sections, usually won by schools that bring along a large number of bad chess players. Because there are so many sections, everyone has a chance to win. As a chess parent, I liked this approach because it gave so many kids a chance to win and increased participation. My only gripe was that the trophy of the winners of the top section didn’t get slightly bigger trophies than the winners of other sections. After all, the kid who came in fifth in the top section would likely handily trounce any of the 230 kids playing in the lower sections.

That was also winners’ thinking, and I’m now ashamed that the idea ever occurred to me. Like all winners, my son’s many top trophies in the highest section depended on many things beyond his control—his immeasurably high IQ in math; his parent’s ability to afford chess camps; classes and private lessons; the role model of his disciplined and career-driven parents; the fact that he never had to worry about where his next meal was coming from. I’m proud of his efforts and achievements, but ashamed that I ever thought about diminishing the size of the trophy of the kid who came in third in the beginners’ section. I’m especially ashamed when I start to conceive of participation trophies as a metaphor for the competition for income and wealth in which all of us are forced to participate from the day we leave school—except of course for trust fund babies like the Trumps, Kochs, Mercers and others for whom the pursuit of additional wealth really is a game.

Translated into real life, the participation ribbon or trophy equals the cut of the income and wealth pie that the 99% of non-winners get. But whereas a trophy or ribbon merely fills up space on a dresser, the participation ribbon or trophy buys food, shelter, medical care, transportation, clothing, entertainment, education, charitable contributions and retirement. Whether it’s a ribbon or a trophy, we give them to children to motivate them and to satisfy an emotional need for dignity. There are also emotional components to what makes humans loyal to any society: Whether it’s an open society or a fascist one, people want to feel that they have a chance to win, that they will be okay whether they win or lose the money wars, that they have a reason to participate in the game, and most significantly for the stability of the game, that they have a stake in its outcome.

Over the past 40 years, American society has time and again created a choice between giving participation ribbons or trophies, and we have chosen always to downgrade a trophy to a ribbon, or to remove the incentive altogether. Lowering the buying power of the minimum wage. Defunding public colleges. Implementing policies that made it harder for workers to unionize. Funding tax cuts for the wealthy with tax increases to others and lower levels of social welfare benefits. Government and large company outsourcing of functions to other organizations that pay employees less. It’s as if we’re always cutting corners on the rewards of participation to provide bigger, glitzier tributes to the winners of a game in which luck matters much more than actual skill or performance.

New book on Ulysses Grant shows Grant was the anti-Trump: quiet, modest, honest, competent, an ardent supporter of African-Americans, an American hero

The administration of Ulysses Grant, 1868-1876, marks the beginning of America’s first Gilded Age, an era in which corruption was rampant throughout federal and state governments and a select few ruthless individuals—primarily white males—accumulated enormous fortunes, while incomes in general polarized and inequality of wealth increased. Over a relatively short period, business interests grew to dominate the one party that ruled both houses of Congress and held the presidency. At the same time, the southern states that had unsuccessfully tried to leave the United States experienced a secondary civil war engaged by white racists against the claims of citizenship and full participation in economic and political life of the newly freed black slaves. Violent guerilla attacks throughout the former Confederacy on blacks and whites who supported black equality combined with the vilest sort of racist propaganda in both north and south about the superiority of the white race and the relative backwardness of others.

Kind of sounds like today.

The difference, of course, was the president, Ulysses Grant versus the current occupant of the oval office.

First and foremost, on matters of race, Grant was the polar opposite of the white supremacist Trump. As Ron Chernow’s recent 900+-page biography of Grant details, our 18th president ardently fostered and protected the rights of African-Americans. Against the growing opposition of his own party, which grew tired of re-litigating the Civil War in southern states, he stood steadfast in supporting both the goals and the methods of Reconstruction. He gave an inordinate number of African-Americans jobs in his administration. He sent troops to a number of communities to fight the Ku Klux Klan and their allies. He never wavered in the respect he gave to African-Americans and the African-American people. Chernow studs his book with many examples of Grant’s generous and open-hearted support of African-American causes. When arguing about what white person did the most to help blacks in America, Grant should rate at the very top of the list with Lincoln and LBJ. In this sense, he was the anti-Trump.

(Not to get sidetracked, but on Native Americans, Grant’s attitudes were not as admirable, maybe equivalent to that of the “I believe in civil rights, but don’t move into my neighborhood” centrist bourgeoisie. He sympathized with Native American tribes, but wanted them to integrate into American society or live a non-nomadic life style on reservations. His desire to see the west settled was greater than his empathy for the peoples being displaced.)

While Grant ruled over a corrupt administration, he was not personally corrupt, nor did he benefit from the illegal and barely legal machinations of certain cabinet members and others in government. He established the first civil service commission as a temporary panel and wanted to make it permanent, only to be overruled by a Congress dominated by practitioners of an earlier form of crony capitalism. The Civil War had created an upsurge in government activity and governmental control of the economy: governments gave land usage rights and awarded large contracts. Without the constraints of a merit-based civil service system and adequate corruption laws, business moguls were able to buy legislation and administrative decisions. Graft lubricated the Gilded Age money machine.

The forces leading to corruption today—much of it legal or impossible to prosecute—are the privatization movement that encourages crony capitalism and the Citizen’s United decision which has enabled the buying of legislators. But instead of being an unwilling victim and mostly opponent of corrupt practices as Grant was, Trumpty-Dumpty and his family’s many conflicts of interests put him at the very center of the current administration’s corruption.

Another difference is in the execution of foreign policy. With the help of Hamilton Fish, his competent and honest Secretary of State, Grant focused exclusively on diplomacy to solve disputes with other countries (not including the native American “nations”). For example, historians consider the settling of the Alabama Claims by Grant and Fish as one of the most important steps in the development of international arbitration. That agreement got Great Britain to pay the United States for the damages done to U.S. ships by Confederate ships built in British shipyards. Brilliant! After his presidency, Grant took a four-year tour of the world in which he was consulted by the government of every country he visited as a great general and unofficial representative of the growing North American powerhouse that was the United States.

Sounds like the antithesis to Trump, who is tearing up treaties, getting us knee-deep in another shooting war in Syria and has the reputation in virtually every foreign capital as dangerously misinformed, offensive and erratic.

Grant’s background before assuming the highest office of the land has some similarities to but a major difference from Trump’s. Although both failed at every business they tried early in their careers, Trump’s fame came because of the self-promoted lie that he was a business genius, a lie boosted by the scripted reality TV program on which he pretended to be successful in business. By contrast, while he had many stumbles in his first military and his business career, once he returned to active duty at the beginning of the Civil War, Grant experienced unparalleled success, almost exclusively through his own competence. His reluctance to be a self-promoter actually slowed down his rise to the top of the Union armed forces, but made him a more effective general.

(Another diversion: The objections to Grant’s military greatness raised by southern apologists for the tactically sound but strategically hopeless Robert E. Lee all prove to be false. These historians, who dominated university history departments during the first half of the 20th century, aver that Grant won because he was a butcher —yet he firmly established and executed a policy to take from the civilian population only what was needed to prosecute the war. No rape, no senseless destruction. He was generous in his treatment of enemy soldiers who surrendered. Those claiming Grant won because he had a superiority of forces to Lee forget that the prior generals prosecuting the war in the east had a similar edge, but they failed to end the conflict. The claim that Lee was tactically better ignores the fact that Grant won every battle he ever led, typically with daring tactics like creating a safe 40-mile long supply chain through hostile territory to feed hungry Union soldiers at Chattanooga or his frequent use of naval forces to transport soldiers to the other side of enemy armies.)

The question is not whether Grant is the greatest general in U.S. history, but whether he is the greatest general in the history of organized warfare. A similar question could not reasonably be asked of a man who sent six companies into bankruptcy, has had a long string of failed businesses and has been involved in thousands of business lawsuits.

Observers all agree that the best words to describe Grant were modest, honest, disciplined and a man of his word. Another way that Grant is the anti-Trump, or Trump the anti-Grant.

Chernow’s book did reveal one similarity between the two men. Grant was and Trump is a true believer in strict pro-business orthodoxy in economic matters. Grant’s first administration enjoyed boom times, fueled by the rapid construction of new railroads. But Grant agreed with Congress that it was necessary to fight inflation by ending the policy of coining silver and helped to pass an 1873 law that essentially put the U.S. on a gold standard and deflated the currency. There was now less money around, which meant less money for railroads to borrow. The houses of cards which were the financial structures of most of the railroads toppled, starting with the Panic of 1873. Deflating the currency had a similar impact in 1873 as the 1929 stock market crash and the bursting of the housing bubble in 2007 did. All led to a rapid decrease in the money circulating in the economy. Some historians say that the ensuing world-wide depression started in 1873 lasted only six years, others say it went on for two decades!

Trump and the GOP have already set into motion the next major recession or depression through the passage of the Tax Cuts and Jobs Bill of 2017. Because most of the money being taken from the government in this tax cut will be given to the ultra-wealthy, it will leave the economy and instead be invested in dead assets. A bubble will form in one or more assets. After it bursts, the hard times will come.

All of our presidents have been flawed. All have been products and reflections of their times, captive to the prevailing myths and enthusiasms of the ruling elite that identified them as appropriate candidates for national election. There are many examples of presidential actions reflecting the zeitgeist or their party: Grant in his obsession with the gold standard; Theodore Roosevelt’s trust-busting and his imperialistic foreign policy; LBJ with his escalation of Viet Nam; Nixon with his opening of relations with China; Clinton with welfare reform and mass incarceration policies; Obama with his continuation of Bush II’s wars. To a large degree, presidents are acted upon as well as actors.

It’s in that context that we have to consider the phenomenon of Donald Trump. He is the apotheosis of the narcissistic politics of selfishness and the gaudy materialistic and anti-intellectual culture of consumption that has dominated the Republican Party and America since the late 1970’s. If Trump is our great national shame, it is not because he is an outlier, but rather because he is a symbol of the times.

Double-speak in proposed new U.S. nuclear policy masks fact that it makes it much more likely that America or someone else will drop the big one

The draft of the Pentagon’s proposed plan to “update” the United States’ nuclear weapon strategy is a masterpiece of double-speak.

The plan, titled the “Nuclear Posture Review” proposes that we modernize our nuclear weaponry, which is euphemistic phrasing for building more nuclear weapons and more efficient ways to deliver them accurately. The call for spending more than a trillion dollars on new nuclear bombs continues the unfortunate policy of the Obama administration to increase our nuclear capabilities even while calling for total dismantling of the world’s nuclear force at some future date.

More significantly, the document also proposes to expand the number of reasons that the United States would strike first. In 2010, the Obama administration significantly narrowed the scenarios in which the United States would drop nuclear weapons without first enduring a nuclear attack. Obama ruled out attacking any country that did not have a nuclear capability, and limited our use of nuclear as a response to large-scale conventional, chemical or biological attacks. But of course, that’s not how the documents put the conditions under which we’re willing to drop the bomb. In both 2010 and 2018, the Pentagon talks abstractly about nuclear weapons “playing a role” or making “essential contributions to the deterrence of nuclear and non-nuclear aggression.” Nowhere do these documents ever use explicit language to describe our willingness under certain conditions to poison the Earth’s atmosphere and water.

The new Pentagon report calls for widening the circumstances in which we would unleash the fury of our nuclear arsenal to include cyber threats and terrorism, or as the current draft puts it, “violent non-state actors.”That’s right—the new strategy would consider letting a U.S. president drop an atomic bomb on a country harboring terrorists, killing tens if not hundreds of thousands of innocent civilians and spewing deadly radiation throughout the planet. Interestingly enough, most stories about the updated nuclear strategy fail to mention the expansion of reasons for dropping the big one. Those that do, like the New York Times, focus exclusively on using nuclear weapons to deter “attempts to destroy wide-reaching infrastructure, like a country’s power grid or communications, that would be most vulnerable to cyberweapons.” No one mentions that the U.S. would now consider the nuclear option to fight terrorism, a far scarier change since the definition of terrorism and who is a terrorist is so amorphous and subject to manipulation. As with the past nuclear strategy documents, the 2018 draft also covers about 30 countries we consider allies, which means that at least theoretically, if a country dismantled Great Britain’s electrical distribution capability using a computer virus, the United States might literally go nuclear!

Double-speak is everywhere in the report. Consider this clever bit of logical twisting: “In no way does this approach ‘lower the nuclear threshold.’ Rather, by convincing adversaries that even limited use of nuclear weapons will be more costly than they can countenance, it raises the threshold.” In other words, the report claims that being willing to use nuclear weapons in more scenarios lowers the possibility of using them. It sounds as if the same propaganda machine that belches out the nonsense that allowing more guns will make people safer from gun violence is advising the Pentagon. And in fact, it might be, seeing that a number of companies manufacturing weaponry for the United States and the dozens of countries to which we sell arms also have divisions which sell firearms to individuals.

My favorite instance of twisted logic in the Nuclear Posture Review is the oft-quoted statement: ”We must look reality in the eye and see the world as it is, not as we wish it to be.” Those of us who have watched the current administration develop and implement immigration, tax, trade, environmental and education policies that fly in the face of reality find an enormous amount of chutzpah in the ostensibly sober admonishment to “look reality in the eye.”

But beyond the irony of the Trump Pentagon invoking reality to justify expanding the possibility of a first use of nuclear bombs is the rhetorical slipperiness of the statement. The Pentagon says it looked at reality, but it really only considered that part of reality that helped to justify the decision to spend a trillion dollars on new weapons of mass destruction and loosen first-use standards.

It didn’t look at the interconnectedness of the world through trade, treaties and computerization that makes it much more dangerous to all countries to launch any kind of attack on a big power like the United States—interconnectedness providing the same kind of deterrence that nuclear advocates claim the possession of atomic bombs does. It ignores the great progress we have made in quelling disturbances through negotiations, economic sanctions and treaties. It doesn’t take into account the fact that with non-nuclear weaponry we have managed to reduce the threat of ISIS and that the number of terrorist episodes in the United States is down significantly over the past four decades. It doesn’t look at the reality of limited resources that could better be put to use in strengthening the American economy and helping lift up the poor and inflicted in the United States and throughout the world.

Finally, and most importantly, the Pentagon does not consider the awful reality of nuclear weapons: that they kill so many with one explosion and that the damage is not limited to the bomb site, but affects the entire globe. The writers of this proposal—which will likely soon become the official policy of the United States—should take a hard look at the reality portrayed in the thousands of photographs of the damage to humans wrought at Hiroshima and Nagasaki. Maybe then they would understand that the reality is that no first-use of our nuclear capability is defensible or justifiable. Nor is retaliation against someone dropping a bomb on us, for that matter. The only realistic policy to follow is to stop developing all nuclear weapons and start decommissioning the weapons we have. Our standing army and economic power in a tightly interconnected world should be deterrence enough to prevent others from exploding nuclear weapons and therefore to follow suit by eradicating their weapons.

When Trump called Haiti and African countries “s***holes,” he was also calling the people who come from there “s***” and included all African-Americans

There have been four justifications made by GOP and FOX apologists for Donald Trump’s outrageous characterization of Haiti and the countries of Africa as “s***holes.” The excuses range from the duplicitous to the cynical:

1. He didn’t really say it: Only the true believer believes this obvious lie. No one has denied that Trump used that word and several witnesses have confirmed it.
2. These countries really are “s***holes,” or hellholes as many journalists are now using in its place. So what? We’re talking about people, not the countries they come from. We investigate every refugee and immigrant for terrorist tendencies or a
criminal past. Why should the economic condition of the country matter? If anything, one could rationalize a preference for people fleeing hellholes over those from cushy countries. If you’re unhappy in the utopia that is Norway, what kind of malcontent or socially maladjusted person are you anyway? Whereas, if you have the gumption to leave a “hellhole” and better yourself, you’re the type of person we want and need.
3. That’s the language average Joes use in bars. That doesn’t make it right. The President is supposed to uplift the level of discourse, not debase it to the lowest
common denominator.
4. It appeals to his base. Which is why the base and Trump are so dangerous. We
can’t forgive or justify a racist comment by saying it’s okay because some people
like it.

These attempts to forgive, explain or contextualize Trump’s remark avoid the uglier truth behind the ugly statement: that the only countries the Donald labels as “s***holes” have primarily black populations. While his comment denigrated African countries and Haiti, it also communicated that Trumpty-Dumpty believes that the people from those countries are inferior. The not so hidden subtext was a slam at African-Americans, a group that Trump has long abused both verbally and with his actions. We thus cannot regard his remarks as solely anti-immigration, or anti-immigration from certain countries. The remarks also manifest an explicitly anti-African American mentality. When Trump called Haiti and African countries “s***holes,” he was also calling the people who come from there “s***” and he meant every African-American.

The last person as racist as Donald Trump to be afforded the majority of votes by the Electoral College was Woodrow Wilson, who as president re-segregated washrooms, cafeterias and work areas throughout the federal government, in the process terminating or downgrading the employment of thousands of African-Americans. His many actions and comments disparaging blacks and elevating whites gave permission to the growing racist sentiments of the progressive era. It’s no surprise that it was during Wilson’s presidency and afterwards that the KKK got its second wind, becoming an important social and political force not only in the South, but also in the Midwest and West. About the KKK, Wilson said, “The white men were roused by a mere instinct of self-preservationuntil at last there had sprung into existence a great Ku Klux Klan, a veritable empire of the South, to protect the Southern country.” He sympathized with white supremacists who hated the Reconstruction periodbecause the dominance of an ignorant and inferior race was justly dreaded.” Wilson also feared and hated East Asians, as witnessed by this choice nugget: “Oriental Coolieism will give us another race problem to solve and surely we have had our lesson.”

In his new biography of Ulysses Grant (who may have done more than any other white person to advance the cause of blacks in America), popular historian Ron Chernow builds the case that no American president has held as openly racist views as Andrew Johnson, who ascended to office after the assassination of Abraham Lincoln. One quote from Johnson should suffice: “This is a country for white men and by God, as long as I am President, it shall be a government for white men.” Johnson actually told Congress that “negroes have shown less capacity for government than any other race of people.” In private remarks, Johnson often used the n-word, which may be next on the list of taboo words and phrases that Trumpty-Dumpty rehabilitates now that we can say “s***hole” in public discourse without too much embarrassment. As president, Johnson tried to slow down the Reconstruction process of integrating freed slaves into Southern economies and governments, for which he was impeached and almost convicted.

Both Johnson and Wilson managed to whip up hope among the significant minority of Americans who believed that whites were superior to blacks and that blacks posed a threat to white America’s way of life. During both their administrations, violence against African-Americans, especially in the South, while the federal government pulled back on their protection of the civil rights of minorities. It’s no surprise that we’re seeing the same thing happen under the current administration. The head of government sets the public conversation and is one of the main forces in determining what is appropriate and inappropriate in the marketplace of ideas. There may be several points of view on any given issue, but one of them is always the president’s.

While racism has played a quietly growing role in Republican ideology since Goldwater, no recent national candidate before Trump had the gall and lacked the good taste to play up racist ideas in an explicit manner. Trump makes himself absolutely clear even to those not attuned to the subtle degradations of racial coding. When he says, “Make America great again,” no one has any doubt that his true message is “Make America white.”

It won’t happen, because within three years, he’ll be impeached, resign or lose the next presidential election. But while in power Trump and his minions will do a lot of damage to our minority and immigrant populations that will last years after he is gone. Moreover, he has opened the same Pandora’s box of racism that Andrew Johnson refused to shut after the Civil War and Wilson helped to pry open again in the Progressive Era.

Surprise, surprise, surprise! Another attempt at privatization of a government function fails. This time it’s collection of back taxes

Once again, privatization of a basic government function has failed. As the New York Times reports, the Internal Revenue Service paid $20 million last year to private collection companies to collect unpaid back taxes. The companies were able to dun people for a mere $6.7 million in back taxes. Sometimes they were paid a 25% commission on back taxes collected solely through the efforts of the IRS. But there’s much worse, 45% of the take was from taxpayers they weren’t supposed to go after: hardship cases for whom paying back taxes would prevent paying basic living expenses.

As Gomer Pyle, the rube played by Jim Nabors for years on two situation comedies, would put it, “Surprise, surprise, surprise!”

If Congress had studied history, it would have known that privatization of tax collecting—which used to be called tax farming—doesn’t work as efficiently as the government doing its own collection. (To be accurate, private companies are trying to collect back taxes for the IRS; historical tax farming collected all taxes.) Ancient Egypt, Rome and Greece all had third-party for-profit enterprises collecting taxes, as did the early kings of Europe. Historians now consider having the government bureaucracy collect its own taxes as one of the earliest signs of a modern government. The advantage of the government doing its own collections is twofold: It gets more of the revenues and the taxpayers are happier, since tax farming generally led to abusive practices by the tax farmers, who sought to maximize profits by squeezing taxpayers. Of course, even if Congress had known what a failure privatized tax collection is, it might have still passed the 2015 bill requiring the IRS to use private contractors.

When will politicians, both Republican and Democratic, learn that the private sector doesn’t always work as well as government does in providing services, especially when those services involve most if not all of the population?

Let’s tally the performance of the private sector when it takes over functions previously performed successfully by government.

We’ll start with private schools. Advanced research now demonstrates without a doubt that when you correct for poverty and disabilities public school students do better on standardized tests and improve their performance more over time than do private schools. And no wonder. Compared to private schools, public schools are more innovative, have more experienced teachers and provide those teachers with more continuing education.

Private prisons have proven to be a complete disaster virtually everywhere they have been tried in the United States. A few years back, the Wall Street Journal detailed the woes that the state of Idaho had after it privatized its state prisons in 2000; its private prison contractor walked away from a new contract, leaving Idaho with several lawsuits alleging that understaffing led to gangs rampaging violently through Idaho’s private prisons. The Journal reported that Michigan recently dropped plans to house 968 cons in a privately run prison after the bids by private companies exceeded by millions how much it would cost the state to do it. A few years back, a study showed that private prisons cost the state of Arizona more than a public system would have. When the State of Ohio gave its first inspection to one prison, it found that since a private company had taken over the facility compliance with regulations fell from 97.3% to 66.7%, a stunning decline in quality. As long ago as 2010, The Lexington Herald-Ledger called the privatized prison system in Kentucky a failure and cited the many abuses at one notorious private-run correctional facility.

How about private armies? People seem to forget that since Bush II, we have farmed out a large part of our military functions to private companies like Blackwater (formerly run by the brother of our current pro-privatization Secretary of Education) and Haliburton (the former company of Bush II’s VP, Dick Cheney). At a certain point, we had well over 100,000 military contractors in Iraq and Afghanistan, hired mercenaries loyal to their company not to the United States and not indoctrinated with the values and ethics of the American military. Using contractors drove up the cost of the Iraq and Afghanistan wars and led to a number of scandals of abuse of local civilians. And perhaps most significantly, we have failed to win either of these wars. And why would anyone think we’d have a better chance to win with hired guns, not well-trained U.S. soldiers? The United States has recognized the inferiority of privatized armies since we beat one in the Revolutionary War.

Then there’s our privatized health insurance system, which costs much more than other nations and yet delivers inferior health care, if judged by outcomes. In 2015, the United States spent almost three times on healthcare as the average of other countries with comparable incomes. Despite these outsized U.S. expenditures, people live longer in 29 of the 34 other countries surveyed. When compared to nationalized health insurance systems, we have fewer hospital beds per capita, higher rates of infant mortality and fewer people covered by health insurance. If you don’t think government does a better job of providing healthcare, ask anyone who has just switched from a commercial plan to Medicare what they think. I immediately noticed the improvement in benefits and access to care that Medicare provides.

For one more example, let’s go back in time to the early days of data processing. Many states and the federal government would outsource data processing to private companies such as Ross Perot’s Electronic Data Systems (EDS). Back in the 1980’s, it would cost Perot’s company about 6-8 cents a transaction, including labor costs. EDS charged governments 72 cents a transaction, a pretty fat profit margin. It doesn’t take much of a business head to figure out that the governments would have saved their constituents a lot of money if they had bought the computers and done the work themselves.

Of course, evidence means nothing to Trumpty-Dumpty and Republicans. Just this week, we learned that the Trump administration has abruptly halted work on the National Registry of Evidence-Based Programs and Practices, which analyzes substance abuse and behavioral health problems and recommends which work and which don’t, based on actual evidence. The program has been around about 20 years and features a website listing 453 programs in behavioral health that have been shown, by rigorous outcomes measures, to be effective. Like all Republicans, Trump and his advisors would prefer that the public be free to select from any number of treatments, even those that evidence has shown are little better than quackery. The freedom to select something that doesn’t work because you don’t have enough information and/or have been fooled by an unethical charlatan is more important than cutting the cost of health care and getting people the best treatment.

The contemporary Republican Party seems to run from science, which disproves virtually all of its fundamental ideas regarding the economy, free markets, the environment and public health. Their religious faith in the power of privatization is greater than the evidence that shows it doesn’t work.

The danger of an Oprah presidency: Instead of being chief of state, POTUS becomes a celebrity figurehead

No one can really answer the question, “Would you vote for Oprah for president?” out of the context of who the other choices are. In a primary against almost any other Democrat, I would vote for the opponent—maybe I’d vote for Oprah over Corey Booker or Heidi Heitkamp, because they’re so conservative. Maybe. But I would certainly vote for Oprah over just about any Republican Maybe I’d vote for John Kasich, Jon Huntsman or Nikki Haley instead of Oprah. Maybe.

Do I think Oprah is qualified to serve as Commander in Chief?

Hell, no.

Sure, she’s a personable entertainer and a shrewd businesswoman, truly a self-made billionaire. Her politics seem to fit in with Hillary Clinton’s and Barack Obama’s: a centrist looking left who will repair the damage of almost 35 years of Reaganism and move us along towards a socialist democracy, but at a snail’s pace and while ignoring the needs of labor unions. She seems to believe in science and respect expertise. She is as sincerely compassionate as the current generation of Republicans are mean-spirited. I am certain that if elected Oprah would surround herself with left-leaning centrists distinguished in their respective fields or bright political minds. She would be a feel-good president, much like all our presidents were between Carter and the current occupant of the Oval Office. With a solid Democratic majority in both houses she could have a historic presidency, on par with the accomplishments of Roosevelt and Johnson. But that’s what I predicted about the impending Hillary Clinton presidency, back before anyone knew that the Russians were attempting to fix the results.

But that doesn’t mean that Oprah should run. She’s just not qualified. She’s never run a government bureaucracy. She’s never shepherded legislation to passage. She’s never juggled electoral concerns with governmental realities. She’s never voted for or against something that affects the lives—and sometimes deaths—of thousands, or millions.

Moreover, what the United States doesn’t need right now is to solidify the idea that a celebrity is qualified to be president, or that to be president of the United States, you must be a celebrity. The mainstream news media would like nothing better than to continue a reality TV approach to covering politics. It’s much easier than getting knowledgeable and asking intelligent questions about real issues. But if we keep electing people with no government experience who have primarily been entertainers or athletes, we’ll end up with a titular presidency, a front person who spouts off the words of others, with the real power up for grabs in the back rooms of the White House. Okay, we’ve sometimes had that situation before, but a celebrity presidency institutionalizes the ceremonial approach to the office, with the real power lying elsewhere, hidden away from public scrutiny.

I’m not saying Oprah will act like a reality show participant as president. She’ll certainly know how to comport herself with dignitaries, foreign leaders, the news media, Congress and the public. There will be no displays of ignorant rudeness, like riding in a golf cart while everyone else walks or pushing past the head of another country to get into the center of a photo. I’m quite certain that while she will tweet and give Facebook updates with great frequency, a President Oprah would not resort to feuding, insulting, exaggerating, bullying, bragging and the other reality TV techniques that Trumpty-Dumpty employs.

But hers would still be another celebrity presidency. And that has to be bad for the country, no matter what her administration might accomplish.

At various times, Bill Clinton and Barack Obama were labeled rock stars, a derogatory phrase meant to imply that they were no better than celebrities. Opponents denigrated Ronald Reagan as an actor. But all these individuals had been involved in government before running for president. All had won elections before. That’s what Oprah needs to do, maybe even serve in a cabinet post. At the age of 63, it’s a little late to get started, but even if she served one term as a senator or governor, it would transform her from celebrity to elected official. It worked for Nelson Rockefeller and Jack Kennedy. It worked for George Murphy, Al Franken and Ronald Reagan. It worked for Sonny Bono. Okay, maybe not for Sonny.

Now that we’ve had our fantasy about possibly electing one of the most likeable people in American to follow one of the most despised, the Democratic Party has to get back to sifting through likely candidates who have electoral and governmental experience. There are any number of qualified Democrats out there: Kamala Harris, Elizabeth Warren, Kirsten Gillibrand, Bill DeBlasio, Sherrod Brown, Chris Murphy, Amy Klobuchar, Gavin Newsom, Tammy Baldwin, Chuck Schumer, even (holding my nose and frowning) Andrew Cuomo. But not Oprah, please. (also not Bernie, Joe Biden or Hillary.)

What connects deporting dreamers and refugees & the new war on pot? The need to fill the private prisons owned & operated by Trump/Session supporters

The body count is getting higher: Add 200,000 Salvadorans to 780,000 dreamers and 45,000 Haitians. That’s well over a million people now, mostly productive and hard-working, ripped out of our economy and communities. These are many of the people who make our hotel beds, fix our pipes, take care of our elderly, slaughter our chickens, pick our crops, deliver our groceries and build our roads and housing. We can expect labor shortages in all these and other industries.

Don’t expect people postponing retirement to fill the gap—the ones who work past 62 are mostly professionals in desk jobs. You might see a senior staffing a cash register at Walmart or flipping burgers at MacDonald’s, but Baby Boomers’ expansive waists, bad knees, sore rotator cuffs, aching hips and general arthritis will rule out plucking oranges from trees, walking patients around hospital wards, making deliveries or operating a jack hammer. As with most of their policies, the Trump GOP’s deportation of more than a million productive Americans goes counter to the best interest of the country. We need to address global warming; he walks away from the Paris Accord. We need to raise taxes on the wealthy; they lower them. We need more workers or face a labor shortage; he kicks out millions.

Add to the more than a million refugees and other immigrants Trump intends to kick out of the country by 2020 a yet unknown number—the additional number of people who will be thrown in jail as a result of the Justice Department’s new crackdown on marijuana. Will it be 10,000? 20,000? 50,000? However many, they won’t be leading productive lives contributing to the economy.

On the surface, what unites deporting dreamers and refugees with ratcheting up arrests for something that should be legal—and is in many states—are the sheer stupidity of the actions, the mean-spirited cruelty underlying both policies and the deleterious effect each will likely have on the American economy and on many individuals.

A follow-the-money analysis uncovers another connection between these two deplorable stupidities: Both will line the pockets of the operators of for-profit prisons. The way back to wherever someone or their parents started usually runs through a detention center, so virtually every dreamer or refugee kicked out of the country will spend some time under lock and key for long periods of time. And every stoner or pot entrepreneur busted will end up detained, sometimes for years.

What a boon to for-profit prisons, which the Obama Administration had begun to phase out. The incarceration industry and its investors have been riding high since Trump announced that he was rescinding the Obama decision and relying even more heavily on private prisons. Now instead of facing a contraction of business, private prisons are looking at boom times.

As usual, Trump gets it wrong. By almost every measure private prisons have been a disaster: prisoners are more likely to be mistreated and often don’t get enough to eat or adequate medical care; drug use and violence are greater in private prisons. Often the private solution ends up being more expensive. It has never produced a greater rate of rehabilitation.

Besides being a disaster, private prisons also have a distorting effect on our politics and criminal justice system. Private prisons make money only if they are filled with prisoners, and so their operators have long lobbied for three-strikes-you’re-out and other harsh sentencing laws. They have contributed to the campaigns of many law-and-order candidates, primarily Republicans and including Jeff Sessions and now Trumpty-Dumpty.

“Crony capitalism” means giving large government contracts to your friends and financial supporters. It’s been around since the Revolutionary War and served as one of the primary sources of ultra-wealth during the Civil War and the Gilded Age. Private prisons are the quintessential “crony capitalists,” an industry that emerged only from a desire to privatize and thereby create more opportunity to use government revenues to create profit for private individuals at the expense of taxpayers. Government is the sole market for their services. It is in their best interest to increase their market by increasing the number of people detained and incarcerated. In an era of rapidly falling crime, that means increasing what is considered a crime and expanding the jail time demanded of the perpetrators. Criminalizing both immigrants and pot smokers fill the bill quite nicely.

The Trump Administration operates primarily on hate, fear and crony capitalism. We can see all three motives coming together in granting the wishes of another large industry dependent on government largess.

Upside of downsizing the American dream: It may slow down use of fossil fuels and global warming

In discussing climate change, the very broadest view we can take is the unfolding of evolution. Recent findings uncover a strong connection between the composition of gases in the atmosphere and the development of life on Earth. Factors such as earthquakes, volcanoes, the activity of the sun, the warming and cooling of the globe, Earth’s slightly irregular rotation in orbit and the impact of asteroids have affected the amount of methane, oxygen, carbon dioxide and other gases in the atmosphere and water. Some species thrive and others falter when this mix of gases changes, either suddenly or over large expanses of time.

Most relevant to this discussion is the percentage of oxygen in the air. Paleontologist Peter Ward (University of Washington) and geologist Joe Kirschvink (California Institute of Technology) explain in A New History of Life: The Radical New Discoveries about the Origins and Evolution of Life on Earth that in Triassic times, just before the extinction event that ushered in the Jurassic period, the precursors of mammals called the therapsids dominated the earth. Compared to reptiles, these ur-mammals had less efficient lungs (as do mammals), but it didn’t matter since the earth was relatively rich in oxygen.

But something happened during the extinction event that separates the Triassic and the Jurassic periods to reduce the percentage of oxygen in the atmosphere, enabling reptiles, including dinosaurs, to thrive and impeding the development of mammals. The rise of the dinosaurs may result directly from a reduction of oxygen and increase in nitrogen in the atmosphere. While science now confirms that the crash of a large asteroid is implicated in the death of all land dinosaurs and most avian dinosaurs (the surviving flyers becoming birds), evolutionary scientists now believe that the central factor in the rise of mammals, and thus primates and humans, was the increase in the amount of oxygen in the atmosphere from 14-16% to about 21% about 65 million years ago.

Humanity’s current spewing of millions of tons of carbon dioxide into the atmosphere a year is already changing the mix of gases dissolved in our oceans. Once the waters become supersaturated with carbon dioxide, if we are still in the midst of our fossil fuel burning spree, we can be reasonably certain that the overall percentage of CO2 in the atmosphere will increase and, more significantly for the survival of humanity, the percentage of oxygen will decrease.

That’s the long-term threat of failing to limit severely the amount of carbon we release into the environment. But before the composition of atmospheric gases radically could change, humanity would already have suffered—and perhaps gone extinct—from pandemics, famines, extreme weather events and resource wars: The four horses of the apocalypse known as global warming.

I recently did the latest version of the individual footprint test, which estimates the number of earths it would take to have the resources to support every human being in my style. Now I’m a voluntary simplicity warrior: I walk or ride the subway as much as possible, only occasionally taking the bus. I was in a car for less than 200 miles last year and took one airplane trip. We eat primarily locally grown food and I eat energy-intensive red meat but once a week. We compost. We live in a 1,200 square-foot apartment in a 17-story building that recently switched to gas heating. We buy only wind-powered electricity and recycle everything allowed. But despite these best efforts, my footprint computes to 1.5 earths for everyone. What else can I do without government intervention, besides maybe to get my building to go solar? The subway has to start using less energy and the buses have to eventually run on wind- or solar power, probably on rails. My food, clothes, computers—everything will have to be made and delivered more cheaply.

And that’s in energy efficient New York City! What about the rest of the country, where automobile travel dominates, mass transit has been allowed to wither and people live in and therefore heat larger spaces, and do so less efficiently, in free-standing houses? If everyone in the world lived as the average American does, it would take the resources of five earths.

The upside of the downsizing of the American dream that the growing inequality of wealth and income has produced is that it will soon shrink the footprint of many Americans. But we have to change what we do with the vast excess capital produced by squeezing the middle and lower classes. Currently, we give it to a small group of very lucky, if typically well-connected, individuals and families, AKA the super wealthy. Instead, we should use taxes to confiscate this excess capital and fund mass transit, wind and solar power, alternative technology development and adaptation, local sourcing projects throughout the United States.

We should also invest heavily in promoting negative population growth. Imagine, if everyone in the world limited themselves to having one child, the population would naturally shrink to a more manageable size. With fewer people, we could sustain a higher average quality of life.

Make no doubt about it—for humanity to survive, Americans will have to start using less energy and other resources and there will have to be a lot fewer of not only us, but of all the peoples of all the nations.