The Demise of the Republican Effort to Repeal the Affordable Care Act Is Hardly the End of the Story

Posted on Jul 30, 2017

By Robert Reich

The demise of the Republican effort to repeal the Affordable Care Act is hardly the end of the story. Donald Trump will not let this loss stand.

Since its inception in 2010, Republicans made the Affordable Care Act into a symbol of Obama-Clinton overreach – part of a supposed plot by liberal elites to expand government, burden the white working class, and transfer benefits to poor blacks and Latinos.

Ever the political opportunist, Trump poured his own poisonous salt into this festering wound. Although he never really understood the Affordable Care Act, he used it to prey upon resentments of class, race, ethnicity, and religiosity that propelled him into the White House.

Repealing “Obamacare” has remained one of Trump’s central rallying cries to his increasingly angry base. “The question for every senator, Democrat or Republican, is whether they will side with Obamacare’s architects, which have been so destructive to our country, or with its forgotten victims,” Trump said last Monday, adding that any senator who failed to vote against it “is telling America that you are fine with the Obamacare nightmare.”

Now, having lost that fight, Trump will try to subvert the Act by delaying subsidies so some insurance companies won’t be able to participate, failing to enforce the individual mandate so funding won’t be adequate, not informing those who are eligible about when to sign up and how to do so, and looking the other way when states don’t comply.

But that’s not all. Trump doesn’t want his base to perceive him as a loser.

So be prepared for scorched-earth politics from the Oval Office, including more savage verbal attacks on Barack Obama and Hillary Clinton, more baseless charges of voter fraud in the 2016 election, and further escalation of the culture wars.

Most Americans won’t be swayed by these pyrotechnics because they’ve become inured to our unhinged president.

But that’s not the point. They’ll be intended to shore up Trump’s “base” – the third of the country that supports him, who still believe they’re “victims” of Obamacare, who continue to believe Trump himself is the victim of a liberal conspiracy to unseat him.

Trump wants his base to become increasingly angry and politically mobilized, so they’ll continue to exert an outsized influence on the Republican Party.

There is a deeper danger here. As Harvard political scientist Archon Fung has argued, stable democracies require that citizens be committed to the rule of law even if they fail to achieve their preferred policies.

Settling our differences through ballots and agreed-upon processes rather than through force is what separates democracy from authoritarianism.

But Donald Trump has never been committed to the rule of law. For him, it’s all about winning. If he can’t win through established democratic processes, he’ll mobilize his base to change them.

Trump is already demanding that Mitch McConnell and senate Republicans obliterate the filibuster, thereby allowing anything to be passed with a bare majority.

Last Saturday he tweeted “Republican Senate must get rid of 60 vote NOW!” adding the filibuster “allows 8 Dems to control country,” and “Republicans in the Senate will NEVER win if they don’t go to a 51 vote majority NOW. They look like fools and are just wasting time.”

What’s particularly worrisome about Trump’s attack on the long-established processes of our democracy is that his assault comes at a time when the percentage of Americans who regard the other party as a fundamental threat is growing.

In 2014 – even before acrimony of 2016 presidential campaign – 35 percent of Republicans saw the Democratic Party as a “threat to the nation’s well being” and 27 percent of Democrats regarded Republicans the same way, according to the Pew Research Center.

Those percentages are undoubtedly higher today. If Trump succeeds, they’ll be higher still.

Anyone who regards the other party as a threat to the nation’s well being is less apt to accept outcomes in which the other party prevails – whether it’s a decision not to repeal the Affordable Care Act, or even the outcome of a presidential election.

As a practical matter, when large numbers of citizens aren’t willing to accept such outcomes, we’re no longer part of the same democracy.

I fear this is where Trump intends to take his followers, along with much of the Republican Party: Toward a rejection of political outcomes they regard as illegitimate, and therefore a rejection of democracy as we know it.

That way, Trump will always win.

 

Posted in Marty's Blog | Leave a comment

The Uninhabitable Earth

Famine, economic collapse, a sun that cooks us: What climate change could wreak — sooner than you think.

By

Fossils by Heartless Machine

In the jungles of Costa Rica, where humidity routinely tops 90 percent, simply moving around outside when it’s over 105 degrees Fahrenheit would be lethal. And the effect would be fast: Within a few hours, a human body would be cooked to death from both inside and out.

I. ‘Doomsday’

Peering beyond scientific reticence.

It is, I promise, worse than you think. If your anxiety about global warming is dominated by fears of sea-level rise, you are barely scratching the surface of what terrors are possible, even within the lifetime of a teenager today. And yet the swelling seas — and the cities they will drown — have so dominated the picture of global warming, and so overwhelmed our capacity for climate panic, that they have occluded our perception of other threats, many much closer at hand. Rising oceans are bad, in fact very bad; but fleeing the coastline will not be enough.

Indeed, absent a significant adjustment to how billions of humans conduct their lives, parts of the Earth will likely become close to uninhabitable, and other parts horrifically inhospitable, as soon as the end of this century.

Even when we train our eyes on climate change, we are unable to comprehend its scope. This past winter, a string of days 60 and 70 degrees warmer than normal baked the North Pole, melting the permafrost that encased Norway’s Svalbard seed vault — a global food bank nicknamed “Doomsday,” designed to ensure that our agriculture survives any catastrophe, and which appeared to have been flooded by climate change less than ten years after being built.

The Doomsday vault is fine, for now: The structure has been secured and the seeds are safe. But treating the episode as a parable of impending flooding missed the more important news. Until recently, permafrost was not a major concern of climate scientists, because, as the name suggests, it was soil that stayed permanently frozen. But Arctic permafrost contains 1.8 trillion tons of carbon, more than twice as much as is currently suspended in the Earth’s atmosphere. When it thaws and is released, that carbon may evaporate as methane, which is 34 times as powerful a greenhouse-gas warming blanket as carbon dioxide when judged on the timescale of a century; when judged on the timescale of two decades, it is 86 times as powerful. In other words, we have, trapped in Arctic permafrost, twice as much carbon as is currently wrecking the atmosphere of the planet, all of it scheduled to be released at a date that keeps getting moved up, partially in the form of a gas that multiplies its warming power 86 times over.

Maybe you know that already — there are alarming stories every day, like last month’s satellite data showing the globe warming, since 1998, more than twice as fast as scientists had thought. Or the news from Antarctica this past May, when a crack in an ice shelf grew 11 miles in six days, then kept going; the break now has just three miles to go — by the time you read this, it may already have met the open water, where it will drop into the sea one of the biggest icebergs ever, a process known poetically as “calving.”

But no matter how well-informed you are, you are surely not alarmed enough. Over the past decades, our culture has gone apocalyptic with zombie movies and Mad Max dystopias, perhaps the collective result of displaced climate anxiety, and yet when it comes to contemplating real-world warming dangers, we suffer from an incredible failure of imagination. The reasons for that are many: the timid language of scientific probabilities, which the climatologist James Hansen once called “scientific reticence” in a paper chastising scientists for editing their own observations so conscientiously that they failed to communicate how dire the threat really was; the fact that the country is dominated by a group of technocrats who believe any problem can be solved and an opposing culture that doesn’t even see warming as a problem worth addressing; the way that climate denialism has made scientists even more cautious in offering speculative warnings; the simple speed of change and, also, its slowness, such that we are only seeing effects now of warming from decades past; our uncertainty about uncertainty, which the climate writer Naomi Oreskes in particular has suggested stops us from preparing as though anything worse than a median outcome were even possible; the way we assume climate change will hit hardest elsewhere, not everywhere; the smallness (two degrees) and largeness (1.8 trillion tons) and abstractness (400 parts per million) of the numbers; the discomfort of considering a problem that is very difficult, if not impossible, to solve; the altogether incomprehensible scale of that problem, which amounts to the prospect of our own annihilation; simple fear. But aversion arising from fear is a form of denial, too.

In between scientific reticence and science fiction is science itself. This article is the result of dozens of interviews and exchanges with climatologists and researchers in related fields and reflects hundreds of scientific papers on the subject of climate change. What follows is not a series of predictions of what will happen — that will be determined in large part by the much-less-certain science of human response. Instead, it is a portrait of our best understanding of where the planet is heading absent aggressive action. It is unlikely that all of these warming scenarios will be fully realized, largely because the devastation along the way will shake our complacency. But those scenarios, and not the present climate, are the baseline. In fact, they are our schedule.

The present tense of climate change — the destruction we’ve already baked into our future — is horrifying enough. Most people talk as if Miami and Bangladesh still have a chance of surviving; most of the scientists I spoke with assume we’ll lose them within the century, even if we stop burning fossil fuel in the next decade. Two degrees of warming used to be considered the threshold of catastrophe: tens of millions of climate refugees unleashed upon an unprepared world. Now two degrees is our goal, per the Paris climate accords, and experts give us only slim odds of hitting it. The U.N. Intergovernmental Panel on Climate Change issues serial reports, often called the “gold standard” of climate research; the most recent one projects us to hit four degrees of warming by the beginning of the next century, should we stay the present course. But that’s just a median projection. The upper end of the probability curve runs as high as eight degrees — and the authors still haven’t figured out how to deal with that permafrost melt. The IPCC reports also don’t fully account for the albedo effect (less ice means less reflected and more absorbed sunlight, hence more warming); more cloud cover (which traps heat); or the dieback of forests and other flora (which extract carbon from the atmosphere). Each of these promises to accelerate warming, and the geological record shows that temperature can shift as much as ten degrees or more in a single decade. The last time the planet was even four degrees warmer, Peter Brannen points out in The Ends of the World, his new history of the planet’s major extinction events, the oceans were hundreds of feet higher.*

The Earth has experienced five mass extinctions before the one we are living through now, each so complete a slate-wiping of the evolutionary record it functioned as a resetting of the planetary clock, and many climate scientists will tell you they are the best analog for the ecological future we are diving headlong into. Unless you are a teenager, you probably read in your high-school textbooks that these extinctions were the result of asteroids. In fact, all but the one that killed the dinosaurs were caused by climate change produced by greenhouse gas. The most notorious was 252 million years ago; it began when carbon warmed the planet by five degrees, accelerated when that warming triggered the release of methane in the Arctic, and ended with 97 percent of all life on Earth dead. We are currently adding carbon to the atmosphere at a considerably faster rate; by most estimates, at least ten times faster. The rate is accelerating. This is what Stephen Hawking had in mind when he said, this spring, that the species needs to colonize other planets in the next century to survive, and what drove Elon Musk, last month, to unveil his plans to build a Mars habitat in 40 to 100 years. These are nonspecialists, of course, and probably as inclined to irrational panic as you or I. But the many sober-minded scientists I interviewed over the past several months — the most credentialed and tenured in the field, few of them inclined to alarmism and many advisers to the IPCC who nevertheless criticize its conservatism — have quietly reached an apocalyptic conclusion, too: No plausible program of emissions reductions alone can prevent climate disaster.

Over the past few decades, the term “Anthropocene” has climbed out of academic discourse and into the popular imagination — a name given to the geologic era we live in now, and a way to signal that it is a new era, defined on the wall chart of deep history by human intervention. One problem with the term is that it implies a conquest of nature (and even echoes the biblical “dominion”). And however sanguine you might be about the proposition that we have already ravaged the natural world, which we surely have, it is another thing entirely to consider the possibility that we have only provoked it, engineering first in ignorance and then in denial a climate system that will now go to war with us for many centuries, perhaps until it destroys us. That is what Wallace Smith Broecker, the avuncular oceanographer who coined the term “global warming,” means when he calls the planet an “angry beast.” You could also go with “war machine.” Each day we arm it more.

II. Heat Death

The bahraining of New York.

Image
In the sugar­cane region of El Salvador, as much as one-fifth of the population has chronic kidney disease, the presumed result of dehydration from working the fields they were able to comfortably harvest as recently as two decades ago. Photo: Heartless Machine

Humans, like all mammals, are heat engines; surviving means having to continually cool off, like panting dogs. For that, the temperature needs to be low enough for the air to act as a kind of refrigerant, drawing heat off the skin so the engine can keep pumping. At seven degrees of warming, that would become impossible for large portions of the planet’s equatorial band, and especially the tropics, where humidity adds to the problem; in the jungles of Costa Rica, for instance, where humidity routinely tops 90 percent, simply moving around outside when it’s over 105 degrees Fahrenheit would be lethal. And the effect would be fast: Within a few hours, a human body would be cooked to death from both inside and out.

Climate-change skeptics point out that the planet has warmed and cooled many times before, but the climate window that has allowed for human life is very narrow, even by the standards of planetary history. At 11 or 12 degrees of warming, more than half the world’s population, as distributed today, would die of direct heat. Things almost certainly won’t get that hot this century, though models of unabated emissions do bring us that far eventually. This century, and especially in the tropics, the pain points will pinch much more quickly even than an increase of seven degrees. The key factor is something called wet-bulb temperature, which is a term of measurement as home-laboratory-kit as it sounds: the heat registered on a thermometer wrapped in a damp sock as it’s swung around in the air (since the moisture evaporates from a sock more quickly in dry air, this single number reflects both heat and humidity). At present, most regions reach a wet-bulb maximum of 26 or 27 degrees Celsius; the true red line for habitability is 35 degrees. What is called heat stress comes much sooner.

Actually, we’re about there already. Since 1980, the planet has experienced a 50-fold increase in the number of places experiencing dangerous or extreme heat; a bigger increase is to come. The five warmest summers in Europe since 1500 have all occurred since 2002, and soon, the IPCC warns, simply being outdoors that time of year will be unhealthy for much of the globe. Even if we meet the Paris goals of two degrees warming, cities like Karachi and Kolkata will become close to uninhabitable, annually encountering deadly heat waves like those that crippled them in 2015. At four degrees, the deadly European heat wave of 2003, which killed as many as 2,000 people a day, will be a normal summer. At six, according to an assessment focused only on effects within the U.S. from the National Oceanic and Atmospheric Administration, summer labor of any kind would become impossible in the lower Mississippi Valley, and everybody in the country east of the Rockies would be under more heat stress than anyone, anywhere, in the world today. As Joseph Romm has put it in his authoritative primer Climate Change: What Everyone Needs to Know, heat stress in New York City would exceed that of present-day Bahrain, one of the planet’s hottest spots, and the temperature in Bahrain “would induce hyperthermia in even sleeping humans.” The high-end IPCC estimate, remember, is two degrees warmer still. By the end of the century, the World Bank has estimated, the coolest months in tropical South America, Africa, and the Pacific are likely to be warmer than the warmest months at the end of the 20th century. Air-conditioning can help but will ultimately only add to the carbon problem; plus, the climate-controlled malls of the Arab emirates aside, it is not remotely plausible to wholesale air-condition all the hottest parts of the world, many of them also the poorest. And indeed, the crisis will be most dramatic across the Middle East and Persian Gulf, where in 2015 the heat index registered temperatures as high as 163 degrees Fahrenheit. As soon as several decades from now, the hajj will become physically impossible for the 2 million Muslims who make the pilgrimage each year.

It is not just the hajj, and it is not just Mecca; heat is already killing us. In the sugarcane region of El Salvador, as much as one-fifth of the population has chronic kidney disease, including over a quarter of the men, the presumed result of dehydration from working the fields they were able to comfortably harvest as recently as two decades ago. With dialysis, which is expensive, those with kidney failure can expect to live five years; without it, life expectancy is in the weeks. Of course, heat stress promises to pummel us in places other than our kidneys, too. As I type that sentence, in the California desert in mid-June, it is 121 degrees outside my door. It is not a record high.

III. The End of Food

Praying for cornfields in the tundra.

Climates differ and plants vary, but the basic rule for staple cereal crops grown at optimal temperature is that for every degree of warming, yields decline by 10 percent. Some estimates run as high as 15 or even 17 percent. Which means that if the planet is five degrees warmer at the end of the century, we may have as many as 50 percent more people to feed and 50 percent less grain to give them. And proteins are worse: It takes 16 calories of grain to produce just a single calorie of hamburger meat, butchered from a cow that spent its life polluting the climate with methane farts.

Pollyannaish plant physiologists will point out that the cereal-crop math applies only to those regions already at peak growing temperature, and they are right theoretically, a warmer climate will make it easier to grow corn in Greenland. But as the pathbreaking work by Rosamond Naylor and David Battisti has shown, the tropics are already too hot to efficiently grow grain, and those places where grain is produced today are already at optimal growing temperature — which means even a small warming will push them down the slope of declining productivity. And you can’t easily move croplands north a few hundred miles, because yields in places like remote Canada and Russia are limited by the quality of soil there; it takes many centuries for the planet to produce optimally fertile dirt.

Drought might be an even bigger problem than heat, with some of the world’s most arable land turning quickly to desert. Precipitation is notoriously hard to model, yet predictions for later this century are basically unanimous: unprecedented droughts nearly everywhere food is today produced. By 2080, without dramatic reductions in emissions, southern Europe will be in permanent extreme drought, much worse than the American dust bowl ever was. The same will be true in Iraq and Syria and much of the rest of the Middle East; some of the most densely populated parts of Australia, Africa, and South America; and the breadbasket regions of China. None of these places, which today supply much of the world’s food, will be reliable sources of any. As for the original dust bowl: The droughts in the American plains and Southwest would not just be worse than in the 1930s, a 2015 NASA study predicted, but worse than any droughts in a thousand years — and that includes those that struck between 1100 and 1300, which “dried up all the rivers East of the Sierra Nevada mountains” and may have been responsible for the death of the Anasazi civilization.

Remember, we do not live in a world without hunger as it is. Far from it: Most estimates put the number of undernourished at 800 million globally. In case you haven’t heard, this spring has already brought an unprecedented quadruple famine to Africa and the Middle East; the U.N. has warned that separate starvation events in Somalia, South Sudan, Nigeria, and Yemen could kill 20 million this year alone.

IV. Climate Plagues

What happens when the bubonic ice melts?

Rock, in the right spot, is a record of planetary history, eras as long as millions of years flattened by the forces of geological time into strata with amplitudes of just inches, or just an inch, or even less. Ice works that way, too, as a climate ledger, but it is also frozen history, some of which can be reanimated when unfrozen. There are now, trapped in Arctic ice, diseases that have not circulated in the air for millions of years — in some cases, since before humans were around to encounter them. Which means our immune systems would have no idea how to fight back when those prehistoric plagues emerge from the ice.

The Arctic also stores terrifying bugs from more recent times. In Alaska, already, researchers have discovered remnants of the 1918 flu that infected as many as 500 million and killed as many as 100 million — about 5 percent of the world’s population and almost six times as many as had died in the world war for which the pandemic served as a kind of gruesome capstone. As the BBC reported in May, scientists suspect smallpox and the bubonic plague are trapped in Siberian ice, too — an abridged history of devastating human sickness, left out like egg salad in the Arctic sun.

Experts caution that many of these organisms won’t actually survive the thaw and point to the fastidious lab conditions under which they have already reanimated several of them — the 32,000-year-old “extremophile” bacteria revived in 2005, an 8 million-year-old bug brought back to life in 2007, the 3.5 million–year–old one a Russian scientist self-injected just out of curiosity — to suggest that those are necessary conditions for the return of such ancient plagues. But already last year, a boy was killed and 20 others infected by anthrax released when retreating permafrost exposed the frozen carcass of a reindeer killed by the bacteria at least 75 years earlier; 2,000 present-day reindeer were infected, too, carrying and spreading the disease beyond the tundra.

What concerns epidemiologists more than ancient diseases are existing scourges relocated, rewired, or even re-evolved by warming. The first effect is geographical. Before the early-modern period, when adventuring sailboats accelerated the mixing of peoples and their bugs, human provinciality was a guard against pandemic. Today, even with globalization and the enormous intermingling of human populations, our ecosystems are mostly stable, and this functions as another limit, but global warming will scramble those ecosystems and help disease trespass those limits as surely as Cortés did. You don’t worry much about dengue or malaria if you are living in Maine or France. But as the tropics creep northward and mosquitoes migrate with them, you will. You didn’t much worry about Zika a couple of years ago, either.

As it happens, Zika may also be a good model of the second worrying effect — disease mutation. One reason you hadn’t heard about Zika until recently is that it had been trapped in Uganda; another is that it did not, until recently, appear to cause birth defects. Scientists still don’t entirely understand what happened, or what they missed. But there are things we do know for sure about how climate affects some diseases: Malaria, for instance, thrives in hotter regions not just because the mosquitoes that carry it do, too, but because for every degree increase in temperature, the parasite reproduces ten times faster. Which is one reason that the World Bank estimates that by 2050, 5.2 billion people will be reckoning with it.

V. Unbreathable Air

A rolling death smog that suffocates millions.

Image
By the end of the century, the coolest months in tropical South America, Africa, and the Pacific are likely to be warmer than the warmest months at the end of the 20th century. Photo: Heartless Machine

Our lungs need oxygen, but that is only a fraction of what we breathe. The fraction of carbon dioxide is growing: It just crossed 400 parts per million, and high-end estimates extrapolating from current trends suggest it will hit 1,000 ppm by 2100. At that concentration, compared to the air we breathe now, human cognitive ability declines by 21 percent.

Other stuff in the hotter air is even scarier, with small increases in pollution capable of shortening life spans by ten years. The warmer the planet gets, the more ozone forms, and by mid-century, Americans will likely suffer a 70 percent increase in unhealthy ozone smog, the National Center for Atmospheric Research has projected. By 2090, as many as 2 billion people globally will be breathing air above the WHO “safe” level; one paper last month showed that, among other effects, a pregnant mother’s exposure to ozone raises the child’s risk of autism (as much as tenfold, combined with other environmental factors). Which does make you think again about the autism epidemic in West Hollywood.

Already, more than 10,000 people die each day from the small particles emitted from fossil-fuel burning; each year, 339,000 people die from wildfire smoke, in part because climate change has extended forest-fire season (in the U.S., it’s increased by 78 days since 1970). By 2050, according to the U.S. Forest Service, wildfires will be twice as destructive as they are today; in some places, the area burned could grow fivefold. What worries people even more is the effect that would have on emissions, especially when the fires ravage forests arising out of peat. Peatland fires in Indonesia in 1997, for instance, added to the global CO2 release by up to 40 percent, and more burning only means more warming only means more burning. There is also the terrifying possibility that rain forests like the Amazon, which in 2010 suffered its second “hundred-year drought” in the space of five years, could dry out enough to become vulnerable to these kinds of devastating, rolling forest fires — which would not only expel enormous amounts of carbon into the atmosphere but also shrink the size of the forest. That is especially bad because the Amazon alone provides 20 percent of our oxygen.

Then there are the more familiar forms of pollution. In 2013, melting Arctic ice remodeled Asian weather patterns, depriving industrial China of the natural ventilation systems it had come to depend on, which blanketed much of the country’s north in an unbreathable smog. Literally unbreathable. A metric called the Air Quality Index categorizes the risks and tops out at the 301-to-500 range, warning of “serious aggravation of heart or lung disease and premature mortality in persons with cardiopulmonary disease and the elderly” and, for all others, “serious risk of respiratory effects”; at that level, “everyone should avoid all outdoor exertion.” The Chinese “airpocalypse” of 2013 peaked at what would have been an Air Quality Index of over 800. That year, smog was responsible for a third of all deaths in the country.

VI. Perpetual War

The violence baked into heat.

Climatologists are very careful when talking about Syria. They want you to know that while climate change did produce a drought that contributed to civil war, it is not exactly fair to saythat the conflict is the result of warming; next door, for instance, Lebanon suffered the same crop failures. But researchers like Marshall Burke and Solomon Hsiang have managed to quantify some of the non-obvious relationships between temperature and violence: For every half-degree of warming, they say, societies will see between a 10 and 20 percent increase in the likelihood of armed conflict. In climate science, nothing is simple, but the arithmetic is harrowing: A planet five degrees warmer would have at least half again as many wars as we do today. Overall, social conflict could more than double this century.

This is one reason that, as nearly every climate scientist I spoke to pointed out, the U.S. military is obsessed with climate change: The drowning of all American Navy bases by sea-level rise is trouble enough, but being the world’s policeman is quite a bit harder when the crime rate doubles. Of course, it’s not just Syria where climate has contributed to conflict. Some speculate that the elevated level of strife across the Middle East over the past generation reflects the pressures of global warming — a hypothesis all the more cruel considering that warming began accelerating when the industrialized world extracted and then burned the region’s oil.

What accounts for the relationship between climate and conflict? Some of it comes down to agriculture and economics; a lot has to do with forced migration, already at a record high, with at least 65 million displaced people wandering the planet right now. But there is also the simple fact of individual irritability. Heat increases municipal crime rates, and swearing on social media, and the likelihood that a major-league pitcher, coming to the mound after his teammate has been hit by a pitch, will hit an opposing batter in retaliation. And the arrival of air-conditioning in the developed world, in the middle of the past century, did little to solve the problem of the summer crime wave.

VII. Permanent Economic Collapse

Dismal capitalism in a half-poorer world.

The murmuring mantra of global neoliberalism, which prevailed between the end of the Cold War and the onset of the Great Recession, is that economic growth would save us from anything and everything.
But in the aftermath of the 2008 crash, a growing number of historians studying what they call “fossil capitalism” have begun to suggest that the entire history of swift economic growth, which began somewhat suddenly in the 18th century, is not the result of innovation or trade or the dynamics of global capitalism but simply our discovery of fossil fuels and all their raw power — a onetime injection of new “value” into a system that had previously been characterized by global subsistence living. Before fossil fuels, nobody lived better than their parents or grandparents or ancestors from 500 years before, except in the immediate aftermath of a great plague like the Black Death, which allowed the lucky survivors to gobble up the resources liberated by mass graves. After we’ve burned all the fossil fuels, these scholars suggest, perhaps we will return to a “steady state” global economy. Of course, that onetime injection has a devastating long-term cost: climate change.

The most exciting research on the economics of warming has also come from Hsiang and his colleagues, who are not historians of fossil capitalism but who offer some very bleak analysis of their own: Every degree Celsius of warming costs, on average, 1.2 percent of GDP (an enormous number, considering we count growth in the low single digits as “strong”). This is the sterling work in the field, and their median projection is for a 23 percent loss in per capita earning globally by the end of this century (resulting from changes in agriculture, crime, storms, energy, mortality, and labor).
Tracing the shape of the probability curve is even scarier: There is a 12 percent chance that climate change will reduce global output by more than 50 percent by 2100, they say, and a 51 percent chance that it lowers per capita GDP by 20 percent or more by then, unless emissions decline. By comparison, the Great Recession lowered global GDP by about 6 percent, in a onetime shock; Hsiang and his colleagues estimate a one-in-eight chance of an ongoing and irreversible effect by the end of the century that is eight times worse.

The scale of that economic devastation is hard to comprehend, but you can start by imagining what the world would look like today with an economy half as big, which would produce only half as much value, generating only half as much to offer the workers of the world. It makes the grounding of flights out of heat-stricken Phoenix last month seem like pathetically small economic potatoes. And, among other things, it makes the idea of postponing government action on reducing emissions and relying solely on growth and technology to solve the problem an absurd business calculation.
Every round-trip ticket on flights from New York to London, keep in mind, costs the Arctic three more square meters of ice.

VIII. Poisoned Oceans

Sulfide burps off the skeleton coast.

That the sea will become a killer is a given. Barring a radical reduction of emissions, we will see at least four feet of sea-level rise and possibly ten by the end of the century. A third of the world’s major cities are on the coast, not to mention its power plants, ports, navy bases, farmlands, fisheries, river deltas, marshlands, and rice-paddy empires, and even those above ten feet will flood much more easily, and much more regularly, if the water gets that high. At least 600 million people live within ten meters of sea level today.

But the drowning of those homelands is just the start. At present, more than a third of the world’s carbon is sucked up by the oceans — thank God, or else we’d have that much more warming already. But the result is what’s called “ocean acidification,” which, on its own, may add a half a degree to warming this century. It is also already burning through the planet’s water basins — you may remember these as the place where life arose in the first place. You have probably heard of “coral bleaching” — that is, coral dying — which is very bad news, because reefs support as much as a quarter of all marine life and supply food for half a billion people. Ocean acidification will fry fish populations directly, too, though scientists aren’t yet sure how to predict the effects on the stuff we haul out of the ocean to eat; they do know that in acid waters, oysters and mussels will struggle to grow their shells, and that when the pH of human blood drops as much as the oceans’ pH has over the past generation, it induces seizures, comas, and sudden death.

That isn’t all that ocean acidification can do. Carbon absorption can initiate a feedback loop in which underoxygenated waters breed different kinds of microbes that turn the water still more “anoxic,” first in deep ocean “dead zones,” then gradually up toward the surface. There, the small fish die out, unable to breathe, which means oxygen-eating bacteria thrive, and the feedback loop doubles back. This process, in which dead zones grow like cancers, choking off marine life and wiping out fisheries, is already quite advanced in parts of the Gulf of Mexico and just off Namibia, where hydrogen sulfide is bubbling out of the sea along a thousand-mile stretch of land known as the “Skeleton Coast.” The name originally referred to the detritus of the whaling industry, but today it’s more apt than ever. Hydrogen sulfide is so toxic that evolution has trained us to recognize the tiniest, safest traces of it, which is why our noses are so exquisitely skilled at registering flatulence. Hydrogen sulfide is also the thing that finally did us in that time 97 percent of all life on Earth died, once all the feedback loops had been triggered and the circulating jet streams of a warmed ocean ground to a halt — it’s the planet’s preferred gas for a natural holocaust. Gradually, the ocean’s dead zones spread, killing off marine species that had dominated the oceans for hundreds of millions of years, and the gas the inert waters gave off into the atmosphere poisoned everything on land. Plants, too. It was millions of years before the oceans recovered.

IX. The Great Filter

Our present eeriness cannot last.

So why can’t we see it? In his recent book-length essay The Great Derangement, the Indian novelist Amitav Ghosh wonders why global warming and natural disaster haven’t become major subjects of contemporary fiction — why we don’t seem able to imagine climate catastrophe, and why we haven’t yet had a spate of novels in the genre he basically imagines into half-existence and names “the environmental uncanny.” “Consider, for example, the stories that congeal around questions like, ‘Where were you when the Berlin Wall fell?’ or ‘Where were you on 9/11?’ ” he writes. “Will it ever be possible to ask, in the same vein, ‘Where were you at 400 ppm?’ or ‘Where were you when the Larsen B ice shelf broke up?’ ” His answer: Probably not, because the dilemmas and dramas of climate change are simply incompatible with the kinds of stories we tell ourselves about ourselves, especially in novels, which tend to emphasize the journey of an individual conscience rather than the poisonous miasma of social fate.

Surely this blindness will not last — the world we are about to inhabit will not permit it. In a six-degree-warmer world, the Earth’s ecosystem will boil with so many natural disasters that we will just start calling them “weather”: a constant swarm of out-of-control typhoons and tornadoes and floods and droughts, the planet assaulted regularly with climate events that not so long ago destroyed whole civilizations. The strongest hurricanes will come more often, and we’ll have to invent new categories with which to describe them; tornadoes will grow longer and wider and strike much more frequently, and hail rocks will quadruple in size. Humans used to watch the weather to prophesy the future; going forward, we will see in its wrath the vengeance of the past. Early naturalists talked often about “deep time” — the perception they had, contemplating the grandeur of this valley or that rock basin, of the profound slowness of nature. What lies in store for us is more like what the Victorian anthropologists identified as “dreamtime,” or “everywhen”: the semi-mythical experience, described by Aboriginal Australians, of encountering, in the present moment, an out-of-time past, when ancestors, heroes, and demigods crowded an epic stage. You can find it already watching footage of an iceberg collapsing into the sea — a feeling of history happening all at once.

It is. Many people perceive climate change as a sort of moral and economic debt, accumulated since the beginning of the Industrial Revolution and now come due after several centuries — a helpful perspective, in a way, since it is the carbon-burning processes that began in 18th-century England that lit the fuse of everything that followed. But more than half of the carbon humanity has exhaled into the atmosphere in its entire history has been emitted in just the past three decades; since the end of World War II, the figure is 85 percent. Which means that, in the length of a single generation, global warming has brought us to the brink of planetary catastrophe, and that the story of the industrial world’s kamikaze mission is also the story of a single lifetime. My father’s, for instance: born in 1938, among his first memories the news of Pearl Harbor and the mythic Air Force of the propaganda films that followed, films that doubled as advertisements for imperial-American industrial might; and among his last memories the coverage of the desperate signing of the Paris climate accords on cable news, ten weeks before he died of lung cancer last July. Or my mother’s: born in 1945, to German Jews fleeing the smokestacks through which their relatives were incinerated, now enjoying her 72nd year in an American commodity paradise, a paradise supported by the supply chains of an industrialized developing world. She has been smoking for 57 of those years, unfiltered.

Or the scientists’. Some of the men who first identified a changing climate (and given the generation, those who became famous were men) are still alive; a few are even still working. Wally Broecker is 84 years old and drives to work at the Lamont-Doherty observatory across the Hudson every day from the Upper West Side. Like most of those who first raised the alarm, he believes that no amount of emissions reduction alone can meaningfully help avoid disaster. Instead, he puts his faith in carbon capture — untested technology to extract carbon dioxide from the atmosphere, which Broecker estimates will cost at least several trillion dollars — and various forms of “geoengineering,” the catchall name for a variety of moon-shot technologies far-fetched enough that many climate scientists prefer to regard them as dreams, or nightmares, from science fiction. He is especially focused on what’s called the aerosol approach — dispersing so much sulfur dioxide into the atmosphere that when it converts to sulfuric acid, it will cloud a fifth of the horizon and reflect back 2 percent of the sun’s rays, buying the planet at least a little wiggle room, heat-wise. “Of course, that would make our sunsets very red, would bleach the sky, would make more acid rain,” he says. “But you have to look at the magnitude of the problem. You got to watch that you don’t say the giant problem shouldn’t be solved because the solution causes some smaller problems.” He won’t be around to see that, he told me. “But in your lifetime …”

Jim Hansen is another member of this godfather generation. Born in 1941, he became a climatologist at the University of Iowa, developed the groundbreaking “Zero Model” for projecting climate change, and later became the head of climate research at NASA, only to leave under pressure when, while still a federal employee, he filed a lawsuit against the federal government charging inaction on warming (along the way he got arrested a few times for protesting, too). The lawsuit, which is brought by a collective called Our Children’s Trust and is often described as “kids versus climate change,” is built on an appeal to the equal-protection clause, namely, that in failing to take action on warming, the government is violating it by imposing massive costs on future generations; it is scheduled to be heard this winter in Oregon district court. Hansen has recently given up on solving the climate problem with a carbon tax, which had been his preferred approach, and has set about calculating the total cost of extracting carbon from the atmosphere instead.

Hansen began his career studying Venus, which was once a very Earth-like planet with plenty of life-supporting water before runaway climate change rapidly transformed it into an arid and uninhabitable sphere enveloped in an unbreathable gas; he switched to studying our planet by 30, wondering why he should be squinting across the solar system to explore rapid environmental change when he could see it all around him on the planet he was standing on. “When we wrote our first paper on this, in 1981,” he told me, “I remember saying to one of my co-authors, ‘This is going to be very interesting. Sometime during our careers, we’re going to see these things beginning to happen.’ ”

Several of the scientists I spoke with proposed global warming as the solution to Fermi’s famous paradox, which asks, If the universe is so big, then why haven’t we encountered any other intelligent life in it? The answer, they suggested, is that the natural life span of a civilization may be only several thousand years, and the life span of an industrial civilization perhaps only several hundred. In a universe that is many billions of years old, with star systems separated as much by time as by space, civilizations might emerge and develop and burn themselves up simply too fast to ever find one another. Peter Ward, a charismatic paleontologist among those responsible for discovering that the planet’s mass extinctions were caused by greenhouse gas, calls this the “Great Filter”: “Civilizations rise, but there’s an environmental filter that causes them to die off again and disappear fairly quickly,” he told me. “If you look at planet Earth, the filtering we’ve had in the past has been in these mass extinctions.” The mass extinction we are now living through has only just begun; so much more dying is coming.

And yet, improbably, Ward is an optimist. So are Broecker and Hansen and many of the other scientists I spoke to. We have not developed much of a religion of meaning around climate change that might comfort us, or give us purpose, in the face of possible annihilation. But climate scientists have a strange kind of faith: We will find a way to forestall radical warming, they say, because we must.

It is not easy to know how much to be reassured by that bleak certainty, and how much to wonder whether it is another form of delusion; for global warming to work as parable, of course, someone needs to survive to tell the story. The scientists know that to even meet the Paris goals, by 2050, carbon emissions from energy and industry, which are still rising, will have to fall by half each decade; emissions from land use (deforestation, cow farts, etc.) will have to zero out; and we will need to have invented technologies to extract, annually, twice as much carbon from the atmosphere as the entire planet’s plants now do. Nevertheless, by and large, the scientists have an enormous confidence in the ingenuity of humans — a confidence perhaps bolstered by their appreciation for climate change, which is, after all, a human invention, too. They point to the Apollo project, the hole in the ozone we patched in the 1980s, the passing of the fear of mutually assured destruction. Now we’ve found a way to engineer our own doomsday, and surely we will find a way to engineer our way out of it, one way or another. The planet is not used to being provoked like this, and climate systems designed to give feedback over centuries or millennia prevent us — even those who may be watching closely — from fully imagining the damage done already to the planet. But when we do truly see the world we’ve made, they say, we will also find a way to make it livable. For them, the alternative is simply unimaginable.

*This article appears in the July 10, 2017, issue of New York Magazine.

*This article has been updated to clarify a reference to Peter Brannen’s The Ends of the World.

Posted in Marty's Blog | Leave a comment

How to Be a Buddhist in Today’s World

Religion now faces three main challenges: communism, science and consumerism.

Photo: Getty Images

Today the world faces a crisis related to lack of respect for spiritual principles and ethical values. Such virtues cannot be forced on society by legislation or by science, nor can fear inspire ethical conduct. Rather, people must have conviction in the worth of ethical principles so that they want to live ethically.

The U.S. and India, for example, have solid governmental institutions, but many of the people involved lack ethical principles. Self-discipline and self-restraint of all citizens—from CEOs to lawmakers to teachers—are needed to create a good society. But these virtues cannot be imposed from the outside. They require inner cultivation. This is why spirituality and religion are relevant in the modern world.

India, where I now live, has been home to the ideas of secularism, inclusiveness and diversity for some 3,000 years. One philosophical tradition asserts that only what we know through our five senses exists. Other Indian philosophical schools criticize this nihilistic view but still regard the people who hold it as rishis, or sages. I promote this type of secularism: to be a kind person who does not harm others regardless of profound religious differences.

In previous centuries, Tibetans knew little about the rest of the world. We lived on a high and broad plateau surrounded by the world’s tallest mountains. Almost everyone, except for a small community of Muslims, was Buddhist. Very few foreigners came to our land. Since we went into exile in 1959, Tibetans have been in contact with the rest of the world. We relate with religions, ethnic groups and cultures that hold a broad spectrum of views.

Further, Tibetan youth now receive a modern education in which they are exposed to opinions not traditionally found in their community. It is now imperative that Tibetan Buddhists be able to explain clearly their tenets and beliefs to others using reason. Simply quoting from Buddhist scriptures does not convince people who did not grow up as Buddhists of the validity of the Buddha’s doctrine. If we try to prove points only by quoting scripture, these people may respond: “Everyone has a book to quote from!”

Religion faces three principal challenges today: communism, modern science and the combination of consumerism and materialism. Although the Cold War ended decades ago, communist beliefs and governments still strongly affect life in Buddhist countries. In Tibet, the communist government controls the ordination of monks and nuns while also regulating life in the monasteries and nunneries. It controls the education system, teaching children that Buddhism is old-fashioned.

Modern science, up until now, has confined itself to studying phenomena that are material in nature. Scientists largely examine only what can be measured with scientific instruments, limiting the scope of their investigations and their understanding of the universe. Phenomena such as rebirth and the existence of the mind as separate from the brain are beyond the scope of scientific investigation. Some scientists, although they have no proof that these phenomena do not exist, consider them unworthy of consideration. But there is reason for optimism. In recent years, I have met with many open-minded scientists, and we have had mutually beneficial discussions that have highlighted our common points as well as our diverging ideas—expanding the world views of scientists and Buddhists in the process.

Then there is materialism and consumerism. Religion values ethical conduct, which may involve delayed gratification, whereas consumerism directs us toward immediate happiness. Faith traditions stress inner satisfaction and a peaceful mind, while materialism says that happiness comes from external objects. Religious values such as kindness, generosity and honesty get lost in the rush to make more money and have more and “better” possessions. Many people’s minds are confused about what happiness is and how to create its causes.

If you study the Buddha’s teachings, you may find that some of them are in harmony with your views on societal values, science and consumerism—and some of them are not. That is fine. Continue to investigate and reflect on what you discover. In this way, whatever conclusion you reach will be based on reason, not simply on tradition, peer pressure or blind faith.

The 14th Dalai Lama, Tenzin Gyatso, is the spiritual leader of Tibet. He is co-author, with Thubten Chodron, of “Approaching the Buddhist Path,” from which this article is adapted.

Posted in Marty's Blog | Leave a comment

Conservatives Go Third ‘I’ Blind

Bret Stephens JULY 6, 2017

On the subject of cycles, Warren Buffett likes to talk about “the natural progression, the three I’s.” As he put it to Charlie Rose in 2008, those I’s are “the innovators, the imitators and the idiots.” One creates, one enhances — and one screws it all up. Then, presumably, the cycle starts afresh.

Buffett was describing the process that led to the 2008 housing and financial crises. But he might as well have been talking about the decline of the conservative movement in America.

I was reminded of this again last week, on news that the Fox News host Sean Hannity will receive the William F. Buckley Jr. Award for Media Excellence later this year at a gala dinner in Washington, D.C. As honors go, neither the award nor the organization bestowing it — the Media Research Center — are particularly noteworthy.

But sometimes symbolism is more potent than fact. If we have reached the point where rank-and-file conservatives see nothing amiss with giving Hannity an award named for Buckley, then surely there’s a Milton Friedman Prize awaiting Steve Bannon for his insights on free trade. And maybe Sean Spicer can receive the Vaclav Havel International Prize for Creative Dissent for his role in exposing “fake news.” The floor’s the limit

Or, in Hannity’s case, the crawl space beneath it.

In 1950, Lionel Trilling wrote that there were no conservative ideas “in general circulation,” only “irritable mental gestures which seek to resemble ideas.” By the time Trilling died 25 years later the opposite was true: The only consequential ideas at the time were conservative, while it was liberalism that had been reduced to an irritable mental gesture.

This was largely Buckley’s doing. Through National Review, his magazine, he gave a hidden American intelligentsia a platform to develop conservative ideas. Through “Firing Line,” his TV show, he gave an unsuspecting American public a chance to sample conservative wit. Not all of the ideas were right, but they were usually smart. And as they evolved, they went in the right direction.

Buckley “learned to free himself of views that had come to him by the circumstances of his background that he concluded ran counter to values he cherished,” notes Alvin Felzenberg in his superb new biography, “A Man and His Presidents.” Buckley shed isolationism, segregationism and anti-Semitism, and insisted the conservative movement do likewise. Over 50 years as the gatekeeper of conservative ideas, he denounced the inverted Marxism of Ayn Rand, the conspiracy theories of Robert Welch (founder of the John Birch Society) and the white populism of George Wallace and Pat Buchanan.

In March 2000, he trained his sights on “the narcissist” and “demagogue” Donald Trump. “When he looks at a glass, he is mesmerized by its reflection,” he wrote in a prophetic short essay in Cigar Aficionado. “The resistance to a corrupting demagogy,” he warned, “should take first priority” for Americans.

Buckley died in 2008. The conservatism he nourished was fundamentally literary: To play a significant part in it you had to know how to write, and in order to write well you had to read widely, and in order to do that you had to, well, enjoy reading. In hindsight, 2008, the year of Sarah Palin, was also the year when literary conservatism went into eclipse.

Suddenly, you didn’t need to devote a month to researching and writing a 7,000-word critique of Obama administration’s policy on, say, Syria to be taken seriously as a conservative foreign-policy expert. You just needed to mouth off about it for five minutes on “The O’Reilly Factor.” For books there were always ghostwriters; publicity on Fox ensured they would always top The Times’s best-seller lists.

Influence ceased to be measured by respectability — op-eds published in The Wall Street Journal; keynotes delivered to the American Enterprise Institute — and came to be measured by ratings. The quality of an idea could be tested not by its ability to withstand scrutiny from experts, but by the willingness of people to swallow it.

It shouldn’t be a surprise that a post-literate conservative world should have been so quick to embrace a semi-literate presidential candidate. Nor, in hindsight, is it strange that, having retired the role Buckley once played in maintaining conservative ideological hygiene, the ideas he expunged should have made such a quick and pestilential comeback.

Thus, when Hannity peddles conspiracy theories about Seth Rich, the young Democratic National Committee staffer murdered in Washington last year, that’s an echo of John Birch. When fellow Fox host Tucker Carlson — who once aspired to be the next Buckley and now aims to be the next Ann Coulter — tries to reinvent himself as the tribune of the working class, he’s speaking for the modern-day George Wallace voter. Isolationism is already back, thanks to Trump. Anti-Semitism can’t be far behind, either, and not just on the alt-right.

And so we reach the Idiot stage of the conservative cycle, in which a Buckley Award for Sean Hannity suggests nothing ironic, much less Orwellian, to those bestowing it, applauding it, or even shrugging it off. The award itself is trivial, but it’s a fresh reminder of who now holds the commanding heights of conservative life, and what it is that they think.

In the financial world, we know how this stage ended for investors, not to mention the rest of the country. The political right might consider that a similar destiny awaits.

Posted in Marty's Blog | Leave a comment

Why the media has broken down in the age of Trump

Michael Goodwin July 2, 2017

Since President Trump was elected, the media landscape has divided and hardened more than ever. Even the once-unimpeachable New York Times has been guilty of “fake news,” while on Tuesday CNN had to retract an article that slimed a Trump aide based on flimsy reporting. In April, The Post’s Michael Goodwin delivered this speech at a Hillsdale College National Leadership Seminar in Atlanta, analyzing how we got here — and how journalism can survive.

I’ve been a journalist for a long time. Long enough to know that it wasn’t always like this. There was a time not so long ago when journalists were trusted and admired. We were generally seen as trying to report the news in a fair and straightforward manner. Today, all that has changed. For that, we can blame the 2016 election or, more accurately, how some news organizations chose to cover it. Among the many firsts, last year’s election gave us the gobsmacking revelation that most of the mainstream media puts both thumbs on the scale — that most of what you read, watch and listen to is distorted by intentional bias and hostility. I have never seen anything like it. Not even close.

It’s not exactly breaking news that most journalists lean left. I used to do that myself. I grew up at the New York Times, so I’m familiar with the species. For most of the media, bias grew out of the social revolution of the 1960s and ’70s. Fueled by the civil rights and anti-Vietnam War movements, the media jumped on the anti-authority bandwagon writ large. The deal was sealed with Watergate, when journalism was viewed as more trusted than government — and far more exciting and glamorous. Think Robert Redford in “All the President’s Men.” Ever since, young people became journalists because they wanted to be the next Woodward and Bernstein, find a Deep Throat, and bring down a president. Of course, most of them only wanted to bring down a Republican president. That’s because liberalism is baked into the journalism cake.

During the years I spent teaching at the Columbia University School of Journalism, I often found myself telling my students that the job of the reporter was “to comfort the afflicted and afflict the comfortable.” I’m not even sure where I first heard that line, but it still captures the way most journalists think about what they do. Translate the first part of that compassionate-sounding idea into the daily decisions about what makes news, and it is easy to fall into the habit of thinking that every person afflicted by something is entitled to help. Or, as liberals like to say, “Government is what we do together.” From there, it’s a short drive to the conclusion that every problem has a government solution.

The rest of that journalistic ethos — “afflict the comfortable” — leads to the knee-jerk support of endless taxation. Somebody has to pay for that government intervention the media loves to demand. In the same vein, and for the same reason, the average reporter will support every conceivable regulation as a way to equalize conditions for the poor. He will also give sympathetic coverage to groups like Occupy Wall Street and Black Lives Matter.

A new dimension

I knew all of this about the media mindset going into the 2016 presidential campaign. But I was still shocked at what happened. This was not naïve liberalism run amok. This was a whole new approach to politics. No one in modern times had seen anything like it. As with grief, there were several stages. In the beginning, Donald Trump’s candidacy was treated as an outlandish publicity stunt, as though he wasn’t a serious candidate and should be treated as a circus act. But television executives quickly made a surprising discovery: The more they put Trump on the air, the higher their ratings climbed. Ratings are money. So news shows started devoting hours and hours simply to pointing the cameras at Trump and letting them run.

As his rallies grew, the coverage grew, which made for an odd dynamic. The candidate nobody in the media took seriously was attracting the most people to his events and getting the most news coverage. Newspapers got in on the game too. Trump, unlike most of his opponents, was always available to the press, and could be counted on to say something outrageous or controversial that made a headline. He made news by being a spectacle.

Despite the mockery of journalists and late-night comics, something extraordinary was happening. Trump was dominating a campaign none of the smart money thought he could win. And then, suddenly, he was winning. Only when the crowded Republican field began to thin and Trump kept racking up primary and caucus victories did the media’s tone grow more serious.

The two leading liberal newspapers were trying to top each other in their demonization of Trump and his supporters.

One study estimated that Trump had received so much free airtime that if he had had to buy it, the price would have been $2 billion. The realization that they had helped Trump’s rise seemed to make many executives, producers and journalists furious. By the time he secured the nomination and the general election rolled around, they were gunning for him. Only two people now had a chance to be president, and the overwhelming media consensus was that it could not be Donald Trump. They would make sure of that. The coverage of him grew so vicious and one-sided that last August, I wrote a column on the unprecedented bias. Under the headline “American journalism is collapsing before our eyes,” I wrote that the so-called cream of the media crop was “engaged in a naked display of partisanship” designed to bury Trump and elect Hillary Clinton.

The evidence was on the front page, the back page, the culture pages, even the sports pages. It was at the top of the broadcast and at the bottom of the broadcast. Day in, day out, in every media market in America, Trump was savaged like no other candidate in memory. We were watching the total collapse of standards, with fairness and balance tossed overboard. Every story was an opinion masquerading as news, and every opinion ran in the same direction — toward Clinton and away from Trump.

For the most part, I blame the New York Times and the Washington Post for causing this breakdown. The two leading liberal newspapers were trying to top each other in their demonization of Trump and his supporters. They set the tone, and most of the rest of the media followed like lemmings.

On one level, tougher scrutiny of Trump was clearly defensible. He had a controversial career and lifestyle, and he was seeking the presidency as his first job in government. He also provided (and continues to provide) lots of fuel with some of his outrageous words and deeds. But from the beginning there was also a second element to the lopsided coverage. The New York Times has not endorsed a Republican for president since Dwight Eisenhower in 1956, meaning it would back a dead raccoon if it had a “D” after its name. Think of it — George McGovern over Richard Nixon? Jimmy Carter over Ronald Reagan? Walter Mondale over Reagan? Any Democrat would do. And the Washington Post, which only started making editorial endorsements in the 1970s, has never once endorsed a Republican for president.

But again, I want to emphasize that 2016 had those predictable elements plus a whole new dimension. This time, the papers dropped the pretense of fairness and jumped headlong into the tank for one candidate over the other. The Times media reporter began a story this way:

“If you’re a working journalist and you believe that Donald J. Trump is a demagogue playing to the nation’s worst racist and nationalist tendencies, that he cozies up to anti-American dictators and that he would be dangerous with control of the United States nuclear codes, how the heck are you supposed to cover him?”

I read that paragraph and I thought to myself, well, that’s actually an easy question. If you feel that way about Trump, normal journalistic ethics would dictate that you shouldn’t cover him. You cannot be fair. And you shouldn’t be covering Hillary Clinton either, because you’ve already decided who should be president. Go cover sports or entertainment. Yet the Times media reporter rationalized the obvious bias he had just acknowledged, citing the view that Clinton was “normal” and Trump was not.

Modal Trigger
New York Times executive editor Dean BaquetNew York Times

I found the whole concept appalling. What happened to fairness? What happened to standards? I’ll tell you what happened to them. The Times’ top editor, Dean Baquet, eliminated them. In an interview last October with the Nieman Foundation for Journalism at Harvard, Baquet admitted that the piece by his media reporter had nailed his own thinking. Trump “challenged our language,” he said, and Trump “will have changed journalism.” Of the daily struggle for fairness, Baquet had this to say: “I think that Trump has ended that struggle. . . . We now say stuff. We fact check him. We write it more powerfully that [what he says is] false.”

Baquet was being too modest. Trump was challenging, sure, but it was Baquet who changed journalism. He’s the one who decided that the standards of fairness and nonpartisanship could be abandoned without consequence.

With that decision, Baquet also changed the basic news story formula. To the age-old elements of who, what, when, where and why, he added the reporter’s opinion. Now the floodgates were open, and virtually every so-called news article reflected a clear bias against Trump. Stories, photos, headlines, placement in the paper — all the tools that writers and editors have — were summoned to the battle. The goal was to pick the next president.

Thus began the spate of stories, which continues today, in which the Times routinely calls Trump a liar in its news pages and headlines. Again, the contrast with the past is striking. The Times never called Barack Obama a liar, despite such obvious opportunities as “you can keep your doctor” and “the Benghazi attack was caused by an internet video.” Indeed, the Times and the Washington Post, along with most of the White House press corps, spent eight years cheerleading the Obama administration, seeing not a smidgen of corruption or dishonesty. They have been tougher on Hillary Clinton during her long career. But they still never called her a liar, despite such doozies as “I set up my own computer server so I would only need one device,” “I turned over all the government emails,” and “I never sent or received classified emails.” All those were lies, but not to the national media. Only statements by Trump were fair game.

As we know now, most of the media totally missed Trump’s appeal to millions upon millions of Americans. The prejudice against him blinded those news organizations to what was happening in the country. Even more incredibly, I believe the bias and hostility directed at Trump backfired. The feeling that the election was, in part, a referendum on the media gave some voters an extra incentive to vote for Trump. A vote for him was a vote against the media and against Washington. Not incidentally, Trump used that sentiment to his advantage, often revving up his crowds with attacks on reporters. He still does.

If I haven’t made it clear, let me do so now. The behavior of much of the media, but especially the New York Times, was a disgrace. I don’t believe it ever will recover the public trust it squandered.

The Times’ previous reputation for having the highest standards was legitimate. Those standards were developed over decades to force reporters and editors to be fair and to gain public trust. The commitment to fairness made the New York Times the flagship of American journalism. But standards are like laws in the sense that they are designed to guide your behavior in good times and in bad. Consistent adherence to them was the source of the Times’ credibility. And eliminating them has made the paper less than ordinary. Its only standards now are double standards.

Modal Trigger
Abe RosenthalAP

I say this with great sadness. I was blessed to grow up at the Times, getting a clerical job right out of college and working my way onto the reporting staff, where I worked for a decade. It was the formative experience of my career where I learned most of what I know about reporting and writing. Alas, it was a different newspaper then. Abe Rosenthal was the editor in those days, and long before we’d ever heard the phrase “zero tolerance,” that’s what Abe practiced toward conflicts of interest and reporters’ opinions. He set the rules and everybody knew it.

Here is a true story about how Abe Rosenthal resolved a conflict of interest. A young woman was hired by the Times from one of the Philadelphia newspapers. But soon after she arrived in New York, a story broke in Philly that she had had a romantic affair with a political figure she had covered, and that she had accepted a fur coat and other expensive gifts from him. When he saw the story, Abe called the woman into his office and asked her if it was true. When she said yes, he told her to clean out her desk — that she was finished at the Times and would never work there again. As word spread through the newsroom, some reporters took the woman’s side and rushed in to tell Abe that firing her was too harsh. He listened for about 30 seconds and said, in so many words, “I don’t care if you f–k an elephant on your personal time, but then you can’t cover the circus for the paper.” Case closed. The conflict-of-interest policy was clear, absolute, and unforgettable.

As for reporters’ opinions, Abe had a similar approach. He didn’t want them in the news pages. And if you put them in, he took them out. They belonged in the opinion pages only, which were managed separately. Abe said he knew reporters tended to lean left and would find ways to sneak their views into the stories. So he saw his job as steering the paper slightly to the right. “That way,” he said, “the paper would end up in the middle.” He was well known for this attitude, which he summed up as “keeping the paper straight.” He even said he wanted his epitaph to read, “He kept the paper straight.” Like most people, I thought this was a joke. But after I related all this in a column last year, his widow contacted me and said it wasn’t a joke — that, in fact, Abe’s tombstone reads, “He kept the paper straight.” She sent me a picture to prove it. I published that picture of his tombstone alongside a column where I excoriated the Times for its election coverage. Sadly, the Times’ high standards were buried with Abe Rosenthal.

Looking to the future

Which brings us to the crucial questions. Can the American media be fixed? And is there anything that we as individuals can do to make a difference? The short answer to the first question is, “No, it can’t be fixed.” The 2016 election was the media’s Humpty Dumpty moment. It fell off the wall, shattered into a million pieces, and can’t be put back together again. In case there is any doubt, 2017 is confirming that the standards are still dead. The orgy of visceral Trump-bashing continues unabated.

But the future of journalism isn’t all gloom and doom. In fact, if we accept the new reality of widespread bias and seize the potential it offers, there is room for optimism. Consider this: The election showed the country is roughly divided 50-50 between people who will vote for a Democrat and people who will vote for a Republican. But our national media is more like 80-20 in favor of Democrats. While the media should, in theory, broadly reflect the public, it doesn’t. Too much of the media acts like a special interest group. Detached from the greater good, it exists to promote its own interest and the political party with which it is aligned.

Ronald Reagan’s optimism is often expressed in a story that is surely apocryphal, but irresistible. He is said to have come across a barn full of horse manure and remarked cheerfully that there must be a pony in it somewhere. I suggest we look at the media landscape in a similar fashion. The mismatch between the mainstream media and the public’s sensibilities means there is a vast untapped market for news and views that are not now represented. To realize that potential, we only need three ingredients, and we already have them: first, free speech; second, capitalism and free markets; and the third ingredient is you, the consumers of news.

Free speech is under assault, most obviously on many college campuses, but also in the news media, which presents a conformist view to its audience and gets a politically segregated audience in return. Look at the letters section in the New York Times — virtually every reader who writes in agrees with the opinions of the paper. This isn’t a miracle; it’s a bubble. Liberals used to love to say, “I don’t agree with your opinion, but I would fight to the death for your right to express it.” You don’t hear that anymore from the Left. Now they want to shut you up if you don’t agree. And they are having some success.

An expanded media landscape that better reflects the diversity of public preferences would, in time, help create a more level political and cultural arena.

But there is a countervailing force. Look at what happened this winter when the Left organized boycotts of department stores that carried Ivanka Trump’s clothing and jewelry. Nordstrom folded like a cheap suit, but Trump’s supporters rallied on social media and Ivanka’s company had its best month ever. This is the model I have in mind for the media. It is similar to how FOX News got started. Rupert Murdoch (who owns the New York Post) thought there was an untapped market for a more fair and balanced news channel, and he recruited the late Roger Ailes to start it more than 20 years ago. Ailes found a niche market, all right — half the country!

Incredible advances in technology are also on the side of free speech. The explosion of choices makes it almost impossible to silence all dissent and gain a monopoly, though certainly Facebook and Google are trying.

As for the necessity of preserving capitalism, look around the world. Nations without economic liberty usually have little or no dissent. That’s not a coincidence. In this, I’m reminded of an enduring image from the Occupy Wall Street movement. That movement was a pestilence, egged on by President Obama and others who view other people’s wealth as a crime against the common good. This attitude was on vivid display as the protesters held up their iPhones to demand the end of capitalism. As I wrote at the time, did they believe Steve Jobs made each and every Apple product one at a time in his garage? Did they not have a clue about how capital markets make life better for more people than any other system known to man? They had no clue. And neither do many government officials, who think they can kill the golden goose and still get golden eggs.

Which brings me to the third necessary ingredient in determining where we go from here. It’s you. I urge you to support the media you like. As the great writer and thinker Midge Decter once put it, “You have to join the side you’re on.” It’s no secret that newspapers and magazines are losing readers and money and shedding staff. Some of them are good newspapers. Some of them are good magazines. There are also many wonderful, thoughtful, small publications and websites that exist on a shoestring. Don’t let them die. Subscribe or contribute to those you enjoy. Give subscriptions to friends. Put your money where your heart and mind are. An expanded media landscape that better reflects the diversity of public preferences would, in time, help create a more level political and cultural arena. That would be a great thing. So again I urge you: Join the side you’re on.

Posted in Marty's Blog | Leave a comment

How Twitter Pornified Politics

Bret Stephens JUNE 23, 2017

This is the column in which I formally forswear Twitter for good. I’ll keep my Twitter handle, and hopefully my followers, but an editorial assistant will manage the account from now on. I’ll intercede only to say nice things about the writing I admire, the people I like and the music I love.

Why now? Because, while reading a cover story in New York magazine, it occurred to me that Twitter is the political pornography of our time: revealing but distorting, exciting but dulling, debasing to its users, and, well, ejaculatory. It’s bad for the soul and, as Donald Trump proves daily, bad for the country.

The story, by Maureen O’Connor, makes use of a decade’s worth of big-data analytics from the website Pornhub, which attracts 75 million visitors a day. The result is what she calls “the Kinsey Report of Our Time” — an unvarnished and unfiltered portrait of the unchecked libido.

Since this is a family newspaper, readers will have to learn the more salacious details of O’Connor’s article by consulting it for themselves. But one important point stands out. “Pornography trains us to redirect sexual desire as mimetic desire,” she writes. “That is, the sociological theory — and the marketers’ dream — that humans learn to want what they see.”

Steve Jobs expressed a similar thought in 1998: “People don’t know what they want until you show it to them.” Technology doesn’t merely service needs. It also teaches wants. You never thought you’d need an iPhone, but you do. You didn’t know you were into kinky massage videos, but you are. We discover our innermost — and bottom-most — selves only when someone else opens the basement door.

That is what Twitter has been for our politics. Short-form writing can be informative, aphoristic and funny. Twitter is terrific when tailored as a personalized wire service and can be a useful way to communicate with readers. And where would our literary culture be without @WtfRenaissance or @LosFelizDaycare?

But Twitter’s degrading uses tend to overwhelm its elevating one. If pornography is about the naked, grunting body, Twitter is about the naked, grunting brain. It’s whatever pops out. And what pops out is altogether too revealing.

Another insight from O’Connor’s article: “Porn has always been a place for indulging irrational, secret and socially unacceptable desires — which makes it a place where people feel free to let their racial prejudices and fantasies run wild, too.”

Twitter is no different. Bigotry flourishes on Twitter, since it offers the bigot the benefits of anonymity along with instantaneous, uncensored self-publication. It’s the place where their political minds can be as foul as they want to be — without the expense or reputational risk of showing their face at a Richard Spencer rally.

Twitter doesn’t merely amplify ugliness. It erases nuance, coarsens thought, turns into a game of “Telephone” in which original meaning becomes hopelessly garbled with every successive re-tweet. It also facilitates a form of self-righteous digital bullying and mob-like behavior that can wreck people’s lives.

Ask Justine Sacco, a P.R. executive who in 2013 sent an ironic tweet to her 170 followers just as she was about to step on a flight to Cape Town. “Going to Africa,” she wrote. “Hope I don’t get AIDS. Just kidding. I’m white!”

She emerged from the plane to discover that what she had intended as a mordant observation about white privilege hadn’t been read that way, and that in 11 short hours she had become the poster racist in a worldwide shaming campaign. She lost her job. Twitter, as the author Jon Ronson has noted, is the 21st century’s answer to the pillory.

That, too, is part of the pornography of Twitter: pleasurably bearing witness to the mockery or humiliation of others. Things we would never say in person, acts we would never perform, become safe to indulge thanks to the prophylactic of a digital interface. After I took this job, one wag on Twitter wrote that he hoped I’d be “Danny Pearl-ed.” He must have found it funny. My 11-year-old son didn’t.

No discussion of the evils of Twitter would be complete without trying to understand the 45th president’s fondness for it. It should be no surprise that he’s a keen user, since it’s the reptilian medium for the reptilian brain.

But it’s also ideally suited for his style of crowd politics: unmediated, blunt and burst-like. It’s how he escapes the softening influence of his advisers and speechwriters. It’s how he maintains the aura of charismatic authenticity that is the prerequisite of populist politics. It’s how he pretends to mingle with his followers while increasing his distance from them. Juan Perón would have loved Twitter.

Politics, like eros, can open the way to the elevation of our souls. Or it can do the opposite. Time for people who care about politics and souls to get off Twitter.

Posted in Marty's Blog | Leave a comment

Our Fake Democracy

Timothy Egan JUNE 23, 2017

Credit Doug Mills/The New York Times

We tell ourselves stories in order to live, as Joan Didion said. We do this as a nation, as individuals, as families — even when that construct is demonstrably false. For the United States, the biggest institutional lie of the moment is that we have a government of the people, responding to majority will.

On almost every single concern, Congress — whether it’s the misnamed People’s House, or the Senate, laughably mischaracterized as the world’s greatest deliberative body — is going against what most of the country wants. And Congress is doing this because there will be no consequences.

We have a fake democracy, growing less responsive and less representative by the day.

The biggest example of this is the monstrosity of a health care bill, which a cartel of Republicans finally allowed us to peek at on Thursday. The lobbyists have seen it; of course. But for the rest us, our first look at a radical overhaul of one-sixth of the economy, something that touches every American, comes too late to make our voices heard.

Crafted in total darkness, the bill may pass by a slim majority of people who have not read it. Inevitably, with something that deprives upward of 23 million Americans of health care, people will die because of this bill. States will be making life and death decisions as they drop the mandated benefits of Obamacare and cut vital care for the poor, the elderly, the sick and the drug-addicted through Medicaid. The sunset of Obamacare is the dawn of death panels.

It would be understandable if Republicans were doing this because it’s what most Americans want them to do. But it’s not. Only about 25 percent of Americans approved of a similar version of this bill, the one passed by the House. By a nearly 2 to 1 margin, people would prefer that the Affordable Care Act be kept in place and fixed, rather than junked for this cruel alternative.

The Senate bill is “by far, the most harmful piece of legislation I have seen in my lifetime,” said Senator Bernie Sanders. At age 75, he’s seen a lot.

Remember when Republicans used to pretend to care about crafting the people’s business in sunlight? “It’s simply wrong for legislation that will affect 100 percent of the American people to be negotiated behind closed doors.” That was Mike Pence in 2010.

Why are they doing it? Why would the people’s representatives choose to hurt their own people? The answer is further evidence of our failed democracy. About 75 million Americans depend on Medicaid. This bill will make their lives more miserable and perilous in order to give the top 2 percent of wealthiest Americans a tax cut.

So, little surprise that Republicans are also working to make it even harder for the poor to vote. They can seek to disenfranchise one class of Americans, and get away with it from the safety of gerrymandered seats.

The symptoms of democratic collapse — from the opioid crises of people who long ago checked out of active citizenship to the stagnation of class mobility — cry for immediate action.

It takes the median worker twice as many hours a month to pay rent in a big city today than it did in the early years of the baby boomer era, as Edward Luce notes in his new book, “The Retreat of Western Liberalism.” Add towering increases in health care and college costs to that and you’ve got an unclimbable wall between low-income limbo and a chance at the middle class. The United States, once known for our American Dream, now has the lowest class mobility of any Western democracy, according to Luce.

What is Congress doing? Nothing on wages. Nothing on college tuition. And the health care bill will most surely force many people to choose between buying groceries and being able to visit a doctor.

Our fake democracy reveals itself daily. Less than a third of Americans support President Trump’s decision to withdraw from the Paris Climate Agreement. In a truly representative government, you would see the other two-thirds, the common-sense majority, howling from the halls of Congress.

Most Americans are also against building a wall along the Mexican border. They would prefer putting taxpayers’ billions into roads, bridges, schools and airports. But the wall remains a key part of President Trump’s agenda.

Trump is president, of course, despite losing the popular vote by nearly 3 million people. Almost 60 percent of the public is against him now. In a parliamentary system, he’d be thrown out in a no-confidence vote. In our system, he’s primed to change life for every citizen, against the wishes of a majority of Americans. Try calling that a democracy while keeping a straight face.

Posted in Marty's Blog | Leave a comment

What Monkeys Can Teach Us About Fairness

 

Monkeys were taught in an experiment to hand over pebbles in exchange for cucumber slices. They were happy with this deal.

Then the researcher randomly offered one monkey — in sight of a second — an even better deal: a grape for a pebble. Monkeys love grapes, so this fellow was thrilled.

The researcher then returned to the second monkey, but presented just a cucumber for the pebble. Now, this offer was insulting. In some cases the monkey would throw the cucumber back at the primatologist in disgust.

In other words, the monkeys cared deeply about fairness. What mattered to them was not

Monkeys aren’t the only primates instinctively offended by inequality. For example, two scholars examined data from millions of flights to identify what factors resulted in “air rage” incidents. One huge factor: a first-class cabin.

An incident in a coach section was four times as likely if the plane also had a first-class cabin; a first-class section increased the risk of a disturbance as much as a nine-hour delay did.

When there is a first-class section, it is at the front of the plane and economy passengers typically walk through it to reach their seats, but in some flights the boarding is in the middle of the plane. The researchers found that an air-rage incident in coach was three times as likely when economy passengers had to walk through first class compared with when they bypassed it.

Keith Payne, a professor of psychology at the University of North Carolina at Chapel Hill, tells of this research in a brilliant new book, “The Broken Ladder,” about how inequality destabilizes societies. It’s an important, fascinating read arguing that inequality creates a public health crisis in America.

The data on inequality is, of course, staggering. The top 1 percent in America owns more than the bottom 90 percent. The annual Wall Street bonus pool alone is more than the annual year-round earnings of all Americans working full time at the federal minimum wage of $7.25 an hour, according to the Institute for Policy Studies. And what’s becoming clearer is the fraying of the social fabric that results.

Payne challenges a common perception that the real problem isn’t inequality but poverty, and he’s persuasive that societies are shaped not just by disadvantage at the bottom but also by inequality across the spectrum. Addressing inequality must be a priority, for we humans are social creatures, so that society becomes dysfunctional when we see some receiving grapes and others cucumbers.

The dysfunction affects not only those at the bottom, but also the lucky ones at the top. Consider baseball: Some teams pay players much more disparately than others do, and one might think that pay inequality creates incentives for better performance and more wins.

In fact, economists have crunched the data and found the opposite is true. Teams with greater equality did much better, perhaps because they were more cohesive.

What’s more, it turned out that even the stars did better when they were on teams with flatter pay. “Higher inequality seemed to undercut the superstar players it was meant to incentivize, which is what you would expect if you believed that the chief effect of pay inequality was to reduce cooperation and team cohesion,” Payne notes.

Something similar emerges in national statistics. Countries with the widest gaps in income, including the United States, generally have worse health, more homicides and a greater array of social problems.

People seem to understand this truth intuitively, for they want much less inequality than we have. In a study of people in 40 countries, liberals said C.E.O.s should be paid four times as much as the average worker, while conservatives said five times. In fact, the average C.E.O. at the largest American public companies earns about 350 times as much as the average worker.

Presented with unlabeled pie charts depicting income distributions of two countries, 92 percent of Americans said they would prefer to live with the modest inequality that exists in Sweden. Republicans and Democrats, rich and poor alike — all chose Sweden by similar margins.

“When the level of inequality becomes too large to ignore, everyone starts acting strange,” Payne notes. “Inequality affects our actions and our feelings in the same systematic, predictable fashion again and again.”

“It makes us believe weird things, superstitiously clinging to the world as we want it to be rather than as it is,” he says. “Inequality divides us, cleaving us into camps not only of income but also of ideology and race, eroding our trust in one another. It generates stress and makes us all less healthy and less happy.”

Think of those words in the context of politics today: Doesn’t that diagnosis of stress, division and unhappiness strike a familiar chord?

So much of the national conversation now is focused on President Trump, for understandable reasons. But I suspect that he is a symptom as well as a cause, and that to uncover the root of our national dysfunctions we must go deeper than politics, deeper than poverty, deeper than demagoguery, and confront the inequality that is America today.

Posted in Marty's Blog | Leave a comment

Rebecca Solnit: The Loneliness of Donald Trump- On the Corrosive Privilege of the Most Mocked Man in the World

May 30, 2017  By Rebecca Solnit

Once upon a time, a child was born into wealth and wanted for nothing, but he was possessed by bottomless, endless, grating, grasping wanting, and wanted more, and got it, and more after that, and always more. He was a pair of ragged orange claws upon the ocean floor, forever scuttling, pinching, reaching for more, a carrion crab, a lobster and a boiling lobster pot in one, a termite, a tyrant over his own little empires. He got a boost at the beginning from the wealth handed him and then moved among grifters and mobsters who cut him slack as long as he was useful, or maybe there’s slack in arenas where people live by personal loyalty until they betray, and not by rules, and certainly not by the law or the book. So for seven decades, he fed his appetites and exercised his license to lie, cheat, steal, and stiff working people of their wages, made messes, left them behind, grabbed more baubles, and left them in ruin.

He was supposed to be a great maker of things, but he was mostly a breaker. He acquired buildings and women and enterprises and treated them all alike, promoting and deserting them, running into bankruptcies and divorces, treading on lawsuits the way a lumberjack of old walked across the logs floating on their way to the mill, but as long as he moved in his underworld of dealmakers the rules were wobbly and the enforcement was wobblier and he could stay afloat. But his appetite was endless, and he wanted more, and he gambled to become the most powerful man in the world, and won, careless of what he wished for.

Thinking of him, I think of Pushkin’s telling of the old fairytale of The Fisherman and the Golden Fish. After being caught in the old fisherman’s net, the golden fish speaks up and offers wishes in return for being thrown back in the sea. The fisherman asks him for nothing, though later he tells his wife of his chance encounter with the magical creature. The fisherman’s wife sends him back to ask for a new washtub for her, and then a  second time to ask for a cottage to replace their hovel, and the wishes are granted, and then as she grows prouder and greedier, she sends him to ask that she become a wealthy person in a mansion with servants she abuses, and then she sends her husband back. The old man comes and grovels before the fish, caught between the shame of the requests and the appetite of his wife, and she becomes tsarina and has her boyards and nobles drive the husband from her palace. You could call the husband consciousness—the awareness of others and of oneself in relation to others—and the wife craving.

Finally she wishes to be supreme over the seas and over the fish itself, endlessly uttering wishes, and the old man goes back to the sea to tell the fish—to complain to the fish—of this latest round of wishes. The fish this time doesn’t even speak, just flashes its tail, and the old man turns around to see on the shore his wife with her broken washtub at their old hovel. Overreach is perilous, says this Russian tale; enough is enough. And too much is nothing.

The child who became the most powerful man in the world, or at least occupied the real estate occupied by a series of those men, had run a family business and then starred in an unreality show based on the fiction that he was a stately emperor of enterprise, rather than a buffoon barging along anyhow, and each was a hall of mirrors made to flatter his sense of self, the self that was his one edifice he kept raising higher and higher and never abandoned.

I have often run across men (and rarely, but not never, women) who have become so powerful in their lives that there is no one to tell them when they are cruel, wrong, foolish, absurd, repugnant. In the end there is no one else in their world, because when you are not willing to hear how others feel, what others need, when you do not care, you are not willing to acknowledge others’ existence. That’s how it’s lonely at the top. It is as if these petty tyrants live in a world without honest mirrors, without others, without gravity, and they are buffered from the consequences of their failures.

“They were careless people,” F. Scott Fitzgerald wrote of the rich couple at the heart of The Great Gatsby. “They smashed up things and creatures and then retreated back into their money or their vast carelessness or whatever it was that kept them together, and let other people clean up the mess they had made.” Some of us are surrounded by destructive people who tell us we’re worthless when we’re endlessly valuable, that we’re stupid when we’re smart, that we’re failing even when we succeed. But the opposite of people who drag you down isn’t people who build you up and butter you up.  It’s equals who are generous but keep you accountable, true mirrors who reflect back who you are and what you are doing.

“He is, as of this writing, the most mocked man in the world.”

We keep each other honest, we keep each other good with our feedback, our intolerance of meanness and falsehood, our demands that the people we are with listen, respect, respond—if we are allowed to, if we are free and valued ourselves. There is a democracy of social discourse, in which we are reminded that as we are beset with desires and fears and feelings, so are others; there was an old woman in Occupy Wall Street I always go back to who said, “We’re fighting for a society in which everyone is important.” That’s what a democracy of mind and heart, as well as economy and polity, would look like.

This year Hannah Arendt is alarmingly relevant, and her books are selling well, particularly On the Origins of Totalitarianism. She’s been the subject an extraordinary essay in the Los Angeles Review of Books and a conversation between scholar Lyndsey Stonebridge and Krista Tippet on the radio show “On Being.” Stonebridge notes that Arendt advocated for the importance of an inner dialogue with oneself, for a critical splitting in which you interrogate yourself—for a real conversation between the fisherman and his wife you could say: “People who can do that can actually then move on to having conversations with other people and then judging with other people. And what she called ‘the banality of evil’ was the inability to hear another voice, the inability to have a dialogue either with oneself or the imagination to have a dialogue with the world, the moral world.”

Some use their power to silence that and live in the void of their own increasingly deteriorating, off-course sense of self and meaning. It’s like going mad on a desert island, only with sycophants and room service. It’s like having a compliant compass that agrees north is whatever you want it to be. The tyrant of a family, the tyrant of a little business or a huge enterprise, the tyrant of a nation. Power corrupts, and absolute power often corrupts the awareness of those who possess it. Or reduces it: narcissists, sociopaths, and egomaniacs are people for whom others don’t exist.

We gain awareness of ourselves and others from setbacks and difficulties; we get used to a world that is not always about us; and those who do not have to cope with that are brittle, weak, unable to endure contradiction, convinced of the necessity of always having one’s own way. The rich kids I met in college were flailing as though they wanted to find walls around them, leapt as though they wanted there to be gravity and to hit ground, even bottom, but parents and privilege kept throwing out safety nets and buffers, kept padding the walls and picking up the pieces, so that all their acts were meaningless, literally inconsequential. They floated like astronauts in outer space.

Equality keeps us honest. Our peers tell us who we are and how we are doing, providing that service in personal life that a free press does in a functioning society. Inequality creates liars and delusion. The powerless need to dissemble—that’s how slaves, servants, and women got the reputation of being liars—and the powerful grow stupid on the lies they require from their subordinates and on the lack of need to know about others who are nobody, who don’t count, who’ve been silenced or trained to please. This is why I always pair privilege with obliviousness; obliviousness is privilege’s form of deprivation. When you don’t hear others, you don’t imagine them, they become unreal, and you are left in the wasteland of a world with only yourself in it, and that surely makes you starving, though you know not for what, if you have ceased to imagine others exist in any true deep way that matters. This is about a need for which we hardly have language or at least not a familiar conversation.

A man who wished to become the most powerful man in the world, and by happenstance and intervention and a series of disasters was granted his wish. Surely he must have imagined that more power meant more flattery, a grander image, a greater hall of mirrors reflecting back his magnificence. But he misunderstood power and prominence. This man had bullied friends and acquaintances, wives and servants, and he bullied facts and truths, insistent that he was more than they were, than it is, that it too must yield to his will. It did not, but the people he bullied pretended that it did. Or perhaps it was that he was a salesman, throwing out one pitch after another, abandoning each one as soon as it left his mouth. A hungry ghost always wants the next thing, not the last thing.

This one imagined that the power would repose within him and make him great, a Midas touch that would turn all to gold. But the power of the presidency was what it had always been: a system of cooperative relationships, a power that rested on people’s willingness to carry out the orders the president gave, and a willingness that came from that president’s respect for rule of law, truth, and the people. A man who gives an order that is not followed has his powerlessness hung out like dirty laundry. One day earlier this year, one of this president’s minions announced that the president’s power would not be questioned. There are tyrants who might utter such a statement and strike fear into those beneath him, because they have installed enough fear.

A true tyrant does not depend on cooperative power but has a true power of command, enforced by thugs, goons, Stasi, the SS, or death squads. A true tyrant has subordinated the system of government and made it loyal to himself rather than to the system of laws or the ideals of the country. This would-be tyrant didn’t understand that he was in a system where many in government, perhaps most beyond the members of his party in the legislative branch, were loyal to law and principle and not to him. His minion announced the president would not be questioned, and we laughed. He called in, like courtiers, the heads of the FBI, of the NSA, and the director of national intelligence to tell them to suppress evidence, to stop investigations and found that their loyalty was not to him. He found out to his chagrin that we were still something of a democracy, and that the free press could not be so easily stopped, and the public itself refused to be cowed and mocks him earnestly at every turn.

A true tyrant sits beyond the sea in Pushkin’s country. He corrupts elections in his country, eliminates his enemies with bullets, poisons, with mysterious deaths made to look like accidents—he spread fear and bullied the truth successfully, strategically. Though he too had overreached with his intrusions into the American election, and what he had hoped would be invisible caused the whole world to scrutinize him and his actions and history and impact with concern and even fury. Russia may have ruined whatever standing and trust it has, may have exposed itself, with this intervention in the US and then European elections.

The American buffoon’s commands were disobeyed, his secrets leaked at such a rate his office resembled the fountains at Versailles or maybe just a sieve (this spring there was an extraordinary piece in the Washington Post with thirty anonymous sources), his agenda was undermined even by a minority party that was not supposed to have much in the way of power, the judiciary kept suspending his executive orders, and scandals erupted like boils  and sores. Instead of the dictator of the little demimondes of beauty pageants, casinos, luxury condominiums, fake universities offering fake educations with real debt, fake reality tv in which he was master of the fake fate of others, an arbiter of all worth and meaning, he became fortune’s fool.

He is, as of this writing, the most mocked man in the world. After the women’s march on January 21st, people joked that he had been rejected by more women in one day than any man in history; he was mocked in newspapers, on television, in cartoons, was the butt of a million jokes, and his every tweet was instantly met with an onslaught of attacks and insults by ordinary citizens gleeful to be able to speak sharp truth to bloated power.

He is the old fisherman’s wife who wished for everything and sooner or later he will end up with nothing. The wife sitting in front of her hovel was poorer after her series of wishes, because she now owned not only her poverty but her mistakes and her destructive pride, because she might have been otherwise, but brought power and glory crashing down upon her, because she had made her bed badly and was lying in it.

The man in the white house sits, naked and obscene, a pustule of ego, in the harsh light, a man whose grasp exceeded his understanding, because his understanding was dulled by indulgence. He must know somewhere below the surface he skates on that he has destroyed his image, and like Dorian Gray before him, will be devoured by his own corrosion in due time too. One way or another this will kill him, though he may drag down millions with him. One way or another, he knows he has stepped off a cliff, pronounced himself king of the air, and is in freefall. Another dungheap awaits his landing; the dung is all his; when he plunges into it he will be, at last, a self-made man.

Posted in Marty's Blog | Leave a comment

Donald Trump Poisons the World

David Brooks JUNE 2, 2017

President Trump at the White House on Thursday. Credit Doug Mills/The New York Times

This week, two of Donald Trump’s top advisers, H. R. McMaster and Gary Cohn, wrote the following passage in The Wall Street Journal: “The president embarked on his first foreign trip with a cleareyed outlook that the world is not a ‘global community’ but an arena where nations, nongovernmental actors and businesses engage and compete for advantage.”

That sentence is the epitome of the Trump project. It asserts that selfishness is the sole driver of human affairs. It grows out of a worldview that life is a competitive struggle for gain. It implies that cooperative communities are hypocritical covers for the selfish jockeying underneath.

The essay explains why the Trump people are suspicious of any cooperative global arrangement, like NATO and the various trade agreements. It helps explain why Trump pulled out of the Paris global-warming accord. This essay explains why Trump gravitates toward leaders like Vladimir Putin, the Saudi princes and various global strongmen: They share his core worldview that life is nakedly a selfish struggle for money and dominance.

It explains why people in the Trump White House are so savage to one another. Far from being a band of brothers, their world is a vicious arena where staffers compete for advantage.

 In the essay, McMaster and Cohn make explicit the great act of moral decoupling woven through this presidency. In this worldview, morality has nothing to do with anything. Altruism, trust, cooperation and virtue are unaffordable luxuries in the struggle of all against all. Everything is about self-interest.

We’ve seen this philosophy before, of course. Powerful, selfish people have always adopted this dirty-minded realism to justify their own selfishness. The problem is that this philosophy is based on an error about human beings and it leads to self-destructive behavior in all cases.

The error is that it misunderstands what drives human action. Of course people are driven by selfish motivations — for individual status, wealth and power. But they are also motivated by another set of drives — for solidarity, love and moral fulfillment — that are equally and sometimes more powerful.

People are wired to cooperate. Far from being a flimsy thing, the desire for cooperation is the primary human evolutionary advantage we have over the other animals.

People have a moral sense. They have a set of universal intuitions that help establish harmony between peoples. From their first moments, children are wired to feel each other’s pain. You don’t have to teach a child about what fairness is; they already know. There’s no society on earth where people are admired for running away in battle or for lying to their friends.

People have moral emotions. They feel rage at injustice, disgust toward greed, reverence for excellence, awe before the sacred and elevation in the face of goodness.

People yearn for righteousness. They want to feel meaning and purpose in their lives, that their lives are oriented toward the good.

People are attracted by goodness and repelled by selfishness. N.Y.U. social psychologist Jonathan Haidt has studied the surges of elevation we feel when we see somebody performing a selfless action. Haidt describes the time a guy spontaneously leapt out of a car to help an old lady shovel snow from her driveway.

One of his friends, who witnessed this small act, later wrote: “I felt like jumping out of the car and hugging this guy. I felt like singing and running, or skipping and laughing. Just being active. I felt like saying nice things about people. Writing a beautiful poem or love song. Playing in the snow like a child. Telling everybody about his deed.”

Good leaders like Lincoln, Churchill, Roosevelt and Reagan understand the selfish elements that drive human behavior, but they have another foot in the realm of the moral motivations. They seek to inspire faithfulness by showing good character. They try to motivate action by pointing toward great ideals.

Realist leaders like Trump, McMaster and Cohn seek to dismiss this whole moral realm. By behaving with naked selfishness toward others, they poison the common realm and they force others to behave with naked selfishness toward them.

By treating the world simply as an arena for competitive advantage, Trump, McMaster and Cohn sever relationships, destroy reciprocity, erode trust and eviscerate the sense of sympathy, friendship and loyalty that all nations need when times get tough.

By looking at nothing but immediate material interest, Trump, McMaster and Cohn turn America into a nation that affronts everybody else’s moral emotions. They make our country seem disgusting in the eyes of the world.

George Marshall was no idealistic patsy. He understood that America extends its power when it offers a cooperative hand and volunteers for common service toward a great ideal. Realists reverse that formula. They assume strife and so arouse a volley of strife against themselves.

I wish H. R. McMaster was a better student of Thucydides. He’d know that the Athenians adopted the same amoral tone he embraces: “The strong do what they can and the weak suffer what they must.” The Athenians ended up making endless enemies and destroying their own empire.

Posted in Marty's Blog | Leave a comment