Existing Law May Not Solve Our Presidential Crisis

October 17, 2017  TruthDig

Bill Blum
Contributor

Presidential Crisis

What’s that you say? Getting rid of Trump won’t be as easy as 1, 2, 3. (Alex Brandon / AP)

For those of you looking for a legal avenue to cut short the tenure of the 45th president of the United States—our very own real-life madman in the high tower—I have some good news, and some bad.

Starting with the upside, your ranks are growing. According to a Public Policy Polling survey conducted in late September, 48 percent of American voters want Trump impeached. A Harvard-Harris Poll conducted a month earlier pegged support for impeachment at 43 percent.

You can also take comfort in the fact that you’re right to regard Trump as a unique threat to democratic values and institutions, not to mention world peace. From his almost-daily diatribes against the “fake news” media to his Twitter taunting of Kim Jong Un, he’s proved as much over the past nine months.

Although I’m about to drop some bad news as well, let me add first that I’m with you. I have been a “never Trumper” ever since the trash-talking real estate mogul descended an escalator at his midtown Manhattan headquarters in June 2015, with a vacuous-looking Melania by his side, to announce his bid for the presidency, pledge to “make America great again” and denounce undocumented Mexican migrants as criminals, rapists and purveyors of drugs.

Throughout the long and bizarre campaign that followed, I warned in multiple Truthdig pieces of the grave dangers a Trump presidency would pose in such areas of law and policy as freedom of the press, birthright citizenship, immigration enforcement and travel bans, climate change, abortion rights and future appointments to the Supreme Court.

I also was among the first to sound the alarm about Trump’s emotional stability, in a column titled “The Psychopathology of Donald Trump,” published in July 2016. The subject is now the focus of a best-selling book, “The Dangerous Case of Donald Trump,” featuring essays written by 27 distinguished mental health experts.

Since the election, rather than mourn the defeat of Hillary Clinton, I’ve looked ahead, not back. Eyes fixed on Trump, I’ve explored the actual prospects for impeaching the president, as well as indicting him for obstruction of justice, stemming from the firing of former FBI Director James Comey.

But now for the downside: There is no quick legal fix for removing Trump. As long as the GOP controls Congress, impeachment remains a long shot, as it requires a majority vote in the House in favor of articles of impeachment and a two-thirds vote in the Senate to obtain a conviction and removal from office. Two House Democrats—Al Green of Texas and Brad Sherman of California—have introduced impeachment resolutions, but at present they’re going nowhere.

And while Robert Mueller’s investigation of Russian meddling in the election and the dismissal of Comey may result in the prosecution of such former Trump associates as Paul Manafort, Michael Flynn and even Donnie Jr., it’s premature to think that the president will find himself in the crosshairs of a grand jury any time soon. Although the issue remains unsettled, the weight of scholarly opinion is that a sitting president cannot be indicted.

The same, unfortunately, holds true for invoking the 25th Amendment—the latest deus ex machina championed by leading Democrats as a means for sacking Trump. If anything, the amendment is a more implausible vehicle than impeachment.

Ratified in 1967, the 25th Amendment was crafted in the aftermath of the assassination of John F. Kennedy to clear up ambiguities and fill gaps in the Constitution’s original provisions on presidential succession.

The Constitution, as it emerged from the founding convention of 1787, addressed the issue of succession in Article II, Section I, which stipulates:

In case of the removal of the President from office, or of his death, resignation, or inability to discharge the powers and duties of the said office, the same shall devolve on the Vice President, and the Congress may by Law provide for the case of removal, death, resignation, or inability, both of the President and Vice President, declaring what officer shall then act as President and such officer shall then act as President and such officer shall act accordingly, until the Disability be removed or a President shall be elected.

The rule of vice-presidential succession was restated by the 12th Amendment, which dealt primarily with the Electoral College and was ratified in 1804. The 20th Amendment, ratified in 1933, offered more clarification, stating that if the president-elect dies before being sworn into office, the vice president would be sworn in instead.

However, not until 1947, with the passage of the Presidential Succession Act, did the current line of succession take shape, extending from the vice president through the speaker of the House, the president pro tempore of the Senate, the secretary of state, and then to other Cabinet officials.

Still, questions about succession remained—among them, how to define a president’s inability to serve, particularly when the inability is mental or emotional in nature. Who gets to make the determination that such an inability exists? And can the president resist efforts to have himself declared unable to serve?

This is where Section 4 of the 25th Amendment comes into play in the debate over Trump’s mental fitness to hold the most powerful office in the land. The first paragraph of Section 4 advises:

Whenever the Vice President and a majority of either the principal officers of the executive departments [the Cabinet] or of such other body as Congress may by law provide, transmit to the President pro tempore of the Senate and the Speaker of the House of Representatives their written declaration that the President is unable to discharge the powers and duties of his office, the Vice President shall immediately assume the powers and duties of the office as Acting President.

The second and final paragraph of Section 4 instructs, in so many words, that the president can attempt to override a declaration of inability by notifying the Senate and House leadership that no such inability exists. Thereafter, the vice president, with the support of a majority of the Cabinet, or “the other body” referred to in the first paragraph, can contest the president’s override. To resolve the conflict and place the vice president in charge, a two-thirds vote of both houses of Congress is required to confirm that the president, in fact, is “unable to discharge the powers and duties of his office.”

The procedures outlined in Section 4 have never been invoked, and it is unlikely that they will be used against Trump. The amendment simply contains too many moving parts and depends on too many external contingencies to make it a viable option.

First and foremost, only the most cockeyed optimists could believe that Vice President Mike Pence would lead what would amount to a de facto palace coup against Trump by initiating the procedures outlined in Section 4. Nor, as an aside, would the nation be better off by having Pence—a religious fanatic—take charge of the federal government.

Second, it is doubtful that Congress, acting without Pence, would enact legislation creating another body that would make a finding of presidential incapacity. To be sure, two bills are pending in the House to do just that. Maryland Democrat Jamie Raskin introduced one to establish an oversight commission on presidential capacity, staffed largely by physicians and psychiatrists. Earl Blumenauer, D-Ore., authored the other bill, which would create an oversight body composed of all former living presidents and vice presidents.

Neither measure, however, has shown any sign of progressing to a committee hearing. Even assuming, improbably, that either could win approval by both the Senate and the House and become law, they would have to be passed by a two-thirds supermajority in each chamber to withstand an inevitable presidential veto.

This does not mean, however, that it is pointless to agitate for impeachment or call for the removal of Trump via the 25th Amendment, or that it is a waste of time to discuss the possibility that Mueller and his colleagues will conclude they can prove an obstruction case against the president. Rather, it means that progressives and never-Trumpers should view the avenues for creating an early Trump exit not just as ends in themselves, but as organizing tools that can draw increasing numbers of Americans into a wider political effort to build a new progressive movement aimed at sweeping the GOP (and eventually, center-right Democrats) from power.

In the final analysis, and most importantly, dumping Trump and ensuring that no one like him ever accedes to the presidency again will require the promotion of an alternative to oligarchic corporate capitalism. As I have written in this column before:

Every major movement of social and political transformation, in addition to championing specific short-term reforms, has been animated by higher principles promising both solidarity and liberation. The American Revolution was moved by the demand for “life, liberty and the pursuit of happiness.” The French version was driven by the ideals of “liberté, égalité, fraternité.” The civil rights movement was propelled by Martin Luther King Jr.’s “dream” of racial harmony and justice. Even Obama’s 2008 presidential run was keyed by a single word of inspiration: “Hope.”

What, then, in this critical hour, is our shared vision of the future? I don’t pretend to have the answers, except to say that in the broadest terms it will be communitarian, diverse, inclusive, respectful of democratic institutions and the environment, and welcoming toward individual freedoms. It will not, if it is to succeed, call for a restoration of the hierarchical neoliberalism of the recent past.

The outcome is uncertain, which only makes the undertaking all the more necessary.

Posted in Marty's Blog | Leave a comment

The Art of Thinking Well

Oct 10, 2017

David Brooks-NY Times

ichard Thaler has just won an extremely well deserved Nobel Prize in economics. Thaler took an obvious point, that people don’t always behave rationally, and showed the ways we are systematically irrational.

Thanks to his work and others’, we know a lot more about the biases and anomalies that distort our perception and thinking, like the endowment effect (once you own something you value it more than before you owned it), mental accounting (you think about a dollar in your pocket differently than you think about a dollar in the bank) and all the rest.

Before Thaler, economists figured it was good enough to proceed as if people are rational, utility-maximizing creatures. Now, thanks to the behavioral economics revolution he started, most understand that’s not good enough.

But Thaler et al. were only scratching the surface of our irrationality. Most behavioral economists study individual thinking. They do much of their research in labs where subjects don’t intimately know the people around them.

It’s when we get to the social world that things really get gnarly. A lot of our thinking is for bonding, not truth-seeking, so most of us are quite willing to think or say anything that will help us be liked by our group. We’re quite willing to disparage anyone when, as Marilynne Robinson once put it, “the reward is the pleasure of sharing an attitude one knows is socially approved.” And when we don’t really know a subject well enough, in T. S. Eliot’s words, “we tend always to substitute emotions for thoughts,” and go with whatever idea makes us feel popular.

Photo

Richard Thaler, left, won the Nobel Prize in economics on Monday in part because he realized people act irrationally. Credit Scott Olson/Getty Images

This is where Alan Jacobs’s absolutely splendid forthcoming book “How to Think” comes in. If Thaler’s work is essential for understanding how the market can go astray, Jacobs’s emphasis on the relational nature of thinking is essential for understanding why there is so much bad thinking in political life right now.

Jacobs makes good use of C. S. Lewis’s concept of the Inner Ring. In every setting — a school, a company or a society — there is an official hierarchy. But there may also be a separate prestige hierarchy, where the cool kids are. They are the Inner Ring.

There are always going to be people who desperately want to get into the Inner Ring and will cut all sorts of intellectual corners to be accepted. As Lewis put it, “The passion for the Inner Ring is most skillful in making a man who is not yet a very bad man do very bad things.”

People will, for example, identify and attack what Jacobs calls the Repugnant Cultural Other — the group that is opposed to the Inner Ring, which must be assaulted to establish membership in it.

Other people will resent the Inner Ring, and they will cut all sorts of intellectual corners in order to show their resentment. These people are quick to use combat metaphors when they talk about thinking (he shot down my argument, your claims are indefensible). These people will adopt shared vague slurs like “cuckservative” or “whitesplaining” that signal to the others in the outsider groups that they are attacking the ring, even though these slurs are usually impediments to thought.

Jacobs notices that when somebody uses “in other words” to summarize another’s argument, what follows is almost invariably a ridiculous caricature of that argument, in order to win favor with the team. David Foster Wallace once called such people Snoots. Their motto is, “We Are the Few, the Proud, the More or Less Constantly Appalled at Everyone Else.”

Jacobs nicely shows how our thinking processes emerge from emotional life and moral character. If your heart and soul are twisted, your response to the world will be, too. He argues that by diagnosing our own ills, we can begin to combat them. And certainly I can think of individual beacons of intellectual honesty today: George Packer, Tyler Cowen, Scott Alexander and Caitlin Flanagan, among many.

But I’d say that if social life can get us into trouble, social life can get us out. After all, think of how you really persuade people. Do you do it by writing thoughtful essays that carefully marshal facts? That works some of the time. But the real way to persuade people is to create an attractive community that people want to join. If you do that, they’ll bend their opinions to yours. If you want people to be reasonable, create groups where it’s cool to be reasonable.

Jacobs mentions that at the Yale Political Union members are admired if they can point to a time when a debate totally changed their mind on something. That means they take evidence seriously; that means they can enter into another’s mind-set. It means they treat debate as a learning exercise and not just as a means to victory.

How many public institutions celebrate these virtues? The U.S. Senate? Most TV talk shows? Even the universities?

Back when they wrote the book of Proverbs it was said, “By long forbearing is a prince persuaded, and a soft tongue breaketh the bone.” These days, a soft tongue doesn’t get you very far, but someday it might again.

Posted in Marty's Blog | Leave a comment

Faces of Pain, Faces of Hope

Oct 9, 2017
Chris Hedges
Mr. Fish

ANDERSON, Ind.—It was close to midnight, and I was sitting at a small campfire with Sybilla and Josh Medlin in back of an old warehouse in an impoverished section of the city. The Medlins paid $20,000 for the warehouse. It came with three lots. They use the lots for gardens. The produce they grow is shared with neighbors and the local homeless shelter. There are three people living in the warehouse, which the Medlins converted into living quarters. That number has been as high as 10.

“It was a house of hospitality,” said Josh, 33, who like his wife came out of the Catholic Worker Movement. “We were welcoming people who needed a place to stay, to help them get back on their feet. Or perhaps longer. That kind of didn’t work out as well as we had hoped. We weren’t really prepared to deal with some of the needs that people had. And perhaps not the skills. We were taken advantage of. We weren’t really helping them. We didn’t have the resources to help them.”

“For the Catholic Workers, the ratio of community members to people they’re helping is a lot different than what we had here,” Sybilla, 27, said. “We were in for a shock. At the time there were just three community members. Sometimes we had four or five homeless guests here. It got kind of chaotic. Mostly mental illness. A lot of addiction, of course. We don’t know how to deal with hard drugs in our home. It got pretty crazy.”

Two or three nights a month people gather, often around a fire, in back of the warehouse, known as Burdock House.

“The burdock is seen as a worthless, noxious weed,” Josh said. “But it has a lot of edible and medicinal value. A lot of the people we come into contact with are also not valued by our society. The burdock plant colonizes places that are abandoned. We are doing the same thing with our house.”

Those who come for events bring food for a potluck dinner or chip in five dollars each. Bands play, poets read and there is an open mic. Here they affirm what we all must affirm—those talents, passions, feelings, thoughts and creativity that make us complete human beings. Here people are celebrated not for their jobs or status but for their contributions to others. And in associations like this one, unseen and unheralded, lies hope.

“We are an intentional community,” said Josh. “This means we are a group of people who have chosen to live together to repurpose an old building, to offer to a neighborhood and a city a place to express its creative gifts. This is an alternative model to a culture that focuses on accumulating as much money as possible and on an economic structure based on competition and taking advantage of others. We value manual labor. We value nonviolence as a tactic for resistance. We value simplicity. We believe people are not commodities. We share what we have. We are not about accumulating for ourselves. These values help us to become whole people.”

The message of the consumer society, pumped out over flat screen televisions, computers and smartphones, to those trapped at the bottom of society is loud and unrelenting: You are a failure. Popular culture celebrates those who wallow in power, wealth and self-obsession and perpetuates the lie that if you work hard and are clever you too can become a “success,” perhaps landing on “American Idol” or “Shark Tank.” You too can invent Facebook. You too can become a sports or Hollywood icon. You too can rise to be a titan. The vast disparity between the glittering world that people watch and the bleak world they inhabit creates a collective schizophrenia that manifests itself in our diseases of despair—suicides, addictions, mass shootings, hate crimes and depression. Our oppressors have skillfully acculturated us to blame ourselves for our oppression.

Hope means walking away from the illusion that you will be the next Bill Gates, Mark Zuckerberg, Kim Kardashian. It means rejecting the lust for public adulation and popular validation. It means turning away from the maniacal creation of a persona, an activity that defines presence on social media. It means searching for something else—a life of meaning, purpose and, ultimately, dignity.

The bottomless narcissism and hunger of consumer culture cause our darkest and most depraved pathologies. It is not by building pathetic, tiny monuments to ourselves that we become autonomous and free human beings; it is through acts of self-sacrifice, by recovering a sense of humility, by affirming the sanctity of others and thereby the sanctity of ourselves. Those who fight against the sicknesses, whether squatting in old warehouses, camped out at Zuccotti Park or Standing Rock or locked in prisons, have discovered that life is measured by infinitesimal and often unseen acts of solidarity and kindness. These acts of kindness, like the nearly invisible strands of a spider’s web, slowly spin outward to connect our atomized and alienated souls to the souls of others. The good, as Daniel Berrigan told me, draws to it the good. This belief—held although we may never see empirical proof—is profoundly transformative. But know this: When these acts are carried out on behalf of the oppressed and the demonized, when compassion defines the core of our lives, when we understand that justice is a manifestation of this solidarity, even love, we are marginalized and condemned by the authoritarian or totalitarian state.

Those who resist effectively will not negate the coming economic decline, the mounting political dysfunction, the collapse of empire, the ecological disasters from climate change, and the many other bitter struggles that lie ahead. Rather, they draw from their acts of kindness the strength and courage to endure. And it will be from their relationships—ones formed the way all genuine relationships form, face to face rather than electronically—that radical organizations will be created to resist.

Sybilla, whose father was an electrician and who is the oldest of six, did not go to college. Josh was temporarily suspended from Earlham College in Richmond, Ind., for throwing a pie at William Kristol as the right-wing commentator was speaking on campus in 2005. Josh never went back to college. Earlham, he said, like most colleges, is a place “where intellectualism takes precedence over truth.”

“When I was in high school I was really into the punk rock community,” Sybilla said. “Through that I discovered anarchism.”

Emma Goldman?” I asked.

“Yeah, mostly that brand of anarchism,” she said. “Not like I’m going to break car windows for fun.”

She was attracted to the communal aspect of anarchism. It fit with the values of her parents, who she said “are very anti-authoritarian” and “who always taught me to think for myself.” She read a book by an anonymous author who lived outside the capitalist system for a couple of years. “That really set me on that direction even though he is a lot more extreme,” she said, “only eating things from the garbage. Train hopping. As a teenager, I thought, ‘Wow! The adventure. All the possible ways you could live an alternative lifestyle that’s not harmful to others and isn’t boring.’ ”

When she was 18 she left Anderson and moved to Los Angeles to join the Catholic Worker Movement.

“I [too] became pretty immersed in the anarchist scene,” Josh said. “I’m also a Christian. The Catholic Worker Movement is the most well known example of how to put those ideas in practice. Also, I really didn’t want anything to do with money.”

“A lot of my friends in high school, despite being a part of the punk rock community, went into the military,” Sybilla said. “Or they’re still doing exactly what they were doing in high school.”

The couple live in the most depressed neighborhood of Anderson, one where squatters inhabit abandoned buildings, drug use is common, the crime rate is high, houses are neglected and weeds choke abandoned lots and yards. The police often never appear when someone from this part of the city dials 911. When the police do appear they are usually hostile.

“If you’re walking down the street and you see a cop car, it doesn’t make you feel safe,” Josh said.

“A lot of people view them [police] as serving the rich,” Sybilla said. “They’re not serving us.”

“Poor people are a tool for the government to make money with small drug charges,” she added. “A lot of our peers are in jail or have been in jail for drugs. People are depressed. Lack of opportunity. Frustration with a job that’s boring. Also, no matter how hard you want to work, you just barely scrape by. One of our neighbors who is over here quite a bit, he had a 70-hour-a-week job. Constant overtime. And he still lives in this neighborhood in a really small one-bedroom apartment. I think Anderson has really bad self-esteem. A lot of young people, a lot of people my age I know are working for $9, $10 an hour. Moving from job to job every six months. Basically, enough money to buy alcohol, cigarettes and pay the rent.”

“My mom’s generation grew up thinking they were going to have a solid job,” she said. “That they were just going to be able to start a job and have a good livelihood. And that’s not the case. Just because you want to work really hard it doesn’t necessarily mean you’re going to make it.”

“I work as a cashier at the local Christian college,” she said. “It’s a small school with 2,000 students. I work in the cafeteria. The contract changed. The school stopped doing its own food service many years ago. Has been hiring private companies. After I worked there for a year the contract was up. It was a new company and they’re huge. … I think it’s the biggest food service company. They do most hospitals, schools, prisons. And the job conditions changed so dramatically. Our orientation with this new company, they had this HR guy come. He’s like, ‘You’re going to work for the biggest company in the world. You should be so excited to be a part of our team. We’re going to make you great. Anderson used to be this really powerful city full of industry. The employees demanded so much from the companies. And [the companies] all left.’ ”

“We’re just looking at him,” she said. “Why is this relevant? Basically the message was, ‘You guys have no other choice. So you don’t choose to work with us. You have to. And we’re going to do what we want you to do.’ At the time I was taking $7.50 an hour. They hired me at $7.50 from the old company. They hired the people beside me for $8, which I was not happy with. The old employees were making more money because they got consistent raises throughout the years. They would have them do jobs like carrying crates of heavy food up the stairs. Or they moved them to the dish room. Jobs that they knew they physically couldn’t do, in hopes that they would quit. I think. They didn’t want to pay that higher wage. And the students weren’t happy either. So many employees were really upset. Everyone was talking about quitting. We lost about half the workforce. There were 100 employees when they came in. They had reduced down to 50. That makes my job twice as hard. But I still make $7.50. With no hope for a raise anytime soon.”

“I went up to them,” she continued. “I said, ‘I need to make as much as these people at least. I’ve been here for a year. I’m a more valuable employee.’ And they were like, ‘If you don’t like it, quit. Maybe you can get a job at Burger King.’ I was so angry. How dare they tell me to quit. I started talking to some of my co-workers to see if they were interested in making the job better rather than quitting. And a lot of them were. Especially the people who’d been there for years and years and who were familiar with GM and UAW [the United Automobile Workers union]. And weren’t scared of it. So we started having meetings. I think the campaign took two years. And we successfully organized. It’s been a huge improvement. Even though it’s still a low-paying job, everything is set. They can’t change their mind about when we get raises. They can’t change their mind about what the hiring rate is. They can’t take these elderly people and make them start carrying boxes rather than run a cash register. They were also firing people for no reason. That doesn’t happen anymore. … The employees have a voice now. If we don’t like something, when our contract is up for renegotiation we can change it.”

“The jobs we have are boring,” she said. “My job was so boring. Having this as an outlet, also with the challenge of creating the union there, I was able to not feel so useless.”

Sybilla also publishes The Heartland Underground. The zine, which sells for $2 a copy and comes out every four or five months, reviews local bands like the punk group Hell’s Orphans, publishes poets and writers and has articles on subjects such as dumpster diving.

In a review of Hell’s Orphans, which has written songs such as “Too Drunk to Fuck” and “Underage Donk,” the reviewer and the band sit in a basement drinking until one of the band members, Max, says, “Feel free to take anything we say out of context. Like, you can even just piece together individual words or phrases.” (Donk, as the article explains, is “a slang term for a very round, attractive, ghetto booty and is a derivative of the term Badonkadonk.”) The review reads:

Hell’s Orphans has really played some unusual shows like a high school open house, a show in a garage where the audience was only four adults, four kids, a dog and a chicken, and out of the Game Exchange a buy/sell/trade video game store. They’ve also played under some uncomfortable circumstances like a flooded basement in which Nigel was getting shocked by the mic and guitar every few seconds and at the Hollywood Birdhouse one night when Max and Nigel were both so paranoid on some crazy pot that they were also too frozen to perform and couldn’t look at the audience. For such a young band that has done zero touring they’ve had a lot of adventures and experiences.

A poet who went by the name Timotheous Endeavor wrote in a poem titled “The Monkey Song”:

please just let me assume
that there is room for us and our lives
somewhere between your lies
and the red tape that confines

please just let me assume
we’re all monkeys
we’re trained to self alienate
but it’s not our fates

i was walking down the road
i wonder if there’s anywhere around here
that i am truly welcome
spend my dollar move along
past all of the closed doors

In one edition of The Heartland Underground there was this untitled entry:

They pay me just to stay out of thier [sic] world. They don’t want me at work I would just get in their way. They pay me just to sit at home. I feel things harder and see with such different eyes it’s easier for everyone if I just stay at home, if I just stay out of their world and wait to die. I am not inept. I just don’t fit into their neatly paved grids, their machines and systems.

There is no place for a schizophrenic in this world and there is no place for anything wild, crooked, or untamed anymore. When did things go so wrong? Everything is wrong!! They paved paradise and put up a parking lot. They paved the entire fucking planet. I’m on a mission to liberate myself from all the lies that poison me and rot inside my mind, holding me captive and causing me to hate myself and the world. I’m ready to stop hating! I’m ready to become fully human and join life.

The truth is: We’re all drowning.

They think I’m crazy? At least I can see that I’m drowning. No one else is in a panic because they can’t see or feel how wrong everything is. I don’t want to drown. I want to swim and climb up to a high place. I want to rise above.

Arbitrary Aardvark wrote an article called “I was a Guinea Pig for Big Pharma,” about earning money by taking part in medical experiments. He would stay in a lab for about two weeks and take “medicine, usually a pill, and they take your blood a lot. You might be pissing in a jug or getting hooked up to an EKG machine or whatever the study design calls for, and they take your blood pressure and temperature pretty often.” He made between $2,000 and $5,000 depending on how long the study lasted. Most of his fellow “lab rats” were “just out of jail or rehab.” In one study he had a bullet-tipped plastic tube inserted down his nose into his intestines. “It was the most painful thing I’ve been through in my adult life.” He said he and the other subjects did not like reporting side effects because they were “worried that they’ll be sent home without a full paycheck, or banned from future studies.” He admitted this probably affected the viability of the studies. He became ill during one of the experiments. The pharmaceutical company refused to pay him, blaming his illness on a pre-existing condition. He wrote:

I signed up for one that was going to pay $5,000, but a week into it my liver enzymes were all wrong, and they took me out of the study but kept me at the site, because I was very sick. It turned out I’d come down with mono just before going into the study. And then I got shingles, not fun. …

I’d spent 3 years trying to be a lawyer and failed. I’d worked in a warehouse, as the Dalai lama’s nephew’s headwaiter, as a courier and a temp. Lost money day trading and then in real estate. I was ready to try medical experiments again. I tried 3 times to get in at Eli Lilly but never did. Lilly no longer does its own clinical trials after a girl … killed herself during an anti-depressant study. …

Jared Lynch wrote an essay titled “Sometimes the Voices in Your Head are Angry Ghosts” that included these lines:

Death shrouded the whole spacetime of the latter half of high school, coating it in an extra vicious layer of depression. The first night we stayed in the house I sat in the living room, writing about ghosts in a composition book… I had a package of single edge blades in the back of my top desk drawer and sometimes I flirted too closely with that edge of darkness. I thought a lot about the blades at school. My daydreams were consumed by untold suicides, and countless times I came home to find one of my past selves in the tub with his forearm opened wide and grinning with his life essence surrounding him in the tub on the wrong side of his skin.

It was a strange, beautiful time. Melancholia wrapped around the edges with the golden glow of nostalgia for a time that felt like I had died before it existed… I fell into an expected, but incredibly deep pool of depression and I found the single edge razors that one of my future had so graciously left behind in my top drawer. I bled myself because I wanted to be with the lovely, lonely ghosts. I bled myself more than I ever had, but I didn’t bleed enough to capture myself in the barbs of the whirlpool of my depression.

He ended the essay with “I still bear my scars.”

Tyler Ambrose wrote a passage called “Factory Blues.”

What is a factory? What is a factory? A factory is a building of varied size. Some immense tributes to humanistic insatiability, others homely, almost comfortable. Mom and Pop type places, each run-down corner lot a puzzle piece in a greater maze. Gears if you will, all part of the capitalism machine. Some so small they fall out like dandruff, plummeting into the furnaces that keep the monster thriving. Constantly shaking loose another drop of fuel from its decaying hide. For the more fuel it consumes, the drier and deader does its skin become. Until one day, when all the skin has fallen into the fires, and all that remains are rustic factory bones, the beast will fall, and all the peoples of the earth will feel its tumble. And all will fall beside it, when its decaying stench kills the sun.

The cri de coeur of this lost generation, orphans of global capitalism, rises up from deindustrialized cities across the nation. These Americans struggle, cast aside by a society that refuses to honor their intelligence, creativity and passion, that cares nothing for their hopes and dreams, that sees them as cogs, menial wage slaves who will do the drudgery that plagues the working poor in postindustrial America.

Parker Pickett, 24, who works at Lowe’s, is a poet and a musician. He frequently reads his work at Burdock House. He read me a few of his poems. One was called “This is a poem with no Words.” These were the concluding lines.

out of, the affection I receive from concrete whether broken or spray
painted, the old men want that money now want that control even
though they are sad and delusional, if I could I would die from the beauty
of her eyes as they shudder and gasp and relax with natural imperfections which
I hold in high regards, the glow of the city around me reaches to the night
sky, a slate of black chalkboard I wipe off the stars with my thumb one
by one, songs end stories end lives end, but the idea of some grand, silly
truth to everyone and everything with never die, we are born in love with precious life, and with that truth I will giggle and smile until I’m laid to rest in my
sweet, sweet grave.

I sat on a picnic table next to Justin Benjamin. He cradled his guitar, one of his tuning pegs held in place by locking pliers. The fire was dying down. Justin, 22, calls himself WD Benjamin, the “WD” being short for “well dressed.” He wore a white shirt, a loosely knotted tie and a suit coat. He had long, frizzy hair parted in the middle that fell into his face. His father was a steelworker. His mother ran a day care center and later was an insurance agent.

“Kids would talk about wanting something better or leaving,” he said. “Yet they weren’t doing steps to take it. You saw they were going to spend their whole lives here begrudgingly. They would talk stuff. They would never do anything about it. It was all just talk.”

He paused.

“Substance [abuse] ruined a lot of lives around here,” he said.

He estimates that by age 14 most kids in Anderson realize they are trapped.

“We had seen our parents or other people or other families not go anywhere,” he said. “This business went under. Pizzerias, paint stores, they all go under. About that time in my life, as much as I was enthralled with seeing cars rushing past and all these tall buildings, we all saw, well, what was the point if none of us are happy or our parents are always worrying about something. Just not seeing any kind of progression. There had to be something more.”

“I’ve had friends die,” he said. “I had a friend named Josh. We’d say, ‘He Whitney Houston-ed before Whitney Houston.’ He pilled out and died in a bathtub. It happened a month before Whitney Houston died. So that was a weird thing for me. Everyone is going to remember Whitney Houston but no one will remember Josh. At the time he was 16.”

“I see friends who are taking very minimal jobs and never thinking anywhere beyond that,” he said. “I know they’re going to be there forever. I don’t despise them or hold anything against them. I understand. You have to make your cut to dig out some kind of a living. … I’ve done manual labor. I’ve done medical, partial. Food service. I’ve done sales. Currently I’m working on a small album. Other than that, I play for money. I sell a lot of odds and ends. I’ve been doing that for years. Apparently I have a knack for collecting things and they’re of use for somebody. Just paying my way with food and entertainment for somebody. I live right across from the library. Eleventh Street. I can’t remember the address. I’m staying with some people. I try to bring them something nice, or make dinner, or play songs. I do make enough to pay my share of utilities. I wouldn’t feel right otherwise.”

He is saved, he said, by the blues—Son House, Robert Johnson, all the old greats.

“My finger got caught in a Coke bottle trying to emulate his style of slide guitar,” he said of House. “I asked my dad to help me please get it out. There was just something about people being downtrodden their whole lives. I used to not understand the plight of the black community. I used to think why can’t they just work harder. I was raised by a father who was very adamant about capitalism. Then one day my sister-in-law told me, ‘Well, Justin, you just don’t understand generational poverty. Please understand.’ People were told they were free yet they have all these problems, all these worries. … It’s the natural voice. You listen to Lead Belly’s ‘Bourgeois Blues,’ it’s a way of expressing their culture. And their culture is sad. ‘Death Don’t Have No Mercy’ talks about the great equalizer of death. It didn’t matter if you’re black or white, death will come for you.”

He bent over his guitar and played Robert Johnson’s “Me and the Devil Blues.”

Early this morning
When you knocked upon my door
Early this morning, oooo
When you knocked upon my door
And I said hello Satan
I believe it’s time to go

“I’ve seen a lot of GM people, they just live in this despair,” he said of the thousands of people in the city who lost their jobs when the General Motors plants closed and moved to Mexico. “They’re still afraid. I don’t know what they’re afraid of. It’s just the generation they came out of. I worked with plenty of GM people who were older and having to work for their dollars begrudgingly. They’re like, ‘I was made promises.’ ”

“I was born 3 pounds,” he said. “I was not destined for this world. Somehow I came out. I did the best I could. That’s all I’ve done. I’ll never say I’m good at anything. At least I have the ability to think, speak and act. Three pounds to this now. I just can’t see the use of not fighting. You always have to think about what’s going to lay down in the future. What’s going to happen when the factories close down? Are you going to support your fellow co-workers? Are you going to say, ‘No, things will come back?’ Are you going to cast everything to damnation? Cast your neighbors down, say it was their fault the jobs are gone.”

“I’ve never seen the heights of it,” he said of capitalism. “But I’ve seen the bottom. I’ve seen kids down here naked running around. I’ve seen parents turn on each other and kids have to suffer for that. Or neighbors. I’d just hear yelling all night. It’s matters of money. It’s always the kids that suffer. I always try to think from their perspective. When it comes down to kids, they feel defeated. When you grow up in a household where there’s nothing but violence and squabbling and grabbing at straws, then you’re going to grow up to be like that. You’re going to keep doing those minimum jobs. You’re fighting yourself. You’re fighting a system you can’t beat.”

“I’ve seen poets, phenomenal guitarists, vocalists, percussionists, people who have tricks of the trade, jugglers, yo-yo players, jokesters,” he went on. “I admire those people. They might go on to get a different job. They might find a sweetheart. They might settle down. They have that thing that got them to some point where they could feel comfortable. They didn’t have to work the job that told them, ‘So what if you leave? You don’t matter.’ I know a fellow who works at the downtown courthouse. Smart as can be. One of my favorite people. We talk about Nietzsche and Kafka in a basement for hours. The guy never really let the world get him down. Even though he’s grown up in some rough situations.”

And then he talked about his newborn niece.

“I wrote this in about 10 minutes,” he said. “I race down the street because no one else was available. I went to a friend. I said, ‘I wrote a song! I think it’s neat. I don’t think it’s good. But I like the idea.’ I’d never done that.”

He hunched back over his guitar and began to play his “Newborn Ballad.”

You were brushed and crafted carefully

They knew young love and now they know you
How two lives figure into one beats me
But either I’m sure they’ll agree with you

Your eyes will open proud I pray
May the breakneck sides around you come down
Little darling I’ll be your laughing stock
So the mean old world won’t get you down

I ain’t gonna say I ain’t crazy
All are fronting and pestering your soul
When we first meet I can promise to
To listen, to play with, to talk to, to love

There’s nothing no better they’ll tell you
Than your youth, no weight will end
No matter the preference child hear me
Not a moment you’ll have will be absent

My pardon, my dearest apologies
For the scenes and the faces I make
For now you might find them quite funny
But they’ll get old as will I, I’m afraid

Your comforts they don’t come easy
With an hour twenty down the road
We made lives in telling you sweetly
But you can make it, we love you, you know.

Posted in Marty's Blog | Leave a comment

Folks, We’re Home Alone

Thomas L. Friedman SEPT. 27, 2017

Former Secretary of State Dean Acheson wrote a famous memoir, “Present at the Creation,” about the birth of the post-World War II order — an order whose institutions produced six decades of security and growth for a lot of people. We’re now at a similar moment of rapid change — abroad and at home. Many institutions have to be rethought. But any book about Washington today would have to be called “Absent at the Creation.”

Surely one of the most cynical, reckless acts of governing in my lifetime has been President Trump and the G.O.P.’s attempt to ram through a transformation of America’s health care system — without holding hearings with experts, conducting an independent cost-benefit analysis or preparing the public — all to erase Barack Obama’s legacy to satisfy a few billionaire ideologue donors and a “base” so drunk on Fox News that its members don’t understand they’ll be the ones most hurt by it all.

Democrats aren’t exactly a fire hose of fresh ideas, but they do respect science and have a sense of responsibility to not play around with big systems without an ounce of study. Not so Trump. He scrapped the Paris climate treaty without consulting one climate scientist — and no G.O.P. leader protested. Think about that.

That failure is particularly relevant because, as this column has been arguing, “climate change” is the right analytical framework for thinking about how we shape policy today. Why? Because we’re going through three climate changes at once:

We’re going through a change in the actual climate — disruptive, destructive weather events are steadily on the rise.

We’re going through a change in the “climate” of globalization — going from an interconnected world to an interdependent one, from a world of walls where you build your wealth by hoarding the most resources to a world of webs where you build your wealth by having the most connections to the flow of ideas, networks, innovators and entrepreneurs. In this interdependent world, connectivity leads to prosperity and isolation leads to poverty. We got rich by being “America Connected” not “America First.”

Finally, we’re going through a change in the “climate” of technology and work. We’re moving into a world where computers and algorithms can analyze (reveal previously hidden patterns); optimize (tell a plane which altitude to fly each mile to get the best fuel efficiency); prophesize (tell you when your elevator will break or what your customer is likely to buy); customize (tailor any product or service for you alone); and digitize and automatize more and more products and services. Any company that doesn’t deploy all six elements will struggle, and this is changing every job and industry.

What do you need when the climate changes? Adaptation — so your citizens can get the most out of these climate changes and cushion the worst. Adaptation has to happen at the individual, community and national levels.

At the individual level, the single most important adaptation is to become a lifelong learner, so you can constantly add value beyond what machines and algorithms can do.

“When work was predictable and the change rate was relatively constant, preparation for work merely required the codification and transfer of existing knowledge and predetermined skills to create a stable and deployable work force,” explains education consultant Heather McGowan. “Now that the velocity of change has accelerated, due to a combination of exponential growth in technology and globalization, learning can no longer be a set dose of education consumed in the first third of one’s life.” In this age of accelerations, “the new killer skill set is an agile mind-set that values learning over knowing.”

At the community level, the U.S. communities that are thriving are the ones building what I call complex adaptive coalitions. These comprise local businesses that get deeply involved in shaping the skills being taught in the public schools and community colleges, buttressed by civic and philanthropic groups providing supplemental learning opportunities and internships. Then local government catalyzes these coalitions and hires recruiters to go into the world to find investors for their local communal assets.

These individual and communal adaptation strategies dictate the national programs you want: health care that is as portable as possible so people can easily move from job to job; as much free or tax-deductible education as possible, so people can afford to be lifelong learners; reducing taxes on corporations and labor to stimulate job creation and relying instead on a carbon tax that raises revenues and mitigates costly climate change; and immigration and trade policies that are as open as possible, because in an age of acceleration the most open country will get the change signals first and attract the most high-I.Q. risk takers who start new companies.

There was no good time for Donald Trump to be president. But this is a uniquely bad time for us to have a race-baiting, science-denying divider in chief. He is impossible to ignore, and yet reacting to his daily antics only makes us stupid — only makes our society less focused on the huge adaptation challenges at hand.

Posted in Marty's Blog | Leave a comment

The Abuses of History

Sept 25, 2017

Chris Hedges

Mr. Fish

Historians, like journalists, are in the business of manipulating facts. Some use facts to tell truths, however unpleasant. But many more omit, highlight and at times distort them in ways that sustain national myths and buttress dominant narratives. The failure by most of the United States’ popular historians and the press to tell stories of oppression and the struggles against it, especially by women, people of color, the working class and the poor, has contributed to the sickening triumphalism and chauvinism that are poisoning our society. The historian James W. Loewen, in his book “Lies Across America: What Our Historic Markers and Monuments Get Wrong,” calls the monuments that celebrate our highly selective and distorted history a “landscape of denial.”

The historian Carl Becker wrote, “History is what the present chooses to remember about the past.” And as a nation founded on the pillars of genocide, slavery, patriarchy, violent repression of popular movements, savage war crimes committed to expand the empire, and capitalist exploitation, we choose to remember very little. This historical amnesia, as James Baldwin never tired of pointing out, is very dangerous. It feeds self-delusion. It severs us from recognition of our propensity for violence. It sees us project on others—almost always the vulnerable—the unacknowledged evil that lies in our past and our hearts. It shuts down the voices of the oppressed, those who can tell us who we are and enable us through self-reflection and self-criticism to become a better people. “History does not merely refer to the past … history is literally present in all we do,” Baldwin wrote.

If we understood our real past we would see as lunacy Donald Trump’s bombastic assertions that the removal of Confederate statues is an attack on “our history.” Whose history is being attacked? And is it history that is being attacked or the myth disguised as history and perpetuated by white supremacy and capitalism? As the historian Eric Foner points out, “Public monuments are built by those with sufficient power to determine which parts of history are worth commemorating and what vision of history ought to be conveyed.”

The clash between historical myth and historical reality is being played out in the president’s disparaging of black athletes who protest indiscriminate police violence against people of color. “Maybe he should find a country that works better for him,” candidate Trump said of professional quarterback Colin Kaepernick, who knelt during the national anthem at National Football League games to protest police violence. Other NFL players later emulated his protest.

Friday at a political rally in Alabama, Trump bellowed: “Wouldn’t you love to see one of these NFL owners, when somebody disrespects our flag, to say, ‘Get that son of a bitch off the field right now. Out! He’s fired. He’s fired!’ ” That comment and a Saturday morning tweet by Trump that criticized professional basketball star Stephen Curry, another athlete of African-American descent, prompted a number of prominent sports figures to respond angrily. One addressed the president as “U bum” on Twitter.

The war of words between the president and black athletes is about competing historical narratives.

Historians are rewarded for buttressing the ruling social structure, producing heavy tomes on the ruling elites—usually powerful white men such as John D. Rockefeller or Theodore Roosevelt—and ignoring the underlying social movements and radicals that have been the true engines of cultural and political change in the United States. Or they retreat into arcane and irrelevant subjects of minor significance, becoming self-appointed specialists of the banal or the trivial. They ignore or minimize inconvenient facts and actions that tarnish the myth, including lethal suppression of groups, classes and civilizations and the plethora of lies told by the ruling elites, the mass media and powerful institutions to justify their grip on power. They eschew transcendental and moral issues, including class conflict, in the name of neutrality and objectivity. The mantra of disinterested scholarship and the obsession with data collection add up, as the historian Howard Zinn wrote, “to the fear that using our intelligence to further our moral ends is somehow improper.”

“Objectivity is an interesting and often misunderstood word,” Foner said. “I tell my students what objectivity means is you have an open mind, not an empty mind. There is no person who doesn’t have preconceptions, values, assumptions. And you bring those to the study of history. What it means to be objective is if you begin encountering evidence, research, that questions some of your assumptions, you may have to change your mind. You have to have an open mind in your encounters with the evidence. But that doesn’t mean you don’t take a stance. You have an obligation. If you’ve done all this studying, done all this research, if you understand key issues in American history better than most people, just because you’ve done the research and they haven’t, you have an obligation as a citizen to speak up about it. …We should not be bystanders. We should be active citizens. Being a historian and an active citizen is not mutually contradictory.”

Historians who apologize for the power elites, who in essence shun complexity and minimize inconvenient truths, are rewarded and promoted. They receive tenure, large book contracts, generous research grants, lucrative speaking engagements and prizes. Truth tellers, such as Zinn, are marginalized. Friedrich Nietzsche calls this process “creative forgetfulness.”

“In high school,” Foner said, “I got a history textbook that said ‘Story of American History,’ which was very one-dimensional. It was all about the rise of freedom and liberty. Slavery was omitted almost entirely. The general plight of African-Americans and other non-whites was pretty much omitted from this story. It was very partial. It was very limited. That’s the same thing with all these statues and [the debate about them]. I’m not saying we should tear down every single statue of every Confederate all over the place. But if we step back and look at the public presentation of history, particularly in the South, through these monuments, where are the black people of the South? Where are the monuments to the victims of slavery? To the victims of lynching? The monuments of the black leaders of Reconstruction? The first black senators and members of Congress? My view is, as well as taking down some statues, we need to put up others. If we want to have a public commemoration of history, it ought to be diverse enough to include the whole history, not just the history that those in power want us to remember.”

“Civil War monuments glorify soldiers and generals who fought for Southern independence,” Foner writes in “Battles for Freedom: The Use and Abuse of American History,” “explaining their motivation by reference to the ideals of freedom, states’ rights and individual autonomy—everything, that is but slavery, the ‘cornerstone of the Confederacy,’ according to its vice president, Alexander Stephens. Fort Mill, South Carolina, has a marker honoring the ‘faithful slaves’ of the Confederate states, but one would be hard pressed to find monuments anywhere in the country to slave rebels like Denmark Vesey and Nat Turner, to the 200,000 black soldiers and sailors who fought for the Union (or, for that matter, the thousands of white Southerners who remained loyal to the nation).”

The United Daughters of the Confederacy, as Loewen points out, erected most of the South’s Confederate monuments between 1890 and 1920. This campaign of commemoration was part of what Foner calls “a conscious effort to glorify and sanitize the Confederate cause and legitimize the newly installed Jim Crow system.”

Gen. Nathan Bedford Forrest, who Loewen writes was “one of the most vicious racists in American history,” was one of the South’s biggest slave traders, commander of the forces that massacred black Union troops after they surrendered at Fort Pillow and the founder of the Ku Klux Klan. Yet, as Foner notes, “there are more statues, markers and busts of Forrest in Tennessee than of any other figure in the state’s history, including President Andrew Jackson.”

“Only one transgression was sufficiently outrageous to disqualify Confederate leaders from the pantheon of heroes,” Foner writes. “No statue of James Longstreet, a far abler commander than Forrest, graces the Southern countryside, and Gen. James Fleming Fagan is omitted from the portrait gallery of famous figures of Arkansas history in Little Rock. Their crime? Both supported black rights during Reconstruction.”

The American myth also relies heavily on a distorted history of the westward expansion.

“The mythology of the West is deeply rooted in our culture,” Foner said, “whether it’s in Western movies or the idea of the lone pioneer, the individual roughing it out in the West, and of course, the main lie is that the West was kind of empty before white settlers and hunters and trappers and farmers came from the East to settle it. In fact, the West has been populated since forever. The real story of the West is the clash of all these different peoples, Native Americans, Asians in California, settlers coming in from the East, Mexicans. The West was a very multicultural place. There are a lot of histories there. Many of those histories are ignored or subordinated in this one story of the westward movement.”

“Racism is certainly a part of Western history,” Foner said. “But you’re not going to get that from a John Wayne movie [or] the paintings by [Frederic] Remington and others. It’s a history that doesn’t help you understand the present.”

Remington’s racism, displayed in paintings of noble white settlers and cowboys battling “savages,” was pronounced. “Jews—inguns—chinamen—Italians—Huns,” he wrote, were “the rubbish of the earth I hate.” In the same letter he added, “I’ve got some Winchesters and when the massacreing begins … I can get my share of ’em and whats more I will.”

Nietzsche identified three approaches to history: monumental, antiquarian and critical, the last being “the history that judges and condemns.”

“The monumental is the history that glorifies the nation-state that is represented in monuments that do not question anything about the society,” Foner said. “A lot of history is like that. The rise of history as a discipline coincided with the rise of the nation-state. Every nation needs a set of myths to justify its own existence. Another one of my favorite writers, Ernest Renan, the French historian, wrote, ‘The historian is the enemy of the nation.’ It’s an interesting thing to say. He doesn’t mean they’re spies or anything. The historian comes along and takes apart the mythologies that are helping to underpin the legitimacy of the nation. That’s why people don’t like them very often. They don’t want to hear these things. Antiquarian is what a lot of people are. That’s fine. They’re looking for their personal roots, their family history. They’re going on ancestry.com to find out where their DNA came from. That’s not really history exactly. They don’t have much of a historical context. But it stimulates people to think about the past. Then there’s what Nietzsche calls critical history—the history that judges and condemns. It takes a moral stance. It doesn’t just relate the facts. It tells you what is good and what is evil. A lot of historians don’t like to do that. But to me, it’s important. It’s important for the historian, having done the research, having presented the history, to say here’s where I stand in relation to all these important issues in our history.”

“Whether it’s Frederick Douglass, Eugene Debs, Elizabeth Cady Stanton, Martin Luther King Jr., those are the people who were trying to make America a better place,” Foner said. “King, in particular, was a very radical guy.”

Yet, as Foner points out, King is effectively “frozen in one speech, one sentence: I want my children to be judged by the content of their character, not just the color of their skin. [But] that’s not what the whole civil rights movement was about. People forget, he died leading a poor people’s march, leading a strike of sanitation workers. He wasn’t just out there talking about civil rights. He had moved to economic equality as a fundamental issue.”

Max Weber wrote, “What is possible would never have been achieved if, in this world, people had not repeatedly reached for the impossible.”

Foner, like Weber, argues that it is the visionaries and utopian reformers such as Debs and the abolitionists who brought about real social change, not the “practical” politicians. The abolitionists destroyed what Foner calls the “conspiracy of silence by which political parties, churches and other institutions sought to exclude slavery from public debate.” He writes:

For much of the 1850s and the first two years of the Civil War, Lincoln—widely considered the model of a pragmatic politician—advocated a plan to end slavery that involved gradual emancipation, monetary compensation for slaver owners, and setting up colonies of freed blacks outside the United States. The harebrained scheme had no possibility of enactment. It was the abolitionists, still viewed by some historians as irresponsible fanatics, who put forward the program—an immediate and uncompensated end to slavery, with black people becoming US citizens—that came to pass (with Lincoln’s eventual help, of course).

The political squabbles that dominate public discourse almost never question the sanctity of private property, individualism, capitalism or imperialism. They hold as sacrosanct American “virtues.” They insist that Americans are a “good” people steadily overcoming any prejudices and injustices that may have occurred in the past. The debates between the Democrats and the Whigs, or today’s Republicans and Democrats, have roots in the same allegiance to the dominant structures of power, myth of American exceptionalism and white supremacy.

“It’s all a family quarrel without any genuine, serious disagreements,” Foner said.

Those who challenge these structures, who reach for the impossible, who dare to speak the truth, have been, throughout American history, dismissed as “fanatics.” But, as Foner points out, it is often the “fanatics” who make history.

Posted in Marty's Blog | Leave a comment

The Great Flood

Chris Hedges
Sept 12, 2017
Mr. Fish

How many times will we rebuild Florida’s cities, Houston, coastal New Jersey, New Orleans and other population centers ravaged by storms lethally intensified by global warming? At what point, surveying the devastation and knowing more is inevitable, will we walk away, leaving behind vast coastal dead zones? Will we retreat even further into magical thinking to cope with the fury we have unleashed from the natural world? Or will we respond rationally and radically alter our relationship to this earth that gives us life?

Civilizations over the past 6,000 years have unfailingly squandered their futures through acts of colossal stupidity and hubris. We are probably not an exception. The physical ruins of these empires, including the Mesopotamian, Roman, Mayan and Indus, litter the earth. They elevated, during acute distress, inept and corrupt leaders who channeled anger, fear and dwindling resources into self-defeating wars and vast building projects. The ruling oligarchs, driven by greed and hedonism, retreated into privileged compounds—the Forbidden City, Versailles—and hoarded wealth as their populations endured mounting misery and poverty. The worse it got, the more the people lied to themselves and the more they wanted to be lied to. Reality was too painful to confront. They retreated into what anthropologists call “crisis cults,” which promised the return of the lost world through magical beliefs.

“The most significant characteristic of modern civilization is the sacrifice of the future for the present,” philosopher and psychologist William James wrote, “and all the power of science has been prostituted to this purpose.”

We are entering this final phase of civilization, one in which we are slashing the budgets of the very agencies that are vital to prepare for the devastation ahead—the National Oceanographic and Atmospheric Administration, the Federal Emergency Management Administration and the Environmental Protection Agency, along with programs at the National Aeronautics and Space Administration dealing with climate change. Hurricane after hurricane, monster storm after monster storm, flood after flood, wildfire after wildfire, drought after drought will gradually cripple the empire, draining its wealth and resources and creating swathes of territory defined by lawlessness and squalor.

These dead zones will obliterate not only commercial and residential life but also military assets. As Jeff Goodell points out in “The Water Will Come: Rising Seas, Sinking Cities and the Remaking of the Civilized World,” “The Pentagon manages a global real estate portfolio that includes over 555,000 facilities and 28 million acres of land—virtually all of it will be impacted by climate change in some way.”

As this column is being written, three key military facilities in Florida are evacuated: the Miami-area headquarters of the U.S. Southern Command, which oversees military operations in the Caribbean and Latin America; the U.S. Central Command in Tampa, in charge of overseas operations in the Middle East and Southwest Asia; and the Naval Air Station in Key West. There will soon come a day when obliteration of infrastructure will prohibit military operations from returning. Add to the list of endangered military installations Eglin Air Force Base in the Florida Panhandle, the U.S. missile base in the Marshall Islands, the U.S. naval base on Diego Garcia and numerous other military sites in coastal areas and it becomes painfully clear that the existential peril facing the empire is not in the Middle East but in the seas and the skies. There are 128 U.S. military installations at risk from rising sea levels, including Navy, Air Force, Marine and Army facilities in Virginia. Giant vertical rulers dot the highway outside the Norfolk naval base to allow motorists to determine if the water is too deep to drive through. In two decades, maybe less, the main road to the base will be impassable at high tide daily.

Cities across the globe, including London, Shanghai, Rio de Janeiro, Mumbai, Lagos, Copenhagen, New Orleans, San Francisco, Savannah, Ga., and New York, will become modern-day versions of Atlantis, along with countries such as Bangladesh and the Marshall Islands and large parts of New Zealand and Australia. There are 90 coastal cities in the U.S. that endure chronic flooding, a number that is expected to double in the next two decades. National economies will go into tailspins as wider and wider parts of the globe suffer catastrophic systems breakdown. Central authority and basic services will increasingly be nonexistent. Hundreds of millions of people, desperate for food, water and security, will become climate refugees. Nuclear power plants, including Turkey Point, which is on the edge of Biscayne Bay south of Miami, will face meltdowns, such as the accident that occurred in the Fukushima nuclear plant in Japan after it was destroyed by an earthquake and tsunami. These plants will spew radioactive waste into the sea and air. Exacerbated by disintegration of the polar ice caps, the catastrophes will be too overwhelming to manage. We will enter what James Howard Kunstler calls “the long emergency.” When that happens, our experiment in civilization might approach an end.

“The amount of real estate at risk in New York is mind-boggling: 72,000 buildings worth over $129 billion stand in flood zones today, with thousands more buildings at risk with each foot of sea-level rise,” writes Jeff Goodell. “In addition, New York has a lot of industrial waterfront, where toxic materials and poor communities live in close proximity, as well as a huge amount of underground infrastructure—subways, tunnels, electrical systems. Finally, New York is a sea-level-rise hot spot. Because of changes in ocean dynamics, as well as the fact that the ground beneath the city is sinking as the continent recovers from the last ice age, seas are now rising about 50 percent faster in the New York area than the global average.”

A society in crisis flees to the reassuring embrace of con artists and charlatans. Critics who ring alarm bells are condemned as pessimists who offer no “hope,” the drug that keeps a doomed population passive. The current administration—which removed Barack Obama’s Climate Action Plan from the White House website as soon as Donald Trump took office—and the Republican Party are filled with happy climate deniers. They have adopted a response to climate change similar to that of the Virginia Legislature: ban discussion of climate change and replace the term with the less ominous “recurrent flooding.” This denial of reality—one also employed by those who assure us we can adapt—is driven by fossil fuel and animal agriculture industries that along with the rich and corporations fund the political campaigns of elected officials. They fear that a rational, effective response to climate change will impede profits. Our corporate media, dependent on advertising dollars, contributes to the conspiracy of silence. It ignores the patterns and effects of climate change, focusing instead on feel-good stories about heroic rescues or dramatic coverage of flooded city centers and storm refugee caravans fleeing up the coast of Florida.

Droughts, floods, famines and disease will eventually see the collapse of social cohesion in large parts of the globe, including U.S. coastal areas. The insecurity, hunger and desperation among the dispossessed of the earth will give rise to ad hoc militias, crime and increased acts of terrorism. The Pentagon report “An Abrupt Climate Change Scenario and Its Implications for United States Security” is blunt. “Disruption and conflict will be endemic features of life,” it grimly concludes.

But as Goodell points out, “In today’s political climate, open discussion of the security risks of climate change is viewed as practically treasonous.” When in 2014 then-Secretary of State John Kerry called climate change “perhaps the world’s most fearsome weapon of mass destruction” and compared it to the effects of terrorism, epidemics and poverty, the right-wing trolls, from John McCain to Newt Gingrich, went into a frenzy. Gingrich called for Kerry’s resignation because “a delusional secretary of state is dangerous to our safety.”

James Woolsey, the former head of the CIA, wrote in a climate change report for the Pentagon titled “The Age of Consequences: The Foreign-Policy National Security Implications of Global Climate Change”:

If Americans have difficulty reaching a reasonable compromise on immigration legislation today, consider what such a debate would be like if we were struggling to resettle millions of our own citizens—driven by high water from the Gulf of Mexico, South Florida, and much of the East Coast reaching nearly to New England—even as we witnessed the northward migration of large populations from Latin America and the Caribbean. Such migration will likely be one of the Western Hemisphere’s early social consequences of climate change and sea level rise of these orders of magnitude. Issues deriving from inundation of a large amount of our own territory, together with migration towards our borders by millions of our hungry and thirsty southern neighbors, are likely to dominate U.S. security and humanitarian concerns. Globally as well, populations will migrate from increasingly hot and dry climates to more temperate ones.

We will react like most patients with a terminal disease as they struggle to confront their imminent mortality. The gradual diminishing of space, perception and strength will weaken our capacity to absorb reality. The end will be too horrible to contemplate. The tangible signs of our demise will be obvious, but this will only accelerate our retreat into delusional thinking. We will believe ever more fervently that the secular gods of science and technology will save us.

As Goodell writes, “People will notice higher tides that roll in more and more frequently. Water will pool longer in streets and parking lots. Trees will turn brown and die as they suck up salt water.” We will retreat to higher ground, cover our roofs with solar panels, finally stop using plastic and go vegan, but it will be too late. As Goodell writes, “even in rich neighborhoods, abandoned houses will linger like ghosts, filling with feral cats and other refugees looking for their own higher ground.”

The water will continue to rise. “It will have a metallic sheen and will smell bad,” Goodell writes. “Kids will get strange rashes and fevers. More people will leave [low areas]. Seawalls will crumble. In a few decades, low-lying neighborhoods will be knee-deep. Wooden houses will collapse into a sea of soda bottles, laundry detergent jugs, and plastic toothbrushes. Human bones, floated out of caskets, will be a common sight. Treasure hunters will kayak in, using small robotic submersibles to search for coins and jewelry. Modern office buildings and condo towers will lean as salt water corrodes the concrete foundations and eats away at the structural beams. Fish will school in the classrooms. Oysters will grow on submerged light poles. Religious leaders will blame sinners for the drowning of the city.”

The damage suffered by Houston, Tampa and Miami is not an anomaly. It is the beginning of the end. Ask not for whom the bell tolls. It tolls for thee.

Posted in Marty's Blog | Leave a comment

The unholy alliance of Trump voters

The unholy alliance of Trump voters

Posted: 8 September 2017 in Uncategorized
Tags: , , , , , , , , , ,

Trump

It wasn’t a homogeneous block—whether the white working-class or anti-immigrant nativists or the victims of globalization—that put Donald Trump into the White House. That’s the kind of reductionist narrative that has proliferated both before and after the fateful 2016 presidential election, all in an attempt to make sense of Trump’s “base.”

Instead, it was a complex coalition of voters, with different resentments and desires, that combined, at least via the electoral college (but not, of course, in the popular vote), to defeat Hillary Clinton and elect Trump.

That’s the conclusion arrived at by Emily Ekins [ht: db] of the Cato Institute and the Democracy Fund Voter Study Group.

According to Ekins, there were five unique clusters of Trump voters—American Preservationists (20 percent), Staunch Conservatives (31 percent), Anti-Elites (19 percent), Free Marketeers (25 percent), and the Disengaged (5 percent)—who hold very different views on a wide variety of issues, including immigration, race, American identity, moral traditionalism, international trade, and economics.

Here’s how Ekins describes these different clusters:

Staunch Conservatives are steadfast fiscal conservatives, embrace moral traditionalism, and have a moderately nativist conception of American identity and approach to immigration.

Free Marketeers are small government fiscal conservatives, free traders, with moderate to liberal positions on immigration and race. (Their vote was a vote primarily against Clinton and not a vote for Trump.)

American Preservationists lean economically progressive, believe the economic and political systems are rigged, have nativist immigration views, and a nativist and ethnocultural conception of American identity.

Anti-Elites lean economically progressive, believe the economic and political systems are rigged, and take relatively more moderate positions on immigration, race, and American identity than American Preservationists. They are also the most likely group to favor political compromise.

The Disengaged do not know much about politics, but what they do know is they feel detached from institutions and elites and are skeptical of immigration.

Call it the “unholy alliance” of Trump voters—clusters of people who had different motivations in mind when they went to the voting booth.

Figure4_ekins_e4aabc39aab12644609701bbacdff252

A good example of their diversity is their response to the question, do you have favor raising taxes on families with incomes over $200,000 a year? Overwhelming majorities of American Preservationists and Anti-Elites (and a plurality of the Disengaged) favor raising taxes, while Staunch Conservatives and Free Marketeers are opposed.

Figure12_ekins_e4aabc39aab12644609701bbacdff252

Much the same differences arise when asked if the economic system in the United States is biased in favor of the wealthiest Americans.

In fact, Ekins found only four issues that clearly distinguish Trump voters from non-Trump voters: an intense dislike of Clinton, a more dismal view of their personal financial situations, support for a temporary ban on Muslim immigration, and opposition to illegal immigration. Otherwise, as Ekins explains, Trump voters diverge on a wide variety of salient issues, including taxes, entitlements, immigration, race, pluralism, traditionalism, and social conservatism.

As I see it, Ekins’s analysis of Trump voters is significant for two reasons: First, it reveals how complex—and shaky or unstable—the coalition is. It’s going to make it difficult for Trump and the Republican Congress to govern in any kind of unified fashion. Second, it creates real opportunities for the political opposition, depending on how it reorganizes itself in the months and years ahead and whether or not it is able to move beyond the Clinton-dominated wing of the Democratic Party, to peal off significant numbers of Trump voters.

That’s only possible if, as Ekins writes, we acknowledge that “different types of people came to vote for Trump and not all for the same reasons.”

Posted in Marty's Blog | Leave a comment

Diseases of Despair

 Chris Hedges Sept 3, 2017

 Mr. Fish

The opioid crisis, the frequent mass shootings, the rising rates of suicide, especially among middle-aged white males, the morbid obesity, the obsession with gambling, the investment of our emotional and intellectual life in tawdry spectacles and the allure of magical thinking, from the absurd promises of the Christian right to the belief that reality is never an impediment to our desires, are the pathologies of a diseased culture. They have risen from a decayed world where opportunity, which confers status, self-esteem and dignity, has dried up for most Americans. They are expressions of acute desperation and morbidity.

A loss of income causes more than financial distress. It severs, as the sociologist Émile Durkheim pointed out, the vital social bonds that give us meaning. A decline in status and power, an inability to advance, a lack of education and health care and a loss of hope are crippling forms of humiliation. This humiliation fuels loneliness, frustration, anger and feelings of worthlessness. In short, when you are marginalized and rejected by society, life often has little meaning.

“When life is not worth living, everything becomes a pretext for ridding ourselves of it … ,” Durkheim wrote. “There is a collective mood, as there is an individual mood, that inclines nations to sadness. … For individuals are too closely involved in the life of society for it to be sick without their being affected. Its suffering inevitably becomes theirs.”

White men, more easily seduced by the myth of the American dream than people of color who understand how the capitalist system is rigged against them, often suffer feelings of failure and betrayal, in many cases when they are in their middle years. They expect, because of notions of white supremacy and capitalist platitudes about hard work leading to advancement, to be ascendant. They believe in success. When the American dream becomes a nightmare they are vulnerable to psychological collapse. This collapse, more than any political agenda, propelled Donald Trump into power. Trump embodies the decayed soul of America. He, like many of those who support him, has a childish yearning to be as omnipotent as the gods. This impossibility, as the cultural anthropologist Ernest Becker wrote, leads to a dark alternative: destroying like the gods.

In “Hitler and the Germans” the political philosopher Eric Voegelin dismissed the myth that Hitler—an uneducated mediocrity whose only strength was an ability to exploit political opportunities—mesmerized and seduced the German people. The Germans, he wrote, voted for Hitler and the “grotesque, marginal figures” surrounding him because he embodied the pathologies of a diseased society, one beset by economic collapse, hopelessness and violence. This sickness found its expression in the Nazis, as it has found its expression in the United States in Trump.

Hannah Arendt said the rise of radical evil is caused by collective “thoughtlessness.” Desperate to escape from the prison of a failed society, willing to do anything and abuse anyone to advance, those who feel trapped see the people around them as objects to be exploited for self-advancement. This exploitation mirrors that carried out by corrupt ruling elites. Turning people into objects to be used to achieve wealth, power or sexual gratification is the core practice espoused by popular culture, from reality television to casino capitalism. Trump personifies this practice.

Plato wrote that the moral character of a society is determined by its members. When the society abandons the common good it unleashes amoral lusts—violence, greed and sexual exploitation—and fosters magical thinking. The Greek philosopher Heraclitus called those who severed themselves from the moral and reality-based universe idiotes. When these idiotes, whose worldview is often the product of relentless indoctrination, form a majority or a powerful minority, the demagogue rises from the morass.

The demagogue is the public face of collective stupidity. Voegelin defined stupidity as a “loss of reality.” This loss of reality meant people could not “rightly orient his [or her] action in the world, in which he [or she] lives.” The demagogue, who is always an idiote, is not a freak or a social mutation. The demagogue expresses the society’s demented zeitgeist. This was true in Nazi Germany. It is true in the United States.

“The fool in Hebrew, the nabal, who because of his folly, nebala, creates disorder in the society, is the man who is not a believer, in the Israelite terms of revelation,” Voegelin wrote. “The amathes, the irrationally ignorant man, is for Plato the man who just does not have the authority of reason or who cannot bow to it. The stultus for Thomas [Aquinas] is the fool, in the same sense as the amathia of Plato and the nebala of the Israelite prophets. This stultus now has suffered loss of reality and acts on the basis of a defective image of reality and thereby creates disorder. … If I have lost certain sectors of reality from my range of experience, I will also be lacking the language for appropriately characterizing them. That means that parallel to the loss of reality and to stupidity there is always the phenomenon of illiteracy.”

A society convulsed by disorder and chaos, as Voegelin pointed out, elevates and even celebrates the morally degenerate, those who are cunning, manipulative, deceitful and violent. In an open society these attributes are despised and criminalized. Those who exhibit them are condemned as stupid—“a man [or woman] who behaves in this way,” Voegelin notes, “will be socially boycotted.” But the social, cultural and moral norms in a diseased society are inverted. The attributes that sustain an open society—a concern for the common good, honesty, trust and self-sacrifice—are detrimental to existence in a diseased society. Today, those who exhibit these attributes are targeted and silenced.

The deep alienation experienced by most Americans, the loss of self-esteem and hope, has engendered what Durkheim referred to as a collective state of anomie. Anomie is a psychological imbalance that leads to prolonged despair, lethargy and yearnings for self-annihilation. It is caused by a collapse of societal norms, ideals, values and standards. It is, in short, a loss of faith in the structures and beliefs that define a functioning democracy. The result is an obliteration of purpose and direction. It leads to what Friedrich Nietzsche called an aggressive despiritualized nihilism. As Durkheim wrote in his book “On Suicide”:

It is sometimes said that, by virtue of his psychological make-up, man cannot live unless he attaches himself to an object that is greater than himself and outlives him, and this necessity has been attributed to a supposedly common need not to perish entirely. Life, they say, is only tolerable if one can see some purpose in it, if it has a goal and one that is worth pursuing. But the individual in himself is not sufficient as an end for himself. He is too small a thing. Not only is he confined in space, he is also narrowly limited in time. So when we have no other objective than ourselves, we cannot escape from the feeling our efforts are finally destined to vanish into nothing, since that is where we must return. But we recoil from the idea of annihilation. In such a state, we should not have the strength to live, that is to say to act and struggle, since nothing is to remain of all the trouble that we take. In a word, the state of egoism is in contradiction with human nature and hence too precarious to endure.

Pope John Paul II in 1981 issued an encyclical titled “Laborem exercens,” or “Through Work.” He attacked the idea, fundamental to capitalism, that work was merely an exchange of money for labor. Work, he wrote, should not be reduced to the commodification of human beings through wages. Workers were not impersonal instruments to be manipulated like inanimate objects to increase profit. Work was essential to human dignity and self-fulfillment. It gave us a sense of empowerment and identity. It allowed us to build a relationship with society in which we could feel we contributed to social harmony and social cohesion, a relationship in which we had purpose.

The pope castigated unemployment, underemployment, inadequate wages, automation and a lack of job security as violations of human dignity. These conditions, he wrote, were forces that negated self-esteem, personal satisfaction, responsibility and creativity. The exaltation of the machine, he warned, reduced human beings to the status of slaves. He called for full employment, a minimum wage large enough to support a family, the right of a parent to stay home with children, and jobs and a living wage for the disabled. He advocated, in order to sustain strong families, universal health insurance, pensions, accident insurance and work schedules that permitted free time and vacations. He wrote that all workers should have the right to form unions with the ability to strike.

The encyclical said:

[In spite of toil]—perhaps, in a sense, because of it—work is a good thing for man. Even though it bears the mark of a bonum arduum, in the terminology of Saint Thomas, this does not take away the fact that, as such, it is a good thing for man. It is not only good in the sense that it is useful or something to enjoy; it is also good as being something worthy, that is to say, something that corresponds to man’s dignity, that expresses this dignity and increases it. If one wishes to define more clearly the ethical meaning of work, it is this truth that one must particularly keep in mind. Work is a good thing for man—a good thing for his humanity—because through work man not only transforms nature, adapting it to his own needs, but he also achieves fulfillment as a human being and indeed, in a sense, becomes “more a human being.”

Work, the pope pointed out, “constitutes a foundation for the formation of family life, which is a natural right and something that man is called to. These two spheres of values—one linked to work and the other consequent on the family nature of human life—must be properly united and must properly permeate each other. In a way, work is a condition for making it possible to found a family, since the family requires the means of subsistence which man normally gains through work. Work and industriousness also influence the whole process of education in the family, for the very reason that everyone ‘becomes a human being’ through, among other things, work, and becoming a human being is precisely the main purpose of the whole process of education. Obviously, two aspects of work in a sense come into play here: the one making family life and its upkeep possible, and the other making possible the achievement of the purposes of the family, especially education. Nevertheless, these two aspects of work are linked to one another and are mutually complementary in various points.”

“It must be remembered and affirmed that the family constitutes one of the most important terms of reference for shaping the social and ethical order of human work,” the encyclical continued. “The teaching of the Church has always devoted special attention to this question, and in the present document we shall have to return to it. In fact, the family is simultaneously a community made possible by work and the first school of work, within the home, for every person.”

We will not bring those who have fled a reality-based world back into our fold through argument. We will not coerce them into submission. We will not find salvation for them or ourselves by supporting the Democratic Party. Whole segments of American society are bent on self-immolation. They despise this world and what it has done to them. Their personal and political behavior is willfully suicidal. They seek to destroy, even if destruction leads to death. We must organize our communities to create a new socialist order and overthrow the corporate state through sustained acts of mass civil disobedience. We must achieve full employment, guaranteed minimum incomes, health insurance, free education at all levels, robust protection of the natural world and an end to militarism and imperialism. We must create the possibility for a life of dignity, purpose and self-esteem. If we do not, the idiotes will ensure our obliteration.

Posted in Marty's Blog | Leave a comment

How America Lost Its Mind

The nation’s current post-truth moment is the ultimate expression of mind-sets that have made America exceptional throughout its history.

Kurt Andersen September 2017 Issue

“You are entitled to your own opinion,
but you are not entitled to your own facts.”

— Daniel Patrick Moynihan

“We risk being the first people in history to have been
able to make their illusions so vivid, so persuasive,
so ‘realistic’ that they can live in them.”

— Daniel J. Boorstin, The Image: A Guide to
Pseudo-Events in America (1961)


When did America become untethered from reality?

I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community. People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.” A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness. “Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books. They’re elitist. Constantly telling us what is or isn’t true. Or what did or didn’t happen. Who’s Britannica to tell me the Panama Canal was finished in 1914? If I wanna say it happened in 1941, that’s my right. I don’t trust books—they’re all fact, no heart … Face it, folks, we are a divided nation … divided between those who think with their head and those who know with their heart … Because that’s where the truth comes from, ladies and gentlemen—the gut.”

Whoa, yes, I thought: exactly. America had changed since I was young, when truthiness and reality-based community wouldn’t have made any sense as jokes. For all the fun, and all the many salutary effects of the 1960s—the main decade of my childhood—I saw that those years had also been the big-bang moment for truthiness. And if the ’60s amounted to a national nervous breakdown, we are probably mistaken to consider ourselves over it.

 

Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. Some of my best friends are very religious, and others believe in dubious conspiracy theories. What’s problematic is going overboard—letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts. The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. From the start, our ultra-individualism was attached to epic dreams, sometimes epic fantasies—every American one of God’s chosen people building a custom-made utopia, all of us free to reinvent ourselves by imagination and will. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts. Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us. And most of us haven’t realized how far-reaching our strange new normal has become.

Much more than the other billion or so people in the developed world, we Americans believe—really believe—in the supernatural and the miraculous, in Satan on Earth, in reports of recent trips to and from heaven, and in a story of life’s instantaneous creation several thousand years ago.

We believe that the government and its co-conspirators are hiding all sorts of monstrous and shocking truths from us, concerning assassinations, extraterrestrials, the genesis of aids, the 9/11 attacks, the dangers of vaccines, and so much more.

And this was all true before we became familiar with the terms post-factual and post-truth, before we elected a president with an astoundingly open mind about conspiracy theories, what’s true and what’s false, the nature of reality.

We have passed through the looking glass and down the rabbit hole. America has mutated into Fantasyland.

How widespread is this promiscuous devotion to the untrue? How many Americans now inhabit alternate realities? Any given survey of beliefs is only a sketch of what people in general really think. But reams of survey research from the past 20 years reveal a rough, useful census of American credulity and delusion. By my reckoning, the solidly reality-based are a minority, maybe a third of us but almost certainly fewer than half. Only a third of us, for instance, don’t believe that the tale of creation in Genesis is the word of God. Only a third strongly disbelieve in telepathy and ghosts. Two-thirds of Americans believe that “angels and demons are active in the world.” More than half say they’re absolutely certain heaven exists, and just as many are sure of the existence of a personal God—not a vague force or universal spirit or higher power, but some guy. A third of us believe not only that global warming is no big deal but that it’s a hoax perpetrated by scientists, the government, and journalists. A third believe that our earliest ancestors were humans just like us; that the government has, in league with the pharmaceutical industry, hidden evidence of natural cancer cures; that extraterrestrials have visited or are visiting Earth. Almost a quarter believe that vaccines cause autism, and that Donald Trump won the popular vote in 2016. A quarter believe that our previous president maybe or definitely was (or is?) the anti-Christ. According to a survey by Public Policy Polling, 15 percent believe that the “media or the government adds secret mind-controlling technology to television broadcast signals,” and another 15 percent think that’s possible. A quarter of Americans believe in witches. Remarkably, the same fraction, or maybe less, believes that the Bible consists mainly of legends and fables—the same proportion that believes U.S. officials were complicit in the 9/11 attacks.

When I say that a third believe X and a quarter believe Y, it’s important to understand that those are different thirds and quarters of the population. Of course, various fantasy constituencies overlap and feed one another—for instance, belief in extraterrestrial visitation and abduction can lead to belief in vast government cover-ups, which can lead to belief in still more wide-ranging plots and cabals, which can jibe with a belief in an impending Armageddon.

Why are we like this?

The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned. Once people commit to that approach, the world turns inside out, and no cause-and-effect connection is fixed. The credible becomes incredible and the incredible credible.

The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites. Yet the institutions and forces that once kept us from indulging the flagrantly untrue or absurd—media, academia, government, corporate America, professional associations, respectable opinion in the aggregate—have enabled and encouraged every species of fantasy over the past few decades.

A senior physician at one of America’s most prestigious university hospitals promotes “miracle cures” on his daily TV show. Cable channels air documentaries treating mermaids, monsters, ghosts, and angels as real. When a political-science professor attacks the idea “that there is some ‘public’ that shares a notion of reality, a concept of reason, and a set of criteria by which claims to reason and rationality are judged,” colleagues just nod and grant tenure. The old fringes have been folded into the new center. The irrational has become respectable and often unstoppable.

Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense. During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.

American moxie has always come in two types. We have our wilder, faster, looser side: We’re overexcited gamblers with a weakness for stories too good to be true. But we also have the virtues embodied by the Puritans and their secular descendants: steadiness, hard work, frugality, sobriety, and common sense. A propensity to dream impossible dreams is like other powerful tendencies—okay when kept in check. For most of our history, the impulses existed in a rough balance, a dynamic equilibrium between fantasy and reality, mania and moderation, credulity and skepticism.

The great unbalancing and descent into full Fantasyland was the product of two momentous changes. The first was a profound shift in thinking that swelled up in the ’60s; since then, Americans have had a new rule written into their mental operating systems: Do your own thing, find your own reality, it’s all relative.

The second change was the onset of the new era of information. Digital technology empowers real-seeming fictions of the ideological and religious and scientific kinds. Among the web’s 1 billion sites, believers in anything and everything can find thousands of fellow fantasists, with collages of facts and “facts” to support them. Before the internet, crackpots were mostly isolated, and surely had a harder time remaining convinced of their alternate realities. Now their devoutly believed opinions are all over the airwaves and the web, just like actual news. Now all of the fantasies look real.

Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily. Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.

Treating real life as fantasy and vice versa, and taking preposterous ideas seriously, is not unique to Americans. But we are the global crucible and epicenter. We invented the fantasy-industrial complex; almost nowhere outside poor or otherwise miserable countries are flamboyant supernatural beliefs so central to the identities of so many people. This is American exceptionalism in the 21st century. The country has always been a one-of-a-kind place. But our singularity is different now. We’re still rich and free, still more influential and powerful than any other nation, practically a synonym for developed country. But our drift toward credulity, toward doing our own thing, toward denying facts and having an altogether uncertain grip on reality, has overwhelmed our other exceptional national traits and turned us into a less developed country.

People see our shocking Trump moment—this post-truth, “alternative facts” moment—as some inexplicable and crazy new American phenomenon. But what’s happening is just the ultimate extrapolation and expression of mind-sets that have made America exceptional for its entire history.

America was created by true believers and passionate dreamers, and by hucksters and their suckers, which made America successful—but also by a people uniquely susceptible to fantasy, as epitomized by everything from Salem’s hunting witches to Joseph Smith’s creating Mormonism, from P. T. Barnum to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Trump. In other words: Mix epic individualism with extreme religion; mix show business with everything else; let all that ferment for a few centuries; then run it through the anything-goes ’60s and the internet age. The result is the America we inhabit today, with reality and fantasy weirdly and dangerously blurred and commingled.


The 1960s and the Beginning of the End of Reason


I don’t regret or disapprove of many of the ways the ’60s permanently reordered American society and culture. It’s just that along with the familiar benefits, there have been unreckoned costs.

In 1962, people started referring to “hippies,” the Beatles had their first hit, Ken Kesey published One Flew Over the Cuckoo’s Nest, and the Harvard psychology lecturer Timothy Leary was handing out psilocybin and LSD to grad students. And three hours south of San Francisco, on the heavenly stretch of coastal cliffs known as Big Sur, a pair of young Stanford psychology graduates founded a school and think tank they named after a small American Indian tribe that had lived on the grounds long before. “In 1968,” one of its founding figures recalled four decades later,

Esalen was the center of the cyclone of the youth rebellion. It was one of the central places, like Mecca for the Islamic culture. Esalen was a pilgrimage center for hundreds and thousands of youth interested in some sense of transcendence, breakthrough consciousness, LSD, the sexual revolution, encounter, being sensitive, finding your body, yoga—all of these things were at first filtered into the culture through Esalen. By 1966, ’67, and ’68, Esalen was making a world impact.

This is not overstatement. Essentially everything that became known as New Age was invented, developed, or popularized at the Esalen Institute. Esalen is a mother church of a new American religion for people who think they don’t like churches or religions but who still want to believe in the supernatural. The institute wholly reinvented psychology, medicine, and philosophy, driven by a suspicion of science and reason and an embrace of magical thinking (also: massage, hot baths, sex, and sex in hot baths). It was a headquarters for a new religion of no religion, and for “science” containing next to no science. The idea was to be radically tolerant of therapeutic approaches and understandings of reality, especially if they came from Asian traditions or from American Indian or other shamanistic traditions. Invisible energies, past lives, astral projection, whatever—the more exotic and wondrous and unfalsifiable, the better.

Not long before Esalen was founded, one of its co-founders, Dick Price, had suffered a mental breakdown and been involuntarily committed to a private psychiatric hospital for a year. His new institute embraced the radical notion that psychosis and other mental illnesses were labels imposed by the straight world on eccentrics and visionaries, that they were primarily tools of coercion and control. This was the big idea behind One Flew Over the Cuckoo’s Nest, of course. And within the psychiatric profession itself this idea had two influential proponents, who each published unorthodox manifestos at the beginning of the decade—R. D. Laing (The Divided Self) and Thomas Szasz (The Myth of Mental Illness). “Madness,” Laing wrote when Esalen was new, “is potentially liberation and renewal.” Esalen’s founders were big Laing fans, and the institute became a hotbed for the idea that insanity was just an alternative way of perceiving reality.

These influential critiques helped make popular and respectable the idea that much of science is a sinister scheme concocted by a despotic conspiracy to oppress people. Mental illness, both Szasz and Laing said, is “a theory not a fact.” This is now the universal bottom-line argument for anyone—from creationists to climate-change deniers to anti-vaccine hysterics—who prefers to disregard science in favor of his own beliefs.

You know how young people always think the universe revolves around them, as if they’re the only ones who really get it? And how before their frontal lobes, the neural seat of reason and rationality, are fully wired, they can be especially prone to fantasy? In the ’60s, the universe cooperated: It did seem to revolve around young people, affirming their adolescent self-regard, making their fantasies of importance feel real and their fantasies of instant transformation and revolution feel plausible. Practically overnight, America turned its full attention to the young and everything they believed and imagined and wished.

If 1962 was when the decade really got going, 1969 was the year the new doctrines and their gravity were definitively cataloged by the grown-ups. Reason and rationality were over. The countercultural effusions were freaking out the old guard, including religious people who couldn’t quite see that yet another Great Awakening was under way in America, heaving up a new religion of believers who “have no option but to follow the road until they reach the Holy City … that lies beyond the technocracy … the New Jerusalem.” That line is from The Making of a Counter Culture: Reflections on the Technocratic Society and Its Youthful Opposition, published three weeks after Woodstock, in the summer of 1969. Its author was Theodore Roszak, age 35, a Bay Area professor who thereby coined the word counterculture. Roszak spends 270 pages glorying in the younger generation’s “brave” rejection of expertise and “all that our culture values as ‘reason’ and ‘reality.’ ” (Note the scare quotes.) So-called experts, after all, are “on the payroll of the state and/or corporate structure.” A chapter called “The Myth of Objective Consciousness” argues that science is really just a state religion. To create “a new culture in which the non-intellective capacities … become the arbiters of the good [and] the true,” he writes, “nothing less is required than the subversion of the scientific world view, with its entrenched commitment to an egocentric and cerebral mode of consciousness.” He welcomes the “radical rejection of science and technological values.”

Earlier that summer, a University of Chicago sociologist (and Catholic priest) named Andrew Greeley had alerted readers of The New York Times Magazine that beyond the familiar signifiers of youthful rebellion (long hair, sex, drugs, music, protests), the truly shocking change on campuses was the rise of anti-rationalism and a return of the sacred—“mysticism and magic,” the occult, séances, cults based on the book of Revelation. When he’d chalked a statistical table on a classroom blackboard, one of his students had reacted with horror: “Mr. Greeley, I think you’re an empiricist.”

As 1969 turned to 1970, a 41-year-old Yale Law School professor was finishing his book about the new youth counterculture. Charles Reich was a former Supreme Court clerk now tenured at one of ultra-rationalism’s American headquarters. But hanging with the young people had led him to a midlife epiphany and apostasy. In 1966, he had started teaching an undergraduate seminar called “The Individual in America,” for which he assigned fiction by Kesey and Norman Mailer. He decided to spend the next summer, the Summer of Love, in Berkeley. On the road back to New Haven, he had his Pauline conversion to the kids’ values. His class at Yale became hugely popular; at its peak, 600 students were enrolled. In 1970, The Greening of America became The New York Times’ best-selling book (as well as a much-read 70-page New Yorker excerpt), and remained on the list for most of a year.

At 16, I bought and read one of the 2 million copies sold. Rereading it today and recalling how much I loved it was a stark reminder of the follies of youth. Reich was shamelessly, uncritically swooning for kids like me. The Greening of America may have been the mainstream’s single greatest act of pandering to the vanity and self-righteousness of the new youth. Its underlying theoretical scheme was simple and perfectly pitched to flatter young readers: There are three types of American “consciousness,” each of which “makes up an individual’s perception of reality … his ‘head,’ his way of life.” Consciousness I people were old-fashioned, self-reliant individualists rendered obsolete by the new “Corporate State”—essentially, your grandparents. Consciousness IIs were the fearful and conformist organization men and women whose rationalism was a tyrannizing trap laid by the Corporate State—your parents.

And then there was Consciousness III, which had “made its first appearance among the youth of America,” “spreading rapidly among wider and wider segments of youth, and by degrees to older people.” If you opposed the Vietnam War and dressed down and smoked pot, you were almost certainly a III. Simply by being young and casual and undisciplined, you were ushering in a new utopia.

Reich praises the “gaiety and humor” of the new Consciousness III wardrobe, but his book is absolutely humorless—because it’s a response to “this moment of utmost sterility, darkest night and most extreme peril.” Conspiracism was flourishing, and Reich bought in. Now that “the Corporate State has added depersonalization and repression” to its other injustices, “it has threatened to destroy all meaning and suck all joy from life.” Reich’s magical thinking mainly concerned how the revolution would turn out. “The American Corporate State,” having produced this new generation of longhaired hyperindividualists who insist on trusting their gut and finding their own truth, “is now accomplishing what no revolutionaries could accomplish by themselves. The machine has begun to destroy itself.” Once everyone wears Levi’s and gets high, the old ways “will simply be swept away in the flood.”

The inevitable/imminent happy-cataclysm part of the dream didn’t happen, of course. The machine did not destroy itself. But Reich was half-right. An epochal change in American thinking was under way and “not, as far as anybody knows, reversible … There is no returning to an earlier consciousness.” His wishful error was believing that once the tidal surge of new sensibility brought down the flood walls, the waters would flow in only one direction, carving out a peaceful, cooperative, groovy new continental utopia, hearts and minds changed like his, all of America Berkeleyized and Vermontified. Instead, Consciousness III was just one early iteration of the anything-goes, post-reason, post-factual America enabled by the tsunami. Reich’s faith was the converse of the Enlightenment rationalists’ hopeful fallacy 200 years earlier. Granted complete freedom of thought, Thomas Jefferson and company assumed, most people would follow the path of reason. Wasn’t it pretty to think so.

R. Kikuo Johnson

I remember when fantastical beliefs went fully mainstream, in the 1970s. My irreligious mother bought and read The Secret Life of Plants, a big best seller arguing that plants were sentient and would “be the bridesmaids at a marriage of physics and metaphysics.” The amazing truth about plants, the book claimed, had been suppressed by the FDA and agribusiness. My mom didn’t believe in the conspiracy, but she did start talking to her ficuses as if they were pets. In a review, The New York Times registered the book as another data point in how “the incredible is losing its pariah status.” Indeed, mainstream publishers and media organizations were falling over themselves to promote and sell fantasies as nonfiction. In 1975 came a sensational autobiography by the young spoon bender and mind reader Uri Geller as well as Life After Life, by Raymond Moody, a philosophy Ph.D. who presented the anecdotes of several dozen people who’d nearly died as evidence of an afterlife. The book sold many millions of copies; before long the International Association for Near Death Studies formed and held its first conference, at Yale.

During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs. That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large: All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe. The borders between fiction and nonfiction are permeable, maybe nonexistent. The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.

These ideas percolated across multiple academic fields. In 1965, the French philosopher Michel Foucault published Madness and Civilization in America, echoing Laing’s skepticism of the concept of mental illness; by the 1970s, he was arguing that rationality itself is a coercive “regime of truth”—oppression by other means. Foucault’s suspicion of reason became deeply and widely embedded in American academia.

Meanwhile, over in sociology, in 1966 a pair of professors published The Social Construction of Reality, one of the most influential works in their field. Not only were sanity and insanity and scientific truth somewhat dubious concoctions by elites, Peter Berger and Thomas Luckmann explained—so was everything else. The rulers of any tribe or society do not just dictate customs and laws; they are the masters of everyone’s perceptions, defining reality itself. To create the all-encompassing stage sets that everyone inhabits, rulers first use crude mythology, then more elaborate religion, and finally the “extreme step” of modern science. “Reality”? “Knowledge”? “If we were going to be meticulous,” Berger and Luckmann wrote, “we would put quotation marks around the two aforementioned terms every time we used them.” “What is ‘real’ to a Tibetan monk may not be ‘real’ to an American businessman.”

When I first read that, at age 18, I loved the quotation marks. If reality is simply the result of rules written by the powers that be, then isn’t everyone able—no, isn’t everyone obliged—to construct their own reality? The book was timed perfectly to become a foundational text in academia and beyond.

A more extreme academic evangelist for the idea of all truths being equal was a UC Berkeley philosophy professor named Paul Feyerabend. His best-known book, published in 1975, was Against Method: Outline of an Anarchistic Theory of Knowledge. “Rationalism,” it declared, “is a secularized form of the belief in the power of the word of God,” and science a “particular superstition.” In a later edition of the book, published when creationists were passing laws to teach Genesis in public-school biology classes, Feyerabend came out in favor of the practice, comparing creationists to Galileo. Science, he insisted, is just another form of belief. “Only one principle,” he wrote, “can be defended under all circumstances and in all stages of human development. It is the principle: anything goes.”

Over in anthropology, where the exotic magical beliefs of traditional cultures were a main subject, the new paradigm took over completely—don’t judge, don’t disbelieve, don’t point your professorial finger. This was understandable, given the times: colonialism ending, genocide of American Indians confessed, U.S. wars in the developing world. Who were we to roll our eyes or deny what these people believed? In the ’60s, anthropology decided that oracles, diviners, incantations, and magical objects should be not just respected, but considered equivalent to reason and science. If all understandings of reality are socially constructed, those of Kalabari tribesmen in Nigeria are no more arbitrary or faith-based than those of college professors.

In 1968, a UC Davis psychologist named Charles Tart conducted an experiment in which, he wrote, “a young woman who frequently had spontaneous out-of-body experiences”—didn’t “claim to have” them but “had” them—spent four nights sleeping in a lab, hooked up to an EEG machine. Her assigned task was to send her mind or soul out of her body while she was asleep and read a five-digit number Tart had written on a piece of paper placed on a shelf above the bed. He reported that she succeeded. Other scientists considered the experiments and the results bogus, but Tart proceeded to devote his academic career to proving that attempts at objectivity are a sham and magic is real. In an extraordinary paper published in 1972 in Science, he complained about the scientific establishment’s “almost total rejection of the knowledge gained” while high or tripping. He didn’t just want science to take seriously “experiences of ecstasy, mystical union, other ‘dimensions,’ rapture, beauty, space-and-time transcendence.” He was explicitly dedicated to going there. A “perfectly scientific theory may be based on data that have no physical existence,” he insisted. The rules of the scientific method had to be revised. To work as a psychologist in the new era, Tart argued, a researcher should be in the altered state of consciousness he’s studying, high or delusional “at the time of data collection” or during “data reduction and theorizing.” Tart’s new mode of research, he admitted, posed problems of “consensual validation,” given that “only observers in the same [altered state] are able to communicate adequately with one another.” Tart popularized the term consensus reality for what you or I would simply call reality, and around 1970 that became a permanent interdisciplinary term of art in academia. Later he abandoned the pretense of neutrality and started calling it the consensus trance—people committed to reason and rationality were the deluded dupes, not he and his tribe.

Even the social critic Paul Goodman, beloved by young leftists in the ’60s, was flabbergasted by his own students by 1969. “There was no knowledge,” he wrote, “only the sociology of knowledge. They had so well learned that … research is subsidized and conducted for the benefit of the ruling class that they did not believe there was such a thing as simple truth.”

Ever since, the American right has insistently decried the spread of relativism, the idea that nothing is any more correct or true than anything else. Conservatives hated how relativism undercut various venerable and comfortable ruling ideas—certain notions of entitlement (according to race and gender) and aesthetic beauty and metaphysical and moral certainty. Yet once the intellectual mainstream thoroughly accepted that there are many equally valid realities and truths, once the idea of gates and gatekeeping was discredited not just on campuses but throughout the culture, all American barbarians could have their claims taken seriously. Conservatives are correct that the anything-goes relativism of college campuses wasn’t sequestered there, but when it flowed out across America it helped enable extreme Christianities and lunacies on the right—gun-rights hysteria, black-helicopter conspiracism, climate-change denial, and more. The term useful idiot was originally deployed to accuse liberals of serving the interests of true believers further on the left. In this instance, however, postmodern intellectuals—post-positivists, poststructuralists, social constructivists, post-empiricists, epistemic relativists, cognitive relativists, descriptive relativists—turned out to be useful idiots most consequentially for the American right. “Reality has a well-known liberal bias,” Stephen Colbert once said, in character, mocking the beliefs-trump-facts impulse of today’s right. Neither side has noticed, but large factions of the elite left and the populist right have been on the same team.


Conspiracy and Paranoia in the 1970s


As the Vietnam War escalated and careened, antirationalism flowered. In his book about the remarkable protests in Washington, D.C., in the fall of 1967, The Armies of the Night, Norman Mailer describes chants (“Out demons, out—back to darkness, ye servants of Satan!”) and a circle of hundreds of protesters intending “to form a ring of exorcism sufficiently powerful to raise the Pentagon three hundred feet.” They were hoping the building would “turn orange and vibrate until all evil emissions had fled this levitation. At that point the war in Vietnam would end.”

By the end of the ’60s, plenty of zealots on the left were engaged in extreme magical thinking. They hadn’t started the decade that way. In 1962, Students for a Democratic Society adopted its founding document, drafted by 22-year-old Tom Hayden. The manifesto is sweet and reasonable: decrying inequality and poverty and “the pervasiveness of racism in American life,” seeing the potential benefits as well as the downsides of industrial automation, declaring the group “in basic opposition to the communist system.”

Then, kaboom, the big bang. Anything and everything became believable. Reason was chucked. Dystopian and utopian fantasies seemed plausible. In 1969, the SDS’s most apocalyptic and charismatic faction, calling itself Weatherman, split off and got all the attention. Its members believed that they and other young white Americans, aligned with black insurgents, would be the vanguard in a new civil war. They issued statements about “the need for armed struggle as the only road to revolution” and how “dope is one of our weapons … Guns and grass are united in the youth underground.” And then factions of the new left went to work making and setting off thousands of bombs in the early 1970s.

Left-wingers weren’t the only ones who became unhinged. Officials at the FBI, the CIA, and military intelligence agencies, as well as in urban police departments, convinced themselves that peaceful antiwar protesters and campus lefties in general were dangerous militants, and expanded secret programs to spy on, infiltrate, and besmirch their organizations. Which thereby validated the preexisting paranoia on the new left and encouraged its wing nuts’ revolutionary delusions. In the ’70s, the CIA and Army intelligence set up their infamous Project Star Gate to see whether they could conduct espionage by means of ESP.

The far right had its own glorious ’60s moment, in the form of the new John Birch Society, whose founders believed that both Republican and Democratic presidential Cabinets included “conscious, deliberate, dedicated agent[s] of the Soviet conspiracy” determined to create “a world-wide police state, absolutely and brutally governed from the Kremlin,” as the society’s founder, Robert Welch, put it in a letter to friends.

This furiously, elaborately suspicious way of understanding the world started spreading across the political spectrum after the assassination of John F. Kennedy in 1963. Dallas couldn’t have been the work of just one nutty loser with a mail-order rifle, could it have? Surely the Communists or the CIA or the Birchers or the Mafia or some conspiratorial combination must have arranged it all, right? The shift in thinking didn’t register immediately. In his influential book The Paranoid Style in American Politics, published two years after the president’s murder, Richard Hofstadter devoted only two sentences and a footnote to it, observing that “conspiratorial explanations of Kennedy’s assassination” don’t have much “currency … in the United States.”

Elaborate paranoia was an established tic of the Bircherite far right, but the left needed a little time to catch up. In 1964, a left-wing American writer published the first book about a JFK conspiracy, claiming that a Texas oilman had been the mastermind, and soon many books were arguing that the official government inquiry had ignored the hidden conspiracies. One of them, Rush to Judgment, by Mark Lane, a lawyer on the left, was a New York Times best seller for six months. Then, in 1967, New Orleans’s district attorney, Jim Garrison, indicted a local businessman for being part of a conspiracy of gay right-wingers to assassinate Kennedy—“a Nazi operation, whose sponsors include some of the oil-rich millionaires in Texas,” according to Garrison, with the CIA, FBI, and Robert F. Kennedy complicit in the cover-up. After NBC News broadcast an investigation discrediting the theory, Garrison said the TV segment was a piece of “thought control,” obviously commissioned by NBC’s parent company RCA, “one of the top 10 defense contractors” and thus “desperate because we are in the process of uncovering their hoax.”

The notion of an immense and awful JFK-assassination conspiracy became conventional wisdom in America. As a result, more Americans than ever became reflexive conspiracy theorists. Thomas Pynchon’s novel Gravity’s Rainbow, a complicated global fantasy about the interconnections among militarists and Illuminati and stoners, and the validity of paranoid thinking, won the 1974 National Book Award. Conspiracy became the high-end Hollywood dramatic premise—Chinatown, The Conversation, The Parallax View, and Three Days of the Condor came out in the same two-year period. Of course, real life made such stories plausible. The infiltration by the FBI and intelligence agencies of left-wing groups was then being revealed, and the Watergate break-in and its cover-up were an actual criminal conspiracy. Within a few decades, the belief that a web of villainous elites was covertly seeking to impose a malevolent global regime made its way from the lunatic right to the mainstream. Delusional conspiracism wouldn’t spread quite as widely or as deeply on the left, but more and more people on both sides would come to believe that an extraordinarily powerful cabal—international organizations and think tanks and big businesses and politicians—secretly ran America.

Each camp, conspiracists on the right and on the left, was ostensibly the enemy of the other, but they began operating as de facto allies. Relativist professors enabled science-denying Christians, and the antipsychiatry craze in the ’60s appealed simultaneously to left-wingers and libertarians (as well as to Scientologists). Conspiracy theories were more of a modern right-wing habit before people on the left signed on. However, the belief that the federal government had secret plans to open detention camps for dissidents sprouted in the ’70s on the paranoid left before it became a fixture on the right.

R. Kikuo Johnson

Americans felt newly entitled to believe absolutely anything. I’m pretty certain that the unprecedented surge of UFO reports in the ’70s was not evidence of extraterrestrials’ increasing presence but a symptom of Americans’ credulity and magical thinking suddenly unloosed. We wanted to believe in extraterrestrials, so we did. What made the UFO mania historically significant rather than just amusing, however, was the web of elaborate stories that were now being spun: not just of sightings but of landings and abductions—and of government cover-ups and secret alliances with interplanetary beings. Those earnest beliefs planted more seeds for the extravagant American conspiracy thinking that by the turn of the century would be rampant and seriously toxic.

A single idée fixe like this often appears in both frightened and hopeful versions. That was true of the suddenly booming belief in alien visitors, which tended toward the sanguine as the ’60s turned into the ’70s, even in fictional depictions. Consider the extraterrestrials that Jack Nicholson’s character in Easy Rider earnestly describes as he’s getting high for the first time, and those at the center of Close Encounters of the Third Kind eight years later. One evening in southern Georgia in 1969, the year Easy Rider came out, a failed gubernatorial candidate named Jimmy Carter saw a moving moon-size white light in the sky that “didn’t have any solid substance to it” and “got closer and closer,” stopped, turned blue, then red and back to white, and then zoomed away.

The first big nonfiction abduction tale appeared around the same time, in a best-selling book about a married couple in New Hampshire who believed that while driving their Chevy sedan late one night, they saw a bright object in the sky that the wife, a UFO buff already, figured might be a spacecraft. She began having nightmares about being abducted by aliens, and both of them underwent hypnosis. The details of the abducting aliens and their spacecraft that each described were different, and changed over time. The man’s hypnotized description of the aliens bore an uncanny resemblance to the ones in an episode of The Outer Limits broadcast on ABC just before his hypnosis session. Thereafter, hypnosis became the standard way for people who believed that they had been abducted (or that they had past lives, or that they were the victims of satanic abuse) to recall the supposed experience. And the couple’s story established the standard abduction-tale format: Humanoid creatures take you aboard a spacecraft, communicate telepathically or in spoken English, medically examine you by inserting long needles into you, then let you go.

The husband and wife were undoubtedly sincere believers. The sincerely credulous are perfect suckers, and in the late ’60s, a convicted thief and embezzler named Erich von Däniken published Chariots of the Gods?, positing that extraterrestrials helped build the Egyptian pyramids, Stonehenge, and the giant stone heads on Easter Island. That book and its many sequels sold tens of millions of copies, and the documentary based on it had a huge box-office take in 1970. Americans were ready to believe von Däniken’s fantasy to a degree they simply wouldn’t have been a decade earlier, before the ’60s sea change. Certainly a decade earlier NBC wouldn’t have aired an hour-long version of the documentary in prime time. And while I’m at it: Until we’d passed through the ’60s and half of the ’70s, I’m pretty sure we wouldn’t have given the presidency to some dude, especially a born-again Christian, who said he’d recently seen a huge, color-shifting, luminescent UFO hovering near him.


The 1980s and the Smog of Subjectivity


By the 1980s, things appeared to have returned more or less to normal. Civil rights seemed like a done deal, the war in Vietnam was over, young people were no longer telling grown-ups they were worthless because they were grown-ups. Revolution did not loom. Sex and drugs and rock and roll were regular parts of life. Starting in the ’80s, loving America and making money and having a family were no longer unfashionable.

The sense of cultural and political upheaval and chaos dissipated—which lulled us into ignoring all the ways that everything had changed, that Fantasyland was now scaling and spreading and becoming the new normal. What had seemed strange and amazing in 1967 or 1972 became normal and ubiquitous.

Extreme religious and quasi-religious beliefs and practices, Christian and New Age and otherwise, didn’t subside, but grew and thrived—and came to seem unexceptional.

Relativism became entrenched in academia—tenured, you could say. Michel Foucault’s rival Jean Baudrillard became a celebrity among American intellectuals by declaring that rationalism was a tool of oppressors that no longer worked as a way of understanding the world, pointless and doomed. In other words, as he wrote in 1986, “the secret of theory”—this whole intellectual realm now called itself simply “theory”—“is that truth does not exist.”

This kind of thinking was by no means limited to the ivory tower. The intellectuals’ new outlook was as much a product as a cause of the smog of subjectivity that now hung thick over the whole American mindscape. After the ’60s, truth was relative, criticizing was equal to victimizing, individual liberty became absolute, and everyone was permitted to believe or disbelieve whatever they wished. The distinction between opinion and fact was crumbling on many fronts.

Belief in gigantic secret conspiracies thrived, ranging from the highly improbable to the impossible, and moved from the crackpot periphery to the mainstream.

Many Americans announced that they’d experienced fantastic horrors and adventures, abuse by Satanists, and abduction by extraterrestrials, and their claims began to be taken seriously. Parts of the establishment—psychology and psychiatry, academia, religion, law enforcement—encouraged people to believe that all sorts of imaginary traumas were real.

America didn’t seem as weird and crazy as it had around 1970. But that’s because Americans had stopped noticing the weirdness and craziness. We had defined every sort of deviancy down. And as the cultural critic Neil Postman put it in his 1985 jeremiad about how TV was replacing meaningful public discourse with entertainment, we were in the process of amusing ourselves to death.


How the Right Became More Unhinged Than the Left


The Reagan presidency was famously a triumph of truthiness and entertainment, and in the 1990s, as problematically batty beliefs kept going mainstream, presidential politics continued merging with the fantasy-industrial complex.

In 1998, as soon as we learned that President Bill Clinton had been fellated by an intern in the West Wing, his popularity spiked. Which was baffling only to those who still thought of politics as an autonomous realm, existing apart from entertainment. American politics happened on television; it was a TV series, a reality show just before TV became glutted with reality shows. A titillating new story line that goosed the ratings of an existing series was an established scripted-TV gimmick. The audience had started getting bored with The Clinton Administration, but the Monica Lewinsky subplot got people interested again.

Just before the Clintons arrived in Washington, the right had managed to do away with the federal Fairness Doctrine, which had been enacted to keep radio and TV shows from being ideologically one-sided. Until then, big-time conservative opinion media had consisted of two magazines, William F. Buckley Jr.’s biweekly National Review and the monthly American Spectator, both with small circulations. But absent a Fairness Doctrine, Rush Limbaugh’s national right-wing radio show, launched in 1988, was free to thrive, and others promptly appeared.

For most of the 20th century, national news media had felt obliged to pursue and present some rough approximation of the truth rather than to promote a truth, let alone fictions. With the elimination of the Fairness Doctrine, a new American laissez-faire had been officially declared. If lots more incorrect and preposterous assertions circulated in our mass media, that was a price of freedom. If splenetic commentators could now, as never before, keep believers perpetually riled up and feeling the excitement of being in a mob, so be it.

Limbaugh’s virtuosic three hours of daily talk started bringing a sociopolitical alternate reality to a huge national audience. Instead of relying on an occasional magazine or newsletter to confirm your gnarly view of the world, now you had talk radio drilling it into your head for hours every day. As Limbaugh’s show took off, in 1992 the producer Roger Ailes created a syndicated TV show around him. Four years later, when NBC hired someone else to launch a cable news channel, Ailes, who had been working at NBC, quit and created one with Rupert Murdoch.

Fox News brought the Limbaughvian talk-radio version of the world to national TV, offering viewers an unending and immersive propaganda experience of a kind that had never existed before.

For Americans, this was a new condition. Over the course of the century, electronic mass media had come to serve an important democratic function: presenting Americans with a single shared set of facts. Now TV and radio were enabling a reversion to the narrower, factional, partisan discourse that had been normal in America’s earlier centuries.

And there was also the internet, which eventually would have mooted the Fairness Doctrine anyhow. In 1994, the first modern spam message was sent, visible to everyone on Usenet: global alert for all: jesus is coming soon. Over the next year or two, the masses learned of the World Wide Web. The tinder had been gathered and stacked since the ’60s, and now the match was lit and thrown. After the ’60s and ’70s happened as they happened, the internet may have broken America’s dynamic balance between rational thinking and magical thinking for good.

Before the web, cockamamy ideas and outright falsehoods could not spread nearly as fast or as widely, so it was much easier for reason and reasonableness to prevail. Before the web, institutionalizing any one alternate reality required the long, hard work of hundreds of full-time militants. In the digital age, however, every tribe and fiefdom and principality and region of Fantasyland—every screwball with a computer and an internet connection—suddenly had an unprecedented way to instruct and rile up and mobilize believers, and to recruit more. False beliefs were rendered both more real-seeming and more contagious, creating a kind of fantasy cascade in which millions of bedoozled Americans surfed and swam.

Why did Senator Daniel Patrick Moynihan begin remarking frequently during the ’80s and ’90s that people were entitled to their own opinions but not to their own facts? Because until then, that had not been necessary to say. Our marketplace of ideas became exponentially bigger and freer than ever, it’s true. Thomas Jefferson said that he’d “rather be exposed to the inconveniences attending too much liberty than those attending too small a degree of it”—because in the new United States, “reason is left free to combat” every sort of “error of opinion.” However, I think if he and our other Enlightenment forefathers returned, they would see the present state of affairs as too much of a good thing. Reason remains free to combat unreason, but the internet entitles and equips all the proponents of unreason and error to a previously unimaginable degree. Particularly for a people with our history and propensities, the downside of the internet seems at least as profound as the upside.

The way internet search was designed to operate in the ’90s—that is, the way information and beliefs now flow, rise, and fall—is democratic in the extreme. Internet search algorithms are an example of Gresham’s law, whereby the bad drives out—or at least overruns—the good. On the internet, the prominence granted to any factual assertion or belief or theory depends on the preferences of billions of individual searchers. Each click on a link is effectively a vote pushing that version of the truth toward the top of the pile of results.

Exciting falsehoods tend to do well in the perpetual referenda, and become self-validating. A search for almost any “alternative” theory or belief seems to generate more links to true believers’ pages and sites than to legitimate or skeptical ones, and those tend to dominate the first few pages of results. For instance, beginning in the ’90s, conspiracists decided that contrails, the skinny clouds of water vapor that form around jet-engine exhaust, were composed of exotic chemicals, part of a secret government scheme to test weapons or poison citizens or mitigate climate change—and renamed them chemtrails. When I Googled chemtrails proof, the first seven results offered so-called evidence of the nonexistent conspiracy. When I searched for government extraterrestrial cover-up, only one result in the first three pages didn’t link to an article endorsing a conspiracy theory.

Before the web, it really wasn’t easy to stumble across false or crazy information convincingly passing itself off as true. Today, however, as the Syracuse University professor Michael Barkun saw back in 2003 in A Culture of Conspiracy, “such subject-specific areas as crank science, conspiracist politics, and occultism are not isolated from one another,” but rather

they are interconnected. Someone seeking information on UFOs, for example, can quickly find material on antigravity, free energy, Atlantis studies, alternative cancer cures, and conspiracy.

The consequence of such mingling is that an individual who enters the communications system pursuing one interest soon becomes aware of stigmatized material on a broad range of subjects. As a result, those who come across one form of stigmatized knowledge will learn of others, in connections that imply that stigmatized knowledge is a unified domain, an alternative worldview, rather than a collection of unrelated ideas.

Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, “Individuals’ explicit religious and paranormal beliefs” are the best predictors of their “perception of purpose in life events”—their tendency “to view the world in terms of agency, purpose, and design.” Americans have believed for centuries that the country was inspired and guided by an omniscient, omnipotent planner and interventionist manager. Since the ’60s, that exceptional religiosity has fed the tendency to believe in conspiracies. In a recent paper called “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion,” based on years of survey research, two University of Chicago political scientists, J. Eric Oliver and Thomas J. Wood, confirmed this special American connection. “The likelihood of supporting conspiracy theories is strongly predicted,” they found, by “a propensity to attribute the source of unexplained or extraordinary events to unseen, intentional forces” and a weakness for “melodramatic narratives as explanations for prominent events, particularly those that interpret history relative to universal struggles between good and evil.” Oliver and Wood found the single strongest driver of conspiracy belief to be belief in end-times prophecies.


The Triumph of the Fantasy-Industrial Complex


As a 13-year-old, I watched William F. Buckley Jr.’s Firing Line with my conservative dad, attended Teen Age Republicans summer camp, and, at the behest of a Nixon-campaign advance man in Omaha, ripped down Rockefeller and Reagan signs during the 1968 Nebraska primary campaign. A few years later, I was a McGovern-campaign volunteer, but I still watched and admired Buckley on PBS. Over the years, I’ve voted for a few Republicans for state and local office. Today I disagree about political issues with friends and relatives to my right, but we agree on the essential contours of reality.

People on the left are by no means all scrupulously reasonable. Many give themselves over to the appealingly dubious and the untrue. But fantastical politics have become highly asymmetrical. Starting in the 1990s, America’s unhinged right became much larger and more influential than its unhinged left. There is no real left-wing equivalent of Sean Hannity, let alone Alex Jones. Moreover, the far right now has unprecedented political power; it controls much of the U.S. government.

Why did the grown-ups and designated drivers on the political left manage to remain basically in charge of their followers, while the reality-based right lost out to fantasy-prone true believers?

One reason, I think, is religion. The GOP is now quite explicitly Christian. The party is the American coalition of white Christians, papering over doctrinal and class differences—and now led, weirdly, by one of the least religious presidents ever. If more and more of a political party’s members hold more and more extreme and extravagantly supernatural beliefs, doesn’t it make sense that the party will be more and more open to make-believe in its politics?

I doubt the GOP elite deliberately engineered the synergies between the economic and religious sides of their contemporary coalition. But as the incomes of middle- and working-class people flatlined, Republicans pooh-poohed rising economic inequality and insecurity. Economic insecurity correlates with greater religiosity, and among white Americans, greater religiosity correlates with voting Republican. For Republican politicians and their rich-getting-richer donors, that’s a virtuous circle, not a vicious one.

Religion aside, America simply has many more fervid conspiracists on the right, as research about belief in particular conspiracies confirms again and again. Only the American right has had a large and organized faction based on paranoid conspiracism for the past six decades. As the pioneer vehicle, the John Birch Society zoomed along and then sputtered out, but its fantastical paradigm and belligerent temperament has endured in other forms and under other brand names. When Barry Goldwater was the right-wing Republican presidential nominee in 1964, he had to play down any streaks of Bircher madness, but by 1979, in his memoir With No Apologies, he felt free to rave on about the globalist conspiracy and its “pursuit of a new world order” and impending “period of slavery”; the Council on Foreign Relations’ secret agenda for “one-world rule”; and the Trilateral Commission’s plan for “seizing control of the political government of the United States.” The right has had three generations to steep in this, its taboo vapors wafting more and more into the main chambers of conservatism, becoming familiar, seeming less outlandish. Do you believe that “a secretive power elite with a globalist agenda is conspiring to eventually rule the world through an authoritarian world government”? Yes, say 34 percent of Republican voters, according to Public Policy Polling.

In the late 1960s and ’70s, the reality-based left more or less won: retreat from Vietnam, civil-rights and environmental-protection laws, increasing legal and cultural equality for women, legal abortion, Keynesian economics triumphant.

But then the right wanted its turn to win. It pretty much accepted racial and gender equality and had to live with social welfare and regulation and bigger government, but it insisted on slowing things down. The political center moved right—but in the ’70s and ’80s not yet unreasonably. Most of America decided that we were all free marketeers now, that business wasn’t necessarily bad, and that government couldn’t solve all problems. We still seemed to be in the midst of the normal cyclical seesawing of American politics. In the ’90s, the right achieved two of its wildest dreams: The Soviet Union and international communism collapsed; and, as violent crime radically declined, law and order was restored.

But also starting in the ’90s, the farthest-right quarter of Americans, let’s say, couldn’t and wouldn’t adjust their beliefs to comport with their side’s victories and the dramatically new and improved realities. They’d made a god out of Reagan, but they ignored or didn’t register that he was practical and reasonable, that he didn’t completely buy his own antigovernment rhetoric. After Reagan, his hopped-up true-believer faction began insisting on total victory. But in a democracy, of course, total victory by any faction is a dangerous fantasy.

Another way the GOP got loopy was by overdoing libertarianism. I have some libertarian tendencies, but at full-strength purity it’s an ideology most boys grow out of. On the American right since the ’80s, however, they have not. Republicans are very selective, cherry-picking libertarians: Let business do whatever it wants and don’t spoil poor people with government handouts; let individuals have gun arsenals but not abortions or recreational drugs or marriage with whomever they wish; and don’t mention Ayn Rand’s atheism. Libertarianism, remember, is an ideology whose most widely read and influential texts are explicitly fiction. “I grew up reading Ayn Rand,” Speaker of the House Paul Ryan has said, “and it taught me quite a bit about who I am and what my value systems are, and what my beliefs are.” It was that fiction that allowed him and so many other higher-IQ Americans to see modern America as a dystopia in which selfishness is righteous and they are the last heroes. “I think a lot of people,” Ryan said in 2009, “would observe that we are right now living in an Ayn Rand novel.” I’m assuming he meant Atlas Shrugged, the novel that Trump’s secretary of state (and former CEO of ExxonMobil) has said is his favorite book. It’s the story of a heroic cabal of men’s-men industrialists who cause the U.S. government to collapse so they can take over, start again, and make everything right.

For a while, Republican leaders effectively encouraged and exploited the predispositions of their variously fantastical and extreme partisans. Karl Rove was stone-cold cynical, the Wizard of Oz’s evil twin coming out from behind the curtain for a candid chat shortly before he won a second term for George W. Bush, about how “judicious study of discernible reality [is] … not the way the world really works anymore.” These leaders were rational people who understood that a large fraction of citizens don’t bother with rationality when they vote, that a lot of voters resent the judicious study of discernible reality. Keeping those people angry and frightened won them elections.

But over the past few decades, a lot of the rabble they roused came to believe all the untruths. “The problem is that Republicans have purposefully torn down the validating institutions,” the political journalist Josh Barro, a Republican until 2016, wrote last year. “They have convinced voters that the media cannot be trusted; they have gotten them used to ignoring inconvenient facts about policy; and they have abolished standards of discourse.” The party’s ideological center of gravity swerved way to the right of Rove and all the Bushes, finally knocking them and their clubmates aside. What had been the party’s fantastical fringe became its middle. Reasonable Republicanism was replaced by absolutism: no new taxes, virtually no regulation, abolish the EPA and the IRS and the Federal Reserve.

When I was growing up in Nebraska, my Republican parents loathed all Kennedys, distrusted unions, and complained about “confiscatory” federal income-tax rates of 91 percent. But conservatism to them also meant conserving the natural environment and allowing people to make their own choices, including about abortion. They were emphatically reasonable, disinclined to believe in secret Communist/Washington/elite plots to destroy America, rolling their eyes and shaking their heads about far-right acquaintances—such as our neighbors, the parents of the future Mrs. Clarence Thomas, who considered Richard Nixon suspiciously leftish. My parents never belonged to a church. They were godless Midwestern Republicans, born and raised—which wasn’t so odd 40 years ago. Until about 1980, the Christian right was not a phrase in American politics. In 2000, my widowed mom, having voted for 14 Republican presidential nominees in a row, quit a party that had become too Christian for her.

The Christian takeover happened gradually, but then quickly in the end, like a phase change from liquid to gas. In 2008, three-quarters of the major GOP presidential candidates said they believed in evolution, but in 2012 it was down to a third, and then in 2016, just one did. That one, Jeb Bush, was careful to say that evolutionary biology was only his truth, that “it does not need to be in the curriculum” of public schools, and that if it is, it could be accompanied by creationist teaching. A two-to-one majority of Republicans say they “support establishing Christianity as the national religion,” according to Public Policy Polling.

Although constitutionally the U.S. can have no state religion, faith of some kind has always bordered on mandatory for politicians. Only four presidents have lacked a Christian denominational affiliation, the most recent one in the 1880s. According to Pew, two-thirds of Republicans admit that they’d be less likely to support a presidential candidate who doesn’t believe in God.

As a matter of fact, one of the Constitution’s key clauses—“no religious test shall ever be required as a qualification to any office or public trust”—is kind of a theoretical freedom. Not only have we never had an openly unbelieving president, but of the 535 members of the current Congress, exactly one, Representative Kyrsten Sinema of Arizona, lists her religion as “none.” Among all 7,383 state legislators, there is apparently only one avowed atheist: Nebraska Senator Ernie Chambers.

I’m reminded of one of H. L. Mencken’s dispatches from the Scopes “monkey trial” in 1925. “Civilized” Tennesseans, he wrote, “had known for years what was going on in the hills. They knew what the country preachers were preaching—what degraded nonsense was being rammed and hammered into yokel skulls. But they were afraid to go out against the imposture while it was in the making.” What the contemporary right has done is worse, because it was deliberate and national, and it has had more-profound consequences.


The Rise of Donald Trump


I have been paying close attention to Donald Trump for a long time. Spy magazine, which I co-founded in 1986 and edited until 1993, published three cover stories about him—and dozens of pages exposing and ridiculing his lies, brutishness, and absurdity. Now everybody knows what we knew. Donald Trump is a grifter driven by resentment of the establishment. He doesn’t like experts, because they interfere with his right as an American to believe or pretend that fictions are facts, to feel the truth. He sees conspiracies everywhere. He exploited the myths of white racial victimhood. His case of what I call Kids R Us syndrome—spoiled, impulsive, moody, a 71-year-old brat—is acute.

He is, first and last, a creature of the fantasy-industrial complex. “He is P. T. Barnum,” his sister, a federal judge, told his biographer Timothy O’Brien in 2005. Although the fantasy-industrial complex had been annexing presidential politics for more than half a century, from JFK through Reagan and beyond, Trump’s campaign and presidency are its ultimate expression. From 1967 through 2011, California was governed by former movie actors more than a third of the time, and one of them became president. But Trump’s need for any and all public attention always seemed to me more ravenous and insatiable than any other public figure’s, akin to an addict’s for drugs. Unlike Reagan, Trump was always an impresario as well as a performer. Before the emergence of Fantasyland, Trump’s various enterprises would have seemed a ludicrous, embarrassing, incoherent jumble for a businessman, let alone a serious candidate for president. What connects an Islamic-mausoleum-themed casino to a short-lived, shoddy professional football league to an autobiography he didn’t write to buildings he didn’t build to a mail-order meat business to beauty pageants to an airline that lasted three years to a sham “university” to a fragrance called Success to a vodka and a board game named after himself to a reality-TV show about pretending to fire people?

What connects them all, of course, is the new, total American embrace of admixtures of reality and fiction and of fame for fame’s sake. His reality was a reality show before that genre or term existed. When he entered political show business, after threatening to do so for most of his adult life, the character he created was unprecedented—presidential candidate as insult comic with an artificial tan and ridiculous hair, shamelessly unreal and whipped into shape as if by a pâtissier. He used the new and remade pieces of the fantasy-industrial complex as nobody had before. He hired actors to play enthusiastic supporters at his campaign kickoff. Twitter became his unmediated personal channel for entertaining outrage and untruth. And he was a star, so news shows wanted him on the air as much as possible—people at TV outlets told me during the campaign that they were expected to be careful not to make the candidate so unhappy that he might not return.

Before Trump won their nomination and the presidency, when he was still “a cancer on conservatism” that must be “discarded” (former Governor Rick Perry) and an “utterly amoral” “narcissist at a level I don’t think this country’s ever seen” (Senator Ted Cruz), Republicans hated Trump’s ideological incoherence—they didn’t yet understand that his campaign logic was a new kind, blending exciting tales with a showmanship that transcends ideology.

R. Kikuo Johnson

During the campaign, Trump repeated the falsehood that vaccines cause autism. And instead of undergoing a normal medical exam from a normal doctor and making the results public, like nominees had before, Trump went on The Dr. Oz Show and handed the host test results from his wacky doctor.

Did his voters know that his hogwash was hogwash? Yes and no, the way people paying to visit P. T. Barnum’s exhibitions 175 years ago didn’t much care whether the black woman on display was really George Washington’s 161-year-old former nanny or whether the stitched-together fish/ape was actually a mermaid; or the way today we immerse in the real-life fictions of Disney World. Trump waited to run for president until he sensed that a critical mass of Americans had decided politics were all a show and a sham. If the whole thing is rigged, Trump’s brilliance was calling that out in the most impolitic ways possible, deriding his straight-arrow competitors as fakers and losers and liars—because that bullshit-calling was uniquely candid and authentic in the age of fake.

Trump took a key piece of cynical wisdom about show business—the most important thing is sincerity, and once you can fake that, you’ve got it made—to a new level: His actual thuggish sincerity is the opposite of the old-fashioned, goody-goody sanctimony that people hate in politicians.

If he were just a truth-telling wise guy, however, he wouldn’t have won. Trump’s genius was to exploit the skeptical disillusion with politics—there’s too much equivocating; democracy’s a charade—but also to pander to Americans’ magical thinking about national greatness. Extreme credulity is a fraternal twin of extreme skepticism.

“I will give you everything,” Trump actually promised during the campaign. Yes: “Every dream you’ve ever dreamed for your country” will come true.

Just as the internet enabled full Fantasyland, it made possible Trump as candidate and president, feeding him pseudo-news on his phone and letting him feed those untruths directly to his Twitter followers. He is the poster boy for the downside of digital life. “Forget the press,” he advised supporters—just “read the internet.” After he wrongly declared on Twitter that one anti-Trump protester “has ties to isis,” he was asked whether he regretted tweeting that falsehood. “What do I know about it?” he replied. “All I know is what’s on the internet.”

Trump launched his political career by embracing a brand-new conspiracy theory twisted around two American taproots—fear and loathing of foreigners and of nonwhites. In 2011, he became the chief promoter of the fantasy that Barack Obama was born in Kenya, a fringe idea that he brought into the mainstream. Only in the fall of 2016 did he grudgingly admit that the president was indeed a native-born American—at the same moment a YouGov/Huffington Post survey found that a majority of Republicans still believed Obama probably or definitely had been born in Kenya. 
Conspiracies, conspiracies, still more conspiracies. On Fox & Friends Trump discussed, as if it were fact, the National Enquirer’s suggestion that Ted Cruz’s father was connected to JFK’s assassination: “What was he doing with Lee Harvey Oswald shortly before the death, before the shooting? It’s horrible.” The Fox News anchors interviewing him didn’t challenge him or follow up. He revived the 1993 fantasy about the Clintons’ friend Vince Foster—his death, Trump said, was “very fishy,” because Foster “had intimate knowledge of what was going on. He knew everything that was going on, and then all of a sudden he committed suicide … I will say there are people who continue to bring it up because they think it was absolutely a murder.” He has also promised to make sure that “you will find out who really knocked down the World Trade Center.” And it has all worked for him, because so many Americans are eager to believe almost any conspiracy theory, no matter how implausible, as long as it jibes with their opinions and feelings.

Not all lies are fantasies and not all fantasies are lies; people who believe untrue things can pass lie-detector tests. For instance, Trump probably really believed that “the murder rate in our country is the highest it’s been in 47 years,” the total falsehood he told leaders of the National Sheriffs’ Association at the White House in early February. The fact-checking website PolitiFact looked at more than 400 of his statements as a candidate and as president and found that almost 50 percent were false and another 20 percent were mostly false.

He gets away with this as he wouldn’t have in the 1980s or ’90s, when he first talked about running for president, because now factual truth really is just one option. After Trump won the election, he began referring to all unflattering or inconvenient journalism as “fake news.” When his approval rating began declining, Trump simply refused to believe it: “Any negative polls” that may appear, the president tweeted at dawn one morning from Mar-a-Lago, “are fake news.”

The people who speak on Trump’s behalf to journalists and the rest of the reality-based world struggle to defend or explain his assertions. Asked about “the president’s statements that are … demonstrably not true,” the White House counselor Kellyanne Conway asked CNN’s Jake Tapper to please remember “the many things that he says that are true.” According to The New York Times, the people around Trump say his baseless certainty “that he was bugged in some way” by Obama in Trump Tower is driven by “a sense of persecution bordering on faith.” And indeed, their most honest defense of his false statements has been to cast them practically as matters of religious conviction—he deeply believes them, so … there. When White House Press Secretary Sean Spicer was asked at a press conference about the millions of people who the president insists voted illegally, he earnestly reminded reporters that Trump “has believed that for a while” and “does believe that” and it’s “been a long-standing belief that he’s maintained” and “it’s a belief that he has maintained for a while.”

Which is why nearly half of Americans subscribe to that preposterous belief themselves. And in Trump’s view, that overrides any requirement for facts.

“Do you think that talking about millions of illegal votes is dangerous to this country without presenting the evidence?,” David Muir, the anchor of ABC’s World News Tonight, asked Trump in January.

“No,” he replied. “Not at all! Not at all—because many people feel the same way that I do.”

The idea that progress has some kind of unstoppable momentum, as if powered by a Newtonian law, was always a very American belief. However, it’s really an article of faith, the Christian fantasy about history’s happy ending reconfigured during and after the Enlightenment as a set of modern secular fantasies. It reflects our blithe conviction that America’s visions of freedom and democracy and justice and prosperity must prevail in the end. I really can imagine, for the first time in my life, that America has permanently tipped into irreversible decline, heading deeper into Fantasyland. I wonder whether it’s only America’s destiny, exceptional as ever, to unravel in this way. Or maybe we’re just early adopters, the canaries in the global mine, and Canada and Denmark and Japan and China and all the rest will eventually follow us down our tunnel. Why should modern civilization’s great principles—democracy, freedom, tolerance—guarantee great outcomes?

Yet because I’m an American, a fortunate American who has lived in a fortunate American century, I remain (barely) more of an optimist than a pessimist. Even as we’ve entered this long winter of foolishness and darkness, when too many Americans are losing their grip on reason and reality, it has been an epoch of astonishing hope and light as well. During these same past few decades, Americans reduced the rates of murder and violent crime by more than half. We decoded the human genome, elected an African American president, recorded the sound of two black holes colliding 1 billion years ago, and created Beloved, The Simpsons, Goodfellas, Angels in America, The Wire, The Colbert Report, Transparent, Hamilton. Since 1981, the percentage of people living in extreme poverty around the globe has plummeted from 44 percent to 10 percent. I do despair of our devolution into unreason and magical thinking, but not everything has gone wrong.

What is to be done? I don’t have an actionable agenda, Seven Ways Sensible People Can Save America From the Craziness. But I think we can slow the flood, repair the levees, and maybe stop things from getting any worse. If we’re splitting into two different cultures, we in reality-based America—whether the blue part or the smaller red part—must try to keep our zone as large and robust and attractive as possible for ourselves and for future generations. We need to firmly commit to Moynihan’s aphorism about opinions versus facts. We must call out the dangerously untrue and unreal. A grassroots movement against one kind of cultural squishiness has taken off and lately reshaped our national politics—the opposition to political correctness. I envision a comparable struggle that insists on distinguishing between the factually true and the blatantly false.

It will require a struggle to make America reality-based again. Fight the good fight in your private life. You needn’t get into an argument with the stranger at Chipotle who claims that George Soros and Uber are plotting to make his muscle car illegal—but do not give acquaintances and friends and family members free passes. If you have children or grandchildren, teach them to distinguish between true and untrue as fiercely as you do between right and wrong and between wise and foolish.

We need to adopt new protocols for information-media hygiene. Would you feed your kids a half-eaten casserole a stranger handed you on the bus, or give them medicine you got from some lady at the gym?

And fight the good fight in the public sphere. One main task, of course, is to contain the worst tendencies of Trumpism, and cut off its political-economic fuel supply, so that fantasy and lies don’t turn it into something much worse than just nasty, oafish, reality-show pseudo-conservatism. Progress is not inevitable, but it’s not impossible, either.

 

Posted in Marty's Blog | Leave a comment

The Corruption of the Law

Chris Hedges
Harlan Fiske Stone’s conservatism was grounded in the belief that the law is designed to protect the weak from the powerful. (Mr. Fish)

ISLE AU HAUT, Maine—I drink coffee in the morning on a round, ornate oak table that once belonged to Harlan Fiske Stone, a U.S. Supreme Court justice from 1925 to 1946 and the chief justice for the last five of those years. Stone and his family spent their summers on this windswept, remote island six miles off the coast of Maine.

Stone, a Republican and close friend of Calvin Coolidge and Herbert Hoover, embodied a lost era in American politics. His brand of conservatism, grounded in the belief that the law is designed to protect the weak from the powerful, bears no resemblance to that of the self-proclaimed “strict constitutionalists” in the Federalist Society who have accumulated tremendous power in the judiciary. The Federalist Society, at the behest of President Trump, is in charge of vetting the 108 candidates for the federal judgeships that will be filled by the administration. The newest justice, Trump appointee Neil Gorsuch, comes out of the Federalist Society, as did Justices Clarence Thomas, John Roberts and Samuel Alito. The self-identified “liberals” in the judiciary, while progressive on social issues such as abortion and affirmative action, serve corporate power as assiduously as the right-wing ideologues of the Federalist Society. The Alliance for Justice points out that 85 percent of President Barack Obama’s judicial nominees—280, or a third of the federal judiciary—had either been corporate attorneys or government prosecutors. Those who came out of corporate law firms accounted for 71 percent of the nominees, with only 4 percent coming from public interest groups and the same percentage having been attorneys who represented workers in labor disputes.

Stone repeatedly warned that unchecked corporate power would mean corporate tyranny and the death of democracy. He was joined in that thinking by Louis D. Brandeis, his fellow justice and ally on the court, who stated, “We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can’t have both.”

The supposed clash between liberal and conservative judges is largely a fiction. The judiciary, despite the Federalist Society’s high-blown rhetoric about the sanctity of individual freedom, is a naked tool of corporate oppression. The most basic constitutional rights—privacy, fair trials and elections, habeas corpus, probable-cause requirements, due process and freedom from exploitation—have been erased for many, especially the 2.3 million people in our prisons, most having been put there without ever going to trial. Constitutionally protected statements, beliefs and associations are criminalized. Our judicial system, as Ralph Nader has pointed out, has legalized secret law, secret courts, secret evidence, secret budgets and secret prisons in the name of national security.

Our constitutional rights have steadily been stripped from us by judicial fiat. The Fourth Amendment reads: “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.” Yet our telephone calls and texts, emails and financial, judicial and medical records, along with every website we visit and our physical travels, can be and commonly are tracked, recorded, photographed and stored in government computer banks.

The executive branch can order the assassination of U.S. citizens without trial. It can deploy the military into the streets to quell civil unrest under Section 1021 of the National Defense Authorization Act (NDAA) and seize citizens—seizures that are in essence acts of extraordinary rendition—and hold them indefinitely in military detention centers while denying them due process.

Corporate campaign contributions, which largely determine who gets elected, are viewed by the courts as protected forms of free speech under the First Amendment. Corporate lobbying, which determines most of our legislation, is interpreted as the people’s right to petition the government. Corporations are legally treated as persons except when they carry out fraud and other crimes; the heads of corporations routinely avoid being charged and going to prison by paying fines, usually symbolic and pulled from corporate accounts, while not being forced to admit wrongdoing. And corporations have rewritten the law to orchestrate a massive tax boycott.

Many among the 1 million lawyers in the United States, the deans of our law schools and the judges in our courts, whether self-identified liberals or Federalist Society members or supporters, refuse to hold corporate power accountable to the law. They have failed us. They alone have the education and skill to apply the law on behalf of the citizens. They alone know how to use the courts for justice rather than injustice. When this period of American history is written, the legal profession will be found to have borne much of the responsibility for our descent into corporate tyranny. Lawyers are supposed to be “officers of the court.” They are supposed to be sentinels and guardians of the law. They are supposed to enlarge our access to justice. They are supposed to defend the law, not subvert it. This moral failure by the legal profession has obliterated our rights.

The radical libertarians in the Federalist Society, now ascendant within the legal system, champion a legal doctrine that is essentially preindustrial. It is centered exclusively on the rights of the individual and restricting the power of government. This can at times lead to rulings that protect personal liberty. The followers of this doctrine on the Supreme Court, for example, voted to overturn Connecticut’s eminent-domain rape of a New London working-class neighborhood to make way for a pharmaceutical plant. The liberals, who formed the court majority, endorsed the taking of the neighborhood.

Another example of radical libertarianism on the bench occurred when attorneys Bruce Afran and Carl Mayer and I sued President Obama over Section 1021 of the NDAA, which overturned the 1878 act that prohibited the government from using the military as a domestic police force. We garnered support from some charter members of the Federalist Society. The proclivity by the Federalist Society to hold up the primacy of individual rights became especially important when, after the temporary injunction of Section 1021 issued by the U.S. District Court for the Southern District of New York was overturned by the appellate court, we had to file a cert, or petition, to request that the case, Hedges v. Obama, be heard before the Supreme Court.

“As obnoxious as [Antonin] Scalia was on cultural issues, he was the strongest modern justice in terms of protecting First Amendment speech, press and assembly rights—no liberal came anywhere near him in these areas,” Afran told me about the late justice. “In fact, Scalia was the justice who sympathized with our cert petition in the NDAA case. [Justice Ruth Bader] Ginsburg denied our petition without circulating it among the other justices. When we went to Scalia, he immediately asked for additional briefs to circulate. It was his dissents in the Guantanamo cases that we relied on in our cert petition. He issued strong dissents holding that the Guantanamo inmates and others taken by the military in Afghanistan should have complete civil rights in criminal prosecutions. He went much further than the majority did in these cases and condemned any holding of civilians by the military.”

But although the Federalist Society purports to be against curtailment of civil liberties, with some members embracing traditional liberal positions on issues such as drug laws and sexual freedom, the organization also supports the judicial system’s position that corporations hold the rights of individuals. It is hostile to nearly all government regulations and regulatory agencies including the Environmental Protection Agency and the Securities and Exchange Commission. It opposes the rights of labor unions, voting rights laws, gender equality laws and the separation of church and state. It seeks to outlaw abortion and overturn Roe v. Wade. The self-proclaimed “originalism” or “textualism” philosophy of the Federalist Society has crippled the ability of the legal system to act en masse in class action suits against corrupt corporate entities. And for all the rhetoric about championing individual liberty, as Mayer pointed out, “they never did a thing about any First Amendment intrusions that all of the legislation passed after 9/11 involved.” The Supreme Court did not accept our cert, leaving Section 1021 as law.

The Federalist Society says it seeks legal interpretations that are faithful to those that would have been made at the time the Constitution was written in the late 18th century. This fossilization of the law is a clever rhetorical subterfuge to advance the interests of the corporations and the oligarchs who have bankrolled the Federalist Society—the Mercer Foundation, the late John Olin, the late Richard Scaife, the Lynde and Harry Bradley Foundation, the Koch brothers and the fossil fuel industry. The Federalist Society has close ties with the American Legislative Exchange Council (ALEC), whose lobbyists draft and push corporate-sponsored bills through state legislatures and Congress.

Stone knew that the law would become moribund if it was frozen in time. It was a living entity, one that had to forever adapt to changing economic, social and political reality. He embraced what Oliver Wendell Holmes called “legal realism.” The law was not only about logic but also about the experience of a lived human society. If judges could not read and interpret that society, if they clung to rigid dogma or a self-imposed legal fundamentalism, then the law would be transformed into a sterile constitutionalism. Stone called on judges to “have less reverence for the way in which an old doctrine was applied to an old situation.” The law had to be flexible. Judges, to carry out astute rulings, had to make a close study of contemporary politics, economics, domestic and global business practices and culture, not attempt to intuit what the Founding Fathers intended.

Stone was wary of radicals and socialists. He could be skeptical of New Deal programs, although he believed the court had no right to reverse New Deal legislation. But he understood that the law was the primary institution tasked with protecting the public from predatory capitalism and the abuses of power. He voted consistently with Holmes and Brandeis, two of the court’s most innovative and brilliant jurists. The three were so often in dissent to the conservative majority they were nicknamed “The Three Musketeers.”

The law, Stone said, must never “become the monopoly of any social or economic class.” He condemned his fellow conservatives for reading their economic preferences into the law and “into the Constitution as well.” By doing so, he said, they “placed in jeopardy a great and useful institution of government.”

Stone embraced the doctrine of “preferred freedoms”—the position that First Amendment freedoms are preeminent in the hierarchy of constitutional rights, permitting justices to override any legislation that curbs these freedoms. This became the basis for court decisions to overturn state laws that persecuted and silenced African-Americans, radicals—including communists, anarchists and socialists—and labor and religious activists.

Stone, as dean of Columbia Law School before being named U.S. attorney general in 1924 and joining the Supreme Court the year after that, said the school’s mission was “devoted to teaching its students how to live rather than how to make a living.” He denounced the Palmer Raids and mass deportations of radicals that ended in 1920. He supported five Socialist members of the New York State Assembly who were stripped of their elected seats by their legislative colleagues in 1920 because of their political views. And he said that everyone, including aliens—meaning those who were not citizens but who lived in the United States—deserved due process.

“[A]ny system which confers upon administrative officers power to restrain the liberty of individuals, without safeguards substantially like those which exist in criminal cases and without adequate authority for judicial review of the action of such administrative officers, will result in abuse of power and in intolerable injustice and cruelty to individuals,” he wrote of laws that deprived aliens of constitutional rights.

As attorney general he weeded out corrupt officials and zealously enforced antitrust laws, swiftly making enemies of many leading industrialists, including Andrew Mellon. He also, ominously, appointed J. Edgar Hoover to run the FBI. His aggressive antitrust campaigns led to calls by the business community for his removal as attorney general, and he was elevated to the Supreme Court in 1925, a move that, as the New York Globe and Commercial Advertiser newspaper observed, “protected business from disturbing litigation or the threat of such litigation [and] has saved the [Coolidge] administration from the charge that it has betrayed business. …”

The 1920s were, as Alpheus Thomas Mason wrote in his 1956 biography, “Harlan Fiske Stone: Pillar of the Law,” “a decade pre-eminent for the exploitative large-scale business; its leaders preached the ‘Gospel of Goods.’ ‘Canonization of the salesman’ was seen ‘as the brightest hope of America.’ The absorbing ambition was to make two dollars grow where one had grown before, to engineer, as utilities magnate Samuel Insull put it, ‘all I could out of a dollar’—that is, get something for nothing.”

Organized labor, which before World War I had been a potent social and political force, had been crushed through government repression, including the use of the Espionage and Sedition acts. Government regulations and controls had been weakened or abolished. It was a time when, as Sinclair Lewis said of Babbittry—referring to the philistine behavior of the lead character in his 1922 novel “Babbitt,” about the vacuity of American culture—the goal in life was to be “rich, fat, arrogant, and superior.” Inequality had reached appalling levels, with 60 percent of American families existing barely above subsistence level by the time of the 1929 crash. The American god was profit. Those not blessed to be rich and powerful were sacrificed on the altar of the marketplace.

The New Hampshire-born Stone, grounded in rural New England conservatism and Yankee thrift, was appalled by the orgy of greed and inequality engineered by his fellow elites. He denounced a hedonistic culture dominated by unethical oligarchs and corporations very similar to those that exist today.

“Wealth, power, the struggle for ephemeral social and political prestige, which so absorb our attention and energy, are but the passing phase of every age; ninety-day wonders which pass from man’s recollection almost before the actors who have striven from them have passed from the stage,” he wrote. “What is significant in the record of man’s development is none of these. It is rather those forces in society and the lives of those individuals, who have, in each generation, added something to man’s intellectual and moral attainment, that lay hold on the imagination and compel admiration and reverence in each succeeding generation.”

Wall Street’s crash in 1929 and the widespread suffering caused by the Depression confirmed Stone’s fears about unfettered capitalism. Victorian-era writer Herbert Spencer, who coined the term “survival of the fittest” and whose libertarian philosophy was widely embraced in the 1920s, argued that liberty was measured by the “relative paucity of restraint” that government places on the individual. Stone saw this belief, replicated in the ideology of neoliberalism, as a recipe for corporate oppression and exploitation.

If the law remained trapped in the agrarian, white male, slave-holding society in which the authors of the Constitution lived, if it was used to exclusively defend “individualism,” there would be no legal mechanisms to halt the abuse of corporate power. The rise of modern markets, industrialization, technology, imperial expansion and global capitalism necessitated a legal system that understood and responded to modernity. Stone bitterly attacked the concept of natural law and natural rights, used to justify the greed of the ruling elites by attempting to place economic transactions beyond the scope of the courts. Laissez faire economics was not, he said, a harbinger of progress. The purpose of the law was not to maximize corporate profit. In Stone’s reasoning, a clash between the courts and the lords of commerce was inevitable.

Stone excoriated the legal profession for its failure to curb the avarice of the “giant economic forces which our industrial and financial world have created.” Lawyers, he went on, were not supposed to be guardians of corporate power. He asked why “a bar which has done so much to develop and refine the technique of business organization, to provide skillfully devised methods for financing industry, which has guided a world-wide commercial expansion, has done relatively so little to remedy the evils of the investment market; so little to adapt the fiduciary principle of nineteenth-century equity to twentieth-century business practices; so little to improve the functioning of the administrative mechanisms which modern government sets up to prevent abuses; so little to make law more readily available as an instrument of justice to the common man.” The law, he said, was about “the advancement of the public interest.” He castigated the educated elites, especially lawyers and judges, who used their skills to become “the obsequious servant of business” and in the process were “tainted with the morals and manners of the marketplace in its most anti-social manifestations.” And he warned law schools that their exclusive focus on “proficiency” overlooked “the grave danger to the public if this proficiency be directed wholly to private ends without thought of the social consequences.” He lambasted “the cramped mind of the clever lawyer, for whom intellectual dignity and freedom had been forbidden by the interests which he served.” He called the legal profession’s service to corporation power a “sad spectacle” and attorneys who sold their souls to corporations “lawyer criminals.”

He was viciously attacked. The Wall Street lawyer William D. Guthrie responded in the Fordham Law Review, warning readers that Stone was peddling “subversive doctrines” championed by “false prophets” that had as their goal “national socialism, the repudiation of standards and obligation heretofore upheld, the leveling of classes, the destruction of property, and the overthrow of our federal system designed to be composed of sovereign and indestructible states.”

But Stone understood a seminal fact that eludes our day’s Federalist Society and the Republican and Democratic party leaderships: Corporations cannot be trusted with social and political power. Stone knew that the law must be a barrier to the insatiable corporate lust for profit. If the law failed in this task, then corporate despotism was certain.

He wrote of the excesses of capitalism that led to the Depression:

I venture to assert that when the history of the financial era which has just drawn to a close comes to be written, most of its mistakes and its major faults will be ascribed to the failure to observe the fiduciary principle, the precept as old as the holy writ, that “a man cannot serve two masters.” More than a century ago equity gave a hospitable reception to that principle, and the common law was not slow to follow in giving it recognition. No thinking man can believe that an economy built upon a business foundation can long endure without some loyalty to that principle. The separation of ownership from management, the development of the corporate structure so as to vest in small groups control over the resources of great numbers of small and uniformed investors, make imperative a fresh and active devotion to that principle if the modern world of business is to perform its proper function. Yet those who serve nominally as trustees, but relieved, by clever legal devices, from the obligation to protect those whose interests they purport to represent, corporate officers and directors who award themselves huge bonuses from corporate funds without the assent or even the knowledge of their stockholders, reorganization committees created to serve interests other than those whose securities they control, financial institutions which, in the infinite variety of their operations, consider only last, if at all, the interests of those whose funds they command, suggest how far we have ignored the necessary implications of that principle. The loss and suffering inflicted on individuals, the harm done to a social order founded upon business and dependent upon its integrity, are incalculable.

The corporate coup d’état Stone attempted to thwart is complete. His worst fears are our nightmare.

Stone had his flaws. After he refused to grant a stay of execution for Nicola Sacco and Bartolomeo Vanzetti, the two anarchists were hanged in August 1927. (A courtier took a fishing boat to retrieve the fateful decision that Stone made while he was at his vacation home here on Isle au Haut. He probably signed off on their execution orders on the table where I sit each morning.) He sometimes ruled against the rights of unions. He endorsed the internment of Japanese-American citizens during World War II. He was not sympathetic to conscientious objectors except on religious grounds. He did not always protect the constitutional rights of communists. He could use the law to curb what he saw as Franklin Roosevelt’s consolidation of power within the executive branch.

But Stone had the integrity and courage to throw bombs at the establishment. He attacked, for example, the Nuremberg Trials of the Nazi leadership after World War II, calling it a “high-grade lynching party.” “I don’t mind what he [the chief Nuremberg prosecutor, Supreme Court Justice Robert H. Jackson] does to the Nazis, but I hate to see the pretense that he is running a court and proceeding according to common law,” he wrote. “This is a little too sanctimonious a fraud to meet my old-fashioned ideas.” He noted acidly that the Nuremberg Trials were being used to justify the proposition that “the leaders of the vanquished are to be executed by the victors.”

Stone spent his summers in a gray-shingled cottage with blue-green trim overlooking a small island harbor. He and his wife built the cottage, which still stands, in 1916. He tramped about the island in old clothes. One day at the dock a woman mistook the Supreme Court justice for a porter. She asked him to carry her bags. Stone, a burly man who had played football in college, lifted the suitcases and followed her without a word.

Stone did not possess the Emersonian brilliance and rhetorical flourishes of a Holmes or the trenchant social analysis of a Brandeis, but he was an astute legal scholar. There would be no place for him in today’s Republican or Democratic parties or judiciary, seized by the corporate interests he fought. The Federalist Society, along with corporate lobbyists, would have mounted a fierce campaign to block him from becoming attorney general and a Supreme Court justice. His iron fidelity to the rule of law would have seen him, like Ralph Nader, tossed into the political and judicial wilderness.

Stone opposed socialism because, as he told his friend Harold Laski, the British political philosopher and socialist, he believed the judicial system could be reformed and empowered to protect the public from the tyranny of corporate elites. If the judicial system failed in its task to safeguard democracy, he conceded to Laski, socialism was the only alternative.

Posted in Marty's Blog | Leave a comment