judicialsupport

Legal Writing for Legal Reading!

Archive for the category “Articles By Others”

Southern Baptists versus United Methodists

There’s a pervasive narrative today of conservative Christian demographic decline. This narrative is partly based on reality and partly based on wishful thinking by some. But this narrative typically ignores the far more dramatic implosion of liberal white Mainline Protestantism.

The popular conventional narrative asserts that young people in droves are quitting evangelical Christianity because it’s too socially and politically conservative. Of course, the implication is that if only Evangelicalism would liberalize, especially on sexuality, then it might become more appealing.

But all the available evidence as to what happens to liberalizing churches strongly indicates the opposite. Mainline Protestantism is in many ways what critics of Evangelicalism wish it would become. And yet the Mainline, comprised primarily of the “Seven Sister” historic denominations, has been in continuous free-fall since the early to mid-1960s. Its implosion accelerated after most of these denominations specifically liberalized their sexuality teachings over the last 20 years.

The facts of Mainline Protestant decline are easily available. And yet the Mainline, once the dominant religious force in America, has declined so calamitously that for many it’s become almost forgotten. Often, when I speak to young people, I must explain what the Mainline is. Many young people, when they think of non-Catholic Christianity, are only familiar with Evangelicalism, which displaced the Mainline decades ago as America’s largest religious force.

So it’s necessary to repeat what’s happened to the Mainline. The Episcopal Church peaked in 1966 with 3.4 million and now has 1.7 million (50% loss). What is now the Presbyterian Church (USA) peaked, in its predecessor bodies that later merged, in 1965 with 4.4 million, and is at 1.4 million (68% loss). The United Church of Christ peaked in 1965 with 2.1 million and now has 850,000 (60% loss). What is now the Evangelical Lutheran Church in America (ELCA), in its predecessor bodies that later merged, peaked in 1968 with 5.9 million and now has 3.5 million (41% loss). The Christian Church (Disciples of Christ) peaked in 1964 with over 1.9 million and now has just over 400,000 (80% loss). United Methodism, in its predecessor bodies, peaked in 1965 with over 11 million and now has 6.9 million in the USA (nearly 40% loss). The American Baptist Church peaked in 1963 with over 1.5 million and now has less than 1.2 million (25% loss.)

During the Mainline implosion the percentage of Americans belonging to the Seven Sister denominations declined from one of every six Americans to one of every 22. If the Mainline had simply retained its share of population it would stand today at about 55 million instead of about 16 million.

Nearly all the Mainline denominations have liberalized their sexuality standards over the last 15 years, precipitating accelerated membership loss. For example, the Presbyterian Church (USA) overturned its disapproval of homosexual practice in 2011 and declined from 1.9 million to 1.4 million in 2017, losing half a million members, or 25% in just 6 years. The Episcopal Church elected its first openly homosexual bishop in 2003 and declined from 2.3 million to 1.7 million, or 26%. The two Mainline denominations that have not officially liberalized on sexuality, United Methodism and American Baptists, have declined the least.

So the proposal from some that conservative stances on sexuality precipitate church decline is not of itself supported, as the fastest declining denominations in America, and throughout the West, have liberalized on sexuality. Some conservative denominations are declining, but all growing denominations in America and the world are conservative theologically and on sexuality.

Recently I have tweeted some of these statistics about Mainline decline, with respondents insisting that Evangelicals are declining too. But by some counts, Evangelicalism is retaining its share of the American population while liberal Protestantism is plunging.

All growing denominations in America are conservative, including the Assemblies of God, which in 1965 had 572,123 and now has 3.2 million (460% increase), the Church of God in Cleveland, which in 1964 had 220,405 and now has 1.2 million (445% increase), the Christian Missionary Alliance, which in 1965 had 64,586 and now has 440,000 (576% increase), and the Church of the Nazarene 1965, which in 343,380 and now has 626,811 (82% increase).

Common responses to reference of Mainline decline are BUT THE SOUTHERN BAPTISTS! And it’s true that America’s largest Protestant body has been declining for 18 years. But its decline from 16.4 million to 15 million represents an 8 percent loss, not comparable to the average Mainline loss of nearly 50%. Southern Baptists displaced Methodism as America’s largest Protestant body in 1967 and now outnumber United Methodists by two to one.

Southern Baptists leaders commonly bewail their 18-year membership decline and urge more focus on evangelism. Their aggressive church planting resulted in 270 additional congregations in 2017 and a twenty percent increase in congregations over the last 20 years, with a strong focus on creating new black and Hispanic congregations. The Southern Baptist Convention likely is more racially diverse than Mainline Protestant denominations, which are over 90% white. And Southern Baptist worship attendance, even amid membership decline, increased by 120,000 in 2017.

Mainline Protestantism shows no sign of any institutional desire to reverse its 53-year membership decline, instead doubling down on the theological and political stances that fueled much of this decline. Some of its denominations, like the Presbyterian Church (USA), at current rates of decline, may not exist in 15 years or less.

Sometimes the demise of Mainline Protestantism is equated with the demise of American Christianity. Media sometimes report dying Mainline congregations without citing different stories at newer evangelical churches. But just as common if not more so is the narrative of ostensible Evangelical decline. White Evangelicalism maybe in decline, but Evangelicalism is increasingly multiethnic. Some evangelical denominations, like the Assemblies of God, which has no racial majority, successfully reach immigrant populations, while Mainline Protestantism fails to do so.

Here’s my suggestion on why there’s lots of focus on supposed Evangelical decline based on its purportedly unappealing moral stances. Evangelicalism surged during the 1970s through 1990s, including growing campus ministries, creating new generations of evangelical young people, some of whom later recoiled from the conservative religious upbringing of their youths. They sometimes blog and pontificate on the failures of evangelical culture, commending an idealized more liberal Christianity, usually unaware of already preexisting liberal Christianity’s dramatic collapse.

Meanwhile, Mainline Protestantism, when its implosion started in the early to mid-1960s, began losing baby boomers and barely had representation among subsequent generations. In recent decades there have not been many young people left in the Mainline who could subsequently complain or pontificate about experiences in their liberal denominations.

It’s important to reiterate the details of Mainline Protestantism’s long and ongoing spiral as a warning to other churches. Whatever the problems of evangelical Christianity, becoming more like liberal Mainline Protestantism is not a remedy.

By Mark Tooley and publisned on December 14, 2018 in Juicy Ecumenism and can be found here.

 

America Is Intolerably Intolerant

A nation devoid of grace immiserates its people.

When you think of the sheer vindictiveness of what happened to Oklahoma quarterback Kyler Murray, it takes your breath away. On the very night of his greatest career triumph, a reporter dug up his old tweets (composed when he was a young teenager), reported on the most offensive insults, and immediately and irrevocably transformed his online legacy. Now he’s not just “Kyler Murray, gifted quarterback and humble Heisman winner,” but also the man who was forced to apologize for his alleged homophobia. And for what purpose? Which cause did the reporter advance? Where was the cultural gain in Murray’s pain?

The incidents happen so fast, and the firings are so quick, that they start to blur together. Can you remember November’s victims? October’s? Who lost their jobs this summer? Who was forced to apologize this spring?

In other words, if you’re in the middle of the shame storm, you can only take it. Even the act of self-defense magnifies the incident and magnifies the harm. It’s as if one doesn’t just wear the scarlet letter: It’s tattooed on one’s forehead in ever-brighter and bolder shades the longer the controversy endures.

I know that complex social phenomena have multiple and complex causes, but consider the terrible surge in teen depression and suicides — a surge that led Jean Twenge to ask in The Atlantic, “Have Smartphones Destroyed a Generation?” She tracks the tipping point at the moment when smartphone ownership became ubiquitous with young Americans. In 2012, the percentage of Americans who owned smartphones passed 50 percent. In 2012, the mental health of teenagers declined dramatically.

Of course, the “smartphone” is a stand-in for what’s on the phone, and what’s on the phone is a stunning amount of fury and intolerance. Look, for example, at this chart of political hatred in the United States, from the new book Prius or Pickup: How the Answers to Four Simple Questions Explain America’s Great Divide:

Teen depression, adult political anger, adult “deaths of despair,” shame campaigns — I don’t think we can look at any of these things entirely in isolation. Instead, I see them as symptoms of a post-Christian America that has become intolerably intolerant. It is a society without grace. It’s a society that’s all too often devoid of mercy — or in which the merciful don’t have nearly the same cultural power as the merciless.

Human beings need forgiveness like we need oxygen. The thing that is so shattering about the shame storm is that it is usually grounded in something a person did wrong — even if it’s a minor transgression. Even if it’s just momentary thoughtlessness. Even if it’s just a tweet. In her essay, Andrews described how the attack from her boyfriend was grounded in her very real mistreatment of him during their relationship. Take any given controversy, and you’ll usually find that the person at the center isn’t proud of what they did. They wish they hadn’t done it. At some level, the person at the center of the shame storm is also ashamed of themselves.

Oh, we can “do justice” — with vindictive glee. But are we kind? Do we have the slightest trace of humility? As any Christian who grew up in the bonds of fundamentalist legalism can tell you, justice untempered by mercy grinds the human heart into dust. And now we’re besieged by a secular fundamentalism that positively delights in inflicting pain on its enemies.

Of course we can and should disagree — even sharply — with bad ideas, but we should take very great care before any person uses the power of their platform — great or small — to attempt to humiliate another human being. Criticism can be conducted with respect and with the humble awareness that our own mistakes are ample and easily found. In fact, it’s hard to improve on Helen Andrews’s wise counsel:

The solution, then, is not to try to make shame storms well targeted, but to make it so they happen as infrequently as possible. Editors should refuse to run stories that have no value except humiliation, and readers should refuse to click on them. It is, after all, the moral equivalent of contributing your rock to a public stoning. We should all develop a robust sense of what is and is not any of our business. Shame can be useful — and even necessary — but it is toxic unless a relationship exists between two people first. A Twitter mob is no more a basis for salutary shaming than an actual mob is for reasoned discussion. That would be true even if the shaming’s relics were not preserved forever by Google, making any kind of rehabilitation impossible.

Or, perhaps it is better to end less with an exhortation than a warning — one grounded in ancient truth: “For with the judgment you pronounce you will be judged, and with the measure you use it will be measured to you.” An intolerant nation is a miserable nation. Only forgiveness can light the trail out of the darkness.

By David French and published in National Review on December 12, 2019 and can be found here.

 

Shame Storm

After a lifetime of impeccably correct opinions, Ian Buruma found himself on the wrong side of the liberal consensus in September 2018, when he was forced to resign as editor of the New York Review of Books for having commissioned a piece called “Reflections from a Hashtag” from the disgraced Canadian broadcaster Jian Ghomeshi. One does not get to be editor of the NYRB without having filament-like sensitivity to the boundaries of acceptable opinion. Buruma’s virtuosic handling in 2007 of the controversy over his New York Times Magazineprofile of Tariq Ramadan, in which he wrote indulgently of his subject’s radical Islamic views—and scathingly of Ayaan Hirsi Ali’s secularist opposition to them—was a model of politically correct equipoise. If Buruma was caught flat-footed this time, it must be the times that have changed.

Unlike Leon Wieseltier, Lorin Stein, ­Garrison Keillor, John Hockenberry, Ryan Lizza, or any of the other editors and journalists who have lost their jobs in the last twelve months due to the movement known as #MeToo, Buruma was not accused of any sexual misconduct. His crime was to give space in his magazine to a man who had been accused (but not, in any of four court cases, convicted) of sexual harassment and non-consensual roughness during sex. Buruma told Slate in an interview five days before his resignation, “I think nobody has quite figured out what should happen in cases like his, where you have been legally acquitted but you are still judged as undesirable in public opinion, and how far that should go, how long that should last.”

Too true, as Buruma found out to his cost. No one has yet figured out what rules should govern the new frontiers of public shaming that the Internet has opened. New rules are obviously required. Shame is now both global and permanent, to a degree ­unprecedented in human history. No more moving to the next town to escape your bad name. However far you go and however long you wait, your disgrace is only ever a Google search away. Getting a humiliating story into the papers used to require convincing an editor to run it, which meant passing their standards of newsworthiness and corroborating evidence. Those gatekeepers are now gone. Most attempts so far to devise new rules have taken ideology as their starting point: Shaming is okay as long as it’s directed at men by women, the powerless against the powerful. But that doesn’t address what to do afterward, if someone is found to have been wrongfully shamed, or when someone rightfully shamed wants to put his life back together.

In the essay that got Buruma fired, Ghomeshi claims to have been a pioneer in online shaming. “There are lots of guys more hated than me now. But I was the guy everyone hated first.” Actually, a better candidate for original victim is Justine Sacco, the PR executive who tweeted to her 170 Twitter followers before getting on a plane to Cape Town, “Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!” It was during the Christmas holidays when news is always slow, so a Gawker post about the tweet quickly went viral. People around the world were soon enjoying the suspense of knowing Sacco was on a plane with no Internet access and no way to know that she had become an object of global ridicule. That was in December 2013, almost a year before the Ghomeshi story broke.

And before that, in the Precambrian era of online shaming, there was me.

In October 2010, I appeared on a panel to promote a book of essays by young conservatives, Proud to Be Right: Voices of the Next Conservative Generation. The moderator was Jonah Goldberg. One of the other panelists was my ex-boyfriend Todd Seavey. During the Q&A, Todd launched into a rant about my personal failings. He accused me of opposing Obamacare on the grounds that it would diminish human suffering, which allegedly I preferred to increase; of wanting to repeal laws against fistfights for the same reason; of being a sadistic and scheming heartbreaker in my personal life; and of generally living according to a “disturbing” and “brutal” set of values. For three minutes and forty-five seconds, which, unfortunately for me, were captured on film for broadcast two weeks later on C-SPAN2, he made an impassioned case that I was a sociopath.

Todd is not a psychologist, but a psychologist with no evidence to go on except my treatment of Todd might well have arrived at the same conclusion. I treated him awfully. I can only plead in mitigation that I was twenty-two. Todd is from Connecticut and has that charming New England stolidity, and I behaved as if his patience, which seemed so infinite when we were dating, really had no limits. The bit about opposing Obamacare because I favored human suffering was outlandish, and other parts of his rant were not quite how I remembered things, but everything he said, he really believed, and he had arrived at those beliefs by a hard road.

I braced myself for the broadcast. Maybe no one would notice? Within minutes, the offending clip had been posted on YouTube, where it got half a million hits in the first forty-eight hours. It made the evening news on Washington’s Fox affiliate. Greg Gutfeld did a segment about it on RedEye. It was written up in Gawker, the Washington Post, Talking Points Memo, and a hundred lesser sites, and then written up again when Todd expanded his remarks about me into a ­series of blog posts on his personal website. My inbox exploded with media inquiries, none of which I answered, except to give a short statement to Mary Katharine Ham at the Daily Caller:

I wish I could say it was all a plan hatched by our new media consultant, who told us we had to “think outside the box” to make our C-SPAN panel “go viral,” but no, it is exactly what it looks like.

As a matter of policy, I don’t comment on my personal life in public, but I will clarify that his tirade thoroughly mischaracterizes my political views. For instance, I do not believe that laws against assault should be repealed—nor do I think there should be an exception in cases when one’s ex-boyfriend behaves unacceptably on national television, though I admit that’s a tougher question. Nor do I oppose Obamacare for the contorted reason he states—I oppose it for the usual reasons.

To the personal friends who emailed commiserations, I replied with an old Aaron Sorkin line about bad publicity: “It’s like seasickness. You think you’re gonna die, and everyone else just thinks it’s funny.”

That, it turned out, was overly optimistic. Everyone at work was supportive (“if you want us to form the committee to horsewhip todd seavey, just say the word,” one colleague emailed, bless him), but no amount of support could counteract the paranoia that settled in over the next weeks and months. My colleagues probably didn’t believe the woman they worked alongside was secretly a comic-book ­villain—but surely the suspicion had been planted? I never knew whether someone on the subway was giving me a second glance because he knew me, or because he recognized me from the video. Fellow journalists reported back to me from conferences where Todd expatiated on my depravity at length—in one case, before an audience that included my boss. An old friend called to say he had posted a supportive comment about me at the New Republic and shortly after received an email from Todd, who had guessed his identity from his screen name, explaining all the reasons I did not deserve to be defended. I wondered how many such incidents I never heard about.

I tried to process the experience intellectually. I read Lord Jim and The House of Mirth. No grand lesson presented itself, which, in a way, was lucky. It meant there was no ideological interpretation I could superimpose on my experience, which would have slowed my progress toward acceptance by allowing me to indulge in resentment and indignation. I couldn’t tell myself it had happened because I was a woman. Had the genders been reversed, I probably would have received less sympathy than I did. I could not blame society, or C-SPAN, or ­Jonah Goldberg. A year and a half later, when I was looking for a new job, I could not even blame the prospective employers who demonstrated a marked reluctance to bring me in for interviews. If I had to choose between a candidate whom no one had ever called a sociopath on national television, and one who probably wasn’t a psycho but might be, I would play it safe, too, even if the probability was only a fraction of a percent.

In 2012, I decided I would rather be Lord Jim than Lily Bart, so I accepted an offer from my boyfriend (now husband) to move with him to Australia—the best decision I ever made. On my last night in New York, in a burst of either sentimentality or bravado, I called Todd. We met in Midtown for a drink, and I found, to my surprise, that there was nothing I particularly wanted to say to him. If I was looking for some kind of closure, I wasn’t ready for it yet. In the end I had only one question for him: When we were chatting in the courtyard before the panel, was it some kind of deliberate foreshadowing when he mentioned how much he always liked Pink Floyd’s The Wall and started singing a song from the album that goes, “Since, my friend, you have revealed your deepest fear, I sentence you to be exposed before your peers”? He said it was just a coincidence.

Moving to the other side of the world did not diminish the video’s place in my life as much as I thought it would. It was still the first result when you Googled my name, which presumably is one reason I couldn’t find a job for the first eighteen months. Eventually, I found a position at a think tank. When I released my first report, an Australian MP tweeted a link to the video and asked why anyone should care about this nutcase’s opinions on regulation. Even after I got married and took my husband’s last name, the video still popped up on social media when I did a TV appearance or had an op-ed in the paper. In 2017, when I moved back to Washington, D.C., and started meeting some of the younger writers in town, it took them less than a week to find the clip and ask me about it. Most of them had been in high school when it happened.

In a funny coincidence, the day I began writing this essay, my husband was attending a conference of free-market activists when his lunch table started talking about bad breakups in the conservative movement. One man pulled out his iPhone and said, “If you want to talk about bad conservative breakups, you have to see this.” He put the phone away when Tim told him that the woman in the video was his wife. That was eight years and twenty-one days since the broadcast first aired.

There is a celebrity fashion blog called Go Fug Yourself that specializes—or specialized back in 2011, the one and only time I visited the site—in unflattering paparazzi shots and red carpet disasters. The odd thing about Go Fug Yourself, I discovered, was that all its nastiest posts featured the same tic. After unloading whatever brutal snark she had for Jennifer Lawrence or whomever, the writer would always include the same disclaimer: A celebrity has one job, and that’s to look glamorous, so if you can’t manage the one thing you owe us in exchange for all the money and fame, then find another line of work, and until then lay off the cheeseburgers and hire a decent stylist. This dime-store Joan Rivers can’t think she’s fooling anyone, I thought as I scrolled through the archives to see if every post really included this lame moral alibi. Her motivation has nothing to do with celebrities falling short of their duty to the public. She’s making fun of ugliness for the same reason anyone does: It stimulates our lizard brains.

People who read the Atlantic are smarter than the readers of Go Fug Yourself, but sometimes smarter people don’t make better decisions; they just come up with better excuses. Kevin Williamson was fired by the Atlantic in April 2018 over an unearthed audio recording in which he said that abortion was a form of murder and should carry the same punishment, up to and including the death penalty. The aspect of the resulting Twitter storm that surprised me was not the way his statement was warped out of context into a defense of lynch justice for pregnant teenagers but the purported concern for his female coworkers. “How can you say that you want a workplace that values women when you hire someone who wants 25% of those women dead?” asked feminist Jessica Valenti. When Williamson’s firing was announced, in a memo that made delicate reference to “the values of our workplace,” Valenti responded, “I am very relieved for the women who work at the magazine.”

At the risk of insulting the reader: No one actually believed Williamson was a threat to his female colleagues. It was only a pretext for what was really an exercise in raw power. People made the same kind of excuses when it was my turn in the dunk tank. Again and again, I read commenters insisting that what might at first glance appear to be prurient gossip was, in fact, fair political commentary, because I was a family-values scold and thus open to charges of hypocrisy, or because I was a hard-core Randian who needed a lesson in the dog-eat-dog heartlessness advocated by my idol. As far as I can tell, these characterizations were extrapolated from the fact that I worked at National Review. Certainly, they had no basis in anything I’d written (an Objectivist, really?).

The more online shame cycles you observe, the more obvious the pattern becomes: Everyone comes up with a principled-sounding pretext that serves as a barrier against admitting to themselves that, in fact, all they have really done is joined a mob. Once that barrier is erected, all rules of decency go out the window, but the pretext is almost always a lie. ­Matthew Yglesias once claimed that the reason he mocked David Brooks for his divorce was because Brooks had written columns about the social value of marriage, but I do not believe him. He did it because it’s fun to humiliate your political opponents. Moira Donegan claims that she created the Shitty Media Men List—a clearinghouse of anonymous accusations optimally parked for maximum dissemination in the Google Spreadsheet cloud—for altruistic reasons and with no thought of its being used to hurt anyone, but I do not believe her. If it was about protecting women in media from harassment, then why no attempt to sort the true accusations from the false? Why the coy protestations that “I thought that the document would not be made public,” when of course she knew that it would be spread far and wide, or she wouldn’t have bothered creating it?

Donegan’s defenders do not behave like people interested in finding the truth. They stirred up a Twitter mob against Katie Roiphe before her Harper’s piece about the Shitty Media Men List was even published. Claims to be motivated by concern about possible backlash against Donegan, if Roiphe revealed her as the creator of the list, were more than a little disingenuous. Since being outed, Donegan has gotten a book deal with Simon & Schuster and a regular column in the Guardian, which is precisely what anyone could have predicted. When John Hockenberry, also in Harper’s, wrote about his experience being #MeToo’d out of his job at NPR, admitting some charges and explaining why he thought others were bogus, his detractors did not bother refuting his case. They simply ridiculed him. And no one has offered him a book deal.

In Trust Me, I’m Lying: Confessions of a Media Manipulator, Ryan Holiday’s memoir of his years as a PR consultant, he describes a roundtable meeting at the Huffington Post where the editors discussed how a certain big company should have handled its recent PR crisis. The editors offered the usual bromides: “Transparency is critical.” “Be proactive.” “Get out in front of it.” Holiday replied, “None of you know what you’re talking about.” The old rules don’t apply in the free-for-all world of online journalism, and they especially don’t work when the figure at the center of the controversy is one lonely individual. If a client came to him because he was being called a racist or sexist on Twitter, Holiday says (pardon the vulgarity), “I would tell him to bend over and take it. And then I’d apologize. I’d tell him the whole system is broken and evil, and I’m sorry it’s attacking him. But there’s nothing that can be done.”

Any attempt to defend yourself or clarify your original remarks is “the equivalent of a squeaky cry of, ‘Why is everyone making fun of me?!’ on the playground,” Holiday says. “Whether it happens in front of snarky blogs or a real-life bully, the result is the same: Everyone makes fun of you even more.” The idea that online shaming is a form of debate—or in any way oriented toward finding the truth—is a delusion. Dialogue is not the point. The day Brett ­Kavanaugh and Christine Blasey Ford testified before the Senate Judiciary Committee, the New Yorker—not Gawker, but the New Yorker—ran thirty-two Kavanaugh headlines in twenty-four hours, many of them on the subject of the nominee’s supposed whininess: “The Tears of Brett Kavanaugh”; “An Angry, Tearful Opening”; “Brett Kavanaugh’s ­Damaging, Revealing Partisan Bitterness”; “A Grotesque Display of Patriarchal Resentment.” The man had been accused of being a brutal rapist, and the most prestigious magazine in America ridiculed him for responding to the allegation as any innocent man would have. No, dialogue is not the point.

When I was debating whether or not to write this essay, which, after all, revisits an unpleasant incident that has long been at least semi-­dormant, if not quite forgotten, I saw a headline in the New York Times: “His Body Was Behind the Wheel a Week Before It Was Discovered.” The man, Geoffrey Corbis, had committed suicide in a parked car in the East Village. Only his name wasn’t really Geoffrey Corbis, the Times explained. He had been born Geoffrey Weglarz. He changed it after an incident in 2013 at a McDonald’s near his home in Connecticut, when he threw a sandwich at a pregnant server who had given him the wrong order. Newspaper coverage of this funny local fracas did not mention Weglarz’s recent divorce or long-term unemployment after leaving his job as a computer programmer at Dell. He couldn’t find work with the McDonald’s story at the top of his Google results, hence the attempt at a fresh start as Geoffrey Corbis.

It happens more often than you would think. At least half a dozen cases mentioned in Laws of ­Image: Privacy and Publicity in America, ­Samantha Barbas’s 2015 history of shame and libel, end with suicides. Jon Ronson’s So You’ve Been Publicly Shamed describes an English chef, living in France, who killed himself after his wife-swapping hobby was revealed by the News of the World. It also tells of a rural Welsh preacher who found himself the subject of a photo spread in the same publication for hosting an orgy in his caravan—after which he, too, killed himself. Most victims of public shaming aren’t nationally famous editors like Ian Buruma. They are ordinary folks like “ID Adam,” who lost his job at a box assembly company in Winston-Salem after reports that he racially profiled a black woman at a community pool. It turned out that he, as the pool chair on duty, had asked to see her ID, because, when signing in, she had given an address on a street in the neighborhood where no houses had yet been built. It took him days to get his side of the story into the papers, and it didn’t make him any less fired.

An essay about public shaming should have advice for those people, I thought. When I couldn’t think of any, I called Todd. He had, after all, suffered quite as much from the C-SPAN2 fallout as I had. He left his job at Fox—not right away, but after three months, when he refused to sign a statement from HR saying that such TV appearances were a violation of their “outside media” policy, even though they had never expressed a problem with his extracurricular projects before. Four years later, he returned to the NewsCorp building to film a segment on the Kennedyshow, only to be stopped in the lobby by security and told he was on a no-admit list. He makes a living as a ghostwriter now, and his book Libertarianism for Beginners was published to positive reviews in 2016. When I asked if he would do it over again if he had the choice, he said he is now a believer in handling things privately. “In the future, if I get married, if my wife stabs me, you won’t hear me shouting in public about it.”

“Things really can get so much bigger than you and your own efforts that you just kind of have to ride the wave,” Todd said. “I was obscure enough before that any public attention I got was the result of me trying really hard.” He told me he never expected the clip to go as viral as it did, “far beyond my ability to control or even monitor,” which sounded implausible—until I remembered just how unfamiliar these online shame cycles were in the years before Justine Sacco’s tweet. Todd thought he would say his piece—which, in his mind, was not just that I was a bad girlfriend, but that I had a “cruelty-based worldview” that future editors and employers should be warned against unwittingly promoting by giving me work—and that would be that.

Todd’s advice for our fellow-shamed was no better than mine. “When a tsunami is heading for your house, at a certain point you have to say, ‘I’m just gonna stand here and hold this piece of plywood and see what’s left standing when it’s all over.’” Arguing back is no use. “If you’re tweeting, you’re losing.” Even in the immediate aftermath of the C-SPAN2 incident, when Todd, on his blog, attempted to make his case at length against my evil beliefs, he saw his arguments get lost in the maelstrom—equally ignored by both supporters and detractors. If we had a breakthrough in our conversation, that was it: There is no content to a shame storm. It is mindless by its very nature. It is indifferent to truth, even in cases where the truth could possibly be determined. Therefore, like the Ring, it cannot be used for good.

The solution, then, is not to try to make shame storms well targeted, but to make it so they happen as infrequently as possible. Editors should refuse to run stories that have no value except humiliation, and readers should refuse to click on them. It is, after all, the moral equivalent of contributing your rock to a public stoning. We should all develop a robust sense of what is and is not any of our business. Shame can be useful—and even necessary—but it is toxic unless a relationship exists between two people first. A Twitter mob is no more a basis for salutary shaming than an actual mob is for reasoned discussion. That would be true even if the shaming’s relics were not preserved forever by Google, making any kind of rehabilitation impossible.

If Stephen Elliott has his way, would-be shamers will have to consult more than just their consciences. He is suing Moira Donegan for defamation over her media men list, in which his entry reads: “Rape accusations, sexual harassment, coercion, unsolicited invitations to his apartment, a dude who snuck into Binders???” (Binders is a Facebook group for women writers.) What it means to be accused of “rape accusations” will doubtless be clarified at trial. It sounds like the person who wrote this was speaking from rumor herself, which proves how cavalierly career-ending allegations of sexual assault are now thrown around. I have no legal opinion on whether Elliott’s lawsuit will be for #MeToo what Peter Thiel and Hulk Hogan’s heroic lawsuit was for Gawker, but, unless we all begin to respond more responsibly to public shaming, we can expect to see more attempts to (as President Trump put it) “open up our libel laws.”

As for the people who find themselves at the center of an online shaming, I can only report how I made peace with mine. Ironically, the disagreement that gave Todd the idea that I had a “cruelty-based worldview” was over my belief that suffering is sometimes necessary for personal growth, and an essential part of God’s plan for our salvation—a belief that, as a strict utilitarian, Todd completely rejects. We had a dozen fights about it. The irony, of course, is that there is no belief my brush with online shaming confirmed more. I had heard the maxim that there is no humility without humiliation—how true it proved. My first reaction to the video was to feel aggrieved, thinking that I did not deserve what was happening to me, but on the Day of Judgment all my sins will be shouted from the housetops, and Todd’s rant will sound like a retirement luncheon toast in comparison. Of course I deserved it, and worse; most of us poor sinners do.

Of all history’s martyrs to shame, the one whose example consoled me most was Oscar Wilde. He is remembered today as a gay rights pioneer, but, in the letters he wrote after his release from prison, he never rails against the injustice of the law that put him away. He did not think it was a good law, he simply believed that the justice or injustice of the charge against him was irrelevant. What mattered was that he had been rescued from his own pride and selfishness by his experience, when he could not have been saved by any gentler medicine. This lesson, which produced “The Ballad of Reading Gaol” (“I know not whether Laws be right, / Or whether Laws be wrong”), he put into plain prose in a letter written during his exile in July 1897. Sporus was the slave boy that emperor Nero freed and “married”:

To me, suffering seems now a sacramental thing, that makes those whom it touches holy. I think I am in many respects a much better fellow than I was, and I now make no more exorbitant claims on life: I accept everything. I am sure it is all right. I was living a life unworthy of an artist, and though I do not hold with the British view of morals that sets Messalina above Sporus, I see that any materialism in life coarsens the soul, and that the hunger of the body and the appetites of the flesh desecrate always, and often destroy. . . . I learnt many things in prison that were terrible to learn, but I learnt some good lessons that I needed.

The man to whom this letter was addressed was Carlos Blacker, who himself had fled England for France in 1890, when he was accused of being a card cheat. The charge against Blacker happened to be false, just as the charge against Wilde happened to be true, but that made no difference in the two men’s experiences. The truth that Wilde came to understand, which he shared with his fellow exile, was that they should accept their chastening in a spirit of gratitude. Nothing had been taken from them that would not be restored a hundredfold if they allowed their ­experience to do its redemptive work.

By Helen Andrews and published in January 2019 in First Things on and can be found here.

It Shouldn’t Be This Easy to Fool the Academic Left

Somehow it is fitting that the most extraordinary academic hoax of our time would deal with dog parks, dildos, Hooters, masturbation, fat shaming, and a feminist Mein Kampf.

In a prank that is alternately hilarious, appalling, and disturbing, three puckish academics managed to place no fewer than seven “shoddy, absurd, unethical” articles in “respectable” academic journals that trafficked in the growing field of grievance studies—a field that includes gender and queer studies, critical race theory and a variety of post-modern constructivist theories now fashionable in the humanities and social sciences. If nothing else, they demonstrated that academic leftism is a target ripe for ridicule as well as outrage.

As they note in their paper, “ Academic Grievance Studies and the Corruption of Scholarship,” the seven fake papers were the “tip of the iceberg” of sophistry in the hyper-ideological swamps of academia.

The absurdity of the paper was first highlighted by the twitter account known as @RealPeerReview, which exposes a wide range of junk scholarship (if you don’t follow it, you really ought to.) When the Wall Street Journal and others began sniffing around to ascertain the authorship of the piece, however, the gig was up and the three hoaxers decided to come clean. They admitted that they were also behind the “nutty and inhumane” idea to make white male students sit on the floor as a form of reparations, a paper that explored why straight men “rarely anally self-penetrate using sex toys,” and had even gotten a paper accepted in a feminist journal that was actually a chapter from Mein Kampf, “with fashionable buzzwords switched in.”

In addition to the seven papers that were accepted, they had another three accepted but not published; another seven were “still in play,” and only six had been rejected by peer reviewers.

Their success had been was so spectacular—and the results so farcical—that Harvard’s Yascha Mounck has labeled the grievance study hoax “Sokal Squared,” a reference to what, until now, had been academia’s most elaborate ruse.

On May 18, 1996, the New York Times broke the story that one of the trendiest, most prestigious academic journals in the country had been the victim of an elaborate hoax. The journal Social Text had published a lengthy post-modernist critique of science, unaware that the whole thing was a parody, a complete spoof of academia’s “self-indulgent nonsense.”

The article, written by a physicist named Alan Sokal was “a hodgepodge of supported statements, outright mistakes, and impenetrable jargon,” wrote the editors of Lingua Franca, the journal that exposed the prank. Filled with references to “hip theorists” like Jacques Derrida, it was “full of nonsense and errors.” But it had been published nonetheless.

Hilarity ensued as the implications of the prank became clear not least because it seemed to confirm suspicions that beneath the academic gibberish lurked … well, just gibberish.

But the latest attempt to expose academic drivel is in some ways more ambitious, because rather than simply relying on word salads of jargon, the authors, Helen Pluckrose, James A. Lindsay, and Peter Boghossian, set out to mimic the mind-set of the “identitarian madness coming out of the academic and activist left.”

They began each paper with a bizarre or outrageous thesis—that astronomy is sexist, or that men should be trained as dogs—but hijacked the logic, language, and dogmas of existing grievance literature to support their claims. The papers were notable for their shoddiness and preposterousness, but—and here was the key—they fit seamlessly into what passes for scholarship in the world of grievance studies.

“While our papers are all outlandish or intentionally broken in significant ways, it is important to recognize that they blend in almost perfectly with others in the disciplines under our consideration,” they explained. “As we progressed, we started to realize that just about anything can be made to work, so long as it falls within the moral orthodoxy and demonstrates understanding of the existing literature.”

So, for example, they wanted to see if they could get a respected journal to “publish papers that seek to problematize heterosexual men’s attraction to women and will accept very shoddy qualitative methodology and ideologically-motivated interpretations which support this.” Again, success—the journal Sex Roles published “An Ethnography of Breastaurant Masculinity: Themes of Objectification, Sexual Conquest, Male Control, and Masculine Toughness in a Sexually Objectifying Restaurant,” in which the (fake) authors argued that men go to Hooters “because they are nostalgic for patriarchal dominance and enjoy being able to order attractive women around.”

Because the three pranksters wanted to see if they see if they could get journals to accept arguments “which are ludicrous and positively dangerous to health if they support cultural constructivist arguments around body positivity and fatphobia,” they wrote a paper arguing for the sport of “fat bodybuilding.” It was duly published in Fat Studies.

And the journal Sexuality and Culture eagerly accepted a piece on sex toys—“Going in Through the Back Door: Challenging Straight Male Homohysteria and Transphobia through Receptive Penetrative Sex Toy Use” that concluded that the male reluctance to use dildos was “actually homophobic, transphobic, and anti-feminist.”

But their pièce de résistance was their success in getting the journal Affilia to publish a rewrite of a chapter of Mein Kampf by titling it “Our Struggle is My Struggle: Solidarity Feminism as an Intersectional Reply to Neoliberal and Choice Feminism,” and leavening it with feminist jargon to distract from its Hitlerian antecedents.

Beneath the merriment, the authors made a deadly serious point. “The problem we’ve been studying is of the utmost relevance to the real world and everyone in it,” they write. Much of the work now being produced by academia’s growing and voluble grievance industry, “is positively horrifying and surreal while exerting considerable influence on the field and beyond.”

Pluckrose, Lindsay and Boghossian—who are all self-proclaimed liberals—warn progressives who care about advancing social justice, that “these fields of study do not continue the important and noble liberal work of the civil rights movements; they corrupt it while trading upon their good names to keep pushing a kind of social snake oil onto a public that keeps getting sicker.”

Something has gone wrong in the university—especially in certain fields within the humanities. Scholarship based less upon finding truth and more upon attending to social grievances has become firmly established, if not fully dominant, within these fields, and their scholars increasingly bully students, administrators, and other departments into adhering to their worldview. This worldview is not scientific, and it is not rigorous….

This makes the problem a grave concern that’s rapidly undermining the legitimacy and reputations of universities, skewing politics, drowning out needed conversations, and pushing the culture war to ever more toxic and existential polarization.

The whistleblowers called on universities to launch a “thorough review,” of all of the fields suffused with grievance studies “in order to separate knowledge-producing disciplines and scholars from those generating constructivist sophistry.”

But such sustained introspection or reformation seem unlikely, given the arc of academia’s new orthodoxies, which are regarded as authoritative in so many disciplines. Reaction from the academic left has been predictable, with the authors attacked as part of a racist, sexist, homophobic, and transphobic right-wing assault on legitimate scholarship. Feminist scholar Alison Phipps issued a call via twitter (which has since been deleted) for her fellow academics to “please stand by colleagues in Gender Studies/Critical Race Studies/Fat Studies & other areas targeted by this journal article hoax. This is a coordinated attack from the right.” A critique in Slate also downplayed the gravity of the hoax and questioned why the story would be released “in the midst of the Kavanaugh imbroglio—a time when the anger and the horror of male anxiety is so resplendent in the news.”

Academia’s professional response to the whistleblowers is likely to be even harsher. The Wall Street Journal notes that it is likely the hoax will result in the academic ex-communication of the three scholars.

Mr. Boghossian doesn’t have tenure and expects the university will fire or otherwise punish him. Ms. Pluckrose predicts she’ll have a hard time getting accepted to a doctoral program. Mr. Lindsay said he expects to become “an academic pariah,” barred from professorships or publications.

Even so, Lindsay thinks it was all worth it, telling the Journal’s Jillian Kay Melchior, “For us, the risk of letting biased research continue to influence education, media, policy and culture is far greater than anything that will happen to us for having done this.”

By Charles J. Sykes, published in The Weekly Standard on October 8, 2019 and can be found here.

Is this it?: A Trump-hater’s Guide to Mueller Skepticism

Mueller’s comportment suggests a man who’s fallen prey to the same state of mind that warped Ken Starr—namely disgust over the people you’re investigating and a desire to justify the sunk capital. Even if the special counsel presents one hell of a report, Democrats must ask: was it worth it?
In the autumn of 1995, millions of Indians flocked to New Delhi after reports that a statue of Ganesha, the Hindu deity of good luck, was drinking milk from a spoon. It turned out that Ganesha, in the form of carved white stone, was a bit porous, and he wasn’t drinking the milk so much as getting coated in it, as each of the thousands of spoonfuls trickled down his side, but a collective thrill prevailed for a while. I relate this incident because its rhythms—big news, then frenzy, then comedown—bear a strong resemblance to those of Russiagate, with each development setting the Resistance into a frenzy of milk-buying and statue-feeding that fades only after a few days, replaced by an unspoken agreement to wait for further reports on Ganesha’s movements.

For many Robert Mueller watchers, the air these days is electric. People sense the big shoes are about to drop. Donald Trump has submitted his written answers to Mueller’s questions. Paul Manafort has entered a plea agreement, but then continued to lie—at least according to Mueller. Jerome Corsi, fringe-right author and personality, is vowing to go to jail for life rather than sign on to Mueller’s version of events. Roger Stone is expecting to be indicted for something. So is Donald Trump Jr. And, most significant of all to those looking for a big payoff, Michael Cohen has pleaded guilty to lying to Congress about the timeline of a deal he was trying to make to construct a 100-story Trump-branded tower in Moscow. It turns out that the deal exploration continued past the time Trump had secured the Republican nomination, and Cohen and his associate Felix Sater, a real-estate promoter and one-time racketeer, had even discussed giving Vladimir Putin a $50 million penthouse in the building. “This is it,” people are saying. “This is the big one!”

Certainly, Trump’s ethical standards are low, but if sleaziness were a crime then many more people from our ruling class would be in jail. It is sleazy, but not criminal, to try to find out in advance what WikiLeaks has on Hillary Clinton. It is sleazy, but not criminal, to take a meeting in Trump Tower with a Russian lawyer promising a dossier of dirt on Clinton. (Just as, it should be mentioned, it is sleazy, but not criminal, to pay a guy to go to Russia to put together a dossier of dirt on Trump. This is one reason why the Clinton campaign lied about its connection to the Steele dossier, albeit without the disadvantage of being under oath.) It is sleazy, but not criminal, to pursue a business deal while you’re running for president. Mueller has nailed people for trying to prevaricate about their sleaze, so we already have a couple of guilty pleas over perjury, with more believed to be on the way. But the purpose of the investigation was to address suspicions of underlying conspiracy—that is, a plan by Trump staffers to get Russian help on a criminal effort. Despite countless man-hours of digging, this conspiracy theory, the one that’s been paying the bills at Maddow for a couple of years now, has come no closer to being borne out. (Or, as the true believers would say, at least not yet.)

Partisanship is hostile to introspection, but at some point maybe we’ll look back and think again about what was unleashed in the panic over Russian influence. Trump’s White House has pursued what is arguably the harshest set of policies toward Russia since the fall of Communism—hardly something to celebrate—yet nearly all the pressure, from the center-left as much as the right, is toward making it even tougher. As for those tapping along to S.N.L. songs in praise of Mueller and his indictments, they might want to remember that Trump won’t always be in office. The weapons you create for your side today will be used by the other side against you tomorrow. Do we really want the special-counsel investigation to become a staple of presidential life? It’s a creation with few boundaries on scope and a setup that encourages the selection of a suspect followed by a search for the crime, rather than the other way around. This caused calamities in the era of Bill Clinton, and it doesn’t get any better just because the partisan dynamics are reversed.

Let’s take a moment to consider Mueller himself. The cut of his jib is likable, and the trad Brooks Brothers vibe of his wardrobe is a perfect complement to his job title. But it’s hard to avoid the suspicion that he’s playing a political game at this point. To be fair, I’m vulnerable to confirmation bias of my own in this assessment, since about a year ago I suggested that Mueller was going to drag out his investigation until 2019, when Democrats were likely to be back in charge of the House, and seeing a prediction play out can lead to unwarranted certitude. But the reports we’re starting to see suggest a man who’s fallen prey to the same state of mind that warped Ken Starr—namely disgust over the people you’re investigating and a desire to justify the sunk capital.

Our justice system gives prosecutors a frightening amount of power as it is, and nothing tempts misuse of it quite like the belief in a narrative in the face of a disappointing witness. George Papadopoulos has told people he pleaded guilty to perjury because Mueller was threatening to prosecute him as an unregistered agent of Israel. Jerome Corsi insists that Mueller was (and is) threatening him with a raft of indictments unless he signed on to an untrue story of how he came to believe (or know) that WikiLeaks had hacked the e-mails of John Podesta.

If it’s any consolation to Trump haters, we san say this much: the special counsel’s office is going to put together a hell of a report. It will have less sex than Starr’s did, but that’s for the best, and the testimony of Michael Cohen will still guarantee a lot of great scenes, many of them certain to become immortal and embarrassing. Trumpworld won’t fare well under a bright light. Like Starr, Mueller is also likely to include footnotes and selections that will hint at criminality, the things he suspects but couldn’t prove, and the most ardent believers in collusion will claim vindication. But the international conspiracies will be few, and the collateral damage of the Russia scare will be extensive, stretching far beyond Trump or his circle to the country as a whole. It might hurt a president who many Americans hate, but even the president’s most ardent foes should reflect on a question that will linger: Was it worth it?

Why We Miss the WASPs

By Ross Douthat, published on December 5, 2018 in The New York Times and can be found here.

Their more meritocratic, diverse and secular successors rule us neither as wisely nor as well.

The nostalgia flowing since the passing of George H.W. Bush has many wellsprings: admiration for the World War II generation and its dying breed of warrior-politicians, the usual belated media affection for moderate Republicans, the contrast between the elder Bush’s foreign policy successes and the failures of his son, and the contrast between any honorable politician and the current occupant of the Oval Office.

But two of the more critical takes on Bush nostalgia got closer to the heart of what was being mourned, in distant hindsight, with his death. Writing in The Atlantic, Peter Beinart described the elder Bush as the last president deemed “legitimate” by both of our country’s warring tribes — before the age of presidential sex scandals, plurality-winning and popular-vote-losing chief executives, and white resentment of the first black president. Also in The Atlantic, Franklin Foer described “the subtext” of Bush nostalgia as a “fondness for a bygone institution known as the Establishment, hardened in the cold of New England boarding schools, acculturated by the late-night rituals of Skull and Bones, sent off to the world with a sense of noblesse oblige. For more than a century, this Establishment resided at the top of the American caste system. Now it is gone, and apparently people wish it weren’t.”

I think you can usefully combine these takes, and describe Bush nostalgia as a longing for something America used to have and doesn’t really any more — a ruling class that was widely (not universally, but more widely than today) deemed legitimate, and that inspired various kinds of trust (intergenerational, institutional) conspicuously absent in our society today.

Put simply, Americans miss Bush because we miss the WASPs — because we feel, at some level, that their more meritocratic and diverse and secular successors rule us neither as wisely nor as well.

Foer suggests this nostalgia is mostly bunk, since the WASPs were so often bigots (he quotes Henry Adams’s fears of a “furtive Yacoob or Ysaac still reeking of the ghetto”), since their cultivation of noblesse oblige was really all about “preserving [a] place at the high table of American life,” and since so many of their virtues were superficial, a matter of dressing nicely while practicing imperialism, or writing lovely thank-you notes while they outsourced the dirty work of politics to race-baiting operatives.

“Those who are mourning the passing of the old Establishment should mourn its many failures, too,” he writes. Which is fair enough: The old ruling class was bigoted and exclusive and often cruel, it had failures aplenty, and as a Catholic I hold no brief for its theology (and don’t get me started on its Masonry).

However, one of the lessons of the age of meritocracy is that building a more democratic and inclusive ruling class is harder than it looks, and even perhaps a contradiction in terms. You can get rid of the social registers and let women into your secret societies and privilege SATs over recommendations from the rector of Justin and the headmaster of Saint Grottlesex … and you still end up with something that is clearly a self-replicating upper class, a powerful elite, filling your schools and running your public institutions.

Not only that, but you even end up with an elite that literally uses the same strategy of exclusion that WASPs once used against Jews to preserve its particular definition of diversity from high-achieving Asians — with the only difference being that our elite is more determined to deceive itself about how and why it’s discriminating.

So if some of the elder Bush’s mourners wish we still had a WASP establishment, their desire probably reflects a belated realization that certain of the old establishment’s vices were inherent to any elite, that meritocracy creates its own forms of exclusion — and that the WASPs had virtues that their successors have failed to inherit or revive.

Those virtues included a spirit of noblesse oblige and personal austerity and piety that went beyond the thank-you notes and boat shoes and prep school chapel going — a spirit that trained the most privileged children for service, not just success, that sent men like Bush into combat alongside the sons of farmers and mechanics in the same way that it sent missionaries and diplomats abroad in the service of their churches and their country.

The WASP virtues also included a cosmopolitanism that was often more authentic than our own performative variety — a cosmopolitanism that coexisted with white man’s burden racism but also sometimes transcended it, because for every Brahmin bigot there was an Arabist or China hand or Hispanophile who understood the non-American world better than some of today’s shallow multiculturalists.

And somehow the combination of pious obligation joined to cosmopolitanism gave the old establishment a distinctive competence and effectiveness in statesmanship — one that from the late-19th century through the middle of the 1960s was arguably unmatched among the various imperial elites with whom our establishment contended, and that certainly hasn’t been matched by our feckless leaders in the years since George H.W. Bush went down to political defeat.

So as an American in the old dispensation, you didn’t have to like the establishment — and certainly its members were often eminently hateable — to prefer their leadership to many of the possible alternatives. And as an American today, you don’t have to miss everything about the WASPs, or particularly like their remaining heirs, to feel nostalgic for their competence.

The interesting question is whether they had to die off as they did. The decline of the old establishment is often portrayed as a simple inevitability — with all those baby boomers storming the universities, all that demographic change sweeping away white Protestant America, how could the WASPs hope to preserve their rule?

Certainly something had to change. But along with the establishment failure in Vietnam, which hastened the collapse of the old elite’s authority, there was also a loss of religious faith and cultural confidence, and a belief among the last generation of true WASPs that the emerging secular meritocracy would be morally and intellectually superior to their own style of elite. Thus under ’60s mandarins like the Yale president Kingman Brewster the WASP ascendancy did not simply fall; it pre-emptively dissolved itself.

I’m not sure that self-abnegation has aged well. In any scenario the WASP elite would have had to diversify and adapt. But its virtues were to some extent transferable to a more diverse society: The establishment had always been somewhat permeable to arrivistes, Jews and Catholics imitated WASP habits in the 1940s and 1950s, and in our era their admirable influence is still felt in figures as different as Barack Obama and Mitt Romney.

So it’s possible to imagine adaptation rather than surrender as a different WASP strategy across the 1960s and 1970s. In such a world the establishment would have still admitted more blacks, Jews, Catholics and Hispanics (and more women) to its ranks … but it would have done so as a self-consciously elite-crafting strategy, rather than under the pseudo-democratic auspices of the SAT and the high school resume and the dubious ideal of “merit.” At the same time it would have retained both its historic religious faith (instead of exchanging Protestant rigor for a post-Christian Social Gospel and a soft pantheism) and its more self-denying culture (instead of letting all that wash away in the flood of boomer-era emotivism). The goal would have been to keep piety and discipline embedded in the culture of a place like Harvard, rather than the mix of performative self-righteousness and raw ambition that replaced them.

Such an effort might also have had spillover effects on politics. It’s de rigueur for liberals to lament the decline of the Rockefeller Republicans, or the compromises that a moderate northeastern WASP like George H.W. Bush made with Sunbelt populism. But a WASP establishment that couldn’t muster the self-confidence to hold on to Yale and Harvard was never likely to maintain its hold on a mass political organization like the G.O.P. Whereas an establishment that still believed in its mission within its own ivied bastions might have been seen as more politically imposing in the wider world — instead of seeing its last paladin, a war hero and statesman in a grand American tradition, dismissed in the boomer era as a “wimp.”

The point of this counterfactual is not to just join the nostalgic chorus around Bush’s departure for the Great Kennebunkport in the Skies. Rather it’s to look forward, and to suggest that our current elite might someday be reformed — or simply replaced — through the imitation of the old establishment’s more pious and aristocratic spirit.

Right now, almost all the discussion of our meritocracy’s vices assumes the system’s basic post-WASP premises, and hopes that either more inclusion (the pro-diversity left’s fixation) or a greater emphasis on academic merit (the anti-affirmative right’s hobbyhorse) will cure our establishment’s all-too-apparent ills.

But nostalgia for what was best about the old establishment might point to a more radical theory of the case, one proposed by Helen Andrews in a 2016 Hedgehog Review essay on meritocracy and its discontents:

The meritocracy is hardening into an aristocracy — so let it. Every society in history has had an elite, and what is an aristocracy but an elite that has put some care into making itself presentable? Allow the social forces that created this aristocracy to continue their work, and embrace the label. By all means this caste should admit as many worthy newcomers as is compatible with their sense of continuity. New brains, like new money, have been necessary to every ruling class, meritocratic or not. If ethnic balance is important to meritocrats, they should engineer it into the system. If geographic diversity strikes them as important, they should ensure that it exists, ideally while keeping an eye on the danger of hoovering up all of the native talent from regional America. But they must give up any illusion that such tinkering will make them representative of the country over which they preside. They are separate, parochial in their values, unique in their responsibilities. That is what makes them aristocratic.

This idea is heresy to our current ruling class; it would have been simple wisdom to the WASPs. If we would learn from their lost successes in our own era of misrule, reconsidering this idea — that a ruling class should acknowledge itself for what it really is, and act accordingly — might be a fruitful place to start.

Follow The New York Times Opinion section on FacebookTwitter (@NYTOpinion) and Instagram, join the Facebook political discussion group, Voting While Female, and sign up for the Opinion Today newsletter.

Ross Douthat has been an Opinion columnist for The Times since 2009. He is the author of several books, most recently, “To Change the Church: Pope Francis and the Future of Catholicism.”

Breaking Faith

The culture war over religious morality has faded; in its place is something much worse.

Over the past decade, pollsters charted something remarkable: Americans—long known for their piety—were fleeing organized religion in increasing numbers. The vast majority still believed in God. But the share that rejected any religious affiliation was growing fast, rising from 6 percent in 1992 to 22 percent in 2014. Among Millennials, the figure was 35 percent.

Some observers predicted that this new secularism would ease cultural conflict, as the country settled into a near-consensus on issues such as gay marriage. After Barack Obama took office, a Center for American Progress report declared that “demographic change,” led by secular, tolerant young people, was “undermining the culture wars.” In 2015, the conservative writer David Brooks, noting Americans’ growing detachment from religious institutions, urged social conservatives to “put aside a culture war that has alienated large parts of three generations.”

That was naive. Secularism is indeed correlated with greater tolerance of gay marriage and pot legalization. But it’s also making America’s partisan clashes more brutal. And it has contributed to the rise of both Donald Trump and the so-called alt-right movement, whose members see themselves as proponents of white nationalism. As Americans have left organized religion, they haven’t stopped viewing politics as a struggle between “us” and “them.” Many have come to define us and them in even more primal and irreconcilable ways.

When pundits describe the Americans who sleep in on Sundays, they often conjure left-leaning hipsters. But religious attendance is down among Republicans, too. According to data assembled for me by the Public Religion Research Institute (PRRI), the percentage of white Republicans with no religious affiliation has nearly tripled since 1990. This shift helped Trump win the GOP nomination. During the campaign, commentators had a hard time reconciling Trump’s apparent ignorance of Christianity and his history of pro-choice and pro-gay-rights statements with his support from evangelicals. But as Notre Dame’s Geoffrey Layman noted, “Trump does best among evangelicals with one key trait: They don’t really go to church.” A Pew Research Center poll last March found that Trump trailed Ted Cruz by 15 points among Republicans who attended religious services every week. But he led Cruz by a whopping 27 points among those who did not.

Why did these religiously unaffiliated Republicans embrace Trump’s bleak view of America more readily than their churchgoing peers? Has the absence of church made their lives worse? Or are people with troubled lives more likely to stop attending services in the first place? Establishing causation is difficult, but we know that culturally conservative white Americans who are disengaged from church experience less economic success and more family breakdown than those who remain connected, and they grow more pessimistic and resentful. Since the early 1970s, according to W. Bradford Wilcox, a sociologist at the University of Virginia, rates of religious attendance have fallen more than twice as much among whites without a college degree as among those who graduated college. And even within the white working class, those who don’t regularly attend church are more likely to suffer from divorce, addiction, and financial distress. As Wilcox explains, “Many conservative, Protestant white men who are only nominally attached to a church struggle in today’s world. They have traditional aspirations but often have difficulty holding down a job, getting and staying married, and otherwise forging real and abiding ties in their community. The culture and economy have shifted in ways that have marooned them with traditional aspirations unrealized in their real-world lives.”

The worse Americans fare in their own lives, the darker their view of the country. According to PRRI, white Republicans who seldom or never attend religious services are 19 points less likely than white Republicans who attend at least once a week to say that the American dream “still holds true.”

But non-churchgoing conservatives didn’t flock to Trump only because he articulated their despair. He also articulated their resentments. For decades, liberals have called the Christian right intolerant. When conservatives disengage from organized religion, however, they don’t become more tolerant. They become intolerant in different ways. Research shows that evangelicals who don’t regularly attend church are less hostile to gay people than those who do. But they’re more hostile to African Americans, Latinos, and Muslims. In 2008, the University of Iowa’s Benjamin Knoll noted that among Catholics, mainline Protestants, and born-again Protestants, the less you attended church, the more anti-immigration you were. (This may be true in Europe as well. A recent thesis at Sweden’s Uppsala University, by an undergraduate named Ludvig Broomé, compared supporters of the far-right Swedish Democrats with people who voted for mainstream candidates. The former were less likely to attend church, or belong to any other community organization.)

How might religious nonattendance lead to intolerance? Although American churches are heavily segregated, it’s possible that the modest level of integration they provide promotes cross-racial bonds. In their book, Religion and Politics in the United States, Kenneth D. Wald and Allison Calhoun-Brown reference a different theory: that the most-committed members of a church are more likely than those who are casually involved to let its message of universal love erode their prejudices.

Whatever the reason, when cultural conservatives disengage from organized religion, they tend to redraw the boundaries of identity, de-emphasizing morality and religion and emphasizing race and nation. Trump is both a beneficiary and a driver of that shift.

So is the alt-right. Read Milo Yiannopoulos and Allum Bokhari’s famous Breitbart.com essay, “An Establishment Conservative’s Guide to the Alt-Right.” It contains five references to “tribe,” seven to “race,” 13 to “the west” and “western” and only one to “Christianity.” That’s no coincidence. The alt-right is ultra-conservatism for a more secular age. Its leaders like Christendom, an old-fashioned word for the West. But they’re suspicious of Christianity itself, because it crosses boundaries of blood and soil. As a college student, the alt-right leader Richard Spencer was deeply influenced by Friedrich Nietzsche, who famously hated Christianity. Radix, the journal Spencer founded, publishes articles with titles like “Why I Am a Pagan.” One essay notes that “critics of Christianity on the Alternative Right usually blame it for its universalism.”

Secularization is transforming the left, too. In 1990, according to PRRI, slightly more than half of white liberals seldom or never attended religious services. Today the proportion is 73 percent. And if conservative nonattenders fueled Trump’s revolt inside the GOP, liberal nonattenders fueled Bernie Sanders’s insurgency against Hillary Clinton: While white Democrats who went to religious services at least once a week backed Clinton by 26 points, according to an April 2016 PRRI survey, white Democrats who rarely attended services backed Sanders by 13 points.

Sanders, like Trump, appealed to secular voters because he reflected their discontent. White Democrats who are disconnected from organized religion are substantially more likely than other white Democrats to call the American dream a myth. Secularism may not be the cause of this dissatisfaction, of course: It’s possible that losing faith in America’s political and economic system leads one to lose faith in organized religion. But either way, in 2016, the least religiously affiliated white Democrats—like the least religiously affiliated white Republicans—were the ones most likely to back candidates promising revolutionary change.

The decline of traditional religious authority is contributing to a more revolutionary mood within black politics as well. Although African Americans remain more likely than whites to attend church, religious disengagement is growing in the black community. African Americans under the age of 30 are three times as likely to eschew a religious affiliation as African Americans over 50. This shift is crucial to understanding Black Lives Matter, a Millennial-led protest movement whose activists often take a jaundiced view of established African American religious leaders. Brittney Cooper, who teaches women’s and gender studies as well as Africana studies at Rutgers, writes that the black Church “has been abandoned as the leadership model for this generation.” As Jamal Bryant, a minister at an AME church in Baltimore, told The Atlantic’s Emma Green, “The difference between the Black Lives Matter movement and the civil-rights movement is that the civil-rights movement, by and large, was first out of the Church.”

Black Lives Matter activists sometimes accuse the black Church of sexism, homophobia, and complacency in the face of racial injustice. For instance, Patrisse Cullors, one of the movement’s founders, grew up as a Jehovah’s Witness but says she became alienated by the fact that the elders were “all men.” In a move that faintly echoes the way some in the alt-right have traded Christianity for religious traditions rooted in pagan Europe, Cullors has embraced the Nigerian religion of Ifa. To be sure, her motivations are diametrically opposed to the alt-right’s. Cullors wants a spiritual foundation on which to challenge white, male supremacy; the pagans of the alt-right are looking for a spiritual basis on which to fortify it. But both are seeking religions rooted in racial ancestry and disengaging from Christianity—which, although profoundly implicated in America’s apartheid history, has provided some common vocabulary across the color line.

Critics say Black Lives Matter’s failure to employ Christian idiom undermines its ability to persuade white Americans. “The 1960s movement … had an innate respectability because our leaders often were heads of the black church,” Barbara Reynolds, a civil-rights activist and former journalist, wrote in The Washington Post. “Unfortunately, church and spirituality are not high priorities for Black Lives Matter, and the ethics of love, forgiveness and reconciliation that empowered black leaders such as King and Nelson Mandela in their successful quests to win over their oppressors are missing from this movement.” As evidence of “the power of the spiritual approach,” she cited the way family members of the parishioners murdered at Charleston’s Emanuel AME church forgave Dylann Roof for the crime, and thus helped persuade local politicians to remove the Confederate flag from South Carolina’s Capitol grounds.

Black Lives Matter’s defenders respond that they are not interested in making themselves “respectable” to white America, whether by talking about Jesus or wearing ties. (Of course, not everyone in the civil-rights movement was interested in respectability either.) That’s understandable. Reformists focus on persuading and forgiving those in power. Revolutionaries don’t.

Black Lives Matter activists may be justified in spurning an insufficiently militant Church. But when you combine their post-Christian perspective with the post-Christian perspective growing inside the GOP, it’s easy to imagine American politics becoming more and more vicious.

In his book Twilight of the Elites, the MSNBC host Chris Hayes divides American politics between “institutionalists,” who believe in preserving and adapting the political and economic system, and “insurrectionists,” who believe it’s rotten to the core. The 2016 election represents an extraordinary shift in power from the former to the latter. The loss of manufacturing jobs has made Americans more insurrectionist. So have the Iraq War, the financial crisis, and a black president’s inability to stop the police from killing unarmed African Americans. And so has disengagement from organized religion.

Maybe it’s the values of hierarchy, authority, and tradition that churches instill. Maybe religion builds habits and networks that help people better weather national traumas, and thus retain their faith that the system works. For whatever reason, secularization isn’t easing political conflict. It’s making American politics even more convulsive and zero-sum.

For years, political commentators dreamed that the culture war over religious morality that began in the 1960s and ’70s would fade. It has. And the more secular, more ferociously national and racial culture war that has followed is worse.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.

By PETER BEINART in The Atlantic and published in its April 2017 edition and can be found here.

Gil Smart on Liberalism

Studies Are Usually Bunk, Study Shows

An interesting detail went overlooked in the fury over fired Google engineer James Damore ’s “diversity memo.” At the end of the document he calls for an end to mandatory “Unconscious Bias training.” Large corporations often force employees into re-education classes, this one a dull, hourlong, 41-slide seminar supported by study after study. Can these studies be trusted? Doubtful. Hands down, the two most dangerous words in the English language today are “studies show.”

The world is inundated with the manipulation of flighty studies to prove some larger point about mankind in the name of behavioral science. Pop psychologists have churned out mountains of books proving some intuitive point that turns out to be wrong. It’s “sciencey,” with a whiff of (false) authenticity.

Malcolm Gladwell is the master. In his 2008 book, “Outlier,” he argues that studies show no one is born better than anyone else. Instead success comes to those who put in 10,000 hours of practice. That does sound right, but maybe Steph Curry shoots hoops for 10,000 hours because he is better than everyone at basketball in the first place. Meanwhile I watch 10,000 hours of TV. Facing criticism, Mr. Gladwell somewhat recanted: “In cognitively demanding fields, there are no naturals.” News alert: Professional sports are cognitively demanding.

Many of the studies quoted in newspaper articles and pop-psychology books are one-offs anyway. In August 2015, the Center for Open Science published a study in which 270 researchers spent four years trying to reproduce 100 leading psychology experiments. They successfully replicated only 39. Yes, I see the irony of a study debunking a study, but add to this a Nature survey of 1,576 scientists published last year. “More than 70% of researchers have tried and failed to reproduce another scientist’s experiments,” the survey report concludes. “And more than half have failed to reproduce their own experiments.”

Bunk medical studies are worrisome, but who really cares about pop behavioral science? It’s easy to write this off as trivial, except millions take these studies and their conclusions seriously. The 2008 book “Nudge,” from academics Richard Thaler and Cass Sunstein, called for “libertarian paternalism” to push people in the right direction. But who decides what’s the right direction? Turns out the answer is Mr. Sunstein. He was hired by the Obama administration in 2009 to run the Office of Information and Regulatory Affairs. Call it psychobabble authoritarianism.

In his best seller “Blink,” Mr. Gladwell finds studies suggesting we are all unconsciously biased sexists, racists, genderists, ableists, and a litany of other “ists”—victimhood’s origin story. Newer research has deflated this theory, but the serious conclusions, and boring training seminars they inevitably lead to, remain. In her first debate against Donald Trump, Hillary Clinton channeled her inner Malcolm Gladwell and declared: “Implicit bias is a problem for everyone, not just police.” Everyone? Speak for yourself. It’s as if she called an entire slice of society deplorable.

Psych labs are being replaced. In the past decade, companies have built vast platforms to probe, test and study humans every time they search, like or snap. Google runs what are called Split A/B tests, dividing users into groups and testing usability and other features to see what works best. In 2014, Facebook caused a bit of a stir after altering 689,000 users’ newsfeeds to see if the company could manipulate their emotions. It could. Good or bad, this is the future of studies.

The world is not binary, but conclusions drawn from studies always are. These studies show whatever someone wants them to. So stay skeptical and remember: Correlation doesn’t equal causation. If only I could find a study that shows this.

Mr. Kessler writes on technology and markets for the Journal.

Copyright ©2019 Dow Jones & Company, Inc. All Rights Reserved.

Appeared in the August 14, 2017, print edition of the Wall Street Journal and can be found here.

Bums

Living the low-stakes life

I have lost touch with my friend Mark, and, assuming he is alive, it will be some work to track him down, because he is periodically homeless or semi-homeless. My first impression was that his economic condition was mainly the result of his having been for many years a pretty good addict and a pretty poor motorcyclist, a combination that had predictable neurological consequences. I never knew Mark “before” — there is something in such men as Mark suggesting an irrevocably bifurcated life — but the better I got to know him, the more I came to believe that he probably had been much the same man, but functional, or at least functional enough.

Part of it was an act, but not all of it. If you saw him on the street and called his name, he’d spin around on you, fists balled up, half enraged and half afraid, ready to fight, until he recognized you, which could sometimes take a few seconds longer than it should have. But then he was all smiles and wry commentary on the passers-by and the police. He’d gesture at passing police cars (he lived about two blocks from the police station) and say, “They all know me,” which was true. We talked about motorcycles and his longing to ride again, and he’d explain to me all the reasons why that was never, ever going to happen. “They’d lock me up,” he’d say darkly, which also was true. He’d sometimes ask to borrow mine, and I’d explain to him all the reasons why that was never, ever going to happen. “You’re a maniac.” This was an approved line of argument. “That’s right!” he’d thunder. Maniac was fine, but he objected to lunatic. He didn’t like bum very much, either, but he was a realist.

A 20-year-old man with adequate shelter, cheap food, computer games, weed, and a girlfriend is apt to be pretty content.

Necessity used to be what forced us to grow up. That was the stick, and sex was the carrot, and between the two of them young men were forced/inspired to get off their asses, go to work, and start families of their own from time immemorial until the day before yesterday. A 20-year-old man with adequate shelter, cheap food, computer games, weed, and a girlfriend is apt to be pretty content. Some of them understand that there is more to life than that, but some do not. David Foster Wallace’s great terror in Infinite Jest was entertainment so engrossing that those consuming it simply stopped doing anything else. (Is it necessary to issue a spoiler alert for a 1,000-page novel that’s 20 years old? Well, spoiler alert: It’s Québécois separatists.) He revisited the idea later in “Datum Centurio,” which is one of the all-time great short stories, one that is written in the form of a dictionary entry from the future for the word “date.” Over the course of the definition (and the inevitable footnotes), we learn that pornography has become so immersive in the future that conventional sexual behavior has been restricted entirely to procreation. The final footnote reads: “Cf. Catholic dogma, perverse vindication of.”

Tyler Cowen considers some of this in his new book, The Complacent Class, in which he argues (in the words of Walter Russell Meade’s review) that “the apparent stability of American society . . . is an illusion: behind the placid façade, technological change and global competition have combined with domestic discontent to bring forth a new age of disruption.”

By Kevin D. Williamson and published in National Review on February 26, 2017 and can be found here.

Post Navigation