So asked Reverend Cleophus James of Jake and Elwood Blues in “The Blues Brothers”. Yet, James Brown now sings the blues. He died over 8 years ago at age 73, on December 25, 2006, and his estate remains unsettled, lost in darkness, with his body not permanently interred at the intended memorial. Caught in estate litigation of Dickinsonian proportions, with personal representatives who were appointed, then resigned and were either dismissed and reappointed, one wonders whether any funds will be left for the charitable beneficiaries of his estate, as his estate preambles through the legal system. .
Mr. Brown, the “Godfather of Soul”, lead a complicated life, including numerous marriages, children, accusations of alleged drug use, and alleged domestic violence. While one would think that his estate plan documents, or possible lack thereof, would be equally complex or unplanned, he did leave a will and irrevocable trust, which left substantial portions of his wealth to provide scholarships to needy children. However, despite the passage of over 8 years, the trustees have allegedly not distributed scholarships via the charitable trust to needy students. Instead, the South Carolina State Attorney General intervened in the matter in an allegedly unprecedented scope. Ultimately, the South Carolina Supreme Court overruled the decision of the Attorney General, to prevent the implementation of the Attorney General’s proposed changes to Mr. Brown’s dispositive intentions.
This summer, the Court sent the matter back to the local South Carolina probate court, which has yet to enter a final decision, due to various claims. Why the matter did not promptly proceed to the local probate court for a trial to resolve all issues and so a decision could be rendered, remains unclear.
Of further complexity, the value of Mr. Brown’s estate remains unknown. This remains a critical factor for determination of reasonableness of fees of the various professionals, which could ultimately further deplete the estate.
Presumably, Mr. Brown’s estate continues to earn millions of dollars a year in royalties. An unknown factor remains regarding the allocation of the income earned after his passing. If the executors/trustees could not or failed to distribute to the net income to the charitable beneficiaries/charitable trust their share of the income, then almost certainly the IRS, on behalf of the U.S. Treasury and possibly state or local governments, will receive substantial income tax revenues from and estate, instead of deducting said funds for charities from income earned.
Many individuals do not understand that an estate or trust, just like a person or corporation, must pay income tax, unless the estate distributes the income to the beneficiaries or the funds pass to a qualified charitable beneficiary, such as a charitable trust or other like organization. When such distribution occurs, the income received by the beneficiaries/charitable trusts generally results in deductions at the estate level, with non-charitable individuals and entity generally then considered to have received income to the extent the funds distributed exceed principal.
A qualified charity receiving income from a qualified charitable entity generally does not have to pay any income tax. But, when the income is accumulated and not distributed, the U.S. Treasury levies an income tax on estate income over $600 ($100 exemption for certain trusts) , with the maximum income tax currently generally set 39.5% for all income over $12,150 in a year. By contrast, income allocated to qualified charities, is not generally subject to income tax. The foregoing does not even factor in any state or local income taxes. Thus, Mr. Brown’s estate ligation has exposed his estate to substantial income tax due to the lack of prompt settling of his estate.
How could some of the above-discussed issues have been avoided?
A family settlement agreement prepared before Mr. Brown’s passing in which he fully disclosed all assets, debts, income and expenses and had all heirs and beneficiaries execute, agreeing to the disposition he intended, may have limited litigation after his passing. While to some extent this was attempted, at least with the individual claiming to be his spouse; as she is now a party to the estate litigation, presumably the agreement was not sufficient for some reason.
Failing such an agreement being practical, or possible, years before Mr. Brown’s passing he could have retained an independent appraiser to value his estate, and eventually transferred, while he was alive and in good health, assets to one or more types of irrevocable charitable trusts, such as a charitable remainder annuity trust. Some funds/assets might have been allocated directly for the scholarship fund he intended to create, while others funds/assets could have been allocated for the individual beneficiaries he intended to benefit from his estate. Indeed, transference of highly appreciated assets to one or more such trusts may have resulted in substantial reduction in taxes, either income taxes and/or estate taxes.
Of course, the funding while alive of charitable bequests or other gifting of assets to heirs in advance of one’s passing requires a willingness to live on less income and to accept one’s mortality. Moreover there are tax consequences. There are potential gift taxes, depending on the sums in question. Also, when a person dies, the cost basis to determine capital gain is normally the date of death of the asset in question, which is known as a “step-up” in cost basis. This means in effect that if you wait until you die, and have a highly appreciated asset, when sold, at death there may be little to know capital gains tax. But, if there asset had been transferred to a non-qualified charity while alive, there might have been substantial capital gain taxes. There are a variety of charitable trusts that when used can minimize or eliminate the capital gain tax in question. Accepting one’s mortality, and careful planning, can then minimize the possibility of disputes.
Here is an article by Adam S. Bernick, Esquire who is of counsel with my firm. This article was originally published in Upon Further Review on January 22, 2015.
An anti-lapse statute is a rule of interpretation that is intended to cure a will to ensure that the next individual in line of a pre-deceased child or certain other close relatives receive testator’s devise or bequest. 20 Pa. C.S. § 2514(9). Specifically, the Pennsylvania anti-lapse statute states that “[A] devise or bequest to a child or other issue of the testator …shall not lapse if the beneficiary shall fail to survive the testator and shall leave issue surviving the testator but shall pass to such surviving issue who shall take per stirpes the share which their deceased ancestor would have taken had he survived the testator….” 20 Pa. C.S. § 2514(9).
When does the anti-lapse statue apply to a will when the testator included language that two children should “share and share alike” and one of children predeceased the testator? In a case of first impression at the appellate level, the Superior Court recently provided guidance on this matter in In re Estate of Harper, — A.2d —-, 2009 WL 1510255 (Pa.Super.), 2009 PA Super 104.
In Estate of Harper, Testator’s will stated as follows:
SECOND: I give, devise and bequeath all the rest, residue and remainder of my estate to my wife…
THIRD: In the event my wife, FLORENCE J. HARPER, fails to survive me, then I give, devise and bequeath all the rest, residue and remainder of my estate, real, personal and mixed, of whatsoever kind and nature and wheresoever the same may be situate, of which I shall die seized and possessed, or to which at the time of my death I may be entitled, to my son, SAMUEL CARL HARPER and to my son, WILLIAM D. HARPER, share and share alike.
In Estate of Harper, Testator’s wife and son William predeceased him by five years, and the surviving son and currently serving Personal Representative (Executor) of the Estate claimed that he, as the surviving beneficiary of Testator, was entitled to the entire residue of Testator’s Estate, because the phrase “share and share alike” denotes a per capita, or individual, distribution, and necessarily negates any right of representation.
At the trial level, the Orphans’ Court noted that the phrase “share and share alike” “is standard language. And the Orphans’ Court observed, “these are words that have been used in wills for hundreds of years.” Superior Court citing Notes of testimony, 5/29/07 at 10.
The Superior Court noted that it failed to see how the language “share and share alike” intended to void the anti-lapse statute. For example, “while the testator provided for the possibility that his wife might predecease him, he did not use any survivorship language in the residuary clause such as “provided this person is living at my death” or “if this person does not survive me” with regard to the two sons.” Also, the Superior Court noted that “the testator had almost five years after the death of his son to revise the Will if he did not want his son’s share to pass through.”
Ultimately, the Superior Court found that Appellant’s interpretation of the words “share and share alike” to be unreasonable and insufficient, “standing alone, to overcome the statutory presumption against lapsed bequests” because to hold otherwise then “the anti-lapse statute would be effectively eviscerated.”
As a related matter, the Superior Court held that the Orphans’ Court correctly declined to hear extrinsic evidence of an alleged ambiguity in testator’s will, relating to the beneficiary who would now receive the property as a result of the anti-lapse statue.
Individuals engaged in estate planning on behalf of clients need to make clear to their clients that if the client wants their assets to pass to the surviving co-beneficiaries that the will must include language to that effect; otherwise, the anti-lapse statute may be applied and instead of the client’s remaining beneficiaries inheriting the property, it might pass, to issue of the deceased beneficiary.
By Adam S. Bernick, Esquire, Law Office of Adam S. Bernick and of counsel to the Law Office of Faye Riva Cohen, P.C.
This article was originally published in Upon Further Review” on August 7, 2009, and can be seen here.
This past weekend, Americans learned of another mass shooting, this time by an employee who decided to murder as many of the people he had worked with for years as possible. As of this writing, the murder toll is 12 people.
Every American asks why. What was the killer’s motive? When we read there is “no known motive,” we are frustrated. Human beings want to make sense of life, especially of evil.
Liberals (in this regard, liberals’ views are essentially as the same as leftists’) are virtually united in ascribing these shootings to guns. Just this past weekend, in a speech in Brazil, former President Barack Obama told an audience:
“Our gun laws in the United States don’t make much sense. Anybody can buy any weapon any time — without much, if any, regulation. They can buy (guns) over the internet. They can buy machine guns.”
That the former president fabricated a series of falsehoods about the United States — and maligned, on foreign soil, the country that twice elected him president — speaks to his character and to the character of the American news media that have been completely silent about these falsehoods. But the main point here is that, like other liberals and leftists, when Obama addresses the subject of mass shootings — in Brazil, he had been talking about the children murdered at Sandy Hook Elementary School in 2012 — he talks about guns.
Yet, America had plenty of guns when its mass murder rate was much lower. Grant Duwe, a Ph.D. in criminology and director of research and evaluation at the Minnesota Department of Corrections, gathered data going back 100 years in his 2007 book, “Mass Murder in the United States: A History.”
Duwe’s data reveal:
In the 20th century, every decade before the 1970s had fewer than 10 mass public shootings. In the 1950s, for example, there was one mass shooting. And then a steep rise began. In the 1960s, there were six mass shootings. In the 1970s, the number rose to 13. In the 1980s, the number increased 2 1/2 times, to 32. And it rose again in the 1990s, to 42. As for this century, The New York Times reported in 2014 that, according to the FBI, “Mass shootings have risen drastically in the past half-dozen years.”
Given the same ubiquity of guns, wouldn’t the most productive question be what, if anything, has changed since the 1960s and ’70s? Of course it would. And a great deal has changed. America is much more ethnically diverse, much less religious. Boys have far fewer male role models in their lives. Fewer men marry, and normal boy behavior is largely held in contempt by their feminist teachers, principals and therapists. Do any or all of those factors matter more than the availability of guns?
Let’s briefly investigate each factor.
Regarding ethnic diversity, the countries that not only have the fewest mass murders but the lowest homicide rates as well are the least ethnically diverse — such as Japan and nearly all European countries. So, too, the American states that have homicide rates as low as Western European countries are the least ethnically and racially diverse (the four lowest are New Hampshire, North Dakota, Maine and Idaho). Now, America, being the most ethnically and racially diverse country in the world, could still have low homicide rates if a) Americans were Americanized, but the left has hyphenated — Balkanized, if you will — Americans, and b) most black males grew up with fathers.
Regarding religiosity, the left welcomes — indeed, seeks — the end of Christianity in America (though not of Islam, whose robustness it fosters). Why don’t we ask a simple question: What percentage of American murderers attend church each week?
Regarding boys’ need for fathers, in 2008, then-Sen. Obama told an audience: “Children who grow up without a father are five times more likely to live in poverty and commit crime; nine times more likely to drop out of schools; and 20 times more likely to end up in prison.”
Yet, the Times has published columns and “studies” showing how relatively unimportant fathers are, and more and more educated women believe this dangerous nonsense.
Then there is marriage: Nearly all men who murder are single. And their number is increasing.
Finally, since the 1960s, we have been living in a culture of grievance. Whereas in the past people generally understood that life is hard and/or they have to work on themselves to improve their lives, for half a century, the left has drummed into Americans’ minds the belief that their difficulties are caused by American society — in particular, its sexism, racism and patriarchy. And the more aggrieved people are the more dulled their consciences.
When you don’t ask intelligent questions, you cannot come up with intelligent answers. So, then, with regard to murder in America, until Americans stop allowing the left to ask the questions, we will have no intelligent answers.
By Dennis Prager and published on June 8, 2019 in The Daily Wire and can be seen here.
Critics think that the genre was an embarrassing dead end. So why do fans and musicians still love it?
Virtuosos such as the keyboardist Keith Emerson made fans feel like connoisseurs. In April, 1971, Rolling Stone reviewed the début album by a band with a name better suited to a law firm: Emerson, Lake & Palmer. The reviewer liked what he heard, although he couldn’t quite define it. “I suppose that your local newspaper might call it ‘jazz-influenced classical-rock,’ ” he wrote. In fact, a term was being adopted for this hybrid of highbrow and lowbrow. People called it progressive rock, or prog rock: a genre intent on proving that rock and roll didn’t have to be simple and silly—it could be complicated and silly instead. In the early nineteen-seventies, E.L.P., alongside several more or less like-minded British groups—King Crimson, Yes, and Genesis, as well as Jethro Tull and Pink Floyd—went, in the space of a few years, from curiosities to rock stars. This was especially true in America, where arenas filled up with crowds shouting for more, which was precisely what these bands were designed to deliver. The prog-rock pioneers embraced extravagance: odd instruments and fantastical lyrics, complex compositions and abstruse concept albums, flashy solos and flashier live shows. Concertgoers could savor a new electronic keyboard called a Mellotron, a singer dressed as a batlike alien commander, an allusion to a John Keats poem, and a philosophical allegory about humankind’s demise—all in a single song (“Watcher of the Skies,” by Genesis). In place of a guitarist, E.L.P. had Keith Emerson, a keyboard virtuoso who liked to wrestle with his customized Hammond organ onstage, and didn’t always win: during one particularly energetic performance, he was pinned beneath the massive instrument, and had to be rescued by roadies. Perhaps this, too, was an allegory.
Most of these musicians took seriously the “progressive” in “progressive rock,” and believed that they were helping to hurry along an ineluctable process: the development of rock music into what Jon Anderson, of Yes, once called “a higher art form.” Even more than most musicians, the prog rockers aimed for immortality. “We want our albums to last,” Robert Fripp, the austere guitar scientist behind King Crimson, said. In a literal sense, he got his wish: although the progressive-rock boom was effectively over by the end of the seventies, it left behind a vast quantity of surplus LPs, which filled the bins in used-record stores for decades. (Many people who have never heard this music would nonetheless recognize some of the album covers.) Progressive rock was repudiated by what came next: disco, punk, and the disco-punk genre known as New Wave. Unlike prog rock, this music was, respectively, danceable, concise, and catchy. In the story of popular music, as conventionally told, progressive rock was at best a dead end, and at worst an embarrassment, and a warning to future musical generations: don’t get carried away.
The genre’s bad reputation has been remarkably durable, even though its musical legacy keeps growing. Twenty years ago, Radiohead released “OK Computer,” a landmark album that was profoundly prog: grand and dystopian, with a lead single that was more than six minutes long. But when a reporter asked one of the members whether Radiohead had been influenced by Genesis and Pink Floyd, the answer was swift and categorical: “No. We all hateprogressive rock music.”
It is common to read about some band that worked in obscurity, only to be discovered decades later. In the case of progressive rock, the sequence has unfolded in reverse: these bands were once celebrated, and then people began to reconsider. The collapse of prog helped reaffirm the dominant narrative of rock and roll: that pretension was the enemy; that virtuosity could be an impediment to honest self-expression; that “self-taught” was generally preferable to “classically trained.”
In the past twenty years, though, a number of critics and historians have argued that prog rock was more interesting and more thoughtful than the caricature would suggest. The latest is David Weigel, a savvy political reporter for the Washington Post who also happens to be an unabashed fan—or, more accurately, a semi-abashed fan. His new history of prog rock is called “The Show That Never Ends,” and it begins with its author embarking on a cruise for fans, starring some of the great prog-rock bands of yore, or what remains of them. “We are the most uncool people in Miami,” Weigel writes, “and we can hardly control our bliss.”
Almost no one hated progressive rock as much, or as memorably, as Lester Bangs, the dyspeptic critic who saw himself as a rock-and-roll warrior, doing battle against the forces of fussiness and phoniness. In 1974, he took in an E.L.P. performance and came away appalled by the arsenal of instruments (including “two Arthurian-table-sized gongs” and “the world’s first synthesized drum kits”), by Emerson’s preening performance, and by the band’s apparent determination to smarten up rock and roll by borrowing from more respectable sources. E.L.P. had reached the Top Ten, in both Britain and America, with a live album based on its bombastic rendition of Mussorgsky’s “Pictures at an Exhibition.” Bangs wanted to believe that the band members thought of themselves as vandals, gleefully desecrating the classics. Instead, Carl Palmer, the drummer, told him, “We hope, if anything, we’re encouraging the kids to listen to music that has more quality”—and “quality” was precisely the quality that Bangs loathed. He reported that the members of E.L.P. were soulless sellouts, participating in “the insidious befoulment of all that was gutter pure inrock.” Robert Christgau, the self-proclaimed “dean of American rock critics,” was, if anything, more dismissive: “These guys are as stupid as their most pretentious fans.”
The story of this reviled genre starts, though, with the most acclaimed popular music ever made. “If you don’t like progressive rock, blame it on the Beatles,” a philosophy professor named Bill Martin wrote, in his 1998 book, “Listening to the Future,” a wonderfully argumentative defense of the genre. Martin is, in his own estimation, “somewhat Marxist,” and he saw progressive rock as an “emancipatory and utopian” movement—not a betrayal of the sixties counterculture but an extension of it. Martin identified a musical “turning point” in 1966 and 1967, when the Beach Boys released “Pet Sounds” and the Beatles released “Sgt. Pepper’s Lonely Hearts Club Band,” which together inspired a generation of bands to create albums that were more unified in theme but more diverse in sound. Using orchestration and studio trickery, these albums summoned the immersive pleasure of watching a movie, rather than the kicky thrill of listening to the radio.
When bands set out to make hit albums, rather than hit singles, some of them abandoned short, sharp love songs and began to experiment with intricate compositions and mythopoetic lyrics. By the dawn of the seventies, the term “progressive rock” was being applied to a cohort of rock-and-roll groups that thought they might be outgrowing rock and roll. In 1973, Columbia Records released a double-album compilation called “The Progressives.” The liner notes informed listeners that “the boundaries between styles and categories continue to blur and disappear.”
But this inclusive musical movement was also, as Weigel emphasizes, a parochial one. “American and British youth music had grown together from the moment the Beatles landed at J.F.K.,” he writes. “In 1969, the two sounds finally started to grow apart.” Weigel quotes an interview with Lee Jackson, the lead singer of a British rock band called the Nice—Keith Emerson’s previous band. “The basic policy of the group is that we’re a European group,” Jackson said. “We’re not American Negroes, so we can’t really improvise and feel the way they can.” (Ironically, the Nice’s biggest hit was an instrumental version of Leonard Bernstein’s “America.”) In a thoughtful 2009 autobiography, Bill Bruford, a drummer who was central to the development of prog rock, noted that many of the music’s pioneers were “nice middle-class English boys,” singing songs that were “self-consciously British.” Genesis, for instance, was formed at Charterhouse, a venerable boarding school in Surrey; the band’s album “Selling England by the Pound” was an arch and whimsical meditation on national identity. Bruford pointed out that even Pink Floyd, known for free-form jam sessions and, later, cosmic rock epics, found time to record songs like “Grantchester Meadows,” a gentle ode to the East Anglian countryside.
In 1969, King Crimson, the most rigorous and avant-garde of the major prog bands, released what is now considered the genre’s first great album, a strange and menacing début called “In the Court of the Crimson King.” The album used precise dissonance and off-kilter rhythms to evoke in listeners a thrilling sensation of ignorance: you got the feeling that the musicians understood something you didn’t. At a career-making concert in Hyde Park, opening for the Rolling Stones, King Crimson played a ferocious set that ended with an acknowledgment of England’s musical heritage: a rendition of “Mars, the Bringer of War,” by the English composer Gustav Holst.
From the start, King Crimson was the kind of band that musicians love—as opposed, that is, to the kind of band that non-musicians love. (King Crimson never had a hit single, although “21st Century Schizoid Man,” the first song from its first album, served, in 2010, as the basis for “Power,” by Kanye West.) Bill Bruford, the drummer, was astonished by an early King Crimson performance, and resolved to make equally ambitious music with his own band, a sweetly melodic group called Yes. In its own way, Yes, too, was profoundly English—Jon Anderson, the lead singer, generally eschewed faux-American bluesiness, and the band instead deployed pleasing multipart harmonies that recall the choral tradition of the Anglican Church.
In 1971, Yes released an album called “Fragile,” which included a hummable—and very progressive—song called “Roundabout.” On the album, it lasted more than eight minutes, but unsentimental record executives trimmed it to three and a half, and the edited version found a home on U.S. radio stations. This music, so self-consciously English, sounded different in America, where its rather nerdy creators were greeted as exotic rock stars. That summer, Yes played its first U.S. concert, at an arena in Seattle. A fan who approached Jon Anderson before the show remembered that Anderson was nervous. “I don’t know what is going to happen,” the singer told him. “I’ve never been in a place like this.”
When Anderson sang, “I’ll be the roundabout,” most American listeners surely had no idea that he was referring to the kind of intersection known less euphoniously, in the U.S., as a traffic circle. (The song was inspired by the view from a van window.) Why, then, did this music seduce so many Americans? In 1997, a musician and scholar named Edward Macan published “Rocking the Classics,” in which he offered a provocative explanation. Noting that this artsy music seemed to attract “a greater proportion of blue-collar listeners” in the U.S. than it had in Britain, he proposed that the genre’s Britishness “provided a kind of surrogate ethnic identity to its young white audience”: white music for white people, at a time of growing white anxiety. Bill Martin, the quasi-Marxist, found Macan’s argument “troubling.” In his view, the kids in the bleachers were revolutionaries, drawn to the music because its sensibility, based on “radical spiritual traditions,” offered an alternative to “Western politics, economics, religion, and culture.”
The genre’s primary appeal, though, was not spiritual but technical. The musicians presented themselves as virtuosos, which made it easy for fans to feel like connoisseurs; this was avant-garde music that anyone could appreciate. (Pink Floyd might be the most popular prog-rock band of all time, but Martin argued that, because the members lacked sufficient “technical proficiency,” Pink Floyd was not really prog at all.) In some ways, E.L.P. was the quintessential prog band, dominated by Emerson’s ostentatious technique—he played as fast as he could, and sometimes, it seemed, faster—and given to grand, goofy gestures, like “Tarkus,” a twenty-minute suite that recounted the saga of a giant, weaponized armadillo. The members of E.L.P. betrayed no particular interest in songwriting; the group’s big hit, “Lucky Man,” was a fluke, based on something that Greg Lake wrote when he was twelve. It concluded with a wild electronic solo, played on a state-of-the-art Moog synthesizer, that Emerson considered embarrassingly primitive. An engineer had recorded Emerson warming up, and the rest of the band had to convince him not to replace his squiggles with something more precise—more impressive. In the effortful world of prog, there was not much room for charming naïveté or happy accidents; improvised solos were generally less important than composed instrumental passages.
The audience for this stuff was largely male—Bruford writes ruefully that, throughout his career, women “generally and rather stubbornly stayed away” from his performances. The singer-songwriter John Wesley Harding, an obsessive prog-rock fan, suggests that these musicians were “afraid of women,” and that they expressed this fear by shunning love songs. What they provided, instead, was spectacle. As the American crowds got bigger, the stages did, too, which meant more elaborate shows, which in turn drew more fans. Weigel notes that, in one tour program, the members of Genesis promised to “continually feed profits back into the stage show.” (At one point, the show included a stage-wide array of screens displaying a sequence of hundreds of images, and, for the lead singer, a rubbery, tumorous costume with inflatable testicles.) Yes toured with sets designed by Roger Dean, the artist who painted its extraterrestrial album covers. Dean’s innovations included enormous, sac-like pods from which the musicians could dramatically emerge. Inevitably, one of the pods eventually malfunctioned, trapping a musician inside and prefiguring a famous scene from “This Is Spinal Tap.” The competition among bands to create bigger and brighter spectacles was absurd but also irresistible, and quite possibly rational. American arena stages, like LPs, needed to be filled, and so these bands set out to fill them.
Weigel’s book has an unlikely flaw, given its subject: it is too short. Wary, perhaps, of taxing readers’ patience, he finishes his tour in three hundred pages, resisting what must have been an overwhelming urge to interrupt the narrative with disco-graphical digressions. Martin, less diffident, included in his book a list of sixty-two “essential” progressive-rock albums—partly to provide a shopping list for newcomers, and partly, one suspects, because he liked the idea of outraging hard-core fans with his omissions.
So what is the greatest progressive-rock album of all time? One perennial and deserving candidate is “Close to the Edge,” by Yes, from 1972, which consists of three long songs that are, by turns, gently pastoral and gloriously futuristic, responding to the genre’s contradictory impulses: to explore musical history and to leave it behind. Earlier this year, Will Romano published “Close to the Edge: How Yes’s Masterpiece Defined Prog Rock,” a frankly obsessive study that makes no pretense of levelheadedness. Romano notes that he listened to the album “easily over a thousand times” while working on the book, and, when he wonders about a “low pulse that pervades entire sections” of the title track, it seems possible that he has begun to hallucinate. He embarks upon a brave attempt to decode Anderson’s inane lyrics, provides an astute technical description of the way Steve Howe seems to play lead and rhythm guitar at the same time, and identifies the pivotal moment when Rick Wakeman, the keyboard player, met Denise Gandrup, a designer of sparkly capes, which became his signature.
Romano ends with a note of defiance, pointing out that Yes still hadn’t been accepted by the cultural élitists in charge of the Rock & Roll Hall of Fame. This spring, not long after the book’s publication, Yes was finally inducted—more than two decades after it became eligible. And yet Romano is right: there is something inspiring about the indigestibility of prog, which still hasn’t quite been absorbed into the canon of critically beloved rock and roll, and which therefore retains some of its outsider appeal. Often, we celebrate bygone bands for being influential, hearing in them the seeds of the new; the best prog provides, instead, the shock of the old.
Listeners who wonder what they have been missing should probably ignore E.L.P. entirely and head straight for “Close to the Edge”—or, if they want something a bit more bruising, “Red,” an austere album that a new version of King Crimson (including Bruford) released in 1974. One of the most underappreciated progressive-rock groups was Gentle Giant, but there was a reason for this neglect: none of the band members happened to be a great singer. So they used interlocking instrumental lines, shifting time signatures, and close harmonies to construct songs that seemed to occupy some phantom limb of music’s evolutionary tree.
Gentle Giant was one of the bands featured on “The Progressives,” the Columbia Records compilation, which turned out to have a hidden agenda: it was, in large part, a jazz album, seemingly designed to help prog fans develop a taste for Ornette Coleman, Charles Mingus, and Mahavishnu Orchestra. Jazz played an important but disputed role in the story of progressive rock. While some British bands were trying to turn inward, away from American influences, others were finding ways to forge new ties between rock and jazz. Indeed, Mahavishnu Orchestra, a jazz-fusion group led by the English guitarist John McLaughlin (who previously played with Miles Davis), is sometimes considered an honorary prog band—at the time, the distinctions between these genres could be hazy. And in Canterbury, in the southeast of England, a cluster of interconnected bands created their own jazz-inflected hybrids: Soft Machine, Matching Mole, Hatfield & the North. These are the bands most likely to charm—and perhaps convert—listeners who think that they hate progressive rock. Unlike the swashbucklers who conquered arenas, the Canterburians were cheerfully unheroic, pairing adventurous playing with shrugging, self-deprecating lyrics about nothing much. (One Hatfield & the North song goes, “Thank all the mothers who made cups of tea. / If they didn’t care for us, we wouldn’t be / here to sing our songs and entertain. / Plug us in and turn on the mains!”) This is music animated by a spirit of playful exploration—recognizably progressive, you might say, though not terribly prog.
The question of progress bedevilled many of the prog bands: the ethos, which implied constant transformation, was at odds with the sound, which was identifiable, and therefore stuck. Robert Fripp solved this problem by disbanding King Crimson just as “Red” was being released. “The band ceased to exist in 1974, which was when all English bands in that genre should have ceased to exist,” he said later. Once some album-side-long songs had been recorded, and some snippets of classical music appropriated, it was not obvious how further progress might be made, especially since the bands now had big crowds to please. In 1978, E.L.P. released an infamous album called “Love Beach,” which was recorded in the Bahamas, and whose cover depicted something less enticing than a battle-ready armadillo: the three grinning band members, displaying white teeth and varying amounts of chest hair.
Progressive rock was a stubborn genre, and yet a number of its adepts proved to be surprisingly flexible; it turned out that their considerable musical skill could be put to new uses. In 1980, Steve Howe, the guitarist from Yes, told the Los Angeles Times that his band had been “modernized” and simplified. “Whatever’s been leveled at us in the past, we want to be re-judged,” he said. This kind of desperate ploy isn’t supposed to work, but it did: in 1983, Yes topped the American pop chart with “Owner of a Lonely Heart,” which barely sounded like it had come from the same band. A new group called Asia, made up of refugees from Yes, King Crimson, and E.L.P., released an album that reached No. 1 on the American chart. Genesis did something even more impressive, transforming into a Top Forty band while spawning three successful solo careers. The singer, Peter Gabriel, became a pop star, and so did the drummer, Phil Collins, and the bassist, Mike Rutherford, who led Mike + the Mechanics. For a few of the genre’s biggest stars, the music industry offered an attractive bargain: leave prog behind and you can be bigger than ever.
Some true believers remained, of course. In the seventies, prog-inspired American bands like Kansas and Styx had conquered arenas, and by the end of the decade there was Rush, a Yes-obsessed trio of Canadians who received even worse reviews than their British forebears. One reason was their avowed love of Ayn Rand; an influential and absurd review in New Musical Express, a British magazine, accused them of preaching “proto-fascism.” Another reason was that, by the late seventies, progressive rock was about the most unhip music in existence. “The fans showing up to hear Rush were the wrong kind of fans—the mockable ones, with mockable taste in music,” Weigel writes, holding up this judgment for ridicule without quite dissenting from it. (No doubt he was sorely tempted to use the term “deplorables.”) By the time Rush emerged, progressive rock had entered its never-ending defensive phase; uncoolness is now part of the genre’s identity, and even a devoted fan like Weigel may not be entirely sure whether he wants that to change.
Progressive rock, broadly defined, can never disappear, because there will always be musicians who want to experiment with long songs, big concepts, complex structures, and fantastical lyrics. You can hear a trace of the genre in the fearless compositions of Joanna Newsom or, equally, in “Pyramids,” an epic Frank Ocean slow jam that blends Afrocentric mythology with a narrative about sexwork. At Coachella this year, one of the breakout stars was Hans Zimmer, the German composer, who performed excerpts from his film scores with an orchestra and a rock band. (Anyone who cheered him on has forever lost the right to make snarky jokes about bands like Yes.) Plenty of revivalist bands play what might, paradoxically, be called retro-prog. And there have been latter-day innovators. Tool emerged, a quarter century ago, as an awesome new kind of prog band: precise but unremittingly heavy, all rumbles and hums. In Sweden, Meshuggah, in the nineties, built roaring, ferocious songs atop fiendish riffs in prime-number time signatures; Opeth, in the aughts, found a connection between death-metal fury and Pink Floydian reverie.
What can disappear—what long ago disappeared, in fact, at least among rock bands—is the ideology of progress in pop music: the optimistic sense, shared by all those early-seventies pioneers, that the form was evolving and improving, and that prog rock offered a sneak peek at our future. The bands thought that the arc of the musical universe bent toward keyboard solos. This is part of what drove Lester Bangs crazy—he couldn’t understand why these musicians thought they had improved upon old-fashioned rock and roll. But contemporary listeners might find the genre’s optimistic spirit more exotic, and therefore more endearing, than it once seemed. Of course, prog rock was not the future—at least, not more than anything else was. Nowadays, it seems clear that rock history is not linear but cyclical. There is no grand evolution, just an endless process of rediscovery and reappraisal, as various styles and poses go in and out of fashion. We no longer, many of us, believe in the idea of musical progress. All the more reason, perhaps, to savor the music of those who did.
By Kelefa Sanneh and published on June 12, 2017 in The New Yorker and can be found here.
Dr. Paul R. McHugh, the Distinguished Service Professor of Psychiatry at Johns Hopkins University and former psychiatrist–in-chief for Johns Hopkins Hospital, who has studied transgendered people for 40 years, said it is a scientific fact that “transgendered men do not become women, nor do transgendered women become men.”
All such people, he explained in an article for The Witherspoon Institute, “become feminized men or masculinized women, counterfeits or impersonators of the sex with which they ‘identify.’”
Dr. McHugh, who was psychiatrist-in-chief at Johns Hopkins Hospital for 26 years, the medical institute that had initially pioneered sex-change surgery – and later ceased the practice – stressed that the cultural meme, or idea that “one’s sex is fluid and a matter of choice” is extremely damaging, especially to young people.
The idea that one’s sexuality is a feeling and not a biological fact “is doing much damage to families, adolescents, and children and should be confronted as an opinion without biological foundation wherever it emerges,” said Dr. McHugh in his article, Transgenderism: A Pathogenic Meme.
“I am ever trying to be the boy among the bystanders who points to what’s real,” said Dr. McHugh, who is also professor of Psychiatry and Behavioral Sciences at Johns Hopkins. “I do so not only because truth matters, but also because overlooked amid the hoopla—enhanced now by Bruce Jenner’s celebrity and Annie Leibovitz’s photography—stand many victims.”
“Think, for example, of the parents whom no one—not doctors, schools, nor even churches—will help to rescue their children from these strange notions of being transgendered and the problematic lives these notions herald,” warned McHugh.
They rarely find therapists who are willing to help them “work out their conflicts and correct their assumptions,” said McHugh. “Rather, they and their families find only ‘gender counselors’ who encourage them in their sexual misassumptions.”
In addition, he said, “both the state and federal governments are actively seeking to block any treatments that can be construed as challenging the assumptions and choices of transgendered youngsters.”
“As part of our dedication to protecting America’s youth, this administration supports efforts to ban the use of conversion therapy for minors,” said Valerie Jarrett, a senior advisor to President Obama, as quoted by Dr. McHugh in his article.
However, there is plenty of evidence showing that “transgendering” is a “psychological rather than a biological matter,” said Dr. McHugh
“In fact, gender dysphoria—the official psychiatric term for feeling oneself to be of the opposite sex—belongs in the family of similarly disordered assumptions about the body, such as anorexia nervosa and body dysmorphic disorder,” said McHugh.
“Its treatment should not be directed at the body as with surgery and hormones any more than one treats obesity-fearing anorexic patients with liposuction,” he said.
In fact, at Johns Hopkins, where they pioneered sex-change-surgery, “we demonstrated that the practice brought no important benefits,” said Dr. McHugh. “As a result, we stopped offering that form of treatment in the 1970s.”
In recent years, though, the notion that one’s sex is fluid has flooded the culture. It is “reflected everywhere in the media, the theater, the classroom, and in many medical clinics,” said McHugh.
It is biologically false that one can exchange one’s sex, explained McHugh.
“Transgendered men do not become women, nor do transgendered women become men,” he said. “All (including Bruce Jenner) become feminized men or masculinized women, counterfeits or impersonators of the sex with which they ‘identify.’ In that lies their problematic future.”
When “the tumult and shouting dies,” McHugh continued, “it proves not easy nor wise to live in a counterfeit sexual garb. The most thorough follow-up of sex-reassigned people—extending over 30 years and conducted in Sweden, where the culture is strongly supportive of the transgendered—documents their lifelong mental unrest.”
“Ten to 15 years after surgical reassignment, the suicide rate of those who had undergone sex-reassignment surgery rose to 20 times that of comparable peers,” said McHugh.
Nonetheless, the false “assumption that one’s sexual nature is misaligned with one’s biological sex,” can be treated with therapy and medication, said McHugh.
He further stressed that, “What is needed now is public clamor for coherent science—biological and therapeutic science—examining the real effects of these efforts to ‘support’ transgendering.”
“But gird your loins if you would confront this matter,” warned Dr. McHugh. “Hell hath no fury like a vested interest masquerading as a moral principle.”
Dr. McHugh’s article, Transgenderism: A Pathogenic Meme, can be read in full at the website of The Witherspoon Institute.
By: Michael W. Chapman and published on May 5, 2016 in CNS News and can be seen here.
By Glenn T. Stanton and published on January 22, 2018 in The Federalist and can be found here.
Is churchgoing and religious adherence really in ‘widespread decline’ so much so that conservative believers should suffer ‘growing anxiety’? Absolutely not.
“Meanwhile, a widespread decline in churchgoing and religious affiliation had contributed to a growing anxiety among conservative believers.” Statements like this are uttered with such confidence and frequency that most Americans accept them as uncontested truisms. This one emerged just this month in an exceedingly silly article in The Atlantic on Vice President Mike Pence.
Religious faith in America is going the way of the Yellow Pages and travel maps, we keep hearing. It’s just a matter of time until Christianity’s total and happy extinction, chortle our cultural elites. Is this true? Is churchgoing and religious adherence really in “widespread decline” so much so that conservative believers should suffer “growing anxiety”?
Two words: Absolutely not.
New research published late last year by scholars at Harvard University and Indiana University Bloomington is just the latest to reveal the myth. This research questioned the “secularization thesis,” which holds that the United States is following most advanced industrial nations in the death of their once vibrant faith culture. Churches becoming mere landmarks, dance halls, boutique hotels, museums, and all that.
Not only did their examination find no support for this secularization in terms of actual practice and belief, the researchers proclaim that religion continues to enjoy “persistent and exceptional intensity” in America. These researchers hold our nation “remains an exceptional outlier and potential counter example to the secularization thesis.”
What Accounts for the Difference in Perceptions?
How can their findings appear so contrary to what we have been hearing from so many seemingly informed voices? It comes down primarily to what kind of faith one is talking about. Not the belief system itself, per se, but the intensity and seriousness with which people hold and practice that faith.
Mainline churches are tanking as if they have super-sized millstones around their necks. Yes, these churches are hemorrhaging members in startling numbers, but many of those folks are not leaving Christianity. They are simply going elsewhere. Because of this shifting, other very different kinds of churches are holding strong in crowds and have been for as long as such data has been collected. In some ways, they are even growing. This is what this new research has found.
The percentage of Americans who attend church more than once a week, pray daily, and accept the Bible as wholly reliable and deeply instructive to their lives has remained absolutely, steel-bar constant for the last 50 years or more, right up to today. These authors describe this continuity as “patently persistent.”
The percentage of such people is also not small. One in three Americans prays multiple times a day, while one in 15 do so in other countries on average. Attending services more than once a week continues to be twice as high among Americans compared to the next highest-attending industrial country, and three times higher than the average comparable nation.
One-third of Americans hold that the Bible is the actual word of God. Fewer than 10 percent believe so in similar countries. The United States “clearly stands out as exceptional,” and this exceptionalism has not been decreasing over time. In fact, these scholars determine that the percentages of Americans who are the most vibrant and serious in their faith is actually increasing a bit, “which is making the United States even more exceptional over time.”
This also means, of course, that those who take their faith seriously are becoming a markedly larger proportion of all religious people. In 1989, 39 percent of those who belonged to a religion held strong beliefs and practices. Today, these are 47 percent of all the religiously affiliated. This all has important implications for politics, indicating that the voting bloc of religious conservatives is not shrinking, but actually growing among the faithful. The declining influence of liberal believers at the polls has been demonstrated in many important elections recently.
These Are Not Isolated Findings
The findings of these scholars are not outliers. There has been a growing gulf between the faithful and the dabblers for quite some time, with the first group growing more numerous. Think about the church you attend, relative to its belief system. It is extremely likely that if your church teaches the Bible with seriousness, calls its people to real discipleship, and encourages daily intimacy with God, it has multiple services to handle the coming crowds.
Most decent-size American cities have a treasure trove of such churches for believers to choose from. This shows no sign of changing. If, however, your church is theologically liberal or merely lukewarm, it’s likely laying off staff and wondering how to pay this month’s light bill. People are navigating toward substantive Christianity.
The folks at Pew have been reporting for years that while the mainline churches are in drastic free fall, the group that “shows the most significant growth is the nondenominational family.” Of course, these nondenominational churches are 99.9 percent thorough-blooded evangelical. Pew also notes that “evangelical Protestantism and the historically black Protestant tradition have been more stable” over the years, with even a slight uptick in the last decade because many congregants leaving the mainline churches are migrating to evangelical churches that hold fast to the fundamentals of the Christian faith.
When the so-called “progressive” churches question the historicity of Jesus, deny the reality of sin, support abortion, ordain clergy in same-sex relationships and perform their marriages, people desiring real Christianity head elsewhere. Fact: evangelical churches gain five new congregants exiled from the liberal churches for every one they lose for any reason. They also do a better job of retaining believers from childhood to adulthood than do mainline churches.
The Other Key Factor: Faithful People Grow More Children
There is another factor at work here beyond orthodox belief. The University of London’s Eric Kaufmann explains in his important book “Shall the Religious Inherit the Earth?” (he says yes) that the sustaining vitality, and even significant per capita growth, of serious Christian belief is as firmly rooted in fertility as it is in faithful teaching and evangelism. Globally, he says that the more robust baby-making practices of orthodox Jews and Christians, as opposed to the baby-limiting practices of liberals, create many more seriously religious people than a secular agenda can keep up with.
The growth of serious Christian belief is as firmly rooted in fertility as it is in faithful teaching and evangelism.
Fertility determines who influences the future in many important ways. He puts it bluntly, “The secular West and East Asia are aging and their share of the world population declining. This means the world is getting more religious even as people in the rich world shed their faith.”
Fertility is as important as fidelity for Christianity and Judaism’s triumph from generation to generation. Kaufmann contends, “Put high fertility and [faith] retention rates together with general population decline and you have a potent formula for change.”
It comes down to this: God laughs at the social Darwinists. Their theory is absolutely true, but just not in the way they think. Those who have the babies and raise and educate them well tend to direct the future of humanity. Serious Christians are doing this. Those redefining the faith and reality itself are not.
This why Orthodox theologian David Bentley Hart proclaimed in First Things, long before the proposal of the Benedict Option, that the most “subversive and effective strategy we might undertake [to counter the culture] would be one of militant fecundity: abundant, relentless, exuberant, and defiant childbearing.” The future rests in the hands of the fertile.
What About All the Millennial Ex-Christians?
But what about our young people? We are constantly hearing that young people are “leaving the church in droves,” followed by wildly disturbing statistics. This also requires a closer look at who is actually leaving and from where. Pew reports that of young adults who left their faith, only 11 percent said they had a strong faith in childhood while 89 percent said they came from a home that had a very weak faith in belief and practice.
It’s not a news flash that kids don’t tend to hang onto what they never had in the first place. Leading sociologist of religion Christopher Smith has found through his workthat most emerging adults “report little change in how religious they have been in the previous five years.” He surprisingly also found that those who do report a change say they have been more religious, not less. This certainly does not mean there is a major revival going on among young adults, but nor does it mean the sky is falling.
Add to this Rodney Stark’s warning that we should not confuse leaving the faith with attending less often. He and other scholars report that young adults begin to attend church less often in their “independent years” and have always done so for as long back as such data has been collected. It’s part of the nature of emerging adulthood. Just as sure as these young people do other things on Sunday morning, the leading sociologists of religion find they return to church when they get married, have children, and start to live a real adult life. It’s like clockwork and always has been. However, the increasing delay among young adults in entering marriage and family is likely lengthening this gap today.
More Americans Attend Church Now Than At the Founding
What is really counter-intuitive is what Stark and his colleagues at the Baylor Institute for Studies of Religion found when looking at U.S. church attendance numbers going back to the days of our nation’s founding. They found that the percentage of church-attending Americans relative to overall population is more than four times greater today than it was in 1776. The number of attendees has continued to rise each and every decade over our nation’s history right up until the present day.
The number of church attendees has continued to rise each and every decade over our nation’s history right up until the present day.
People are making theological statements with their feet, shuffling to certain churches because they offer what people come seeking: clear, faithful, practical teaching of the scriptures, help in living intimately with and obediently to God, and making friends with people who will challenge and encourage them in their faith. To paraphrase the great Southern novelist Flannery O’Connor, if your church isn’t going to believe and practice actual Christianity, then “to hell with it.” This is what people are saying with their choices.
Or as Eric Kaufmann asserts, “Once secularism rears its head and fundamentalism responds with a clear alternative, moderate religion strikes many as redundant. Either you believe the stuff or you don’t. If you do, it makes sense to go for the real thing, which takes a firm stand against godlessness.”
If your Christianity is reconstituted to the day’s fashion, don’t be surprised if people lose interest in it. Few are seeking 2 Percent Christianity. They want the genuine deal, and the demographics on religion of the last few decades unmistakably support the fact.
Don’t get married before you live together. You just never know what the other person will be like to live with, and you need to figure that out before marriage.
Definitely don’t tie the knot until you’ve traveled together. You absolutely have to find out if your future spouse is a good travel companion – what if you get married and then learn they cry during turbulence but not The Notebook? If you don’t have money to travel, simply hitchhike together and see how your partner reacts when they have a gun to their head.
You need to make sure your marriage can withstand major life changes, so don’t get married until one of you has been fired from their job. If you like your job, then plan to marry someone who’s bad at theirs. Or intentionally sabotage them by hacking into their work email and sending nudes to their boss. If you’re uncomfortable sending your partner’s nudes, send your own. Marriage is about compromise.
Don’t get married before you’ve had children together. Seeing what the other person is like as a parent is key to determining if they’re the right person for you. If it turns out they’re a completely negligent parent, at least you know before you do something extreme like buying an expensive white dress. Throw the tester-baby out the window and call it a day.
You’ll be humiliated if your loved ones travel dozens of miles to see your nuptials and then you get divorced a mere two decades later, so don’t get married until one of you has completely altered their physical appearance. You want to make sure this marriage is about love, not just physical attraction and his trust fund. Shave your head, gain weight, cut your nose off, stay inside for 6 years — I really don’t care. Just do it.
What if weddings just aren’t for you? You need to find that out before you marry your partner, so don’t get married until you’ve married someone else first. Please consider me for the role of your starter bride. And then don’t get divorced — it’s a turn off to future lovers.
Don’t join in holy union until you’ve turned 25 together. It’s imperative that you’ve seen your partner hit the quarter-century mark and come out the other end. Ideally, this will happen to you on exactly the same day (if you’re a twin, you’re in luck!). You have no idea how many good relationships have failed just because one of them hit the wrong side of 25 and the other couldn’t handle it. If you meet after age 25, don’t get married. If you’re single at 25, sorry, but society warned you — excessively.
And don’t get married before you’ve both gone through periods of extreme depression. If your other half isn’t a naturally depressed person, try to induce it by depriving them of food and sleep. You need to know what they’re like when they’re nearly suicidal before your parents drop $60,000 on a wedding.
Are you “in love” enough to go visit them while they’re serving a life sentence in jail? You must find this out before you throw your whole life away! So instead, throw your neighbor Jeanine’s life away by murdering her, and then wait and find out if your LOML comes to see you every weekend. If so, you can walk down the aisle. Or, walk in a lap with the other prisoners. Orange is the new White.
Don’t get married before you’ve watched the other person die. Honestly, that’s a really traumatic event in a marriage, and you want to make sure you can handle it before you commit to spending your whole life together. Bravely volunteer yourself as the one to watch your partner meet their maker. This is usually the step where couples realize it’s not going to work out, so make sure not to skip it.
By Ginny Hogan
Published on February 13, 2019 in McSweeney’s and can be found here.
It would be rather simple to write a series of articles discussing the positive and negative aspects of video games or commenting on the coolest graphics and best storylines. But such a set of articles could not genuinely be called “Catholic.” Something that is “Catholic” deliberates the whole of things, meaning it does not interpret reality as piecemeal or a set of facts in isolation. The Catholic thinker is someone who contemplates, discusses, and writes radically (from the Latin radix—“at the root of things”) seeing reality as it is in its entirety; seeing a thing as it fits within the entire framework of existence. Thus, before we begin a dialogue about specific video games, we must first situate the topic within the context of civilization as a whole. We must go to the origin of this phenomenon and why it has taken the world by storm. The question therefore is, “Why are video games?”
Video games are first and foremost an expression of contemporary culture. A brief study in etymology will clarify our point. The word “culture” comes from the Proto-Indo-European root kwelə meaning “to revolve,” “sojourn,” or “dwell.” This would later evolve into the Latin word incola, “someone who inhabits/dwells” in a certain area. The activities of an incola for the care of his or her sustenance is the verb colere, “to cultivate/till” the earth. Colere is also a word of self-awareness, a recognition of humanity’s capacity for agriculture, construction, and landscaping. The human being is not like other creatures; humans can interact and cooperate with the world around them in a drastic way. One has only to recall the great edifices of Giza, Athens, and Rome for proof.
Even amidst their achievements, however, ancient people were mindful of mystery. They sensed that at the deepest core of reality, the world is given to man, not made by him. It is something simultaneously for us to be subdued and beyond us to be wondered at. The ancients’ realization of this fact led to the development of the verb colere into the noun cultura (culture), denoting “an acknowledgment of” or “honoring of” those things which are essential to a community’s livelihood yet not under their immediate control. One could plant the seed at harvest time (colere), but ultimately, it was the cosmic work of Renenutet, Demeter, or Ceres to provide for its growth (cultura). By studying this etymological and historical relationship between the words colere (to cultivate) and cultura (cult/culture), we can come to a better appreciation of “culture” in the proper sense. Culture appropriately defined represents a claim about the human person’s role in the infrastructure of the world; it is the fruit of a seeing where one truly is in the grand scheme of things; it is the expression of a person’s understanding of reality and their relationship to the order of the universe.
In light of the above-written reflection, let us return to our original question, “Why are video games?” Everything in a civilization is directly influenced by culture: language, food, clothing, music, inventions, architecture, etc. Each of these is a tangible manifestation of a metaphysical presupposition. In other words, the stuff we say, how we say it, what we wear when we say it, and the design of the building we say it in…all these things come from the same place. They are the fruits of culture, the consequences of a philosophical judgment made by society about the essence of reality. Video games are no different. As a matter of fact, I see video games as an apex expression of our postmodern technological culture. More than any other media, video games respond to and affirm the keystone assertion of our civilization: reality is what I make of it.The following quote from Shigeru Miyamoto (the famous creator of Mario, The Legend of Zelda, Star Fox,F-Zero, Donkey Kong, and Pikmin) summarizes the point lucidly: “Players [Gamers] are artists who create their own reality within the game.”
As such, video games have become a fascinating place to see people recognize and deal with the fallout of postmodernity. The virtual world is a seemingly limitless medium in which gamers can experience, suffer, respond to, and escape the egoism, relativism, atheism, and mechanism of culture. I recall one person on YouTube who posted at the bottom of a video game soundtrack: “This Soundtrack, this Game…it feels like a therapy. Especially when you feel down it feels like every sound, every movement you make, everything you can see is there to heal your wounds, your soul…I really love it…” This comment is a perfect example of what we have been discussing.
On the one hand, video games make clear where our culture has failed, where we as a people have lost the language, skill, and discernment to engage the deepest and most vital facets of our being. On the other hand, video games are a rich mine in which to excavate the needs of our people so as to reintroduce basic human qualities and reignite the divine spark of a sedated society.
In the end, what we millennials and post-millennials want is the real world, not the artificial world. Our wanderings in the lands of Minecraft and the mountains of Skyrim are a crying out for reality, not a rejection of it. We long to witness the breath-taking beauty of creation, soar into the heights of authentic heroism and experience the life-giving dynamism of true freedom. “We want reality!” This is the rallying cry of our generation. Unfortunately, many of us are convinced that it no longer exists. So, we seek in the virtual world what we wish existed in the real world. The world outside our suburban home or terraced row-house is a cold, uninviting place flanked on all sides by the ravenous beast of materialistic industrialism and the constant noise of the machine. We sympathize with Romano Guardini when he first saw the decrepit smokestack of a modern factory disrupting the flawless majesty of Lake Como, Italy. At that moment, he knew the “world of natural humanity, of nature in which humanity dwells, was perishing” (Romano Guardini, Letters from Lake Como). A world of money, flashing billboards, and high-rise corporations is nothing compared to the peaceful islands of Uncharted 4 or the awe-inspiring scenery of Final Fantasy X.
Besides, why should we participate in the “real world” when all it seems to offer is passing fads, superficial pleasures, and relativistic opinions? We would rather save a magical kingdom, run through endless leagues of virtual pristine forests, or complete a daring mission to gain XP for our avatars. At least then we can feel like we have purpose; we can feel like we have the opportunity to achieve greatness and see a world left better by our living in it.
Show us something beautiful. Prove to us that the world outside our game room can be as inspiring, challenging, and fulfilling as the world within our game consoles. If you can do that, then you will awaken the hearts of millions and summon a generation of men and women ready to complete the greatest quest of all time: the quest to holiness and sainthood in Jesus Christ.
To understand how American politics got the wayit is today, it helps to rewind the tape to the presidential campaign of John McCain—specifically to his effort to win back a listless crowd at an otherwise forgettable campaign event in south-central Pennsylvania in the summer of 2008. The Republican nominee had opened by promising a country-over-party approach to politics, recalling his compromises with Democrats like Ted Kennedy: “We’ll have our disagreements, but we’ve got to be respectful.” The Republican crowd sat in silence. McCain then denounced Vladimir Putin’s incursion into independent Georgia, warning that “history is often made in remote, obscure places.” No one seemed interested in that particular remote and obscure place.
McCain just couldn’t connect with the crowd, until he unleashed a garbled riff about how Congress shouldn’t be on recess when gasoline prices were soaring. “My friends,” he said, “the message we want to send to Washington, D.C. is: ‘Come back off your vacation, go back to Washington, fix our energy problems, and drill and drill now, drill offshore and drill now!’” It lacked the poetic brevity of the “Drill, baby, drill” line his future running mate, Sarah Palin, would use to fire up crowds, but the York Expo Center suddenly erupted with raucous cheers. It felt visceral, almost violent, as if McCain had given his supporters permission to drill someone they hated. McCain flashed an uneasy grin, like a kid who had just set off his first firecracker, delighted but also a bit frightened by its power. He wasn’t really a drill-baby-drill politician, but he could sense his party drifting toward drill-baby-drill politics.
The Republicans clamoring for drilling that day in Pennsylvania weren’t reacting to the science of global warming or the economics of petroleum or the geopolitics of energy policy. They loved the idea of drilling now, and drilling everywhere, because their political enemies hated it. They were enjoying the primal experience of owning the libs, lashing out at the smug Democratic hippies who wanted to take away their SUVs and guns and Big Gulps. Oil exploration is a complex issue, but in the arena it was just another blunt-force weapon in a simple culture war.
A decade later, McCain is dead, bipartisanship is just about dead—his funeral felt like the rare exception that proved the rule—and the leader of the Republican Party is a world-class polarizer who mocked McCain’s service while cozying up to Putin on his way to the White House. President Donald Trump has pioneered a new politics of perpetual culture war, relentlessly rallying his supporters against kneeling black athletes, undocumented Latino immigrants and soft-on-crime, weak-on-the-border Democrats. He reverses the traditional relationship between politics and governance, weaponizing policy to mobilize his base rather than mobilizing his base to change policy. And in the Trump era, just about every policy issue is a wedge issue, not only traditional us-against-them social litmus tests like abortion, guns, feminism and affirmative action, or even just the president’s pet issues of immigration and trade, which he has wielded as cultural cudgels to portray Americans as victims of foreign exploiters. These days, even climate change, infrastructure policy and other domestic issues normally associated with wonky panels at Washington think tanks have been repackaged into cultural-resentment fodder.
At a time when Blue and Red America have split into two warring tribes inhabiting two separate realities, and “debate” has been redefined to evoke split-screen cable-news screamfests, this ferocious politicization of everything might seem obvious and unavoidable. But it’s also dangerous. It’s as if the rowdy cultural slap-fight the kids were having in the back seat has moved into the front, threatening to swerve the national car off the road. Transforming difficult analytical questions into knee-jerk emotional battlegrounds will dramatically increase the danger that thoughtless short-term choices will throw off our long-term national trajectory. And even beyond the impact on the quality of our public policy decisions, the ferocious politicization of everything is not healthy for the American body politic, which is why a Russian troll farm used fake social media accountsto gin up protests and counterprotests about hot-button issues like police shootings and Trump’s border wall. Our foreign adversaries like it when we yell at one another.
Honestly, though, we don’t need much prodding. Democrats and Republicans are increasingly self-segregated and mutually disdainful, each camp deploying the furious language of victimhood to justify its fear and loathing of the gullible deplorables in the other. One side boycotts Chick-fil-A (over gay rights), Walmart (over sweatshops) and companies that do business with the National Rifle Association, while the other boycotts Nike (over Colin Kaepernick), Starbucks (over refugees, gay marriage and non-Christmas-specific holiday cups) and companies that stop doing business with the NRA. We live in an era of performative umbrage. Every day is Festivus, a ritual airing of our grievances about Kathy Griffin, Roseanne Barr, fake news, toxic masculinity and those fancy coffee machines that Sean Hannity’s viewers decided to destroy for some reason. Every decision about where to shop or what to drive or what to watch is now an opportunity to express our political identities. The 24-hour news cycle has become a never-ending national referendum on Trump.
Politically, it makes sense that debates over highly technical challenges like energy and climate change have been transformed into shirts-and-skins identity issues. Ron DeSantis, the Trump-loving Republican former congressman running for governor of Florida, recently proclaimed that he’s “not in the pews of the Church of Global Warming Leftists,” a very 2018 way of expressing opposition to carbon regulations, renewable energy subsidies and other forms of climate action. He wasn’t disputing that the planet is getting hotter, or questioning the scientific data on the dangers of fossil fuels. He was clarifying which team he’s on, and more specifically which team he isn’t on, the team of tree-hugging scolds who look down on ordinary Americans for eating bacon and using plastic straws. You can see that sentiment expressed in less genteel ways if you search YouTube for “rolling coal,” where pollution-porn videos flaunt diesel trucks (sometimes dubbed “Prius repellents”) retrofitted to spew thick clouds of black smoke into the air, the transportation version of a middle finger to the opposing tribe. And there’s no denying that the opposing tribe of conspicuous composters and recyclers and Tesla drivers have their own identitarian rituals that pointedly broadcast their wokeness.
But while DeSantis may win points with his base by distancing himself from the Church of Global Warming Leftists, just as Trump does by dismissing global warming as a hoax manufactured in China, global warming is real, no matter who belongs to its church. Greenhouse gases don’t care whether they’re a wedge issue. Culture-war politics are often a crutch, a look-at-the-shiny-ball distraction, an easy way to shift complicated policy debates from inconvenient facts to emotion and identity.
As long as America keeps sorting itself into two factions divided by geography, ethnicity and ideology, pitting a multiracial team of progressives who live in cities and inner-ring suburbs against a white team of conservatives who live in exurbs and rural areas, this is what debates about public policy—or for that matter about the FBI, the dictator of North Korea and the credibility of various sexual assault allegations—will look like. We will twist the facts into our partisan narratives. The self-inflicted wounds will infect more and more of our lives. And if you want something else to worry about, consider where it might be spreading next.
Politics has always been adversarial. Traditionally, though, we’ve had a fairly robust national consensus about a fairly broad set of goals—a strong defense, a decent safety net, freedom from excessive government interference—even though we’ve squabbled over how to achieve them. What’s different about drill-baby-drill politics is the transformation of even nonpartisan issues into mad-as-hell battles of the bases, which makes it virtually impossible for politicians to solve problems in a two-party system. Cooperation and compromise start to look like capitulation, or even treasonous collusion with the enemy.
Take infrastructure spending, which was once reasonably uncontroversial, at least in principle. Today, many conservatives portray it as a liberal plot to siphon rural tax dollars into urban bike paths, subways, and high-speed rail boondoggles that unions will build and Democratic city slickers will use. The Trump administration actually changed the rules of the most prominent grant program for local transportation projects so that it explicitly favors rural projects, infuriating liberals who now see it as a slush fund for sprawl roads to nowhere serving out-in-the-boonies Trump voters. The war over Obamacare has a similar mine-versus-yours feel; many Republicans see it as a scheme to redistribute tax dollars (and the hard-earned Medicare benefits of older Americans) to lazy and entitled Barack Obama voters, while Democrats see the intense opposition to universal health care as generational warfare on behalf of the aging white GOP base.
There’s no denying that the opposing tribe of conspicuous composters and recyclers and Tesla drivers have their own identitarian rituals that pointedly broadcast their wokeness.
Trump has never shown much interest in the details of policy, but he does understand how to use the levers of government to reward his allies and punish his enemies. He froze the pay of federal employees, a key Democratic constituency, while approving a $12 billion bailout for farmers, who, like other industries, have taken a hit from his trade wars, but, unlike other industries, tend to vote as a Republican bloc. Trump’s tax bill hammered blue states by reining in deductions for state and local taxes, while his energy policies have provided relief to red states that rely heavily on fossil fuels. His administration has picked fights with California, the epicenter of coastal-elite Blue America, over fuel-efficiency standards, net neutrality and water policy.
Policy skirmishes tend to metastasize into cultural battles when they involve identity issues, and after spending time on the campaign trail recently, I got the sense the next big Republican culture war will be a war on college. For generations, the notion of higher education as a ladder of opportunity for everyone has been an anodyne nonpartisan talking point, even if Democrats and Republicans disagreed on the appropriate levels of federal funding and regulation. But Republican attitudes are changing. In Ohio, I heard them talk about taxpayer-funded school bureaucrats who trick kids into believing that expensive and often useless liberal-indoctrination universities are the only way to get ahead in life; siphoning students away from vocational programs that could prepare them for well-paying jobs.
It’s probably not a coincidence that this shift is happening at a time when college-educated voters are trending Democratic and noncollege whites have been Trump’s most reliable constituency. Policies that hurt colleges, like policies that hurt cities, are policies that hurt Democrats. To listen to pols talk about college these days is to watch a wedge issue in its embryonic stage, as substantive questions about the cost and relevance of higher ed, the burdens of student debt, the adequacy of worker training and the power of political correctness on campus start to morph into red-meat attacks on pointy-headed elitists who look down on ironworkers and brainwash America’s youth. Republicans are starting to fit the Democratic push for universal free college into their larger critique of the Democratic urge to hand out free stuff to Democratic voters. And they’re portraying a liberal arts education as a culturally liberal thing, like kale or Kwanzaa or reusable shopping bags.
I saw a soft-edged version of this anti-college theme at a manufacturing roundtable that Ohio Attorney General Mike DeWine, the Republican candidate for governor, held in September in Youngstown. DeWine listened for an hour as a group of executives complained how teenagers are constantly told they need college degrees to get ahead in life, how students who might flourish in programs to prepare them for factory jobs are steered into mainstream classes they hate. DeWine perked up when the director of a local career center said that only 12 percent of students who pursue four-year degrees end up earning enough to pay off their loans and that many never learn about other options. “The goal should be exposing kids to more things, not forcing them into anything,” DeWine interjected.
“We need to stop pushing everyone into college,” Renacci said. “Let’s get this stigma off our backs: You can live the American dream without college.”
Renacci’s event was supposed to be about trade, but none of the local farmers expressed any concern about the beating they’re taking from Trump’s trade war. What they expressed concern about was illegal immigrants who commit crimes and demand handouts; the deep state; Democrats who want to steal from Medicare to fund Obamacare; and Antifa thugs. Even though their party controls Washington and Columbus, they believe they’re under siege; one 60-something farmer told me he’s afraid to speak out because “radical Democrats will burn your house down.” When I said that seemed unlikely in the rural expanses of Ashtabula County, he said I should check out the angry leftist millennials he’s seen when he’s visited the Ohio State campus, “wearing boots and backpacks and shouting stupid slogans.” I asked him whether he supports government spending on higher education for those millennials, and he shot back: “I’ll tell you what I don’t support: free college for illegals and higher taxes for me.”
There are real policy debates to be had over higher education, and they’re important. U.S. universities aren’t blameless: They’ve jacked up their tuition costs much faster than inflation, overpopulated their faculties with liberals, failed to hold themselves accountable for the employment outcomes of their students and policed speech to the point that they look more concerned with stamping out “micro-aggressions” than promoting free inquiry. At the same time, a lot of work has been done to try to make colleges, especially community colleges, more relevant to the job market; DeWine’s roundtable event highlighted a model partnership between local educators and manufacturers. The Obama administration also established tough new rules limiting federal dollars to institutions that don’t move students into gainful employment. Ironically, the Trump administration is trying to roll back those rules, as well as others providing relief to students defrauded by Trump University-style for-profit diploma mills.
What they expressed concern about was illegal immigrants who commit crimes and demand handouts; the deep state; Democrats who want to steal from Medicare to fund Obamacare; and Antifa thugs.
But modern politics isn’t about these nuances of policy substance. It isn’t evidence-based. The debate over immigration isn’t really about measured wage effects or growth effects; it’s about whether a diverse America is the “real” one, and whether nonwhite newcomers make the country great. The Trump fans who came to see Renacci in Ashtabula County didn’t care any more about the details of higher education studies than they cared about the details of Paul Manafort’s guilty plea or our trade deficit with Canada. (It’s actually a surplus, a fact that will change approximately zero minds about Trump’s trade rhetoric.) The signal of substance breaks through the noise of politics so rarely that the noise has become the signal.
Nevertheless, substance does end up affecting people’s lives. Our higher education system is still one of America’s most valuable competitive assets, and breaking it in a fit of cultural fury would be the national equivalent of choking on diesel smoke to own the libs. Meanwhile, polls show that Americans, and particularly Republicans, are already increasingly suspicious that four-year colleges are really worth the money. That could affect their future choices and limit their own children’s options, all because “college” now feels like the other team.
Donald Trump was not the first Republican president to exploit America’s divisions. Think of Richard Nixon rallying his “silent majority” against bra-burning, free-loving, acid-dropping hippies, or even George H.W. Bush running against flag-burning and Willie Horton. And Trump didn’t create the so-called Big Sort of Americans into two ideologically polarized, geographically and racially segregated, mutually suspicious partisan camps. The rift between the mostly white camp of gun-owning, evangelical-church-going Fox News watchers who live relatively spread out and the more diverse camp of Whole Foods-shopping, funky-cafe-going NPR listeners who live closer together has been widening for decades.
Trump may be America’s leading culture warrior, but a war requires two armies. The frequent journalistic safaris into the right side of America’s divide tend to focus on the unwavering faith that Trump supporters have in Trump, but polls suggest the left side is just as prone to motivated reasoning about politics, and perhaps even more consumed by anger over politics. In a Pew Research Center survey, 47 percent of liberal Democrats said that if a friend supported Trump, it would put a strain on their friendship, and 68 percent of all Democrats said it’s “stressful and frustrating” to talk to Trump supporters. Andrew Gillum, the Democratic candidate for governor of Florida, had to fire his youth outreach director for posing for an Instagram post while wearing a shirt featuring the 2016 electoral map, with blue states labeled “United States of America” and Trump states labeled “Dumbfuckistan.” It was a perfect manufactured-outrage episode for our time—needless to say, similar shirts on which the blue states are labeled Dumbfuckistan are available for purchase—but it did reflect a common Democratic disdain for Republican rubes in the provinces.
So the culture war is not all about Trump. But Trump has a destructive genius for exploiting the culture war, exploding Washington’s norms of decorum and euphemism to trash his adversaries and torture the truth, portraying Puerto Ricans as ungrateful, immigrants as dangerous and Democrats as un-American. You’re with him or you’re with the terrorists. And the rest of Washington, which was already uncelebrated for civility, has followed him into perennial attack mode, to the point that even Supreme Court Justice Brett Kavanaugh bellowed partisan conspiracy theories during his confirmation hearing.
Trump’s entire Make America Great Again theme was always a cultural call to arms, deeply rooted in nostalgia for the supposedly good old days of the 1950s, before the messy disruptions of Black Lives Matter or #MeToo, before the steel industry had to worry about global competition or the coal industry faced limits on pollution. And since he’s abandoned his populist promises to crack down on Wall Street, build $1 trillion worth of infrastructure projects and get every American good health care, he’s doubling down on his racial and cultural messaging to his white working-class supporters, betting his attacks on the intelligence of LeBron James and CNN’s Don Lemon will overshadow his efforts to strip protections for pre-existing conditions and gut oversight of financial rip-offs. So far, it seems like a good bet.
Our higher education system is still one of America’s most valuable competitive assets, and breaking it in a fit of cultural fury would be the national equivalent of choking on diesel smoke to own the libs.
It’s hard to have serious public debates about the massive changes in public policy that Trump is pursuing, because there’s no longer a clear path for facts and logic to break through the daily onslaught of demonization and obfuscation. We’re too busy fighting to think. It’s especially tough to have an evidence-based debate about an issue like trade when Trump proclaims at one rally that his tariffs have prompted U.S. Steel to open seven new plants, and after fact-checkers point out the actual number is zero, he ups the number to eight or nine at his next rally. He understands that modern political debates don’t depend on facts or logic. Where you stand—on questions of whether to believe Kavanaugh’s accusers and whether there was any collusion with Russia, as well as questions about corporate tax rates or lifetime insurance caps—depends almost entirely on where you sit. Deficits are bad when your team is in charge, benign when my team is in charge. I’m being denied due process by a witch hunt, but you belong in jail. I’m no puppet; you’re the puppet.
This is presumably how entire countries turn into Dumbfuckistan. The solutions to our political forever war are pretty obvious: Americans need to rebuild mutual trust and respect. We need to try to keep open minds, to seek information rather than partisan ammunition. We need to agree on a shared foundation of facts from authoritative sources. But those words looked ridiculous the moment I typed them. Americans are not on the verge of doing any of those things. Once the dogs of war have been unleashed, it’s hard to call them back. And we should at least consider the possibility that we’re fighting this forever war because we like it.
The thing I remember most about Trump’s rallies in 2016, especially the auto-da-fé moments in which he would call out various liars and losers who didn’t look like the faces in his crowds, was how much fun everyone seemed to be having. The drill-baby-drill candidate would drill the Mexicans, drill the Chinese, drill the gun-grabbers, drill all the boring Washington politicians who had made America not-great. It sure as hell wasn’t boring. It was a showman putting on a show, a culture-war general firing up his internet troops. It wasn’t a real war, like the one that Trump skipped while John McCain paid an unimaginable price, but it made the spectators feel like they were not just spectating, like they had joined an exhilarating fight. They got the adrenaline rush, the sense of being part of something larger, the foxhole camaraderie of war against a common enemy, without the physical danger.
It’s not clear how a fight like that would ever end.