So asked Reverend Cleophus James of Jake and Elwood Blues in “The Blues Brothers”. Yet, James Brown now sings the blues. He died over 8 years ago at age 73, on December 25, 2006, and his estate remains unsettled, lost in darkness, with his body not permanently interred at the intended memorial. Caught in estate litigation of Dickinsonian proportions, with personal representatives who were appointed, then resigned and were either dismissed and reappointed, one wonders whether any funds will be left for the charitable beneficiaries of his estate, as his estate preambles through the legal system. .
Mr. Brown, the “Godfather of Soul”, lead a complicated life, including numerous marriages, children, accusations of alleged drug use, and alleged domestic violence. While one would think that his estate plan documents, or possible lack thereof, would be equally complex or unplanned, he did leave a will and irrevocable trust, which left substantial portions of his wealth to provide scholarships to needy children. However, despite the passage of over 8 years, the trustees have allegedly not distributed scholarships via the charitable trust to needy students. Instead, the South Carolina State Attorney General intervened in the matter in an allegedly unprecedented scope. Ultimately, the South Carolina Supreme Court overruled the decision of the Attorney General, to prevent the implementation of the Attorney General’s proposed changes to Mr. Brown’s dispositive intentions.
This summer, the Court sent the matter back to the local South Carolina probate court, which has yet to enter a final decision, due to various claims. Why the matter did not promptly proceed to the local probate court for a trial to resolve all issues and so a decision could be rendered, remains unclear.
Of further complexity, the value of Mr. Brown’s estate remains unknown. This remains a critical factor for determination of reasonableness of fees of the various professionals, which could ultimately further deplete the estate.
Presumably, Mr. Brown’s estate continues to earn millions of dollars a year in royalties. An unknown factor remains regarding the allocation of the income earned after his passing. If the executors/trustees could not or failed to distribute to the net income to the charitable beneficiaries/charitable trust their share of the income, then almost certainly the IRS, on behalf of the U.S. Treasury and possibly state or local governments, will receive substantial income tax revenues from and estate, instead of deducting said funds for charities from income earned.
Many individuals do not understand that an estate or trust, just like a person or corporation, must pay income tax, unless the estate distributes the income to the beneficiaries or the funds pass to a qualified charitable beneficiary, such as a charitable trust or other like organization. When such distribution occurs, the income received by the beneficiaries/charitable trusts generally results in deductions at the estate level, with non-charitable individuals and entity generally then considered to have received income to the extent the funds distributed exceed principal.
A qualified charity receiving income from a qualified charitable entity generally does not have to pay any income tax. But, when the income is accumulated and not distributed, the U.S. Treasury levies an income tax on estate income over $600 ($100 exemption for certain trusts) , with the maximum income tax currently generally set 39.5% for all income over $12,150 in a year. By contrast, income allocated to qualified charities, is not generally subject to income tax. The foregoing does not even factor in any state or local income taxes. Thus, Mr. Brown’s estate ligation has exposed his estate to substantial income tax due to the lack of prompt settling of his estate.
How could some of the above-discussed issues have been avoided?
A family settlement agreement prepared before Mr. Brown’s passing in which he fully disclosed all assets, debts, income and expenses and had all heirs and beneficiaries execute, agreeing to the disposition he intended, may have limited litigation after his passing. While to some extent this was attempted, at least with the individual claiming to be his spouse; as she is now a party to the estate litigation, presumably the agreement was not sufficient for some reason.
Failing such an agreement being practical, or possible, years before Mr. Brown’s passing he could have retained an independent appraiser to value his estate, and eventually transferred, while he was alive and in good health, assets to one or more types of irrevocable charitable trusts, such as a charitable remainder annuity trust. Some funds/assets might have been allocated directly for the scholarship fund he intended to create, while others funds/assets could have been allocated for the individual beneficiaries he intended to benefit from his estate. Indeed, transference of highly appreciated assets to one or more such trusts may have resulted in substantial reduction in taxes, either income taxes and/or estate taxes.
Of course, the funding while alive of charitable bequests or other gifting of assets to heirs in advance of one’s passing requires a willingness to live on less income and to accept one’s mortality. Moreover there are tax consequences. There are potential gift taxes, depending on the sums in question. Also, when a person dies, the cost basis to determine capital gain is normally the date of death of the asset in question, which is known as a “step-up” in cost basis. This means in effect that if you wait until you die, and have a highly appreciated asset, when sold, at death there may be little to know capital gains tax. But, if there asset had been transferred to a non-qualified charity while alive, there might have been substantial capital gain taxes. There are a variety of charitable trusts that when used can minimize or eliminate the capital gain tax in question. Accepting one’s mortality, and careful planning, can then minimize the possibility of disputes.
Here is an article by Adam S. Bernick, Esquire who is of counsel with my firm. This article was originally published in Upon Further Review on January 22, 2015.
It seems as if every decade or so, the Legislature makes changes to the statutes governing powers of attorney. It did so again with the passage of House Bill 1429, which Governer Corbett signed into law July 2014, thereby becoming Act 95 of 2014 (”Act 95”).
Act 95 modifies many provisions of the statutes governing financial powers of attorney (”POA”) in Pennsylvania. See, e.g., 20 Pa. C.S. § 5600 et. seq. Some of the changes are currently in effect, while others become effective in January 1, 2015. As such, prior to drafting a POA, consultation of the statutes governing POAs should occur prior to meeting with the prospective client and drafting the document.
Act 95 changes the formal requirements for the execution of POAs. They will now be required to be executed before two witnesses and a notary public. The changes to the statute clarify that neither the prospective agent nor the notary may serve as a witness. Also, the current statutory Notice alerting the principal to their rights, which the principal must read and sign before proceeding to execute the POA, has been modified. Likewise, the Acknowledgment form to be executed by the Agent prior to their beginning to serve as the Agent under the POA has also undergone changes. As the statute is not retroactive, it should not have any effect on currently executed valid POAs, but certainly there is an issue if a prospective Agent has not executed an acknowledgment and begun service as the Agent under a current POA, whether the Agent must execute the new form of Acknowledgement or the old one.
It is generally permitted for an attorney to act as a witness to the execution of certain documents when a notary is not available. The attorney would then have his signature witnessed by a notary public that he witnessed the principal execute the POA and witnessed the witnesses sign in his presence and in the presence of the principal. Such execution procedures prior to Act 95 changes were permitted and are with regards to wills. See, 42 Pa. C.S. § 327(a); 57 Pa. C.S. § 316(2.1). Typically this might occur when a notary is not available and time is of the essence because the client is due to undergo a medical procedure or leave the country. Under Act 95, counsel may no longer stand in for a notary in witnessing the execution of a POA, and a notary may not serve as a witness to a POA if they are also witnessing the execution of the POA in their capacity as a notary public.
Act 95, in an effort to clarify the scope of POAs, stipulates certain powers that must be specifically provided for in order for the Agent to act for the Principal. These include: (1) creating, amending, revoking or terminating a living trust, (2) making a gift, (3) creating or changing rights of survivorship, (4) creating or changing a beneficiary designation, (5) delegating authority granted under the POA, (6) waiving the principal’s right to be a beneficiary of a joint and survivor annuity, including a survivor benefit under a retirement plan, (7) exercising fiduciary powers that the Principal has authority to delegate, and (8) disclaiming property, including a Power of Appointment. If the POA does not specifically list these powers, and ideally, explain the scope of the powers in detail, then the Agent may not so act on behalf of the Principal.
Previously, the Pennsylvania Supreme Court issued a ruling that an individual or financial entity relying in good faith on a POA could be liable to the principal or another party for doing so even if the company had no way of knowing the POA was forged or otherwise void on its face. Commonwealth v. Vine, 607 Pa. 648 (Pa. 2010). While the Court recognized the abuses that occasionally occur by purported Agents under POAs, the resulting decision has led many financial companies to refuse to recognize a valid POA and insist that Principals execute separate POAs on the financial companies’ forms, contrary to the plain language of the statute. Act 95 clarifies the protection granted such third parties by granting them the right to reject POAs or to otherwise require verification of same before acting on them. How financial companies will act with regards to Act 95 remains unclear at present. Certainly, agents may now have additional hurdles to overcome before having a valid POA accepted by the financial instution.
Here is an article by Adam S. Bernick, Esquire who is of counsel with my firm. This article was originally published in Upon Further Review on December 16, 2014.
Attorneys who write wills and trusts as part of their practice have frequently been requested to write detailed clauses disposing of various forms of property, from tangible personal property, such as jewelry, to intangible personal property, such as trademarks, copyrights and patents, and real property. Within the last decade, a fourth category has come into being– digital property. Problematically the law governing the disposition of digital property is out of date.
For example, who has the right to passwords and user names, and can these be given or bequeathed or otherwise devised via a will or trust? Do domain names constitute intellectual property? What about content of material posted on social media, such as photographs posted on a Facebook page of someone who is now deceased? In addressing these questions two fundamental documents affect the answers to these questions: the terms of service of agreements between the now deceased individual and the company providing the service, such as Comcast®, Google®, Yahoo®, etc…, and the 1986 Stored Communications Act, (“SCA”, codified at 18 U.S.C. Chapter 121 §§ 2701–2712).
The SCA, in essence, prohibits consumer electronic-communications companies, such as AOL® or Google®, from disclosing the content of an individual’s account and communications and postings therein without the owner’s consent or a government order such as a warrant. 18 U.S.C.A. § 2703. While the statute predates popular use of the Internet or the more recent advent of social media, it remains in effect.
The Internet Service Providers (“ISP”) such as Comcast® or Verizon®, Internet Services such as Yahoo® and Google® (“IS”), social media companies such as Facebook® (collectively “Internet Companies”) have voluntary Terms of Service Agreements (“ToS”) between the company and individual or entity that creates an account with the Internet Company. By way of illustration, Facebook’s® ToS specifically states that “[Y]ou [the account holder] will not share your password (or in the case of developers, your secret key), let anyone else access your account, or do anything else that might jeopardize the security of your account.” Facebook®, Statement of Rights and Responsibilities, Section 4, 8. Likewise, the account cannot be transferred without permission of Facebook®. Similarly, Yahoo’s® ToS specifically states that “[N]o Right of Survivorship and Non-Transferability. You agree that your Yahoo! account is non-transferable and any rights to your Yahoo! ID or contents within your account terminate upon your death. Upon receipt of a copy of a death certificate, your account may be terminated and all contents therein permanently deleted.” Yahoo® Terms of Service Agreement, Paragraph 28. While Yahoo® “may” allow the account to transfer and data to be accessed if there is a specifically worded clause in a will, there is no guarantee this will occur. Internet Companies, especially social media companies such as Facebook, have resisted such efforts, going so far as to seek court orders to deny access to protect the privacy of the deceased account holder. Therefore, whether Internet Companies need allow access to the content of a deceased user’s account is up to the company providing the service absent a state law to the contrary, which in any event could be considered in conflict of the SCA prohibiting such access absent a warrant.
While some Internet Companies may permit the memorialization of someone’s account, so that they have a presence on a social media site even after they pass away, generally the account cannot be altered or otherwise accessed.
As of the date of March 1, 2013, 5 states have enacted statutes to enable fiduciaries to access online accounts: Connecticut Statutes § 45a–334a (see also Proposed Bill 5227 introduced January 11, 2013, status); Idaho Statutes §–15–3–715(28); Oklahoma Statutes § 58–269; Rhode Island General Laws Chapter 33–27; Indiana Code § 29–1–13–1.1. Whether the Internet Companies will respect these statutes or whether the courts will uphold them in light of the SCA remains to be seen as no cases have reached the US Supreme Court regarding executors accessing such content.
Pennsylvania had proposed legislation to amend Title 20 (Decedents, Estates and Fiduciaries) of the Pennsylvania Consolidated Statutes, in administration and personal representatives, providing for power over decedent account on social networking website, microblogging or short message service website or e-mail service website, under PA HB 2580 in the 2012 session of the Legislature, but the statute has not been so amended, and new legislation will need to be introduced in 2013.
Individuals may be tempted to consider the executor of the estate or heirs as third party beneficiaries to the digital information, but many ToS frequently specifically waive this. Moreover, due to the SCA, even if you were to leave a list of your passwords to various sites, arguably only the person who registered for the account could use such passwords and it would be an arguable breach of the SCA for an heir or executor to do so after the account holder died.
Based on the above, then, even if one writes a will and specifically grants the executor authority over social media accounts, email, etc…the Internet Companies may not allow access to such accounts of a deceased individual or to the contents thereof. Moreover, once the Internet Companies are aware that the individual is deceased, they will begin closing such accounts generally.
By contrast, if one executes a Power of Attorney or is under a guardianship, it may be possible to obtain such data and information because the agent/guardian is in essence acting for an individual who is currently alive. Nonetheless, agents and guardians should expect a lot of red tape in accessing this data, and may have to take legal action to secure it.
I will not discuss the intricacies of the Internet in terms of who has the right to lease or otherwise grant the right to the use of a domain name. However, it is generally recognized that if you are the registered owner of a domain name, you own the legal domain name rights to the domain name. If the domain name is a registered trademark, it may be considered intellectual property, and thus possible to bequeath; however, a critical issue is if the registration lapses because the fees to renew or extend the rights to the domain name are not timely paid, which means the domain name might be re-issued to other individuals. Likewise, if the domain name is registered in the name of a corporation, and the corporation has perpetual existence, even if a decedent is the sole owner of the corporation, it would be possible to renew the registration with only the payment of the registration fee.
A separate issue is whether forms of digital property that are not owned in any fashion, but used by the deceased individual. By way of illustration, an individual does not generally own any of the items purchased on Apple’s Itunes® or Amazon’s Kindle® application because the purchaser generally purchases the license to use the digital files, not the actual song or book in a digital form. Also, generally the license is “non-transferrable”. While the files in the account may not be considered an asset that can be used by others, the account itself may be considered an asset. It may be possible to create a trust while you are alive to own the license for benefit of your heirs. While this in itself raises a variety of issues, such as who would serve as a trustee and would the digital licensor recognize or continue to recognize such rights, it is an option to consider.
Digital assets could be lent or sold to third parties if the licensor of the product permits such transfers by the retail store that has the right to resell the right to use the licensed product. Presumably, there would be a variety of threshold questions to address, primarily compensation to the publisher of the material, compensation to the licensor, eliminating access by the initial purchaser. By way of illustration, were Apple to permit sale of a music collection purchased on its iTune® store by customer A to Customer B, the transaction would have to proceed through the iTune® store. Unlike purchase of a book or cd from a retail store, which could then be sold or left to heirs via a will, even if internet companies such as Apple permit transfer of the right to use a licensed product, freely transferring the rights to license to the product or leaving it to heirs may not be contemplated, or even if eventually granted would likely still involve fees for the transfer. While there may be an eventual arraignment addressing these issues, with licenses eventually being transferable via the retail store that sold customer A the right use the initial licensed product, this is not generally currently the case.
Consequently, individuals should be aware that when they create their wills, their executor may not have any authority to retrieve data, including photographs posted to social media websites. While such clauses may be added to their wills, the Internet Companies could still decline access to the data. Likewise their books on Kindle® or collection of music on iTunes® may vanish when they die, instead of being able to be transferred to heirs. Clients should be advised to save all such photographs and data that they want passed to heirs to a hard drive that can be backed up, or print out the data in question, to the extent that there is no license infringement in doing so. If Internet providers or companies allow multiple account holders’ consideration should be made to setting up accounts to add heirs so that they can access the data.
By Adam S. Bernick, Esquire, and originally published in Upon Further Review on April 3, 2013.
This past weekend, Americans learned of another mass shooting, this time by an employee who decided to murder as many of the people he had worked with for years as possible. As of this writing, the murder toll is 12 people.
Every American asks why. What was the killer’s motive? When we read there is “no known motive,” we are frustrated. Human beings want to make sense of life, especially of evil.
Liberals (in this regard, liberals’ views are essentially as the same as leftists’) are virtually united in ascribing these shootings to guns. Just this past weekend, in a speech in Brazil, former President Barack Obama told an audience:
“Our gun laws in the United States don’t make much sense. Anybody can buy any weapon any time — without much, if any, regulation. They can buy (guns) over the internet. They can buy machine guns.”
That the former president fabricated a series of falsehoods about the United States — and maligned, on foreign soil, the country that twice elected him president — speaks to his character and to the character of the American news media that have been completely silent about these falsehoods. But the main point here is that, like other liberals and leftists, when Obama addresses the subject of mass shootings — in Brazil, he had been talking about the children murdered at Sandy Hook Elementary School in 2012 — he talks about guns.
Yet, America had plenty of guns when its mass murder rate was much lower. Grant Duwe, a Ph.D. in criminology and director of research and evaluation at the Minnesota Department of Corrections, gathered data going back 100 years in his 2007 book, “Mass Murder in the United States: A History.”
Duwe’s data reveal:
In the 20th century, every decade before the 1970s had fewer than 10 mass public shootings. In the 1950s, for example, there was one mass shooting. And then a steep rise began. In the 1960s, there were six mass shootings. In the 1970s, the number rose to 13. In the 1980s, the number increased 2 1/2 times, to 32. And it rose again in the 1990s, to 42. As for this century, The New York Times reported in 2014 that, according to the FBI, “Mass shootings have risen drastically in the past half-dozen years.”
Given the same ubiquity of guns, wouldn’t the most productive question be what, if anything, has changed since the 1960s and ’70s? Of course it would. And a great deal has changed. America is much more ethnically diverse, much less religious. Boys have far fewer male role models in their lives. Fewer men marry, and normal boy behavior is largely held in contempt by their feminist teachers, principals and therapists. Do any or all of those factors matter more than the availability of guns?
Let’s briefly investigate each factor.
Regarding ethnic diversity, the countries that not only have the fewest mass murders but the lowest homicide rates as well are the least ethnically diverse — such as Japan and nearly all European countries. So, too, the American states that have homicide rates as low as Western European countries are the least ethnically and racially diverse (the four lowest are New Hampshire, North Dakota, Maine and Idaho). Now, America, being the most ethnically and racially diverse country in the world, could still have low homicide rates if a) Americans were Americanized, but the left has hyphenated — Balkanized, if you will — Americans, and b) most black males grew up with fathers.
Regarding religiosity, the left welcomes — indeed, seeks — the end of Christianity in America (though not of Islam, whose robustness it fosters). Why don’t we ask a simple question: What percentage of American murderers attend church each week?
Regarding boys’ need for fathers, in 2008, then-Sen. Obama told an audience: “Children who grow up without a father are five times more likely to live in poverty and commit crime; nine times more likely to drop out of schools; and 20 times more likely to end up in prison.”
Yet, the Times has published columns and “studies” showing how relatively unimportant fathers are, and more and more educated women believe this dangerous nonsense.
Then there is marriage: Nearly all men who murder are single. And their number is increasing.
Finally, since the 1960s, we have been living in a culture of grievance. Whereas in the past people generally understood that life is hard and/or they have to work on themselves to improve their lives, for half a century, the left has drummed into Americans’ minds the belief that their difficulties are caused by American society — in particular, its sexism, racism and patriarchy. And the more aggrieved people are the more dulled their consciences.
When you don’t ask intelligent questions, you cannot come up with intelligent answers. So, then, with regard to murder in America, until Americans stop allowing the left to ask the questions, we will have no intelligent answers.
By Dennis Prager and published on June 8, 2019 in The Daily Wire and can be seen here.
Critics think that the genre was an embarrassing dead end. So why do fans and musicians still love it?
Virtuosos such as the keyboardist Keith Emerson made fans feel like connoisseurs. In April, 1971, Rolling Stone reviewed the début album by a band with a name better suited to a law firm: Emerson, Lake & Palmer. The reviewer liked what he heard, although he couldn’t quite define it. “I suppose that your local newspaper might call it ‘jazz-influenced classical-rock,’ ” he wrote. In fact, a term was being adopted for this hybrid of highbrow and lowbrow. People called it progressive rock, or prog rock: a genre intent on proving that rock and roll didn’t have to be simple and silly—it could be complicated and silly instead. In the early nineteen-seventies, E.L.P., alongside several more or less like-minded British groups—King Crimson, Yes, and Genesis, as well as Jethro Tull and Pink Floyd—went, in the space of a few years, from curiosities to rock stars. This was especially true in America, where arenas filled up with crowds shouting for more, which was precisely what these bands were designed to deliver. The prog-rock pioneers embraced extravagance: odd instruments and fantastical lyrics, complex compositions and abstruse concept albums, flashy solos and flashier live shows. Concertgoers could savor a new electronic keyboard called a Mellotron, a singer dressed as a batlike alien commander, an allusion to a John Keats poem, and a philosophical allegory about humankind’s demise—all in a single song (“Watcher of the Skies,” by Genesis). In place of a guitarist, E.L.P. had Keith Emerson, a keyboard virtuoso who liked to wrestle with his customized Hammond organ onstage, and didn’t always win: during one particularly energetic performance, he was pinned beneath the massive instrument, and had to be rescued by roadies. Perhaps this, too, was an allegory.
Most of these musicians took seriously the “progressive” in “progressive rock,” and believed that they were helping to hurry along an ineluctable process: the development of rock music into what Jon Anderson, of Yes, once called “a higher art form.” Even more than most musicians, the prog rockers aimed for immortality. “We want our albums to last,” Robert Fripp, the austere guitar scientist behind King Crimson, said. In a literal sense, he got his wish: although the progressive-rock boom was effectively over by the end of the seventies, it left behind a vast quantity of surplus LPs, which filled the bins in used-record stores for decades. (Many people who have never heard this music would nonetheless recognize some of the album covers.) Progressive rock was repudiated by what came next: disco, punk, and the disco-punk genre known as New Wave. Unlike prog rock, this music was, respectively, danceable, concise, and catchy. In the story of popular music, as conventionally told, progressive rock was at best a dead end, and at worst an embarrassment, and a warning to future musical generations: don’t get carried away.
In place of a guitarist, Emerson, Lake & Palmer had Keith Emerson, a keyboard virtuoso who liked to wrestle with his Hammond organ onstage, and didn’t always win: during one performance, he was pinned beneath the massive instrument, and had to be rescued by roadies.
The genre’s bad reputation has been remarkably durable, even though its musical legacy keeps growing. Twenty years ago, Radiohead released “OK Computer,” a landmark album that was profoundly prog: grand and dystopian, with a lead single that was more than six minutes long. But when a reporter asked one of the members whether Radiohead had been influenced by Genesis and Pink Floyd, the answer was swift and categorical: “No. We all hateprogressive rock music.”
It is common to read about some band that worked in obscurity, only to be discovered decades later. In the case of progressive rock, the sequence has unfolded in reverse: these bands were once celebrated, and then people began to reconsider. The collapse of prog helped reaffirm the dominant narrative of rock and roll: that pretension was the enemy; that virtuosity could be an impediment to honest self-expression; that “self-taught” was generally preferable to “classically trained.”
In the past twenty years, though, a number of critics and historians have argued that prog rock was more interesting and more thoughtful than the caricature would suggest. The latest is David Weigel, a savvy political reporter for the Washington Post who also happens to be an unabashed fan—or, more accurately, a semi-abashed fan. His new history of prog rock is called “The Show That Never Ends,” and it begins with its author embarking on a cruise for fans, starring some of the great prog-rock bands of yore, or what remains of them. “We are the most uncool people in Miami,” Weigel writes, “and we can hardly control our bliss.”
Almost no one hated progressive rock as much, or as memorably, as Lester Bangs, the dyspeptic critic who saw himself as a rock-and-roll warrior, doing battle against the forces of fussiness and phoniness. In 1974, he took in an E.L.P. performance and came away appalled by the arsenal of instruments (including “two Arthurian-table-sized gongs” and “the world’s first synthesized drum kits”), by Emerson’s preening performance, and by the band’s apparent determination to smarten up rock and roll by borrowing from more respectable sources. E.L.P. had reached the Top Ten, in both Britain and America, with a live album based on its bombastic rendition of Mussorgsky’s “Pictures at an Exhibition.” Bangs wanted to believe that the band members thought of themselves as vandals, gleefully desecrating the classics. Instead, Carl Palmer, the drummer, told him, “We hope, if anything, we’re encouraging the kids to listen to music that has more quality”—and “quality” was precisely the quality that Bangs loathed. He reported that the members of E.L.P. were soulless sellouts, participating in “the insidious befoulment of all that was gutter pure inrock.” Robert Christgau, the self-proclaimed “dean of American rock critics,” was, if anything, more dismissive: “These guys are as stupid as their most pretentious fans.”
The story of this reviled genre starts, though, with the most acclaimed popular music ever made. “If you don’t like progressive rock, blame it on the Beatles,” a philosophy professor named Bill Martin wrote, in his 1998 book, “Listening to the Future,” a wonderfully argumentative defense of the genre. Martin is, in his own estimation, “somewhat Marxist,” and he saw progressive rock as an “emancipatory and utopian” movement—not a betrayal of the sixties counterculture but an extension of it. Martin identified a musical “turning point” in 1966 and 1967, when the Beach Boys released “Pet Sounds” and the Beatles released “Sgt. Pepper’s Lonely Hearts Club Band,” which together inspired a generation of bands to create albums that were more unified in theme but more diverse in sound. Using orchestration and studio trickery, these albums summoned the immersive pleasure of watching a movie, rather than the kicky thrill of listening to the radio.
When bands set out to make hit albums, rather than hit singles, some of them abandoned short, sharp love songs and began to experiment with intricate compositions and mythopoetic lyrics. By the dawn of the seventies, the term “progressive rock” was being applied to a cohort of rock-and-roll groups that thought they might be outgrowing rock and roll. In 1973, Columbia Records released a double-album compilation called “The Progressives.” The liner notes informed listeners that “the boundaries between styles and categories continue to blur and disappear.”
But this inclusive musical movement was also, as Weigel emphasizes, a parochial one. “American and British youth music had grown together from the moment the Beatles landed at J.F.K.,” he writes. “In 1969, the two sounds finally started to grow apart.” Weigel quotes an interview with Lee Jackson, the lead singer of a British rock band called the Nice—Keith Emerson’s previous band. “The basic policy of the group is that we’re a European group,” Jackson said. “We’re not American Negroes, so we can’t really improvise and feel the way they can.” (Ironically, the Nice’s biggest hit was an instrumental version of Leonard Bernstein’s “America.”) In a thoughtful 2009 autobiography, Bill Bruford, a drummer who was central to the development of prog rock, noted that many of the music’s pioneers were “nice middle-class English boys,” singing songs that were “self-consciously British.” Genesis, for instance, was formed at Charterhouse, a venerable boarding school in Surrey; the band’s album “Selling England by the Pound” was an arch and whimsical meditation on national identity. Bruford pointed out that even Pink Floyd, known for free-form jam sessions and, later, cosmic rock epics, found time to record songs like “Grantchester Meadows,” a gentle ode to the East Anglian countryside.
In 1969, King Crimson, the most rigorous and avant-garde of the major prog bands, released what is now considered the genre’s first great album, a strange and menacing début called “In the Court of the Crimson King.” The album used precise dissonance and off-kilter rhythms to evoke in listeners a thrilling sensation of ignorance: you got the feeling that the musicians understood something you didn’t. At a career-making concert in Hyde Park, opening for the Rolling Stones, King Crimson played a ferocious set that ended with an acknowledgment of England’s musical heritage: a rendition of “Mars, the Bringer of War,” by the English composer Gustav Holst.
The prog-rock pioneers embraced extravagance. Concertgoers could savor a new electronic keyboard called a Mellotron, a singer dressed as a batlike alien commander, an allusion to a John Keats poem, and a philosophical allegory about humankind’s demise—all in the space of a single song (“Watcher of the Skies,” by Genesis).
From the start, King Crimson was the kind of band that musicians love—as opposed, that is, to the kind of band that non-musicians love. (King Crimson never had a hit single, although “21st Century Schizoid Man,” the first song from its first album, served, in 2010, as the basis for “Power,” by Kanye West.) Bill Bruford, the drummer, was astonished by an early King Crimson performance, and resolved to make equally ambitious music with his own band, a sweetly melodic group called Yes. In its own way, Yes, too, was profoundly English—Jon Anderson, the lead singer, generally eschewed faux-American bluesiness, and the band instead deployed pleasing multipart harmonies that recall the choral tradition of the Anglican Church.
In 1971, Yes released an album called “Fragile,” which included a hummable—and very progressive—song called “Roundabout.” On the album, it lasted more than eight minutes, but unsentimental record executives trimmed it to three and a half, and the edited version found a home on U.S. radio stations. This music, so self-consciously English, sounded different in America, where its rather nerdy creators were greeted as exotic rock stars. That summer, Yes played its first U.S. concert, at an arena in Seattle. A fan who approached Jon Anderson before the show remembered that Anderson was nervous. “I don’t know what is going to happen,” the singer told him. “I’ve never been in a place like this.”
When Anderson sang, “I’ll be the roundabout,” most American listeners surely had no idea that he was referring to the kind of intersection known less euphoniously, in the U.S., as a traffic circle. (The song was inspired by the view from a van window.) Why, then, did this music seduce so many Americans? In 1997, a musician and scholar named Edward Macan published “Rocking the Classics,” in which he offered a provocative explanation. Noting that this artsy music seemed to attract “a greater proportion of blue-collar listeners” in the U.S. than it had in Britain, he proposed that the genre’s Britishness “provided a kind of surrogate ethnic identity to its young white audience”: white music for white people, at a time of growing white anxiety. Bill Martin, the quasi-Marxist, found Macan’s argument “troubling.” In his view, the kids in the bleachers were revolutionaries, drawn to the music because its sensibility, based on “radical spiritual traditions,” offered an alternative to “Western politics, economics, religion, and culture.”
The genre’s primary appeal, though, was not spiritual but technical. The musicians presented themselves as virtuosos, which made it easy for fans to feel like connoisseurs; this was avant-garde music that anyone could appreciate. (Pink Floyd might be the most popular prog-rock band of all time, but Martin argued that, because the members lacked sufficient “technical proficiency,” Pink Floyd was not really prog at all.) In some ways, E.L.P. was the quintessential prog band, dominated by Emerson’s ostentatious technique—he played as fast as he could, and sometimes, it seemed, faster—and given to grand, goofy gestures, like “Tarkus,” a twenty-minute suite that recounted the saga of a giant, weaponized armadillo. The members of E.L.P. betrayed no particular interest in songwriting; the group’s big hit, “Lucky Man,” was a fluke, based on something that Greg Lake wrote when he was twelve. It concluded with a wild electronic solo, played on a state-of-the-art Moog synthesizer, that Emerson considered embarrassingly primitive. An engineer had recorded Emerson warming up, and the rest of the band had to convince him not to replace his squiggles with something more precise—more impressive. In the effortful world of prog, there was not much room for charming naïveté or happy accidents; improvised solos were generally less important than composed instrumental passages.
The audience for this stuff was largely male—Bruford writes ruefully that, throughout his career, women “generally and rather stubbornly stayed away” from his performances. The singer-songwriter John Wesley Harding, an obsessive prog-rock fan, suggests that these musicians were “afraid of women,” and that they expressed this fear by shunning love songs. What they provided, instead, was spectacle. As the American crowds got bigger, the stages did, too, which meant more elaborate shows, which in turn drew more fans. Weigel notes that, in one tour program, the members of Genesis promised to “continually feed profits back into the stage show.” (At one point, the show included a stage-wide array of screens displaying a sequence of hundreds of images, and, for the lead singer, a rubbery, tumorous costume with inflatable testicles.) Yes toured with sets designed by Roger Dean, the artist who painted its extraterrestrial album covers. Dean’s innovations included enormous, sac-like pods from which the musicians could dramatically emerge. Inevitably, one of the pods eventually malfunctioned, trapping a musician inside and prefiguring a famous scene from “This Is Spinal Tap.” The competition among bands to create bigger and brighter spectacles was absurd but also irresistible, and quite possibly rational. American arena stages, like LPs, needed to be filled, and so these bands set out to fill them.
Weigel’s book has an unlikely flaw, given its subject: it is too short. Wary, perhaps, of taxing readers’ patience, he finishes his tour in three hundred pages, resisting what must have been an overwhelming urge to interrupt the narrative with disco-graphical digressions. Martin, less diffident, included in his book a list of sixty-two “essential” progressive-rock albums—partly to provide a shopping list for newcomers, and partly, one suspects, because he liked the idea of outraging hard-core fans with his omissions.
So what is the greatest progressive-rock album of all time? One perennial and deserving candidate is “Close to the Edge,” by Yes, from 1972, which consists of three long songs that are, by turns, gently pastoral and gloriously futuristic, responding to the genre’s contradictory impulses: to explore musical history and to leave it behind. Earlier this year, Will Romano published “Close to the Edge: How Yes’s Masterpiece Defined Prog Rock,” a frankly obsessive study that makes no pretense of levelheadedness. Romano notes that he listened to the album “easily over a thousand times” while working on the book, and, when he wonders about a “low pulse that pervades entire sections” of the title track, it seems possible that he has begun to hallucinate. He embarks upon a brave attempt to decode Anderson’s inane lyrics, provides an astute technical description of the way Steve Howe seems to play lead and rhythm guitar at the same time, and identifies the pivotal moment when Rick Wakeman, the keyboard player, met Denise Gandrup, a designer of sparkly capes, which became his signature.
In the United States, British prog rock’s rather nerdy creators were greeted as exotic rock stars. Before Yes played its first U.S. show, at a stadium in Seattle, the singer Jon Anderson said, “I don’t know what is going to happen. I’ve never been in a place like this.”
Romano ends with a note of defiance, pointing out that Yes still hadn’t been accepted by the cultural élitists in charge of the Rock & Roll Hall of Fame. This spring, not long after the book’s publication, Yes was finally inducted—more than two decades after it became eligible. And yet Romano is right: there is something inspiring about the indigestibility of prog, which still hasn’t quite been absorbed into the canon of critically beloved rock and roll, and which therefore retains some of its outsider appeal. Often, we celebrate bygone bands for being influential, hearing in them the seeds of the new; the best prog provides, instead, the shock of the old.
Listeners who wonder what they have been missing should probably ignore E.L.P. entirely and head straight for “Close to the Edge”—or, if they want something a bit more bruising, “Red,” an austere album that a new version of King Crimson (including Bruford) released in 1974. One of the most underappreciated progressive-rock groups was Gentle Giant, but there was a reason for this neglect: none of the band members happened to be a great singer. So they used interlocking instrumental lines, shifting time signatures, and close harmonies to construct songs that seemed to occupy some phantom limb of music’s evolutionary tree.
Gentle Giant was one of the bands featured on “The Progressives,” the Columbia Records compilation, which turned out to have a hidden agenda: it was, in large part, a jazz album, seemingly designed to help prog fans develop a taste for Ornette Coleman, Charles Mingus, and Mahavishnu Orchestra. Jazz played an important but disputed role in the story of progressive rock. While some British bands were trying to turn inward, away from American influences, others were finding ways to forge new ties between rock and jazz. Indeed, Mahavishnu Orchestra, a jazz-fusion group led by the English guitarist John McLaughlin (who previously played with Miles Davis), is sometimes considered an honorary prog band—at the time, the distinctions between these genres could be hazy. And in Canterbury, in the southeast of England, a cluster of interconnected bands created their own jazz-inflected hybrids: Soft Machine, Matching Mole, Hatfield & the North. These are the bands most likely to charm—and perhaps convert—listeners who think that they hate progressive rock. Unlike the swashbucklers who conquered arenas, the Canterburians were cheerfully unheroic, pairing adventurous playing with shrugging, self-deprecating lyrics about nothing much. (One Hatfield & the North song goes, “Thank all the mothers who made cups of tea. / If they didn’t care for us, we wouldn’t be / here to sing our songs and entertain. / Plug us in and turn on the mains!”) This is music animated by a spirit of playful exploration—recognizably progressive, you might say, though not terribly prog.
The question of progress bedevilled many of the prog bands: the ethos, which implied constant transformation, was at odds with the sound, which was identifiable, and therefore stuck. Robert Fripp solved this problem by disbanding King Crimson just as “Red” was being released. “The band ceased to exist in 1974, which was when all English bands in that genre should have ceased to exist,” he said later. Once some album-side-long songs had been recorded, and some snippets of classical music appropriated, it was not obvious how further progress might be made, especially since the bands now had big crowds to please. In 1978, E.L.P. released an infamous album called “Love Beach,” which was recorded in the Bahamas, and whose cover depicted something less enticing than a battle-ready armadillo: the three grinning band members, displaying white teeth and varying amounts of chest hair.
Most of the musicians took seriously the “progressive” in “progressive rock,” and believed that they were helping to hurry along an ineluctable process: the development of rock music into what Jon Anderson, of Yes, once called “a higher art form.”
Progressive rock was a stubborn genre, and yet a number of its adepts proved to be surprisingly flexible; it turned out that their considerable musical skill could be put to new uses. In 1980, Steve Howe, the guitarist from Yes, told the Los Angeles Times that his band had been “modernized” and simplified. “Whatever’s been leveled at us in the past, we want to be re-judged,” he said. This kind of desperate ploy isn’t supposed to work, but it did: in 1983, Yes topped the American pop chart with “Owner of a Lonely Heart,” which barely sounded like it had come from the same band. A new group called Asia, made up of refugees from Yes, King Crimson, and E.L.P., released an album that reached No. 1 on the American chart. Genesis did something even more impressive, transforming into a Top Forty band while spawning three successful solo careers. The singer, Peter Gabriel, became a pop star, and so did the drummer, Phil Collins, and the bassist, Mike Rutherford, who led Mike + the Mechanics. For a few of the genre’s biggest stars, the music industry offered an attractive bargain: leave prog behind and you can be bigger than ever.
Some true believers remained, of course. In the seventies, prog-inspired American bands like Kansas and Styx had conquered arenas, and by the end of the decade there was Rush, a Yes-obsessed trio of Canadians who received even worse reviews than their British forebears. One reason was their avowed love of Ayn Rand; an influential and absurd review in New Musical Express, a British magazine, accused them of preaching “proto-fascism.” Another reason was that, by the late seventies, progressive rock was about the most unhip music in existence. “The fans showing up to hear Rush were the wrong kind of fans—the mockable ones, with mockable taste in music,” Weigel writes, holding up this judgment for ridicule without quite dissenting from it. (No doubt he was sorely tempted to use the term “deplorables.”) By the time Rush emerged, progressive rock had entered its never-ending defensive phase; uncoolness is now part of the genre’s identity, and even a devoted fan like Weigel may not be entirely sure whether he wants that to change.
Progressive rock, broadly defined, can never disappear, because there will always be musicians who want to experiment with long songs, big concepts, complex structures, and fantastical lyrics. You can hear a trace of the genre in the fearless compositions of Joanna Newsom or, equally, in “Pyramids,” an epic Frank Ocean slow jam that blends Afrocentric mythology with a narrative about sexwork. At Coachella this year, one of the breakout stars was Hans Zimmer, the German composer, who performed excerpts from his film scores with an orchestra and a rock band. (Anyone who cheered him on has forever lost the right to make snarky jokes about bands like Yes.) Plenty of revivalist bands play what might, paradoxically, be called retro-prog. And there have been latter-day innovators. Tool emerged, a quarter century ago, as an awesome new kind of prog band: precise but unremittingly heavy, all rumbles and hums. In Sweden, Meshuggah, in the nineties, built roaring, ferocious songs atop fiendish riffs in prime-number time signatures; Opeth, in the aughts, found a connection between death-metal fury and Pink Floydian reverie.
What can disappear—what long ago disappeared, in fact, at least among rock bands—is the ideology of progress in pop music: the optimistic sense, shared by all those early-seventies pioneers, that the form was evolving and improving, and that prog rock offered a sneak peek at our future. The bands thought that the arc of the musical universe bent toward keyboard solos. This is part of what drove Lester Bangs crazy—he couldn’t understand why these musicians thought they had improved upon old-fashioned rock and roll. But contemporary listeners might find the genre’s optimistic spirit more exotic, and therefore more endearing, than it once seemed. Of course, prog rock was not the future—at least, not more than anything else was. Nowadays, it seems clear that rock history is not linear but cyclical. There is no grand evolution, just an endless process of rediscovery and reappraisal, as various styles and poses go in and out of fashion. We no longer, many of us, believe in the idea of musical progress. All the more reason, perhaps, to savor the music of those who did.
By Kelefa Sanneh and published on June 12, 2017 in The New Yorker and can be found here.
Dr. Paul R. McHugh, the Distinguished Service Professor of Psychiatry at Johns Hopkins University and former psychiatrist–in-chief for Johns Hopkins Hospital, who has studied transgendered people for 40 years, said it is a scientific fact that “transgendered men do not become women, nor do transgendered women become men.”
All such people, he explained in an article for The Witherspoon Institute, “become feminized men or masculinized women, counterfeits or impersonators of the sex with which they ‘identify.’”
Dr. McHugh, who was psychiatrist-in-chief at Johns Hopkins Hospital for 26 years, the medical institute that had initially pioneered sex-change surgery – and later ceased the practice – stressed that the cultural meme, or idea that “one’s sex is fluid and a matter of choice” is extremely damaging, especially to young people.
The idea that one’s sexuality is a feeling and not a biological fact “is doing much damage to families, adolescents, and children and should be confronted as an opinion without biological foundation wherever it emerges,” said Dr. McHugh in his article, Transgenderism: A Pathogenic Meme.
“I am ever trying to be the boy among the bystanders who points to what’s real,” said Dr. McHugh, who is also professor of Psychiatry and Behavioral Sciences at Johns Hopkins. “I do so not only because truth matters, but also because overlooked amid the hoopla—enhanced now by Bruce Jenner’s celebrity and Annie Leibovitz’s photography—stand many victims.”
“Think, for example, of the parents whom no one—not doctors, schools, nor even churches—will help to rescue their children from these strange notions of being transgendered and the problematic lives these notions herald,” warned McHugh.
They rarely find therapists who are willing to help them “work out their conflicts and correct their assumptions,” said McHugh. “Rather, they and their families find only ‘gender counselors’ who encourage them in their sexual misassumptions.”
In addition, he said, “both the state and federal governments are actively seeking to block any treatments that can be construed as challenging the assumptions and choices of transgendered youngsters.”
“As part of our dedication to protecting America’s youth, this administration supports efforts to ban the use of conversion therapy for minors,” said Valerie Jarrett, a senior advisor to President Obama, as quoted by Dr. McHugh in his article.
However, there is plenty of evidence showing that “transgendering” is a “psychological rather than a biological matter,” said Dr. McHugh
“In fact, gender dysphoria—the official psychiatric term for feeling oneself to be of the opposite sex—belongs in the family of similarly disordered assumptions about the body, such as anorexia nervosa and body dysmorphic disorder,” said McHugh.
“Its treatment should not be directed at the body as with surgery and hormones any more than one treats obesity-fearing anorexic patients with liposuction,” he said.
In fact, at Johns Hopkins, where they pioneered sex-change-surgery, “we demonstrated that the practice brought no important benefits,” said Dr. McHugh. “As a result, we stopped offering that form of treatment in the 1970s.”
In recent years, though, the notion that one’s sex is fluid has flooded the culture. It is “reflected everywhere in the media, the theater, the classroom, and in many medical clinics,” said McHugh.
It is biologically false that one can exchange one’s sex, explained McHugh.
“Transgendered men do not become women, nor do transgendered women become men,” he said. “All (including Bruce Jenner) become feminized men or masculinized women, counterfeits or impersonators of the sex with which they ‘identify.’ In that lies their problematic future.”
When “the tumult and shouting dies,” McHugh continued, “it proves not easy nor wise to live in a counterfeit sexual garb. The most thorough follow-up of sex-reassigned people—extending over 30 years and conducted in Sweden, where the culture is strongly supportive of the transgendered—documents their lifelong mental unrest.”
“Ten to 15 years after surgical reassignment, the suicide rate of those who had undergone sex-reassignment surgery rose to 20 times that of comparable peers,” said McHugh.
Nonetheless, the false “assumption that one’s sexual nature is misaligned with one’s biological sex,” can be treated with therapy and medication, said McHugh.
He further stressed that, “What is needed now is public clamor for coherent science—biological and therapeutic science—examining the real effects of these efforts to ‘support’ transgendering.”
“But gird your loins if you would confront this matter,” warned Dr. McHugh. “Hell hath no fury like a vested interest masquerading as a moral principle.”
Dr. McHugh’s article, Transgenderism: A Pathogenic Meme, can be read in full at the website of The Witherspoon Institute.
By: Michael W. Chapman and published on May 5, 2016 in CNS News and can be seen here.
By Glenn T. Stanton and published on January 22, 2018 in The Federalist and can be found here.
Is churchgoing and religious adherence really in ‘widespread decline’ so much so that conservative believers should suffer ‘growing anxiety’? Absolutely not.
“Meanwhile, a widespread decline in churchgoing and religious affiliation had contributed to a growing anxiety among conservative believers.” Statements like this are uttered with such confidence and frequency that most Americans accept them as uncontested truisms. This one emerged just this month in an exceedingly silly article in The Atlantic on Vice President Mike Pence.
Religious faith in America is going the way of the Yellow Pages and travel maps, we keep hearing. It’s just a matter of time until Christianity’s total and happy extinction, chortle our cultural elites. Is this true? Is churchgoing and religious adherence really in “widespread decline” so much so that conservative believers should suffer “growing anxiety”?
Two words: Absolutely not.
New research published late last year by scholars at Harvard University and Indiana University Bloomington is just the latest to reveal the myth. This research questioned the “secularization thesis,” which holds that the United States is following most advanced industrial nations in the death of their once vibrant faith culture. Churches becoming mere landmarks, dance halls, boutique hotels, museums, and all that.
Not only did their examination find no support for this secularization in terms of actual practice and belief, the researchers proclaim that religion continues to enjoy “persistent and exceptional intensity” in America. These researchers hold our nation “remains an exceptional outlier and potential counter example to the secularization thesis.”
What Accounts for the Difference in Perceptions?
How can their findings appear so contrary to what we have been hearing from so many seemingly informed voices? It comes down primarily to what kind of faith one is talking about. Not the belief system itself, per se, but the intensity and seriousness with which people hold and practice that faith.
Mainline churches are tanking as if they have super-sized millstones around their necks. Yes, these churches are hemorrhaging members in startling numbers, but many of those folks are not leaving Christianity. They are simply going elsewhere. Because of this shifting, other very different kinds of churches are holding strong in crowds and have been for as long as such data has been collected. In some ways, they are even growing. This is what this new research has found.
The percentage of Americans who attend church more than once a week, pray daily, and accept the Bible as wholly reliable and deeply instructive to their lives has remained absolutely, steel-bar constant for the last 50 years or more, right up to today. These authors describe this continuity as “patently persistent.”
The percentage of such people is also not small. One in three Americans prays multiple times a day, while one in 15 do so in other countries on average. Attending services more than once a week continues to be twice as high among Americans compared to the next highest-attending industrial country, and three times higher than the average comparable nation.
One-third of Americans hold that the Bible is the actual word of God. Fewer than 10 percent believe so in similar countries. The United States “clearly stands out as exceptional,” and this exceptionalism has not been decreasing over time. In fact, these scholars determine that the percentages of Americans who are the most vibrant and serious in their faith is actually increasing a bit, “which is making the United States even more exceptional over time.”
This also means, of course, that those who take their faith seriously are becoming a markedly larger proportion of all religious people. In 1989, 39 percent of those who belonged to a religion held strong beliefs and practices. Today, these are 47 percent of all the religiously affiliated. This all has important implications for politics, indicating that the voting bloc of religious conservatives is not shrinking, but actually growing among the faithful. The declining influence of liberal believers at the polls has been demonstrated in many important elections recently.
These Are Not Isolated Findings
The findings of these scholars are not outliers. There has been a growing gulf between the faithful and the dabblers for quite some time, with the first group growing more numerous. Think about the church you attend, relative to its belief system. It is extremely likely that if your church teaches the Bible with seriousness, calls its people to real discipleship, and encourages daily intimacy with God, it has multiple services to handle the coming crowds.
Most decent-size American cities have a treasure trove of such churches for believers to choose from. This shows no sign of changing. If, however, your church is theologically liberal or merely lukewarm, it’s likely laying off staff and wondering how to pay this month’s light bill. People are navigating toward substantive Christianity.
The folks at Pew have been reporting for years that while the mainline churches are in drastic free fall, the group that “shows the most significant growth is the nondenominational family.” Of course, these nondenominational churches are 99.9 percent thorough-blooded evangelical. Pew also notes that “evangelical Protestantism and the historically black Protestant tradition have been more stable” over the years, with even a slight uptick in the last decade because many congregants leaving the mainline churches are migrating to evangelical churches that hold fast to the fundamentals of the Christian faith.
When the so-called “progressive” churches question the historicity of Jesus, deny the reality of sin, support abortion, ordain clergy in same-sex relationships and perform their marriages, people desiring real Christianity head elsewhere. Fact: evangelical churches gain five new congregants exiled from the liberal churches for every one they lose for any reason. They also do a better job of retaining believers from childhood to adulthood than do mainline churches.
The Other Key Factor: Faithful People Grow More Children
There is another factor at work here beyond orthodox belief. The University of London’s Eric Kaufmann explains in his important book “Shall the Religious Inherit the Earth?” (he says yes) that the sustaining vitality, and even significant per capita growth, of serious Christian belief is as firmly rooted in fertility as it is in faithful teaching and evangelism. Globally, he says that the more robust baby-making practices of orthodox Jews and Christians, as opposed to the baby-limiting practices of liberals, create many more seriously religious people than a secular agenda can keep up with.
The growth of serious Christian belief is as firmly rooted in fertility as it is in faithful teaching and evangelism.
Fertility determines who influences the future in many important ways. He puts it bluntly, “The secular West and East Asia are aging and their share of the world population declining. This means the world is getting more religious even as people in the rich world shed their faith.”
Fertility is as important as fidelity for Christianity and Judaism’s triumph from generation to generation. Kaufmann contends, “Put high fertility and [faith] retention rates together with general population decline and you have a potent formula for change.”
It comes down to this: God laughs at the social Darwinists. Their theory is absolutely true, but just not in the way they think. Those who have the babies and raise and educate them well tend to direct the future of humanity. Serious Christians are doing this. Those redefining the faith and reality itself are not.
This why Orthodox theologian David Bentley Hart proclaimed in First Things, long before the proposal of the Benedict Option, that the most “subversive and effective strategy we might undertake [to counter the culture] would be one of militant fecundity: abundant, relentless, exuberant, and defiant childbearing.” The future rests in the hands of the fertile.
What About All the Millennial Ex-Christians?
But what about our young people? We are constantly hearing that young people are “leaving the church in droves,” followed by wildly disturbing statistics. This also requires a closer look at who is actually leaving and from where. Pew reports that of young adults who left their faith, only 11 percent said they had a strong faith in childhood while 89 percent said they came from a home that had a very weak faith in belief and practice.
It’s not a news flash that kids don’t tend to hang onto what they never had in the first place. Leading sociologist of religion Christopher Smith has found through his workthat most emerging adults “report little change in how religious they have been in the previous five years.” He surprisingly also found that those who do report a change say they have been more religious, not less. This certainly does not mean there is a major revival going on among young adults, but nor does it mean the sky is falling.
Add to this Rodney Stark’s warning that we should not confuse leaving the faith with attending less often. He and other scholars report that young adults begin to attend church less often in their “independent years” and have always done so for as long back as such data has been collected. It’s part of the nature of emerging adulthood. Just as sure as these young people do other things on Sunday morning, the leading sociologists of religion find they return to church when they get married, have children, and start to live a real adult life. It’s like clockwork and always has been. However, the increasing delay among young adults in entering marriage and family is likely lengthening this gap today.
More Americans Attend Church Now Than At the Founding
What is really counter-intuitive is what Stark and his colleagues at the Baylor Institute for Studies of Religion found when looking at U.S. church attendance numbers going back to the days of our nation’s founding. They found that the percentage of church-attending Americans relative to overall population is more than four times greater today than it was in 1776. The number of attendees has continued to rise each and every decade over our nation’s history right up until the present day.
The number of church attendees has continued to rise each and every decade over our nation’s history right up until the present day.
People are making theological statements with their feet, shuffling to certain churches because they offer what people come seeking: clear, faithful, practical teaching of the scriptures, help in living intimately with and obediently to God, and making friends with people who will challenge and encourage them in their faith. To paraphrase the great Southern novelist Flannery O’Connor, if your church isn’t going to believe and practice actual Christianity, then “to hell with it.” This is what people are saying with their choices.
Or as Eric Kaufmann asserts, “Once secularism rears its head and fundamentalism responds with a clear alternative, moderate religion strikes many as redundant. Either you believe the stuff or you don’t. If you do, it makes sense to go for the real thing, which takes a firm stand against godlessness.”
If your Christianity is reconstituted to the day’s fashion, don’t be surprised if people lose interest in it. Few are seeking 2 Percent Christianity. They want the genuine deal, and the demographics on religion of the last few decades unmistakably support the fact.
Don’t get married before you live together. You just never know what the other person will be like to live with, and you need to figure that out before marriage.
Definitely don’t tie the knot until you’ve traveled together. You absolutely have to find out if your future spouse is a good travel companion – what if you get married and then learn they cry during turbulence but not The Notebook? If you don’t have money to travel, simply hitchhike together and see how your partner reacts when they have a gun to their head.
You need to make sure your marriage can withstand major life changes, so don’t get married until one of you has been fired from their job. If you like your job, then plan to marry someone who’s bad at theirs. Or intentionally sabotage them by hacking into their work email and sending nudes to their boss. If you’re uncomfortable sending your partner’s nudes, send your own. Marriage is about compromise.
Don’t get married before you’ve had children together. Seeing what the other person is like as a parent is key to determining if they’re the right person for you. If it turns out they’re a completely negligent parent, at least you know before you do something extreme like buying an expensive white dress. Throw the tester-baby out the window and call it a day.
You’ll be humiliated if your loved ones travel dozens of miles to see your nuptials and then you get divorced a mere two decades later, so don’t get married until one of you has completely altered their physical appearance. You want to make sure this marriage is about love, not just physical attraction and his trust fund. Shave your head, gain weight, cut your nose off, stay inside for 6 years — I really don’t care. Just do it.
What if weddings just aren’t for you? You need to find that out before you marry your partner, so don’t get married until you’ve married someone else first. Please consider me for the role of your starter bride. And then don’t get divorced — it’s a turn off to future lovers.
Don’t join in holy union until you’ve turned 25 together. It’s imperative that you’ve seen your partner hit the quarter-century mark and come out the other end. Ideally, this will happen to you on exactly the same day (if you’re a twin, you’re in luck!). You have no idea how many good relationships have failed just because one of them hit the wrong side of 25 and the other couldn’t handle it. If you meet after age 25, don’t get married. If you’re single at 25, sorry, but society warned you — excessively.
And don’t get married before you’ve both gone through periods of extreme depression. If your other half isn’t a naturally depressed person, try to induce it by depriving them of food and sleep. You need to know what they’re like when they’re nearly suicidal before your parents drop $60,000 on a wedding.
Are you “in love” enough to go visit them while they’re serving a life sentence in jail? You must find this out before you throw your whole life away! So instead, throw your neighbor Jeanine’s life away by murdering her, and then wait and find out if your LOML comes to see you every weekend. If so, you can walk down the aisle. Or, walk in a lap with the other prisoners. Orange is the new White.
Don’t get married before you’ve watched the other person die. Honestly, that’s a really traumatic event in a marriage, and you want to make sure you can handle it before you commit to spending your whole life together. Bravely volunteer yourself as the one to watch your partner meet their maker. This is usually the step where couples realize it’s not going to work out, so make sure not to skip it.
By Ginny Hogan
Published on February 13, 2019 in McSweeney’s and can be found here.
It would be rather simple to write a series of articles discussing the positive and negative aspects of video games or commenting on the coolest graphics and best storylines. But such a set of articles could not genuinely be called “Catholic.” Something that is “Catholic” deliberates the whole of things, meaning it does not interpret reality as piecemeal or a set of facts in isolation. The Catholic thinker is someone who contemplates, discusses, and writes radically (from the Latin radix—“at the root of things”) seeing reality as it is in its entirety; seeing a thing as it fits within the entire framework of existence. Thus, before we begin a dialogue about specific video games, we must first situate the topic within the context of civilization as a whole. We must go to the origin of this phenomenon and why it has taken the world by storm. The question therefore is, “Why are video games?”
Video games are first and foremost an expression of contemporary culture. A brief study in etymology will clarify our point. The word “culture” comes from the Proto-Indo-European root kwelə meaning “to revolve,” “sojourn,” or “dwell.” This would later evolve into the Latin word incola, “someone who inhabits/dwells” in a certain area. The activities of an incola for the care of his or her sustenance is the verb colere, “to cultivate/till” the earth. Colere is also a word of self-awareness, a recognition of humanity’s capacity for agriculture, construction, and landscaping. The human being is not like other creatures; humans can interact and cooperate with the world around them in a drastic way. One has only to recall the great edifices of Giza, Athens, and Rome for proof.
Even amidst their achievements, however, ancient people were mindful of mystery. They sensed that at the deepest core of reality, the world is given to man, not made by him. It is something simultaneously for us to be subdued and beyond us to be wondered at. The ancients’ realization of this fact led to the development of the verb colere into the noun cultura (culture), denoting “an acknowledgment of” or “honoring of” those things which are essential to a community’s livelihood yet not under their immediate control. One could plant the seed at harvest time (colere), but ultimately, it was the cosmic work of Renenutet, Demeter, or Ceres to provide for its growth (cultura). By studying this etymological and historical relationship between the words colere (to cultivate) and cultura (cult/culture), we can come to a better appreciation of “culture” in the proper sense. Culture appropriately defined represents a claim about the human person’s role in the infrastructure of the world; it is the fruit of a seeing where one truly is in the grand scheme of things; it is the expression of a person’s understanding of reality and their relationship to the order of the universe.
In light of the above-written reflection, let us return to our original question, “Why are video games?” Everything in a civilization is directly influenced by culture: language, food, clothing, music, inventions, architecture, etc. Each of these is a tangible manifestation of a metaphysical presupposition. In other words, the stuff we say, how we say it, what we wear when we say it, and the design of the building we say it in…all these things come from the same place. They are the fruits of culture, the consequences of a philosophical judgment made by society about the essence of reality. Video games are no different. As a matter of fact, I see video games as an apex expression of our postmodern technological culture. More than any other media, video games respond to and affirm the keystone assertion of our civilization: reality is what I make of it.The following quote from Shigeru Miyamoto (the famous creator of Mario, The Legend of Zelda, Star Fox,F-Zero, Donkey Kong, and Pikmin) summarizes the point lucidly: “Players [Gamers] are artists who create their own reality within the game.”
As such, video games have become a fascinating place to see people recognize and deal with the fallout of postmodernity. The virtual world is a seemingly limitless medium in which gamers can experience, suffer, respond to, and escape the egoism, relativism, atheism, and mechanism of culture. I recall one person on YouTube who posted at the bottom of a video game soundtrack: “This Soundtrack, this Game…it feels like a therapy. Especially when you feel down it feels like every sound, every movement you make, everything you can see is there to heal your wounds, your soul…I really love it…” This comment is a perfect example of what we have been discussing.
On the one hand, video games make clear where our culture has failed, where we as a people have lost the language, skill, and discernment to engage the deepest and most vital facets of our being. On the other hand, video games are a rich mine in which to excavate the needs of our people so as to reintroduce basic human qualities and reignite the divine spark of a sedated society.
In the end, what we millennials and post-millennials want is the real world, not the artificial world. Our wanderings in the lands of Minecraft and the mountains of Skyrim are a crying out for reality, not a rejection of it. We long to witness the breath-taking beauty of creation, soar into the heights of authentic heroism and experience the life-giving dynamism of true freedom. “We want reality!” This is the rallying cry of our generation. Unfortunately, many of us are convinced that it no longer exists. So, we seek in the virtual world what we wish existed in the real world. The world outside our suburban home or terraced row-house is a cold, uninviting place flanked on all sides by the ravenous beast of materialistic industrialism and the constant noise of the machine. We sympathize with Romano Guardini when he first saw the decrepit smokestack of a modern factory disrupting the flawless majesty of Lake Como, Italy. At that moment, he knew the “world of natural humanity, of nature in which humanity dwells, was perishing” (Romano Guardini, Letters from Lake Como). A world of money, flashing billboards, and high-rise corporations is nothing compared to the peaceful islands of Uncharted 4 or the awe-inspiring scenery of Final Fantasy X.
Besides, why should we participate in the “real world” when all it seems to offer is passing fads, superficial pleasures, and relativistic opinions? We would rather save a magical kingdom, run through endless leagues of virtual pristine forests, or complete a daring mission to gain XP for our avatars. At least then we can feel like we have purpose; we can feel like we have the opportunity to achieve greatness and see a world left better by our living in it.
Show us something beautiful. Prove to us that the world outside our game room can be as inspiring, challenging, and fulfilling as the world within our game consoles. If you can do that, then you will awaken the hearts of millions and summon a generation of men and women ready to complete the greatest quest of all time: the quest to holiness and sainthood in Jesus Christ.