judicialsupport

Legal Writing for Legal Reading!

Archive for the tag “business”

The Baby Is Still Not That Important

The phenomenon is for parents to choose to raise gender-neutral babies who are called “theybies.”

More and more parents in the U.S. are choosing to raise gender neutral babies. They use gender neutral words and pronouns for their children, and sometimes don’t disclose what’s in their babies’ diapers except to a very close circle of friends. These children are often called “theybies”—neither boys nor girls.

So, right off the bat, it’s really sad and wrong that “what’s in the diaper” is the very narrow way that a lot of people have come think about sex and gender. I remember a long time ago trying–and failing–to work out in my own self the curious disconnection between the mind, body, and spirit that is the property of being human. The three war against each other, which is why God commands that all three should be redirected toward him. Leaving aside the wars that human people have with each other, the internal war of the spirit against the mind against the body is painful. We all live with it in a thousand tiny, disquieting ways. All the more reason not to add to the disintegration of the self under the guise of reintegrating it. It’s not “just” what’s in the diaper. It is a whole person who has a certain kind of biology, however broken and dysfunctional.

Incidentally, I do think it is interesting that this “theybie” thing is arising at the same time as insane gender (which should be sex) reveal events, some of which are so extravagant that some of the participants have even died. Notice that the baby is still not being celebrated. It is a lot of broken people who don’t know God, don’t know themselves, and don’t feel comfortable about anything who are foisting an ideology on their children. Christians are accused of this, of course, but the accusation can absolutely be made the other way. The baby is not the important thing here. The underlying religious belief is, and the baby will have to get along as best “they” can.

So, there are five ways you can help parents raising Theybies, which is the point of the article/listicle/whatever. And the first way is to, “Remember that the intention is liberation.”

Parents who choose not to gender their children are trying to carve out space for them to be their full selves, unencumbered by gender expectations that are oh so pervasive in our gendered world. They do not want their child’s genitals or chromosomes to dictate what should play with or how they are treated by others. These parents want their children to get the opportunity to grow up to be the truest versions of themselves possible, and this is one of the ways they are trying to make that happen. Many studies have shown that children absorb gender stereotypes at a very young age, and that these implicit expectations are damaging to their self-expression and self-confidence. I sometimes hear people critiquing parent’s choices to use gender-neutral pronouns for their children as a way to force their own ideologies onto their kids. But isn’t the ideology of “girl” or “boy” even more constraining?

No, actually, it is not “even more constraining.” There is a link attached to the word “study” which I do not have time to click right now, but I happen to know that if you go looking for something in “science” you will always be able to find it because the hearts and minds of human people are darkened by sin, wickedness, and rebellion against the Creator.

Notice that the goal of parenting articulated here is “self-expression and self-confidence.” Much like the much confused American pursuit of happiness, which, having chucked the necessary element of virtue to the curb, has produced a generation of deeply unhappy people, so also recasting the purpose of parenting to be the rearing of a self-expressive child is bringing about the collapse of society. Think I’m being hyperbolic? I’m not. Children in this culture are unhappier than ever before and in any other place around the world.

Just like children are, in their created nature, sexed and gendered, so they are, by nature self-expressive. What children need in order to be happy is to discover that they are not the center of the world. Their natural selves need to be curbed. They need to discover the riches of self-discipline, self-denial, and the truth that something greater than them (God) is the ruler and judge not only of the cosmos, but of their own little selves. They learn this first by having loving parents who help them see the pleasant and beautiful walls that keep them safe. They learn it by discovering that their parents (and by extension God) are merciful. They learn that they can stop crying when told to, stop touching things when told to, to come when told to, to sit when told to. They learn that there other people more important than themselves—first their parents, then other adults and children, and most critically, God. The ears of their little minds are opened to the astonishing wonder of Jesus, and they discover that by loving him, they are able to love themselves and others, that they were created to enjoy him in peculiar and delightful ways with their minds, hearts, and bodies.

This vision is so much bigger than behavior or the paltry, ruinous idol of “self-expression.”

And I am so sorry, but my blogging hour is up and so I will pick up the second thing you are supposed to do tomorrow. I’ll destroy my routine and keep going over here at SF and do regular quick takes and book notes over on Patheos. See you tomorrow!

By Anne Kennedy and published on Patheos on January 16, 2020 and can be found here or here.

How America Killed Play—and What We Can Do to Bring it Back

In our last piece from our interview with play expert Dr. Peter Gray, we outlined the five criteria of play. For an activity to truly be considered play, it must:

  • Be self-chosen and self-directed
  • Be done for its own sake and not an outside reward
  • Have some sort of rules/structure
  • Have an element of imagination
  • Be conducted in an alert frame of mind

When you break it down like that, much of what modern parents think of as play doesn’t actually qualify. The truth is play has been gradually declining for the past five or six decades, but it seems to have come to a head in the last 10 years. According to the Child Mind Institute, American kids now spend an average of just 4-7 minutes a day on unstructured outdoor play, and elementary schools across the country are reducing or entirely eliminating recess. Play is an absolutely critical part of our youth, as it develops life skills in a way which is very hard to replicate elsewhere. How did this crucial component of the human experience get so diminished?

The 1950s were something of a “golden era” of play. The post-World War II baby boom left no shortage of potential playmates for a kid, and child labor laws passed in the late 1930s meant children could no longer be forced to toil inside factories or coal mines. Schools had multiple recesses throughout the day, the concept of homework barely existed, and the school year itself was about 4-5 weeks shorter.

“School was not the big deal it is today. Parents were not involved. You went home, you were home. School happened at school, when you were out of school, you were out of school,” says Gray, a research professor of psychology at Boston College and the author of the book Free to Learn: Why Unleashing the Instinct to Play Will Make Our Children Happier, More Self-Reliant, and Better Students for Life. The culmination of these factors created a generation where kids played for hours each and every day.

“You could go out anytime during daylight and you’d find kids playing with no adults around. Parents shoo’d you outdoors, they didn’t want you in the house—moms especially,” Gray says. Organized youth sports were still in their infancy, and if they did occur, they were a far cry from some of the ultra-expensive, ultra-regimented leagues that exist now. In some little leagues, the biggest or most mature kid on the team often acted as the coach, and there was rarely a parent to be found down the foul lines. But this golden age of play didn’t last forever.

The rise of television made the indoors more attractive, sure, but it was the shift in parental attitudes around school, sports and free time that really changed things. Elementary schools (and schools, in general) began placing a greater emphasis on testing results and homework. According to the University of Michigan, students aged 6-8 went from having 52 minutes of homework a week in 1981 to 128 minutes a week in 1997.

Sensationalistic news reports led parents to believe the world was becoming increasingly dangerous for their children, though statistics show the opposite was in fact true. As time has gone on, the outdoor world’s only become safer for our children. Either way, parents became increasingly uncomfortable with the idea of their child playing around town without adult supervision, and organized sports slowly came to replace play. As the demand for organized activities for younger and younger ages increased, organizations quickly met the demand. Parents stopped allowing their kids to walk or bike to practice, instead shuttling them there themselves.

“Kids going to games themselves by bike or walking became somehow dangerous. So parents felt the need to drive them there. Then if you’re going to drive them there, you might as well watch. Then it became a sort of parental duty to stay and watch. If you don’t stay and watch you don’t care about your child. So you’re supposed to be there, you’re supposed to be cheering your child on. You’re supposed to care if your child’s team wins or loses,” Gray says. “It was gradual, it happened over time. (Organized sports) came to replace actual play in people’s minds—this is how my child gets exercise, this is how my child meets other children, and so on.”

The undercurrent among all this was the idea that play was largely a waste of time. Adults believed structured, adult-guided activities were of greater value to their children, so they began filling their free time as such. As the commitments mounted, time for play decreased. “Instead of the idea that childhood was an idea of freedom and play and children were largely free of adults, we began feeling increasingly responsible for the children’s development,” Gray says. “And accompanying that idea was that children’s own activities are a waste of time.”

Of course, we now know that couldn’t be further from the truth. A 2018 report from the American Academy of Pediatrics confirms that play enhances creativity, imagination, dexterity, boldness, teamwork skills, stress-management skills, confidence, conflict resolution skills, decision-making skills, problem-solving skills and learning behavior. Play is an essential part of the human experience, and a lack of play can have troubling short and long-term ramifications for children.

A major benefit of play is what’s known as “risky play.” This entails engaging in play that creates some sense of fear. This often involves ascending to great heights (climbing a pine tree), moving at great speeds (riding a bike or swinging on a rope swing), play fighting (wrestling), going off on your own (hide and seek) or engaging with dangerous tools/environments. Risky play is a fundamental part of play. Children like to test their limits and innately know how much fear they can tolerate, and when they engage with fear and survive the experience, they become more resilient, confident and better-equipped to handle stress and anxiety. While play in general has decreased over the last five or six decades, risky play has been hit particularly hard due to overprotective parents. Playgrounds have become increasingly sterile in America—most are now devoid of equipment that allows you to confront any fear of heights or high speeds, and offer little challenge in the way of dexterity or agility.

“Natural selection has designed children to play in risky ways so they learn how to deal with risk…I can do this thing that stretches my physical and emotional abilities and I can survive it, I can do it. What you’re practicing is controlling your mind and body in a somewhat fear-inducing situation. But it’s a fear-inducing situation that you can control, you put yourself there. But what you’re learning is you can deal with feeling fear, you can hold yourself together. So when you experience something that produces fear in real life, it’s not a new thing to you,” Gray says. “I feel confident I can handle this instead of panicking. I think that’s part of the reason we’re finding a lot of lack of resiliency today, we’re finding a lot of people falling apart when something difficult happens in their life. Because they haven’t practiced this kind of play where they’re deliberately putting themselves into difficult positions and learning how to deal with that.”

Gray notes that continually decreasing levels of play have coincided with increases in depression and anxiety among young people. In a 2014 TEDx Talk, he outlined how five to eight times as many children now suffer from major depression or a clinically significant anxiety disorder as compared to the 1950s. Questionnaires have also revealed a continuous decline among children and young adults in the feeling that they have “control over their own lives.” They’re increasingly micro-managed and have limited chance to cut loose or follow their intuitions. It’s not an exaggeration to say a lack of play may be at the heart of increased anxiety and decreased resiliency in young people. It’s not their fault—they’re simply ill-equipped to handles life’s ups-and-downs.

How can we put play back in our children’s lives? We’ll get to integrating more “true” play in a second, but you can start by shifting certain organized activities into more playful states. Are there ways to help them self-select and self-direct more of what they’re doing? Or decrease the focus on outside rewards? Or foster a grander sense of imagination? The more an adult is telling them exactly what and how to do something, the less play is taking place. The U.S. Soccer pamphlet Best Practices for Coaching Soccer in the United States sums it up nicely: “Coaches can often be more helpful to a young player’s development by organizing less, saying less and allowing the players to do more. Set up a game and let the kids play. Keep most of your comments for before and after practice and during water breaks.”

In terms of pure or “true” play, we’re not getting back to the days of the 1950s anytime soon. However, some communities are fighting to bring play back with encouraging results.

Schools around the country are integrating “play clubs” and finding great success. These clubs typically take place on school grounds for 1-2 hours directly preceding or directly following the school day. Different equipment is set out for kids to play and experiment with at their leisure, and adult supervisors (of which there are not an abundance) are trained only to intervene when something truly dangerous is occurring. Gray recently observed an elementary school play club that takes place prior to the school day once a week (though they’re trying to make it more frequent) and was delighted with the result.

“Free play indoors in the school and outdoors, it’s age mixed, all grades K-5…It’s working wonderfully. It’s working partially because the age mixing. Older children are helping to solve the quarrels among younger children,” Gray says. “Children are truly running in hallways, wrestling, playing chasing games, some old-fashioned games, very vigorous play. Here’s a situation where there are adults present, but the adults are initiating actives (and) they’re not intervening. I was there for an hour, there were 150 kids, and I did not see any single case of adult intervening. It went so remarkably well.”

Gray also offers up the idea of recreation departments including more sandlot-style activities among the more organized sports. It would be formalized in the sense it would take place at a given location at a given time, but it would really be just a way to get a bunch of kids together. A volunteer could help get games going during the first few sessions, but slowly step away and intervene less over time. New equipment could be added over time to help inspire different games or different styles of play.

“Maybe one parent is there at a time to help each other put their minds at ease. It begins with something more formal, but over time, that structure falls away,” Gray says. “I think that could catch on. I think there’s enough kids and enough parents who would want to do this as an alternative…Ideally, over time, the kids who are coming together (for this) every Saturday afternoon start realizing they can do it every other day, too.”

By Brandon Hall and published in Stack on March 25, 2019 and can be found here.

You Are Here Prof: Today’s Students and Professors ‘Know Hardly Anything about Anything at All’

Six months ago we shared a frightening observation from Patrick Deneen, a political science professor at Notre Dame who has also taught at Princeton and Georgetown. He described his students as “know-nothings… devoid of any substantial knowledge.”

More recently, a respected author and English professor at Providence College in Rhode Island has echoed Deneen’s concerns.

In an essay titled “Exercises in Unreality: The Decline of Teaching Western Civilization,” Anthony Esolen describes a university climate today in which many students and professors no longer possess the knowledge and skills that their peers of previous generations took for granted:

“But what if you know hardly anything about anything at all? That is an exaggeration, but it does capture much of what I must confront as a professor of English right now, even at our school, which accepts only a small fraction of students who apply for admission. Nor, I’m afraid, does it apply only to freshmen. It applies also to professors.”

He explains:

“I now regularly meet students who have never heard the names of most English authors who lived before 1900. That includes Milton, Chaucer, Pope, Wordsworth, Byron, Keats, Tennyson, and Yeats. Poetry has been largely abandoned. Their knowledge of English grammar is spotty at best and often nonexistent. That is because grammar, as its own subject worthy of systematic study, has been abandoned. Those of my students who know some grammar took Latin in high school or were taught at home. The writing of most students is irreparable in the way that aphasia is. You cannot point to a sentence and say, simply, ‘Your verb here does not agree with your subject.’ That is not only because they do not understand the terms of the comment. It is also because many of their sentences will have no clear subject or verb to begin with. The students make grammatical errors for which there are no names. Their experience of the written language has been formed by junk fiction in school, text messages, blog posts, blather on the airwaves, and the bureaucratic sludge that they are taught for ‘formal’ writing, and that George Orwell identified and skewered seventy years ago. The best of them are bad writers of English; the others write no language known to man.”

Esolen’s above lament is supported not only by similar laments from his fellow professors, but also by statistics that show only a minority of American students are proficient in reading and writing, and by the fact that billions of dollars each year are spent on remedial courses in college.

Do you think that things can be turned around in the near future? Or are we destined to slip further into an educational dark age?

Dan is a former Senior Fellow at Intellectual Takeout. He received his B.A. in Philosophy and Catholic Studies from the University of St. Thomas (MN), and his M.A. and Ph.D. in Systematic Theology from Duquesne University in Pittsburgh, Pennsylvania. You can find his academic work at Academia.edu.

By Daniel Lattier, and originally published on August 8, 2016 in Intellectual Takeout and can be found here.

Politics and Rationality: On the Uses and Limits of Science

How rational is your politics, and how rational could or should politics be, in general? What is, and what ought to be, the role of reason and of science in policy-making or in campaigning? To answer such questions in a reasonable or scientific way, it would first be necessary to define such terms as “rationality,” “reason,” and “science.” That’s a nice Socratic-style challenge, anyway, and I’m not confident that people mean anything very clear or specific by them on most occasions. And, whatever they mean, the things themselves—conceived as faculties in people’s heads or as a series of procedures or guidelines for how to gain knowledge—have little to do with why anyone has the politics they do. People who think their own politics are rational and those of their opponents irrational (that is, more or less everybody) are engaged in a self-congratulatory self-delusion.

A traditional account of the faculty of rationality might be that it encompasses the canons of deductive and inductive reasoning and perhaps the scientific method (which it then is incumbent on the rationalizers to characterize in a general way). That is, rationality is an array of techniques, variously related, for getting true conclusions from true premises, or probable conclusions from probable premises, or data from experiments, or well-tested hypotheses from mere guesses: the rational procedures are the truth-preserving or truth-conducive procedures.

Then again, the alleged science of economics deploys a seemingly completely distinct conception of rationality, oriented to actions and agents rather than to generating true theories. Here, a rational person is one who pursues their own interests (conceived by economists, of course, as economic interests) by means that are most likely, or very likely, or fairly likely, or more likely than not, to be helpful in achieving those interests. In other words, a rational person is defined (admittedly this is comparatively clear) as one who knows how to get his, or who has effective techniques for securing resources, or, in short, who makes a whole bunch of money.

These two, or several, or many, senses of “rationality” may go back to Aristotle, who defined humans as “rational animals,” which raises doubts about whether he had ever met any of us. Aristotle defined “practical rationality” in terms of a certain style of deliberation, known as the “practical syllogism”: “I want thing X; action A will help me get X; so I’ll do A.” Of course, that leaves it entirely open what X is: it could itself be an irrational or evil goal.

Aristotle thought that we all had the same goal—happiness—and that the same means (study and friendship, for example) could help us each achieve it. But he did not give any rational reasons to prefer happiness to various other possible ultimate goals (union with God, for example, or a life of self-sacrifice), nor could he. Our goal, he thought, was built into our nature. Maybe so, but that does not in itself make it any more rational than any other goal. Also, it doesn’t make it clear what happiness (or, as contemporary versions have it, well-being) is, or why we should prefer it to other candidates for ultimacy; it just insists that happiness—itself an awfully vague concept, or a variable that just means “everything we want all at once”—is in fact our goal. But Aristotle at least connects what we might call “cognitive” and “deliberative” rationality, or perhaps logic, experimental science, and economic modeling, into something like the same conceptual structure, which is as much as anyone has done since, really.

As to the scientific method, which is supposed to be something clear enough for a teacher to scribble briefly on a blackboard: a general characterization is going to have to encompass the techniques, for example, of astronomers (instrument-aided observation), psychology (questionnaires), experimental chemistry (hypothesis and reproducible test), medicine (double-blind placebo studies), anthropology (immersion and empathy), and of course economics (statistics), among many other procedures. Good luck boiling it all down, or figuring out exactly which technique to use on a political or moral question, and how.

So, for example, let’s stipulate that science (whatever it may be, exactly) has delivered to us the truth that the planet is getting hotter because of human carbon emissions. It might also give reasons to think that certain procedures will be effective to ameliorate the problem. That’s when the practical syllogism or the economic model of rationality kicks in: if I want to ameliorate climate change I should act to reduce my emissions and to see whether I can convince you to do likewise. But I have many goals that I’m trying to achieve simultaneously, including goals that economists assert to be rational, such as maximizing my income, or paying as little as possible for the things I need. The sheer fact that I’m deliberating about how to reach some goal rationally isn’t going to help me decide which of these goals to pursue when they conflict. It’s not going to help me fix my ultimate goals, or order my goals in a list of priorities. In order to do that, I’m going to have to figure out what I really want, what I think is most important. On that matter, the practical syllogism, like particle physics, is silent.

In general, fixing our ultimate values—in politics or anywhere else—is not an activity that lends itself to rational deliberation. It rests, rather, on visceral commitment. If I think that justice is more important than tradition, or world peace than national borders, for example, I am going to have to screw up my emotions one way or another and make the choice. And to persuade you to do likewise, I am going to have to express passion, not present a series of practical syllogisms or scientific papers. No one’s politics is based on deliberative rationality. And no one’s politics is based on science, of course.

This is one thing that David Hume meant when he made his famous declaration that “Reason is, and ought only to be, the slave of the passions.” Another thing he meant was that while passion, emotion, or desire can motivate people to action, sheer reason cannot. Though people sometimes say that science demands that we act now, it demands no such thing. It might tell us that if we don’t act now, various things will happen. It can’t show us why we don’t want them to happen, or why we should try not to let them happen, if we don’t really care as much about being screwed in the long run as we do about what’s for dinner tonight. Reason might tell us that if we want dinner tonight we should go to the grocery store and crank up the grill; it can’t tell us how much to care, or what to care about. Perhaps reason is a group or a family of strategies for generating beliefs, but, if so, it looks like they are only tangentially related to each other. At any rate, when you’ve told me that I should select my political beliefs rationally, I still don’t know exactly what you mean, or how I possibly could.

Political scientists—who are an interesting kind of scientist—tell us that, statistically speaking, our political positions tend to follow our demographics. The sort of “predictive analytics” that drove Cambridge Analytica’s interventions in the 2016 campaign on behalf of candidates like Trump indicate the same thing. It seems that if I know your race, your region, your age, your gender, your education level, or what movies you watched last month, I can predict your political positions with a fair amount of accuracy. This would be a bizarre circumstance if people were coming to their political positions through rational procedures. The oft-remarked “tribalism” of American politics, which applies just as well to college professors as to truck drivers, gives the lie to the alleged fact that some of these people (the people you agree with, no doubt) are basing their politics on reason while other people (the people you oppose) are not. People, by and large, believe to belong. But at what rate we ought to value belonging: on that, science offers no help.

Perhaps science, whatever it may be, can provide some information that would be useful to us, given that we have certain purposes. It cannot give us purpose, however. If “rationality” meant something, our politics would turn out to be no more rational than we are, overall. What did you expect?

By Crispin Sartwell and published on December 29, 2010 in Quillette and can be found here.

Why blackface still dogs the Mummers 50 years after it was banned

Please note: This article is published as an archive copy from Philadelphia City Paper. My City Paper is not affiliated with Philadelphia City Paper. Philadelphia City Paper was an alternative weekly newspaper in Philadelphia, Pennsylvania. The last edition was published on October 8, 2015.

Court injunctions, human chains, riots, “blood in the streets” — it’s the 50th anniversary of the year blackface was banned in the Mummers Parade.

On Broad Street, two giant, white-gloved hands roll back to reveal the Ferko String Band, one of the oldest and most respected groups in all of Mummerdom; they’ve marched in every New Year’s Parade since 1923. Their routine involves an Irving Berlin tune; it’s not one of his most well-known ones, but sing along if you know the words:

I recall when I was small and the minstrels came to town

I was glad because my dad

Always used to take me down

Often my memory strays

Back to those wonderful days

Those good, old minstrel days. 

From the title, “Ferko’s Bringing Back the Minstrel Days,” on down, the group’s entry in last year’s New Year’s Parade was hardly subtle about what it was paying tribute to. Even if a parade goer wasn’t familiar with “minstrel” in its definition as “an entertainer, generally white, who performed songs and jokes in blackface and probably had his best days between 1830 and 1900,” there were plenty of hints. There’s an homage to Al Jolson. A large black piece of scenery is painted with the word MINSTREL and a floating, white-lipped smile; most of the musicians, nearly all of whom are Caucasian, have white-outlined grins — painted over tan, orange or red faces.

For the finale, out come four large, big-lipped, wide-grinning, top-hatted prop heads. The faces technically aren’t black — blackface has been banned from the parade for 50 years. Their skin is light tan: not blackface, so they’re not breaking the rule. It does, however, still resemble something torn from a creatively crayoned Amos & Andy coloring book.

The Mummers’ minstrel days ended in 1964. It was unusually warm and sunny on the day of the 64th New Year’s Parade, and at the time, decent weather usually meant a turnout of between a million and a million and a half spectators. But only 300,000 people showed up this time — for every one person who came, three or four decided Broad Street was not a good place to be.

That’s not surprising, given the circulating warnings about potential “blood in the streets” and rumors that black people had been recruited from Harlem and Chicago to wreak havoc. “If there is violence and we get beat, then we’ll take our beatings,” said Louis Smith, president of the Philadelphia chapter of the Congress of Racial Equality, of their plan to make a human chain to block the parade if blackface was allowed. Leave had been cancelled for the entire Philadelphia Police force, and thousands of officers were on duty. Newspapers were full of metaphors of impending explosion: “sitting on a volcano,” “powder keg,” “pouring oil on a fire.” Philadelphia had been on edge for a long time, and on this sunny January day, it seemed likely that the blackface ban might be what would push it over.

“If I can’t wear a black face, it’ll be brown or purple,” one Mummer protesting the ban had declared. And Mummer historian Charles Welch, taking notes from the judges’ seats, writes about a makeup-less Al Jolson, comics in “dark blue makeup, kinky hair,” and “a picture of a minstrel in blackface on a large poster with ‘Gone Yes — Forgotten Never.’”

In 1964 and 2013, these Mummers were adhering to the letter of the law rather than the spirit. The early groups were protesting, actively trying to infuriate people. In 2013, though, Ferko String Band members seemed genuinely surprised and hurt at the negative reaction to their show. (Calls to their captain went unreturned.) And so go the microbattles of race in the modern era, which so often boil down to an infinite, crescendoing loop of “But I followed the rules!” and “But you missed the point!”

The Mummers’ history with minstrelry is, unsurprisingly, something that nearly all contemporary Mummers wish would just stay in the past. But one of their own always seems to dredge it up again: “Ferko’s Bringing Back the Minstrel Days,” or the Goodtimers’ “Al Jolson Sings Again” in 2003, or the South Philadelphia String Band’s request to be allowed to use blackface in a Cotton Club-themed show in 1985 (denied), or the blacked-up faces that stubbornly show up on Two Street year after year.

Without historical context to provide the leaky life rafts of nostalgia, tribute and homage, the recurrence of blackface seems baffling. And since most people don’t know the history, they’ll assume the most obvious explanation is the correct one: That Mummers are fucking racist.

But nothing’s ever that easy, is it?

In nearly every collection of newspaper photos of the New Year’s Parade, there are a few shots that always appear: A close-up on the face of a wide-eyed kid. Lots of photogenic Fancies — at the very least, someone in one of those enormous, feathered backpieces and a wide shot capturing the full length of a 100-foot-long cape. A beauty shot of a pretty wench who could almost pass for female. A close-up of someone from a string band playing a saxophone.

And then there’s the wide group shot capturing a crowd of strutting Comics, the sector that has historically devoted itself to satiric pantomime, usually of current events. The clown-like figures — some in dresses, most looking intoxicated — brandish parasols and smile up at the camera. After 1964, their faces tend to be painted to match their outfits. Before 1964, there’s usually a significant percentage in blackface. (The video below, from 1930, has some good examples early on.)

The history of the Mummers, particularly the comic brigades, is entwined with the blackface minstrelry popular in the mid- to late 1800s. But the roots go back further, before South Philadelphia was even officially part of Philadelphia. It was a poor satellite town of immigrant laborers and free blacks, and its poorest neighborhood was a swampy, near-rural shantytown known for its garbage-fed pig herds. This was the Neck — the birthplace of Mummery.

The early history of the Mummers is inexact, says Christian DuComb — who teaches theater at Colgate University, wrote his doctoral thesis on racial impersonation in the Mummers and until recently was a member of the Vaudevillains NYB club — because nobody wrote it down. “Most of the Mummers’ own history is oral; it’s a working-class tradition, and the working class hasn’t always had the resources to write its own history.”

While the early days are fuzzy, one thing’s clear: In the 1830s, rowdy bands of proto-Mummers shooting guns off like Yosemite Sam started showing up at holiday time in arrest records and in the diaries of irritated rich people in Philadelphia proper, the area now called Center City. The drunk, costumed men were seen as a dangerous nuisance — the first formal parade in 1901 was essentially appeasement, as the city, unable to keep things under control in the holiday season, bribed the Mummers with prize money to hold their celebration in an organized on Broad Street. There was a definite sense of “At least we know where they are now.”

“In contrast to older, rural themes of semi-human disguise, Philadelphians commonly impersonated kinds of people,” writes Susan G. Davis in “Making Night Hideous,” a 1982 paper on historical South Philly holiday traditions published in American Quarterly. The two most popular types of disguises were the ones that required the least effort: “Wearing women’s clothing was an easy transformation and popular. … Dressing as a woman could be as simple as filching a sister’s dress.” And: “Blackface was a popular theme in the street Christmas from the 1830s. … Like transvestism, blacking-up was quick and cheap.”

“It’s an easy disguise to get access to — all you need to do is burn something and smear it on your face to at least partially mask your identity, especially if you’re performing or carousing at night,” says DuComb. “Blacking up” had been a practice in many of the home countries of the Neck’s residents, “but when the European practice migrated to the Americas,” and especially to the Neck, a neighborhood with a number of free blacks, “it quickly took on an obvious racial significance it may have previously lacked.”

And so, at least 70 years before the first formal New Year’s Parade, there it is, the origin of wenches and blackface: The first Mummers were too poor to buy costumes.

Imagine the reaction, if tomorrow, a couple weeks before the parade, a large group of women came forward and declared that the wench brigades were insulting caricatures and therefore offensive to women. (As wenches are now fairly inoffensive, picture the suffragettes-putting-babies-on-spikes ones of earlier days.) Their group, these women say, will be doing everything in their power to ensure no wenches march in the parade: legal action, boycotts, picketing, even blocking the parade with their bodies.

Unlikely to work, right? The wenches are a tradition going back hundreds of years, impossible to uproot in a few weeks. But, though it was just as long-lived a tradition in the Mummers, that’s essentially what happened with the blackface controversy in 1964.

1963 had been a frustrating, violent year for the civil rights movement, with most of the victories still on the horizon. Martin Luther King Jr. was thrown into jail. Bull Connor used fire hoses and police dogs on a nonviolent protest in Birmingham, Ala. Medgar Evers was murdered in Mississippi. Despite JFK’s request that it be put off for fear of violence, the March on Washington was held in August. In September, Birmingham rioted after the bombing of a Baptist church killed four little girls. And just before Thanksgiving, JFK was shot in Dallas.

Philadelphia, though not in the South, was tense. Between 1940 and 1960, the black population had more than doubled as hundreds of thousands of blacks fled the Jim Crow South in search of good jobs in Philadelphia’s industrial sector. Instead, the first arrivals were greeted by a highly segregated union system in which most skilled, technical and professional jobs were closed to blacks, and housing policies that crowded them into cramped ghettos, mostly in North Philly.

Later arrivals had it even worse — the defense-industry jobs dwindled after the war, and with the beginnings of white flight, blacks found themselves marooned in an increasingly empty and blighted city with few jobs of any kind available. Early civil-rights leaders like Sadie and Raymond Pace Alexander got promising things through City Council, but lacked the muscle to enforce them. In 1963, newly elected local NAACP president Cecil B. Moore (constantly referred to as “fiery” by reporters of the ’60s) had started pushing hard for more confrontation, a voice for the growing frustration with the status quo.

Parade Magistrate Elias Myers must have felt it in the air. In mid-December, without much fanfare, he banned blackface from the ’64 parade — perhaps because this would be the first time it would be broadcast nationally. A couple of days later, he found his Two Street home being picketed by a bunch of young Mummers. A few days after that, Myers reversed his decision and re-allowed blackface. And that was what set things off — Cecil B. Moore, the NAACP and other civil-rights groups converged on the Mummers, with only about a week before the parade.

Many Mummers resented the last-minuteness of it all, and that it was being led by outsiders who didn’t understand South Philly traditions — Moore and the bulk of his supporters were from North Philadelphia.

Interviews in Patricia Masters’ book The Philadelphia Mummers: Building Community Through Play suggest that a lot depended on north and south as well as black and white. One white comic club member who jokes that he’s “probably the last living person involved [directly] with the blackface controversy,” said he was confused about the suddenness of the anger about blackface: “If this was such a pressing issue, number one question I had at the time … was why wasn’t this brought up in February, March, April for discussion?”

Another man Masters spoke with, a black musician from South Philly whose brass band had been employed by Mummers for years, also saw Moore as an outsider: “Why he [Cecil B. Moore] brought them [the group of blacks] from North Philadelphia I have no idea, because they were not even associated with the parade. They were so far away. The blacks in the community in South Philadelphia lined the streets to see the parade because from Washington Avenue up to Lombard Street was a black area. … I think [Moore started the controversy] to start some problems, I really do.”

But Moore hardly invented black opposition to blackface. In March 1954, North Philadelphia Councilman Raymond Pace Alexander had introduced a resolution for official disapproval of  “the ridiculing, satirizing, or holding in contempt or derision any race of people on the part of organizations and club members of the Mummers Parade on New Year’s Day.”

After nine months, the Tribune headline “Alexander Wins Fight for Decency” is less accurate than a later description, “a victory of a sort.” That is, City Council politely asked judges not to award the taxpayer-funded prize money to “groups that used ridicule of a racial or religious group as their theme.” It’s hard to say exactly what would qualify as over-the-top “ridicule” in any given decade — one comic club wore white, pointy robes and called themselves the “Koo Koo Kan” for a few years in the ’20s — but the judges promptly awarded first-place prize money to “African Voodoo Warriors” in 1955. (You can observe for yourself what was allowed in the 1955 parade in this video clip.)

Cecil B. Moore was less interested in keeping the peace than in making things actually happen. Recalling covering desegregation in Philadelphia in 1963, journalist Lawrence O’Rourke wrote that “reporters who wanted a quote from black leaders frequently sought out the same people, starting with Judge Raymond Pace Alexander and his wife, Sadie, distinguished members of the black community. For more firebrand quotes, reporters called Cecil B. Moore.”

So when magistrate Myers brought back blackface, Moore and the NAACP immediately went after the Mummers’ money. Using the courts, protests, boycotts and political pressure, they targeted the Mummers’ parade permit, the taxpayer-funded prize money and lucrative broadcast deals. Now, both sides were angry.

Tensions peaked when the NAACP’s court motions were thrown out on New Year’s Eve, less than 24 hours before the parade was supposed to begin, with the impossible reassurance that “the city, which sponsors the parade, had promised that no part of the performance would offend anybody, racially or otherwise,” wrote the Associated Press. The patronizing tone infuriated activists, and they began to sound a bit threatening, as in this quote in the Tribune: “I’m worried about what’s going to happen when the parade passes Broad and South Streets,” [Moore] said. “I’m trying to prevent tension.” Every police officer who’d taken New Year’s Day off was called up for duty. The threat of imminent violence was now big news even outside the region; a front-page Washington Post article began: “It was New Year’s Eve in Philadelphia when the Rev. Henry H. Nichols told a reporter: ‘I pray the good Lord will stop the parade with a snowstorm.’”

Defusing the situation took the intercession of less-confrontational, old-guard black activists, church leaders and, some claimed, the good Lord. (Or a Nor’easter.)

“We prayed for snow on New Year’s Day and got it,” said the Rev. Nichols, after a sudden bout of bad weather caused the parade to be postponed only two hours before it was to start. “Now we are praying that understanding will come.” And in the three days the snow bought, Nichols and other older-school activists proposed a compromise that would bar both blackface and protesters from the parade. It was quickly approved by the same court that had just tossed out the NAACP’s requests.

The parade itself was an anticlimax. Few Mummers showed up in blackface, and those who did were removed from the parade. Activists stayed home. The most noticeable sign of tension, aside from the above-average number of police officers and below-average number of spectators — if you want an idea of what the day looked like, there’s some really interesting snippets of silent footage of police situation here and the Mummer protestors here — was probably Hammond Comic Club’s chants of “1, 2, 3, 4, we hate Cecil Moore!” being briefly audible on the national TV broadcast.

Hammond had headed the resistance to the blackface ban. At the parade, members wore no makeup at all, though apparently some applied blackface on the fly as they marched; they refused, as did a few other groups, to perform as they walked through a stretch of black neighborhoods; they even had a “sit-in protest.” According to historian Charles Welch, they “sat down in the middle of the street, some shouting, ‘Negroes sat down in City Hall, we’ll sit down here.’ … The police quickly moved in and forced the Mummers to rise. The entire incident lasted about 20 minutes, after which the paraders again started up the street.”

Welch also has notes on Hammond from the judging area: “HAMMOND: Blackface used by some Mummers — no reaction from the crowd. Young Negro boy dressed as an American Indian, elaborate costume, red makeup.”

The same kid turns up a week later in the Inquirer brief “50 in Blackface Join Parade Honoring Myers,” on a 200-Mummer parade in support of the magistrate’s ultimatum that he’d resign if the blackface ban was made permanent.

Though the Mummers have always been predominantly white, there were more black Mummers before the Depression hit, when minorities made up less than a tenth of Philadelphia’s population. Today, when “minority” describes nearly two-thirds of Philadelphia’s population, there’s barely any. The brass bands, which provide music for the comics, tend to mostly have black members, but they’re being paid, don’t dress up and don’t call themselves Mummers.

Willis Fluelling, the kid who marched with Hammond, was proud to call himself a Mummer. Fluelling, now 64, has suffered several strokes over the past decade, and has difficulty speaking. However, he’s happy to express how much he enjoyed his first year in 1964 — a photo of him and a younger kid playing Custer to his Crazy Horse appeared in the Daily News. “The costumes were so extraordinary — I wanted to be in it, real bad. I was a kid, I just jumped right in. … Some white boys got me in.”

Was it strange being the only black Mummer in Hammond, and nearly the whole parade? “No, it wasn’t. No.”

Was being in Hammond during the blackface controversy strange for him? “No, it wasn’t.”

What was it like marching in the ’64 parade? “Nice.”

What did he wear? “An Indian costume — Custer’s Last Stand.” What was he doing? “Killing Custer.”

His wife, Bernette, interjects: “He was struttin’! Or whatever they call it.” She giggles at the feel of the unfamiliar word.

When the two met in 1971, Fluelling was still an active Mummer, which surprised Bernette. “He told me that he was marching in the parade, and, of course, I laughed. I told him at that time that I didn’t see any black people in the parade!” She laughs again. “And he said, ‘Well, come down Broad Street and see me parade!’ And I went down Broad Street and saw him parade; I was laughing! I was shocked because everybody knew him in the parade, since he’d been doing it since he was little.”

The Mummers were much less of a presence in her own childhood, growing up in a mostly black neighborhood “up on 22nd Street.” In fact, she says, she never really thought about them at all. “We had our own parades up our way — there wasn’t no white people in the parades we had,” she laughs, so the Mummers being mostly white didn’t seem unusual. It was just who lived in that South Philly neighborhood. “At that time, blacks was throwing more parades with precision — like, twirling the baton and doing the stepping.”

Fluelling says the blackface controversy didn’t bother him at the time, and that his feelings about the parade haven’t changed. “I figured that blackface was in the parade all those years, and nobody said nothing about it.”

Bernette jumps in: “He was young! We were all young at the time.”

And as kids, she says, they didn’t really follow the news. “All we were thinking was how pretty the costumes were. … We weren’t thinking about any of the overtones, because we didn’t know anything about that.”

After the two started dating, she says, they got more into studying the Bible, and he started spending his free time on that instead of the Mummers. More than that, Bernette would rather not say.  “We don’t get into political affairs — I have my opinion, but me and my husband would rather keep our opinion to ourselves; if you say something, it can get misconstrued.”

In 1963, a Mummer told the Associated Press, “Minstrelry is a part of the tradition. … No offense ever was meant, and so far as I know, none was ever taken.” Five decades later, a surprising number of people are still using this argument anytime anybody is offended by anything. These days, it often comes with an undercurrent of “and fuck ’em if they can’t take a joke,” suggesting that 50 years of being called racist grates on the nerves.

The Mummers still have a race problem. It’s not the occasional tribute to Al Jolson or Joey Vento-sponsored “Speak English/Our jerbs!” skit about illegal immigration. It’s the impression that they give off: that the Mummers are by white people and for white people, and don’t particularly care if anyone else likes them. “Anyone else” used to be an insignificant number. But since the first official parade in 1901, Philadelphia has gone from 95 percent white to 37 percent white. Not thinking about what things like “Ferko’s Bringing Back the Minstrel Days” will look like to 5 percent of the population is thoughtless; not thinking about what they look like to more than half of your city is slow suicide.

“It’s not just, ‘It made me uncomfortable,’ it’s ‘The Mummers are racist!’ That’s what they say,” says Stu Bykofsky. Last year, in response to a bunch of commentary on Ferko and other iffy performances (including “Indi-insourcing,” shown below) that year, the Daily News columnist and skilled troll responded by slamming people who were “finding other forms of ‘racism’ in the parade, drawing insipid conclusions from their aggressive ignorance.” His advice then: “If you’re offended, here’s a buck. Try to buy a sense of humor. Or an ounce of sense.”

The Philadelphia Police used to take an annual stab at estimating the number of parade spectators; the highest was 2 million in 1949, and estimates of over a million were common in the 1950s and ’60s. They stopped doing these after an estimate of 22,000 at the 1994 parade caused a lot of angry blowback. It was later revised to 60,000 to 70,000 — three drops in a bucket rather than one. Today, the turnout gets only an adjective, nearly always something like “thin,” “disappointing” or “anemic.”Bykofsky has been a staunch advocate and defender of the Mummers for decades. But even as he skewers humor-lacking donkeys, he’s lamented the loss of the “Good Old Days,” and the decline in spectators, clubs, and attention the Mummers are dealing with these days.

Bykofsky says he doesn’t think the “accusations of racism” are a factor, but that something’s definitely been lost. He sounds genuinely sad. “I think that, for whatever reason, the Mummers’ time is past. This wonderful, beautiful, truly spontaneous folk celebration has run out of steam. For whatever reason.”

Is the massive decline in spectators because of TV? Less free time on weekends? The ungodly inefficiency of the parade of late? Yes, probably.

But here’s another correlation: Philadelphia’s white population has steadily and dramatically declined since the end of WWII; it’s a third of what it was in 1950.

racial demographics vs. parade attendanceI

It’s hard talking about this without using jargon like “privilege,” but I’ve avoided it because for many people, “privilege” evokes images of rich kids driving expensive cars. In this context, though, what it means is “a group that only has to think about this stuff in the abstract, because it doesn’t affect them.” Bykofsky rolls his eyes, but though he scoffs at “white privilege,” he says that “black un-privilege” is obvious.

So let’s say it like this: The Mummers began as the smallest of the small, in the crappiest, poorest, most violent and gang-ridden area of South Philadelphia. But they crossed the line from weak to strong a long time ago, probably around the time the Irish or Italian or Polish crossed the line from despised ethnic minority to white. This isn’t saying that they were suddenly on top of the world or that their lives had no tough problems, but they were able to get the factory jobs that were closed to blacks in the ’50s and ’60s. They are no longer the “un-privileged,” but they don’t seem to recognize this.

When the weak poke fun at the strong, it’s satire; when the strong poke fun at the weak, it’s just kind of being a jerk. And people remember that sort of thing.

So anyone who truly wants the Mummers to flourish will stop telling anyone who speaks up about things that offend them to sit down, shut up and to learn to take a joke. Because those people probably will shut up, but that silence isn’t the sound of a newly purchased sense of humor. It’s the silence of the deadliest predators of the new era: the raised eyebrow and the 180-degree turn, currently sucking the life out of the Republican Party, the Atlas Shrugged film adaptations and Charlie Sheen. If the Mummers don’t start thinking more seriously about what they’re saying to most of the city, there’s likely to be silence on Broad Street within a decade or two.

Court injunctions, human chains, riots, blood in the streets: With all the high drama swirling around the 1964 New Year’s Parade, how strange that in just 50 years, the greatest threat to the Mummers is not a bang, but a shrug.

By Emily Guendelsberger and originally published in My City Paper on December 19, 2013 and can be found here.

Mapping the Incarnation: How the Christian Narrative Makes Sense of our World

I always wanted to align my life with what was true. Discovering the truthful, unfortunately, proved to be rather more difficult than I had realized.

I stopped being an atheist while I was a student at Oxford University late in 1971, partly because of my growing realization of the intellectual over-ambition of the forms of atheism I had earlier espoused, but also because I came to realize that Christianity offered a way of making sense of the world I observed around me and experienced within me.

Christianity thus offered me a rationally plausible and imaginatively compelling “big picture” of reality that brought my worlds and concerns into focus. It is not so much a collection of isolated individual beliefs, but a web of interconnected beliefs, which gains its strength and appeal partly because of its comprehensiveness, and partly because of its intellectual and imaginative resilience.

Christian theology weaves together the threads of biblical truth to disclose a pattern of meaning – like a tapestry, which brings many individual threads together, thus allowing their deeper significance and interconnections to be appreciated.

No single thread can show that pattern; it only emerges through the theological process of weaving the threads together.

A central theme of this “big picture” is the incarnation. This idea came to be of particular importance to me as I began to grasp that atheism was much less intellectually resilient and existentially satisfying than I had once believed to be the case. One of my objections to Christian belief had been my feeling that God was existentially irrelevant. I conceived this non-existent God as a distant figure, without any involvement in the world. God was in heaven – wherever that was. And I was located in the flow of space and time that we call human history. Since God was absent from the flow of history, God seemed to me to be an irrelevance.

Yet as I began to grasp what Christianity was about, I came to see that the core Christian idea of incarnation addressed this deep existential concern. The incarnation spoke of a God who chose to inhabit history; who chose to come to the place which I inhabited as one of us; who suffered, as I and so many others did, but who also chose to make that same suffering the basis of our salvation. I discovered a God who journeyed to my place of exile in order to bring me home.

Some are drawn to Christianity because it offers a strong sense of identity and purpose; others because of the beauty of its vision of God and the world. In my own case, I experienced an intellectual conversion, which changed the way I saw and understood things. Both the New Testament and many early Christian writers speak of metanoia. Although this Greek term is often translated as “repentance,” this does not really convey the full richness of its meaning. The Greek term metanoia means something like “a complete change of mind,” a mental about-face, or a fundamental re-orientation of the way in which we think, leading to a new way of seeing or imagining the world and acting within it. The incarnation offers us this new way of seeing Christ – and in doing so, changes the way in which we see both God and ourselves.

While the doctrine of the incarnation helps us grasp the significance of Jesus Christ for humanity, it also tells us something about the kind of God that Christians love and worship. As I’ve already mentioned, during my own atheist phase I thought of God as a distant reality standing behind or outside history, detached from human existential concerns and shielded from the traumas of history. I could see no intellectual or existential case for believing in a God like that.

Yet the biblical affirmation that the “word became flesh and lived among us” (John 1:14) offers a radically different concept of God: not the abstract and remote “God of the philosophers,” but a God who cares for us; not as a passive distant observer, but as an active fellow traveller and constant companion within the historical process. God is someone we can know and address in worship and prayer. The philosopher Roger Scruton expressed this point rather nicely: “The God of the philosophers disappeared behind the world, because he was described in the third person, and not addressed in the second.” The “incarnation” is not a static and timeless idea, but the Christian way of interpreting something which happened – the life, death and resurrection of Christ – and its implications.

“Crucified under Pontius Pilate”

So how can we begin to do justice to the astonishingly rich Christian understanding of the identity and significance of Jesus Christ? One answer is offered by the British philosopher Mary Midgely, who argues that we need multiple maps to cope with the complexity of our world. No single map is good enough to do justice to the many layers and aspects of human existence. Alongside maps of our physical worlds, we need some way of representing deeper truths about human existence, and relating these to our everyday experiences. One map helps us to understand the shape of our world, and how it works; another helps us to understand our true nature and destiny, and why we are here. We need both maps to inhabit this world meaningfully. These two maps need to be superimposed on one other, allowing us to journey through our physical world and discover our meaning and purpose. The maps work at different levels; yet both are essential to human wellbeing and flourishing.

Our need for multiple maps is brought out by reflecting on a famous historical event – Julius Caesar’s crossing of the River Rubicon, just south of the Italian city of Ravenna. The Roman historian Suetonius is one of several writers who tells us of how Caesar led his army southwards in 49 BC, crossing the Rubicon on his way to Rome from Cisalpine Gaul. This action can be mapped onto the physical landscape of Italy, so that the general course of Caesar’s march southwards towards Rome can be tracked. The Rubicon is not a particularly broad or deep river, so the physical act of crossing it was not remarkable in itself. So why do we remember this event?

To understand why Caesar’s crossing of the Rubicon is of such historical importance, we need to use another map. The Rubicon marked a political frontier between the territories of the Roman provinces, and the area controlled directly by Rome itself. This political map allows the physical act of crossing this river to be supplemented with an appreciation of its deeper significance. In crossing the river, Caesar was declaring war against the Roman republic, thus precipitating a civil war. If we are to appreciate the full significance of this event, physical and political maps need to be laid over one another.

The same principle applies to theological maps of meaning. The New Testament sees the death of Jesus Christ on a cross as being of decisive importance to The Creeds declare that Jesus was “crucified under Pontius Pilate,” making clear that this was an execution, which can be dated historically to the period during which Pontius Pilate was prefect of the Roman province of Judaea (AD 26-36). It can be situated geographically to somewhere immediately outside the ancient city walls of Jerusalem, although the archaeological evidence is not sufficiently clear to allow this to be located this more precisely.

Yet here is the point: a geographical, legal and historical mapping of the crucifixion of Jesus Christ fails to disclose its full significance. Paul, repeating the compact summary of the Christian faith that was passed on to him after his conversion, speaks of Christ’s death using an additional theological map. It was a prophetically predicted event which had the potential to transform the human situation. “I handed on to you as of first importance what I in turn had received: that Christ died for our sins in accordance with the Scriptures” (1 Corinthians 15:3). Jesus Christ thus did not simply die; he died for our sins. The historical assertion that Jesus died is affirmed, but it is supplemented by an understanding of the significance of this event, using a theological map of meaning.

We can thus think of the incarnation in terms of a theological map, which brings the life, death and resurrection of Jesus Christ into sharp theological and spiritual focus. It takes nothing away from the historical narrative, but allows the full significance of that narrative to be grasped. Above all, it allows the narrative of Jesus Christ to be connected with God and humanity.

Christianity’s grand narrative

How does the incarnation fit into this loose and shifting intellectual context? The rise of postmodernism reflects – or has created, depending on your perspective – a growing confidence in the power of narratives to express and communicate deep truths. Postmodern thinkers may have misgivings and suspicions about the ambition of “grand stories” or metanarratives; they have, however, no problems about narratives themselves, realizing that it is impossible to give an account of our individual and communal lives without using the structure of a story.

The use of narratives in Christian theology is of especial importance, as the Lucerne theologian Edmund Arens reminds us:

“Storytelling is fundamental for faith because it is only through this act of telling that our story can be connected with that of God and Jesus; because this story must be told; and so that it can be told as an unfinished story into which the faithful write their own stories and, in doing so move the story forward. Thus at its basic level, the Christian faith has a ‘narrative deep structure’.”

You can see how this incarnational narrative is able to correlate three stories into a coherent whole, allowing our own stories to be connected with the biblical narratives about God and about Jesus Christ. As I hope to show in a forthcoming book on narrative apologetics, the Christian narrative and its many sub-narratives allow us imaginatively winsome and intellectually engaging ways of illuminating the meaning of life.

Christianity tells a story about God, humanity and the world, a story that pivots around the life, death and resurrection of Christ. The incarnation both gives coherence and focus to the entire Christian narrative, and allows us to grasp its relevance for human life and thought. Above all, it expands our vision of reality, helping us to realize that we too often satisfy ourselves with inadequate accounts of ourselves in the universe. As the American novelist and theologian Marilynne Robinson puts it, rationalism ends up imprisoning us within a limited world, diminishing our hopes and expectations, and failing to capture what is so important about being human:

“The modern world, insofar as it is proposed to humankind as its habitation, is too small, too dull, too meager for us. After all, we are very remarkable. We alone among the creatures have learned a bit of the grammar of the universe.”

The Christian story affirms that, and unfolds how, God has paid us the compliment of coming to where we are in the incarnation, taking our form, partly in order to free us from this restricted and restricting vision of reality. God constructs and unveils a new habitation, which we are invited to enter – and enjoy.

The New Testament and the long tradition of Christian reflection on its foundational documents affirm that God entered into the world of time and space in Jesus Christ: “The Word became flesh and lived among us, and we have seen his glory” (John 1:14). If God entered human history, when did this happen? Where did this happen? What did it look like? Who saw this happen?

Answering these questions demands a story – an incarnational narrative of the entry of God into our world by those who witnessed it and appreciated its significance: “We declare to you what was from the beginning, what we have heard, what we have seen with our eyes, what we have looked at and touched with our hands, concerning the word of life – this life was revealed, and we have seen it and testify to it, and declare to you the eternal life that was with the Father and was revealed to us” (1 John 1:1-2).

By Alister McGrath and published on July 16, 2018 in ABC Religion & Ethics and can be found here.

Christianity is the greatest engine of moral reform and cultural riches the world has known

Christians spread cultural values such as religious liberty, mass education, volunteerism and more

Often media outlets, it seems, are uninterested in religion — especially Christianity — except when it is connected to scandals or electoral controversies. Thus, we get a steady diet of news about sex abuse cover-ups, politicized preachers and more. Unfortunately, a great deal of this negative coverage is entirely deserved by Christians and their churches.

But as Christians reflect on the incarnation and birth of Jesus Christ, it is a good time to remember that Christianity has massively contributed to good in world history as well. Other religions have done so, too, and Christianity’s effects are impossible to disconnect from the Jewish tradition from which it sprang. Christianity, however, is arguably the greatest engine of moral reform and cultural riches that the world has known.

That’s a big claim, but many studies and books back it up. To cite just one, sociologist Robert Woodberry showed in a landmark 2012 article that Christian missionaries were responsible for much of the global spread of cultural values such as “religious liberty, mass education, mass printing, newspapers, voluntary organizations, and colonial reforms” from Latin America to East Asia. For a century, skeptical scholars have lambasted missionaries as tools of the British and American empires. Sometimes those charges were warranted, as significant numbers of mission stations became sites of economic exploitation, or worse.

But Woodberry demonstrated that the enduring effects of Christian missions were overwhelmingly positive for the countries receiving them. Even when many indigenous people did not convert to Christianity, they still enjoyed benefits of Christianity’s influence. Where Christian (especially Protestant) missionaries went, Woodberry found higher levels of education, literacy, economic flourishing, the rule of law, and effective government than elsewhere. In other words, Christianity has inculcated healthy patterns of democracy.

Similarly, who could imagine Western civilization’s art and literature without Christianity? The Christian influence on Western societies, especially the United States, is so pervasive that it is easy not to notice it, especially as our biblical literacy has declined over the past half century. From Leonardo da Vinci to Caravaggio, you’d be hard-pressed to find a major European artist before the modern period who didn’t paint the Nativity of Christ, or the shepherds, or wise men. The Kimbell Art Museum’s The Adoration of the Magi by Renaissance artist Jacopo Bassano is just one of hundreds of evocative depictions of the Christ child between the 1400s and 1800s. Through the early 20th century, scenes from the Hebrew Bible and the New Testament were the most common sources of inspiration for great artists.

In spite of our increasingly secular culture, we still see Christianity’s impact in countless instances today in art, literature, music and movies, such as those of Austin-based filmmaker Terrence Malick. But perhaps the most important way America depends on its lingering Christian heritage is in works of charity. For example, a Baylor-sponsored 2017 study demonstrated that faith-based groups such as the Salvation Army supply almost 60 percent of all emergency shelter beds for America’s homeless. The Salvation Army and the Southern Baptist Convention also manage two of the three biggest disaster-relief agencies in America (the other is the Red Cross). And these represent just a sliver of the vast spectrum of benevolent works provided by denominations and other faith-inspired organizations.

So despite the unrelenting bad news about Christians, I shudder to think of a world to which Christ never came. His arrival brought hope of human flourishing and redemption, in this life and the next.

Why children’s lives have changed radically in just a few decades

Childhood has changed out of all recognition, says Barbara Beck. What does that mean for children, parents and society at large?

“When i was a kid, we were out and about all the time, playing with our friends, in and out of each other’s houses, sandwich in pocket, making our own entertainment. Our parents hardly saw us from morning to night. We didn’t have much stuff, but we came and went as we liked and had lots of adventures.” This is roughly what you will hear if you ask anyone over 30 about their childhood in a rich country. The adventures were usually of a homely kind, more Winnie the Pooh than Star Wars, but the freedom and the companionship were real.

Today such children will spend most of their time indoors, often with adults rather than with siblings or friends, be supervised more closely, be driven everywhere rather than walk or cycle, take part in many more organised activities and, probably for several hours every day, engage with a screen of some kind. All this is done with the best of intentions. Parents want to protect their offspring from traffic, crime and other hazards in what they see as a more dangerous world, and to give them every opportunity to flourish.

Originally published in The Economist on January 3, 2019 and can be found here.

Opinion: The Ts are out to erase the Ls, Gs, and Bs

Get the popcorn ready because they don’t even see it coming — at least most of them don’t.

For decades the LGB’s have mastered the Saul Alinsky method in dealing with orthodox Christians and all those holding to traditional morality – “pick the target, freeze it, personalize it, and polarize it.” The bigot label has been applied to Bible believers with such precision that only a fool would confuse who has won the hearts and minds of the culture on the issue of sexuality.

No, the imminent blindside coming for LGBs won’t be originating from the religious boogeymen they’ve soundly defeated in the courtroom of public opinion. It will be coming from their fellow sexual revolutionaries – the Ts.

It has long been a political axiom that revolutionaries will eventually turn on one another. It’s the nature of a revolutionary after all. There’s always a new cause, a new victim, a new enemy. And while the vanquishing of the Bible bangers and street preachers have brought these wildly divergent sexual lobbies together to unite in a common cause, anyone paying attention can see what’s coming.

Take this video clip from the 2019 gay pride documentary, “Are You Proud?” and listen closely to what the transgender activist says at the very end.

In 2019 documentary about gay pride ‘Are You Proud?’ someone proposes that in future lesbians and gays shouldn’t exist. A far right politician? No, a trans activist.

If you didn’t get all that, she says (quite presciently and logically I might add):

“It’s quite challenging to LGB people, because if gender is on a spectrum, then homosexuality doesn’t really exist cause it can only exist in a binary. So when it comes down to it, it’s really just two people, or maybe three, or whatever, loving each other. It has nothing to do with sexuality.”

If you listen closely, you can hear the transgender jackhammer busting a gaping hole in the foundation of everything this movement has claimed for the last three decades. The very nature of lesbianism, gayness, and bisexuality rests on the presupposition that there exists a so-called “gender binary.” That is, there are boys and there are girls.

  • A lesbian is a female who has romantic and sexual attractions to other females.
  • To be a gay man is to be a male who has romantic and sexual attractions to other males.
  • Bisexuals are those who are either male or female, but who experience romantic and sexual attractions to both males and females.

Meanwhile, the entire premise of transgenderism is the belief that there is no “male” nor “female.” Instead, all beings exist on a sliding scale of gender identity, which makes any appeal to a male/female gender reality oppressive.

In transgenderism, lesbianism is a ruse because you can’t really be female, and what you’re attracted to can’t really be female either. Ditto that for gayness and bisexuality. They don’t really exist; they can’t exist if transgender theory is to be accepted as viable, legitimate, and true. In other words, as the activist in the video states, “homosexuality doesn’t really exist.”

It’s kind of funny to think that not long ago gay crusaders were standing in solidarity with transgender culture warriors in demanding that society not “erase” trans identity. Seeking to erase lesbian and gay identity is a most peculiar way of saying “thank you,” it would seem.

By Peter Heck and published in Disrn on December 11, 2019 and can be found here.

It’s time to put down our swords in the culture wars and talk about the Bible

So says Jimmy Gator in the 1999 film Magnolia. Twenty years later, that line retains its authenticity. There’s just no point in denying that everything has a back story.

Tom Holland is an acclaimed British writer, who has risen to fame as a popular historian over the last 15 years.

His most recent book, Dominion, is an intriguing attempt to recover our culture’s past, in spite of our best attempts to look the other way.

Holland’s target is the place of Christianity in shaping the Western mind, and his point is simple: we’re all kind of Christian now.

This might come as a surprise to many, both believer and unbeliever. Our contemporary moment works with a baked-in back-story about the waning of religion.

Belief has moved from being normal to being optional, and a fading option at that.

Holland has no quibble here, indeed his own personal story includes surrendering his childhood faith, and he remains philosophically agnostic, albeit sympathetic to Christian morality.

No, this is not an argument for reconversion. Instead, the point of Dominion is far more subtle, and for that reason, far wider.

Why would a culture believe weakness is strength?

Holland claims that Christianity has revolutionised our posture towards everything: be it religion, sex, power, love, or people, our secular age remains instinctually Christian.

Surely not, I hear you say, and fair enough too. We’re not in the mood for this right now. But 500 pages of compelling historical narrative and lively vignettes do offer plenty to chew on.

How Christianity’s iconography gets lost in translation

For example, if you think religion becomes authentic only when freely chosen and personal, then you’re riding in a slipstream partly forged by descendants of the Protestant Reformation.

Or when we all praise humility as a virtue, it’s easy to forget that it wasn’t always so. Why would a culture come to believe that the way of weakness is the way of strength?

Holland, along with many others, argues that this ethic has a date-stamp, because it is the story of Jesus that functions as a fountainhead for seeking the good of others above my own personal honour.

Take the pre-eminence of love as another example. The Beatles proclaimed all we need is love, and Taylor Swift just released an album which she says is a love letter to love itself.

Yet Dominion points out that our veneration of love as the chief end of life and our compass point for ethics has strong roots in the teaching of Jesus, Paul, and St Augustine.

Finally, it seems obvious today that every person is entitled to dignity and rights.

It’s “self-evident.” Except that it isn’t. For as the historian Lynn Hunt has put it: “if equality of rights is so self-evident…why was it only made in specific times and places?”

What do we want to keep from the past?

Dominion argues that the democratisation of dignity rests upon the biblical claim that we are all made in God’s image.

This was then augmented by the early Christian claim that all people, no matter their social location, can experience salvation through Jesus.

How much is a prayer worth? The answer is about $4.30

It leads to the potentially uncomfortable conclusion that human rights originate because of the presence of religion, rather than its absence.

For each of these examples there is plenty we should debate. Dominion functions well as a first word rather than a last word.

But the book raises vital questions about the social function of history in general, and the meaning of our history in its particulars.

One of the general values of history is that it reminds us things didn’t have to turn out this way.

Values and practices don’t spring to life of their own accord. They have an origin and a story. It could have been different; it was different.

Which presents us with a challenge about how we assess our present and our future.

What do we want to keep from the past? And if we want to keep it, can we adequately nourish that future while we ignore or disparage the past?

But it is the particulars of our Western history that prove most intriguing in this work.

We can’t avoid Christ — for better or worse

Dominion concludes with a chapter examining how the values of Christianity are now wielded by a secular culture as weapons against the church.

In a world where we treasure personal freedom, praise acts of charity, and where we prioritise generous inclusion, Holland suggests that the moral disputes at the heart of our culture are between rival versions of Christian ethics.

Unpacking the baggage of colonialism as a Christian

For the witness of history says that the same Christians who practised radical generosity to the poor also practiced intense forms of religious practice and piety.

What was a seamless garment for them is now torn fabric in our modern culture wars. Even our art bears poignant witness to this tension.

Within the story world of the Handmaid’s Tale there are multiple instances where the text of the Bible is used by both oppressor and oppressed. So who does the text speak for?

If Holland is right, then it seems unavoidable that, as a culture, we need to talk about how we read and sift the Bible and Christian history.

From the most ardent believer through to those who’ve never darkened the door of a church, it would do us well to know how we arrived here, and confess that we all use the past with confirmation bias.

But we might want to take a moment to put our swords down and ask — have I read this text right? Do I have the story straight?

As the historian Margaret McMillan has said: “We use history to understand ourselves, and we ought to use it to understand others.”

If we are talking us here in the West, we can’t avoid Christ and the Christian movement, for better or for worse.

By Mark Stephens and published on November 3, 2019 in Abc.net.au and can be found here.

 

Post Navigation