Читать книгу Mortal Follies - William Murchison - Страница 7
ОглавлениеONE
Sunset and Evening Star
AND YET, BEFORE WE CAN START TO SEE WHERE WE ARE, WE have to look with some attention at the places where we have been.
Nor need we travel far to get there. Many of us carry around intimate memories of the decade known as “the Fifties.” Others know the period by its legends, whether of repression, cultural twitching, or hi-honey-I’m-home complacency. A point easy to miss about the 1950s is that the period was deeply complex, neither one thing nor the other (as indeed might be said, in varying degrees, of all human eras).
I was there. So were many who read these words, and may appraise differently the factors I lay out for consideration. On one point we might agree. It is that whereas the 1950s fairly tingled with religious energy, and worshippers overflowed the churches, and to some observers the Kingdom of God seemed ready to break forth right here in the good old U.S.A., there was much more to be taken in—restlessness, disquiet, a growing sense of agitation.
The years that followed the defeat of the Axis Powers, and of the Great Depression as a notable bonus, were as good, seemingly, for American churches as for America itself. Everyone (so you might have guessed from externals) wanted to be a Christian. Returning veterans sought to get on with normal life, as they remembered or envisioned it. The church was a large part of that quest, holding out possibilities of community and fulfillment—and sometimes even of that spiritual redemption that churches were in business to encourage.
As families expanded with the onset of the baby boom, steeples rose and pews multiplied to accommodate their needs. Religion, one might have said, was in fashion, possibly to a degree previously unknown even in a country proud of its longtime sacred commitments. By 1960, a truly flabbergasting 69 percent of Americans claimed membership in a church. Certain automatic assumptions filled the lungs. Of the new lawyer or teacher or store manager in town, someone was sure to ask, in a welcoming spirit, “Where do you go to church?” Not, “Do you go?” The only conceivable question was, “Where?” It was a genial form of recruiting, and it expressed the social conviction that the place to be on Sunday mornings was in God’s house.
Popular culture, being popular, was suffused with religion—anyway, a sort of religion, obscuring more often than highlighting doctrinal differences. Where the customers led, the entertainment industry followed. The early Christian martyrs figured centrally in successful movies like Quo Vadis and The Robe. Charlton Heston parted, then un-parted, the Red Sea, to the admiration of millions who lined up to see The Ten Commandments, never mind their personal views on the original document.
Television, then coming into its own, offered as spiritual commentator and guide the eloquent Roman Catholic Bishop Fulton J. Sheen, whose weekly program, “Life is Worth Living,” began its influential run in February 1952. The program drew immense audiences, quickly landing Sheen on the cover of Time. Another consequential personality was the popularizing Protestant minister Norman Vincent Peale, whose book, The Power of Positive Thinking, published the year of Sheen’s television debut, purported to establish a vital link between religion and success, a link never long out of sight in American business culture. There was also—powerfully, unforgettably—the Rev. Billy Graham, inviting the world, in fervent gospel preaching, to find and affirm its Savior.
New possibilities arose. Was it not time for Christian soldiers of every denominational variety, to march onward in unity? The National Council of Churches was the great cooperative venture that twenty-nine denominations, representing 33 million Christians, launched in 1950. The Episcopal bishop who was its first president spoke of the NCC’s goal: “a Christian America in a Christian world.” (Could any other statement by a public figure so starkly demonstrate the distance between the sixth decade of the twentieth century and the first decade of the twenty-first?)
Congress played its own part in the religious revival, making us, via the Pledge of Allegiance, “one nation under God.” President Dwight D. Eisenhower himself declared that “our government makes no sense unless it is founded in a deeply felt religious faith—and I don’t care what it is.” The remark, despite exciting ridicule on the part of some theologians, was aimed, not inaccurately, at a broad and sensible consensus.
And yet were matters as good as they looked? If at the popular level, religion—Christianity in particular, seemed the going thing, many “professionals”—theologians, pastors, priests—felt an unease bordering on discomfort. Were the churches doing what churches ought to be doing? Was it enough that people were showing up in increasing numbers for a closer walk with God? Perhaps, just perhaps, more was indicated.
As various professionals saw it, a great deal more was indeed indicated. A bubble needed puncturing—the bubble of popular complacency and social comfort. The war had not long been over when an Episcopal priest, the Rev. Theodore Wedel, had intimations that all this might be so. In a series of lectures published in 1950, just as the National Council of Churches was beckoning forward the massed ranks of Christian soldiers, Wedel lamented that “Christianity is today, among a majority of educated men and women, including many nominal Protestant Christians, an almost unknown religion.”
Unknown? How would that be, amid the bustle of masons and carpenters aggressively throwing together new churches? To Wedel’s mind, “Golden Rule idealism,” “moralism,” and biblical ignorance had become features of the religious landscape hardly less common than new white-painted steeples. “Main Street Christianity” was “a kind of Christianity without theology, one which does not repudiate the name of God, but which has basically little to do with him.”
Many shared Wedel’s apprehensions. The Jewish theologian Will Herberg found America’s “common religion” to be nothing less than the American way of life, “the operative faith of the American people.” With a “spiritual structure,” yes—one that “embraces such seemingly incongruous elements as sanitary plumbing and freedom of opportunity, Coca-Cola and an intense faith in education—all felt as moral questions relating to the proper way of life.” God functioned, said Herberg, “as sanction and underpinning for the supreme values of the faith embodied in the American Way of Life.” It all amounted, he thought, to “secularized Puritanism, a Puritanism without transcendence, without sense of sin or judgment.”
The bestselling sociological commentator Vance Packard declared wincingly that most Americans saw church attendance as “the nice thing that people do on Sundays. It advertises their respectability, gives them a warm feeling that they are behaving in a way their God-fearing ancestors would approve, and adds (they hope) a few cubits to their social stature by throwing them with a social group with which they wish to be identified.” A desired “social group”! Did anyone mention the Episcopal Church, with its ancient silver plate and gilt-edged membership rolls? There was some of that going about; there would be much more.
The Christian churches were rising—slowly, gradually, but emphatically—to critique, if not to flagellate, the culture from which they were receiving such hearty back thumps of fellowship and encouragement. There might seem some oddness in this, except for the greater oddness we will note as we go along—that of the churches’ coming to embrace cultural norms more hostile to traditional Christianity than anything on display in the postwar 1950s.
From the straight course the churches were making toward Scylla they tacked suddenly, pennants flying and canvas fully stretched, toward Charybdis. The 1960s began with thoughtful churchmen asking how the church might more fittingly identify itself with the culture’s genuine needs. The decade ended in religious warfare of a new kind. It seemed the churches of Jesus Christ were more than disappointed with the old culture; they were angry at it, and at its continuing manifestations—angry and ready to see in the old culture’s place a new cultural style altogether. As course corrections go, that of the American churches, in the 1960s, was something to behold, not least for velocity and noise.
Just what was it the churches felt, increasingly, called on to do? This took some working out. Methodists, Presbyterians, and the like, closely tied to the hard-working middle classes, took their time when it came to stirring. An Episcopal tradition of social commitment, observed irregularly but generally with heart and spirit, did begin to assert itself in the 1950s.
In a generalized sense, one Episcopal theologian, Norman Pittenger, had in 1950 called for “an awareness of the privilege of Christian discipleship . . . a consciousness of the fact that one is a Christian, called to a peculiar kind of life and a unique loyalty to the divine imperative.” Yes, but what then? The sociologist Peter Berger, in The Noise of Solemn Assemblies, published in 1961, identified the “what” as “Christian outreach to the distressed.” Indeed, said he, it was more than that. The call, a loud one, was to “modify the social structure itself,” and to erect “Christian signs in the world.” Not billboards—rather, plain evidences of Christ’s presence in the world. There had to be “complete identification,” Berger argued, with the conditions of daily life.
So also argued the Rev. Gibson Winter, Episcopal priest and professor at the University of Chicago Theological School, whose influential and much-quoted book, The Suburban Captivity of the Church, published the same year as Berger’s, called for repudiation of the model of the church as refuge and resting place. No! cried Winter. The church was a place for exertion. The manifest duty of America’s churches, he said, was to create “a human environment in the metropolis.” “The churches,” he affirmed, “must now become publicly accountable institutions with a vision of the metropolis and their mission to it.” Service to purely middle-class interests could no longer suffice, if ever it had. By such a model the church was “an island of conformity in the metropolis—a treadmill where men and women grind out their salvation.” Such an “introverted church” was a contradiction in terms. The church had to turn outward, in love and reconciliation. The church of Jesus Christ had, in short, to take itself, and its purposes, with utmost seriousness.
This was no extravagant claim. Most Americans understood, vaguely at least, the extent to which the Christian churches had influenced the course of their history, from the theological empires of New England where the anti-slavery crusade germinated, then on through the push for Prohibition. Particular ministers and laymen had attached themselves fervently to the cause of laboring Americans—preaching, urging a Social Gospel, a message of justice.
First had come the Congregationalist minister Washington Gladden (1836-1918), with his ringing appeals on behalf of the rights of labor. There followed the outspoken, and even more influential, American Baptist minister Walter Rauschenbusch (1861- 1918), who read the Scriptures as a condemnation of “socialized, institutionalized, and militant” evil that “the Kingdom of God and its higher laws” could displace “only by conflict.” To which end he foresaw “prophet minds” fighting “for the freedom of the people in political government and for the substitution of cooperation for predatory methods in industry.”
In the years following World War II, the call to engage the world often fell upon willing ears, of which one set belonged to a young Episcopal priest with extraordinary credentials—a wealthy family, a Yale degree, and a Silver Star and Navy Cross awarded for gallantry on Guadalcanal. The Rev. Paul Moore, as he would write later, was “drawn to the cause of the poor and the persecuted.” With solemn intentions, this scion of a prominent New York banking family sought ministry, his young family alongside him, in slums, and downtrodden places. The world needed the church—that was Moore’s idea, and the idea of other Episcopalians like him, including a young Harvard-educated lawyer, William Stringfellow, who moved to East Harlem in order to provide the poor with legal services otherwise unavailable to them.
There was, at least in Anglican—hence Episcopal—thought, strong theological grounding for commitments of this kind. It lay in the “incarnational” understanding of Jesus as the enduring, undying savior, positioned at the center of human affairs, suffering for the sake of all, not merely for the well-born and comfortable who lived in climate-controlled homes, sent their children to good schools, and on Sunday mornings turned up, freshly scrubbed and accoutered, at the nearest Episcopal church. The nineteenth-century English theologian Frederick D. Maurice had embodied this understanding, with concern (as the twentieth-century American Protestant theologian Reinhold Niebuhr epitomized it) for “[t]he conversion of mankind from self-centeredness to Christ-centeredness.” Maurice stood against mere “religion”—heartless formalism scarcely cognizant of Christ’s kingship. By the 1940s, another English theologian, Alec Vidler, saw theology coming around to Maurice’s “emphasis on the solidarity of the human race, on the doctrine of the mystical body of Christ, on Christ’s Lordship over all men and his relationship to the whole created order.”
Incarnational theology was earthy, material—incarnational. That was what was meant by “incarnation”: Word made flesh; God here among us, as one of us. The here-ness, the now-ness of Jesus drew reinforcement and splendor from his “real presence” in the Eucharist, the elements of bread and wine rendered flesh and blood in the spiritual sense, the crucified Son of God taken on the tongue of the faithful worshipper. Not that all Episcopalians, or Anglicans, by any means shared this somewhat rarefied understanding. (The low-church style of worship put far greater emphasis on Scripture and sermon than on regular celebration of the Holy Communion, as the Eucharist was then styled in the Book of Common Prayer.) Yet the wonder and power of Christ here among us under the aspects of bread and wine made a powerful appeal to worshippers with hearts turned in the indicated direction. Paul Moore called the Eucharist “the pattern and the power”—“an exchange of our bodies for his body, of the Cross for Resurrection, of captivity for freedom, of death for life, of all else for joy.” On this intuition he built a consequential ministry.
As did, to somewhat different ends and purposes, the kinetic, not to say frenzied, bishop who until his death in 1969 would symbolize Episcopal wrestlings and fidgets. James Albert Pike founded no school of thought within Anglicanism, nor did he leave much trace of his compulsive activities, apart from a number of short, glib books that still turn up regularly in secondhand sales. Episcopalians scarcely knew whether to acknowledge him as prophet or provocateur. He was likely, in indeterminable portions, a mixture of both. As charismatic dean, in the early 1950s, of New York City’s Cathedral of St. John the Divine, and subsequently as bishop of California, Pike won for himself a large and attentive audience, outside as well as inside Episcopalianism. (For a few seasons he was host of his own local television show.) The man with the large black-framed spectacles and unstoppable mouth, wearing proudly the garments of church office, was given to judgments and observations not generally associated with men of the cloth. It was a major part of his reckless charm.
Surveying the Christian faith at large, Pike, like growing numbers of non-Episcopal kidney, glimpsed mold, encrustation, and piles of rubbish. Old ideas in a new time simply weren’t the ticket, not in the eyes of James A. Pike. At the 1964 General Convention, he attacked “outdated, incomprehensible, and nonessential doctrinal statements, traditions, and codes”—a considerable statement from a bishop, sworn like his brothers to drive from the church “all strange and erroneous doctrine.”
“The fact is,” said Pike, “we are in the midst of a theological revolution. Many of us feel that it is urgent that we rethink and restate the unchanging gospel in terms which are relevant to our day and to the people we would have hear it; not hesitating to abandon or reinterpret concepts, words, images, and myths developed in past centuries when men were operating under different world views and different philosophical structures.”
Oh.
Pike, whose visage Time magazine featured on its cover (as it had featured that of Bishop Sheen) spoke with some acuity, and certainly fluency, to an age more full of skeptical questions and blunt assertions than of reverent assents. It was an age of hard facts, with yet a high regard for the experts who came in so many guises: television commentators, soapbox haranguers on college campuses, secular politicians, founders of causes. He was all the more in tune with the times on account of his willingness to drink constantly from different springs and wells and chalices just to sample the taste. That he loved the church seems clear enough—though his manner of showing affection outraged a large portion of the church’s membership and leadership, who judged him to be putting souls in danger while bringing the church itself into disrepute. In the early 1960s, the Episcopal Church charged him formally with heresy, in response to an accusation by fellow bishops. There was general cringing at the idea of actual attempts to reprimand wrong teaching. Was there not about this whole enterprise the odor of wood smoke and burning martyrs? The church shrank back. Pike walked from his auto-da-fé bearing only a judgment against his theological “irresponsibility.”
Which had been considerable.
The bishop of California, it seemed, would say nearly anything that came to mind. St. Paul, to Pike, was “crotchety.” Ancient doctrinal formulas such as “came down from heaven” were to the bishop “incredible . . . in this Space Age.” The Virgin Birth of Christ he found troublesome “for many intelligent people.” The doctrine of the Trinity, one God in three Persons, was “unintelligible and misleading to men of our day.” More “unintelligible” by far seems the notion that no harm could come of taking particular Christian doctrines and giving them a swift kick, conveying to listeners the idea that modern folk who held to these ancient Christian ideas really weren’t (wink, wink) People Like Us.
Large numbers of American Christians listened. Large numbers nodded, or nodded and chuckled and smiled, with appreciation. On some of the points that streamed forth (endlessly, it began to seem) from Jim Pike and other theological jostlers of elbows, there was room for reservation, if not furious objection. Yes. Of course. But you never could tell. What if, after all, there was something to this new business about straitjacketed thinking and scandalously unpaid debts to the Lord of Life? Consider. If the life of the world could run in new, exciting courses after so much darkness and division, might not understandings of life and eternity be ripe for reappraisal? Might it not be time to . . .
So much for the 1950s, its questionings, quibbles, and persistent sense of unease. The noise of solemn assemblies began to fade. The church—not in every place, by any means, and with many a finger crossed, or set of arms folded in opposition—began to move its stiffened legs. Dead ahead lay the 1960s. Those years would prove decisive.
I am at the Phoenix airport, changing planes as I head home from a West Coast university for Christmas break. The year is 1963. And what is this? A man, possibly a year or two younger than I, hurries past. I turn around. I look. No, the truth is, I stare—at a head of hair, thick and loose, the bangs, like window drapes, falling to the eyebrows. It might be a sheepdog, or Moe from “The Three Stooges.” Then, as the saying goes, recognition dawns. This is a Beatle. An imitation Beatle, to be sure, but with the same hair we’ve seen in the newspaper photos relayed from England. One had known there was such hair; one just hadn’t expected it outside its proper context, at the Phoenix airport. Suddenly the age of the male flat top and of Bryl Creem (“A Little Dab’ll Do You,” the TV commercial promised) seemed less certain than before, to the extent one had thought to question its staying power.
It was possibly my first glimpse of that which later we would call “the Sixties.” I do not recall the moment as unsettling or premonitory. On the other hand, I find, looking back, no coincidence in the timing. President John Fitzgerald Kennedy had been assassinated barely two weeks earlier, less than sixty miles north of my hometown. The two things I desired most to see on landing that day in Dallas were my father, who was due to collect me, and, on the drive home, the Texas School Book Depository, a helpless monument to horror, surrounded still by gawkers, with index fingers pointed toward the seventh floor. I forgot for a long time about my Phoenix airport encounter. Afterwards, concerning that hinge moment, a great many things came to me, things that were fading even then, though we had no inkling of it, and other things that were putting out small shoots and tendrils.
The times, as a not-yet-world-famous Bob Dylan was to sing (and as we were to hear incessantly afterwards) were a-changing. Equally, as some of the ancient Romans had said, we were changing with those times (et nos mutamur in illis).
Was there something new in all this? No change, no growth, is the law of life. Yet the changes we were to experience—or endure, as the case might be—in the 1960s and afterwards were deeper, darker, and more disruptive than anyone could have foreseen in that deep, darkening fall of 1963.
Older assumptions about life, about norms, about reality itself commenced a slow fade-out. Into focus came new assumptions, rattling alike the windows and the nerves. It was more than just a case of getting used to daily sensations like “campus protest” and “flower power,” to cite two popular terms of the time. There was a sense, prevalent among the younger set but shared increasingly by older onlookers that personal expectations suddenly counted much more than seemingly stale viewpoints and definitions. What did parents know, anyway? They were so . . . old! As were their notions about life and how best to get along in it. It was appropriate, seemingly, to live by the slogan, “Never trust anyone over thirty.” (Until—naturally—becoming thirty yourself.)
Whatever justice and love and duty and hope had meant previously, these commodities no longer enjoyed special “relevance” (another then-popular term). A certain kind of sensitivity would lead us to the understandings necessary to carry on with modern life. What kind of sensitivity? Clearly the kind that people of sensitive outlook (people such as us!) were only too happy to employ. The logic of the new creed was never other than circular: What we say is so because we’re the ones saying it! Nor was it likely to be confused with the older wisdom founded on tradition and the slow, careful exploration of possibility and limits. It now seemed the very notion of limits was some archaic fantasy, some artifact in a dark attic made bright by the sudden flinging back of wool curtains.
An older culture was making way, with many a groan and grunt, for a new culture, one whose varied influences radically inform the living of life.
It is common nowadays to talk loosely of “the culture,” and of the various wars that rage within it. Learned books are written on the subject. Still, the term gives pause. Many of us think of the word “culture” as pertaining chiefly to artistic pursuits. People who went to the symphony and read books, perhaps even watched subtitled foreign films at the local arts theater, were “cultured.” Culture, in that specialized sense, was acknowledged a good thing, at least by those who thought about the matter. Culture stood over against barbarism, or, as some said, Elvis.
As we use the word today, “culture” mainly refers to an environment—moral, political, economic, or whatever, and a set of attitudes, actions, and assumptions associated with that environment. That is the definition I propose for present purposes. I mean by “culture” the ocean and all schools of fish that swim in it. I mean the modes of the larger society, in which institutions of every sort exist: viewpoints, mental habits, and crotchets; entertainments and obsessions, ideas and ideals, norms and non-norms, behaviors, memories, ways.
To taxonomize all things cultural is clearly not the task of an essayist (my self-definition), but of entire teams of sociologists, aided and abetted by those whom journalists always like to identify in news stories as “experts.” My own definitions of “the culture” are bound, for some, to fall short, a certainty I acknowledge with, I hope, appropriate regret. I would say in my own defense that anyone’s definition of a beast like “the culture” is bound to fall short. That is how things are. I invite argument and dispute as to terms, even as I implore the reader’s patience.
I have advanced the notion that America’s “mainline” churches in general, and the Episcopal Church in particular, whether meaning to or not, have placed themselves in at least partial thrall to the culture. I do not mean the whole of one church or another. I do not mean the whole of the culture. What I mean is that prominent, not to say dominant, elements of church and culture now carry on this intimate and destructive relationship.
I need, before going further, to say a word about culture generally and its effects on those fish swimming daily in its depths. The mistake I want earnestly to avoid is that of implying that over here is something called “culture” and over there something called “church,” and that the latter lives under some sublime obligation to keep the former always at arm’s length. That would be nonsense. Life has never worked that way, for all the occasional efforts of good Christian folk to isolate themselves from grossness and corruption: the desert anchorites, for instance, or the Amish, or even those who merely cultivate stricter standards of personal behavior (e.g., avoidance of “bad language”) than the larger society tolerates.
In practice, the church is forever rubbing elbows with, bumping up against, “the culture,” striving to infuse it with appreciation of duties and possibilities whose source is other than human will and intellect. Often the endeavor works, the church having the better of such disputes as may follow. In the early twenty-first century, human intellect and, especially, human will seem often to enjoy the upper hand.
Why? I contend we need, for the sake of social and moral stability, to seek an answer too long deferred. We need to seek it likewise for the sake of possibly imperiled souls. The place to start is, I think, with some account of the culture to which Christianity addresses itself, the culture all around us, the ocean in which we swim—or flounder.
If asked to assign our times a hallmark, I would answer by fusing two characteristics.
Characteristic 1. Personal autonomy.
Characteristic 2. Moral fragmentation, if not actual disintegration.
What we want to do we jolly well ought to be able to do, with no one to deny us. That seems, broadly speaking, and with room for numerous exceptions and variations, the nub of the matter. As the journalist David Brooks has limned the attitude: “[T]he core mission of life is to throw off the shackles of social convention and to embark on a journey of self-discovery. Behavior is not wrong if it feels good and doesn’t hurt anybody else. Sex is not wrong so long as it is done by mutual consent.” This is a picture of life lived without moral structure or all-encompassing purpose, life as we see it all around us.
We are by now deeply “into”—one more post-1960s expression—autonomy and leave-me-aloneness. Autonomy is independence of others and of claims imposed from the outside: by religion, by family, by social code, by almost any institution seeming to communicate preferences of one kind or another. Fine that someone else should prefer a thing. To require of others that same preference—no, no. Not any more. The commercial freedom of the marketplace and the political freedom of the polling place have subtly shaped and formed us.
So have historical events and occasions. The Protestant Reformation, which chopped up western Christianity into discrete fragments, including Anglicanism, gave religious proclaimers of all sorts—some wise and devout, others addle-brained and noxious—title to say whatever they liked to whomever they liked. The scientific achievements of the seventeenth-century could be read as further celebrating individual vision as over against the all-seeing-ness, all-providingness of God. A century later, the thinkers of the French Enlightenment began to heckle and jeer the claims both of church and of state. To the jeering and heckling the French Revolution, which began in 1789, added guillotining. The old regime was not much of a regime: lazy, self-satisfied, open to imputations of corruption and tyranny. It may have deserved strong rebuke. What followed was out of all proportion to offenses rightly or wrongly imputed. For the authority of crown and church the revolution substituted the authority of the mob: loud, bloodthirsty, easy for orators and agitators to manipulate. Appropriating Notre Dame Cathedral to its own purposes, the mob elevated to preeminence the newly created goddess of Reason. It was not precisely what France’s own Martyrs of Lyons, in Roman times, had willingly surrendered their lives to affirm—the triumph of human passion over the solemn prescriptions of God.
That which Edmund Burke had called “the red fool fury of the Seine” receded during and after the Napoleonic wars, though not to its former banks. In Britain and America, Victorianism reclaimed some lost ground for Authority. All the same, Victorianism probably encouraged more liberation of one kind and another than it shooed away. That was thanks partly to the romantic movement in art and literature but more particularly to capitalism—the economic handmaiden of democracy—and a scientism (especially Darwinism) that came over time to disdain many of religion’s putatively ignorant claims. The unseen God of the Christians took up less and less of modern folks’ time and interest. A new form of “critical” biblical scholarship asked questions about the authority of those Scriptures that Christians had seen as reflecting the mind, if not recording the actual words, of the Lord God Almighty.
Still, at the start of the last century, and for some time afterwards, Authority was vertical and top-down in a way that can amaze us when we look back, provided we remember to look back. Tradition, meaning the distilled experience of the past, held considerable sway; so, in greater or lesser degree, did notions of self-denial. Common forms of address (“Yes, sir,” “No, ma’am”) and gestures of deference (“Ladies first”) told their own story. It was a story of cultural gradients. Everyone had the right to an opinion, but some opinions were more sensible than others—were, in fact, not mere opinions but, rather, well-settled statements about the world and its operations; also about the obligations that daily life entailed. It seemed—yes, really—there were duties that particular people owed to other people. One performed those duties because—well, because it was the right thing to do: “right” being a reality, just like the obverse reality of “wrong.”
I interrupt these remarks to flush from his Florentine tomb the philosopher-statesman Machiavelli, so that he may give fair warning about nostalgic delusions. “Men ever praise the past,” wrote Niccolo, in his Discourses, “and find fault with the present.” We have to be perpetually cautious, in other words, about unflattering comparisons of present times to past times. All I suggest here is, I think, empirically verifiable. Verticality and top-down-ness in culture gave way slowly but inexorably to horizontality—a side-by-sideness of ideas, outlooks, postures, assumptions, and beliefs, especially moral and social beliefs. (Disputes over distribution of power and property naturally went on, as irresolvable in the twenty-first century as in the first.) That an idea was old or venerable created no presumption in its favor. To the contrary, age darkened the filaments of the brightest ideas. The new culture read by different lights entirely.
Well before the turn of the present century, democracy, meaning voter sovereignty, passed over from the realm of political theory into daily life. Whatever you wanted, maybe that was after all your right, your entitlement. Not to the point of anarchy, perhaps, but farther in that tipsy direction than society had ventured before. Counsels of caution, and of respect for the wisdom of the past, got barely a nod from activists laboring at their separate, often uncoordinated, projects. What if, after all, there was no abstract right and wrong? What if there were multiple ways of understanding and embracing truth? What if truth itself was just a conceit invented to keep down the town rowdies? Or perhaps over time we came to see things in a clearer light than our ancestors had done. Bless their hearts, they may have had good cause to believe thus-and-so was right and authoritative, but times changed, information accumulated, new insights formed. The past had no hammerlock on our brains. Thus the periodic duty to sweep from our cultural closets those notions and practices unsuited to a new age. Out came the brooms, and up from different sides of the cultural spectrum went the respective yells of delight and horror.
If the 1950s were far more dynamic and less “conformist” than legend maintains, still it was the decade of the 1960s—hippies, Woodstock, Eastern religions, incinerated draft cards, “Make Love Not War,” marijuana and LSD, Black Panthers, the Age of Aquarius—that smashed up older concepts of authority and left them writhing in the street. No longer, it seemed, were particular ideas, particular modes, inherently “better” than others, and therefore more deserving of respect. Shelby Steele has rightly called the 1960s “a time when seemingly every long-simmering conflict, every long-standing moral contradiction in American history, presented itself to be made right even as an ill-conceived war raged on. And the resulting loss of moral authority was the great vacuum that literally called the counterculture consciousness into being.”
Not that, prior to the 1960s, Americans had showed much disposition toward forelock-tugging and the automatic ratification of a stranger’s “should” or “ought.” In this most democratic of nations, champions of the authoritative long ago made a sort of peace with the Great Oh, Yeah?—with the right to heckle, deeply ingrained in the American character. All the same, the 1960s were something new in our national experience: a time of defiance, provocation, and exhibitionism for their own sake, of fist-shaking and nose-thumbing all across the cultural spectrum. A favorite exhortation from the late 1960s and early 1970s became, loosely, a kind of watchword for the period: “If it feels good, do it!” To claim that these six words were the creed of a whole culture would be, as with any slogan, going infinitely too far. Still, this particular slogan encapsulated the increasingly common notion that personal choice trumped outdated rules and regulations. The whole appeal to openness and untapped possibility found lodgment in unlikely places, such as the universities and the churches, teaching institutions where the “vertical” approach to knowledge and instruction had generally held sway.
Virtually across the board, choice exerted itself as the determinative factor in art, in music, in self-expression; in courting and marriage and personal relationships; in the use of time itself. In 1973, the U.S. Supreme Court proclaimed the right of an American woman to decide entirely for herself, and without obstruction, the vexed question of whether to bring a pregnancy to full term. In 2003, the justices found it indefensible that Texas should statutorily penalize consensual sodomy, given that (in the words of Justice Anthony Kennedy) an emerging awareness of liberty gave substantial protection “to adult persons in deciding how to conduct their private lives in matters pertaining to sex.” The justices sang sweetly of “an autonomy of self that includes freedom of thought, belief, expression, and certain intimate conduct.”
In business, said Peter Drucker, “knowledge workers” were similarly in control of their own destinies, could they but see it. Free speech came to embrace, with societal and judicial permission, attitudes like the burning of American flags and the sprinkling of movie and television scripts with words once rendered in the funny papers, delicately if suggestively, as “&!#@$#@.”
In time—partly because they seemed to have little choice, partly because the fever of the times carried them away, the teachers loosened up. Perhaps, they started to reflect—just perhaps—the old ways really had grown offensively, uselessly old. Perhaps, as “the kids” were trying to instruct us, it was time for reassessment, reevaluation, growth.
In the meantime, could someone shut the window, please? With all that racket from the street: How could anyone think? The noise, of course, was that of metaphorical bricks being hurled metaphorically from below, smashing metaphorical window-panes, scattering metaphorical books, papers, and chalk.