In the fall of 2015, another seemingly silly, culture-war controversy flared up at Yale after Lecturer Erika Christakis—assistant Master for one of the school’s residential colleges and wife of Nicholas Christakis, the renowned sociologist—responded critically to an administrative message sent to all students advising them to avoid “culturally unaware or insensitive choices” when selecting their Halloween costumes. A specialist in early education, who was then teaching a course on “the problem child,” Mrs. Christakis couldn’t help but wonder, addressing by email the students themselves: “Have we lost faith in young people’s capacity—in your capacity—to exercise self-censure, through social norming, and also in your capacity to ignore or reject things that trouble you? … What does this debate about Halloween costumes say about our view of young adults, of their strength and judgment? Whose business is it to control the forms of costumes of young people? It’s not mine, I know that.”
A very vocal contingent of Yale students immediately thought otherwise, however: it was not just her business as assistant Master to enforce social norms and protect the residents of their college from insensitive costumes; it was her moral duty. Apparently the Christakises were not charged with protecting an “intellectual space” where free speech and debate could flourish; rather, as faculty-in-residence, they were obliged to create a “safe space” where students felt “at home.” The principle of a safe space, however, was not to be applied reciprocally. In response to that one email, the couple’s own home was picketed, its windows defaced; they were cursed in person and then, as this internal spat quickly went viral, attacked online; and after receiving only minimal support from the administration and their faculty colleagues, Mrs. Christakis ceased teaching the following semester and her husband resigned as Master of the college soon thereafter.
This episode quickly proved to be less exceptional than representative. At Columbia, Northwestern, UCLA, and other highly ranked schools, students were demanding to be protected from the “microaggressions” of socially inappropriate imagery and language, and were expecting their professors to provide “trigger warnings” on their syllabi that would exempt them from engaging assigned texts whose paintings, plots, or historical events they might find offensive. Such an outburst of censorious sensitivity on campus must have seemed all too familiar to David Bromwich, himself a Yale professor and the author in 1992 of Politics by Other Means—an erudite polemic that included one of the earliest and most thorough critiques of “political correctness” on campus, as viewed from a liberal rather than a right-wing perspective.
Writing in the aftermath of both the Reagan administration and the arrival of cultural studies in literature departments like his own, Bromwich was objecting then to their opposing attempts to revise the nation’s curriculum and so, too, its self-understanding. He was resisting a new formulation on the academic left that narrowly stressed the racism, sexism, and colonial exploitation of the West, even as he scorned the platitudinous idealization of that same tradition, which was then being voiced by Reagan appointee William Bennett and others. On the one side a purportedly radical “culture of suspicion”; on the other, a supposedly conservative “culture of assent.” And yet, despite their starkly conflicting agendas, these dogmatic movements shared, in Bromwich’s view, a deeper fatal flaw: a “fundamentalist” intolerance rooted in the erroneous assumption that education should be “communitarian” in its goals.
Bromwich wanted to defend the value of the Western tradition, literary and otherwise, against the new “culture of suspicion,” but only in a fashion that stringently eschewed the mindless, moribund ancestor worship of Bennett and his crew. In attempting to chart a middle way purged of what he perceived to be their common error, Bromwich then asserted that real learning could never be a collective enterprise. “Group thinking is not thinking,” he insisted flatly. “Conceits and dogmas of conformity, some of them cruel, all of them timid, cling to the group when they cannot survive in a less settled company.” “The aims of education,” he added later, “are deeply at odds with the aims of any coherent and socializing culture [italics mine]. The former are critical and ironic, the latter purposeful and supervisory.” “The voice to distrust,” therefore, was “the ‘we’-voice of collective judgment." And although he was writing to "defend tradition as a social and personal fact,” that tradition was “personal first and last and most,” a strong claim that he then applied to the subject that mattered to him most as a literary scholar: “Every community of art is a community of one that wants to be two (not more).”
We might pause here to wonder what the oxymoronic phrase “community of one” could possibly mean, or ask what sort of society would ever support a form of education that was always “deeply at odds” with its own “coherent and socializing” beliefs. But first let’s acknowledge that some forms of “collective judgment” are not only timid, but cruel and unjust, ranging in severity from mob violence to the sort of public shaming suffered by the Christakises in person and online, and, too, commend Bromwich for resisting the inflexible dogmas that were arising at that time. As someone who published an essay in 1991 that also took on both sides of our culture wars, striving to chart a middle way, I can imagine all too easily the accusatory responses he likely received just one year later.
New political and educational dogmas, however, tend to be reactionary in the literal sense; they are belated responses to concrete changes in cultural and psychological conditions—to distressing shifts in everyday patterns of perception, communication, or behavior that have upset the old norms. In the Yale students’ fury that their “masters” were favoring free expression over domestic protection, the soap box of the civic square over the safe sofa of the happy home, one can sense, for example—beyond a reanimation of the “culture of suspicion"—both the compensatory longings and default expectations of a generation bedeviled by the opposite extremes of parental desertion via divorce or careerism and the hyper vigilance of helicopter-parenting.
A girl tethered from preschool (where she is leashed to classmates whenever taking a walk) all the way through to her freshman dorm (where she remains one text away at all times of day from a consoling dad or mom) and a boy larded with trophies for "participating” in this or that sport while being dosed daily with promiscuous praise of the self-esteem sort: both may be shocked to find a wider world indifferent alike to their special needs and inherent charms. Unprepared to master the challenges of late adolescence, with its inevitable social slights and emotional wounds, they then couch their personal insecurities in the more righteous terms of an identity politics rooted in the civil rights movement and in a psychiatric therapy first developed to ease the serious suffering of PTSD. Seeking a “safe space” both emotionally and intellectually, they demand that administrators protect them from their peers’ insensitivity and that their professors exempt them from any ideas, plots, or images that, however true to history or the human condition, might “trigger” an eruption of their own worst fears.
(In such demands, along with the renewed expectation that campus administrators should police sexual behavior, one can hear, too, an ironic backlash against the freedoms won by students in the sixties, when the scorned doctrine in loco parentis was largely overthrown. To revise the young Augustine’s hypocritical prayer for today’s campus scene: “Lord, make me free, but spare me the consequences.”)
And lest this analysis begin to seem like yet another op-ed penned by the editorial pit bulls on Rupert Murdoch’s payroll, the same critique applies to the cultural right, and to much graver consequences. The politics of Reagan and pedagogy of Bennett were also based on the mindless blandishments of a self-esteem detrimental to real liberty, though aimed to flatter, of course, a different demographic: the so-called true blue (that is to say, white) middle-class American. And now that the jelly-bean sweets and rom-com geniality of Sunny Jim have been replaced by the toxic tweets and revenge fantasies of the Donald, the pompous panegyrics of Bennett with the racist bile of Trump’s Steve Bannon, the dangers of an “identity politics” writ large have become all too clear. Under the id-driven policies of the alt-right, the liberties lost in the next few years may prove far more consequential than the right to choose our own Halloween costumes.
The fearful retreat behind the physical and psychological walls of tribal belonging; the vicious scapegoating of political opponents in racial, ethnic, or sectarian terms; the willing submission to either an autocratic demagogue or the viral “likes” and loathings of the hive mind; the corruption of the Internet by the click-bait lies of fake news, the anonymous slander of cowardly trolls, and the omnipresent ads of a digitized mammonism; the citizenry’s flight, on both the left and the right, into “safe spaces,” whether in campus dorms, inside the “filter bubbles” of partisan news, or through the promiscuous prescription of mood-enhancing drugs: with these signs of the times, we’re witnessing, I fear, the global demise of the very premises of liberal modernity. And we’re doing so just a quarter-century after its self-proclaimed final victory. The “end of history” may be arriving, all right—not in the form of liberal democracy’s utopian completion but through the internal dismantling of its key institutions and founding beliefs, including its theory on how history proceeds.
This reactionary turn should have been predictable. But insomuch as every ruling common sense has its biases and the modern mind was weaned on linear thinking, its would-be prophets have been largely blind to the cyclical aspects of human history. If our technologies keep on getting better and better, then shouldn’t our economy, our society, our own sovereign selves? … That the answer is no, not necessarily; that becoming better as a social goal is far more subtle and elusive than boosting the speed of a CPU or writing ever more intricate code; that rapid technological progress can, in fact, be socially dangerous; indeed, that the primary engines driving the demise we see all around us may be our own digital devices: these counter observations will seem outrageous to a nation that worships the wizards of Silicon Valley, even while raging against some of the changes that their clever inventions have recklessly set loose.
As Bromwich would appreciate, one of the best ways to gauge these disturbing changes is to reread the literature of the Western tradition, searching for historical trends and analogous circumstances. Having done just that, I now believe that the period most relevant to our own is the early seventeenth century, another volatile era in which rapid technological progress in the West—especially the many unanticipated effects of the printing press—resulted in widespread social and political regress, including decades of civil and sectarian war. It was also, and not accidentally, the era in which the earliest incarnations of modern consciousness and culture were taking form.
If, to reframe Bromwich’s phrase, the importance of tradition in the premodern period was also a “social and personal fact,” it was clearly in those days social “first and last and most.” The very idea of the personal—of the individual as separable from his or her community—was beyond the ken of a feudal society whose economic, political, and religious institutions were deeply configured by communitarian beliefs. By the early seventeenth century, however, a conception of the personal had begun to emerge, and was expressed then in its most optimistic terms by the poet Edward Dyer.
My mind to me a kingdom is;
Such perfect joy therein I find
That it excels all other bliss
Which God or nature hath assign’d
This celebration of the joyous inner kingdom of a new individualism was then more than matched by Descartes’ philosophical assertions in Discourse on Method. There, he claimed that a single thinker—by segregating himself from both the sway of his peers and the presumptions of tradition, and then strictly following Descartes’ new method—could solve all of philosophy’s problems on his own. Although, in this radical revision of Plato’s project, the new philosopher had to begin from scratch, rejecting every form of group thinking, current or past, the rewards in the end would be unsurpassed. Dyer’s proclamation of unexcelled bliss was now being joined by the buoyant promise of a near omniscience—and each achieved alone, each “personal first and last and most.” (Even in modernity’s infancy, its dangerous propensity for utopian overreaching was evident.)
But where the proto-modern poet and philosopher saw emotional and intellectual self-sufficiency in the offing, others perceived a threat to the entire social order. Rather than personal bliss and wisdom, John Donne glimpsed the agonies of anarchy and the maw of sheer confusion. By atomizing the moral as well as the material world, the new philosophy of the personal, pursued unchecked, would run amok, public harmony torn asunder by private ambitions. “‘Tis all in pieces,” Donne lamented in 1611, “all coherence gone”:
All just supply, and all relation;
Prince, subject, father, son, are things forgot,
For every man alone thinks he hath got
To be a phoenix, and that then can be
None of that kind, of which he is, but he.
Shakespeare, who was Dyer and Donne’s contemporary, dramatized then nearly all the possibilities, desirable and dire, for this reconfiguration of the Western person: that is, the sovereign self or “phoenix” who would revel in his own uniqueness and completeness. In his histories and tragedies, the playwright’s extraordinary soliloquies—which, in effect, “outed” these new inner kingdoms for public scrutiny—revealed not just the acute intelligence, introspective eloquence, and unbridled ambition of the proto-modern self, but also its scary potential for antisocial behavior. Freed from conforming to the day’s group thinking, Shakespeare’s Machiavellian villains (at once appalling in their goals and appealing in their eloquence) are also liberated from any fellow-feeling for the group’s wellbeing, and so, too, from any moral limits on their aspirations. Each is not just the smartest and boldest but also the coldest person in the room, and the narrative result of such a personality—at once over-armed with ambition, bolstered by reason, and bereft of empathy—is political chaos, as Donne’s worst fears are fully enacted on Shakespeare’s tragic stage.
The likely origins of this new and inward self, one that values “the personal first and last and most,” can be found in the rapid spread of post-Gutenberg literacy and, more specifically, in the peculiar nature of the silent reading experience. With silent reading, the translation of visual signs into virtual realities occurs “in here,” within the cloister of the single reader’s mind, and with the time, place, and pacing of that conversion under his or her complete control. This naturally evokes a heightened sense of autonomous achievement—of sovereignty, if you will—as well as a collateral feeling of segregation from both social support and public censure, with the potential to induce over time emotional alienation as well as intellectual independence. As silent reading becomes the primary means for learning during this period, the intense concentration required to read normalizes the habit of mentally withdrawing from the social scene. Knowledge as such becomes associated with an introspective solitude, advanced thinking recast as a natively silent, private, and even secret act. The combination of the author’s physical absence and the everyday practice of reenacting speech within the theater of the silent reader’s own head then helps to cultivate a responsive and self-reflective inner voice—in Bromwich’s words, “a community of one that wants to be two (not more).” That is precisely the voice that we overhear in the Shakespearean soliloquies and also the one that begins to appear in the spiritual diaries of the Pilgrims and Puritans during this era.
It is difficult to imagine a sensibility more alien to Shakespeare’s than the one evident on the parched pages of the Puritans’ plain style. Yet despite their rhetorical differences, both the Elizabethan playwright and the Protestant seeker were responding to a crisis crucial to their times. Each was reacting to the threat of social regress that technological progress, in the form of the printing press, then posed to English life: literacy’s unanticipated creation of a rationally empowered but emotionally alienated and potentially antisocial individualism. Donne’s phoenix, who has forgotten that the bell of someone else’s mourning also tolls for him; Shakespeare’s Machiavellian man, whose only allegiance is to his own ambitions; the Puritan divine’s unredeemed sinner who, estranged alike from God and man, is both damned and dangerous: these figures weren’t the fictive phantoms of a few overly sensitive poets and religious fanatics, but the crude pioneers and internal agents of a new and initially brutal age.
Insomuch as print literacy both radically increased the quantity of available knowledge and democratized its accessibility, it was far too powerful an advance to be easily managed, much less rejected out-of-hand. But its initial dissemination in a still largely feudal Europe did prove to be exceptionally disruptive and frequently deadly. To be adopted safely, widespread literacy required a radical reimagining of the West’s social order, and the political reforms that were necessary to civilize its powers had to be matched by psychological ones. Before the silent reader’s would-be sovereign self could be licensed economically and politically, eventually leading to the free-market democracies, it had to be emotionally and ethically domesticated. This new inner kingdom, hailed by Dyer and Descartes but feared by Donne and Shakespeare, needed to be infused with, yes, communitarian beliefs. To make the atomized self fit for society, the old external pressures to conform to moral norms had to be internalized, the primacy of public shaming slowly giving way to the pangs of private guilt. The apparent non-sequitur “community of one” could in fact make sense, if the new individualist were haunted daily by the ghostly voice of social values.
This psychological process, the haunting of the literate self’s solitary cloister by the communitarian voice of conscience—in Bromwich’s terms, this invasion of the personal by the ‘we’-voice of collective judgment—can be traced in the early Puritan diaries, and achieves its most vivid and complete expression later in John Bunyan’s spiritual autobiography, Grace Abounding to the Chief of Sinners. The new Protestant self that we see emerging on those pages, with its fiercely introspective, hyper-vigilant moral sense, is the religious predecessor to America’s democratic citizen and, as such, an unsuspecting pioneer for the liberal modern order soon to come.
A similar conversion, but on more secular terms, is dramatized in Hamlet, arguably Shakespeare’s most influential play. Like Edmund, say, or Richard III—two of the playwright’s evil schemers—Hamlet possesses a formidable introspective intelligence, and like them, too, he feels estranged from the current social order. But in this drama the moral poles have been reversed: the state itself has become “rotten” through regicide and fratricide, and proto-modernity’s new secret agent musters many of the same powers favored by the Machiavellian villain (including a mastery of duplicity to conceal his intentions) to restore “coherence” and the “just supply” of true “Relation.” For Hamlet, “Prince, subject, father, son are not things forgot,” and in dramatizing his conversion to moral action, Shakespeare supplied an early example of the ways in which the acute self-consciousness and analytical powers of the literate self might be bent to public service.
In ways that would have eluded Descartes, however, that bending toward justice would be driven as much by empathy as by some rationally achieved moral accuracy. As is powerfully established in Hamlet’s first soliloquy, which occurs prior to learning that his father has been murdered, the prince’s primary motive for the actions he will take is an abiding love for the dad he has lost, including a righteous anger at his mother for failing to mourn him as a loving wife should. And this intricate depiction of a new moral figure—the intensely private and estranged individualist who, nevertheless, through retaining strong emotional ties to others, makes a sacrificial commitment to restore public order—becomes a prototype for many of the later mythic heroes of liberal modernity, including the lonesome cowboy of American lore.
By way of summary, then, liberal modernity’s invention of, and deep investment in, the personal emerged during a specific period of Western history, in response to a crisis being driven by technological change. The many aftereffects of the printing press led to wholesale social chaos, as the powers unleashed by literacy demolished the checks-and-balances that had kept the feudal order stable and its thinking coherent. It took years of intellectual confusion and sectarian strife, including the violent loss of many thousands of lives, before the West could fashion a new system of “coherence,” of “just supply” and true “Relation,” in the form of a Protestant spirituality, a free market economy, and a democratic polity, each structured in ways that licensed the personal but with complementary restraints to protect the social. These cultural innovations, however, didn’t comprise some final solution to the perennial problem of human governance—they certainly didn’t signal the “end of history.” They were instead adept, if imperfect adaptations to a set of cultural conditions that were themselves bound to change, and indeed now have. But let’s turn to their inherent imperfections first.
What Chesterton observed about the nature of aesthetics—"Art is limitation; the essence of every painting is the frame"—applies to philosophy as well, and to all ruling worldviews, scientific or mythic, old or new. To know is to exclude: each culture’s mental map tends to heighten certain truths and prefer certain measures to the diminishment of others, which its members then ignore like so many bats in a well-lit room. As Donne and Shakespeare warned, our emphasis on the innate bliss and wisdom of the inner kingdom has led at times to sociopathic behavior, as is evident today in the extraordinary size of our prison population. And the scorned outlaw who has defied the social compact is not so different from the acclaimed CEO, whose narrow moral compass has failed to account for his corporation’s toxic impact on both the social “ties that bind” and the natural world. More generally, our default endorsement of the personal has led to schemes of radical inequality, over-rewarding individuals with pay or praise for achievements that have been more communal in nature than we want to acknowledge.
This same bias is evident in Bromwich’s claim that “every community of art is a community of one that wants to be two (no more.)” As was argued before, he is relying here, I believe, on the peculiar nature of the silent reading experience—but not all the arts are created and engaged after the model of the modern novel. Shakespeare’s dramas were not written to be read in solitude but performed in public, and before a highly diverse and interactive audience. Today, aesthetic collaboration (that is, group thinking on a higher plane) is rife in music, cinema, and serious television; the resulting works aren’t the soliloquies of some lonely genius but the communal conversations of numerous professionals, striving to compose a harmonious chorus.
As Americans, we have inherited an ethos of moralized individualism, which values the “personal first and last and most,” even as our everyday practices and intellectual discoveries have been undermining its logic. Today’s economy, for example, is no longer dominated by independent farmers, artisans, and shopkeepers, but by large corporations and wage employees. Likewise, the atomistic reductionism that prevailed in the early modern sciences has given way to more interactive approaches and understandings. “Secondary physics,” as Arthur Eddington wrote, is now “the study of ‘and'—that is, of organization.” In biology, the initial focus on the inner kingdom of the living part (anatomy) has given way to an emphasis on the interactive field of the whole (ecology). Even single-cell animals have now been found to be bilingual, “conversing” biochemically with their own species and others, initiating cooperative interaction for their mutual benefit.
Which brings me to sociology. I chose to open this essay with the controversy at Yale not merely due to the coincidence that Bromwich teaches there but because the recent work of Nicholas Christakis has also challenged the sovereignty of the personal in human affairs. Coauthored with James Fowler, his 2009 sociological study Connected insists that “our [social] connections affect every aspect of our daily lives. How we feel, what we know, whom we marry, whether we fall ill, how much money we make, and whether we vote at all depend on the ties that bind us. Social networks … are always there, exerting both dramatic and subtle influence over our choices, actions, thoughts, feelings, even desires.” Such a claim, rooted in gigabytes of epidemiological data, rejects any formulation of the fully sovereign self, whether Descartes’ solitary philosopher, Bromwich’s artistic “community of one,” or Milton Friedman’s self-sufficient “economic man.” Contrary to Satan’s desperate boast in Paradise Lost—now echoed daily by the occupant of the White House—the single mind can never be “its own place.” It can’t create a perfectly insulated “safe space” for itself or others, much less “make a Heaven of Hell.” Though, as Milton’s plot shows, and as our own immediate political future may prove: once granted sufficient authority, such a solipsistic mind can make our lives here all too hellish.
The timing of these relatively recent (re)discoveries of the power and prevalence of the social over the personal isn’t accidental. They have been occurring during yet another phase of rapid technological change, as our digital devices have been exposing, even as they have been enhancing, the collaborative nature of human behavior. That exposure, though, has also been razing the walls of privacy and dignity that once protected and empowered the Protestant self. As a consequence, that fiercely independent but morally sensitive individualist, the civilized self finally fashioned out of the silent reading experience, is now being rapidly replaced by a new and still insecure public persona: one that, posted online, can be found on Facebook, YouTube, Twitter, and their like. Far from self-sufficient, this new and needy self seeks the immediate endorsement of the digital collective, his self-worth measured in “likes” and retweets, and if ignored, as is often the case, he may then vent his rage from behind a Halloween mask of digital anonymity.
These are perilous times—not just for higher education, as David Bromwich aptly foresaw, but for liberal modernity’s entire regime. As in the early seventeenth century, cultural wisdom has yet to catch up with the ruthless disruptions of technological progress. But while it was the sociopathic individualist who threatened the social order then, today it is the immoral collective: the literal and figurative “dark web” of digital thieves, sadists, pedophiles, and racists; the viral contagion of conspiracy theories; the virtual lynching of public shaming; an identity politics (left and right, black and white) on the crudest of terms. Contrary to Bromwich’s critique, however, the primary problem is not group thinking per se—which, given the power of digital communications, is going to dominate the post-modern age—but the current quality of the group thinking that too often prevails: a hive mind that has yet to be civilized and moralized, as the inner kingdom of the Machiavellian man once was.
Let me end with a confession. As someone who was raised by religious Methodists, and who still spends most of his day reading and writing silently without the aid of a smart phone, I am a vestigial representative of that Protestant self whose demise I’ve just pronounced. When called to perform an autopsy on one’s own way of life, the first likely response is indignation: witness the reactionary rage of today’s electorate. But having sipped from that cup a few too many times, I can assert with some authority that even one drop past its initial stirring draft can turn the whole cup quickly toxic. At a certain age—a stage of a maturity hard to achieve in a society delusional with Silicon Valley’s utopian dreams—one begins to grasp that the final challenge each of us faces is not how to live forever but how to die well. And because we live doubly, as nested in our meanings as we are rooted in our flesh, that difficult goal is doubled as well when our fourscore and ten happens to coincide with an epochal transition between cultural identities.
“All things fall"—all, without exception—mindscapes along with landscapes, the ethos with the temple. But as Yeats once insisted, they can be "built again.” Originally conceived in the cradle of modernity, the American project now needs to be reimagined. The crude forces unleashed by its own inventions will have to be civilized, not just monetized, for a post-modern era whose new ethos, mythos, and polis have yet to be established. Given the technological tenor of the times, much of this reconstructive work will be done online, through networks and listservs in a collaborative exchange (on a higher plane) by self-organizing groups committed to addressing the quality of group thinking itself, with the added task of imagining how that thinking might preserve the best of the personal in an era when the social is bound to prevail.
And in that exchange, the old Protestant self will have something meaningful to say as it leaves center stage to enter the dark basement of the collective unconscious. Deathbed words, after all, do bear a certain gravity. And each successful cultural identity contains a core of lasting truths—including the inescapability of mortality itself—whose softly tolling bell the best of all possible post-modern Americas will have to heed as well.