Читать книгу Reaganism in Literary Theory - Jeremiah Bowen - Страница 7

Оглавление

Essay One

INTERPRETIVE POLITICS: READING SYSTEMIC OPPRESSIONS WITH EVE SEDGWICK, STEPHEN BEST AND SHARON MARCUS

We can say of the eighties what Orwell could say of the forties: “In our age there is no such thing as ‘keeping out of politics.’”

W. J. T. Mitchell, “The Politics of Interpretation” (1982)

The current era of political polarization and culture war in the United States is often measured against a fantasy, an imaginary era of nonpartisan harmony in the wake of a war that established “The American Century.”1 And yet this depoliticized image signifies the same age Orwell describes as inescapably political, in an essay written to span the ocean between allies.2 Like the forties, the eighties is now often mythologized as a time of American triumph, when good struggled against an Evil Empire, and freedom overcame tyranny. This mythological narrative of holy war is presaged by Ronald Reagan’s depoliticized image of the 1962 election in “A Time for Choosing”: “There is no left or right [...] only an up or down—up to man’s age-old dream, the ultimate in individual freedom consistent with law and order, or down to the ant heap of totalitarianism.”3 Reagan is apparently undeterred by the manifest contradiction between his disavowal of partisanship and the ostentatiously partisan occasion of his speech, televised in support of Republican presidential candidate Barry Goldwater. That disavowal itself also depends on Reagan’s image of the Democratic Party as drifting toward a Stalinist authoritarian version of socialism. The bad faith of his trope is undeniable, especially when one recognizes that Reagan ostensibly refuses partisanship only to immediately define his own party as agents of “man’s age-old dream” of freedom, and dehumanize the other party as a mindless colony of insects bent on dystopian oppression. We will see homologous rhetorical gestures repeated throughout this book, as various characters deny or conflate the differences of left and right, claiming for themselves universality and agency, while objectifying others as mere negations of universal value, truth and right.

Reagan’s frame and premises—defining the struggle between left and right in the United States in terms of individualism, freedom or liberty, the same terms used to define the reasons for US opposition to the USSR—have essentially been accepted as the default explanation for the national turn away from the progressivism of Roosevelt’s New Deal. Within this frame, US democracy is defined as consistent or even coterminous with private property, both subsumed by the signifier “liberty,” and this is placed in opposition to Stalinist authoritarianism, the signifier under which the socialism of the USSR is identified with any system of communal ownership or wealth redistribution. Never mind that none of these identifications are rigorously defensible, let alone self-evident, as they ignore contradictions internal to each society. In both the US and the USSR, a professed adherence to principles of equal distribution of power is belied by traditions of terrorist governance that have maintained disproportionate power for elites. While the United States touted its principles of political equality, its unequal distribution of wealth and privilege guaranteed inequities in political and legal representation and enforcement. And while the USSR boasted of its principles of economic equality, its inequitable distribution of political and legal enforcement and representation guaranteed inequalities of wealth and privilege. In both cases, mutually reinforcing inequalities and inequities ensured that, for most citizens, their nations were neither a pure heaven of freedom and opportunity, nor a pure hell of oppression.

Reagan calls analytic attention to one such contradiction internal to US democracy in this period by means of the caveat he places on freedom, specifying that it should be “consistent with law and order.” In a speech given in support of Barry Goldwater, famously an opponent of the Civil Rights Act, this caveat carries significant semantic weight—just as it would for Richard Nixon in elections to come. The record of the Nixon and Reagan administrations demonstrates that freedom “consistent with law and order” meant freedom inconsistently distributed and defended along lines of race, sex, class and religion. His record marks Reagan as a counterrevolutionary figure in the trajectory of US democracy, in which the rejection of aristocratic rule has progressed incompletely and unevenly toward broader inclusion. And yet his inconsistent support for individual freedoms places him in the mainstream of US history, as the rhetoric of liberty has always been partly inconsistent with structural inequities in production, law and policy.

This inequitable recognition of citizens’ equal right to liberty is the condition for a function of Reagan’s rhetoric of depoliticization that is not so much persuasive as it is permissive, and which remained durably effective even in his post-presidency and after his death. This permissive gesture is also occasioned by the unpopularity of Goldwater, whose extreme views would win over only the five states of the Deep South and his home state of Arizona, earning him the smallest share of the popular vote ever received by a major-party candidate for US President. By positioning himself as voice of the unassailable center and standard, and by presenting his views as an expression of universal values, not partisan agendas, Reagan invites supporters of the far-right candidate he endorses to shelter under the strength and confidence of his rhetorical persona. It is a protective and permissive rhetorical posture, a claim to universality that guards against the disapproval of others, in which his audience is invited to share. That rhetorical posture would persist throughout his presidency, helping to normalize the polarizing policies that would constitute his administration’s putative ascension toward freedom—which included support for dictators,4 paramilitary death squads5 and apartheid regimes at home and abroad,6 as well as agitation against women’s reproductive rights,7 and the neglect of tens of thousands of queer citizens dying of an unchecked epidemic.8

In one sense, this relational pattern of assimilation or destruction of every difference, this aggressivity toward outgroups, is a function of what Slavoj Žižek calls “the totalitarian Master,” whose calls for discipline and renunciation provide cover for an invitation to transgress “ordinary moral prohibitions.”9 One follows such a figure so that one may unleash one’s own aggression on those who are designated as enemies, outsiders or subalterns, surrendering the rights and responsibilities of self-determination to the Master in exchange for permission to violate the rights of those “beneath” or outside the hierarchical order. While Žižek opposes this totalitarian Master to Theodor Adorno’s “authoritarian personality,” these explanations are not precisely at odds, as both define the consistency of an excessive deference toward those above one in a social hierarchy with demeaning violence toward those below.10 This relational structure behaves in accord with Jacques Lacan’s definition of madness, which is not exemplified best by a commoner who believes himself to be a reigning king, but by a reigning king who believes himself to truly be a king. In other words, the belief that one’s position truly expresses a substantial distinction is a signal error of madness. This faith in necessary referentiality is also a signal condition of unexamined privilege.

While Adorno’s model interprets this relational structure of deference and aggressivity in terms of a “personality type,” Žižek treats it as a social fantasy, a shared structure of enjoyment into which one is initiated by the rhetoric of a central figure, around which a group organizes the rules and values of its relations. This helps to explain why followers of authoritarian figures or personality cults sometimes seem indifferent to the harms they suffer as a result of their leader’s policies or actions. Harms caused by Reagan’s neoliberal policies were not confined to minority communities, but also hit the white working-class communities who were a key part of Reagan’s successful electoral strategies. Those communities were deeply impacted by union busting, shifts in the tax burden toward those with less wealth and income, weakening regulatory protections, and encouragement of domestic deindustrialization and offshoring. These policies contributed to a period of wage stagnation that began in 1980 and still continues, even as wealth and productivity has increased exponentially.11 But if Žižek’s Master offers a compensatory exchange of social benefits for psychic benefits, this is not proffered or accepted as a conscious bargain. The mechanisms by which such contracts are foreclosed from attention, disavowed or denied—not to mention the generically fascist character of this arrangement—have only become topics of greater interest during the Trump administration. These mechanisms are important to the study of interpretation, and especially to theoretical reflection on the social construction or production of meaning.

Such foreclosures and disavowals are part of the social technics of knowledge production, information distribution and meaning making. If the negative consequences of Reagan’s policies and priorities cited above are not what first come to mind when one encounters his name, this is explained in part by the partisan politicization of education, media and interpretation. Already in 1982 Mitchell was defining the New Right in terms of this partisan attack: “The emergence of Reaganism has brought the pressure of economic and political reality directly to bear on the practice of criticism and scholarship. The intellectual and academic community, that part of society which lives by and on interpretation, finds itself threatened with loss of power, jobs, and prestige.”12 In literary studies, these losses persisted in every decade since Mitchell’s caution, despite repeated promises of a boom just around the corner. There is no way to predict whether this trend will dramatically alter its trajectory, or simply continue until graduate education in literary study collapses entirely. But it is clear that the efforts Mitchell describes to undermine the prestige and security of workers in the academic humanities comports with other efforts to politicize interpretation: Alongside the delegitimization of nonpartisan journalism and the invention of a “fair and balanced” news network by Republican political strategist Roger Ailes, the threat to literary studies Mitchell cites is implicated in a decades-long pattern of practice, with obvious strategic value for anyone who might wish to manipulate definitions of the mainstream in US politics and culture. The damage done to Nixon’s agendas by student activism and investigative reporting has apparently not gone unanswered, as his party has sought to maintain its title to values and principles of democracy that are often contradicted by its deeds. By redefining his opponents as evil antagonists in an eternal struggle between ascension and decline, Reagan set the tone for the next 50 years of partisan struggle.

Of course, such observations about patterns of behavior, systemic consistencies and strategic incentives can be mistaken for conspiracy theory, as we will discuss in connection with Eve Sedgwick’s reflections on the AIDS epidemic in the United States. This ambiguity between conspiracy and what she calls “systemic oppressions” has often been used to the right’s advantage: Whether it be Zionist protocols, global communist plots, jihadist terror, or immigrant invasions, the right’s grand narratives inevitably define the outgroup against which aggression is permitted as an inexplicably powerful cabal drawn from the marginalized or disempowered side of an asymmetrical struggle. Like the self-aggrandizing, self-universalizing fantasy of a “silent majority” living in a “city on a hill,” melodramas of victimization in which powerful and wealthy white men are beset by demonized and demeaned minority subalterns completely ignores systemic inequities. So while this dimension of narrative, poetics and interpretation is incomparably important to the reproduction of power, it is an often underestimated and overlooked aspect of US politics and history. The period of US history in which minority rule, information bubbles and fake news have blossomed as primary drivers of public policy has been the same period in which the academic centers of expertise in narrative, poetics and interpretation have been systematically defunded and disempowered. This need not be misread as a conspiracy in order to be acknowledged as a systemic consistency that accords with the right’s strategic incentives.

As Mitchell observes of the eighties and Orwell of the forties, we still live in an age in which there is no escaping politics—and in politics, there is no escaping interpretation. Just now, in an interview with the New York Times Book Review, a Yale professor in his seventies is defending the “aristocratic spirit” of the university against the “egalitarian and democratic values” of US political culture.13 While he assures us he does not object to these values in our political life, Anthony Kronman speaks of the dangers of importing such values into academic life. As if the university were a walled sovereignty upon which students were imposing their foreign democratic culture, or else an apolitical realm in which questions of power were suspended, he accuses students of engaging in the “politicization of academic life.”14 Seemingly unaware that his aristocratic values are as political as democratic ones, Kronman’s argument illustrates the self-aggrandizing fantasies of victimization that are so often voiced by the most privileged wealthy white men, fantasies we will see recur throughout this book. Kronman calls the attempt to democratize higher education an “assault on American excellence,” and by defining his own values as universal, he remains etymologically true to the aristos of his preferred spirit. He warns against an invasion, already underway, of the demos into Yale’s rarified halls, bringing with them “Orwellian” attempts at “purification” and restriction of speech on campus.

It is easy to forget that the term “aristocracy”—even in the analogical sense of “spiritual aristocracy” that Kronman invokes here—is already partisan in its valorizing redefinition of oligarchy, or minority rule, as rule by the best. The distinction “best” presumes a universalized standard of value, a central referent which would guarantee that the excellence Kronman celebrates is not merely relative or contingent, but is universal and necessary, defined by those “distinguished not in this or that particular endeavor [...] but in the all-inclusive work of being human.” His use of a vocabulary of inclusion to define an exclusionary ethos is an echo of William Bennett and Walter Jackson Bate, whose rhetoric of crisis in the humanities was provoked by challenges to the aristocratic, white, heteronormative, masculinist canon that for so long centered humanistic study. Kronman’s reaction to this destabilization of an exclusionary standard of value is indicated by the anecdote that begins his book, concerning one residential unit adviser at Yale who decided to abandon the traditional title of “master,” because “he understood why black students in particular might be sensitive to the use of the term.”15

Immediately after recounting this facially reasonable decision, Kronman mocks it as a clear demonstration of the inferior intelligence of both the would-be “master” and the black students about whom he articulated concern:

I found it hard to believe he was serious. In an academic setting, the word “master” carries none of the connotations the complaining students found offensive. Instead of mindlessly deferring to their feelings, the master of Pierson should have told them what is obvious—that in this setting the word has an altogether different meaning.16

This is the implicit interpretive theory of privilege in action. Connotations, in Kronman’s view, should apparently be erased and reset at every institutional threshold, to be defined solely by those with the preponderance of institutional power. In that view, meanings are not carried over from a more familiar to a less familiar usage or rhetorical situation. Instead, words mean what those in power say they mean. Kronman treats as “obvious” the presumption that Yale tradition decides what words mean for all those who tread on its campus, and insists no other connotations are valid, no matter how the language or its population of users might change over time. The decision to alter one’s vocabulary in response to altered conditions or audiences is here depicted as ridiculous, unserious or “mindless,” as laughably irrelevant as a student’s “feelings.” But any thinking that disavows feelings thereby ignores key aspects of its own occasion, frame and motor, just as much as any inquiry that disavows its historical and social implication.

Kronman’s characterization of such incidents as posing “Orwellian” dangers to campus life at Yale illustrates that, just as Reagan and his eighties are too often reduced to stock characters of popular myth, so too are Orwell and his forties. It should not be surprising that even the briefest review of Orwell’s warnings against politicizing speech, in “Politics and the English Language,” reveals an argument that contradicts Kronman’s reference. But it would be a dire mistake to avoid discussing such simplistic instrumentalizations of historical and literary figures and events: first, because they are so often carried out by the privileged and powerful, like this former Dean of Yale Law School. Silence on the overly familiarized tropes of mythologization or hagiography actually helps to reproduce them, as their alternative and antidote is not demonization or dismissal, but careful attention. To become familiar, after all, is to recede from attention, and figures like Reagan or Orwell are misremembered because they are referenced but not recalled. The most certain way to humanize and contextualize mythological figures is to attend closely to their words and deeds, drawing conclusions from the patterns discerned there, rather than from reputation, expectations or received wisdom.

Writing just a few months after World War II ended, Orwell reminds us of the tenor of the time when he epitomizes its political speech in some “familiar phrases”: “bestial atrocities, iron heel, bloodstained tyranny, free peoples of the world.”17 Orwell mocks these as hackneyed figures, robotically repeated, in conformity with the general rule that “orthodoxy […] seems to demand a lifeless, imitative style.” But his point is not primarily stylistic. Nor is he proposing an aesthetic program—even though, like Williams’s insistence on “no ideas but in things,” Orwell’s advice emphasizes the importance in writing of “calling up mental pictures.” His primary concern is the political force of such images, which the political speech he cites is designed to neutralize. The familiar phrases are designed to repulse their readers’ attention, by means of “euphemism, question-begging and sheer cloudy vagueness.” By drawing attention to them, Orwell’s “defense of the English language” presents an alternative to the “defense of the indefensible” accomplished through obfuscation and distraction. He reminds readers that defeating Hitler did not defeat indefensible acts as such, and to that end recalls that the USSR “purged” dissidents and Jews, the US killed civilians with atomic weapons and the UK massacred its imperial subjects. The implication is that these Allied powers should not be allowed to mythologize themselves as purely noble and virtuous figures by contrast with their Axis enemies. The political oratory Orwell decries is meant to depoliticize the atrocities of one’s own country, subtracting their horror to leave only the empty formalism of terms like “pacification,” “transfer of population,” “rectification of frontiers.” These terms allow their audience to forget or ignore the horrors that might be evoked by “mental pictures,” smoothing them over with technical jargon. Contrary to Kronman’s allusion, Orwell here argues against the depoliticization of language, and for its repoliticization. Clearly, Kronman’s desire to erase the historical connotations of “master” exemplifies the political language Orwell condemns more than the evocative writing he prefers.

In Mitchell’s gloss, Orwell’s argument is “that the pervasiveness of politics was very bad for language, that it tended to replace discussion with ‘a mass of lies, evasions, folly, hatred, and schizophrenia.”18 While this could be taken to mean that Orwell argues for the depoliticization of discussion or debate, such an interpretation would not be consistent with his account of political speech. When Orwell claims that “political writing is bad writing,” he is not objecting to the intrusion of political concerns into discussions that would be better organized around universal values. Instead, he is objecting to the elision of specific, concrete content in discussions of public policy. Mitchell’s reading is not wrong, per se, but it is potentially misleading. In part, this is because he is repurposing Orwell’s argument as an occasion for his own rhetorical task as editor of Critical Inquiry, arguing for the pertinence of the issue’s theme, “The Politics of Interpretation.” Mitchell makes room for opposing arguments in the journal by acknowledging that Orwell’s authority might be pressed into service by more than one side of current debates—having observed that politics is inescapable, but also that it can be “very bad for language.” Mitchell’s gloss is expedient in establishing the valid and pertinent point that politics has always elicited both interest and opposition among producers and critics of literature, and in that he is certainly not wrong. But his summary would likely mislead those who have not read Orwell’s famous essay, have not read it lately, or have only read about it.

Misreadings, or even reversals of the meaning of a text, seem inevitable in the exchange of approximations and citations of arguments which, especially if they are not central to the rhetorical task, are often presented in the form of a compromise deemed least likely to raise objections from supporters or detractors. This dim bog of “what everyone knows”—where debunked myths and useful innovations blur in a haze of partial recall, expedience and impatience—seems to more or less constitute every “middle ground.” I do not dispute the inevitability of this middle ground, because certainly there are times when we must deal in compromise and sojourn in bogs to reach an objective. But this foggy place is a waypoint, not a destination. Much is lost when Orwell’s argument is so vaguely characterized, as if from a distance. When it is pulled a little closer, its words and their context comprehended firsthand, we are reminded that Orwell’s irritation with political speech is not articulated as a call for autonomously aesthetic writing cleansed of political and historical impurities. It is not aligned with an ahistorical image of close reading, or a depoliticized common sense. What the essay condemns is the euphemistic jargon that depoliticizes state terror and sanitizes the horrors consequent upon state neglect. Orwell pleads with writers to reject facile professional clichés, but his proposals do not primarily aim at restoring the beauty or even the truth of English writing. They aim instead to renew its utility to the moment, to produce writing shaped to its purpose. Orwell clearly hopes that a reinvigorated forthrightness will renew the shock of atrocity dampened by technical terminology, expose the lie of apolitical consensus hiding a status quo brutality, and unveil the exploitation at the foundation of every flawless professional façade.

In revisiting what seems familiar, we often encounter surprises. Without this seemingly inexhaustible novelty of the old, literary studies would have little warrant for its curricula. But just as we reconstruct each time the memories we seem to merely review, so we stand a chance of changing our cultures each time we reflect on we are supposed to know. The obscure power of this process informs our reflection on “the politics of interpretation,” a phrase that recurs throughout the issue of Critical Inquiry that Mitchell frames as a response to Reaganism. That phrase implicates a broad range of personal, professional and social technics of meaning-making. However we may subdivide those technics for the purposes of academic study, analytic examination remains incomplete without a synthetic account of how these various scales and spheres of meaning interact. And however we may compartmentalize these domains of meaning in our personal and professional lives, our apprehensions of particular objects are incomplete without the comprehension of reality that conditions the meaning we make of them, even as it is conditioned by them. To study interpretation is therefore to study how the subject is formed or produced by a social order, and how a social order can be transformed or reproduced by subjects—or in other words, how a self is constructed by others, and how others are constructed by a self.

1.Reading the Lines

But when we reconsider Mitchell’s citation, the question arises as to why one might wish to “keep out of politics”—a wish implied by Orwell’s pronouncement of its impossibility. Reagan’s sheltering, permissive persona reminds us that political efforts to support or encourage social changes can threaten unexamined attachments and enjoyments, which are foundational to one’s sense of significance and worth—just as a politics that supports or restores an exploitative asymmetry of power provokes subalterns to demand recognition that their lives matter. Any view that presumes a zero-sum distribution of worth will define politics as antagonism and loss, and this in turn motivates the search for a realm without struggle or death. In other words, the notion that politics is a fight to the death—one that inevitably ends in a master–slave relation—is correlated with the wish for a domain of eternal, universal excellence, truth and beauty. The fallen world implies a higher world above it. The negative reference of each of these worlds to the other—one inevitable but undesirable, the other impossible but irresistible—constructs a sense of stability, and at the same time produces a reality effect of incompleteness or inconsistency, of compensatory losses and gains. In some sense, any academic discipline that imagines itself in terms of Matthew Arnold’s “study of perfection” necessarily participates in this ambivalent structure of interdependent but irreconcilable worlds, in which the necessity of enduring quotidian struggle, strife and cruelty is compensated by an eternal realm of universal value which guarantees the superiority of social structures that shelter and defend it. This is the logic that frames devotional scholarship.

From within that frame, Orwell’s desire to unmask the atrocities of Western humanist and liberal democratic societies might appear to undermine the institutions that keep the inescapable antagonisms of a fallen world in some degree of containment. This fear of undermining institutional stability and authority informs traditionalists and conservatives, from T. S. Eliot’s attachment to the church and “the main stream” of the Western tradition to neoconservative views of culture and religion.19 But in the last decade or so, the exposure and unveiling for which Orwell advocates has been called into question from an apparently different perspective. Stephen Best and Sharon Marcus, among others, in their arguments for “surface reading,” suggest that exposing atrocity is simply unnecessary in an era of more broadly available information, and of newly obvious government corruption and ineptitude. Though these authors argue on the basis of liberal or progressive values, their arguments entail the same discrepancy between a world of politics, necessarily defined by antagonistic contradictions, and a higher, more beautiful or harmonious world.

Their opposition to George W. Bush’s policies provide the rhetorical occasion for Best and Marcus to question “symptomatic reading,” a term they employ to characterize a range of approaches in literary studies based on demystifying interpretations, which they describe in terms that evoke Orwell’s: “The assumption that domination can only do its work when veiled, which may once have sounded almost paranoid, now has a nostalgic, even utopian ring to it.”20 Just as Mitchell situates his arguments against the backdrop of Reagan, so Best and Marcus frame their proposal for “surface reading” as a response to the Bush administration’s obvious deceit, incompetence and reliance on culture war. Bush’s failures and bad faith were so plain, Best and Marcus argue, that they obviate the need for literary studies to develop and deploy sophisticated methods of reading between the lines: “Eight years of the Bush regime may have hammered home the point that not all situations require the subtle ingenuity associated with symptomatic reading.” We might pause to wonder what Orwell would make of the tentative tenacity implied by “may have hammered home the point,” but in this uneasy combination of hedging and insistence we can discern the tensions inherent in their rhetorical purpose, which requires both an inclusive breadth and a focused urgency. Best and Marcus make their argument in the introduction to a special issue of Representations that, like Mitchell’s issue of Critical Inquiry, presents viewpoints drawn from a range of current methodologies in literary studies. This inclusiveness allows them to define “The Way We Read Now” as an attempt to overcome the hegemony of demystification.

If one agrees that methods of critical unveiling or skeptical examination were obviated by the overtness of Bush’s abuses of power, then this must be doubly true of the current administration. In a period when the euphemisms of polite society have been scrapped by many politicians in favor of raunchy hate speech and overt racism, misogyny and transphobia, served up alongside a sneering public disdain for democracy and the rule of law, many of us may be led to question the pertinence of Orwell’s cautions against the depoliticized propriety of professional jargon. In Trump’s mouth, dog whistles become foghorns, eagerly advertising all the dehumanizing hate concealed by the jargon that Orwell deplores. It is tempting to believe that in an increasingly divided country, the agendas of both sides are finally exposed, apparent to all, even self-evident—making interpretation superfluous. But because the antagonism of subcultures that we often call “tribalism” encourages self-referential “information bubbles,” it also encourages the proliferation of bad faith arguments that permit or excuse structures of enjoyment predicated on aggressivity.

As I write this, examples of this disavowal and bad faith arise daily, along with spontaneous examples of tactical response. President Trump recently opined that Representative Elijah Cummings’ district was such “a disgusting, rat and rodent infested mess” that “no human being would want to live there.”21 This came two weeks after the President told four members of Congress, all of whom are women of color, they should “go back and help fix the totally broken and crime infested places from which they came.”22 The reaction to all this prompted acting White House Chief of Staff Mick Mulvaney to appear on the next episode of Fox News Sunday, apparently with the express agenda of denying the demonstrable pattern of racism in Trump’s insults.23 Mulvaney at first appears comfortable in his appearance on a network guided for so long by the president’s friend Roger Ailes, a network that maintained its deep and extensive ties to the administration, including a regular exchange of personnel, even after Ailes was forced out by multiple allegations of sexual assault. Under questioning by Chris Wallace, Mulvaney quickly deploys a “both sides” defense: “When the president attacks AOC plus three, when he attacks the squad last week, he gets accused of being a racist. When Nancy Pelosi does it a few days later, the left and many members of the media […] come to Nancy’s defense, how it couldn’t possibly be racist, that she was simply attacking their ideas.” Mulvaney goes on to contend, against all apparent evidence, that the president’s attacks are comparable to those of Speaker Pelosi, that he is also attacking the ideas and records of these public figures, in ways that do not implicate their race: “Look, I was in congress for six years. If I had poverty in my district like they have in Baltimore […] and I spent all of my time in Washington, D.C., chasing down this Mueller investigation, this bizarre impeachment crusade, I’d get fired.” In what has become a favorite strategy of Trump defenders, Mulvaney ignores the president’s actual message and implicitly substitutes a more substantive and judicious message in its place.

But he also goes beyond this inventive free indirect speech, to conjure a set of putatively factual premises, upon which to base his conclusions. Mulvaney substitutes a stereotype of black urban poverty for the actual data from Cummings’ district, while substituting a stereotype of white suburban affluence for the data that describes his own former district. Of course, neither substitution comports with the facts. Upon seeing this segment, NBC News reporter Jonathan Allen corrected the record, tweeting that “the poverty rate in Mulvaney’s old district (14.9%) isn’t much lower than it is in Cummings’ district (16.6%). Both are above the national poverty rate of 12.3%. By Mulvaney’s standard, Republican Reps. Jeff Duncan and Tom Rice of SC should be thrown out of office.”24 Relying on the false certainty afforded by familiar racist stereotypes to establish the credibility of his claim, Mulvaney is able to argue that Cummings’s district is worthy of the president’s verbal abuse because it is poor, not because of the race of its residents or their representative. This is a perfect example of how racist attacks do not so much rely on ignorance as they do on a false knowledge supplied by aestheticized fantasies, of the same kind that undergird more complex conspiracy theories. Simple ignorance would be an improvement for Mulvaney, as not knowing would at least allow room for learning, whereas his false certainty about fictional premises leaves no space for the inquiry that leads to knowledge.

Referring to his pretextual concern with the district’s poverty, Mulvaney concludes that “I think the president is right to raise that, and it has absolutely zero to do with race.” This denial is itself a racist erasure of the historical relevance of white supremacist public policy to the accumulation of wealth, and many activists or academics could refute the point by citing history or social science. This is a valid approach, but Wallace shows that it is unnecessary if one’s aim is simply to demonstrate Mulvaney’s bad faith. Almost before Mulvaney finishes his denial of the relevance of race, Wallace launches into a demonstration of a pattern of usage in Trump’s attacks, one clearly marking their racist tone. “You say it has zero to do with race,” Wallace repeats, but “there is a clear pattern here, Mick.” He focuses on a key word associated with racist and Nazi smears against people of color, religious minorities, socialists and leftists, as well as the queer community, among others: infestation. Trump said of Representative John Lewis, “he should spend time in his crime infested district.” About “the squad” of four Representatives who are women of color, Trump tweeted “they should go back to the crime infested countries from which they come.” And about Representative Cummings, Trump tweeted that “his district was rat and rodent infested.” After rehearsing this list, Wallace again emphatically pronounces the word “infested,” to observe that “it sounds like vermin, it sounds subhuman, and these are all six members of Congress who are people of color.”

Mulvaney, whose small round metal-framed glasses suddenly seem in discomfiting harmony with the historical resonances of the accusation, rears back on his stool as if to slow a galloping horse, delivering a dismissive charge with which every literary scholar is likely familiar: “I think you’re spending way too much time reading between the lines.” Wallace, demonstrating a wit I have not generally been led to expect on Fox News, shoots back: “I’m not reading between the lines. I’m reading the lines.”

In sum, Wallace simply cites the public pronouncements of the president and observes a pattern of usage consonant with one of the most historically familiar racist tropes. His claim could not be more transparent or inarguable, but because Mulvaney does not want to explicitly acknowledge and support the clearly racist attacks, he accuses Wallace of suspiciously “reading into” the language, unveiling supposedly secret meanings that are merely the product of an overly sophisticated intellectualism. But while it clearly cannot be said that Wallace was “reading into” Trump’s words, he was also doing something more than merely reading his lines. To highlight what he calls “a clear pattern,” he had to first note and select for repeated word choices, and then conclude about the significance of that repetition, by placing it in a pertinent social and historical context of usage. He attentively and faithfully reorganized Trump’s utterances to emphasize their terminological consistency. And while Mulvaney could not be expected to admit what, after all, he is paid to deny, Wallace’s techniques could at least demonstrate to the audience that neither he nor they were crazy to believe their own observations and reasoning, rather than succumbing to Mulvaney’s attempts at gaslighting—which, above all, consists of the denial of arguments from reason in favor of arguments from authority. While Mulvaney insists that Trump’s words mean whatever those in power say they mean, Wallace demonstrates their meaning depends on rationally articulable patterns, historical connotations and a social context larger than the will or caprice of the powerful.

Obsolete Suspicions

Best and Marcus primarily use Frederic Jameson’s term, “symptomatic reading,” to identify the hegemonic method they seek to displace. It would be outside the scope of the present inquiry to attend closely to Jameson’s reading of Marxist and psychoanalytic concepts of symptom, which differs substantively from my own. But that analysis is also unnecessary here, because Best and Marcus do not use Jameson’s term to refer solely or specifically to his claims, instead employing it as an overarching title for a range of familiar literary-theoretical methodologies. Indeed, samples of all these approaches are included in Mitchell’s 1982 issue of Critical Inquiry—including some antagonistic and mutually exclusive strains of Marxism, psychoanalysis, feminism, deconstruction, postcolonialism, queer theory, and even New Critical “close reading.” As against what therefore amounts to almost the entirety of their disciplinary past, “surface reading” is presented as the coming hegemony, gradually and organically forming out of responses to their predecessors by many critics, including those their issue of Representations collects.

This superseding novelty of surface reading is thus presented as arising from a new political and cultural climate—as we have noted, in response to Bush’s blundering indifference to the values proper to a democratic republic. Best and Marcus imply that past political obfuscation looks sophisticated by comparison with the transparent lies and abuses of the Bush administration, which have rendered obsolete the sophisticated interpretive methods to which past generations turned in order to reveal political violence and oppression. The examples they cite are revealing enough to be worth quoting at length:

Those of us who cut our intellectual teeth on deconstruction, ideology critique, and the hermeneutics of suspicion have often found those demystifying protocols superfluous in an era when images of torture at Abu Ghraib and elsewhere were immediately circulated on the internet; the real-time coverage of Hurricane Katrina showed in ways that required little explication the state’s abandonment of its African American citizens; and many people instantly recognized as lies political statements such as “mission accomplished.”25

On first reading, their principal claim seems inarguable, in part because it is a description of their own experience: They report finding their training in “demystifying protocols” unnecessary for the purpose of decoding the administration’s blatant cruelty, incompetence and deceit. But this observation about their own experience and judgments is then immediately generalized into a normative program for the discipline, producing a claim that is complexly problematic and even self-contradictory—indeed, one might say it is “symptomatic.”

Their argument can be read as symptomatic because its aporias indicate a procedure of self-universalization characteristic of privilege, which can also be discerned as a systemic pattern in the disciplines of literary studies. And because all asymmetrical binaries of privilege and disempowerment or centrality and marginalization are intersectional, this pattern implicates even those of us for whom some salient vectors of marginalization or oppression are definitive. In other words, because few if any of us are defined by the disempowered or marginalized terms of every conceivable binary, most if not all of us are conditioned by some vector of privilege. Unacknowledged and unaddressed issues of privilege have continued to undermine literary studies’ social significance since the canon and theory wars of the eighties. Symptomatic indication can therefore be read as a trope for the homology between systemic disciplinary difficulties and the four difficulties specific to this argument: First, Best and Marcus misrepresent the novelty of the Bush administration in claiming its policies as their occasion. Second, the ambiguous relation of their methodological arguments to their experience of the Bush years is consonant with their ambiguous views on interpretation’s relation to politics. Third, this ambiguity regarding the relation between interpretive act and political implication opens the way for an immediate universalization of their own privileged position. And fourth, confusion about the politics of interpretation manifests in inconsistent representations of their intellectual lineage.

While it would be easy to dismiss their response to the Bush administration as a rhetorical device, merely an occasion for their argument with no more than decorative significance, it is rather the primary premise offered in support of their conclusion that symptomatic reading is obsolete or outmoded. Without the political frame, readers might be tempted to view their call for a new methodological consensus as arising purely from their own personal preferences or goals. While nothing in principle prevents them from framing their arguments in that way, they do not, and instead they cite a shared experience of the Bush administration as a turning point in the interpretation of politics. Demystification, they imply, was necessary at some time in the past, but the events they cite have made it “superfluous” for scholarship and teaching—because information about state violence has been made widely available (e.g., images from Abu Ghraib), inequitable state neglect has been prominently featured in news media (e.g., coverage of Katrina), and the president has made claims that “many people” know to be deceptive (e.g., “mission accomplished”). But if we attend carefully to these claims, we find that none of these events mark the epochal shift their argument presumes. None of these names a novel development of the Bush era, and so it is difficult to understand how these events can be supposed to have convinced Best and Marcus that we were entering upon a newly unmystified, unveiled era in politics.

Despite their characterization, the images from Abu Ghraib were not in fact “immediately circulated.” While it is technically true that those images were available on the Internet earlier than most Americans became aware of them, it took approximately 10 months for those abuses to garner widespread media attention. By way of comparison, this is only a few months less than it took the media to give national attention to the Mỹ Lai Massacre decades earlier, and information about that event was also available before the media thematized it as a national issue.26 Abuses at Abu Ghraib did not therefore represent an effectively new experience of immediate exposure. Similarly, what Best and Marcus characterize as “real-time coverage” of the state neglect of African Americans after Hurricane Katrina can be compared to page one of the New York Times on May 4, 1963, which prominently features the previous day’s events under the headline “Dogs and Hoses Repulse Negroes at Birmingham.”27 There does not seem to be anything new, therefore, about real-time attention to such racist state abuses.28 And after “mission accomplished,” the lies of the Iraq War do not appear to have been “instantly recognized” as such by most Americans— at least not in any way that effectively undermined Bush’s majority of the popular vote in 2004, which improved upon his failure to secure a majority in 2000. Perhaps as they were writing in 2009, amidst historically low approval ratings for Bush and in the wake of Obama’s victory, Best and Marcus forgot that the 2004 election records popular approval for Bush after two of the three events they cite.

If the same kind of immediacy and transparency that Best and Marcus impute to the Bush years could also reasonably be imputed to earlier periods, then we cannot conclude that their views were formed in response to changing conditions. This is evidence that the Bush administration functions not as a provocation for their call to reject “demystifying protocols” but as a post hoc rationalization. There is nothing new about Bush’s disregard for international law, civil rights, equitable treatment or transparently responsive governance—all of which are consistent with Republican Party rhetoric and policy under the Nixon, Reagan and Trump administrations. In the same way, the skepticism professed by Best and Marcus regarding political engagement in literary studies is not new. Hostility toward democratic governance is consistent with skepticism about the efficacy or desirability of political engagement, and both are perennial tendencies in a long-running war of position. They are especially characteristic of the neoliberal consistency that has gained in power and influence in the US during the last 50 years.

This leads us to the second difficulty, which arises when we consider the nature of this ostensible occasion for their argument: The obviousness of Bush’s abuses of power is cited in order to obviate political engagement in literary scholarship, as “the disasters and triumphs of the last decade have shown that literary criticism alone is not sufficient to effect change.” Here they admit literary scholarship’s implication in politics, by offering its lessons as germane to disciplinary objectives, but deny that such scholarship in turn has implications for politics. Literary studies is thus subordinated to determination by political events, while they deny its capacity to share in the determination of those events. So, while they disapprove of Bush’s actions, they also argue against scholars’ efforts to respond, dismissing them as pointless attempts to make literary criticism into “political activism by another name.” But if demystifying protocols cannot “explain our oppression” or “effect change,” as they claim, then why would it also be necessary to argue that its effects are obviated by the obviousness of Bush’s acts? The former implies that demystification is insufficient to the task, while the latter implies that it would be sufficient, but it is not necessary, since those abuses and lies are apparent even without sophisticated interpretive techniques. If criticism is incapable of explanation, then why insist that its explanations are unnecessary because “many people” recognize “immediately” or “instantly” the deceptions and abuses that are taking place?

Like their attribution of novelty to aspects of the Bush administration with ample precedent in US history, their argument in the alternative suggests that Bush is used as a rationalization for their views on political engagement, rather than having led to those views in the first place. The remaining two difficulties indicate some of the conditions that allow for such rationalizations to pass unnoticed. Best and Marcus call ideology critique “superfluous” in connection with a catalogue of Bush administration misdeeds that cites dehumanizing aestheticizations like Islamophobia and racism, a prominent denegation in a disavowed imperial war, and victim-blaming against subalterns impoverished by state policy—to characterize only a few aspects of the events named there. Their argument that critique of ideology is irrelevant is proposed in a rhetoric that implicitly universalizes the recognition that certain views and systemic inequities are harmful, in spite of its authors’ explicit acknowledgement that those who so recognize are “many,” and not all. The clear implication is that those who do not view Bush as they do, need not be heeded, and their different interpretations need not be explained or addressed. And yet, precisely because I recognize like Best and Marcus that the harms of the Bush administration were unambiguous and apparent, I am genuinely puzzled by their incurious attitude toward the majority of US voters who approved of him.

To produce the recognition described by Best and Marcus generally requires a particular position in the system of US property relations, one that seems relatively common among humanities academics, but characterizes a small minority of the US population. This position is indicated by the conditions necessary to each perception Best and Marcus cite: Relatively high-speed internet access was required to encounter “images of torture at Abu Ghraib” that were “immediately circulated on the internet.” Cable television was required to view “the real-time coverage of Hurricane Katrina.” The instant recognition of “mission accomplished” as a lie is made far easier by a lack of personal connections to those fighting it, given the deeply wish-fulfilling valence of that banner for every parent, sibling, partner or child of a US soldier, who is disproportionately likely to come from a poor or working class background. And the initial desirability of these policies and actions is conditioned, in part, on racist, Orientalist, imperialist, militarist, patriarchal, compulsorily heterosexist, capitalist, theistic and otherwise metaphysical ideological assumptions that are usually unlearned only through some prior training in some mode of demystification, whether in an academic context or otherwise. But all this positional specificity and class or educational background is implicitly presumed as a premise for what should define the necessary features of higher education research and pedagogy in interpretation and writing. It is as though Best and Marcus believe their experience to be representative, instead of conditioned upon an exceptionally privileged position in a society where, for example, tenured professors’ income places them in the top quintile. Here the entire orientation of literary study as an educational enterprise and a social good is predicated on the class experience of that elite group.

This suggests an explanation for their indifference to the interpretive problem posed by widespread support for Bush during much of his presidency, including after most of the actions they cite. Instead of approaching this support as an occasion for pedagogical and interpretive interventions, as an opportunity to demonstrate the utility of literary studies in explaining our world, the Bush administration is described solely from the perspective of his opponents, as if they constituted a majority all along. This invented majority of interpreters who agree with Best and Marcus are then cited as evidence of the superfluity of politicized criticism—as if the ineffectuality of this fictional majority is proof that interpretive clarity is insufficient to affect politics. It is as though, having been trained in a culture and era that universalized and glorified the solitary heroism of the thinker and interpreter “supposed to know,” a mythic figure of Western thought whose agency is sufficient unto itself, Best and Marcus reject this agency’s omnipotence by concluding that it is utterly impotent. As if the diametrical opposite of a debunked fiction would be the only possible truth.

Far from obviating the need for sophisticated interpretive methods, the demystification of theoretical omnipotence would seem to call for unprecedented efforts. What appears transparent to one of us often appears as a mirror to those on the other side. The “MeToo” movement has lately dramatized how patriarchy and compulsory heterosexuality has encased even liberal, educated professionals in an interpersonal version of this two-way mirror, in which powerful men preen, reflecting on their own demands without reciprocal attention to others. This masculinist narcissism has long been transparent to many women who struggled to make careers in a sexist society. But so many men now seem to be shocked that their mirrors were windows to those on the other side, that their abusive behavior had been plainly visible to those who were not empowered to hold them accountable. White privilege and class privilege operate in homologous ways to that masculinist culture of assault and objectification. And on a national scale, we are increasingly divided into ideologically homogenized factions viewing entirely different realities. Just now, the increasingly desperate rhetoric used by journalists and pundits to describe these problems would seem to indicate that explanations are difficult to come by, and expertise in interpretation might be useful. Perhaps if more scholars and teachers in literary studies would acknowledge and attend to the relevance of politics to interpretation, then we could contribute to understanding the relevance of interpretation to politics.

Bad Faith

In their dismissiveness toward political engagement in literary studies, Best and Marcus not only present inconsistent or impoverished characterizations of their opponents’ views, but also misrepresent or misrecognize their own intellectual lineage. They seem to speak to both sides of the perennial question of autonomy and engagement, claiming the political environment of the aughts showed them “that literary criticism alone is not sufficient to effect change.”29 Similarly, they cite authors who advocate for the disqualified, demeaned, harmed and neglected, but at the same time argue against scholarship as “political activism by another name.” This difficulty is elided in their characterization of themselves as “the heirs of Michel Foucault, skeptical about the very possibility of radical freedom and dubious that literature or its criticism can explain our oppression or provide the keys to our liberation.” But like their rejection of “literary criticism alone” as sufficient for social change, the invocation of “radical freedom,” taken colloquially, implies an extreme or unreasonable opponent, and understood terminologically, it introduces a Sartrian phrase with little purchase among US activists. It is also a concept that can be critiqued as apparently voluntarist or idealist, especially in light of Marxian, Nietzschean and Freudian insights concerning determination and conditioning, choice and will, developed by subsequent theorists of sex, gender, sexuality, race and class. Without acknowledging these contexts and complexities, their characterizations of precursors and opponents as extremes on either side imply that they defend the capaciously common-sense positions of an inclusively moderate majority, as opposed to the views of unidentified critics and theorists engaged with politics, whom their vocabulary implicitly depicts as totalizing or unreasonable.

This kind of rhetorical positioning is often used to defend an exclusionary and exploitative status quo against those it excludes and exploits, and has long been employed against socialists, feminists, abolitionists and other malcontents—as Foucault, among others, has noted in tracing the historical discourses that have defined incarcerated people, the mentally and otherwise physically ill, or those who diverge from gender or sexual norms. In short, appeals to the authority of a presumed moderate majority—like those of the “silent majority” or “moral majority” to which Reagan, Nixon or Bush appealed—often succeed at the expense of the most vulnerable. Like all arguments from authority, they do not require reason or principle, because they rest on the implicit threat of exclusion. The definition of others as negative images of oneself is also consonant with a reluctance to engage with the ways in which a text is at odds with itself, as both rely on objectifications that deny internal contradiction. In this way, both are also consistent with the reluctance we noted in Best and Marcus to engage with the internal political differences of Bush’s America. This avoidance of internal difference and political history would at least partly account for a professed inability to “explain our oppression or provide the keys to our liberation.”

It is certainly strange to encounter an argument that during the Bush administration, the veils were lifted from the authoritarian aspirations of the US right, given that Bush’s candidacy began with his claim to represent a new, “compassionate conservatism,” supported by the widespread belief that both major political parties were substantively the same, as well as by distortions in campaign coverage.30 But that same candidacy was decided by the Supreme Court’s partisan verdict in Bush v. Gore, producing charges by legal scholars of outright “illegality.”31 That era also demonstrated the powerful influence operations of Republican strategist Roger Ailes, whose full editorial control over Fox News conditioned views of the PATRIOT Act, the “Defense of Marriage” and the War on Terror, even as he branded their approach as “fair and balanced.” In the Bush years, just as in previous decades, racism, sexism, homophobia and xenophobia were obvious to many, and at the same time they were persistently disavowed or obfuscated.

Of course, it will always be enticing to think of a day when we will be able to forgo interpretive engagement with the kind of bad faith represented by George W. Bush’s “compassionate conservatism,” Reagan’s putative love of liberty or Richard Nixon’s professed devotion to law and order. It is possible that the election of Barack Obama led Best and Marcus, like many others, to believe that the United States had turned a corner, and would go on to conclusively renounce its history of hate and fear. But a decade later, there is no question that those lessons are not yet learned. What they call the “nascent fascism” of Bush is now growing up fast, maturing into something much closer to the Europe-bestriding stature proper to Orwell’s forties, complete with far-right nationalist parties rising to power in Austria and Poland, and torch-wielding mobs in US streets chanting “blood and soil.” After a decade of gains, a new conservative majority on the Supreme Court places civil rights in immediate peril, including voting, labor and reproductive rights. Transparent disregard for the rule of law and for equitable treatment of people of color, which Best and Marcus rightly abhor in the Bush administration, has long been featured in descriptions of the Nixon and Reagan administrations, and is now a prominent feature of Trump’s presidency. One need only read the lines. But such reading has never been a given of political discourse, as comprehension and recognition have never been the default in human communication. It is only on the basis of misreading, incomprehension and misrecognition that we learn, by reflecting on what has not worked and what we must change.

The bad faith characteristic of authoritarianism, imperialism and all the various forms of scapegoating or victim-blaming has rarely been more potent in US politics than it is now. But the incentive structures that support bad faith have long been discernible, for example, in the systemic oppressions that comprise white supremacy in the United States. Now after decades of denials and disavowals, Reagan’s supporters have been confronted with new, apparently undeniable evidence of his racist views and attitudes. The recording of a private call to Nixon was recently released, made while Reagan was governor of California, in which he casually refers to UN delegates from Africa as “monkeys” who are “still uncomfortable wearing shoes.”32 Both men blamed African countries for defeating a US-led effort to block the UN from seating the People’s Republic of China. There is no apparent reason why the African nations would be singled out, given that Nixon’s own State Department identified the responsible parties as Belgium, Cyprus, Ireland and Mexico, along with “Arab defections” from the US position.33 That is, there is no apparent reason other than the racist contempt and hatred that is apparent in Reagan’s comments. Nixon’s hearty laugh, and his subsequent repetition of the story to others, indicates that he and Reagan were of one mind about the inferiority of Africans, and did not bother to disguise their contempt in private. Of course, Reagan’s attitude toward people of color was already apparent to anyone willing to acknowledge the patterns in his rhetoric and policies as president, including his support for South African apartheid and his regular recourse to “coded” characterizations of African-Americans as lazy or criminal.

And yet when the tape was released, Melissa Giller of the Ronald Reagan Presidential Foundation and Institute responded with denials and obfuscations: “If he said that 50 years ago, he shouldn’t have. And he would be the first person to apologize.”34 Note that her conditional “if” allows for the expression of doubt about something already clearly demonstrated. This conditional allows her to continue in the vein of a hypothetical characterization based on known evidence, as if we must apply what we know of Reagan to judge an uncertainty: “If he ever mistakenly said something offensive, he would be the first to apologize,” Giller seems to say. But this speculation is clearly contradicted by the actual evidence of the call, in which he deliberately says something so casually and cavalierly dehumanizing that it can only be interpreted as evidence of his belief in his own superiority over Africans, who he describes in literally dehumanizing terms, referencing racist and imperialist stereotypes, apparently irritated that people he considers uncivilized have a say in world affairs. To posit Reagan’s hypothetical eagerness to apologize for this is just as bizarre a gesture as Giller’s attempt to distance him from his comments by specifying that he said them “50 years ago.” This almost requires us to belabor the obvious point that, because Reagan was dead at the time of Giller’s response, his current beliefs could not be distanced from what they were decades ago. And dating the comments to 50 years previous to their release does nothing to distance them from his public record, because he was a governor at the time of the recording and began to campaign for the presidency only four years after it. In short, she can say nothing to refute the relevance of this revelation to Reagan’s public record, in elected offices that required him to represent the interests of African-Americans.

In writing as though it is a living Reagan she defends, as if what he said fifty years ago should not be held against him now, as if he were now capable of apologizing or changing his views, Giller inadvertently indicates that her rhetorical purpose is to defend the persisting and still efficacious mythologization of Reagan. Supremacist politics relies on the kind of narcissistic investment at work in this mythologization and devotion to the cult value of a personality. No matter how apparent the facts under discussion may be, this is a politics of interpretation that is founded on disavowal, as well as on the refusal to credit opponents’ equal worth, which results in refusal to credit their concerns or acknowledge harms. As indicated by Giller’s defensive response, Reagan is not remembered consistently and accurately for his policies or positions, but has instead been depicted as an aspirational figure of identification for a social fantasy of America. That fantasy universalizes a particular subject position, a straight white impenetrable Hollywood image of masculinity, a man devoted to Western civilization but anti-intellectual, Christian but aggressively nationalist and militarist, a cowboy, a businessman and a star. By apparently allowing her devotion to the aspirational fantasy of Reagan to eclipse her concern with the historical reality, Giller becomes a figure for academics whose devotion to aristocratic values exercises a censorship over our attention to the implication of our discipline in historical struggles, and its implications for those struggles.

This kind of discussion is bound to provoke anxiety among academics because, in part, our reticence to engage with historical and political interpretation follows from an understandable desire to maintain an image of nonpartisanship. In this, we have much in common with journalists who, despite having long been targeted by the right, wish to maintain credibility as nonpartisan actors. Reagan’s taped call received a lot of attention from journalists for the same reason that Giller feels emboldened to implicitly deny the manifest evidence: Without the calls, supporters of Reagan could still disavow his racism by insisting on referential proof of a phenomenon that was already inferentially discernible, as patterns of practice in Reagan’s policies and rhetoric. Whether or not a recording emerged that included racist slurs and stereotypes, the consequences of Reagan’s rhetoric and policies would have remained the same, and they would have been just as clearly rooted in racist and imperialist beliefs about white superiority and the agenda of white supremacy. But without that recording, the clear pattern could always be rhetorically dismissed as merely reading into Reagan’s record, primarily because we continue to value referential over inferential evidence.

Reagan’s savvy in disavowing his solidarity with racists is evident from his first public appearance as the 1980 Republican nominee, when he demonstrated his intention to continue Nixon’s “Southern Strategy,” appealing to Wallace voters by signaling approval and support for institutional racism. At the Neshoba County Fair, where famously three civil rights workers had been murdered in 1964, Reagan announced to a white audience, “I believe in states’ rights.”35 President Carter responded to the obvious implications of Reagan’s speech the next month at Ebenezer Baptist Church, denouncing “the stirrings of hate and the rebirth of code words like ‘states’ rights.’”36 The press responded by questioning Carter’s accusations, rather than Reagan’s rhetoric. A reporter asked Carter: “Do you think that Reagan is running a campaign of hatred and racism, and how do you answer allegations that you are running a mean campaign?” In what is by now a familiar pattern of false equivalencies, the reporter’s question balances one presidential candidate’s message of dehumanizing hate and institutional terror against concerns that it might be “mean” for his opponent to point out that hate. True to form, Carter demurred from mean-spirited accusations and equivocated on his own previous statements in assuring the room, “I do not think that my opponent is a racist in any degree.”

But as a Hollywood showman and corporate spokesman, Reagan knew when to press an advantage. He called Carter’s speech “shameful,” scolding him that “we ought to be trying to pull the country together.” After sending a message to the white South that he would protect the system that encoded the Confederacy and Jim Crow as issues of “states’ rights,” Reagan also claimed the role of messenger of unity to the whole nation. In our time, we would call this “gaslighting,” recognizing this tactic as one long used by abusers to silence their victims. And yet these kinds of tactics seem to have only contributed to Reagan’s aura of power and Carter’s image of weakness in the media narrative. By accusing Carter of divisiveness after appealing to racist support for segregation, Reagan is demonstrating a form of projection, in the psychoanalytic sense of assigning to another some disliked aspect of oneself, in order to preserve a flawless image of self.37 This is a component of the splitting of the ego, in which narcissistic overvaluation requires splitting off a wholly good self-image from a wholly bad image of the other. Splitting in this way serves as a mechanism of defense against confronting the split or contradiction internal to any unity—which if acknowledged would pose problems and motivate change.

The avoidance of politics and history is necessarily an avoidance of internal splits and contradictions, avoidance of change and self-criticism. One of the key obstacles to social engagement in literary studies is a notion we will examine later in writings by Reagan’s Education Secretary William Bennett and Harvard humanist Walter Jackson Bate, that interpretive attention is only appropriate or worthwhile when applied to texts that incarnate perfection and universality—an assumption that can be read as a survival from the era in which the primary object of hermeneutics was scripture. This is a devotional entailment of metaphysical or mystified interpretations, like those that sustain faith in the Reagan myth. By constructing Reagan as a god-like figure of aspirational identification and his opponents as demonized figures of threatening difference, the political imagination of Reaganism is organized in homology with this splitting of the ego, marking the intersection of narcissistic identification, religious mythology and supremacist politics. Any interpretation that reduces systemic conditions to a personalized or aestheticized melodrama of struggle to the death—between forces of good and evil, being and nonbeing, or reason and unreason—participates in this mythologized pattern of interpretation, which can trace its roots to the interpretive process in which the self-image is formed. In an essay cited by Best and Marcus as providing a foundation for their criticisms of symptomatic reading, Eve Sedgwick describes precisely these interrelations, in which paranoia, conspiracy theory and narcissism indicate the politics of interpretation in literary studies and their connections to the interpretation of politics.

2.Paranoid Projects

When Mick Mulvaney accuses Chris Wallace of “spending way too much time reading between the lines,” he is exploiting ambiguities in the way we speak of conspiracy theory and interpretation as such, as well as a culturally enforced aversion toward intellectualism and self-criticism. Eve Sedgwick’s account of “paranoid reading” directly addresses this “sore spot” in literary theory, marked by our shame about participating in what Paul Ricoeur calls the “hermeneutics of suspicion.” Best and Marcus indicate their debt to Sedgwick’s definition of this hegemonic style in literary theory and criticism, and to the challenge she poses to its universality by enumerating its defining features “as one kind of cognitive/ affective theoretical practice among other, alternative kinds.”38 In line with this objective, Sedgwick distances herself from “the use of ‘paranoid’ as a pathologizing diagnosis,”39 not aiming to refute or disqualify this mode of reading, but only to particularize and denaturalize it.

One defining feature of this paranoid style that Sedgwick emphasizes is “its faith in exposure.” As an example, Sedgwick describes in Foucauldian terms the purpose and method of D. A. Miller’s The Novel and the Police, an exposure of “modern discipline” as “a problem in its own right.”40 By seeking to expose political violence by means of literary criticism, this project stakes its value on an implicit claim of historical pertinence and social utility. But that claim is undermined when Sedgwick shows that the book’s problematic is more a response to theoretical predecessors like Foucault than to the social and political context in which it was produced. This autonomous response to a disciplinary intellectual history, posing as a response to a broader social and political history, risks wasting its sophisticated powers of interpretation on irrelevant or anachronistic problems.41

Sedgwick cites Miller’s book, Judith Butler’s Gender Trouble and works associated with New Historicism as examples of the broad range of influential texts that rely on paranoid reading. But she warns against confusing conventional ubiquity with necessary, universal or eternal relevance, observing that “with the passage of time […] it’s becoming easier to see the ways that such a paranoid project of exposure may be more historically specific than it seems.” The tendency to mistake hegemonic methods for necessary or definitive ones has encouraged the pursuit of anachronistic objectives in the work of Sedgwick’s students, who seem to be responding to their professors and readings more than to their own life circumstances:

I daily encounter graduate students who are dab hands at unveiling the hidden historical violences that underlie a secular, universalist liberal humanism. Yet these students’ sentient years, unlike the formative years of their teachers, have been spent entirely in a xenophobic Reagan-Bush-Clinton-Bush America where “liberal” is, if anything, a taboo category and where “secular humanism” is routinely treated as a marginal religious sect, while a vast majority of the population claims to engage in direct intercourse with multiple invisible entities such as angels, Satan, and God.42

While the terms may have changed a bit since Sedgwick’s writing in 2003, my own experience suggests that the distance she describes persists. Of course, it is easy to overlook the dimension of privilege in this attitude toward “secular humanism,” forgetting all the factors that ensure Sedgwick’s students do not emerge from an unmarked center, but instead as college students are disproportionately drawn from white, wealthy and highly educated households. It is reasonable to surmise that her students’ attitudes were conditioned by the overrepresentation in that informal sample of a demographic correlated with more secular, humanist and liberal views, as well as with access to public and private resources denied to many other Americans.

But this does not undermine Sedgwick’s central contention that the historical moment is consequential here. Sedgwick’s marker for the difference between her “students’ sentient years” and “the formative years of their teachers” is widely recognized as an epochal divide between two prevailing attitudes toward government in the United States. As Mitchell and Said warned in Critical Inquiry early in Reagan’s first term, his administration marked a major victory in the conservative “counter-attack” that went on to reverse much of the (uneven, incomplete, often grudging) progress made during the era of the New Deal, the world war against fascism and the Great Society. If Roosevelt’s election marked the institutionalization of an era of progressive hegemony, then Reagan’s election can be said to mark the institutionalization of a reactionary era. The victories of the New Right reinvigorated the patriarchal, white supremacist and anti-communist rhetoric of Jim Crow, John Birch and Richard Nixon as discourses of liberty and true Americanism, while stripping away many of the safety nets and safeguards put in place by progressive movements and their allies in elected office. Even in this period of hegemonic neoliberalism, Sedgwick notes, it is still common to hear professors and students in the United States speak as though we live in the “politically correct,” liberal welfare “nanny state” decried for so long by Reagan, Gingrich and others, in which a paternal government is the primary agent of power over our lives and the primary enemy for any liberative project. But this is not the government we encounter inferentially in the relevant data, even if it is the government we encounter as referent for so much political rhetoric.

Indeed, while the positive impacts of Foucault’s US influence are marked in both Said and Sedgwick, his concept of state power, formed in a political context quite different than that of the United States, can easily lend itself to conclusions compatible with neoliberal and neoconservative attitudes toward “state control.” In part, then, Sedgwick’s students overlook the problems of their own time and place because they emulate models provided by professors and influential theorists, whose work responds to other times and places. The students are working on the problems posed by an anachronistic or culturally misplaced conception of governance, rather than responding to problems posed by their experience of the society in which they live. This problem of anachronism combines with the valorization of the hegemonic class culture, especially in environments organized centripetally (to aim at concentration rather than distribution of power) and narcissistically (to focus participants’ efforts on aggrandizing a central figure of emulative identification). Where students are taught to emulate elders, rather than explain their experience, where students are encouraged to universalize a particular cultural tradition and eternalize the explanations of specific historical moments, the problem Sedgwick observes in D. A. Miller’s work will be a necessary consequence, indicating a larger systemic problem of self-universalization characteristic of privilege.

It should not be surprising that academic disciplines disproportionately comprised of a privileged minority of wealthy white people—thus as insulated as possible from the precarity and caprice of markets and power, and conditioned by every prejudice in our society to see themselves as superior to others—have not broadly registered political or cultural shifts with more alacrity than Sedgwick suggests. Writing 15 years before my complaint, Sedgwick is already dismayed by the delay in responding to Reagan, citing an example from 15 years prior:

Writing in 1988—that is, after two full terms of Reaganism in the United States—D. A. Miller proposes to follow Foucault in demystifying “the intensive and continuous ‘pastoral’ care that liberal society proposes to take of each and every one of its charges.” As if! I’m a lot less worried about being pathologized by my therapist than about my vanishing mental health coverage.43

I am deeply grateful for Sedgwick’s confident scoff, which highlights the dramatic irony of Miller’s evocation of a universal welfare state never implemented in the United States. While those reared in the top income quintile of US households might have lived their lives with ample access to medical care, the remaining 80 percent of households have too often had only intermittent or precarious access to it. Writing in George W. Bush’s first term, Sedgwick observes that “since the beginning of the tax revolt, the government of the United States […] has been positively rushing to divest itself of answerability for care to its charges, with no other institutions proposing to fill the gap.”44 With dramatic increases in the discipline’s reliance on poorly compensated casual labor, it now seems certain that far fewer among English faculties have stable and reliable mental health coverage than did at the time of Sedgwick’s writing, though it is also likely that far more of us find ourselves in need of it.

Thirty years of neoliberal national policy since D. A. Miller’s rhetorical emulation of Foucault—a period in which such policies have driven the spread of precarity from manufacturing jobs to professional careers, as illustrated by the changes in English—demonstrates the victory of the counter-revolution Said warned against while Reagan was still in his first term. In the red-state, fundamentalist Christian, white working class subculture in which I was reared, far away from the demographics most likely to produce humanities professors, the words “secular, universalist liberal humanism” are not merely marginalized, as Sedgwick rightly suggests, and not even merely subject to the silence that might attend a taboo, but actively and energetically demonized. In my own experience, these terms seem to be among the more familiar names for what is perceived in such communities as a grand internal threat to the “real America”—one that matches the external threats of the Communist “Evil Empire,” “radical Islamic terror” or the once-always-approaching “migrant caravan.” “Secular humanism” and “liberalism” are titles given to an insidious, “globalist” anti-Christian plot as reviled as any of the racial, sexual and religious slurs with which these terms are routinely connected. Growing up inside (though, in many ways, on the wrong side of) that Real America, making and studying literature seemed like an alternative. Of course, this liberal secular world in which I sought refuge from reactionary moralism is not as diametrically opposed to fundamentalist conservatism as it is often supposed to be.

The examples Sedgwick cites therefore serve, in part, to remind us of the principle that undermines all privilege, the truth that must be obfuscated for privilege to persist: No particular view can also be a universal view. This negative principle implies a positive correlative: Every singularity of experience indicates a universal principle. For example, because the experience of state neglect is applicable to a far larger portion of humanity than Miller’s articulation of “pastoral care,” that paternalistic relation to the state, even if it was his experience, is not definitive of the state as such. But this does not mean that Miller’s experience is not worthy of articulation or indicative of pertinent truths about the state. The political contrariety we are now tasked with overcoming, if we hope to deliver democratic governance to future generations, is a forced choice between the violence of state neglect disguised as liberty, and the violence of state manipulation disguised as care. While I welcome Sedgwick’s “as if!” for the space it makes for my experience in the conversation, the experiences of paternalism that Miller articulates are also necessary to illuminate the ways in which state care can be corruptly manipulative, abusive and controlling. Rejecting both paternalistic liberalism and “tough love” conservatism would require a recognition of both as rationalizations for exploitative violence by appeal to misrepresentations of fidelitous love (philos).

These correlative principles— that every singularity is a path to universality, but no particularity can own or identify itself with the universal as such—entail that diversity is necessary to knowledge production. Not as a matter of public relations or normative moralism, but because universalizing any particular position or life experience inevitably leads to omissions that must be continuously suppressed if that claim to universality is to be maintained. The resulting misrecognitions and misapplications impoverish any attempt to solve problems, mobilizing the resources of the ego to impede, rather than shepherd, the transformative work of learning.

This insight is tangible in Sedgwick’s treatment of “paranoid reading,” not as the essence or definitional principle of theory or criticism “as such,” but as a particular critical “position”—using the Kleinian sense of this latter term: “the characteristic posture that the ego takes up with respect to its objects.”45 In questioning paranoid reading’s project of exposure, Sedgwick seeks to expand the range of positions available to theorists beyond that of paranoia, rather than to disqualify any position. She is not questioning the utility of that project for some ends, but only the wisdom of universalizing any particular approach: “The force of any interpretive project of unveiling hidden violence would seem to depend on a cultural context, like the one assumed in Foucault’s early works, in which violence would be deprecated and hence hidden in the first place.”46 As Sedgwick observes, this context differs from the era of Reaganism, what she calls the Reagan–Bush–Clinton–Bush America in which, “while there is plenty of hidden violence that requires exposure,” the dominant culture is conditioned by “an ethos where forms of violence that are hypervisible from the start may be offered as an exemplary spectacle.” From the Evil Empire’s nukes to Saddam’s WMD’s, from the shirtless suspects on Cops to Willie Horton, from “welfare queens” to casualties of AIDS, little effort is put into articulating pious concern for the victims blamed by a US government increasingly driven by neoliberal and neoconservative imperatives.

Of course, the neoliberal agenda of that period is well served by a phallic image of the state, as necessarily defined by domination or manipulative paternalism. And the inverse of the premise of phallic potency is the image of democratic governance as ineffectual, incompetent or impotent, which as much as the image of naked predation only reinforces a neoliberal “starve the beast” strategy. As a budgetary strategy, this starvation means first cutting taxes, so that spending restrictions can later be introduced as necessary to avoid deficit and debt—a strategy illustrated by Mitch McConnell and Paul Ryan musing, just after the 2017 tax cuts, about the need to cut Medicare and other social programs to address the budget deficit they had just enlarged.47 If we apply the same pattern to an electoral strategy, “starve the beast” would mean sowing mistrust of government, voter disengagement and disillusionment with the efficacy of electoral mechanisms of power. By undermining the legitimacy of democratic governance, one muffles the outcry that might otherwise follow restrictions on voting and other civil rights. In both its budgetary and its electoral forms, the monstrous state must be depicted as only capable of benefiting the bestial other, so that attacks on the state are attacks on the other. It would appear that this is the strategy pursued by the same partisan forces that support the tax cuts, and it is well supported by the strategy of division and voter suppression that US intelligence services and the Mueller Report have attributed to Putin’s efforts in the 2016 election.48

Putin’s reorientation of Russia’s burgeoning democracy into an oligarchic mafia state illustrates the benefits of undermining democratic legitimacy for those with corrupt agendas. As Russian information operations have illustrated, those who seek to undermine democratic governance encourage a definition of politics as pure antagonism, rather than as collective action in pursuit of a common good. Logically, the politics of antagonism is the politics of supremacy, because it rejects the necessary democratic principle of consent, which requires the state to serve all its citizens, not solely those who supported a given politician or party. This indifference to consent is common to patriarchy, white supremacy, imperialism and capitalism. It is also evident in both the old left paternalism of government as caretaker and the New Right notions of government as predator, because both definitions ignore or dismiss a key consequence of governance by consent: that such a principle makes the state, in principle, an instrument of the people’s self-governance, self-determination and self-care. Of course, this is not the government we have always had, and we have never had such a government entirely. But we have had such governance to some extent, in some times and places. That distance between democratic principles and historical democratic governance is what informs Masha Gessen’s description of democracy as “an aspirational ideal,” toward which a society progresses or does not.

Conspiracy Theory

By bringing together D. A. Miller’s anachronistic notion of the state with a discussion of conspiracy and systemic oppressions, Sedgwick’s essay dramatizes the consequences for interpretation of indifference to historical change and subject position. She describes Miller’s writing as a brilliant performance in emulation of Foucault’s characteristic concerns, but by relocating those themes in the United States of the eighties Miller produces an anachronistic exposure of the violence of paternalistic care in an era of neoliberal neglect, placing the state in the role of universal agent, rather than situated instrument. Disciplinary specialization, which presumes the development of a discourse that is either relatively or absolutely autonomous, here overlaps with the entailments of self-universalizing privilege to condition anachronisms and misrecognitions of the problems posed by one’s historical and political situation. And when literary studies neither produces profit nor contributes to solving social problems, what will be its constituency? And who will object to its demise?

Sedgwick illustrates the difference between the violence of state intervention and state neglect by describing Reagan’s attitude toward the AIDS epidemic. This difference is relevant even when they achieve the same ends: If Reagan had executed fifty thousand people whom his constituents’ Christian morality condemned as sinful, the action would have been condemned in the same terms as the state crimes that Orwell describes. But in allowing approximately that number to die of AIDS while he refused to even acknowledge the existence of the epidemic.49 Reagan could find cover in distraction, public inattention, or his own ideology of liberty as absence of state intervention.50 Reagan did not actively murder fifty thousand citizens, most of whom were gay men, and were therefore condemned by the particular strain of religious dogma he courted and encouraged. The evidence suggests that he merely allowed those fifty thousand humans to die, while taking no action at all.

Sedgwick’s account of the time shows that this distinction was not always clear, and that many who opposed Reagan were predisposed to understand his actions in the mold of state attack, rather than state neglect. In the eighties, Sedgwick recalls asking a friend about the idea that HIV was “deliberately engineered or spread” by the US government. Cindy Patton, a sociologist and a historian of AIDS, replied “I just have trouble getting interested in that.” In a nuanced and considered reply, Patton questions the utility of such suspicions to projects for change, ultimately rejecting the notion that conspiracy theories are helpful in pragmatically progressive efforts to solve the problem of the epidemic—efforts that Reagan’s administration should have been undertaking. Sedgwick acquits herself well in taking time to reflect on this response, which seems to pose a challenge to her worldview, rather than dismissing or attacking Patton’s response as insufficiently radical or oppositional. She reports that, having “brooded a lot over this response” in the years since, she has come to find it “enabling” and empowering.

When Sedgwick broached the possibility of a government conspiracy to spread HIV, “sometime back in the middle of the first decade of the AIDS epidemic,” Patton replied with a clear sense of the equivalent violence of state neglect and state attack. Conspiratorial speculation was uninteresting, Patton explained, because even if proven, it would add nothing to what she already knew:

that the lives of Africans and African Americans are worthless in the eyes of the United States; that gay men and drug users are held cheap where they aren’t actively hated; that the military deliberately researches ways to kill noncombatants whom it sees as enemies; that people in power look calmly on the likelihood of catastrophic environmental and population changes. Supposing we were ever so sure of all those things—what would we know then that we don’t already know?51

At this time, when Reagan could be seen to be entrenching Nixon’s cultural agendas—the war on drugs, glorification of a masculinist militarism and misogynistic heteronormativity, white supremacy at home and abroad—Patton had no need to unearth the secrets of a new conspiracy to establish that there was support inside and outside government for actions that harmed the preponderance of the victims of AIDS. She did not need to read between the lines. Reading the lines of American history, and the overt public rhetoric and policies of the Nixon and Reagan administrations, was more than enough to prove it.

Patton’s reasoning illuminates a definitive aspect of conspiracy theories, one that challenges neat equations of reason or logic with power and privilege: Conspiracy theories are unnecessary explanations. They do not begin from an unsolved problem, a difficulty or a gap in knowledge, then seek evidence and arrive at conclusions through inquiry and sound reasoning. Instead, they begin from the assumed, from that which is experienced as known, applying what one already believes to explain what one does not understand. In doing so, conspiracy theories tend to reinforce initiating assumptions, and strain the credulity of those who do not share these assumptions. Their complexity arises not as a consequence of encounters with the overdetermined richness of an interdependent field of causes and conditions, but as a corollary of the need to explain how the presumed and easily comprehensible agenda of the conspirators is hidden from the uninitiated. Of course, sometimes this complexity of concealment is also borne out by reasoned inquiry into conspiracies, as in the reporting on the Pentagon Papers and Watergate, or the investigations into Russian influence operations. The demonstrable existence of conspiracies, Patton’s answer implies, means that it is not incredible to accuse the US government of deliberately spreading disease: There are historical precedents for this, from the spread of small pox among Native Americans52 to the deliberate infection of African Americans with syphilis.53 However, these plots do not provide sufficient reason to default to government conspiracy as explanation for any epidemic among members of an oppressed population, especially given that what Sedgwick calls “systemic oppressions” are usually sufficient to cause harm without any additional conspiratorial plot. And inquiries into historical conspiracies have not described flawlessly executed plots by farsighted masterminds, as even intelligent, competent and experienced political actors like Presidents Johnson and Nixon were drawn deeper into secrecy and illegality by unforeseen circumstances and unintended consequences.

In short, the fantasy of conspiracy as explanation is also a fantasy of the whole agency of historical actors, an agency unimpeded by any objectifying determinations or conditions. Patton’s answer bypasses this fantasy to indicate pragmatic attempts to respond to apparent problems, rather than suspiciously leaping to a more “real” cause of the problem beneath or beyond its appearance as a situated set of conditions. Sedgwick frames this difference in terms of the relation between view and practice, or knowledge and action:

Patton’s comment suggests that for someone to have an unmystified, angry view of large and genuinely systemic oppressions does not intrinsically or necessarily enjoin that person to any specific train of epistemological or narrative consequences. To know that the origin or spread of HIV realistically might have resulted from a state-assisted conspiracy—such knowledge is, it turns out, separable from the question of whether the energies of a given AIDS activist intellectual or group might best be used in the tracing and exposure of such a possible plot.54

One’s answer to that question “represents a strategic and local decision.” In efforts to mitigate or terminate harms, exposing a plot is only necessary to the extent that the conspirators are taken to be the cause of those harms, or else continue to be a condition pertinent to the production of those harms. Patton’s activist efforts may or may not call for a “paranoid project of exposure,” because the harms they address may or may not be consequent upon state conspiracy: This is the matter to be determined by inquiry, rather than imposed according to a formalism that predetermines the inquiry.

Thus Sedgwick demonstrates the wisdom of reflecting on the historical and positional conditioning of the most reliably familiar abductions, which one has been led to expect by life experience or the bounds of one’s knowledge. Her willingness to contemplate Patton’s different view allows her to relativize the default premises determined by historical conditions in which she was constructed as a subject—conditions that include the Cold War era of opaque and far-reaching government conspiracies. One such documented conspiracy is COINTELPRO, under the auspices of which the FBI undertook vast operations to discredit, attack and assassinate leaders of left movements.55 Beginning with communists and socialists, the program soon expanded to target activism by feminists and people of color. As in the cases of the Pentagon Papers or Watergate, it made sense to investigate and expose these secret and illegal activities of US government officials in order to prosecute or end their crimes, or to ameliorate the reputational damage done. But as a result of the publicity around these conspiracies, an informed and engaged US person who lived through the seventies might be expected to perceive any harm directed at or limited to left activists or marginalized communities as the likely product of a government plot. Given the political history of the United States since Eisenhower, that conclusion is reasonable—in the literal sense that there are valid reasons to conclude thusly. But in spite of the Reagan administration’s deceit, plots and cover-ups, including most notably Iran-Contra, those of us who came of age during or after his administration might presume conspiracy theory instead to be the province of right-wing extremists—as it seemed often to be under Clinton, Bush and Obama. It might even seem unlikely to us that a government plot could be effective or long remain secret, an assumption informed by the Bush-era bungling that Best and Marcus highlight. If Sedgwick manages to innovatively work through these conditioning forces, this is because she remains curious about methodological approaches outside those supported by her own conditioning, and attends to those like Patton who dispute or question her assumptions, as well as those like her students whose experience and frame of reference differ from her own.

By relativizing her premises, and by opposing the theoretical monoculture in which the paranoid position is identified in an exclusionary way with theory as such, Sedgwick exemplifies the utility of theoretical reflection defined as examination of one’s own presumptions and premises. As she demonstrates in describing her students’ anachronistic emulation of Miller’s emulation of Foucault, methodologies are tied to thematics and problematics, and all of those are conditioned by one’s life experience. While the Eisenhower–Kennedy–Johnson–Nixon America that conditioned Sedgwick’s premises might lead her to think first about state intervention as a cause of the HIV epidemic, the Reagan–Bush–Clinton–Bush America that was just emerging as she conversed with Patton would lead us to think first about the likelihood that state neglect would enable and exacerbate the epidemic. For those shaped by whatever era is now still emerging, the first thoughts will no doubt be different again.

But this relativity is not total: We can establish logically that defaulting to conspiracy is at odds with what Sedgwick calls “an unmystified, angry view of large and genuinely systemic oppressions,” because it substitutes personal, agential cause for social, systemic conditions. That oppressions are systemic entails that the normal functioning of the system itself is an elaborate conspiracy against the populations it oppresses—though not one that is comprehended, much less directed or caused, by a single individual, agency or institution. A systemically oppressed or exploited population is subject to so many vectors of state neglect and state attack that it is unnecessary, in principle, for those in power to introduce additional, elaborate conspiratorial plots to cause them harm. Of course, such plots are sometimes undertaken, as in COINTELPRO, but exposing or explaining those plots is neither necessary nor sufficient to comprehend systemic oppressions.

I do not mean to say secrecy, fraud, collusion and conspiracy are not involved in the workings of oppressive power, but rather that these are the workings of power, and thus one need not look beyond, behind or beneath the apparent abuses of power that characterize our systemic oppressions to find a “true” or “real” cause of the cause. This is even the case in actual conspiracies: Why would we not believe Black Panthers who claimed that the FBI was attacking them? Why would we not believe that Reagan sold weapons to Iran to fund anticommunist death squads in South America and lied to cover it up, or that Bush lied to get us into the Iraq War, or that Clinton lied to cover for his affairs, or that Trump obstructed justice to prevent explanation of his campaign’s collusion with Putin? A responsible inquiry need not fear beginning from an abduction about what appears to be the case, because responsible inquiry subjects that starting point to examination, research, revision and tests of logic and evidence before a conclusion is reached. In the same way, we need not fear to acknowledge that history or politics condition our own views, preferences and expectations—we need only commit ourselves to examining our own sources and premises, and attending to those who differ from us.

So an “unmystified” view is compatible with an “angry” one, in Sedgwick’s terms, because an inquirer need not pretend to be disinterested from the start: Indeed, without anger or passion to fuel the process, what would be the motor of all the work necessary to change one’s mind? This means that conspiracy theories, which confirm rather than change one’s assumptions, are at best a useless supplement to the kind of social analysis that incentivizes transformative activism, and at worst a disempowering replacement for such analysis. The paranoia of conspiracy theorizing tends to overvalue the power of those who occupy the hegemonic position, representing them as masterminds manipulating the oppressed like objects, able to shape the world while hiding their involvement completely. Paranoia and conspiracy therefore construct a view of the world that flatters the powerful and demeans the oppressed.

Superman Cape

Because it identifies powerful enemies, the paranoid reading can appear radical, only to later reveal its complicity with the narcissism of the oppressor. Kanye West provides us with a contemporary example, in the form of his unexpected recent displays of affection for President Trump. The affinity was unexpected, in part, because West famously responded to the mismanagement of Katrina relief by announcing that “George Bush doesn’t care about black people,” at a charity concert in 2005. In the same year, West drew an analogy between the AIDS and crack epidemics, scourges of the queer community and the black community in the eighties, calling AIDS a “man-made disease […] placed in Africa just like crack was placed in the black community to break up the Black Panthers.”56 On his album of that year, Late Registration, the song “Heard ‘Em Say” announces “I know the government administered AIDS,” and “Crack Music” names the culprit behind West’s conspiratorial view of that epidemic: “How we stop the Black Panthers? Ronald Reagan cooked up an answer.”57 As we have noted, investigations revealed that it was an institutionally white supremacist and stridently anti-communist system of law enforcement that “cooked up an answer.” And that answer was much less inventive, involving the harassment, murder and imprisonment of civil rights workers and leaders, including the Panthers’ Fred Hampton, throughout the sixties and seventies.

The mechanics of exposing this truth did not require sophisticated hermeneutics, in the sense adduced by Best and Marcus, but dogged research and pattern recognition, of the sort done by journalists and Congressional staff in the sixties and seventies to uncover COINTELPRO—or in a simpler form by Chris Wallace interviewing Mick Mulvaney. And for all their complexity, the systemic explanations for crack, AIDS or the Panthers’ decline are still the parsimonious option: Given the historical commitment, or at least indifference, of US authorities to the destruction of black lives, why would Reagan need to resort to inventing a new method of ingesting cocaine in order to undermine a community organizing movement? Would it not be infinitely simpler to continue along the course of neoliberalism and culture war, relying on those policies to accomplish the destruction of black community mobilization? Union-busting, deregulation and free trade policies would undermine employment opportunities, spending cuts would limit the social safety net, Nixon’s drug war would continue to demonize people of color and a history of racism would ensure their disproportionate incarceration. In theory, none of that would require of Reagan any special conspiratorial action, apart from his already well-publicized policy orientation. For all the ways Reagan did expand government’s power and increase spending, in this area he only needed to curtail the federal government’s activities, to govern less, in order to accomplish the aims of white supremacy—as was also the case during Reconstruction and Jim Crow. This, of course, is one obvious meaning of “states’ rights.” The consequence of this reasoning is that even if a conspiracy to invent crack cocaine was carried out by Reagan, this would still not necessarily be the most important target of activist resistance to Reaganism, because without neoliberal and neoconservative policies and rhetoric, finding operatives for such a conspiracy would not be possible, and with such policy and rhetoric, such a conspiracy is superfluous.

What does require interpretive sophistication and education is the ability to recognize the consequences of one’s own interpretive defaults, which one unwittingly universalizes like the proverbial fish in water. Public opinion in the eighties seems to have drifted toward the default assumption that crack and AIDS, not to mention the decline of the Black Panthers and other civil rights struggles, were “special interest” problems germane only to minority groups, even problems those groups brought upon themselves, not affecting “normal” or “mainstream” Americans. This is in some sense the implicit purpose of ad hominem disqualification, a dismissive or even demonizing definition of those who differ from one’s self-image in ways deemed salient: We disqualify others from our attention so that we can ignore or disavow the harms done to them. It is not obvious, without some interpretive effort, that this default to a self-centered or self-interested orientation entails a corollary orientation of default to conspiracy. And yet this is true whether the self-image on which I am centered is a valorized or disqualified image: To the extent that I accept the premise that the other is the standard, the norm or the mainstream of society, I am likely to implicitly define myself in relation to their agency, either as their object or else as an agent only insofar as I resemble them. This definition will condition all my explanations of my situation in society. If I am held personally responsible for positional consequences of my birth into poverty, patriarchy, white supremacy or heteronormativity, then I am likely to look for whom to hold personally responsible for the actions of power. When I am taught that systemic consequences are the effects of a personal, agential cause, I will oscillate between blaming or praising myself and blaming or praising another, like Reagan, for the situation that obtains.

It is perhaps the most sophisticated interpretive act, though it is in principle equally possible for each human to accomplish, to develop the capacity to define what immediately appears ubiquitous as a complexly constructed and particular theory. An old joke tells of a fish asked “How is the water today?” His reply: “What is water?” Asked earnestly, this is the definitive theoretical question. The quintessentially human interpretive act involves inferring the specificity of the ocean in which we swim, and the possible alternative oceans that implies, which may lead to inventing our way out of the ocean, onto dry land, launching ourselves out of our immersion in one kind of system into a previously unrecognizable alternative. After all, how else can one describe the immense interpretive transformation involved in the ongoing transition from the magical or divine blood right of aristocratic governance to the aspirational goal of equally distributed sovereignty that defines democratic governance? Such an all-encompassing transformation of social relations entails an equally holistic transformation of imagination and reasoning.

Nevertheless, along the way we will inevitably mistake water for land. West’s conspiratorial criticism of Reagan was greeted by at least one critic as an indicator of “Kanye’s black radical consciousness.”58 This makes sense if one reads “radical” merely as a synonym for “extreme” or “outside the mainstream of opinion.” But if one reads “radical” as Marx did, as indicating a concern with the “roots” of systemic oppressions, the principles that entail the range of particular practices, then this characterization of West is wildly mistaken. This brings us directly to a principle of difference between conspiracy theory and theory as reasoned inquiry: A conspiracy identifies the root with an overvalued ego as a fantasy of whole agency, while inquiry defines the root in terms of a principle by which one system can be reliably distinguished from others. In the latter definition, West’s account is not an indicator of radical consciousness, but of wish-fulfilling megalomania. West’s diagnosis turns systemic oppression into a battle between proper nouns, depicting the overdetermined consequences of a centuries-long white supremacist program, in the context of which the FBI carried on a decades-long program to target “subversives,” as the plot of a single all-powerful politician to destroy a single organization. In these ostensibly knowing pronouncements, West’s inversion of the problem also distorts its solutions, representing systemic class struggles in the manner of a “Great Man” historiography, as struggles to the death among powerful individuals. This model of politics, in which superhuman heroes and villains fight to decide the fate of faceless masses, reproduces the most conservative and dehumanizing pseudo-historical melodramas.

This personalization of systemic conditions and class struggles indicates the connection between West’s paranoid style of reading systemic oppressions and his narcissism, as it creates foes whose unrealistic omnipotence reinforces his own aggrandized heroic self-image. It therefore should not surprise anyone that West is so attracted to the narcissistic and racist melodrama of conspiracy theorist Donald Trump. Visiting Trump in the Oval Office, West described with characteristic artistry and ambiguous self-awareness the difference between an activist empowerment arising from systemic explanations of oppression, and the narcissistic self-aggrandizing potency entailed by identification with “Great Men.” Comparing Trump’s signature red hat to the Clinton campaign slogan, West explained that

this hat, it gives me—it gives me power, in a way […] The [Hilary Clinton] campaign “I’m with her” just didn’t make me feel, as a guy, that didn’t get to see my dad all the time—like a guy that could play catch with his son. It was something about when I put this hat on, it made me feel like Superman. You made a Superman. That was my—that’s my favorite superhero. And you made a Superman cape.59

There is unmistakable genius in this analogy and its convincing presentation as an improvisation, in which the simple and effective branding of the red hat is psychologically linked to the red cape. And there is unmistakable symptomatic significance in West’s dismissiveness toward the value of solidarity with a professional woman, like his own single mother, an English professor who divorced his former Black Panther father when West was 3 years old. But just as unmistakable is the significance of his “favorite superhero”: Superman is often called a boring character60 precisely because he is too close to the narcissist’s fantasy of perfection, omnipotence and indestructibility. Like the Christian God of Paradise Lost, the character of Superman poses his writers the problem of dramatizing flawlessness. This provides little opportunity for the conflict and change that conventionally shape a plot, and that allow us to invest in a character’s inconsistency or incompleteness, its reality effect.

Narratives in which omnipotent evil faces perfect good are the domain of melodrama, moralism, popular mythologies and exoteric religiosity. Successful activist strategies, in contrast, have historically proceeded by strategically targeting the mechanisms of oppression at their contingently weakest points, on the basis of some reasoned and historically grounded analysis of the systemic structures and functions of power. While superheroes usually make war, settling zero-sum conflicts by means of definitive violence, activists and community organizers like the Black Panthers are distinguished from terrorists and rebel armies by their cultivation of long-term commitments to the hard and slow work of addressing the needs of their communities. The Panthers’ school breakfast programs, funded by Johnson’s Great Society, as well as their programmatic supervision of police to deter violence demonstrate their support for governance in accord with the democratic principle of “all power to all the people.”61

Sedgwick acknowledges this strategic approach to systemic change in describing Patton’s response, which seeks to shift focus from an unproductive concern with sensationalism and spectacle, to a more productive concern with the historical mechanisms of political power and state neglect. The paranoid orientation that seeks the “true cause” of oppression in a transcendental agent or essence, in contrast, cultivates the fear and aggression entailed by systemic oppression. This encourages intelligent young people, like Late Registration-era Kanye West and we who listened to him, to embrace the false empowerment of consumerist narcissism and competitive supremacy, supported and rationalized by conspiracy-theoretical melodramas and mythologies.

Narcissistic Rebellion

For Sedgwick, Patton’s reply to her query about an AIDS conspiracy confirms it is not enough to ask what is true; we must ask what that truth accomplishes, what it does and what one does with it. Sedgwick calls this “an unremarkable epiphany,” part of the “habitual practices” of academic critical theory and the hegemony of the “hermeneutics of suspicion.” However common this pragmatic insight may be in literary studies, it has certainly been linked, from Said through Sedgwick and beyond, to Foucault’s concern with the active social power of knowledge. But while both Foucault and Ricoeur treat of the hermeneutics of suspicion, Sedgwick shows deference to Ricoeur’s construction here, claiming that he defines “very productive critical habits,” which she calls “perhaps by now nearly synonymous with criticism itself.” However, she also suggests that these habits “may have had an unintentionally stultifying side effect”: “They may have made it less rather than more possible to unpack the local, contingent relations between any given piece of knowledge and its narrative/epistemological entailments for the seeker, knower, or teller.”62 In short, she suggests that the hermeneutics of suspicion may have introduced the pragmatics of truth by way of a definition too narrow in its affective range, conflating the paranoid attitude with consideration of truth’s pragmatic aspect as such. This default to paranoia, which I would correlate with a default to disqualification ad hominem, can get in the way of knowing what to do with the knowledge we produce, and obstruct our view of what others do with that knowledge.

Sedgwick makes this point in the context of opposing her own suspicious response to the AIDS epidemic to Patton’s pragmatic, caretaking response. While Sedgwick describes herself as wondering first about the agent of the harm, Patton seemed to have already moved on to a concern with addressing and ameliorating those harms. Patton makes it clear that she had already concluded the Reagan administration was to blame, at least insofar as they did not care about queer lives, and were all too happy to neglect the situation. But Sedgwick seems to have been concerned to demonstrate a culpable agency, while Patton seems content to acknowledge a pattern of action, and to act on the basis of that knowledge—by working to ameliorate the crisis, since Reagan and US government agencies could not be trusted to do so. The “paranoid” position that Sedgwick describes is not only characterized by identification and overvaluation of a “higher” agency, as exemplified by West’s praise for Trump, but is therefore also invested in treating that agency as the arbiter of truth and significance, entreating it to recognize truth and warrant action, to confer legitimacy on one’s assignation of blame by admitting guilt. But when their power is predicated on harm, and is likely to be compromised by exposure, those in power have every reason to persist in bad faith and maintain disavowals indefinitely.

The overvaluation of the higher or central agency is the narcissistic premise that authoritarian permissiveness relies on to construct the dramatic scenarios of its social fantasy. Its obverse is the narcissistic invincibility of youthful rebellion, familiar to anyone who has consumed the popular culture produced by the American Century. As in so much popular music, both self-deification and rebellious invincibility have been consistent themes in West’s output, manifest since his 2001 breakout success as producer of four songs on Jay-Z’s classic album The Blueprint. West’s tracks are characterized by selectively sampled, often sped-up loops of the sort of self-aggrandizing choruses that are customary in hip-hop, like his edit of The Doors’ “Five to One,” cut to repeat “gonna win, yeah we’re takin’ over.” The most famous is “Izzo (H.O.V.A),” in which the chorus spells out what Jay-Z calls his “God name,” “Jay-Hova”: “H to the izz-O, V to the izz-A.” By 2013, West was making this kind of claim much more explicitly in a solo song titled “I Am a God.”

In West’s best songs, a thrilling rebellious energy flows from this narcissistic triumphalism, but without a revolutionary systemic critique of inequities to connect that energy to its practical conditions and consequence. So while he often calls out pertinent historical enemies of the black community like Reagan and Bush, he inevitably presents an alternative that merely reverses the supremacist form of their claim to power. This rebellion is a mirror image, rather than the transformative alternative that defines revolution. That distinction can be seen in his attitude toward education, which seems like a reaction to moralistic pedagogy, but one that only posits an opposing supremacism, a superior internal standard of reference that overcomes the inferior standard of the other. He defines the message of his first solo album in terms of the implications of its title, The College Dropout: “All that’s saying is make your own decisions. Don’t let society tell you, ‘This is what you have to do.’”63 West discloses his class privilege with a passing reference to the presumption he would go to university. “People told me to stay in school,” he mentions, meaning college rather than high school. Of course, for many young people in the United States, college is not a duty to be avoided, but an unlikely opportunity that cannot be presumed, whether due to cost or to their social context, in which a degree is not the norm.

But this disclosure of privilege is not to be rebuked of itself, as it can be revelatory. In this case, it seems to condition West’s ability to disregard a common experience of moralistic education that can be called, after Jacques Lacan, “orthopedic,” in the etymological sense of “straightening” children or bringing them into line. In simple terms, the orthopedagogue begins with a predetermined standard and works to make the student conform to that standard, rather than discerning from the student’s interests and affinities a direction for development. Both a critique of orthopedic pedagogy and some of the consequences of West’s privilege are indicated—along with his famously cavalier nonconformity—by comments to a hip-hop journalist, in which West praises a white singer-songwriter known for slick, radio friendly production and mainstream appeal: “I listen to John Mayer, and his song ‘No Such Thing’ is exactly what my [philosophy] is about, but in different words.” Upon closer attention, the observation turns out to be as apt as it is unexpected, and West’s relevance as an artist is connected with the narcissism that makes it possible for him to confess an unfashionable affinity with such idiosyncratically defiant vulnerability.

Mayer’s articulation of West’s philosophy is less revealing as a work, lacking West’s reflexive complexity. But when it is considered in the context of this evidentiary reference, mediatized and repurposed as an articulation of West’s message, Mayer’s pablum becomes, if anything, more directly indicative of the systemic fantasy in which it is implicated. The song cited is on Mayer’s 2001 debut, Room for Squares, a title that performs West’s gesture of unfashionable confession without his audacity or unexpectedness.64 This prepares the listener for the kind of hackneyed anti-intellectualism one might expect from radio pop: “They read all the books but they can’t find the answers,” Mayer tosses off, with all the casual self-assurance of one who has never bothered to read the books, because he feels no pressing need to find the answers. The chorus is a gleeful encomium to Mayer’s own unstudied, preternatural wisdom: “I wanna run through the halls of my high school, I wanna scream at the top of my lungs,” he sings in a falsetto softly mimicking the energy of a scream, “I just found out there’s no such thing as the real world, just a lie you got to rise above.” The reality on offer in his high school is, the song makes clear, the secure but workaday life of a conventional “American dream”—the kind of career, marriage and family life that can seem like paradise to those excluded from it, though it has been limned as a hell of capitulation by artists of every generation.

The metaphysical promise of American consumerism is brought to its romantic apotheosis in Mayer’s encouragement to break on through this staid image of an easy but ultimately unfulfilling living: “They love to tell you, ‘Stay inside the lines,’” he observes in the rising harmonies of the song’s bridge, before dropping back down, to again build with each word of promise, “But something’s better […]”—now rising to a crescendo of falsetto harmony—“[…] on the other side.” This is a sweetened commercial concoction of the rebellious fantasy familiar from decades past, a synthesis of Pat Boone and Jim Morrison. It is a fantasy produced by privilege, in which the only way to lose is not to try, and one has only oneself to blame for not realizing one’s boldest dreams. This fantasy is just as fundamentally predicated on faith in the whole agency of power as any conspiracy theory: A path is planned and available to you, it claims, and if you do as you are told, all your predictable needs will be provided. The society Mayer dreams in falsetto is the family socialism of wealthy white well-intentioned parents, an image of society compatible with Sedgwick’s account of D. A. Miller’s paternalistic welfare state. Such authorities mean well, Mayer’s song implies, but they nevertheless fail to see the grander life each of us can live if we just throw away their velvet chains and embrace the risk of inspiration. The contempt bred by this predetermined path to success is clear in Mayer’s promise that, in order to possess a better life than this, one need only ignore the clueless adults who lack the imagination or courage of youth: “All of our parents, they’re getting older, I wonder if they’ve wished for anything better, while in their memories, tiny tragedies […].” In short, to avoid the petty, tragic lives of one’s sad old parents, Mayer explains, one need only ignore them and follow one’s own self-aggrandizing dreams of glory.

The song ends with Mayer imagining his moment of vindication, after he has proven himself definitively right and demonstrated his innate superiority over all those around him. He plans his triumphant return to the 10-year reunion held in his high school’s cafeteria, to “stand on these tables before you” and gloat about his success, presumably to all those classmates who did not listen to the advice he screamed as he ran past them in the hallways, or who doubted his inevitable apotheosis. Just before this climax comes a brief interlude in which the music slows and Mayer wails, as if to summarize the previous 50 years of white male teenage fantasy in popular song, “I am invincible, I am invincible, I am invincible as long as I am alive.” Not only are we reminded here of the godlike omnipotence that defines phallic value, but also the immortality it implies, the eternalization that defeats history and change, that insists like a track West produced for Jay-Z, “never, never, never, never change, I never change.”65 Mayer and West seem to imagine themselves like Vonnegut’s Harrison Bergeron, a supremely powerful figure somehow hobbled by the machinations of external authorities. Indeed, this is the fantasy necessary to invest in Reagan’s neoliberal promise of unaided success, overcoming the weights and blinders of parents’ rules or government regulations, to rise into the skies and beyond.

But this fantasy of overcoming does not propose an alternative to the powers it disdains, nor does it even reject the form of power imposed by conventional authorities. Instead, it merely posits that those currently supposed to possess authority are imposters, allowing the enunciating subject to take their rightful place of supremacy, claiming authenticity as embodiment of power. It is as though these men have discovered their parents are not the all-powerful beings they seemed to be through a child’s eyes, but instead of challenging the image of potency that shapes their expectations, they disqualify the compromised potency that defines their experience. Instead of questioning their belief in an all-powerful being—as perhaps merely the fantasy of a child still growing into full embodiment and empowerment, aspiring to a simplistic perfection because it is their first imagination of power—they angrily blame their parents for failing to realize omnipotence. By rejecting and negating the defeated wills and “tiny tragedies” of their parents, they seem to imagine they will emerge into possession of the supreme potency they presume. This is only possible on the basis of emulative, aspirational identification with a figure beyond conventional power, as in West’s ironic articulation of his own divinity: “I am a god, even though I’m a man of God, my whole life in the hand of God.” Though it may seem that a god cannot devote himself to God, it is in fact only by emulative identification with God that West can imagine himself as a god.

West’s reliance here on the apparent contradiction of supremacy and subordination to provoke a second thought in his listener illustrates how even his simplest lyrics demonstrate an artistic complexity lacking in Mayer’s. Even when they communicate the same thing, “but in different words,” West exploits internal contradictions to achieve unexpected rhetorical effects, while Mayer projects a flawless consistency without internal division or surprise. In consequence, both West’s and Mayer’s lyrics can be cited to illustrate common fantasies, but West’s can also be mined for trajectories of traversal. “I am a god, even though […]” In this dialectical movement, West includes in his lyric the motor that produced it, evidence of the drive that pushed him to such extremes of articulation. West’s “even though” depends on the capacity to entertain a perspective despite being convinced it is wrong: The phrase signals that, while he can see why his divinity would appear to contradict his dependence upon a divinity, this contradiction is only apparent. He concludes on the side of wholeness, and thus like Mayer on the side of narcissism, but he acknowledges at least the appearance of contradiction. An equivalent gesture in Mayer’s song might be a prominent acknowledgement that the certainty with which he announces his exceptional destiny and condemns the conformity of his parents must appear to others as the utterly banal and commonplace hubris of a privileged youth. West’s acknowledgment of contradictory perspectives suggests how an element of double consciousness can interject contradiction and contingency even into the closed loop of his narcissistic fantasy.

This acknowledgment of contradiction also allows for the possibility of inferentially traversing privilege. For example, if one can acknowledge that others’ evaluations are wrong, one can also see when others are treating one as superior without succumbing to belief in one’s superiority. In other words, it allows for the possibility of sanity. The relation West describes, between the self as divinity and its dependence upon a divine, is one that Lacan describes in terms of the relation between two Freudian terms, the Ichideal (“I-ideal”) and the Idealich (“ideal-I”), of which other psychoanalysts have spoken indifferently.66 In one’s relations with others, those others seem to desire and therefore perceive more than is accounted for in one’s self-image or Idealich, one’s idea of oneself. This seems to corroborate a potential for identity with the Ichideal, the ideal image of a self in one’s ideological context. If others seem to like or want something in me that is greater or other than I see in myself, then perhaps I am more like the standard of personal perfection than others are. This ideal of a perfect person, which differs among cultures and times in its definitions of what is universally and perfectly beautiful, intelligent or good, this Ichideal “governs the interplay of relations on which all relations with others depend.”67

One way this governance functions is by organizing affinities, constituting a central standard of value, in emulative relation to which all judgments of taste, truth or justice can be determined. Such judgments are called “moralistic,” in a tradition that one can trace from Nietzsche, through Heidegger and Sartre to Lacan, and beyond to Paul de Man’s use of the term, to be examined in a later chapter. Orthopedagogy is the implementation of this moralism in education, disregarding the student as they are, in expectation that they will conform—and inevitably, this can only be conceived as an inadequate, asymptotic conformity—to the standard of perfection. A moralistic teacher devalues their student’s self-image, except insofar as it corresponds to the teacher’s ideal. So if the students’ ideal differs from that of the teacher’s—as it inevitably does, for example, when the teacher and students have been shaped by different cultural or subcultural contexts—the student cannot see their aims and aspirations reflected in the teacher’s regard. In that situation, students must choose between themselves and the confirmation promised by their teachers, and whichever they choose, they suffer an untenable loss.

Nonconformists reject orthopedagogy because its procedures are indifferent to their singularity. The wisdom of narcissism, its necessity as the basis for the formation of the I, is its irrationally uncompromising defense of the value of singularity, without which the human organism would quickly perish of self-sacrifice or neglect. This is also the wisdom and necessity of the individualist rebellion that defines the tenor of so much American popular culture. Without this rebellious insistence on the singular worth of the self, there can be no revolution, even though there can be no revolution that does not go beyond this individualist rebellion.

3.Defining the Opposition

In the Freudian notion of ego splitting, externalizing internal contradictions and differences allow one to maintain the whole worth and goodness of the self and the whole worthlessness and evil of the other, obviating and therefore avoiding any inquiry that might challenge such beliefs. This suppression of differences illustrates how intrasubjective narrative strategies can be homologous with intersubjective ones, insofar as it agrees with Hayden White’s cautionary definition of the politicization of interpretation. While for Mitchell the phrase designates the political implications of interpretive practices, for White it designates the application of political power to decide the outcome of “interpretive conflicts.”68 Rather than allowing contradictions to function as the motor of inquiry, driving the resolution of disputes by reasoned debate, such differences are suppressed when “political power or authority is invoked to resolve them.” For those using interpretation as a pretext or rationalization, rather than as part of an inquiry, the exposure of insufficiently convincing reasons is not an educative feature of interpretive conflicts, but a threat to be avoided. White’s definition highlights this distinction between rationalization and reason, pointing to the use of reason as a weapon against authority and power: Because the capacity to reason is distributed democratically as a structural feature of human embodiment, regardless of the disproportionate distribution of power in oppressive systems, the powerful can never completely eliminate the threat posed by reasoned arguments by those they exploit, exclude or demean.

Again, the Trump administration has supplied my argument with a late-breaking illustration of White’s definition. After Trump’s incorrect announcement that Alabama “will most likely be hit (much) harder than anticipated” by Hurricane Dorian, the National Weather Service in Birmingham corrected the president, assuring that “Alabama will NOT see any impacts from #Dorian.”69 After media attention to this rebuttal, Trump’s commerce secretary reportedly threatened firings in order to force a rebuke from the Weather Service’s parent agency NOAA, which issued an unsigned statement claiming the Birmingham office’s tweet was “inconsistent with probabilities from the best forecast products available at the time.” Trump then continued to claim vindication, even apparently using a black marker to alter the forecasted path of the hurricane on a weather map he displayed in the Oval Office, and then pretending when asked that he knew nothing about it. While there is a silliness to this blatant demonstration of narcissistic pettiness from the president that makes it all too easy to dismiss, that pettiness itself emphasizes the level at which state power is now being used to deny the most obvious realities and the most rigorously demonstrable facts.

This pattern of avoidance, denial and disavowal has reached heights—or depths, depending on one’s preferred metaphor—from which we can foresee existential threats to the rule of law, the conduct of scientific inquiry and the habitability of regions of the planet. After all, opposition to education in evolutionary biology had long ago politicized scientific consensus, and climate change has now brought us to the point at which the weather is politicized. The Trump administration’s pressure on NOAA staff is an example of trying to decide a dispute about facts by the application of force, here in the form of threats against employees’ careers. COINTELPRO is an example of the application of lethal institutional force to win arguments. And Reagan’s accusation of divisiveness against Carter is an example of rhetorical force deployed to neutralize inquiry. All these illustrate White’s definition of politicization, in which force silences reasoned debate.

Spiro Agnew pioneered the tactic of defining the opposition in the way Reagan did with Carter, famously calling reporters, who dared to expose the truth of Nixon’s secret military adventures and failures in Vietnam, “nattering nabobs of negativity.” Agnew was also a pioneer in the adaptation of George Wallace’s racist appeals to the expansion of Republican Party support, efforts that won him a spot on the Nixon ticket as part of the Southern Strategy, which in turn paved the way for Reagan’s Neshoba County campaign debut. Agnew’s “nattering nabobs” charge encapsulated a persistent national strategy by the right, in which media professionals have been shamed for partisanship when reporting facts the right found objectionable. Journalist Will Bunch noted this enduring influence in his 2009 obituary for William Safire, author of Agnew’s famous phrase.

The words that William Safire penned and that Spiro Agnew mouthed actually had enormous impact that has lasted until this day. They helped foster among conservatives and the folks that Nixon called “the silent majority” a growing mistrust of the mainstream media, a mistrust that grew over two generations into a form of hatred. It also started a dangerous spiral of events—journalists started bending backwards to kowtow to their conservative critics, beginning in the time of Reagan, an ill-advised shift that did not win back a single reader or viewer on the right.70

Reaganism in Literary Theory

Подняться наверх