Centenary now
Modernism, James Michener and the crisis of the endless 1970s
1. The Commission
Only another Substacker, someone who had poured their heart out into a good post that got five hundred pageviews, could appreciate the thrill of receiving an email inviting me to a video chat with CEO Chris Best.
“You wrote that blog about Yellowstone, didn’t you?” said Best eagerly once we were on the call. “And the one about, what was it, El Dorado or rock painting or something? You haven’t posted in a while. What gives?”
“Well, I guess I just haven’t found the time,” I said hesitantly. “And I think my posts are a little more complicated than —”
“Yeah, yeah — have you ever considered writing about the suffocating illiberal orthodoxies of the successor ideology?” Best asked questioningly. “Or how a minor personnel dispute at a small liberal arts college in New England makes it okay to become a fascist? Or how cancel culture has made everyone afraid to even just ask questions about vaccine safety?” He paused, and a grave look came over his face. “Wait, Chase — have you stopped posting because you’ve been silenced by the woke mob?”
“I think I’m just looking for the right idea,” I replied. “I still want to do a big long thing on Edward Abbey, or maybe Ivan Doig. And I’m interested in interrogating the whole concept of the post-Western, some people call it the anti-Western, or the revisionist Western —”
“Listen, we probably need to hurry to the section break before people stop reading,” Best said quickly. “What do you think about a post on Centennial?”
“By God!” I yelled shoutily. “The thousand-page airport novel? The middlebrow doorstop they made into an unwatchable 20-hour miniseries? No one’s ever actually read that, have they? Not reading it is the point. It’s like Infinite Jest for high school gym teachers.”
“We can pay you nothing,” Best said.
“I’ll start today,” I replied.
2. The Author
On a warm Friday night in Doylestown, Pennsylvania in June 1910, a company of firemen raced towards the north side of town, where there were reports of a barn on fire. Only when they’d arrived and run out their hoses did the firefighters discover, in the words of the Doylestown Daily Intelligencer, that “the alarm was a fake”; the blaze was in fact a bonfire that industrialist Henry Chapman Mercer “had made on top of his concrete mansion,” a 44-room structure known to locals as Fonthill Castle. For Mercer, the fire was at once a birthday celebration and a thumbing of his nose at all those around town who’d disdained his use of what was then a newfangled construction method, in which concrete was poured into place and reinforced with iron bars. The journal Cement Age, which had already profiled Mercer and his castle the year before, declared it “the sort of story that is causing the insurance man to sit up and take notice, and likewise the citizen who wants an indestructible house.”
Among the Doylestown mothers who struggled in those years to keep their children from adventuring into the sites where Mercer built Fonthill and other reinforced-concrete landmarks was one Mabel Michener, whose son James was known for tramping about and bothering townsfolk at the courthouse or the train station. “Most people in the borough thought Mercer was a fool,” writes Michener biographer John Hayes. “No one could erect a building without a hammer and nails and expect it to stand, they claimed … Parents warned their children to stay away from the site and, for that matter, from Mercer, too.” Young James didn’t listen, and even as a child “could explain in detail how they had been constructed,” adds Hayes. “Once or twice he even managed to strike up a conversation with Mercer.”
The two would’ve made an odd pair: the wealthy, Harvard-educated eccentric who rode around town on his bicycle and the nearly destitute Michener boy, whose dubious paternity kept his widowed mother on the fringes of Doylestown society. Mabel scrounged for work as a laundress and later opened a boardinghouse for orphaned children; though James claimed for much of his life to have been one of those orphans, accounts unearthed by Hayes indicate that Mabel gave birth to him in New York in 1907, five years after the death of her husband Edwin and ten years before taking in her first foster child.
A scholarship to Swarthmore College became Michener’s ticket out of poverty and into a stable middle-class career in education, taking him from Pennsylvania prep school classrooms to instruction at the Colorado State Teachers College in Greeley and ultimately to an editorship at the Macmillan Publishing Company’s textbook division in New York City. When the U.S. entered World War II, he sought a commission as a naval officer, and though the 37-year-old Lieutenant Michener’s tour of the Pacific as a supply clerk was effectively little more than an extended tropical vacation, it provided him with the material he needed for a long-planned leap into fiction writing.
Tales of the South Pacific was published in 1947, awarded a flukish and instantly controversial Pulitzer Prize the next year and adapted into a Rodgers and Hammerstein musical the year after that. The book’s unexpected success set Michener up for both a lifelong professional and financial security and a lifelong anxiety about his critical reputation and literary worthiness. In its wake, he spent a decade as what Hayes calls “the sole literary proprietor of the vast territory west of San Francisco,” penning short “Asiatic” novels in the vein of South Pacific.
Then, at age 50, Michener suddenly ditched Hemingwayesque understatement for the maximalist formula that would define the long latter half of his life and career. The 937-page Hawaii was the product of more than three years of intensive research and consultation with academics across a multitude of disciplines; Michener wrote and rewrote the 500,000-word manuscript from start to finish at least four times. Published three months after Hawaii became the 50th state, Hawaii quickly topped the bestseller lists, and the Michenerian epic was born.
In his sweeping The Dream of the Great American Novel, Harvard literary scholar Lawrence Buell describes the latest and most self-conscious category of attempts at the G.A.N. (as Henry James archly initialized it as early as 1880) as
compendious meganovels that assemble heterogeneous cross-sections of characters imagined as social microcosms or vanguards. These are networked loosely or tightly as they case may be, and portrayed as acting and interacting in relation to epoch-defining public events or crises, in such a way as to constitute an image of “democratic” promise or dysfunction.
Michener later named “family histories” like Glenway Wescott’s The Grandmothers and Eleanor Dark’s The Timeless Land as his chief influences in this new mode, but Hawaii and all his subsequent meganovels bear an obvious conceptual and titular resemblance to John Dos Passos’ U.S.A. trilogy, regarded for a time as an American Ulysses, the nation’s totemic achievement in modernist literature. In 1947, the same year that South Pacific had been received so indifferently by critics, a deluxe U.S.A. reissue prompted the New York Times Book Review to declare it “not only the longest but also the most important and the best of the many American novels written in the naturalistic tradition,” placing Dos Passos atop a list that included John Steinbeck, Sinclair Lewis and other giants of early 20th-century fiction.
Late in life, Dos Passos would tell the Paris Review that he had “always been a frustrated architect,” and as he was finishing the U.S.A. trilogy in the mid-1930s he penned an influential essay titled “The Writer as Technician.” Drawing explicit parallels between writing and scientific discovery, he argued that the mission of the writer should be to “discover the deep currents of historical change under the surface of opinions, orthodoxies, heresies, gossip and the journalistic garbage of the day,” endeavoring always to “probe deeper and deeper into men and events as you find them.”
With Hawaii, writes Hayes, Michener was satisfied that he’d proved himself as a “serious” writer, rather than just a “popular” one. Certainly no one could accuse the book of lacking a deep probing of history. His process was not unlike Henry Mercer’s: casting thick, solid layers of character and drama upon a sturdy frame of fact and scholarship, aiming to erect towering artistic edifices that could entertain and educate all at once — and stand the test of time. At the height of their ambition in the mid-20th century, the architectural modernists believed they could use the newest materials and the latest ways of thinking to build a better world from the ground up, and so it was with Michener.
So earnest was this belief, in fact, that his next major project after Hawaii was not a novel at all, but a hopeless run for Congress as a Democrat in his heavily Republican home district in Pennsylvania. Though he lost that race by 10 points, later in the 1960s he won a consolation of sorts in his election to a convention called to amend and modernize Pennsylvania’s state constitution. He seemed to enjoy this kind of thankless blue-ribbon committee work, and went on to accept appointments by President Richard Nixon to the advisory board of the U.S. Information Agency and, pivotally, the American Revolution Bicentennial Commission, tasked years in advance with developing a plan for the 1976 celebration.
An inveterate traveler, Michener had in the intervening years repeated the Hawaii formula with the 909-page The Source, set in Israel-Palestine, and the 818-page, nonfictional Iberia. But now he set his sights on something many of his most loyal fans, in their letters to him, had long clamored for: a meganovel set on the U.S. mainland, an epic that to his readers was more authentically American. His agent suggested California as the natural subject for such a book, while his longtime editor proposed a continent-spanning America. (That would have been an unmistakably direct challenge to Dos Passos, who died the same year.) Instead, Michener began plotting out a narrative that he soon realized had been more than three decades in the making, dating back to his days teaching history in northern Colorado, to visions of “an imaginary plains town … which has lived with me since 1937 when I first saw the Platte.”
Michener was an idealist deeply troubled by the bitter political tensions and civil unrest that plagued the country in the late 1960s, and through both the Bicentennial Commission and the novel that it inspired, he hoped to author what he called “a new spiritual agreement” among his fellow countrymen. Approaching his 64th birthday, he hinted to colleagues that it could well be his final book, and by the time he road-tripped to Colorado in late 1970 to begin another period of exhaustive research, Michener and his inner circle had wholly bought into the notion that Centennial would be his magnum opus, a career valedictory, his parting gift to America.
3. The Setting
Charles W. Caryl had already lived several versions of the American Dream before he settled in Colorado in 1894: an inventor and entrepreneur who’d patented a successful commercial fire extinguisher; a financier who’d gone bust in the industrializing South; a moralizing antipoverty crusader in the slums of New York, Boston and Philadelphia. Within a few years of his arrival in Denver, Caryl had rebuilt his fortune in gold mining, and once again tried his hand at progressive social reform with the publication of his utopian novel and stage play, New Era, in 1897.
This delightfully deranged book — its full title is New Era: Presenting the Plans for the New Era Union to Help Develop and Utilize the Best Resources of This Country: Also to Employ the Best Skill There Is Available to Realize the Highest Degree of Prosperity That is Possible for All Who Will Help to Attain It: Based on Practical and Successful Business Methods — is Caryl’s humble attempt to solve all the problems of human civilization in a brisk 194 pages.
In a tone immediately familiar to those of us in the age of self-assured ultrarich dilettantes weighing in on the issues of the day on social media, Caryl confidently explained how financial panics, mass unemployment, deadly labor strife and all the other ills that plagued the Gilded Age would be brought to an end by his new, rationally designed paradise — a liberal technocrat’s vision of a world where, through the power of “all of the best inventions and improvements” furnished by modernity, wealth inequality was eased, but “there was no cause whatever for capitalists to fear their present investments would be impaired in any way.” In his fantasy, Caryl sketched out everything from a new system of government and fixed wage schedules to a minutely detailed street plan for the New Era Model City, home to a population of one to five million people, the building of which he predicted would be “by far the most stupendous and important enterprise ever accomplished on this planet by human beings.”
Colorado and its neighbors have long attracted dreamers and schemers like Charles Caryl. The conquered and depopulated lands of the West presented white settlers with a terra incognita upon which visionaries and zealots could imagine society rebuilt from the ground up. It was in the decades after the Centennial State’s founding, at the dawn of the modern age, that such utopianism was at its height across the world, uniting novelists and philosophers and reformers and bureaucrats in an exhilarating wave of optimism for the labor-saving mechanical production and bright electric lights of the future.
The mountain mining districts and empty grasslands of Colorado offered planners the rare opportunity to put their designs into action: Nathan Meeker’s Union Colony was the largest of a handful of such idealistic agrarian communities on the Eastern Plains, while farther west the town founded by the short-lived New Utopia Cooperative Land Association is today just called Nucla. An enduring utopian impulse can be detected in everything from New Deal resettlement proposals and the master-planned communities of the postwar period to present-day co-ops and off-the-grid communes. Coloradans may have adopted the primeval individualism of the Wild West as a kind of state ethos, but it’s arguably the search for a more enlightened, more advanced form of civilization, not the absence of it, that better describes Colorado’s past.
The utopian Union Colony, founded in 1869 and later incorporated as the city of Greeley, is the nearest geographical equivalent to Michener’s fictional Centennial, a mid-sized cow town situated at the confluence of the South Platte and Poudre rivers. Michener spent five years studying and teaching history in Greeley, no doubt becoming intimately familiar with its origins in agrarian utopianism, and yet his great saga of the West seems wholly uninterested in this enduring throughline of Western settlement; “the Greeley colony” is mentioned only a few times in passing, and the novel’s long list of stock characters includes plenty of frontiersmen, cowboys and homesteaders but no utopian dreamers, no communitarian idealists and no midcentury master planners.
The omission might not be worth noting at all were it not for the fact that Centennial is, at heart, just such an experiment in narrative form. In all its earnestness, its sprawling ambition, its faith in grand structures, there may be no purer expression of American literary modernism than this novel that announces itself in its opening chapter — a loopy framing device featuring a Michener stand-in and a fictional magazine assignment — as an attempt to capture “nothing less than the soul of America … as seen in microcosm.” Here is Michener, the intrepid novelist-historian, attempting a great act of authorial synthesis, drawing upon and fusing together the collected bodies of geological, climatological, biological and anthropological research into a coherent story spanning from the formation of the earth to the present.
Once the book’s main narrative gets going, the reader endures a hundred pages of descriptions of glacial erosion and bovine evolution before encountering a human story — and nothing is more essential to Centennial, or to the modernist project as a whole, than the conviction that a continuity of method should exist between the two. Through patient empirical observation and rational analysis, the modernist maintains, one can just as easily reveal objective truths about human beings and the optimal ways for their societies to be organized as a radiocarbon isotope can precisely date a mastodon fossil or a Clovis point.
The 20th century was to be the canvas on which modernism’s dream of a world made new could finally be realized. The “scientific management” of Frederick Winslow Taylor and the assembly line of Henry Ford, counterbalanced by the hard-won gains of progressive reformers and trade unionists, promised a new era of industrial prosperity in which the world’s working classes could finally share. In the laboratories and lecture halls of Europe, scientists on the bleeding edge of chemistry and physics, direct descendants of the great Enlightenment thinkers, inched closer to a unified theory of the fundamental building blocks of the universe. Visionaries like Charles-Édouard Jeanneret, the Swiss-French architect better known as Le Corbusier, drew up plans for buildings and city districts that reflected the efficiency and rationality of the machine age. “We claim,” wrote Jeanneret in 1923’s Towards a New Architecture, “in the name of the steamship, the airplane and the automobile, the right to health, logic, daring, harmony, perfection.”
Dos Passos probably still thought of himself as a political radical when he wrote “The Writer as Technician” in 1935, but the central point of his essay brims with this kind of ingenuous faith in the power of reason, in the perfectibility of human society and in the artist’s role in pursuing it. In a world being remade by technological advances and mass mechanization, even radicals could conceive of their political project as an earnest application of Enlightenment principles to persuade and inform the public of scientific truths:
The professional writer discovers some aspect of the world and invents out of the speech of his time some particularly apt and original way of putting it down on paper. If the product is compelling, and important enough, it molds and influences ways of thinking … The process is not very different from that of scientific discovery and invention. The importance of a writer, as of a scientist, depends upon his ability to influence subsequent thought. In his relation to society a professional writer is a technician just as much as an electrical engineer is.
The problem, of course, is that language is not a scientific instrument. An electrician can make a light bulb turn on because of certain undeniable physical properties of its wiring and the energy source that powers it; photons emitted by the light bulb reveal the material objects upon which it shines. Nothing of the sort can be said for acts of writing or speech. That an idea is “compelling” is no guarantee that it’s true, and if all that is required for a writer to be important is to “influence subsequent thought,” society can go to some very dark places indeed.
In 1898, Charles Caryl attempted to bring his New Era Union to fruition, establishing a colony of a few hundred people at one of his mining camps in the foothills of Boulder County. Little is known about how the experiment fared, beyond the fact that it fizzled within a year. Its failure seems to have sent Caryl on a manic downward spiral in search of increasingly far-fetched ways of realizing his utopian vision: from membership in a cult known as the Brotherhood of Light, which established a series of orphanages that were shut down by the state of Colorado following the deaths of at least nine young children; to claims made ahead of the 1904 World’s Fair that he had invented a “sun-ray” machine that would “soon replace steam and electricity”; to a newfound belief in the magical powers of a secret life-force called “vril,” which he turned out to have plagiarized from a different utopian novelist; and finally to the founding of another colony in California, where he was eventually caught seducing a number of “spiritual mates” with whom he wished to father a new race of “perfect human beings.” For Caryl and his followers, these ideas seem to have been plenty compelling.
Debate continues to this day over which of these strains of intellectual and political thought were ultimately spliced together into the forces responsible for the 20th century’s greatest madnesses and horrors. The Polish poet Czesław Miłosz, who lived under both Nazi and Stalinist occupation before emigrating to the West in 1951, summed up one view: “Innumerable millions of human beings were killed in this century in the name of utopia.” And yet in the postwar decades, many people retained a faith in new, modern ways of thinking to build a world that was, if not Utopia, at least a better one than before.
Le Corbusier did things with concrete that Henry Mercer could only dream of, and his sketchbooks for La Ville Radieuse, a grand utopian city plan, would have made Charles Caryl blush. Though tainted by his brief association with the Vichy regime, Le Corbusier emerged as the architectural world’s leading postwar visionary, influencing a generation of urban planners with his proto-Brutalist model housing development, the Unité d'habitation. Each iteration of the dense, modular Unité promised ample residential, recreational and commercial space for its 1,600 inhabitants, a scaled-down realization of the Ville Radieuse vision. The first was completed in Marseilles in 1952, and before long Le Corbusier’s principles could be seen at work in modernist housing projects all over the world: Cabrini-Green, La Vele di Scampia, Heygate, Pruitt-Igoe.
One need not know the ultimate fate of each of these projects, or share Miłosz’s view of the dangers of utopian thought, to grasp some of problems inherent in modernism, whether they manifest in art or architecture or literature or political theory. It’s one thing to dream up a new world, and another to actually build it out of the raw, entrenched materials of the old one. How do you design a city, or a city block, or even just a building, to meet all of its residents’ needs? How do you author a narrative grand enough to organize a society around? How do you tell the story of an entire state, an entire region, an entire country, in a single book?
4. The Novel
I.
Well, maybe you can’t.
That Centennial is a failure on its own terms is plain not just from its ill-advised self-promotion as “the soul of America … in microcosm,” but from the expectation that Michener set for the book soon after its publication, when he predicted to a reporter that it would be read for hundreds of years, “because I told a story that needed to be told.” Just 25 years after Michener’s death, though, his work seems at risk of fading away completely.
At best, you’ll find present-day literary criticism listing Michener alongside other “diminished” midcentury writers like Bernard Malamud and Herman Wouk; at worst, he’s lumped in with the mass-market “drivel” of Tom Clancy and Danielle Steel. His epics are nowhere to be found on college syllabi, best-of lists or even in IP-hungry Hollywood pitch meetings. They may still be a bookstore staple, sure to be found on local-interest shelves in the Anchorage or Norfolk or Dallas airports, but for how much longer? The cover design copied across the latest paperback editions of Michener’s works gives some idea of how little the literary world now seems to know what to make of them, their low-saturation stock photos and faux-distressed textures marketing even the most panoramic of his works as something between a true-crime thriller and NCAA football memoir.
Michener could be forgiven, though, for feeling boastful in 1974, when Centennial smashed records to become the year’s best-selling work of fiction, even after Random House raised its hardcover price to an unprecedented $12.50. Certainly the publisher knew how to market the book at the time, its dust jacket of solid black and primary colors — an echo, deliberate or not, of the first complete edition of Dos Passos’ U.S.A. — and a literal Great Seal marking the enclosed 909 pages as a must-read national tome. Such was the interest in the book that Random House published two companion titles, About Centennial by Michener himself and In Search of Centennial by John Kings, a British-born Wyoming rancher who helped with the novel’s research and preparation. Adaptation rights were soon snatched up by Universal. It was in Centennial’s afterglow that Michener received his highest honor since South Pacific’s Pulitzer: President Gerald Ford awarded him the Medal of Freedom in 1977, crediting the “master storyteller” with having “expanded the knowledge and enriched the lives of millions.”
About a novel and a novelist that try so hard to be everything all at once, perhaps it serves to start with everything they definitively are not. In his biography, Hayes mounts a running defense of Michener that ends up conceding a great deal: his subject is “not a risk taker”; “neither stylist nor scholar”; “less thinker than craftsman”; “more social journalist than novelist.” Michener’s prose was “often stilted and awkward” and “not good writing,” much less “great literature,” Hayes adds. His work contained “no ambiguity, no irony, no paradox, no conceits,” wrote Pearl K. Bell in a 1981 Commentary piece. Even Michener himself, in the course of decades of agonizing over his critical reputation, arrived at a certain frankness about his own limitations. In Centennial’s frame story, his narrator and stand-in, historian Lewis Vernor, is advised by his editors “not to bother about literary style.” (“Evidently he takes them at their word,” quipped the New York Times’ review.) One year after Centennial’s publication, he told Hayes: “Character, dialogue, plot — all of that I leave to others.”
And so we have a thousand-page historical novel that excels neither as history nor as a novel, and offers little in the way of risk, style, technique, thought, character, dialogue or plot. Other than that, though, it can’t be denied that there’s a great deal to be found in Centennial: chiefly and most obviously, an immense amount of factual and near-factual material, rendered into long stretches of a shared national past portrayed more or less accurately, ordered into clauses and sentences and paragraphs that are almost entirely comprehensible, and — less obviously at first but in great bludgeoning blows over the reader’s head by the book’s end — plenty of didactic moralism about America’s place in and responsibility to the world.
Michener’s deep love of country was being tested like never before as he began to work on Centennial in earnest; his two previous books, the poorly received novel The Drifters and the nonfictional Kent State: What Happened and Why, both published in 1971, show him agonizing over the state of American society, and particularly the direction of its youth. Even the honor of a seat on Nixon’s Bicentennial Commission proved a bitter disappointment to Michener when the panel, prevented by partisan infighting from realizing some of its members’ highest-minded hopes for the national celebration, accomplished little before being dissolved in December 1973 — by which time, of course, Nixon himself had plunged the country into an unprecedented constitutional crisis.
Michener “never doubted the country’s resourcefulness, or its ability to survive,” contends Hayes, but believed in the need for a “new set of national goals, a new consensus for America.” Hoping to author a national epic that could heal the wounds of the fractious 1960s and the distrustful Watergate years, Michener returned to the two themes that more than any others endured through all his works — what the scholar George J. Becker, in a 1983 monograph on Michener, identified as “human courage” paired with “human tolerance.”
The Quaker-educated Michener, who credited much of his success to his zeal for exhaustive research and his prolific writing output, consistently sought, wrote Becker, “to emphasize the heroic in human endeavor, to show in all five of his major novels the prodigies of valor, of endurance, of unremitting hard work that ordinary people are capable of.” Even more prominent, from South Pacific onwards, was Michener’s abhorrence of racial prejudice; like earlier novels set in Asia and the Middle East, Becker wrote, both Centennial and the subsequent Chesapeake “face up to the intensity of American racial discrimination, first against Indians, then against imported Negro slaves, and more recently against Mexicans … [and] assert the dignity and worth of so-called inferior races.”
We can view Michener’s pairing of these two values, in the year of Centennial’s publication, as one aging centrist liberal’s attempt to broker a post-Nixonian generational bargain: if the hippies would agree to cut their hair and start holding down jobs, their reactionary antagonists should agree to stop standing in the way of pluralism and multiracial democracy.
It was a simpler task for Michener to craft a narrative around the first idea than the second, if only because it has far more often been the object of American historical scholarship to valorize the pioneer spirit and the Protestant work ethic than to extol the virtues of secular humanist multiculturalism. As Centennial proceeds through the centuries, its plot is driven almost entirely by a series of classically heroic figures in U.S. history — men who, through singular acts of grit or ingenuity, become responsible for key developments in the colonization of Colorado and the West. It’s the bravery of French trapper Pasquinel that opens up the Platte River Valley and its Native tribes to the fur trade. It’s the ambition of cattleman John Skimmerhorn that blazes a trail across the open range and births Colorado’s ranching industry. It’s the pluck of Volga Deutsch immigrant Hans “Potato” Brumbaugh that digs the ditches to irrigate the dry benchlands along the river and make the town of Centennial possible:
[I]t was then that Potato Brumbaugh glimpsed the miracle, the whole marvelous design that could turn The Great American Desert into a rich harvestland. … The once-arid land on the bench proved exceptionally fertile as soon as water was brought to it, and Potato Brumbaugh’s farm became the wonder of Jefferson Territory. As he had foreseen that wintry night on Pikes Peak, it was the farmer, bringing unlikely acreages into cultivation by shrewd devices, who would account for the wealth of the future state.
It’s this last example, especially, that reveals what’s at work in Michener’s parable of the High Plains, since the real-world development of irrigated agriculture in Colorado was famously the product of collective, cooperative visions, not an individual one. With high-profile support from utopian reformers like Horace Greeley, the people of the Union Colony, in pursuit of an egalitarian political project inspired in part by the early socialist thinker Charles Fourier, dug two major irrigation canals in 1870 and cultivated the “Greeley Spud” to feed Colorado’s fast-growing population. Similar efforts were undertaken by the copycat Fort Collins Agricultural Colony two years later, and similar principles were at work in the irrigation ditches dug beginning in the 1850s by Hispano settlers in the San Luis Valley, where the communal acequia system is still in use today.
Why are these things so invisible in Centennial? How can a work of social history be so empty of social movements? In Michener’s telling, each transformative event in the West’s history is necessarily balanced between two extremes of meaning: It must be neither the product of pure randomness, the cold materialist logic of an indifferent universe, nor the fulfillment of some grand design, belonging to a recognizable politics. Instead it must sit at a finely calibrated, market-liberal midpoint: a humble act of self-interest that pioneers a new trade, an individual innovation that confers broad benefits on society, a profit-motivated enterprise that enriches but trickles down. The utopians of the Union Colony, full of a conscious, communal desire to design and build a better world, have no place in such a narrative.
Michener was far from the first writer to shy away from sociological storytelling, and any dramatization of history is bound to rely on compression, allegory and composite characters. But if a primary strength of his work, as defenders like Hayes and Becker argue, is its capacity to educate a mass audience about the past, then it’s fair to evaluate the quality of that education — and not just with regard to the timing of North American interglacial periods, the relative merits of the Hawken and Lancaster rifles, the best breeds to cross with Hereford cattle and other trivia, but about some of the biggest and most important questions we’ve faced as a nation.
II.
Michener deserves credit for the extent of Centennial’s efforts to incorporate the historical perspectives of the Cheyenne and Arapaho into its grand narrative arc, and, well, he knew it; one of the only real uses he gets out of the novel’s framing device is to satirize the magazine editors who brusquely advise Vernor that his story can “safely” begin with the arrival of white settlers in 1844. Centennial’s best stretches come in the patient 300 or so pages that Michener devotes to the Platte River Valley’s long transition from untrammeled wilderness to commercial frontier to bloody borderland to imperial outpost.
It’s not to be taken for granted that any popular novelist telling America’s story in the early 1970s would have gone to such lengths to foreground and humanize the Indigenous people from whom its land was taken, or to confront millions of white readers in middle America with vivid details of the betrayals and atrocities that facilitated this dispossession. At the time of the novel’s publication, the 1864 massacre of hundreds of peaceful Cheyenne and Arapaho men, women and children by a U.S. Army regiment in southeastern Colorado was still widely known, and even memorialized on a monument outside the state Capitol, as the “Battle of Sand Creek.” Generations of schoolchildren in Colorado and beyond were raised on a version of American history that ignored such acts of brutality altogether.
In Centennial, the location of the Sand Creek Massacre is shifted to the north, and its name and those of its key actors changed, but little else about the event itself is disguised or sanitized. In a book composed of so many interlocking historical currents, there’s no turning point more singular than the arrival of preacher-turned-Army-officer Frank Skimmerhorn, who’s called to the Colorado Territory by a religious epiphany and promptly takes to the newspapers to lay out his mission in stark, biblical terms:
Patient men across this great United States have racked their brains trying to work out some solution for the Indian problem, and at last the answer stands forth so clear that any man even with one eye can see it. The Indian must be exterminated. He has no right to usurp the land that God intended us to make fruitful. … Today everyone cries, “Make Colorado a state!” Only when we have rid ourselves of the red devils will we earn the right to join the other states with honor.
If Skimmerhorn’s messianic, genocidal ravings sound too melodramatically villainous, it’s only to those readers unacquainted with the documented statements of the man he’s based on, U.S. Army Colonel John Chivington, who told his officers that he had “come to kill Indians, and believe it is right and honorable to use any means under God’s heaven to kill Indians,” or those of newspaperman William Byers, a Chivington ally who used the Rocky Mountain News to endorse a “few months of active extermination against the red devils” a few weeks before the massacre. (Around the same time, the editors of the Nebraska City News urged “a religious extermination of the Indians generally.”) The fictional Skimmerhorn proceeds to issue a declaration of martial law that echoes the historical proclamations of territorial Governor John Evans, who granted Colorado settlers free rein to “pursue, kill, and destroy all hostile Indians that infest the plains” and to “hold to their own private use and benefit all the property” of their victims.
Michener didn’t let his aversion to graphic violence outweigh the need to depict the massacre itself in gruesome detail, including its scalpings, mutilations and Skimmerhorn’s sanctioning of the mass murder of Cheyenne and Arapaho children on the grounds that “nits grow into lice,” another direct Chivington quote. If the Indigenous peoples of Colorado largely disappear from Centennial in the wake of this slaughter, it’s hard to blame that on an author attempting a faithful retelling of history, and Michener’s choice to devote an entire chapter to “The Massacre” at the narrative heart of his Big Book About America suggests there’s ample truth to Skimmerhorn’s prediction that the country could only become what it wished to be, and what it is today, through brutal and knowing acts of genocide.
And yet it doesn’t take long for the ever-patriotic Michener to begin to soften the weight of this judgment. Skimmerhorn, initially a hero to the people of the Colorado Territory for his role in the “battle,” is soon disgraced after shooting an Indian prisoner in the back. Again we see the limitations of Michener’s historical fiction and its reliance, however unavoidable, on load-bearing composite characters. Skimmerhorn is Chivington and Byers and Evans all at once, and as only one man, the forces he represents are more easily defeated and discredited. His march on the Native American camp is portrayed as essentially a rogue operation, enabled by an ill-timed recall of his commanding general to Kansas, when in fact no such circumstance existed (nor had Chivington’s family been victimized by Indians in the Midwest, as Skimmerhorn’s was in another too-neat plot flourish). The last we hear of Skimmerhorn, he’s fled the state, hounded by public scorn for his cowardly deeds. In reality, not even Chivington faced such a degree of accountability — though multiple official investigations reprimanded him for his actions, he eventually returned to Denver and remained a respected figure there until his death from cancer in 1894 — while Byers and Evans still rank among the most venerated figures from Colorado’s pioneer era.
Letting the Skimmerhorn character shoulder all of this is, to be fair, simply the flip side of Michener’s heroicism, a sense that the evils to be found in American history, just like the virtues, belong to the individual, rather than to larger systems and structures. To the extent that Michener considers those systems and structures at all, he is, like any good modernist, inclined to view them with a benign neutrality — as modes of economic and social organization that, if rationally designed, can help human civilization realize its full potential. Just as a well-regulated market mediates and harnesses the power of entrepreneurial capitalism, sorting beneficial productivity and innovation from destructive greed, so do republican institutions mediate and harness the best of the popular will, sorting democracy’s cooperative, egalitarian spirit from the darkest impulses of the mob. It’s classical liberalism’s non-utopian utopia: self-reliance and self-determination, each one the price of the other.
So then where did it all go wrong? Writing a decade after Centennial topped the bestseller lists, Becker noted the contradiction that haunts so many of Michener’s epics: “The novels show a past that is heroic, yet in their last episodes — those dealing with recent or contemporary events — they show a present that is confused, divisive, without coherent purpose or system of values.”
Michener’s great Western saga begins to lose steam as soon as the reader feels the frontier coming to a close, and declines precipitously when the narrative crosses into the 20th century, steamrolling through accounts of industrial beet farming, the Mexican Revolution and the Dust Bowl at a pace that manages to feel both rushed and interminable. But the bottom truly drops out in “November Elegy,” Centennial’s final and most embarrassing chapter. At last Dr. Vernor, arriving in the town of Centennial in late 1973 to finish his magazine assignment, can enter the action — but no sooner has he arrived than his place in the narrative is superseded by another, more aspirational Michener self-insertion.
Paul Garrett, whose introduction comes complete with a full-page family tree identifying him as the improbable descendant of nearly all of the book’s preceding major characters, is Michener’s ultimate fantasy of the Good American, a gruff no-nonsense cattle rancher with an environmentalist streak and a love of Chicano culture, a political Mary Sue who votes mostly for Republicans on the grounds that they “represented the time-honored values of American life” while allowing that “in time of crisis, when real brains were needed to salvage the nation, it was best to place Democrats in office, since they usually showed more imagination.”
Garrett is named chair of the committee planning Colorado’s state centennial celebration, a chance for Michener to rewrite his own dismal personal history with the Bicentennial Commission, but the wish fulfillment doesn’t end there. Soon Garrett accepts another appointment, as chief deputy to the man elected to a newly created statewide office, a position that Michener gives the hysterical title “Commissioner of Resources and Priorities” and describes, without elaboration, as having the authority to “steer the state in making right industrial and ecological choices.”
What follows is Garrett’s whistle-stop tour around early 1970s Colorado, a state grappling with the impacts of environmental destruction, economic inequality, simmering racial tensions, rapid population growth and the looming threat of water shortages. In one or another of his official capacities — before long, they seem to blend together — it’s up to Garrett to deal with each of these problems in turn. In Vail, he warns ski resort operators to expand their runs along the highway corridors, not to “commercialize the back valleys.” In Centennial, where the beet factory has been shuttered and housing developers are moving in, he orders the town’s noxious feedlot to relocate. At a research station in the Cache la Poudre Valley, he plots with hydrologists to curb industrial water use and save Colorado agriculture from the grimmest of futures. On his own ranch, Garrett signs off on a plan to cross his prized purebred Herefords with more efficient and profitable hybrids, and advises his radical Chicano brother-in-law to fight for the revolution by getting a college education: “Learn their system, Ricardo. Beat them over the head with it.”
It would be unfair to say that Michener left readers with the impression that all of the country’s problems could be easily solved, and these vignettes mostly serve as vessels for the author’s humdrum and mostly harmless conservationist moralizing: “This nation is running out of everything,” opines Garrett after one meeting. “We forgot the fact that we’ve always existed in a precarious balance, and now if we don’t protect all the components, we’ll collapse.” But the animating faith behind a message like this is the sense that we’re still masters of our own destiny, that the precariously balanced system being described is still well within our control. If the system was out of balance, we need only tweak the parameters, pull the right sequence of levers, rewrite the code.
From the early overtures of the Enlightenment, all the way through the dizzying heights of utopianism to the harried technocratic tinkering of the late modernists, this faith in the power of reason, in mankind’s mastery over the technologies and systems it had created, endured. But what if it no longer held? What if more and more people began to ask themselves, not without justification, whether the essential condition of their lives was not to be in control of these systems, but to be at the mercy of them?
5. The Miniseries
I.
On any given night in the early 1970s, between 40 and 60 million U.S. households turned on their televisions and picked between the programs airing on CBS, NBC or ABC — unless, of course, a presidential address was being carried by all three. Centennial was just beginning to take shape as an outline in one of Michener’s notebooks on one such night, August 15, 1971, when President Nixon preempted Bonanza to deliver an 18-minute speech that changed the global economy forever.
Nielsen ratings showed that fully two-thirds of all households in America were tuned in as Nixon laid out a new economic agenda, hashed out at Camp David over the preceding three days and spurred by the president’s fear that rising inflation and ballooning Vietnam War debt would lead to a recession and cost him the 1972 election. Months earlier, Nixon had made headlines with the remark that he was “now a Keynesian in economics,” and for the moment it was true enough; on balance, the three-pronged plan outlined in his address, “The Challenge of Peace,” was a sweeping government intervention in the economy, pairing stimulative tax relief with a nationwide freeze on wages and prices.
It was unlike anything the country had seen since World War II — and anything it has seen since. Only in retrospect can Nixon’s program be seen as one of the last gasps of the New Deal order, the bipartisan domestic consensus that government should play an active, central role in the economy to promote the general welfare, and the broader postwar period of international cooperation built upon the same principle. Within a few short years, it would become unthinkable for any American president, let alone a Republican, not just to impose price controls or promise full employment but even to speak of government as if it had the ability or responsibility to do anything of the sort.
In part, that’s the legacy of the third prong in Nixon’s economic plan: the dismantling of the so-called Bretton Woods system of international finance, accomplished by suspending the convertibility of the U.S. dollar into gold. Though overshadowed domestically at the time by the attention-grabbing (and enormously popular) price controls, it was the unilateral termination of Bretton Woods that reverberated around the world and became known as the Nixon Shock. While the structure established by the 1944 Bretton Woods Agreement had been under stress for years, Nixon’s surprise announcement abruptly overthrew an international order that had underpinned the decades of unprecedented prosperity enjoyed by Western democracies and Japan in the mid-20th century, and, in some sense, a system of commodity-backed money that had existed in one form or another for thousands of years.
The causes and effects of the unraveling that followed were complex, and would take years to play out. But Nixon had accomplished the critical delinking of monetary policy from a set of institutions, however imperfect, of democratic, multilateral oversight and accountability. In Europe, wrote the historian Tony Judt in Postwar, the resulting melee of devaluation and free-floating exchange rates “steadily depriv[ed] national governments of their initiative in domestic policy,” an outcome that suited an emerging generation of policymakers just fine:
In the past, if a government opted for a “hard money” strategy by adhering to the gold standard or declining to lower interest rates, it had to answer to its local electorate. But in the circumstances of the later 1970s, a government in London — or Stockholm, or Rome — facing intractable unemployment, or failing industries, or inflationary wage demands, could point helplessly at the terms of an IMF loan, or the rigors of pre-negotiated intra-European exchange rates, and disclaim liability.
In the U.S., too, the stage was set for what a recent book from the historian Fritz Bartel terms “the politics of broken promises,” the shattering of the New Deal consensus and the steady rollback of the midcentury welfare state in all its grand modernist ambition. Any positive effects that Nixon’s plan may have had on the domestic economy were soon rendered moot by the escalating crises of the mid-1970s, chief among them the supercharged inflation brought on by the Arab oil embargo in 1973.
An abundance of cheap oil had been a crucial component of the West’s robust postwar growth, but the embargo, though only in effect for the six months following the Yom Kippur War, ended that era forever. After a sixfold increase in Western oil demand since 1950, the U.S. and its allies found themselves heavily dependent on developing Arab states who were now keen, in a post-Bretton Woods world, to “recapture the value they had lost with the dollar’s decline,” Bartel writes. By the time the embargo was lifted, the price of oil had quadrupled. “With prices already galloping ahead at a steady pace,” he notes, “the fourfold increase in the price of the commodity that formed the basis of industrial society was bound to have dramatic economic effects.”
The end of Bretton Woods further meant that the fallout from economic turmoil in the West and the amassing of vast new pools of oil wealth in the Middle East would now be managed not by a set of international agreements between democratically elected governments, but by an unstable and unregulated new system of supranational capital flows. Global financial markets as we know them today had hardly existed at all in the 30 years of Bretton Woods’ reign, but after “[t]he meteoric rise of financial and energy wealth in the 1970s,” writes Bartel, suddenly their influence could hardly be overstated:
With the world’s surplus capital now at its disposal, the international financial community became an arbiter of politics around the world. … Any nation-state, East or West, that relied on borrowed capital to fund the products of its domestic politics was now subject to the capricious confidence of capitalists. As long as markets remained convinced that borrowed capital could be repaid on time and with interest … [p]oliticians could continue to promise their people prosperity, and the legitimacy of the government could survive unquestioned. But should market confidence ever falter, the domestic politics of borrowing states would be thrown into immediate disarray.
Once unleashed, it was only a matter of time before these forces would spell the ruin of the West’s modern industrialized affluence — strong manufacturing sectors powered by the mass production of Henry Ford, strong middle classes and labor unions protected by the welfare state of John Maynard Keynes. Fordism, wrote the sociologist Simon Clarke, had “promised to sweep away all the archaic residues of pre-capitalist society by subordinating the economy, society and even the human personality to the strict criteria of technical rationality.” But the rise of a truly global capitalism came with a new overriding logic, a new set of efficiencies to be realized. It hardly made rational sense, after all, to locate labor- and energy-intensive manufacturing industries in developed countries with high labor costs and severe dependencies on foreign oil.
What, then, was to become of the American economy as the long shadow of deindustrialization crept over the heartland? With what could it hope to replace the Fordist systems of manufacture that had fueled an era of prosperity unlike anything seen before in U.S. history? Bright minds had already been at work on the answer. At the time of the Nixon Shock, sociologists had spent years theorizing the “postindustrial society,” while management consultants had begun to lecture corporate America on the coming “knowledge economy.” By 1977, the “Information Age” was a concept popular enough for Senator George McGovern, the Democrat Nixon had walloped to win reelection five years earlier, to pen a New York Times op-ed on its implications. “Almost everything we can manufacture can also be made abroad,” wrote McGovern. “Today, there is a school of thought contending that the greatest economic and political resource we have — and one possible difference between future red or black ink in the trade-balance account — is the sale of information and know-how.”
In the ensuing decades, Americans would experience this shift to a service- and information-based economy in a wide variety of ways. They would experience it in factory closings and layoffs and declining wages and lost pensions, in a growing emphasis on the importance of postsecondary education, in the aggressive deregulation of the communications, transportation and financial-services sectors.
Mostly, though, they would experience it through television.
II.
The products of the post-Fordist economy would include TV shows, movies, music, sports and other entertainments, sure, but to a much greater extent they would be lifestyles, consumer goods and the brands and slogans used to sell them to mass audiences. Growth no longer depended on trade balances, assembly-line outputs and economies of scale, but on the ability of a new class of “creatives to defibrillate our jaded sensibilities by acquiring products marginally, but tellingly, different from the last model we bought,” writes the journalist Stuart Jeffries in his book Everything, All the Time, Everywhere. “Fordism may have offered car buyers any color so long as it was black; post-Fordism offers very nearly too many colors, including some that buyers had never heard of.”
Nothing was more essential to the rise of this new economic reality than television, which as an industry quickly underwent an information-age revolution of its own. By the late 1960s the networks had finally rid themselves of an early business model whereby a small handful of corporate sponsors purchased whole timeslots and produced programs to fill them themselves, opting instead for advertiser “scatter plans” and multiple commercial spots per show. In 1971 the networks cut commercials’ minimum duration to 30 seconds from 60, instantly doubling the number of different messages viewers could be bombarded with nightly. Over the next 10 years the advertising revenues of Big Three tripled, and TV executives, under increasingly acute financial pressures from Wall Street, waged a newly frantic war for every last ratings point. “Until the early 1970s, network programmers had relied primarily on hunches to keep attuned to the shifting tastes and attitudes of their audience,” writes the journalist Sally Bedell in Up the Tube, her 1981 history of the decade in television. “But as the financial stakes rose, they hired whole batteries of researchers to produce scientific analyses of popular views and program preferences.”
One consequence was the rapid maturation of scripted television as a popular art form, driven by boundary-pushing creators like Norman Lear. After rejecting its first two pilots, skeptical CBS executives stranded the January 1971 premiere of Lear’s All in the Family in a Tuesday graveyard slot; by the end of the summer it was the highest-rated show in America, a title it would hold for five years running. Its success opened the floodgates for a new generation of primetime shows that steered into the skid of the era’s social and political turbulence, abruptly ending the two-decade reign of provincial sitcoms and variety shows that pandered to conservative rural whites.
It didn’t take long for embattled executives at the Big Three to recognize the potential of All in the Family and its many spinoffs and imitators to better reflect the country’s diversity and draw viewers in with topical material that broke a host of longstanding network taboos. In 1969, Gomer Pyle, U.S.M.C. had gone off the air without so much as ever having mentioned the Vietnam War; within three years, viewers were offered a fall lineup that included M*A*S*H, its strident antiwar politics only thinly disguised by its Korean War setting, alongside contemporary explorations of race, sex, poverty, mental health, drug use and other issues on sitcoms like Sanford and Son and Lear’s Maude, which featured a special two-episode arc on its title character’s decision to have an abortion.
There are distinct echoes of James Michener in Lear’s long, prolific career, in his hopeful, accommodationist liberalism, and in his uneven reputation among high-culture critics for works that proved massively popular with middle America. Michener himself had briefly ventured into TV writing just as Hawaii went to print in 1959, penning scripts for ABC’s Adventures in Paradise, the first series to be filmed in the 50th state. Though a mild commercial success, the show was panned by critics, and the experience left Michener so embittered that 15 years later, Centennial’s final chapter reserved some of its harshest criticisms for broadcast media and the “illiterate cheapening” they had inflicted on society. “Radio and television could have been profound educative devices,” muses Michener via his hero Paul Garrett. “[I]nstead, most of them were so shockingly bad that a reasonable man could barely tolerate them.”
The creative renaissance of the 1970s raised hopes that television could at last take its rightful place as the instrument of mass moral edification that Michener and other lettered paternalists had dreamed of. As competition between the networks reached a fever pitch, budgets for primetime series soared, and programming executives experimented with attention-grabbing special-event films and miniseries, many of which took on hot-button issues or retold important stories from U.S. and world history. For the better part of a century, the modernists had dreamed of the power of new technologies and ambitious works of art to reshape society for the better. How intoxicating must it have been, then, to glimpse the world’s third-largest country, still healing from the scars of slavery and Jim Crow, tuned in together to the 1977 adaptation of Alex Haley’s Roots, beamed in technicolor into the homes of 130 million people at once?
It was the apex of American monoculture, and it was over almost as soon as it began — but not before NBC, languishing in third place in the ratings, greenlit a 21-hour adaptation of Centennial with a budget of $30 million, five times what Roots had cost ABC.
By the time it debuted in October 1978, Michener’s national epic had missed its moment in more ways than one. The Bicentennial had come and gone without anything remotely like a new consensus. The miniseries’ opening monologue, delivered by Michener against the scenic backdrop of the Grand Tetons and full of moralizing about the country’s responsibility to “this earth we depend upon for life,” was a message the country was well on its way to rejecting. Because of the show’s demanding production schedule, its 12 feature-length episodes were spread out over a period of five months; on its debut it drew fewer than half the viewers that Roots had, and declined from there. Despite sky-high expectations, it ended up as only the 28th-ranked program of the 1978-79 season. NBC sank even lower to record its worst ratings performance in over a decade.
For once, a Michener blockbuster had underwhelmed commercially while mostly winning over the critics — many of whom, at least at first, seem to have been bamboozled by the sheer scale of what NBC had marketed, probably accurately, as “the longest motion picture ever made.” Longtime Los Angeles Times TV writer Cecil Smith proclaimed it “a movie to match its mountains, big and brawny and beautiful as the awesome Colorado Rockies it celebrates,” adding that “[m]ore than any of its predecessors, this is truly a ‘television novel.’” In praise like this, one can detect a certain determination not to let Centennial’s undeniable seriousness go unrewarded. After decades of two-bit screwball westerns and lowbrow country fare, the TV business and its commentariat finally had before them a more grounded and literary rendering of the American mythos to prove the worthiness of the medium. Here was Wild Wild West star Robert Conrad in a hard-nosed dramatic portrayal of French mountain man Pasquinel. Here was no less a symbol of TV’s bumpkin era than Andy Griffith, the sheriff of Mayberry himself, playing against type as tweed-jacketed historian Dr. Vernor.
Mostly, though, Griffith’s arrival in Centennial’s twelfth and final installment just makes the viewer long for a bit of well-written farce and the easy comfort of a laugh track — anything to liven up the slog. Eventually, the spell that show’s ambition had cast on critics wore off just as it had on audiences, and it collected only two nominations and no wins at the 1979 Emmy Awards. Though $30 million could buy an A-list cast and enough cranes, dollies and helicopters to ensure a handful of striking landscape shots per episode, Centennial provides little else in the way of filmmaking artistry. What little compelling narrative texture Michener’s novel had to offer is sanded away into nothing by John Wilder’s scripts and papered over with the hand-holding and melodrama of ultraconventional TV storytelling. (One of Wilder’s only other notable credits in a long small-screen career was 1993’s criminal Return to Lonesome Dove.) The show’s sterile set design and costuming rob its frontier setting of its natural charm, and it’s dated terribly by the cheap cosmetics used to age its key characters through the decades — poor Richard Chamberlain, especially, is forced to spend the latter half of his turn as Scottish trapper Alexander McKeag with the most unsettling ghost-white, glued-on beard the world would see until David Duke’s.
Worse still is the series’ treatment of its Native American characters, the majority of whom are played by white actors in a kind of low-effort redface. Whatever else can be said about Michener’s novel, it devotes nearly one-fifth of its 909 pages to the story of the West before Pasquinel and other white settlers arrive; Wilder and NBC, by contrast, judged that audiences would tolerate fewer than six of the adaptation’s 1,248 minutes.
Unsurprisingly, the series’ most significant and least forgivable revisions to Michener’s tale serve to further ignore and whitewash the history of Native dispossession and genocide, a list of offenses topped by the woeful sequence that concludes episode five, “The Massacre.” While the novel’s ahistorical account of Chivington/Skimmerhorn’s fall from grace was bad enough, Wilder’s adaptation debases it further by inventing a climactic code-of-honor duel between Skimmerhorn and the requisite Good Soldier, Captain Maxwell Mercy, after which the massacre’s perpetrator is banished forever from the territory by his own son. “Skimmerhorn’s driven out of Colorado, out of power, and he’ll never be back. … All the wars are finished,” a remorseful but triumphant Mercy tells Arapaho chief Lost Eagle in the year 1865, promising supply wagons from Denver to save the surviving tribespeople from starvation. Among the series’ final images of Indigenous life is a wide shot of Mercy, gently weeping, and Lost Eagle, holding an American flag, embracing one another beside a cluster of tipis.
It’s an extremely potent symbol — not, of course, of anything remotely having to do with the real history of the West, but of the pathetic inadequacy of a certain brand of self-absolving liberal guilt, and of mass-market entertainment’s unlimited capacity for delusion and deceit.
Students of the Norman Lear school of TV storytelling had hoped, like the literary realists before them, to capture the whole of the human experience, the good and the bad — and believed, even if they wouldn’t admit it in so many words, in the power of these naturalistic narratives to be tools of social improvement. It was John Dos Passos’ old notion of the writer as scientist or engineer, discovering fundamental truths about society and how it functions, celebrating what helps it run smoothly while satirizing and discouraging what ails it.
But what if much of the country didn’t actually want to be confronted with the full truth of its genocidal past? (How many of the millions of Americans who’d made Centennial a bestseller do we really think didn’t just skip those chapters, anyway?) What if the only way for Roots to succeed, in the eyes of ABC brass, was with a massive promotional campaign that marketed the series as a soapy Gone With the Wind-esque plantation story with, as one executive told Bedell, “a lot of white people in the promos”? What if much of the country liked to laugh at, and even laugh with, Archie Bunker’s bigotry every week only so long as neither he nor they learned any lasting lessons about it? Who gets to say what’s good or bad anyway? For the networks, the answer to any and all such questions was automatic: The ratings decide. There were no contradictions worth worrying about so long as they could turn to a simple measure of success in their pitched three-way battle for 200 million pairs of eyes nightly.
By the time Centennial finished its run in February 1979, though, all involved — network execs, Hollywood producers, creative talent and, not least, viewers at home — were growing more and more dissatisfied with the whole arrangement. The ratings war descended into mania; the networks had taken to premiering 20 or more new shows in short runs every fall, then axing most of them after as few as one or two episodes based on overnight Nielsen figures and telephone surveys. The overexposure of made-for-TV movies and showcase miniseries, each one hyped as every bit the must-watch national event as the last, rapidly diminished their power to draw large audiences. After a decade-long development binge, cost-cutting measures returned in force. Low-budget game shows proliferated again. ABC, which had for most of its history been a distant third among the Big Three, surged to #1 with a lineup that relied heavily on the schlock and sex appeal of shows like Charlie’s Angels, The Love Boat and Fantasy Island.
“After twenty-five years of exploitation and repetition, culminating in the excesses of the 1970s, even the most dim-witted viewers began feeling some resentment,” writes Bedell. By the end of the decade, surveys had begun to show that for the first time in the medium’s history, TV viewership was in decline. For a brief historical moment, mass media had given America a genuine popular monoculture — a messy, modern, vulgar, pluralistic, multiracial reflection of itself — before it began to drive people away in disgust.
III.
The great fragmentation that followed wasn’t the product of shifting consumer tastes alone, but of complex new feedback loops created by the technological innovations, economic imperatives and policy reforms of the global information age. By the mid-1980s nearly half of American homes had a VCR, thanks in large part to a booming Japanese consumer electronics industry, and time-shifting and the home video market posed a new threat to the networks’ monopoly on viewers’ attention. The rise of satellite communications systems transformed international news broadcasting and enabled the rise of pay-TV services like HBO and national “superstations” like TBS. Though cable TV had been around in the U.S. in limited, local formats since the late 1940s, a wave of deregulation soon cleared the way for CNN, ESPN, MTV and a long list of other specialized national channels to explode the number of programming options audiences had to choose from. And all of this would be merely a dress rehearsal for the information maelstrom that had been brewing in those years not in the studios of Hollywood or the skyscrapers of Manhattan, but in the laboratories of Silicon Valley.
The hippies had gotten jobs, after all, and many of them saw in the emerging information economy the tools by which the highest hopes of the 1960s counterculture, the dream of a world remade by the liberatory powers of mind-expanding journeys of discovery and kaleidoscopic individualism, could at long last be achieved. Steve Jobs, a longhaired college dropout who’d traveled to India seeking enlightenment and cited Stewart Brand’s Whole Earth Catalog as a formative intellectual influence, gave the world the signal declaration of the tech industry’s values in Apple’s famous “1984” ad: the lifeless monotony of dystopian Fordism, the tyrannical horror of belonging to “one people, with one will, one resolve, one cause,” shattered and overthrown forever by the limitless freedom of self-expression the personal computer would offer. In 1995, as the dot-com era dawned, a triumphant Brand made sure readers of Time knew whom to thank for the “cyberrevolution” they were beginning to enjoy. “Reviled by the broader social establishment, hippies found ready acceptance in the world of small business,” he wrote. “The counterculture’s scorn for centralized authority provided the philosophical foundations of … the entire personal-computer revolution.”
Once bitterly at odds with the establishment and with each other, the Boomers were reconciled to both through an economy, and a politics, that joined liberal pluralism to profit-seeking hierarchy. Brand’s generational victory lap came at the height of post-Cold War American triumphalism, and though it was only in 1986 that Czesław Miłosz had spoken for many in the West when he blamed utopianism for the deaths of “innumerable millions,” it didn’t take long after the fall of the Soviet Union for many of those same intellectuals to wonder if humanity had achieved, in liberal democratic capitalism, a history-ending final stage of civilization.
It was a vision not of a modern utopia, but of modern utopianism transcended. Where the master planners had erred, the thinking went, was in believing that society could ever be subordinated to a single vision, no matter how idealistic, no matter how rationally designed. In the final decades of the 20th century, people in the U.S. and other Western countries watched as the great projects of high modernism crumbled around them — in some cases literally, as with the ambitious public housing developments that Le Corbusier and his disciples had helped governments erect in cities around the world.
The sprawling Pruitt-Igoe complex in St. Louis, designed by the Japanese architect Minoru Yamasaki, became one of the first and largest such projects to be demolished beginning in 1972, but it was by no means the last. Seemingly wherever these projects had been built, in the decades that followed they had fallen steadily into disrepair, plagued by poverty and crime, leaving the movement that had inspired them permanently discredited. “Modern Architecture died in St. Louis, Missouri on July 15, 1972, at 3:32 p.m. (or thereabouts),” wrote the critic Charles Jencks in a 1977 book whose title helped popularize a term that had long been floated in art and academia to describe what would come next — The Language of Post-Modern Architecture.
Crucially, the emerging ideologies of the postmodern period located the cause of these failures in key values embedded within modernism itself — in the scope and sincerity of its ambition, in its faith in science and rationality, in its pursuit of objective truth. Architecture’s postmodern critics took an evident joy in identifying the design flaws (Pruitt-Igoe’s efficient but ill-fated skip-stop elevators, the wanton destruction of Robert Moses’ expressways, the logical but lifeless plazas of Le Corbusier’s Chandigarh or Lúcio Costa’s Brasília) that exposed their designers not as impartial arbiters of pure reason but as fallible bureaucrats who’d failed to account for the breadth and multiplicity of the human experience. The theorist James C. Scott collected stories of high-modernist hubris in his 1998 book Seeing Like a State, from experiments with “scientific forestry” in 18th-century Prussia to forced villagization programs in postcolonial Africa. “The progenitors of such plans regarded themselves as far smarter and farseeing than they really were and, at the same time, regarded their subjects as far more stupid and incompetent than they really were,” Scott wrote.
Postmodernism heralded itself as a radical democratizing force, bent on dismantling an ideology that had promised to make things new but instead only reproduced old, old forms of rule of the many by a cruel or incompetent few. Just as suspect as grand architectural or social development projects were the grand narratives of mass culture; postmodern thought meant skepticism not just of authority, but of authorship itself. How could any so-called Great American Novel, almost exclusively the work of upper-middle-class cisgender heterosexual white men, ever hope to represent or speak to the experiences and needs of America’s wildly diverse populace? Why, for that matter, should American matter at all?
It’s hard to imagine a work less prepared to withstand such criticism than Michener’s Centennial, its insistence on a rectilinear, just-the-facts presentation of U.S. history wholly undermined by its glaring blind spots and eventual descent into heavy-handed lecturing. For all the artifices of high modernism, for all its gestures towards rationalism and systematization, at the end of the day there was only ever a flawed, flesh-and-blood individual at the controls of the machine, attempting to tell the masses where and how to live or what core values America stood for. Not for nothing did several consecutive generations know the 20th century’s archetypal oppressor as Big Brother, or simply The Man.
By contrast, postmodernity promised true liberation not through the establishment of a new order but through the effacement and erasure of order altogether, supplanted by the mesmerizing chaos produced by the information economy and increasingly decentralized communications technology. By its nature, postmodern culture was infinitely variable in its appeal, crossing ideological and generational lines as it drew consumers of the cable-TV and internet ages into insular ecosystems where their individual interests and values could be endlessly reflected back to them. To the punks, the ironists, the avant-garde, it imbued experimentation and iconoclasm with new artistic merit and political urgency. To theorists in the academy, and to people belonging to long-oppressed demographic groups, it meant the emergence of a wide array of emancipatory ideologies that aimed to deconstruct harmful binaries and power relations. The postmodern critique of the modern was persuasive and expansive enough to win over both anarchists like Scott and world-conquering capitalist entrepreneurs like Jobs, and the lessons drawn from all this never needed to be especially coherent: “Our generation proved in cyberspace that where self-reliance leads, resilience follows,” declared Brand in 1995, “and where generosity leads, prosperity follows.”
And yet many, many Americans did not find the experience of the postindustrial economy to be quite so liberating or prosperous. Perhaps this was most acutely true of the factory workers and Rust Belt towns who were among the earliest and most obvious losers of globalization, but it was by no means limited to them. From a psychological standpoint alone, the era came with new compulsions and social pressures, new modes of stress and alienation, new paradoxes of choice. To live in the information age has meant being surrounded at all times by an exhausting cacophony of language and image, much of it designed for the express purpose of manipulating our cultural tastes and consumer habits.
More to the point, for many people in the U.S. and other Western countries it has meant believing, often with very good reason, that things have come unstuck — that too much of the information we encounter no longer bears any meaningful relation to the material world, and that beneath the noise of all the promotional bluster and insistent hard sells and aspirational lifestyle marketing we’re deluged with, in one key way or another our lives are getting not better but worse.
Not surprisingly, it was literature that proved best able to anticipate and give shape to this feeling. Even before postmodernism had a name, a small handful of writers had swan-dived into the torrents of language cascading through the information economy and surfaced performing exhilarating genre experiments and acrobatic feats of prose. They were stylists whose virtuosic facility with language granted them a kind of backdoor into the rapidly assembling source code of the postindustrial age: for Thomas Pynchon, it was the carnival-barker gab of midcentury ad men and the epigrammatic menace of the ascendant security state; for William Gaddis, it was the unbearable accumulated weight of two millennia of Western artistic tradition and Christian religious doctrine. Even Dos Passos, as early as the 1930s, had prefigured many of the moods and methods of postmodern literature with U.S.A.’s collages of newsreel excerpts, short biographies and experimental “Camera Eye” segments, the frantic social and political currents of the era channeled through prose heavy on nonstandard compound words and long breathless sentences that heighten the sense of anxiety and overload. The novel’s characters are sailors and stevedores, aristocrats and actresses, labor organizers and P.R. pros, but they all share a certain helplessness, finding themselves continually adrift on world-encircling tides of commerce and conflict that they could never hope to control.
Deindustrialization was far from the only consequence of the tumultuous global capital flows unleashed by the reforms of the 1970s. In competing for the favor of international financial markets, governments faced new pressures to implement drastic programs of austerity, deregulation and “labor discipline.” Some nations, especially in western Europe, proved more resistant to these pressures than others, managing to maintain their strong welfare states even through prolonged periods of high unemployment — but in the U.S., President Ronald Reagan’s revolution rapidly remade the American economy into a haven for the world’s surplus capital. Tax rates were slashed, unions were crushed, inequality skyrocketed, and the widening tears in the country’s social fabric began to be patched together with readily accessible credit. “In place of the postwar social contract of rising incomes, job security and full employment, working- and middle-class Americans were offered a new deal of debt-fueled consumption,” writes Bartel. Reagan proved unable to meaningfully shrink the size of the federal government, and his supply-side economics failed miserably to revive the domestic industrial economy. But the influx of foreign capital proved instrumental in bringing about the end of the Cold War, allowing his administration to run massive deficits as it paired tax cuts with a military spending spree and flexed its geopolitical muscle to force a wave of financial reckonings in the debt-riddled Eastern Bloc.
By then, much of the communist world was speaking the language of global capitalism, anyway. Mikhail Gorbachev’s perestroika, launched in the mid-1980s after a decade of cascading commodity shocks and mounting international financial pressures, was administered in large part by a Georgian economist, Abel Aganbegyan, who praised Margaret Thatcher and spoke of the U.S.S.R.’s need to develop “an effective system of individual incentive and responsibility.” Another Thatcher admirer, the Hungarian communist leader Károly Grósz, assured representatives of the International Monetary Fund in 1986 that “Marxism has never accepted egalitarianism, but rather the postulate of equal opportunity.” In the end, Western governments didn’t so much win the Cold War on ideological grounds as prove better equipped to justify to their people the miseries and indignities of the the late 20th century’s economic turmoil. As Bartel writes, while the communist state “could claim no special right to rule if it was just going to be any empty vessel for the allocation of the market’s punishment and rewards,” the ideology that governed the West, to put it mildly, “suffered from no such quandary. Not only did it seek to turn the state into as much of an empty vessel as possible for the market’s provision of luxury and anguish, but … sought to use the state to actively advance the interests of the wealthy few, all while justifying this pursuit in a lexicon of political and economic freedom.”
Each in their own way, the two great world-shaping ideological and economic forces of the modern age — democratic capitalism and state socialism — had succumbed to a force altogether greater than themselves. As Eastern Bloc countries threw off the yoke of Soviet rule only to find themselves subject to the demands and volatilities of global financial markets, so too did people in the West feel the institutions of the welfare state and postwar internationalism shrink, and the levers of majoritarian self-rule grow gradually less responsive to their material needs. “The end of the Cold War, we must conclude, was the moment in which the people’s power peaked and the moment in which it was overcome,” writes Bartel. “Every government that lived on credit after the oil crisis … was beholden to the twin masters of global capital and their own people.”
In Bartel’s telling, this transition demonstrated the greater capacity of the West’s democratic and capitalist institutions to “break promises.” But of course it’s just as true to say that postmodernity gave those institutions the means to make a whole new set of promises to citizens and consumers. Promises, after all, are the stuff of which the financialized postindustrial information economy is made: investment and credit products that promise security and prosperity; consumer products that promise infinite aestheticization and self-expression; entertainment products that promise endless diversion and play. Gone were the universalist covenants of a market-liberal New Deal or a Marxist-Leninist classless society, replaced by an ever-changeable set of values and ideas that could be continually reassembled to construct your own personal promised land. “Rejecting the grand utopian visions of total planning and total design,” wrote the architectural theorist Katharine Kia Tehranian in 1995, “the postmodernists are calling for an eclectic city of many faces and neighborhoods that can accommodate a whole range of utopias in miniature.”
And if those promises are broken, too? If you find yourself growing unhappy, in your bespoke mini-utopia? Well, maybe you just need to choose a different one. Maybe you need a new job, or a new house, or a new hobby, or a new set of clothes. Maybe you need to become a gym rat, or a Deadhead, or a born-again Christian, or a New Age beach bum, or an urbanist cargo-bike dad, or an off-the-grid goat-farming mommy blogger. Still not happy? Still have a feeling that something’s not quite right? Have you tried yoga or video games or a new tattoo or Paxil? Could you be an outdoorsy granola girl, a grindset finance bro, a K-pop stan, a flat-earth YouTuber, a Disney adult? Not your thing? Hypebeast, tankie, incel, doomer, tradcath? No? Nothing’s working — job sucks, rent too high, insurance fleecing you, planet on fire, too many promises broken? Well — who are you going to complain to, exactly?
Who do you think is in charge here? Who do you expect to hold accountable for the iniquities of the postindustrial economy, in its diffuse, decentralized networks of transactions and counterparties, its mystifying language games and epiphenomena, its transnational cascades of cause and effect? Fish caught off the coast of Norway are shipped to Indonesia to be processed and packaged for sale in New York. Aquifers in Arizona run dry to grow feed for dairy cows in Saudi Arabia. Code is tweaked in an office in Menlo Park and ethnic tensions rise in Ethiopia. A fire breaks out at a semiconductor plant in Japan and U.S. used-car prices hit historic highs.
In Centennial’s “November Elegy,” Michener, who at 67 had lived nearly his entire adult life in an America forged by the New Deal consensus, dreamed of a rational bureaucrat empowered to set right all the economic and environmental ills that plagued the country in the 1970s. But his vision was hopelessly outdated almost as soon as the novel went to print. In the world created by the later developments of that decade, the management of “resources and priorities” is something far more opaque, governed by a logic far less legible to us.
Markets move, balance sheets are adjusted, carbon accumulates in the atmosphere — and a nurse in Memphis loses her home to foreclosure, a pensioner in Thessaloniki sees his monthly payment slashed, a family in Balochistan escapes the flood that washes their home away. None of it’s the fault of anyone in particular. No president can be elected, no dictator installed, to stand meaningfully in the way of any of it. In postmodernity, we no longer fear the man behind the curtain; now the thing to be raged against is the machine, or the system. “No one at all is in the executive suite,” the academic Charles A. Reich wrote presciently in 1970’s The Greening of America. “What looks like a man is only a representation of a man who does what the organization requires. He (or it) does not run the machine; he tends it.”
IV.
Amid the radical diversity of subject, genre, style and tone that characterizes postmodern fiction, it’s the paranoid “systems novel” that stands out as its most enduring and essential form — stories that break some of the oldest rules in literature to portray characters whose actions hardly matter at all, so fully are they at the mercy of the tectonic structural forces that dictate the conditions of life in postmodernity. For some authors, like Pynchon or Kurt Vonnegut, these feelings of fragmented helplessness found expression in manic globe-trotting satires whose convoluted conspiracies and madcap plot contrivances — “If the world is absurd, if what passes for reality is distressingly unreal, why spend time representing it?” asked the Vonnegut scholar Jerome Klinkowitz — did not, and were not meant to, ever resolve satisfyingly for the reader. Others, like Don DeLillo in White Noise or Toni Morrison in Beloved, evoke the same feelings in novels nearer to everyday life, their quieter scenes and more grounded characters haunted by unspeakable traumas from the past or menaced by unknowable dangers lurking in the skies overhead.
What, then, is this sinister presence in our lives, this ungovernable machine whose capriciously destructive power we’re primed to recognize in the shadowy schemes of Nazi rocket scientists or invisible airborne clouds of toxic chemicals? Of course, for many writers and thinkers in the postmodern age, belonging at least in a broad sense to the political left, it’s the governing logic of the global economic order — what’s often called, in reference to the distinct turn made by much of the developed world beginning in the 1970s, neoliberalism, though in plenty of other contexts plain old capitalism serves just fine. But one can fault those terms not just on narrow ideological grounds but on the suspicion that -isms do too much to portray the thing in question as the product of human intention, a choice that we make or a belief that we subscribe to. By the same token, the rote inevitability connoted by globalization, deindustrialization, digitization, financialization and others probably does too much to let us off the hook.
Whatever the thing is, it has the qualities of both an -ism and an -ization. We’re complicit in it, but not in control. We give it certain inputs, but it has a mind of its own. Is this not a lot less strange, a lot less paranoid, than it once may have sounded? Is there not something very much like this we experience daily, omnipresent but invisible, predetermining for us the content we see on social media or the terms we’re offered for a bank loan, which we’re accustomed to just calling the algorithm, and soon the A.I.? These too, though, may be metaphorically useful while getting us little closer than Reich’s the machine does to a satisfying name for what we’re talking about; to assign responsibility to and even anthropomorphize some artificial creation of ours is to make both of the preceding mistakes at once.
Other transformations that loom large over us — not least that most postmodern of catastrophes, climate change — may be symptoms and accelerants of the thing, but not the thing itself. The economic historian Adam Tooze has helped popularize the polycrisis as a catch-all for the present moment’s hopeless entanglement of global emergencies, but it feels like the ultimate surrender to postmodernity and its fear of narrative to refuse even to try to give its organizing force a name.
Maybe instead of looking upward and outward, into sci-fi futures dominated by Skynet or the Matrix or Silicon Valley’s all-too-real ambitions for a neofeudal metaverse, we’re better served by searching for something hard-wired into the circuitry of the material world. What DeLillo’s Jack Gladney comes to fear about the Airborne Toxic Event, after all, is not just large things looming overhead but the microscopic molecules at work in his bloodstream, chemical bonds breaking and forming and breaking again according to certain inviolable laws of matter.
As the modern age dawned in the mid-19th century, science introduced us to entropy, from the ancient Greek for “turning,” the endless tumbling-down randomness and dissipation of a universe at war with itself, the insatiable drive for energy to be released and for capital to accumulate — inevitable in the long run, but eager to be hastened by human agency, coaxing us along to ensure that its flows are made maximally efficient, that nothing stands in the way of its all-consuming imperatives. As expressed in the second law of thermodynamics, entropy is “the only equation of fundamental physics that knows any difference between past and future,” writes the Italian physicist Carlo Rovelli. “The only one that speaks of the flowing of time.” Perhaps what Pynchon in Gravity’s Rainbow described simply as the system is all of these things at once, high-tech and mundane and apocalyptic and primordial:
Kekulé dreams the Great Serpent holding its own tail in its mouth, the dreaming Serpent which surrounds the World. But the meanness, the cynicism with which this dream is to be used. The Serpent that announces, “The World is a closed thing, cyclical, resonant, eternally-returning,” is to be delivered into a system whose only aim is to violate the Cycle. Taking and not giving back, demanding that “productivity” and “earnings” keep on increasing with time, the System removing from the rest of the World these vast quantities of energy to keep its own tiny desperate fraction showing a profit: and not only most of humanity — most of the World, animal, vegetable and mineral, is laid waste in the process. The System may or may not understand that it’s only buying time. And that time is an artificial resource to begin with, of no value to anyone or anything but the System, which sooner or later must crash to its death, when its addiction to energy has become more than the rest of the World can supply, dragging with it innocent souls all along the chain of life.
But perhaps, too, it is worse still than that. We understand now how little we can actually know about Kekulé’s benzene molecule, which is not in fact a comforting cyclical constant but, in its constituent parts, just another instrument of doubt and decay. We can’t even say with certainty where the electrons that make up its bonds are at any given instant; if we try to zoom in further we find only an indeterminate morass of probability and superposition. It was the theorist Jean-François Lyotard, in the foundational text of postmodernist thought, who analogized the radical discoveries of quantum mechanics to the late 20th century’s cultural collapse of objectivity and authority.
All that’s left, if neither modernity nor postmodernity can supply us with the nomenclature we’re after, is to rummage around for it in the flotsam of the various spiritual traditions that wrecked upon their shores. If you’ve seen footage of the demolition of the Pruitt-Igoe complex, the spectacular endpoint of high modernism, it was almost certainly in the 1982 experimental film that took as its title the Hopi word koyaanisqatsi, “life out of balance.” The long, dizzying sequence in which cinematographer Ron Fricke’s camera captures the planned detonations at Pruitt-Igoe and other scenes of urban decay follows a series of Pynchonian images of firebombs, ballistic missiles and nuclear warheads, the terrible uses to which so much modern science was put, as if to argue that nothing humanity achieves through technological progress can outweigh or outrun the destructive powers that such progress unleashes. In the film’s final scene, an Atlas rocket is launched skyward, hurtling towards the heavens, putting the stars within our grasp — and then explodes in a flash, falling lifelessly to earth, its flames slowly flickering out in the void. Another translation of koyaanisqatsi, an end card tells us, is “life disintegrating.”
Much of Buddhist philosophy revolves around the similarly entropic concept of pratītyasamutpāda, “dependent origination,” the contingency and impermanence of worldly phenomena, closely linked with the doctrine of śūnyatā, the “emptiness” or lack of essence of all things. These are ideas, thousands of years old, that resonate with the postmodern artist and the quantum physicist alike, as they’re drawn into ever stranger experiments that contain, in the observation that things are indefinite and not what they seem, the promise that they might become whatever we want them to be. An electron takes on the qualities of both a particle and a wave; it passes through space where it both is and isn’t. An eclectic mishmash of cylindrical and pyramidal geometries bursts from the drab rectilinear base of a public library; an office parking lot in downtown New Orleans becomes a knowing parody of a classical Roman piazza. A film tells its story out of chronological order; a stage actor is directed to break character and address the audience as himself. A pop star assumes and discards alter egos like costumes; a conductor and her symphony perform a three-movement piece that consists of no music at all.
But where can these ideas lead except the one place they seem to? Early Buddhist thinkers wrestled with the question of whether conceiving of the world as endlessly conditional, empty of intrinsic being, was a path towards ucchedavāda, “annihilationism.” A woman wanders the world alone, after every other person on earth has inexplicably disappeared. She crosses oceans, visits museums, pecks at a typewriter, but there is no plot to speak of, only a disjointed series of philosophical meditations and fading memories, a jumble of signs and symbols being drained of their meaning, the bonds between them unraveling. It’s a novel. Or is it? “Even when one’s telephone still does function,” says David Markson’s narrator, “one can be as alone as when it does not.”
“There is a sense of wandering now, an aimless and haunted mood, sweet-tempered people taken to the edge,” DeLillo writes. “In the altered shelves, the ambient roar, in the plain and heartless fact of their decline, they try to work their way though confusion.”
A train, careening down the track in a manner determined only by the imperatives of the system, derails outside of a small Midwestern town, sending a cloud of toxic chemicals billowing into the sky. In the end, what is it that we fear lurking unseen above, below and all around us? Jack Gladney, for one, just knows it as death.
Musical interlude
𝄂𝄃
6. The Novel (reprise)
I.
“Emptiness misunderstood destroys the slow-witted,” wrote the Indian philosopher Nāgārjuna in a revolutionary second-century Buddhist text, “like a snake incorrectly seized, or a spell incorrectly cast.” Rejecting both the nihilism of ucchedavāda and the “grasp for permanence” of fixed absolutes, Nāgārjuna’s Mūlamadhyamakakārikā instead endorsed Buddhism’s Middle Way between the extremes: “Whoever sees dependent origination also sees suffering, and its arising, and its cessation, as well as the path.” Perhaps the impermanence, conditionality and interdependence of everything in the world fills it with meaning and wonder and beauty, rather than draining it of them; perhaps one can achieve nirvāṇa not by halting or escaping saṃsāra, the “cyclic existence” of all things, but by embracing it.
I first read passages from the Mūlamadhyamakakārikā in Rovelli’s Helgoland, which uses Nāgārjuna’s radical view of emptiness as a framework for grappling with the revelatory 20th-century uncertainties of quantum physics and general relativity. An Italian theoretical physicist, inspired by the epiphanies of a German scientist on a remote island in the North Sea in 1925, drawing upon the teachings of a Buddhist monk who lived in India 2,000 years ago, writes a meditative treatise on relational quantum mechanics that is interpreted by two British translators and made available instantaneously around the world on a small device manufactured in China by a Seattle-based tech giant. Of course there are wonders, too, in the tangled webs that postmodernity weaves.
It didn’t take long for the first furious wave of postmodernism, led by artists and thinkers who believed they could wield it as an emancipatory weapon against the prevailing social and economic order, to prove unequal to the task. The literary postmodernists were everything that James Michener and Centennial were not — bitingly satirical, stylistically agile, equipped with an expansive toolbox of mind-bending techniques that challenged some of our longest-held assumptions. But perhaps, despite John Gardner’s famous claim to the contrary, when it came to core questions about literature’s moral purpose, they weren’t all that different from the instrumentalizing modernists they succeeded. Why bother to make art that mirrors the complex absurdities of the system, after all, if not because you hold out hope that you might spark rebellion against it? In their insurrection against postindustrial hegemony, authors unleashed chaotic fusillades of camp, parody, pastiche, subversion, absurdism, manic esoterica, cool detachment, winking meta-reference — in a word, irony, the writer’s handiest tool for infiltrating the gap between appearance and meaning, between the material reality of the postmodern world and the cacophonous falsity of the information age.
None of it so much as made a dent. As early as 1993, David Foster Wallace wrote of the ease with which postmodernity was able to “capture and neutralize” the irony-tipped barbs of its supposed antagonists, epitomized by the willing embrace by network TV and its corporate sponsors of irreverent, fourth-wall-breaking self-parody. “Televisual culture has somehow evolved to a point where it seems invulnerable to any such transfiguring assault,” Wallace wrote. “For at least ten years now television has been ingeniously absorbing, homogenizing, and re-presenting the very cynical postmodern aesthetic that was once the best alternative to the appeal of low, over-easy, mass-marketed narrative.” What was true of TV in the early 1990s is exponentially truer of today’s internet, where users are invited to indulge any liberatory, profane or anarchic impulse they want, so long as they do so on a small handful of platforms controlled by billionaire tech monopolists and autocratic sovereign wealth funds. The cultural forces that dethroned modernism have produced only a superficially dissimilar system that, in Wallace’s words, “takes elements of the postmodern — the involution, the absurdity, the sardonic fatigue, the iconoclasm and rebellion — and bends them to the ends of spectation and consumption.”
Writing in the early 1990s, a decade or so after “postmodernism” had entered the discourse, Wallace could afford to hope that some potent new cultural force might simply come along to replace it. Thirty years later, we can’t possibly be so deluded. It’s barely worth rehashing Wallace’s critique of postmodern irony, his call for a return to earnestness, “reverence” and “conviction” in literary fiction, because how many times have we cycled between the two in the last 30 years of criticism? How many more rounds will we go on irony vs. the New Sincerity, snark vs. smarm, hipster vs. normcore, David Brent vs. Ted Lasso, Weird Twitter vs. everyone? The cultural authorities who once had the power to reify and periodize new literary movements, who could forge for us a viable post-postmodernism, have been thoroughly discredited and destroyed by postmodernity itself. (The poet and professor most responsible for the brief injection of “metamodern” into pop consciousness a decade ago now makes his living as an anti-Trump #Resistance reply guy.) A latent sense that postmodernism has been exhausted as an artistic force can’t change the fact that “we remain trapped within its embrace,” argues Stuart Jeffries. “The market-based culture of differentiated consumerism and individual libertarianism that postmodernism extolled was ultimately realized in Amazon, Facebook and Twitter,” he writes. “We’re all postmodernists now.”
It seems impossible, then, to identify any quality in particular that’s meaningfully shared by much of the most influential 21st-century fiction, which instead seems fated to oscillate forever, propelled to lightspeed by the particle collider of the digital knowledge economy, between poles of irony and sincerity, classicism and experimentation, minimalism and maximalism, the real and the speculative. The one thing that an overwhelming majority of authors and their works have in common in the Age of Amazon, as Mark McGurl’s Everything and Less convincingly names this period of literary history, is that virtually no one is reading them. Among the tens of thousands of novels released each year by traditional publishing houses, two-thirds sell fewer than 1,000 copies. By McGurl’s reckoning, hundreds of thousands of other works of fiction are self-published each year through Amazon’s Kindle platform and “essentially never read by anyone.” Americans are buying more books but reading fewer of them, surveys show, as time spent consuming TV, video games and social media continues to grow.
Still, when we widen our scope to include these vast troves of genre and YA fiction, McGurl’s “underlist” and adapted and original works in TV and film, there are a few signals amid all the noise. It’s difficult to ignore the staggering rise in popularity of dystopian and post-apocalyptic storytelling over the last few decades, a trend that spans from the most highbrow literary fiction to the most populist blockbuster multimedia franchises. With few real precedents prior to the late 20th century, tales about zombies, plagues, nuclear holocausts, climate catastrophes, totalitarian takeovers and A.I. singularities have at various points in recent memory seemed poised to monopolize narrative entertainment completely. Even if few of these works are postmodernist in a purely artistic sense, they are almost indisputably the defining literary product of the postmodern era — and it’s difficult, too, not to see this as an inversion of the golden age of utopian literature, which peaked in popularity as modernity dawned in the late Victorian period and has since disappeared almost entirely.
It could be tritely observed that where once masses of readers were drawn to imagining a new and better world, today we seem drawn to nothing so much as imagining its end. But of course that’s not really what post-apocalyptic fiction offers us, is it? The apocalypse in question is rarely the point, and even in the most desolate tales of a world in ruins, the very act of telling a story set within it suggests the beginnings of a new world to come. “The only true account of the world after a disaster as nearly complete and as searing as the one (Cormac) McCarthy proposes,” observes Michael Chabon in his review of The Road, “would be a book of blank pages, white as ash. But to annihilate the world in prose one must simultaneously write it into being.” In other, less despairing post-apocalyptic and dystopian titles, the collapse of civilization of we know it often seems almost purely incidental, drained of any larger political or moral meaning. McGurl calls it “a fantasy of refreshment and reset,” but for readers and authors alike, it’s most practically a tool of simplification and avoidance, clearing the decks for timeless, archetypal dramas to be staged free from the complications and monotony of our digitized, globalized society. “While it is apparently ‘epic’ in all sorts of ways, not least in its length,” McGurl writes, “the central appeal of the genre is in how severely it reduces the scale of the social world.”
What the least valuable recent dystopian fiction, then, shares with the most forgotten utopian fiction of the 19th century — which is to say most of it — is this lack of willingness or ability to grapple with the present state of the world in its full, material complexity. Whether in bestsellers from the likes of Edward Bellamy or cult manifestos from the likes of Charles Caryl, vulgar literary expressions of modern utopianism tended to imagine sprawling new universal systems without any compelling theories of how the existing order might be disentrenched and dismantled. Today, the polarities are reversed; we are full of subversive, deconstructive energies but lack the capacity to envision any real alternatives. When the New Yorker’s Jill Lepore describes contemporary dystopianism, she might as well be Wallace critiquing postmodern irony three decades ago: “Dystopia used to be a fiction of resistance; it’s become a fiction of submission, the fiction of an untrusting, lonely, and sullen twenty-first century, the fiction of fake news and infowars, the fiction of helplessness and hopelessness,” Lepore writes. “It cannot imagine a better future, and it doesn’t ask anyone to bother to make one.”
II.
If anything is to come after postmodernism, perhaps it will be best characterized not as an evolution, nor as a return to modernism, nor even (as some “metamodernists” would put it) as a kind of synthesis of the two, but as something that transcends and confounds them both, definable not along any of the familiar axes but in some new dimension we can’t conceive of yet. Just as the modern and the postmodern only came fully into view decades after they began to reshape culture, it could be a lifetime before we’re equipped with the language or the technology to clearly articulate what their successor might be.
But if we’re eager to move on, to detect in the literary fiction of the present some early overtures of what’s next, perhaps we can find them in what the writer Lincoln Michel terms the “speculative epic”: the maximalist, multilinear, time- and genre-hopping sagas that have proliferated in the last few years, epitomized by Anthony Doerr’s Cloud Cuckoo Land and Emily St. John Mandel’s Sea of Tranquility but observable in at least a half dozen other major titles and in more explicitly genre fiction as well.
Many of these novels contain visions of a future not easily categorized as either utopia or apocalypse, full of wondrous technologies but forever altered by deadly pandemics or climate change — “perhaps,” writes Michel, “because the issues we now face feel so overwhelming that only a vast, imaginative canvas can begin to tackle them.” In one sense, these works of kaleidoscopic pastiche could hardly be more characteristic of postmodern literature, and they rely heavily on the information-age reader’s facility in switching back and forth between different perspectives, styles and social contexts. But they also tend to be unapologetically sincere and universalist, their panoplies of characters and settings often threaded together by a fictional work of art (in Doerr’s case, a literal grand narrative) that offers hope that our own atomized, discontinuous existences might be given structure and meaning by something larger than ourselves. A foundational text in this genre, David Mitchell’s Cloud Atlas, goes to great lengths not just to bridge the historical early modernity of the mid-19th century to far-flung dystopian and post-apocalyptic futures but to loop back and tie each of its narratives to every other.
Already, postmodernity has dislodged the novel — hopelessly old-fashioned and rigidly univocal as it is — from the central place in the culture it once occupied, a development that seems unlikely to be reversed. But we are drawn atavistically, the speculative epic suggests, towards the pure act of storytelling, undiluted by the spectacle of moving images or the dopamine traps of video games and social media. No technology presently conceivable will displace a human being typing words on a page as the lowest-cost and most accessible way to experiment with what kinds of stories we want to tell — something that will be demonstrated, not disproven, by the rapid, curdling exhaustion of A.I. models that are capable only of endlessly recycling and regenerating at the most superficial possible level.
There’s no need to reject this particular onslaught, which is lucky, since there would surely be no stopping it anyway. Like any other technological leap in the information age, it’s more or less inevitable, and it will bring new wonders and new horrors, new efficiencies and new annoyances. Our phones and computers make us better informed and more depressed. Globalization has ameliorated some of the world’s most extreme forms of poverty even as it has exposed billions to new precarities and vulnerabilities. Our decentralized, democratized communications networks allow a far richer diversity of identities and experiences to be widely shared and also seem to be making us despise and distrust each other more.
Is there some way to retreat from the furthest excesses of all this that isn’t a reactionary flight to ancient hierarchies? Is there a path forward through it all that isn’t the millenarian transhumanism of megalomaniacal tech oligarchs? If there is, fiction, however broadly defined, has a role to play in helping us find it. In a preface to a 2016 edition of Thomas More’s Utopia, the author China Miéville urges us to “learn to hope with teeth,” to dream of fantastic new worlds but “never mistake those dreams for blueprints, nor for mere absurdities”:
There is bad pessimism as well as bad optimism. … There is hope. But for it to be real, and barbed, and tempered into a weapon, we cannot just default to it. We have to test it, subject it to the strain of appropriate near-despair. We need utopia, but to try to think utopia, in this world, without rage, without fury, is an indulgence we can’t afford.
It’s science fiction, of course, that is best positioned to offer this to us most directly, and indeed already does — in Miéville’s own work, in the “ambiguous utopia” of Ursula K. Le Guin’s The Dispossessed, in the stress-testing futurism of Kim Stanley Robinson’s Mars trilogy.
What the same impulses might look like in the less speculative environs of literary fiction is harder to imagine, though it is almost certainly not the voguish minimalist autofiction of the 2010s. Contemporary authors, few of whom dare to dream of winning fame or fortune in any great measure, have all but given up on the panoramic social novels of generations past.
The most famous American novelist of the last three decades, Jonathan Franzen, spent the first phase of his career trying and failing to win national acclaim with paranoid, DeLillo-esque systems novels, ultimately breaking through only after a philosophical shift he chronicled in a 1996 Harper’s essay. “Expecting a novel to bear the weight of our whole disturbed society — to help solve our contemporary problems — seems to me a peculiarly American delusion,” Franzen writes. “As soon as I jettisoned my perceived obligation to the chimerical mainstream, my third book began to move again.” The Corrections and Freedom don’t neglect the cultural, political and economic contexts in which they take place — “as if, in peopling and arranging my own little alternate world, I could ignore the bigger social picture even if I wanted to,” Franzen assures the reader — but are principally concerned with the fraying family relationships at their centers. Near the end of the Harper’s essay he quotes a letter he received from DeLillo himself: “If we’re not doing the big social novel fifteen years from now, it’ll probably mean our sensibilities have changed in ways that make such work less compelling to us.”
That’s a depressing thing to read 25 years after the publication of Underworld, DeLillo’s sprawling, time-hopping American epic and likely his last great novel. Immersed in its pages, tracing its narrative through New York baseball stadiums and New Mexico missile ranges, through mystical hippie reveries and dreary dot-com banalities, through the fiscal breakdowns of the 1970s and the free-market delusions of the early 1990s, there’s very little that’s more compelling to me than the idea of a novel that could situate and intertwine the events of our last turbulent quarter century into a similarly sweeping and continuous national mythos. Instead, we seem to have developed a powerful collective mental block against dramatizing our present and our recent past in all but the most detached and fragmented ways.
Underworld is loosely organized around the fate of a McGuffin-like piece of sports memorabilia, the pennant-winning home run ball slugged by the New York Giants’ Bobby Thomson in 1951’s Shot Heard ’Round the World. When we meet the aging collector who’s spent a lifetime tracing the ball’s provenance, his own memory is fading, and he admits to a buyer that in meticulously documenting the chain of ownership he has come up just short, failing to prove beyond doubt that the artifact is genuine. In the end, the novel can show us the missing link, but can it make us believe? There are things that scientific inquiry won’t ever be able to determine for us. There are things that technological progress won’t ever deliver. In the endless web of conditionality there may be no fixed absolutes, but in the telling, in the searching, in the relations and crosscurrents and interdependencies that arise from it, we can still find meaning and truth. The postmodern theorists called this rhizomatic. Rovelli might call it entanglement.
In 2021, Franzen published Crossroads, his best-reviewed work since The Corrections and the first in a planned trilogy titled “A Key to All Mythologies.” That’s a joking reference to the fictional unfinished opus of Middlemarch’s haughty Reverend Casaubon, and Franzen continues, at least in public, to disown “all the po-mo hijinks and the grand plot elements” and instead embrace his identity as a “novelist of character and psychology.” But while Crossroads shows its author keeping his focus on the intimate tragedies spiraling outward from a single American family, the promised scope of his undertaking — a saga “mirroring the preoccupations and dilemmas of the United States from the Vietnam War to the 2020s,” per his publisher — betrays a welcome flash of the old Systems ambition.
If the crack-up ever ends, we’re going to have to put ourselves back together. We’re going to need literature that is constitutive of something new, that aims unabashedly for the totemic and the totalizing. We’re going to need speculative epics and big social systems novels and multiversal space operas and panoramic historical fiction and audacious attempts at the G.A.N. — not just reflexive and arguably self-serving paeans to the Power of Storytelling, but actual grand narratives with content and bite and blazing moral purpose. We need monument and myth. We need to believe again in the possibility of progress, and we need artists and authors who can show us the way.
In a country that has only become vastly more fragmented and ridiculous in the last half century, capturing “nothing less than the soul of America” in the pages of a novel, as James Michener tried and failed to do, seems an even more impossible task. It would take massive, miraculous talent to even partially succeed in such an endeavor, and even if he were interested, it will probably not be achieved by an aging white male birdwatcher with a penchant for badly misapprehending urgent political issues. But why shouldn’t our most brilliant young novelists dare to believe that they might be capable of it? Why can’t we sustain all the objections of postmodernism, accept all its valid critiques of monocratic moralism and technocratic folly, while striving anew to make one out of many?
If there are still literary critics a hundred years from now, it seems likely that Franzen will be viewed as a transitional figure, with one foot in the postmodern past and another in whatever’s next. Less certain is how far he’ll go in helping discover and define what that might be. Crossroads, successful as it is, has the benefit of taking place in 1971, on the precipice of the slow national disintegration that has bedeviled American culture ever since. We don’t know yet how its two planned sequels will carry forward the story of the Hildebrandts’ struggles to live full and virtuous lives into the chaotic flood of information and identity we’ve been submerged under in the present. Perhaps we shouldn’t be surprised if it takes many years for Franzen to figure it out, if the results are less widely acclaimed, or even if, like the trilogy’s fictional namesake, a finished product never comes. The work will go on. The searching will go on.
“If we take utopia seriously, as a total reshaping, its scale means we can’t think it from this side,” writes Miéville. “It’s the process of making it that will allow us to do so.” Franzen is almost certainly correct to say that we can’t expect literature to solve big contemporary social problems. But maybe a society that produces literature that tries is a society capable of solving its problems with or without literature’s help.
7. The Setting (reprise)
Not far from my house in Denver, there are two perfectly nice parks, situated almost adjacent to one another along the banks of the Platte River. One, Grant-Frontier, commemorates a real historical event that actually happened: the founding of the 1858 Montana City camp, the first chartered white settlement in the Denver area, which lasted less than a year before its inhabitants packed up and moved seven miles downriver to the confluence with Cherry Creek.
The other park, just across Evans Avenue to the north, is named Pasquinel’s Landing.
There’s no explanatory marker here, no sun-bleached still of Robert Conrad in his fur-trapper costume, no plaque memorializing James Michener and his best-selling endorsement of the nascent green movement of the early 1970s — only some hardscaping and disused fitness equipment, a path that winds down to a narrow gravel bar, the exposed riverbank that will bake in the sun until a half dozen newly planted cottonwoods can grow into a decent overstory. Maybe there’s no better monument, in the end, than the river itself.
For over half a century now, since the passage of the Clean Water Act, it has been the official policy of the U.S. federal government that the Platte, along with every other waterway in the country, should be swimmable and fishable. The law has produced a biblical flood of regulatory proceedings and scientific analyses, complaints and rulings and appeals and case law, master plans and environmental reviews and feasibility studies and open letters and newspaper features and annual reports and grant applications and job postings and fundraising emails and ArcGIS maps and memoranda of understanding — and at the end of it all there is a river, still moving sluggishly between broken concrete and invasive vegetation, still contaminated by arsenic and E. coli and phosphorus, the early progress that was made in its cleanup now stalled. A decade ago, Denver’s mayor made another promise to make the Platte swimmable by 2020. This time, when the deadline passed, the city didn’t bother to set a new one.
It’s a difficult thing, it turns out, to restore an entire river to good health after more than a century of abuse. In the mountains there are wildfires and debris flows, and in the foothills there is sprawl and erosion. In the city there are Superfund sites and storm drains and streets littered with waste, and in the fields to the north and east there is agricultural runoff full of pesticides and fertilizer nitrates. The urban planners of the postmodern era may have dreamed of building multitudes of “utopias in miniature,” but as long as there are still rivers that run through them all, there will never be any such thing.
It matters what kinds of stories we tell ourselves. But when we’re done, there will always be the river that may or may not be safe to swim in, the air that may or may not be toxic to breathe, the planet that may or may not become uninhabitably hot. The collapse of Pruitt-Igoe and other midcentury public housing projects wasn’t a failure of architectural modernism — it was the inevitable product of austerity and neglect as governments deprived the projects of the resources they needed and showered them on the suburbs instead. Meanwhile, northern India’s Le Corbusier-designed Chandigarh, which in the latter half of the 20th century continued to receive the investment due an important postcolonial national project, remains one of the country’s wealthiest and happiest cities.
Should it really surprise us that our politics and our culture have been stuck for so long in the same superstructural cul-de-sacs? Have we ever really escaped the crises of the 1970s? Were any of the material problems at the base of American society that Michener and his hero Paul Garrett sought to grapple with — energy shocks, ecological degradation, economic inequality, a violent backlash to multiracial democracy — ever at any point settled, or were they just deferred, transmuted, strung along on credit, buried under mountains of information-age abstraction?
If you go looking for them, you can find reasons to hope — reasons to believe that we might manage to stumble our way through all the artificial noise and come out on the other side with a new appreciation for the meaningful and the real. For the moment, at least, there is a sense that Silicon Valley has exhausted its power to bedazzle us, its dreams for the metaverse derailed, its creditors less eager to light cash on fire for the sake of cartoon ape JPEGs and profitless Uber-for-everythings, its billionaire overlords more liable to be publicly recognized not as visionaries and geniuses but as the hucksters, sociopaths and lottery-winning mediocrities they are. In parallel with the contractions and financial reckonings faced by Big Tech, media companies and other engines of the knowledge economy, for the first time in decades there are bright spots in American manufacturing and heavy industry, oriented in part by a national green-development policy that, while 50 years overdue and far from equal to the scale of the threat posed by climate change, may represent the start of a genuine course correction. There’s a worldwide urgency now behind collective, democratic visions of sustainable prosperity, an abundance of clean energy and housing for all, a utopia that we won’t find in a book or a VR headset but will have to mold from the clay of the earth with our own bare hands.
Of course, as comforting as it would be to believe that the answer is as simple as logging off — that to stand in the mud and drink the wild air and feel your body heave and ache is itself a radical act of resistance to the system — we can’t afford that delusion, either. There’s nothing the system has proven more capable of, over the centuries, than control of the land beneath our feet, the fruits of its soil, the resources it hides underground. Even as Silicon Valley has burned through more and more of their capital with less and less to show for it, even as the world’s economic output has become increasingly dominated by artifice and ephemera and measurable only in lines of code or cells in a spreadsheet, the richest and most powerful people on earth have consolidated their hold over the raw essentials of life. Census data shows that U.S. homeownership has sunk to its lowest rate in 50 years, while in 2019, the Land Report found that the country’s 100 largest private landowners controlled 42 million acres nationwide, a 50% jump in just over a decade. A 2021 McKinsey report calculated that even at this late date, more than two-thirds of global wealth is made up of real estate. Infrastructure and inventories of physical goods account for nearly all of the rest, while “intangibles” — all the software we rely on, every word ever written, every painting and film and TV series and video game ever made, every binary bit ever magnetized onto the disk of a hard drive, the sum total of the intellectual property produced by 10,000 years of human civilization — make up just four percent.
There’s a plan on the books, of course, to continue restoring the little section of the Platte River that passes by Pasquinel’s Landing. Channels need regrading, riparian buffers need widening, drop structures that impede fish need modifying. All told, it would cost about $12 million, a sum equal to the value of the economic activity generated by the Denver metro area roughly once every 20 minutes. Five years after its approval, the plan from the U.S. Army Corps of Engineers remains unfunded.
Another, much larger, much more expensive plan developed by the Corps, though, is already underway along a 6.5-mile stretch of the river to the north. Securing a federal outlay to cover two-thirds of the cost of the $550 million South Platte River and Tributaries project was a top priority for city, state and congressional leaders. One clue as to why is buried in the list of signatories to one of those memoranda of understanding: the name Revesco Properties, a front of sorts for billionaire Stan Kroenke, fifth on the latest edition of the Land Report 100 list with 1.63 million acres to his name. Sixty-two of those acres lie along the Platte just south of downtown Denver, and though they’re currently occupied by parking lots and the Elitch Gardens amusement park, Kroenke and Revesco have laid out a long-term vision for the site’s redevelopment into the River Mile, a massive new mixed-use district in the heart of the city.
Revesco’s CEO says the project “can be a catalyst for change,” and in a way it’s an honest-enough pitch. If the Corps does its restoration work and the River Mile vision is fully realized, it will provide 8,000 units of high-density infill housing in a city that badly needs it, and achieve the “rediscovery and revitalization of the South Platte” for generations to come. It will also make a bunch of Denver’s richest and most powerful people that much richer and more powerful. “We live in utopia,” notes Miéville. “It just isn’t ours.”
Is this the only kind of grand vision we’ll ever be capable of again? Can we imagine broad, transformative social benefits as anything other than the trickle-down byproducts of the system’s relentless drive to concentrate ever-larger pools of wealth at the top of our society? “A planner’s mission is to imagine a better world,” writes the urban planner Samuel Stein in his 2019 book Capital City, “but their day-to-day work involves producing a more profitable one.”
A few weeks ago I walked along the Platte from Pasquinel’s Landing to the future River Mile and back. After a wet spring the river was as high as I’d ever seen it, but I saw no rafters, kayakers, tubers or paddleboarders all day. A couple of anglers picked out spots to fish for walleye and smallmouth bass that it would be dangerous for them to eat, and at Confluence Park a few people, defying official warnings, waded in and splashed around in the shallows. For almost three years now, bikers and pedestrians following the Greenway trail along the river near Vanderbilt Park have been forced to take a mile-long detour through the industrial zone to the west, because the city couldn’t come up with the cash to fix a collapsing retaining wall. There are other detours, and washouts, and thick stands of invasive Siberian elm, and places where people still illegally dump tires and other trash. At the end of it all there is a river, inconstant and unpredictable but flowing forever downhill. Stand at its edge and watch the water in motion on its surface, the muddy churn and wisps of white foam and pale reflections of sky, and by the time you’ve taken it all in, it will be gone around the bend.
8. The Author (reprise)
Entering his late sixties as he finished Centennial, James Michener at one point allowed himself to believe that it might be his last major novel, according to John Hayes. In fact, in terms of published work, he hadn’t even hit his career’s midpoint. The 1978 publication of Chesapeake launched the most prolific period of his life, a decade during which he published a novel with an average length of 717 pages at a pace of once every 15 months.
That Michener, ever the ingenuous modernist, never thought too deeply about the implications of the new information economy and its relentless commoditization of culture probably owes a great deal to the fact that, more than anyone else in American letters at the time, he was its star and beneficiary. At the height of his literary fame in the 1980s, the demand for Michener’s epics was such that when he wrote Texas and Alaska it was at the direct urging of those states’ governors, who arranged to furnish him with university office space and any other research assistance he needed. Fittingly, Texas, his longest work at 1,096 pages, broke a record for hardcover fiction with a first printing of 1.1 million copies — and was judged even by his fans as a step down in quality. Wrote a Los Angeles Times reviewer: “Recent Michener fare isn’t so much high literature as it is business — big business, pardner.”
This is the Michener that is and will be remembered, for as long as these monolithic titles continue to collect dust on grandparents’ bookshelves across the country — not the author of Tales of the South Pacific, not the roving Reader’s Digest correspondent, but Michener, Inc., efficiently producing what the system demanded of it. His historical meganovel format lent itself to collaboration and routinization — “it wasn’t ‘written,’ it was compiled,” said the New York Times’ review of Centennial — and evidence suggests that the amount of juice that could be squeezed out of Random House’s most bankable brand wasn’t something left entirely up to one septuagenarian’s work ethic. Biographer Stephen May writes of at least two instances, most notably 1980’s The Covenant, in which Michener was “guilty of excessive collaboration without proper acknowledgement,” and rumors that other late-period works were wholly or partly ghostwritten have persisted.
As any writer or actor on the picket line in Hollywood this summer can attest, there’s no limit to the level of automation the system will pursue in the production of culture, as long as it thinks there’s profit in it. It’s perhaps a final damning comment on Michener’s merits as an author that it’s quite easy to imagine, at some point in the near future, entirely new works bearing his name and trademark style being assembled by an A.I. language model. And maybe it’s a hopeful sign for the long-term survival of real, human art in that future that — just one generation removed from Michener’s peak as a bestseller-list juggernaut — it’s hard to imagine a market would exist for Random House or anyone else to bother with the effort.
Is that, then, the lot of the solitary author, the artist, the architect — to do their best to outrun the machinery of the assembly line and the computations of the algorithm, straining to stay always one step ahead of assimilation and annihilation, forever? “Strange and almost traitorous as it sounds,” writes Mark McGurl, “the monumental waste of internet-enabled literary history… clear[s] conceptual space for a world that doesn’t need so much fiction, at least not as we know it.” If that sounds a little too bleak, it also treats literature as the solemn and impossible responsibility that it is, the miracle of leaving one’s mark armed only with the power of language, which is only the power to make the world more abstract and conditional, to make one thing stand for another, to draw lines between what was and is and ought to be.
Where that power ends is what will be done with the lines that are drawn, how the blueprints will be executed, the chains of action and reaction they will provoke in the audience. As the age of hyperconnection dawned in the mid-1990s, DeLillo was fixated on these limits in the final pages of Underworld:
You try to imagine the word on the screen becoming a thing in the world, taking all its meanings, its sense of serenities and contentments out into the streets somehow, its whisper of reconciliation, a word extending itself ever outward, the tone of agreement or treaty, the tone of repose, the sense of mollifying silence, the tone of hail and farewell, a word that carries the sunlit ardor of an object deep in drenching noon, the argument of binding touch, but it’s only a sequence of pulses on a dullish screen and all it can do is make you pensive.
The DeLillo of a decade earlier had ended White Noise preoccupied by death from above and within, the postmodern paranoias of the nuclear age, the invisible apocalypses of ICBMs overhead or pesticides in the water. Underworld, by contrast, concludes with a dying character’s ascension into a kind of afterlife — though “she is in cyberspace, not heaven,” and its fears and anxieties are no less acute. It’s 1997, the Cold War is over, and it’s nice to imagine that the computerized images of hydrogen-bomb explosions don’t have quite the same power they once did. The word on the screen, DeLillo isn’t afraid to tell us, is peace. But what will it cost us?
There’s little ideological determinism in an artist’s skill at their craft, the influences they draw on, the movements they’re a part of — and there’s more than one way for a legacy to be diminished. John Dos Passos, the prose stylist Michener could never hope to be, the oracle of an alienating and cacophonous postmodernity he wouldn’t live to witness, is far less remembered today than Hemingway, Fitzgerald and the other Lost Generation writers he once stood shoulder-to-shoulder with, a development attributable in part to a dramatic political conversion that disappointed many of his contemporaries. Soon after publishing the final book in the U.S.A. trilogy, Dos Passos traveled to Spain in 1937, where he became one of many Western artists and intellectuals to grow disillusioned with the left as a result of the Spanish Civil War. Unlike others, however, his rightward shift didn’t stop at anticommunist liberalism, and by the 1960s, the final decade of his life, Dos Passos was a committed conservative, contributing to National Review and penning his final novel, Midcentury, in which he attempted to once again wield the experimental techniques and satirical eye of U.S.A., this time not against big business but against union bosses, left-wing academics and New Deal bureaucrats.
In truth, Dos Passos seems to have lost interest in fiction in the postwar years, devoting himself instead to polemical memoirs and works of political history, paeans to the early American republic with titles like The Men Who Made the Nation and The Head and Heart of Thomas Jefferson. In the Founders, an aging ex-anarchist could find an example of men whose radical energies were ultimately fused with counterrevolutionary vigilance against classically-rooted notions of civilizational decline. “Cogitations in a Roman Theatre,” an essay published the year Dos Passos enthusiastically endorsed Barry Goldwater’s presidential campaign, featured the author brooding over the possibility that the country for which he’d developed such an unlikely patriotic fervor was headed for a decline and fall like Rome before it. “For a number of generations we thought the Declaration of Independence had repealed history,” he wrote. “[But] one of the teachings of history is that whole nations and tribes of men can be crushed under the imperial weight of institutions of their own devising.”
We’ll never know if Dos Passos would have fully welcomed the transformations that began to rock the world shortly after his death in 1970 — the triumph of Goldwater’s vision over the New Deal consensus, of the West over the East, of global capitalism over any alternative — or if, witnessing their late effects on his idealized “common man,” he would have once again felt as betrayed as he did in Madrid in 1937. But in light of these events, his doomsaying rings hollow, foreshadowing the catastrophizing conspiracism of a postmodern American right that warns incessantly of dystopian collapse but only ever seems to get everything it wants. Dos Passos and other midcentury conservatives, united above all by the fight against communism, got the peace they wanted. But what did it cost?
Michener, too, felt a sense of foreboding about America’s future late in his life, a feeling that is vaguely present in Centennial’s final chapter but gradually solidified and sharpened in the years leading up to 1996’s This Noble Land, a short political tract and the last of Michener’s works published during his lifetime. Though his fiction rarely left readers with any doubts as to his political sympathies, This Noble Land offers one final, unadorned look at Michener’s worldview: his centrist liberalism, his love of country, his undeterred high-modernist faith in rationally designed systems. Each chapter finds the author reckoning with an America that, noble though it was, found itself plagued by “problems” — racial inequality, gun violence, deteriorating educational and health-care systems, the decline of the industrial working class — that needed to be “solved.” Surely there was no question facing the nation that couldn’t be answered by the plainspoken, scratch-paper reckonings of a man who, before he was an author, had come to understand the world as a hard-working pedagogue, and before that as a curious boy in Doylestown, educating whoever would listen about Henry Mercer’s mansion and its superior, scientifically supported concrete construction.
The Michener of the mid-1990s, eager as always to be the reasonable consensus-builder, was an unsurprising ally of Clinton-era triangulation on issues like crime and welfare, and the parts of This Noble Land that have aged the worst are those dealing with topics like the O.J. Simpson trial and the “assault on the family,” along with one especially muddled section on Charles Murray and The Bell Curve. But in the wake of 1994’s Republican Revolution, Michener seems to have recognized, somewhat belatedly, the toppling of the political and economic order under which he had spent his entire adult life. “The social revolution that Franklin Delano Roosevelt engineered in 1932 and that Lyndon Johnson extended in the 1960s was largely to be scuttled,” he wrote, “with the major result that citizens at the lower end of the economic hierarchy would be deprived of many of their aid programs while those at the upper end would watch their fortunes grow.”
Alas, there was as little unique insight into the American experience in Michener’s farewell manifesto as there had been in any of his novels. But there is something surprising — in both, perhaps — hiding in plain sight. What is it, after all, that Centennial’s final chapter is an “elegy” for? It may be too easy to mistake Michener’s earnestness, his surface-level prescriptions for what ailed (and still ails) his country, for dewy-eyed patriotic optimism. His words were a rationalist’s proposition, a series of if-then statements, laying out a path towards a more just and sustainable future and making plain, logically sound arguments for why the country should take it. By the end of his life, Michener was hardly brimming with faith that it would, in fact, be the path the country chose. The man who had invested so much in America’s second centenary didn’t expect that it would have much to celebrate at its third — and even dared to doubt that it would survive to celebrate it at all.
The cosmopolitan Michener echoed Dos Passos on the lessons to be drawn from the full sweep of history. “I have spent my adult life studying the decline of once great powers whose self-indulgent errors condemned them not only to decline but in many cases to extinction … I have never thought that we were exempt from that rule of destiny,” he wrote in This Noble Land, reiterating a point he had expressed in speeches and interviews throughout the 1980s. But in Michener’s writing there is none of Dos Passos’ paranoid melodrama, none of the irascibility of a repressed radical. There is only the patient, empirical computations of a modern man of reason — measurements taken, equations balanced, conclusions reached.
At best, he figured, the U.S. could hope to prolong its status as the world’s dominant superpower until the middle of the 21st century. “With dedication to the principles that made us great, we can at least borrow time,” Michener mused diffidently in the book’s final paragraph, among the very last words written for publication by one of the most prolific authors in American history. “Clear sailing — albeit through increasingly roiled waters — till 2050, then the beginning of twilight.” If, on the other hand, the wrong choices were made, Michener predicted this decline would be hastened by one or more self-inflicted calamities, beginning with a “looming racial crisis” that he estimated would erupt “in the early years of the next century.” The rise of extremist militia groups, together with talk-radio propagandists and Republican demagogues, he judged, portended a “large movement that … endangers the democracy.” Plainest of all to him were the widening wealth gaps that threatened the very foundations of the country’s social contract. “If my extrapolations are correct,” he wrote, “I would expect that some time in the next century something will snap and there will be a violent upheaval.”
Michener was no oracle; though he can be excused for not foreseeing events like 9/11, the War on Terror and the COVID-19 pandemic, there are plenty of world-shaping forces missing from his book — the internet, climate change, immigration and nativist backlash — that a more perceptive analyst writing in 1996 might not have overlooked. But it is still striking to find one of history’s most popular chroniclers of American life, a decorated member of its milquetoast, middlebrow cultural firmament, preoccupied in the triumphalist Decade About Nothing by visions of the country’s decline and fall. And it would be difficult, a generation later, to dismiss out of hand the broad strokes of Michener’s prognosis, even if, in imagining Japan as the nation most likely to supplant the U.S. as global hegemon, he may have picked the wrong rising power in East Asia, or if, lacking familiarity with the next century’s most catastrophic tail risks, he was wrong to assume that the mantle would be taken up by any country at all.
What, indeed, will peace cost us? The river flows, the gears of the machine keep turning, things arise and cease only to arise again, empty in and of themselves, inescapably interdependent on one another. There is no hope of the absolute, and no comfort to be found in abstraction. When Michener died in 1997, at the age of 90, he was eulogized by his publisher — that is, by the system — as “America’s storyteller.” But what any good storyteller knows is that in the fullness of time, pressed in upon by the madness of infinite possibility, the most powerful thing a story can do is end.