Zócalo Public SquareHistory – Zócalo Public Square http://www.zocalopublicsquare.org Ideas Journalism With a Head and a Heart Mon, 22 Jan 2018 22:14:34 +0000 en-US hourly 1 https://wordpress.org/?v=4.8 How Dodge City Became the Ultimate Wild Westhttp://www.zocalopublicsquare.org/2018/01/22/dodge-city-became-ultimate-wild-west/ideas/essay/ http://www.zocalopublicsquare.org/2018/01/22/dodge-city-became-ultimate-wild-west/ideas/essay/#respond Mon, 22 Jan 2018 08:01:36 +0000 By Robert R. Dykstra and Jo Ann Manfra http://www.zocalopublicsquare.org/?p=90584 Everywhere American popular culture has penetrated, people use the phrase “Get out of Dodge” or “Gettin’ outta Dodge” when referring to some dangerous or threatening or generally unpleasant situation. The metaphor is thought to have originated among U.S. troops during the Vietnam War, but it anchors the idea that early Dodge City, Kansas, was an epic, world-class theater of interpersonal violence and civic disorder.

Consider this passage from the 2013 British crime novel, Missing in Malmö, by Torquil Macleod:

“The drive to Carlisle took about twenty-five minutes. The ancient city had seen its fair share of violent history over the centuries as warring Scots and English families had clashed. The whole Border area between the two fractious countries had been like the American Wild West, and Carlisle was the Dodge City of the Middle Ages.”

So, just how bad was Dodge, really, and why do we remember it that

The post How Dodge City Became the Ultimate Wild West appeared first on Zócalo Public Square.

]]>
Everywhere American popular culture has penetrated, people use the phrase “Get out of Dodge” or “Gettin’ outta Dodge” when referring to some dangerous or threatening or generally unpleasant situation. The metaphor is thought to have originated among U.S. troops during the Vietnam War, but it anchors the idea that early Dodge City, Kansas, was an epic, world-class theater of interpersonal violence and civic disorder.

Consider this passage from the 2013 British crime novel, Missing in Malmö, by Torquil Macleod:

“The drive to Carlisle took about twenty-five minutes. The ancient city had seen its fair share of violent history over the centuries as warring Scots and English families had clashed. The whole Border area between the two fractious countries had been like the American Wild West, and Carlisle was the Dodge City of the Middle Ages.”

So, just how bad was Dodge, really, and why do we remember it that way?

The story begins in 1872, when a miscellaneous collection of a dozen male pioneers—six of them immigrants—founded Dodge astride the newly laid tracks of the Atchison, Topeka and Santa Fe Railroad. The town’s early years as a major shipping center for buffalo hides, its longer period as a “cowboy town” serving the cattle trails from Texas, and its easy accessibility by rail to tourists and newspaper reporters made Dodge famous. For 14 years, the media embellished the town’s belligerence and bedlam—both genuine and created—to produce the iconic Dodge City that was, and remains, a cultural metaphor for violence and anarchy in a celebrated Old West.

Newspapers in the 1870s crafted Dodge City’s reputation as a major theater of frontier disorder by centering attention on the town’s single year of living dangerously, which lasted from July 1872 to July 1873. As an unorganized village, Dodge then lacked judicial and law-enforcement structures. A documented 18 men died from gunshot wounds, and newspapers identified nearly half again that number as wounded.

But the newspapers didn’t merely report that news: They interwove it with myths and metaphors of the West that had emerged in the mid-century writings of Western travelers such as Frederick Law Olmsted, Albert D. Richardson, Horace Greely, and Mark Twain, and in the “genteel” Western fiction of Bret Harte and its working-class counterpart, the popular yellow-back novels featuring cowboys, Indians, and outlaws.

Consequently, headlines about seriously lethal doings in Dodge echoed the make-believe West: “BORDER PASTIMES. THREE MEN BORED WITH BULLETS AND THROWN INTO THE STREET”; “FROLICS ON THE FRONTIER. VIGILANTES AMUSING THEMSELVES IN THE SOUTHWEST . . . SIXTEEN BODIES TO START A GRAVEYARD AT DODGE CITY”; “TERRIBLE TIMES ON THE BORDER. HOW THINGS ARE DONE OUT WEST.”

One visiting reporter remarked that, “The Kansas papers are inclined to make mouths at Dodge, because she has existed only one month or thereabouts and already has a cemetery started without the importation of corpses.” Another quipped, “Only two men killed at Dodge City last week.” A joke circulated among Kansas weeklies: “A gentleman wishing to go from Wichita to Dodge City, applied to a friend for a letter of introduction. He was handed a double-barreled shot-gun and a Colt’s revolver.”

Ken Curtis and James Arness in “Gunsmoke,” the hit TV show that popularized Dodge City’s Wild West aura. Photo courtesy of Wikimedia Commons.

The bad news out of Dodge made its major East Coast debut in 10 column inches in the nation’s then most prestigious newspaper, the late Horace Greeley’s New York Tribune. Titled “THE DIVERSIONS OF DODGE CITY,” it condemned the village for the lynching of a black entrepreneur. “The fact is that in charming Dodge City there is no law,” it concluded. “There are no sheriffs and no constables. . . . Consequently there are a dozen well-developed murderers walking unmolested about Dodge City doing as they please.”

Conditions of well-publicized anarchy, though they sold out-of-town papers, were not what Dodge City’s business and professional men wanted. From the town’s founding they had feared more for their pocketbooks than for their lives. Their investments in buildings and goods, to say nothing of the settlement’s future as a collective real-estate venture, stood at risk. For their common business enterprise to pay off they had to attract aspiring middle class newcomers like themselves.

And so, in the summer of 1873, Dodge’s economic elite wrested control of the situation. The General Land Office in Washington at last approved its group title to the town’s land and the electorate chose a slate of county officers, of whom the most important was a sheriff. Two years later Kansas granted Dodge municipal status, authorizing it to hire a city marshal and as many assistant lawmen as needed.

From August 1873 through 1875 apparently no violent deaths occurred, and from early 1876 through 1886 (Dodge’s cattle-trading period and during its ban on the open carry of sidearms), the known body count averaged less than two violent deaths per year, hardly shocking. Still, the cultural influence of that infamous first year has colored perceptions of the settlement’s frontier days ever since. Part of the reason was a Swedish immigrant, Harry Gryden, who arrived in Dodge City in 1876, established a law practice, inserted himself into the local sporting crowd, and within two years began penning sensationalist articles about the town for the nation’s leading men’s magazine, New York’s National Police Gazette, known as the “barbershop bible.”

In 1883 a Dodge City reform faction briefly assumed control at City Hall and threatened to start a shooting war with professional gamblers. Alarmist dispatches, including some by Gryden, circulated as Associated Press stories in at least 44 newspapers from Sacramento to New York City. The Kansas governor was preparing to send in the state militia when Wyatt Earp, arriving from Colorado, brokered a peace before anyone got shot. Gryden, having already introduced both Earp and his friend Bat Masterson to a national readership, penned a colorful wrap-up for the Police Gazette.

With the end of the cattle trade at Dodge in 1886, its middle class citizenry hoped that its bad reputation would at last subside. But interest in the town’s colorful history never disappeared. This enduring attention eventually led to Dodge’s inauguration in 1902 as a staple item in the upscale mass-circulation magazines of the new century, including the very widely read Saturday Evening Post.

For 14 years, the media embellished the town’s belligerence and bedlam—both genuine and created—to produce the iconic Dodge City that was, and remains, a cultural metaphor for violence and anarchy in a celebrated Old West.

With that, the dangers of Dodge became a permanent commodity—a cultural production that was retailed to a primary market of tourists, and wholesaled to readers and viewers. Thereafter writers catering to the public’s fascination with the town’s violent reputation seemingly tried to outdo one another in lurid generalizations: “In Dodge . . . the revolver was the only sign of law and order that could command respect.” And: “The court of last resort there was presided over by Judge Lynch.” And: “When one was ‘bumped off,’ the authorities just hustled the body out to Boot Hill and speculated upon what else the day would bring forth in bloodshed.”

Dodge’s local handful of yarn-spinners endorsed such nonsense, and bogus estimates of those interred on Boot Hill ranged from 81 to more than 200. By the 1930s the town’s consensus had settled on 33, a number that included victims of illness as well as violence—but a best-selling biography of Wyatt Earp, published in 1931 by the California writer Stuart Lake and still in print, boosted the body count back up to 70 or 80. Lake’s book’s success, a burgeoning auto-borne tourism, and the Great Depression’s severe economic effect on southwest Kansas collaborated in wiping out any remaining local resistance to memorializing Dodge City’s bygone days.

Movies and then television also got into the act. As early as 1914, Hollywood had discovered the old frontier town. In 1939 Dodge got major film treatment. But it was a TV series set in Dodge that ensured its continuing cultural importance. “Gunsmoke” entertained literally millions of Americans for a phenomenal twenty years (1955-1975), becoming one of the longest-running prime-time serials ever aired. Ironically, because the hour-long weekly program appears to have prompted the “Get outta Dodge” trope, the population of Hollywood’s Dodge was an interesting soap-opera collaborative of reasonable citizens beset with weekly onslaughts of assorted trouble-making outsiders. It was a dangerous place only because of the people who did not live there.

Imaginary Dodge is still hard at work helping Americans chart their moral landscape as the archetypal bad civic example. Inserted into the national narrative, it promotes belief that things can never be as dreadful as they were in the Old West, thereby confirming that we Americans have evolved into a civilized society. As it reassures the American psyche, the Dodge City of myth and metaphor also incites it to celebrate a frontier past brimming with aggression and murderous self-defense.

The post How Dodge City Became the Ultimate Wild West appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/01/22/dodge-city-became-ultimate-wild-west/ideas/essay/feed/ 0
When a Fiery Populist Inflamed the Nation—but His Political Rivals Won the Warhttp://www.zocalopublicsquare.org/2018/01/12/fiery-populist-inflamed-nation-political-rivals-won-war/ideas/essay/ http://www.zocalopublicsquare.org/2018/01/12/fiery-populist-inflamed-nation-political-rivals-won-war/ideas/essay/#comments Fri, 12 Jan 2018 08:01:38 +0000 By David S. Brown http://www.zocalopublicsquare.org/?p=90416 Two centuries after he served as president, Andrew Jackson remains an enduring figure both in history—the 1820s and ’30s are known as “The Age of Jackson”—and in American political conversation, with Donald Trump associating himself with Old Hickory’s nationalism and populism.

Jackson’s contemporary notoriety, however, far exceeds his actual impact. To be sure, he remains well known for his “war” on the Second National Bank of the United States and for signing the Indian Removal Act, which resulted in the forcible eviction of thousands of Native Americans from their homes to lands west of the Mississippi River. But soon after he left the presidency, Jackson’s way of social, economic, and racial thinking was eclipsed—and it is not likely to be durably revived, by Trump or anyone else.

The true long-term winners of the “Age of Jackson” were actually his opposition—the Whig Party. After the 1860s, the Jacksonians’ articles of faith—agrarianism,

The post When a Fiery Populist Inflamed the Nation—but His Political Rivals Won the War appeared first on Zócalo Public Square.

]]>
Two centuries after he served as president, Andrew Jackson remains an enduring figure both in history—the 1820s and ’30s are known as “The Age of Jackson”—and in American political conversation, with Donald Trump associating himself with Old Hickory’s nationalism and populism.

Jackson’s contemporary notoriety, however, far exceeds his actual impact. To be sure, he remains well known for his “war” on the Second National Bank of the United States and for signing the Indian Removal Act, which resulted in the forcible eviction of thousands of Native Americans from their homes to lands west of the Mississippi River. But soon after he left the presidency, Jackson’s way of social, economic, and racial thinking was eclipsed—and it is not likely to be durably revived, by Trump or anyone else.

The true long-term winners of the “Age of Jackson” were actually his opposition—the Whig Party. After the 1860s, the Jacksonians’ articles of faith—agrarianism, states’ rights, and slavery—were relegated to history’s ash heap. It was the priorities of the Whig Party—the short-lived moderate party of antebellum America—that prevailed, and shaped the world to come.

The Whigs did not capture as many presidential elections as the Jacksonians (only two in five contests), rarely controlled Congress, and in 1854 dissolved into the Republican Party. But their party was in the forefront of modern American development in a way that the Jacksonians, a Southern-dominated coalition, never were. And thus its impact was both superior and lasting.

To say that the Whigs advanced a “modern” outlook is to note their support for what the Kentucky statesman Henry Clay called the American System, an economic plan to use the power of the central government to encourage internal improvements in the states. Canals, railroads, industry, and a more centralized banking system were to be the fruits of this program. Culturally, Whigs were known as the party of religious and educational reform. Typically, they were opposed to Indian removal and many were hostile to slavery. They comprised a coalition of entrepreneurs and evangelicals.

The Jacksonians’ unreflective emphasis on private capitalism and states’ rights came to a head in the disastrous Panic of 1837, which touched off a major recession that lasted for several years. The panic resulted from Jackson’s destruction of the Second National Bank, which had been a vital engine of development and financial regulation in the country. Jackson put the Bank on the road to extinction by vetoing an 1832 bill to recharter it, and then removing government deposits from the Bank. In the early 20th century, Congress would rectify his mistake—and acknowledge that Whiggery had gotten it right—with the creation of the Federal Reserve System.

The true long-term winners of the “Age of Jackson” were actually his opposition—the Whig Party.

During his Bank War, Jackson’s actions concerned many Americans. His rejection of the Bank bill was one of 12 presidential vetoes during his two terms—more than all the vetoes by the previous six presidents combined. Jackson sometimes refused to sign congressionally approved legislation that he merely disagreed with personally, which led to accusations that he was governing as a king rather than as a president.

Opposition to Jackson’s monarchal behavior was a big part of the Whigs’ identity. In fact, the very name “Whig” was chosen to align Jackson’s critics philosophically with the British Whig Party, thus portraying the Jackson camp as American Tories. In a similar vein, one of the Whigs’ biggest concerns was the growth of executive authority—a concern that was not only justified, it remains of vital importance in our republic today.

It’s also significant that Jackson’s opponent in two presidential elections, John Quincy Adams, is in many respects a more relatable figure to us than Old Hickory. His support for federal aid in economic development, his criticism of slavery, and his desire to see the United States move beyond a narrow agrarian states’ rights orientation were not always political winners in his own time. But they were positions that would push the Republican Party to victory in 1860, continuing in their modern permutations to inform economic and cultural conversations today. Perhaps that is why John Quincy Adams is such a hot topic for contemporary historians. Since 2013 no fewer than four major biographies—by Herb Giles Unger, Fred Kaplan, James Traub, and William J. Cooper—have appeared; earlier this year the Library of America released an edition of The Diaries of John Quincy Adams.

Whigs were not perfect, of course—they could be elitist, patronizing, and condescending. Too much a party of the Anglo, the industrial, and the educated, Whigs alienated some voters. But they also believed in societal unity—a view that contrasted with the slash-and-burn tactics of Jackson, who often spoke of irreconcilable interests in America and seldom sought compromise. Northern Whigs in particular, who were closer to industry and further from slavery than their Southern colleagues, demonstrated a far keener understanding of the challenges facing the nation by taking centrist positions on the vital issues of the day—counseling reform rather than destruction of the Bank, advising peaceful border negotiations with Mexico rather than war, and seeking to limit planter expansion into the country’s western territories.

Jackson and his party were undeniably the “victors” of the 1830s. But the Jacksonian vision collapsed within a generation and the Republican Party under Abraham Lincoln (a former Whig) put together a new American system that included a protective tariff, subsidized industry, and promoted education. These planks—along with the destruction of slavery—definitively overturned the Jacksonian order. Old Hickory may have carried the day, but his more moderate opponents won the war.

The post When a Fiery Populist Inflamed the Nation—but His Political Rivals Won the War appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/01/12/fiery-populist-inflamed-nation-political-rivals-won-war/ideas/essay/feed/ 1
The Cookbook That Declared America’s Culinary Independencehttp://www.zocalopublicsquare.org/2018/01/11/cookbook-declared-americas-culinary-independence/ideas/essay/ http://www.zocalopublicsquare.org/2018/01/11/cookbook-declared-americas-culinary-independence/ideas/essay/#comments Thu, 11 Jan 2018 08:01:15 +0000 By Keith Stavely and Kathleen Fitzgerald http://www.zocalopublicsquare.org/?p=90396 American Cookery, published by the “orphan” Amelia Simmons in 1796, was the first cookbook by an American to be published in the United States. Its 47 pages (in the first edition) contained fine recipes for roasts—stuffed goose, stuffed leg of veal, roast lamb. There were stews, too, and all manner of pies. But the cakes expressed best what this first cookbook had to say about its country. It was a place that acknowledged its British heritage, to be sure—but was ultimately a new kind of place, with a new kind of cuisine, and a new kind of citizen cook.

The recipe for “Queen’s Cake” was pure social aspiration, in the British mode, with its butter whipped to a cream, pound of sugar, pound and a quarter of flour, 10 eggs, glass of wine, half-teacup of delicate-flavored rosewater, and spices. And “Plumb Cake” offered the striving housewife a huge 21-egg

The post The Cookbook That Declared America’s Culinary Independence appeared first on Zócalo Public Square.

]]>
American Cookery, published by the “orphan” Amelia Simmons in 1796, was the first cookbook by an American to be published in the United States. Its 47 pages (in the first edition) contained fine recipes for roasts—stuffed goose, stuffed leg of veal, roast lamb. There were stews, too, and all manner of pies. But the cakes expressed best what this first cookbook had to say about its country. It was a place that acknowledged its British heritage, to be sure—but was ultimately a new kind of place, with a new kind of cuisine, and a new kind of citizen cook.

The recipe for “Queen’s Cake” was pure social aspiration, in the British mode, with its butter whipped to a cream, pound of sugar, pound and a quarter of flour, 10 eggs, glass of wine, half-teacup of delicate-flavored rosewater, and spices. And “Plumb Cake” offered the striving housewife a huge 21-egg showstopper, full of expensive dried and candied fruit, nuts, spices, wine, and cream.

Then—mere pages away—sat johnnycake, federal pan cake, buckwheat cake, and Indian slapjack, made of familiar ingredients like cornmeal, flour, milk, water, and a bit of fat, and prepared “before the fire” or on a hot griddle. They symbolized the plain, but well-run and bountiful, American home. A dialogue on how to balance the sumptuous with the simple in American life had begun.

American Cookery sold well for more than 30 years, mainly in New England, New York, and the Midwest, before falling into oblivion. Since the 1950s it has attracted an enthusiastic audience, from historians to home cooks. The Library of Congress recently designated American Cookery one of the 88 “Books That Shaped America.”

The collection of recipes, which appeared in numerous legitimate and plagiarized editions, is as much a cultural phenomenon as a cooking book. In the early years of the Republic, Americans were engaged in a lively debate over their identity; with freedom from Britain and the establishment of a republican government came a need to assert a distinctly American way of life. In the words of 20th-century scholar Mary Tolford Wilson, this slight cookbook can be read as “another declaration of American independence.”

The book accomplished this feat in two particularly important ways. First, it was part of a broader initiative, led by social and political elites in Connecticut, that advanced a particular brand of Yankee culture and commerce as a model for American life and good taste. At the same time, its author spoke directly to ordinary American women coping with everyday challenges and frustrations.

The title page of American Cookery. Image courtesy of Library of Congress.

American Cookery was a Connecticut project. There, a still mainly agricultural society of small independent farms was positioned to benefit from trading networks, near and far. But moving beyond mere subsistence farming required an openness to these new markets and to the world of commerce in general. Connecticut’s Federalist leaders were well-connected to influential newspapers, printers, and booksellers, and were able to promulgate a vision of an America where agriculture would flourish with the help of commerce—rather than in opposition to it.

Jeffersonians who disagreed with this outlook emphasized rural life as an end in itself. For them, the future of American society depended on the spread of the smallhold farmer, whose rustic simplicity would inoculate their fledgling country against the corrupting influence of the luxury to which Britain had succumbed.

The two camps took part in a public debate about luxuries—were they totems of prosperity or symbols of social decay? Some American thinkers, such as Joel Barlow, the author of the popular poem The Hasty Pudding, maintained that thoroughgoing simplicity should form the basis of American cooking and eating. But the Connecticut Federalists thought such asceticism left too little room for the aspirations of common people to improve their lot. These moderates preferred to encourage a kind of restrained gentility that would, in time, become the parlor rectitude of Victorian America. For those in the Federalist camp, encouraging education and the modest enjoyment of worldly goods would help build an enlightened society.

While their way of thinking was nothing if not temperate, the Connecticut Federalists promoted their views vigorously. They published Noah Webster’s popular Blue Back Speller (1783), the first American spelling book and primer, so called because of its cheap blue paper covers; Jedidiah Morse’s American Geography (1789), the first general compendium of political and geographic information about the new nation; as well as the writings of a literary circle known as the Connecticut Wits, whose poems allegorized the American Revolution and envisioned a glorious destiny for the new country. Many of these best-selling works were published by the firm of Hudson & Goodwin—which also published the first edition of American Cookery. Complementing this new American literary harvest were other ventures in locally-made goods. Imports were far from rare, but the message was clear: Everything—books, clothing, furniture, and even food—could be given an American slant.

With its new take on a practical topic, American Cookery caught the spirit of the times. It was the first cookbook to include foods like cranberry sauce, johnnycakes, Indian slapjacks, and custard-style pumpkin pie.

Moreover, Simmons had a keen understanding of the care that went into the construction of American household abundance. Behind every splendidly arrayed table lay the precise management of all the fruits and vegetables, meats and poultry, preserves and jellies, and cakes and pies that sustained the home and family—and American Cookery gave cooks and housewives tips for everyday cooking as well as occasions when the aim was to express greater gentility.

Simmons explained how to keep peas green until Christmas and how to dry peaches. She introduced culinary innovations like the use of the American chemical leavener pearlash, a precursor of baking soda. And she substituted American food terms for British ones—treacle became molasses, and cookies replaced small cakes or biscuits.

Above all, American Cookery proposed a cuisine combining British foods—long favored in the colonies and viewed as part of a refined style of life—with dishes made with local ingredients and associated with homegrown foodways. It asserted cultural independence from the mother country even as it offered a comfortable level of continuity with British cooking traditions.

American Cookery also carried emotional appeal, striking a chord with American women living in sometimes-trying circumstances. Outside of this one book, there is little evidence of Amelia Simmons’s existence. The title page simply refers to her as “An American Orphan.” Publishers Hudson & Goodwin may have sought her out, or vice versa: The cookbook’s first edition notes that it was published “For the Author,” which at the time usually meant that the writer funded the endeavor.

Whatever Simmons’s backstory might have been, American Cookery offers tantalizing hints of the struggles she faced. Although brief, the prefaces of the first two editions and an errata page are written in a distinctive (and often complaining) voice. In her first preface, Simmons recounts the trials of female orphans, “who by the loss of their parents, or other unfortunate circumstances, are reduced to the necessity of going into families in the line of domestics or taking refuge with their friends or relations.”

American Cookery asserted cultural independence from the mother country even as it offered a comfortable level of continuity with British cooking traditions.

She warns that any such young female orphan, “tho’ left to the care of virtuous guardians, will find it essentially necessary to have an opinion and determination of her own.” For a female in such circumstances, the only course is “an adherence to those rules and maxims which have stood the test of ages, and will forever establish the female character, a virtuous character.” Lest the point somehow be missed, Simmons again reminds readers that, unlike women who have “parents, or brothers, or riches, to defend their indiscretions,” a “poor solitary orphan” must rely “solely upon character.”

The book appears to have sold well, despite Simmons’s accusation on the errata page of “a design to impose on her, and injure the sale of the book.” She ascribes these nefarious doings to the person she “entrusted with the recipes” to prepare them for the press. In the second edition she thanks the fashionable ladies, or “respectable characters,” as she calls them, who have patronized her work, before returning to her main theme: the “egregious blunders” of the first edition, “which were occasioned either by the ignorance, or evil intention of the transcriber for the press.” Ultimately, all her problems stem from her unfortunate condition; she is without “an education sufficient to prepare the work for the press.” In an attempt to sidestep any criticism that the second edition might come in for, she writes: “remember, that it is the performance of, and effected under all those disadvantages, which usually attend, an Orphan.”

These parts of the book evoke sympathy. Women of her time seem to have found the combination of Simmons’ orphan status and her collection of recipes hard to resist, and perhaps part of the reason lies in her intimations of evil as much as her recipes. When the pennywise housewife cracked American Cookery open, she found a guide to a better life, which was the promise of her new country. But worry and danger lurked just below the surface of late 18th-century American life, especially for women on the social margins. In a nation still very much in the making, even a project as simple as the compilation of a cookbook could trigger complex emotions. American Cookery offered U.S. readers the best in matters of food and dining as well as a tale of the tribulations facing less fortunate Americans—including, it seems, the “American Orphan” Amelia Simmons herself.

The post The Cookbook That Declared America’s Culinary Independence appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/01/11/cookbook-declared-americas-culinary-independence/ideas/essay/feed/ 3
When Does a Garden-Variety Demagogue Become Dangerous?http://www.zocalopublicsquare.org/2018/01/10/garden-variety-demagogue-become-dangerous/ideas/essay/ http://www.zocalopublicsquare.org/2018/01/10/garden-variety-demagogue-become-dangerous/ideas/essay/#respond Wed, 10 Jan 2018 08:01:36 +0000 By Thomas Weber http://www.zocalopublicsquare.org/?p=90379 In the summer of 1923, Adolf Hitler realized he had a problem. Germany was in the midst of an extreme economic crisis that inspired widespread feelings of disaffection, worries about national and personal decline, a wave of anti-globalism, and the political turmoil that the 34-year-old Nazi leader had been longing for.

But for Hitler, this air of imminent national revolution had come too soon—because no one yet realized that he should be Germany’s natural leader.

This was his own fault. For years, he had steadfastly refused to be photographed and had not given anything about himself away in his speeches. Instead, he had relied solely on the power of his voice to create a following for himself. And while his carefully choreographed speeches had been sufficient to turn him into the enfant terrible of Bavarian politics, Hitler concluded that his chances of becoming the face, or at least a face,

The post When Does a Garden-Variety Demagogue Become Dangerous? appeared first on Zócalo Public Square.

]]>
In the summer of 1923, Adolf Hitler realized he had a problem. Germany was in the midst of an extreme economic crisis that inspired widespread feelings of disaffection, worries about national and personal decline, a wave of anti-globalism, and the political turmoil that the 34-year-old Nazi leader had been longing for.

But for Hitler, this air of imminent national revolution had come too soon—because no one yet realized that he should be Germany’s natural leader.

This was his own fault. For years, he had steadfastly refused to be photographed and had not given anything about himself away in his speeches. Instead, he had relied solely on the power of his voice to create a following for himself. And while his carefully choreographed speeches had been sufficient to turn him into the enfant terrible of Bavarian politics, Hitler concluded that his chances of becoming the face, or at least a face, of the national revolution were close to nil if people did not even know what he looked like.

So he went to the opposite extreme—producing picture postcards of himself and distributing them widely.

Hitler’s radical recasting of his public image in 1923 went further than that—and said a great deal about the kind of leader he was aspiring to become. A garden-variety demagogue might have simply created an outsized image for himself, an inadvertent sort of cartoon. Hitler did something more sophisticated. He made the case for a new kind of leader, and created a semi-fictional alternative version of himself that would fit his own job description.

To sell the idea that he was Germany’s savior-in-waiting, and to boost his profile outside of Bavaria, he wrote a very short autobiography to be published together with a selection of his speeches. In the autobiography, he told the story of how his experiences as a young man provided him with revelations about the nature of politics that would allow him to save Germany from misery and make it safe for all times.

But publishing such a self-aggrandizing portrait would have repelled Germany’s traditional conservatives, so Hitler searched for a writer with impeccable conservative credentials willing to pretend to have written the book. Doing so would come with a double payoff: Hitler’s shameless act of self-promotion would be concealed, while the impression would be created that he already was in receipt of widespread support among traditional conservatives.

This led Hitler to Victor von Koerber, a blue-eyed and blond young military hero and writer. A North-German aristocrat, von Koerber was attracted by the promise of a new conservatism fused with the youthful idealism of National Socialism.

The book—published under the title Adolf Hitler, sein Leben, seine Reden (Adolf Hitler: His Life and His Speeches)—was banned soon after publication, limiting its intended impact. Yet the book sheds light on how Hitler—in a moment rife for demagoguery—managed to rise to the top against all odds.

Hitler often paid lip service to the myth—which tends to be believed by historians to the present day—that he was only “a drummer” who was doing the bidding of others and had no ambitions to lead Germany into the future. But in the book, he put into the mouth of Koerber his own determination that he was “the leader of the most radically honest national movement […] who is ready as well as prepared to lead the German struggle for liberation.”

When confronted with emerging demagogues, […] history thus cannot tell us until it is too late whether an individual is a Hitler, a Franco, a Lenin—or, for instance, a populist who, while flirting with authoritarianism, ultimately manages to withstand its seduction.

Hiding behind Koerber’s name, Hitler could get away with pronouncing himself Germany’s “messiah.” His autobiography-in-disguise repeatedly uses biblical language, arguing that the book should “become the new bible of today as well as the ‘Book of the German People.’” It also directly compares Hitler to Jesus, likening the purported moment of his politicization in Pasewalk to Jesus’s resurrection:

“This man, destined to eternal night, who during this hour endured crucifixion on pitiless Calvary, who suffered in body and soul; one of the most wretched from among this crowd of broken heroes: this man’s eyes shall be opened! Calm shall be restored to his convulsed features. In the ecstasy that is only granted to the dying seer, his dead eyes shall be filled with new light, new splendor, new life!”

Given that he wrote this stuff, Hitler’s need to pretend to be a mere “drummer” is simple: He had to square the circle. On the one hand, he desired to put himself in a position to head a national revolution. On the other hand, Germany’s conservatives had their own political ambitions. Hitler could only advance by pretending that he would be their tool, while attempting to create the impression that his support among them was already larger than it really was.

The Hitler of this episode belies the common misconception that he was a primitive, raging, and nihilistic dark elemental force. Rather, he was a man with an emerging deep understanding of how political processes, systems, and the public sphere worked. His study of propaganda techniques while serving in World War I had provided him with an appreciation for political narratives that would help him plot his way to power.

Getting Koerber to release his autobiography helped Hitler create a politically useful narrative. By making the case for a new kind of leader, without explicitly naming Hitler, it insidiously created the public perception of a gap that only he could fill: a man without a pedigree coming out of nowhere with an innate gift for seeing the hidden architecture of the world and hence to build a new Germany. In short, Hitler cleverly exploited the way the German political system and the public sphere worked, so as to build a place for himself.

Demagogues come in several varieties, from populists with no genuine core beliefs to ideologues of various political convictions. They include rational as well as irrational actors. Some are figures who know when to retreat to moderation, and others never know where to stop, thus planting the seed of their regime’s self-destruction. The problem is that it is only in hindsight that we can tell how any specific demagogue will develop.

Koerber and other conservatives thought that they simply could use Hitler. But they did not understand, at least in 1923, how the common language and style of demagogues-in-the-making looks very similar at the beginning, while their inner selves vary greatly. Unlike many others, Koerber of course knew how clever a political operative Hitler was, but the young aristocrat could not really see into Hitler and misjudged him.

When confronted with emerging demagogues, in moments when people yearn for strongmen and novel kinds of leaders, history thus cannot tell us until it is too late whether an individual is a Hitler, a Franco, a Lenin—or, for instance, a populist who, while flirting with authoritarianism, ultimately manages to withstand its seduction.

Victor von Koerber eventually learned the hard way that the person he had imagined Hitler to be when lending his name to him was a very different man from the one who would rule Germany. He grew disillusioned with Hitler in the mid-1920s after seeing how he presented himself once his trial (in the wake of his failed putsch) had finally transformed him into a public figure.

In the late 1920s, Koerber began issuing warnings about the dangers Hitler posed to the world. But by then, it was already too late to stop him. Once the Nazi Party was in power, Koerber helped a prominent German Jew to get out of the country. And then Koerber began to feed the British military attaché in Berlin with intelligence. Koerber ultimately landed in one of Hitler’s concentration camps, which he barely survived.

The post When Does a Garden-Variety Demagogue Become Dangerous? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/01/10/garden-variety-demagogue-become-dangerous/ideas/essay/feed/ 0
Eisenhower’s Tax Policies Invested in the Future, Not the Fewhttp://www.zocalopublicsquare.org/2017/12/19/eisenhowers-tax-policies-invested-future-not/ideas/essay/ http://www.zocalopublicsquare.org/2017/12/19/eisenhowers-tax-policies-invested-future-not/ideas/essay/#respond Tue, 19 Dec 2017 08:01:32 +0000 By David Goldfield http://www.zocalopublicsquare.org/?p=90064 “It’s a tax bill for the middle class. It’s a tax bill for jobs. It’s going to bring a lot of companies in. It’s a tax bill for business, which is going to create the jobs,” President Donald Trump told business leaders earlier this fall as the Republican Congress pushed through sweeping tax legislation.

None of that’s true, but it does beg the question: What would a tax policy need to look like to accomplish all this? Going back to an earlier Republican administration provides a striking example.

During the administration of President Dwight D. Eisenhower, from 1953 to 1961, the top income bracket in the United States climbed to a marginal tax rate of 91 percent. Taxes on corporate profits were two times as great as they are in 2017, and that’s before the current proposal to cut that rate to 21 percent. The tax on large estates rose

The post Eisenhower’s Tax Policies Invested in the Future, Not the Few appeared first on Zócalo Public Square.

]]>
“It’s a tax bill for the middle class. It’s a tax bill for jobs. It’s going to bring a lot of companies in. It’s a tax bill for business, which is going to create the jobs,” President Donald Trump told business leaders earlier this fall as the Republican Congress pushed through sweeping tax legislation.

None of that’s true, but it does beg the question: What would a tax policy need to look like to accomplish all this? Going back to an earlier Republican administration provides a striking example.

During the administration of President Dwight D. Eisenhower, from 1953 to 1961, the top income bracket in the United States climbed to a marginal tax rate of 91 percent. Taxes on corporate profits were two times as great as they are in 2017, and that’s before the current proposal to cut that rate to 21 percent. The tax on large estates rose to more than 70 percent. Businesses operated under a relatively high tax burden, and they employed a labor force in which one-third of the workers were unionized and bargained with executives as equals. Corporations served a diversity of stakeholders as opposed to stockholders. The result was a booming economy that benefited most Americans.

In 1955, Fortune magazine noted approvingly that the incomes of the top 0.01 percent of Americans were less than half what they had been in the late 1920s, and their share of total income was down by 75 percent. In the 1950s, the average corporate CEO received 20 times more compensation than the firm’s typical employee; by 2016, CEOs’ salaries averaged more than 200 times those of the average worker.

Americans in the 1950s enjoyed what economists called “the virtuous circle of growth”: Well-paid workers fueled consumer demand, which, in turn, generated business expansion and hiring, raising corporate profits, which produced higher wages and more hiring. A consumer culture flourished and, therefore, so did the economy. Fortune noted that, by the mid-1950s, the number of middle-class families was increasing by 1.1 million a year.

A key part of the prosperity equation of the 1950s was public policy—what the federal government accomplished with its tax revenues. The Eisenhower administration invested in human capital, not in corporate welfare. It expanded Social Security. It raised the minimum wage by $1 an hour, or $8.85 per hour in 2017 dollars. (The federal minimum wage is currently $7.25 an hour.) And Eisenhower initiated the greatest public works project in American history, the interstate highway system, a program funded by a gas tax.

Eisenhower’s 1958 budget nearly doubled the federal budget from that of the Truman era, and raised the national debt by billions of dollars. But it also provided jobs and education for millions of Americans who, in turn, would repay the nation many times over.

Americans in the 1950s enjoyed what economists called “the virtuous circle of growth”: Well-paid workers fueled consumer demand, which, in turn, generated business expansion and hiring, raising corporate profits, which produced higher wages and more hiring.

It would have been politically expedient in the Cold War era for Eisenhower to have spent more of the government’s largesse on military hardware and less on human capital. But he didn’t. His response to the Soviet Union’s launch of the Sputnik satellite in October 1957 is instructive. Eisenhower sponsored the National Defense Education Act to promote the study of science, foreign languages, and area studies in universities. He established the President’s Science Advisory Committee to fund both research and infrastructure. Federal funding for research and development increased from less than 0.7 percent of the GDP in 1953 to 1.8 percent of the GDP by 1959.

That earlier era of tax policy is long gone and is about to recede even further into the past. Instead of the virtuous circle of growth, we have devolved into a vicious cycle of decline as tax and spending cuts, along with deregulation, have skewed federal policy toward the benefit of the few. More than 90 percent of Americans born in 1940 at the start of the baby boom earned more than their parents; only 50 percent of those born in 1980 will do so. By 2011, 51 percent of the population was middle-class, compared to 61 percent in 1971.

The decline in federal support for higher education has corresponded with tuition hikes at public colleges and universities; those hikes make it harder for low-income students to complete their studies and enjoy the “college premium,” the increase in pay that accrues to university graduates. In the 1970s, Pell grants for low-income students covered nearly 80 percent of the costs at a public university; by 2013-14, they covered just 31 percent. Federal research spending is once again roughly at pre-Eisenhower levels.

Lower taxes on the wealthy are irrelevant to solving these problems, and may worsen them. The decline in the marginal tax rate and in the top capital gains tax rate over the past six decades has not, according to the nonpartisan Congressional Research Service, correlated with economic growth. Instead, it is “associated with the increasing concentration of income at the top of the income distribution.” Most economists discredit the notion that lower taxes will lead to greater growth and employment. Sweden, with a top marginal tax rate of 56.4 percent, experienced a GDP growth rate of 3.2 percent in 2016, more than double the U.S. rate of 1.5 percent.

Corporations today are sitting on record levels of capital, deploying it often only to buy back their own shares rather than investing to grow the economy. Investment in the national interest is clearly and historically the role for government. Eisenhower was fond of quoting another Republican, Abraham Lincoln, on the proper role of government: “The legitimate object of government is to do for a community of people whatever they need to have done, but cannot do at all, or cannot so well do, for themselves, in their separate and individual capacities.”

Taxes are the means to this end. That burden, shared widely, will indeed make America great again. It is our best investment in the future.

The post Eisenhower’s Tax Policies Invested in the Future, Not the Few appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/12/19/eisenhowers-tax-policies-invested-future-not/ideas/essay/feed/ 0
Why Americans Think Managing the National Budget Is Like Balancing the Family Checkbookhttp://www.zocalopublicsquare.org/2017/12/18/why-americans-think-managing-national-budget-like-balancing-family-checkbook/ideas/essay/ http://www.zocalopublicsquare.org/2017/12/18/why-americans-think-managing-national-budget-like-balancing-family-checkbook/ideas/essay/#comments Mon, 18 Dec 2017 08:01:05 +0000 By Joanna Cohen http://www.zocalopublicsquare.org/?p=90039 Americans are forever being urged to do things that supposedly will jump-start the economy, protect jobs, and raise the fortunes of Wall Street. Politicians and pundits implore consumers to “Buy American,” so as to help U.S. workers and keep the trade deficit low. Or to hit the shopping malls—even if it means taking on more debt—while still somehow finding a way to balance the family checkbook.

What’s striking about these demands is that the responsibilities and obligations of American consumers are understood to be stories about individual accountability. Whether it is the government asking consumers to eschew a low price for the sake of patriotism, or economists calling on them to manage debt at the same time that they unleash their desires, demands made on consumers are imposed at the personal level. Each citizen must hold the reins of protection and growth in his or her own hands, making choices

The post Why Americans Think Managing the National Budget Is Like Balancing the Family Checkbook appeared first on Zócalo Public Square.

]]>
Americans are forever being urged to do things that supposedly will jump-start the economy, protect jobs, and raise the fortunes of Wall Street. Politicians and pundits implore consumers to “Buy American,” so as to help U.S. workers and keep the trade deficit low. Or to hit the shopping malls—even if it means taking on more debt—while still somehow finding a way to balance the family checkbook.

What’s striking about these demands is that the responsibilities and obligations of American consumers are understood to be stories about individual accountability. Whether it is the government asking consumers to eschew a low price for the sake of patriotism, or economists calling on them to manage debt at the same time that they unleash their desires, demands made on consumers are imposed at the personal level. Each citizen must hold the reins of protection and growth in his or her own hands, making choices that apparently resonate on the national level.

Given the widespread belief in the invisible hand of the marketplace, such a personal touch seems out of place. Yet the logic behind the importance of the individual consumer is so deeply rooted that it’s hardly surprising to find it persists. It’s a formulation that goes back to 18th-century efforts to understand the creation and preservation of wealth.

Take, for example, one 1767 diatribe written by a frustrated New York linen draper. Infuriated by the poor state of his local economy—excessive imports had left the colony constantly in debt and drained of all hard currency—the merchant published a pamphlet that was intended to act as an object lesson in the arts of economizing. He painted a picture of an industrious farmer from Staten Island, busily bringing to market all the “beef, pork, corn, butter, cheese and wool” he could spare. Having earned a thousand pounds through his sales, he went on to buy the makings of a good party, bringing home “rum, sugar, wine, cloth, silk, muslin and tea.” But this farmer never spent more than he earned. As a result, he prospered.

When his son succeeded to the estate, the parable took a darker turn. The young man’s desire to be fashionable led to overspending on pricey imports. Claret, madeira, linen, chintz, and damask—all of which could have been substituted with items made at home—led to the young man’s ruin. His farm was sold, his “body shut up in prison.” The lesson from such a tale was clear, the linen purveyor suggested: “The conduct of a single Farmer and a Province in this respect differ no more than greater and less.” In other words, a healthy economy comes simply from responsible housekeeping.

Such morality tales were not the purview of protectionists alone. Free-trade enthusiasts likewise sought to justify their ideology with counter-parables about economically righteous behavior. In 1776, when the Scottish economist Adam Smith sought to persuade readers that restrictions on imports were unhelpful, he turned to the example of the “prudent master of a family” to make his case. Why would such a sensible man seek to manufacture shoes or clothes for himself, Smith argued, when it clearly made sense for him to seek out those items from cobblers and tailors who could make them faster and cheaper than any member of his own household? Leaping from the household to the nation, Smith continued: “What is prudence in the conduct of every private family, can scarcely be folly in that of a great kingdom. If a foreign country can supply us with a commodity cheaper than we ourselves can make it, better buy it of them.”

Regardless of ideology, such parables shared a basic didactic device: They erased differences between an individual’s personal economy and the nation’s political economy. Collapsing the distinctions between household and country made it easier to imagine that the judgments and actions of a single consumer could have a direct bearing on the health of the nation’s economy. Both authors—the linen merchant, and Smith—went on to argue that economic policy could structure and influence the creation of wealth. Still, the cornerstone of national wealth remained an individual’s actions. The role of policy was obscured by the familiar metaphor of the consumer at home.

The logic behind the importance of the individual consumer is so deeply rooted that it’s hardly surprising to find it persists. It’s a formulation that goes back to 18th-century efforts to understand the creation and preservation of wealth.

In America, such parables about personal habits and individual accountability had special resonance. Even before Smith’s prognostications were rolling off the press, Americans had confronted the question of how far their actions as individual consumers might take them and their nascent nation. By boycotting English woolens and donning homespun and in refusing to drink East India Company tea, colonial consumers challenged the might of the British empire. By 1776, Americans had learned powerful lessons about their importance as individuals. Their everyday household decisions had helped birth a nation.

In the wake of the Revolution this form of patriotism lost some appeal. Still, the idea that each individual might make the choices that would create national wealth did not disappear. Faced with an enormous national debt and seeking ways to raise urgently-needed revenue, politicians turned back to the consumer. Members of the First Congress in 1789 passed a new tariff: Each time Americans bought a foreign luxury, they would pay a price to the nation for doing so.

Once again, personal economy sat at the heart of political economy. Praising the new tariff in 1789, The Connecticut Courant editorialized: “It teaches us to economize not by forbidding us to be extravagant but by making us pay for it if we are so.” For those who were wealthy, it was possible to shop and contribute to the nation’s coffers. For those who were poor or dependent, the only virtuous course of action was to abstain from the world of foreign luxury goods altogether.

This formulation led to a notion that economic liberties in the form of consumer choice should be reserved for the rich and powerful, rather than the poor and vulnerable. In addition, the idea that consumers as individuals had such a direct impact on the nation’s economic health gave rise to the virulent surveillance of less-wealthy consumers whose actions seemed not only to undermine their own solvency but that of the nation, too.

During the Panic of 1857, for instance, retailers scrambled to unload unwanted stock by reducing prices, closing down lines of credit, and selling cheap for cash. Since most Americans, rich and poor, depended on credit to survive, such an action could have provoked widespread condemnation. But it was not the frantic retailers who came in for criticism; instead, it was a “horde” of female shoppers that caught satirists’ attention. In a piece in Harper’s Weekly, women were described as a “spending animal” whose heart was “a bargain” and whose soul was “an immense reduction.” These shallow shoppers were not just criticized for their unbridled spending: They were accused of obliterating “republican values” too.

Even as the nation succumbed to cataclysmic Civil War, citizens continued to monitor each other and make judgments. In the Union, where a new and stringent tariff became law, Northern shoppers found they could defend their personal purchases as contributing to the nation’s coffers. In this way, President Abraham Lincoln’s government helped to enshrine consumption at the heart of America’s political economy, as a civic force for good. But even as the Republican Party helped transform shopping from liability into liberty for everyday Americans, the central idea that individual shoppers bore a personal responsibility to keep the nation solvent remained intact.

As modern-day accounts of consumer behavior show, Americans have not shaken off the lingering belief that the national economy is simply the household economy writ large. With the U.S. financial collapse in 2008, both media and government were quick to scrutinize the actions of the individual consumer. Overspending and irresponsible borrowing (rather than irresponsible lending, for example) were easier to understand. As Time magazine noted in a list of 25 people most to blame for the subprime crisis, consumers deserved some blame: “we enjoyed living beyond our means…no wonder we hoped it would never end.” As had been the case throughout America’s history, government looked to individuals to make right the nation’s economic predicament. Using the language of household economics, it was easy to lay the blame on the shoulders of individuals. Tales of personal fiscal irresponsibility offered a “common sense” solution to the problem; stories of structural failures did not.

The homespun vision of the nation as household enshrines the consumer as the agent of America’s fortune or failure. But such a vision is, of course, a fiction. Though individual citizens may be in charge of their own households, they are not in charge of the American household writ large. The citizen-consumer is at the mercy of national economic policy, not the other way around. Nonetheless, as with all homespun philosophies, it’s a myth that is hard to dispel. Perhaps that is not surprising. It’s a story as old as the nation itself.

The post Why Americans Think Managing the National Budget Is Like Balancing the Family Checkbook appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/12/18/why-americans-think-managing-national-budget-like-balancing-family-checkbook/ideas/essay/feed/ 1
The ‘Hillbilly’ Migrants Who Made Akron, Ohio the World’s Rubber Capitalhttp://www.zocalopublicsquare.org/2017/12/11/hillbilly-migrants-made-akron-ohio-worlds-rubber-capital/ideas/essay/ http://www.zocalopublicsquare.org/2017/12/11/hillbilly-migrants-made-akron-ohio-worlds-rubber-capital/ideas/essay/#comments Mon, 11 Dec 2017 08:01:49 +0000 By Tom Jones http://www.zocalopublicsquare.org/?p=89900 In the earliest decades of the 20th century, more than 28 million men and women—black and white—began “The Great Migration” north from the Deep South and Appalachia. Among those who left their homes, literally hundreds of thousands migrated to “the Rubber Capital of the World”—Akron, Ohio. With blacks barred from factory work due to the tenor of the times in Akron, Southern white males would build the tires and produce the war materials as America entered World War I.

Although dismissively and disparagingly called “hillbillies,” these Southern whites were preferred even over locals by the rubber companies. This was because, as author John Tully noted in The Devil’s Milk: A Social History of Rubber, “they were hard workers and often individualistic in outlook, reflecting their origins as fiercely independent small-owners of farmlands.” This made them less susceptible to unionization and meant they could be easily “returned” by simply allowing

The post The ‘Hillbilly’ Migrants Who Made Akron, Ohio the World’s Rubber Capital appeared first on Zócalo Public Square.

]]>
In the earliest decades of the 20th century, more than 28 million men and women—black and white—began “The Great Migration” north from the Deep South and Appalachia. Among those who left their homes, literally hundreds of thousands migrated to “the Rubber Capital of the World”—Akron, Ohio. With blacks barred from factory work due to the tenor of the times in Akron, Southern white males would build the tires and produce the war materials as America entered World War I.

Although dismissively and disparagingly called “hillbillies,” these Southern whites were preferred even over locals by the rubber companies. This was because, as author John Tully noted in The Devil’s Milk: A Social History of Rubber, “they were hard workers and often individualistic in outlook, reflecting their origins as fiercely independent small-owners of farmlands.” This made them less susceptible to unionization and meant they could be easily “returned” by simply allowing them to go back home when production slackened. That very mobility, however, also meant that they were resented by locals who saw them as having no city pride and being only interested in taking their wages and returning home.

Even though they were once recognized as “Akron’s largest ethnic group,” their contributions have been all but forgotten.

My grandfather, Haskell Jones, was one of those “hillbillies.” Born in Western Kentucky in 1898, the first son of a respected magistrate and former schoolteacher, his family may well have been counted among the “elite” of the community—even though they resided in a two-room shack without running water, electricity, or an outhouse. When his father passed away, leaving Haskell as the head of the family of eight at the age of 15, he had to provide for all by working in the fields—until he heard about the rubber factories of Akron and their insatiable demand for labor.

As it became a home for that labor, Akron’s population tripled between 1910 and 1920 to 208,435—making it the fastest growing city in the entire nation. Of those, more than 75,000 were employed in the rubber industry, in factories that operated 24 hours a day, six days a week.

“All you had to do was hit town in those days and they grabbed you,” Haskell later recalled in more than 40 hours of a recorded oral history. “Rubber factories was going full blast and they was hiring every one of us hillbillies that come into town. Thousands. When I first came here, they couldn’t get enough people.”

He said that he arrived on a Saturday and couldn’t find a room. “First night I was in Akron, I slept with two other guys. Strangers. There wasn’t any rooms. There was just not any you could get ahold of unless you was acquainted and knew how to look. Two of us had to stand out in the hall while the other guy undressed and got into bed it was so small.”

By Monday, he had a job at Miller Rubber. “Wages were pretty low,” he said. “I think I was makin’ 56 cents an hour. It was better than 60 cents a day, I’ll tell you that. That’s what I got around home.”

The author’s grandfather, Haskell A. Jones, C. 1917. Photo courtesy of Tom Jones.

It was dangerous, too. “I’d worked about two or three weeks and a man got killed,” he said. “It was only about twenty feet from me. He got pulled into the machine. Some guy run a steel bar in between the rolls and cracked the machinery to get him out of there. But the poor guy died.”

When the company tried to assign the dead man’s job to him, Haskell refused, and was fired. Which was hardly a problem.

“I went over to Firestone, got another job, got examined, and was back in an hour and a half later to get my pay. Firestone and Miller was only a block apart. That’s how it worked in those days. You just walk out of one job, said ‘I want another one’ and the man gave it to ya’.”

As one of thousands of white Southerners in Akron, he was not alone in refusing certain work. Due to appalling conditions on both the factory floor and in acquiring suitable housing, recruiting and maintaining a workforce were not easy tasks. In her dissertation, “Industrial Voyagers: A Case Study of Appalachian Migration to Akron, Ohio, 1900-1940,” author Susan Allyn Johnson reported that one company reported hiring 642 new employees during one week of 1916, only to have another 652 quit. Another company, which employed 18,000 men on its production lines, was forced to employ 88,000 men over the course of a year.

After the first World War, recession, the Great Depression, and modernization of the factories, much of that workforce would be eliminated. At the bottom of the Depression, Akron’s industrial unemployment rate hit a staggering 60 percent. The national economy, with an unemployment rate of only 23.6 percent, looked almost robust by comparison.

But with a tenaciousness birthed in a hardscrabble childhood, my grandfather and thousands of other hillbillies held on, working part-time at the factories, taking odd jobs, or signing up for the Works Progress Administration before World War II created yet another employment boom.

By then, Haskell had moved on to a more important role. Beginning in 1941 and continuing throughout much of the war, he served as the last marshal and first chief of police of the neighboring village of Tallmadge, Ohio. On call 24 hours a day, seven days a week, he was required to supply his own uniform and gun, as well as his own police car.

As the sole member of the community’s police department, he established its first fire department, organized scrap drives, and received FBI training to help protect the wartime industry of Akron. Or, as a recent history of the city notes, “Police work in Tallmadge began in 1941 when Haskell Jones served as the lone officer and Town Marshal in the village of Tallmadge.” It would not be the last role he would perform in service to the community.

With the wartime influx of factory workers and post-war veterans looking to live in Tallmadge, the village rapidly grew into a city—which required major changes in municipal operations. As a member of Tallmadge City Council—first elected in 1949, then re-elected in 1953 and 1955—he helped define the infrastructure on which today’s modern city is built.

Improvements during his tenure included everything from paved roads (many of them built on an emergency basis), to new housing developments, mail delivery (but only after the houses were numbered), and the first shopping mall, first drive-in theater and first city bus service. The council also forced a water system through against voters’ wishes, and oversaw the installation of gas mains and a sewage disposal system. For the new city employees, it established paid holidays and sick leave, as well as a police pension fund.

In his later years, this hillbilly migrant who had grown up without so much as an outhouse was keenly aware of how far he’d come. “Now, I got along with no education because I came at a time when all they wanted was muscle,” he would observe. “But that time is gone, see. They need somebody that can think. In today’s market, I couldn’t’ve made it. I think about it: ‘Boy, you was lucky. You were a little bit smart a few times, but lucky all the time.”

The hillbillies did more than find a better life in Akron. They built it—on a foundation of independence, determination, pride in their heritage, and pride in their new home. In doing so, they created a modern, industrial Ohio, forever changing its culture, institutions, and people. And that could be the best hillbilly elegy of all.

The post The ‘Hillbilly’ Migrants Who Made Akron, Ohio the World’s Rubber Capital appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/12/11/hillbilly-migrants-made-akron-ohio-worlds-rubber-capital/ideas/essay/feed/ 1
When Burlap Underwear Was Fashionablehttp://www.zocalopublicsquare.org/2017/12/04/burlap-underwear-fashionable/ideas/essay/ http://www.zocalopublicsquare.org/2017/12/04/burlap-underwear-fashionable/ideas/essay/#comments Mon, 04 Dec 2017 08:01:22 +0000 By Joy Spanabel Emery http://www.zocalopublicsquare.org/?p=89766 In 1928, when President Calvin Coolidge visited Chicago, the ladies of a Presbyterian church presented him with a set of pajamas made from flour sacks dyed lavender and finished with silk frogs and pearl buttons in appreciation of his program on economy and thrift.

It seems surprising now, but once the use of cloth feed bags for clothing and household items was a part of mainstream rural American culture—related to a long practice of utilizing all resources that is deeply imbued in the American psyche. Resourceful housewives recycled feed bags from flour, corn, sugar, salt, and even chicken feed into children’s’ clothes, aprons, and dresses.

At the outset, feed bag clothing was strictly utilitarian; in the Great Depression, it became a symbol of thrift and economical household management, and during the war years in the 1940s its use was promoted as part of the campaign for Allied victory. The rise

The post When Burlap Underwear Was Fashionable appeared first on Zócalo Public Square.

]]>
In 1928, when President Calvin Coolidge visited Chicago, the ladies of a Presbyterian church presented him with a set of pajamas made from flour sacks dyed lavender and finished with silk frogs and pearl buttons in appreciation of his program on economy and thrift.

It seems surprising now, but once the use of cloth feed bags for clothing and household items was a part of mainstream rural American culture—related to a long practice of utilizing all resources that is deeply imbued in the American psyche. Resourceful housewives recycled feed bags from flour, corn, sugar, salt, and even chicken feed into children’s’ clothes, aprons, and dresses.

At the outset, feed bag clothing was strictly utilitarian; in the Great Depression, it became a symbol of thrift and economical household management, and during the war years in the 1940s its use was promoted as part of the campaign for Allied victory. The rise and fall of the feed sack dress tells a story about a culture that once deeply valued both thrift and ingenuity. It is also a story of how commercial interests—cloth makers, feed sellers, bakers, pattern makers, and even newspapers—were keenly aware of the large, if indirect, market for such thrifty clothes.

Starting in the mid-1800s, cloth bags became a recognized resource for clothing. Foodstuffs were packaged in a range of five- to 100-pound sacks; the latter measured 36 by 42 inches. Originally made of burlap or osnaburg—a coarse off-white plain fabric that softens with subsequent washings—they were ideal for men’s, women’s, and children’s undergarments and nightwear as well as utilitarian household accessories.

Feedsack Dress made by Mrs. Dorothy Overall of Caldwell, Kansas, in 1959. Photo courtesy of The National Museum of American History.

In the early 1930s, with the onset of the Great Depression, bag manufacturers added colors and prints, along with the traditional white bag that could be dyed—to attract more farm wives. Their husbands were instructed to buy feed bags in specific colors and prints in order to get sufficient yardage for garments.

Patterns for garments specifically made from feed bags were promoted in advertising pages of newspapers. Companies such as Famous Features in New York City produced numerous patterns with different brand names such as Barbara Bell, Sue Barnett, and others from circa 1923 to 1997. The collaboration was designed to attract the farm housewife to specific products.

In partnership with National Cotton Council, bag manufacturers produced national publications such as “Bag of Tricks for Home Sewing” and even promotional flyers to insert in loaves of bread. Patterns shown in the publications specified the number and size of bags needed to make each garment. These designs were not subject to the latest fashion trends but featured timeless fashions that gave them a surprisingly long lifespan. For example, the pattern for the pajamas that the church ladies presented to President Coolidge was still in circulation in 1945.

I learned details about the feed bag clothing innovation from publications and patterns in the Commercial Pattern Archive at the University of Rhode Island. Daniel Flint, owner of Famous Features, explained in an interview that their styles were basic and intended to last for several years. Not all patterns are tagged specifically for the use of feed bags but many can be used that way.

Enterprising women formed clubs to collect and exchange bags; purchasers sought specific matching colors and a textile design to meet the yardage needs. Bakers realized they could sell their flour bags, so they bought specific dress goods bags and shipped them to millers to be filled with flour. After emptying the bags, they made up home sewing kits with four matching cotton bags, eight buttons, thread, and a pattern book.

One of the most popular garments that could be made from bags was an apron; some designs were simple, while others were fancier. Other popular garments included mother/daughter fashions, infant’s and children’s wear, toys, draperies, slip covers, closet organizers, maternity, and undergarments.

The child’s dress prominently and unusually displays the product logo. Photo courtesy of the Commercial Pattern Archive.

Producers identified their product with company logos on each bag. Ideally these needed to be removed, which usually required soaking the bag overnight in cold water then washing in warm soapy water and possibly boiling for 10 minutes to restore color. Removal of the logos was considered essential to avoid announcing the source of the fabric and any related stigma of poverty and “home-sewn.” On rare occasions, the logo was a feature of the design such as that worn by the little girl (pictured right).

A major contributor to the endurance of clothing from bags during World War II was textile restrictions imposed in support of military needs. The restrictions did not impact feed bag manufacturers because the bags were designated in the “industrial” category. Therefore, the high quality textiles used for feed bags were in abundance for the home sewer. Consequently, feed bag clothing became even more popular during World War II.

According to “Bag of Tricks for Home Sewing,” by the end of the war more than 800,000,000 yards of cotton fabric each year were made into bags. In “Bag Magic for Home Sewing” (1946) the National Cotton Council declared feed bag clothing to be the “warp & woof of daily life, the simple virtues of thrift, ingenuity and skill—the virtues upon which in the last analysis, the future of the country rests.” In addition to recycling the bags, users were encouraged to save the string used to close the bags for crocheting; patterns were included in the booklets.

In the postwar years, the National Cotton Council and the Textile Bag Manufacturers Association concentrated on additional associations with two mainstream pattern companies and expanded the line of textiles to include percale, chambray, cambric denim, toweling, and rayon with a silky sheen. At the peak of the bags’ popularity, textile designers were hired to create exciting prints to attract consumers—both millers and the public. Bag manufacturers issued a wide range of textile colors and designs. In conjunction with Simplicity and McCall’s, they promoted national sewing competitions for adults and teens as well as traveling fashion shows featuring bag clothing through at least 1961.

The demise of feed bag clothing was brought about by the increasing popularity of less expensive paper bags, and in 1948 twenty states forbade re-use of cloth bags for food products. Combined with the increasing popularity of less expensive paper and plastic bags, the decline of farming populations, and increased availability of inexpensive ready-made clothing, this resulted in a low demand for cloth feed bags by the early 1960s. The cultural shift from primarily family-operated farms to large cooperatives resulted in many families moving to urban centers.

Fewer women were providing the family wardrobe, since ready-made clothing was readily available and more affordable. Home-made garments and gifts celebrating economy and thrift were no longer part of the American psyche.

The post When Burlap Underwear Was Fashionable appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/12/04/burlap-underwear-fashionable/ideas/essay/feed/ 3
Why Americans Love Dinershttp://www.zocalopublicsquare.org/2017/11/27/americans-love-diners/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/27/americans-love-diners/ideas/essay/#comments Mon, 27 Nov 2017 08:01:33 +0000 By Richard J. S. Gutman http://www.zocalopublicsquare.org/?p=89573 Driving north on Route 95 through Connecticut, I noticed a billboard advertising a local diner. Its immense letters spelled out: “Vegan, Vegetarian, Gluten-Free and Diner Classics.” I knew a seismic shift had occurred when Blue Plate Specials—hands-down favorites for nearly a century such as meat loaf, hot turkey sandwiches, and spaghetti and meatballs—were last on a list of diner offerings.

Over their long history, diners have been a subtle part of our built environment and also our inner landscapes. They are as familiar as the language we speak and the comfort food we eat. Everyone loves diners.

There really is no other building like a classic diner: long and low, sheathed in glass, gleaming stainless steel, and colorful porcelain enamel; often ringed in neon and punctuated by a flashy, sometimes flashing, sign; going and glowing at all hours, day and night.

The first diners showed up 135 years ago

The post Why Americans Love Diners appeared first on Zócalo Public Square.

]]>
Driving north on Route 95 through Connecticut, I noticed a billboard advertising a local diner. Its immense letters spelled out: “Vegan, Vegetarian, Gluten-Free and Diner Classics.” I knew a seismic shift had occurred when Blue Plate Specials—hands-down favorites for nearly a century such as meat loaf, hot turkey sandwiches, and spaghetti and meatballs—were last on a list of diner offerings.

Over their long history, diners have been a subtle part of our built environment and also our inner landscapes. They are as familiar as the language we speak and the comfort food we eat. Everyone loves diners.

There really is no other building like a classic diner: long and low, sheathed in glass, gleaming stainless steel, and colorful porcelain enamel; often ringed in neon and punctuated by a flashy, sometimes flashing, sign; going and glowing at all hours, day and night.

The first diners showed up 135 years ago when Walter Scott served affordable fast food out of his horse-drawn wagon in Providence, Rhode Island. Patrons stood on the street to eat their lunches in the same manner as the customers of today’s ubiquitous food trucks. These eateries were constructed by wagon builders; gradually a specialized industry developed to mass produce diners.

These classic diners were factory-built, from the 1920s onward, and thus conformed to regular dimensions and proportions in order to be moved—by rail, barge, and truck—from where they were manufactured to where they would operate. As a result, diners have a generic similarity to one another. But, because they are mostly individually owned, and made by different manufacturers, they have distinct personalities, based upon the people on both sides of the counter.

The diner interior is all business, where form follows function—“as utilitarian as a machinist’s bench.” The customer can see the short order cook reach into the icebox, work the griddle, and deliver the food in an astonishingly short amount of time. The back bar of the diner, beneath the glass-fronted changeable letter menu boards, is a tour-de-force of stainless steel or colorful tile, with a line of work stations filled with grills, steam tables, sandwich boards, coffee urns, multi-mixers, drink dispensers, and display cases.

From 1955 to 1957 waitress Joan Hepner Wentzell constantly snapped photos of customers at the Pole Tavern Diner on Route 40 in Salem County, New Jersey. These sweethearts sharing a Coke remain anonymous. Photo courtesy of Richard J. S. Gutman.

The “counter culture” inside diners is a reflection of their wide appeal. Commentators have long fixated on this, describing how this spirit manifests itself.

A 1932 article in World’s Work depicted the all-inclusive range of patrons:

“The lunch wagon is the most democratic, and therefore the most American of all eating places. Actors, milkmen, chauffeurs, debutantes, nymphes du pave, young men-about-town, teamsters, students, streetcar motormen, messenger boys, policemen, white wings, businessmen—all these and more rub elbows at its counter.”

Five years later, there was a one-page story in The Literary Digest:

“If you joined diner devotees at a quick ‘cup o’ java,’ you’d find, if it were daytime, that you were rubbing shoulders mostly with horny-handed men in denim. If it were before dawn, you might be rubbing shoulders with men in tails, homeward bound from a night of revelry.” (I love the fact that in 1937 there were people described as “diner devotees.”)

Just as important as the diners’ look and feel is their chow: Always affordable, it has continuously adapted to fit the public’s desires. The norm is home-style cooking, breakfast anytime, and food that is real, local, and sustainable.

C. Oakley Ells in 1932 supplied his diner in Lackawanna County, Pennsylvania, with fresh eggs, milk, and vegetables from his own Ells’ Sunnyside Farms, a stone’s throw down the road. In 2017, Champ’s Diner, in Woonsocket, Rhode Island, identifies on their menu the name of the local farm that provides their eggs.

In San Diego, California, Ray and Herb Boggs operated the Airway Diner. Their July 1942 menu included an avocado cocktail appetizer (35 cents), a natural since San Diego County was the source of most avocados in the country. You wouldn’t find that on a diner on the East Coast at that time. The seafood of the day was grilled Catalina swordfish (85 cents), caught off nearby Santa Catalina Island.

Today the Silver Diner is a locally owned and operated chain of 14 units that set out in 1989 to create a diner for the 21st century. They have continually tweaked their offerings to serve the food that people want to eat. In 2006, Silver Diner was the first chain in the Washington, D.C. area to completely remove trans fats from their menus. Now they feature local farms that supply all-natural, antibiotic- and hormone-free meats and provide non-GMO produce in season.

I’ve studied the world of diners for more than 45 years, beginning when these classic stainless-steel eateries were believed to be a dying breed. But, to paraphrase the supposed Mark Twain quote: “The report of their demise is premature.”

Every year there are articles and TV news magazine stories that proclaim either the death or the rebirth of the diner. I admit I once believed that diners might go extinct. One of my earliest articles was “Diners are declining, but great ones remain,” published in The Boston Globe, in 1974. Truth be told, more than half of the diners I profiled in that story have been demolished.

[Diners] are as familiar as the language we speak and the comfort food we eat. Everyone loves diners.

But the other half have survived. What accounts for their longevity?

In 1975, the National Trust for Historic Preservation included a session on diners and gas stations in its yearly meeting. The Christian Science Monitor noted the tension in the discussion with “Roadside architecture: is it treasure or trash?”

By the 1980s, the Henry Ford Museum, in Dearborn, Michigan, was restoring Lamy’s Diner, a 1946 streamliner, built by the Worcester Lunch Car Company. This became the first of many diners to be installed as icons of our culture in museums. Also notable, vintage diners were resurrected, and new old-style diners—like the Silver Diner—began a comeback.

This was largely fueled by Baby Boomers seeking the comfort and nostalgia of their youth. The diner was put on a pedestal as an exemplar of what’s good about America: mom-and-pop businesses; fresh, home-style food at a good value; and an individual experience that contrasted with the cookie-cutter fast food chains.

Now, the diner is clearly safe and here to stay. With great regularity, my Facebook feed will advise me of “The 21 Best Diners in America,” according to the Huffington Post; “The Top 12 New England Diners,” says Boston magazine; “13 Picture-Perfect LA Diners You’ve Never Heard Of,” proclaims EaterLA.com (and of which, I might add, none are actual diners); and “These Are the Cutest Diners In Every State,” in the eyes of Country Living.

Social media keeps diners in the headlines, in our stream of consciousness, and constantly reminds us why we love these places. There’s a magical something in that word that conjures up a place where you feel at home, can have a great meal for a good price, and walk away satisfied and with a smile on your face.

The diner of the future will continue to change subtly and dramatically simultaneously: an American trait that makes it “feel the same” while ever accommodating the evolving tastes of its customers.

The post Why Americans Love Diners appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/27/americans-love-diners/ideas/essay/feed/ 1
Can a Corrupt Politician Become a Good President?http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/#respond Thu, 16 Nov 2017 08:01:11 +0000 BY SCOTT S. GREENBERGER http://www.zocalopublicsquare.org/?p=89382 “Who you are, what you are, it doesn’t change after you occupy the Oval Office,” President Barack Obama said during the 2016 election campaign. “It magnifies who you are. It shines a spotlight on who you are.”

But at least one man was transformed by the presidency: Chester Alan Arthur. Arthur’s redemption is all the more remarkable because it was spurred, at least in part, by a mysterious young woman who implored him to rediscover his better self.

Arthur, the country’s 21st president, often lands on lists of the most obscure chief executives. Few Americans know anything about him, and even history buffs mostly recall him for his magnificent mutton-chop sideburns.

Most visitors to Arthur’s brownstone at 123 Lexington Avenue in New York City are there to shop at Kalustyan’s, a store that sells Indian and Middle Eastern spices and foods—not to see the only site in the city where

The post Can a Corrupt Politician Become a Good President? appeared first on Zócalo Public Square.

]]>
“Who you are, what you are, it doesn’t change after you occupy the Oval Office,” President Barack Obama said during the 2016 election campaign. “It magnifies who you are. It shines a spotlight on who you are.”

But at least one man was transformed by the presidency: Chester Alan Arthur. Arthur’s redemption is all the more remarkable because it was spurred, at least in part, by a mysterious young woman who implored him to rediscover his better self.

Arthur, the country’s 21st president, often lands on lists of the most obscure chief executives. Few Americans know anything about him, and even history buffs mostly recall him for his magnificent mutton-chop sideburns.

Most visitors to Arthur’s brownstone at 123 Lexington Avenue in New York City are there to shop at Kalustyan’s, a store that sells Indian and Middle Eastern spices and foods—not to see the only site in the city where a president took the oath of office. Arthur’s statue in Madison Square Park, erected by his friends in 1899, is ignored. But Arthur’s story of redemption, which illustrates the profound impact that the U.S. presidency can have on a person, deserves to be remembered.

Arthur was born in Vermont in 1829, the son of a rigid abolitionist preacher. Even in the North, abolitionism was not popular during the first decades of the 19th century, and Arthur’s father—known as Elder Arthur—was so outspoken and uncompromising in his beliefs that he was kicked out of several congregations, forcing him to move his family from town to town in Vermont and upstate New York.

Shortly after graduating from Union College in Schenectady, Arthur did what many ambitious young men from the hinterlands do: He moved to New York City. Once there, he became a lawyer, joining the firm of a friend of his father’s, who was also a staunch abolitionist.

And so he started down an idealistic path. As a young attorney, Arthur won the 1855 case that desegregated New York City’s streetcars. During the Civil War, when many corrupt officials gorged themselves on government contracts, he was an honest and efficient quartermaster for the Union Army.

But after the war Arthur changed. Seeking greater wealth and influence, he became a top lieutenant to U.S. Senator Roscoe Conkling, the all-powerful boss of the New York Republican machine. For the machine, the ultimate prize was getting and maintaining party control—even if that meant handing out government jobs to inexperienced men, or using brass-knuckled tactics to win elections. Arthur and his cronies didn’t view politics as a struggle over issues or ideals. It was a partisan game, and to the victor went the spoils: jobs, power and money.

It was at Conkling’s urging, in 1871, that President Ulysses S. Grant appointed Arthur collector of the New York Custom House. From that perch, Arthur doled out jobs and favors to keep Conkling’s machine humming.

Vice President Chester A. Arthur. Photo courtesy of Library of Congress.

He also became rich. Under the rules of the Custom House, whenever merchants were fined for violations, “Chet” Arthur took a cut. He lived in a world of Tiffany silver, fine carriages, and grand balls, and owned at least 80 pairs of trousers. When an old college classmate told him that his deputy in the Custom House was corrupt, Chet waved him away. “You are one of those goody-goody fellows who set up a high standard of morality that other people cannot reach,” he said. In 1878, reform-minded President Rutherford B. Hayes, also a Republican, fired him.

Two years later, Republicans gathered in Chicago to pick their presidential nominee. Conkling and his machine wanted the party to nominate former president Grant, whose second administration had been riddled with corruption, for an unprecedented third term. But after 36 rounds of voting, the delegates instead chose James Garfield, a longtime Ohio congressman.

Conkling was enraged. Party elders were desperate to placate him, realizing that Garfield had little hope of winning in November without the help of the New York boss. The second place on the ticket seemed to be a safe spot for one of Conkling’s flunkeys; They chose Arthur, and the Republicans triumphed in November.

Just months into Garfield’s presidency, Arthur’s meaningless post suddenly became critical. On the morning of July 2, 1881, a deranged office-seeker named Charles Guiteau shot President Garfield in a Washington railroad station. To Arthur’s horror, when Guiteau was arrested immediately afterward, he proclaimed his support for the Arthur-Conkling wing of the Republican Party—which had been resisting Garfield’s reform attempts—and exulted in the fact that Arthur would now be president. Some newspapers accused Arthur and Conkling of participating in the assassination plot.

Garfield survived the shooting, but he was mortally wounded. Throughout the summer of 1881, Americans prayed for their ailing leader and shuddered at the prospect of an Arthur presidency. Prominent diplomat and historian Andrew Dickson White later wrote: “It was a common saying of that time among those who knew him best, ‘Chet Arthur, President of the United States! Good God!’”

Big-city elites mocked Arthur as unfit for the Oval Office, calling him a criminal who belonged in jail, not the White House. Some of the people around Arthur feared that he was on the verge of an emotional collapse.

The newspapers were vicious. The Chicago Tribune lamented “a pending calamity of the utmost magnitude.” The New York Times called Arthur “about the last man who would be considered eligible” for the presidency.

At the end of August 1881, as Garfield neared death, Arthur received a letter from a fellow New Yorker, a 31-year-old woman named Julia Sand. Arthur had never met Sand, or even heard of her. They were complete strangers. But her letter, the first of nearly two-dozen she wrote to him, moved him. “The hours of Garfield’s life are numbered—before this meets your eye, you may be President,” Sand wrote. “The people are bowed in grief; but—do you realize it?—not so much because he is dying, as because you are his successor.”

“But making a man President can change him!” Sand continued boldly. “Great emergencies awaken generous traits which have lain dormant half a life. If there is a spark of true nobility in you, now is the occasion to let it shine … Reform!”

Sand was the unmarried eighth daughter of Christian Henry Sand, a German immigrant who rose to become president of the Metropolitan Gas Light Company of New York. She lived at 46 East 74th Street, in a house owned by her brother Theodore V. Sand, a banker.

As the pampered daughter of a wealthy father, Julia read French, enjoyed poetry, and vacationed in Saratoga and Newport. But by the time she wrote Arthur she was an invalid, plagued by spinal pain and other ailments that kept her at home. As a woman, Julia was excluded from public life, but she followed politics closely through the newspapers, and she had an especially keen interest in Chester Arthur.

The “reform” she was most concerned about was civil service reform. Under the so-called “spoils system,” politicians doled out government jobs to loyal party hacks, regardless of their qualifications. Reformers wanted to destroy the spoils system, to root out patronage and award federal jobs based on competitive examinations, not loyalty to the party in power.

“But making a man President can change him!” Sand continued boldly. “Great emergencies awaken generous traits which have lain dormant half a life. If there is a spark of true nobility in you, now is the occasion to let it shine … Reform!”

Vice President Arthur had used his position to aid Conkling and his machine—even defying President Garfield to do so. There was every reason to believe he would do the same as president.

But Garfield’s suffering and death, and the great responsibilities that had been thrust upon him, changed Chester Arthur. As president, the erstwhile party hack shocked everybody and became an unlikely champion of civil service reform, clearing the way for a more muscular federal government in the succeeding decades. Arthur started rebuilding the decrepit U.S. Navy, which the country desperately needed to assume a greater economic and diplomatic role on the world stage. And he espoused progressive positions on civil rights.

Mark Twain, who wasn’t bashful about mocking politicians, observed, “it would be hard indeed to better President Arthur’s administration.”

Arthur’s old machine buddies saw it differently—to them, he was a traitor. Meanwhile, reformers still didn’t completely trust that Arthur had become a new man, so he had no natural base of support in the party. He also secretly suffered from Bright’s disease, a debilitating kidney ailment that dampened his enthusiasm for seeking a second term. The GOP did not nominate him in 1884.

Arthur was ashamed of his political career before the presidency. Shortly before his death, he asked that almost all of his papers be burned—with the notable exception of Julia Sand’s letters, which now reside at the Library of Congress. Arthur’s decision to save Sand’s letters, coupled with the fact that he paid her a surprise visit in August 1882 to thank her, suggests that she deserves some credit for his remarkable transformation.

Arthur served less than a full term, but he was showered with accolades when he left the White House in March 1885. “No man ever entered the Presidency so profoundly and widely distrusted as Chester Alan Arthur,” newspaper editor Alexander K. McClure wrote, “and no one ever retired from the highest civil trust of the world more generally respected, alike by political friend and foe.”

The post Can a Corrupt Politician Become a Good President? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/feed/ 0