Zócalo Public SquareHistory – Zócalo Public Square http://www.zocalopublicsquare.org Ideas Journalism With a Head and a Heart Mon, 20 Nov 2017 08:01:42 +0000 en-US hourly 1 https://wordpress.org/?v=4.8 Can a Corrupt Politician Become a Good President?http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/#respond Thu, 16 Nov 2017 08:01:11 +0000 BY SCOTT S. GREENBERGER http://www.zocalopublicsquare.org/?p=89382 “Who you are, what you are, it doesn’t change after you occupy the Oval Office,” President Barack Obama said during the 2016 election campaign. “It magnifies who you are. It shines a spotlight on who you are.”

But at least one man was transformed by the presidency: Chester Alan Arthur. Arthur’s redemption is all the more remarkable because it was spurred, at least in part, by a mysterious young woman who implored him to rediscover his better self.

Arthur, the country’s 21st president, often lands on lists of the most obscure chief executives. Few Americans know anything about him, and even history buffs mostly recall him for his magnificent mutton-chop sideburns.

Most visitors to Arthur’s brownstone at 123 Lexington Avenue in New York City are there to shop at Kalustyan’s, a store that sells Indian and Middle Eastern spices and foods—not to see the only site in the city where

The post Can a Corrupt Politician Become a Good President? appeared first on Zócalo Public Square.

]]>
“Who you are, what you are, it doesn’t change after you occupy the Oval Office,” President Barack Obama said during the 2016 election campaign. “It magnifies who you are. It shines a spotlight on who you are.”

But at least one man was transformed by the presidency: Chester Alan Arthur. Arthur’s redemption is all the more remarkable because it was spurred, at least in part, by a mysterious young woman who implored him to rediscover his better self.

Arthur, the country’s 21st president, often lands on lists of the most obscure chief executives. Few Americans know anything about him, and even history buffs mostly recall him for his magnificent mutton-chop sideburns.

Most visitors to Arthur’s brownstone at 123 Lexington Avenue in New York City are there to shop at Kalustyan’s, a store that sells Indian and Middle Eastern spices and foods—not to see the only site in the city where a president took the oath of office. Arthur’s statue in Madison Square Park, erected by his friends in 1899, is ignored. But Arthur’s story of redemption, which illustrates the profound impact that the U.S. presidency can have on a person, deserves to be remembered.

Arthur was born in Vermont in 1829, the son of a rigid abolitionist preacher. Even in the North, abolitionism was not popular during the first decades of the 19th century, and Arthur’s father—known as Elder Arthur—was so outspoken and uncompromising in his beliefs that he was kicked out of several congregations, forcing him to move his family from town to town in Vermont and upstate New York.

Shortly after graduating from Union College in Schenectady, Arthur did what many ambitious young men from the hinterlands do: He moved to New York City. Once there, he became a lawyer, joining the firm of a friend of his father’s, who was also a staunch abolitionist.

And so he started down an idealistic path. As a young attorney, Arthur won the 1855 case that desegregated New York City’s streetcars. During the Civil War, when many corrupt officials gorged themselves on government contracts, he was an honest and efficient quartermaster for the Union Army.

But after the war Arthur changed. Seeking greater wealth and influence, he became a top lieutenant to U.S. Senator Roscoe Conkling, the all-powerful boss of the New York Republican machine. For the machine, the ultimate prize was getting and maintaining party control—even if that meant handing out government jobs to inexperienced men, or using brass-knuckled tactics to win elections. Arthur and his cronies didn’t view politics as a struggle over issues or ideals. It was a partisan game, and to the victor went the spoils: jobs, power and money.

It was at Conkling’s urging, in 1871, that President Ulysses S. Grant appointed Arthur collector of the New York Custom House. From that perch, Arthur doled out jobs and favors to keep Conkling’s machine humming.

Vice President Chester A. Arthur. Photo courtesy of Library of Congress.

He also became rich. Under the rules of the Custom House, whenever merchants were fined for violations, “Chet” Arthur took a cut. He lived in a world of Tiffany silver, fine carriages, and grand balls, and owned at least 80 pairs of trousers. When an old college classmate told him that his deputy in the Custom House was corrupt, Chet waved him away. “You are one of those goody-goody fellows who set up a high standard of morality that other people cannot reach,” he said. In 1878, reform-minded President Rutherford B. Hayes, also a Republican, fired him.

Two years later, Republicans gathered in Chicago to pick their presidential nominee. Conkling and his machine wanted the party to nominate former president Grant, whose second administration had been riddled with corruption, for an unprecedented third term. But after 36 rounds of voting, the delegates instead chose James Garfield, a longtime Ohio congressman.

Conkling was enraged. Party elders were desperate to placate him, realizing that Garfield had little hope of winning in November without the help of the New York boss. The second place on the ticket seemed to be a safe spot for one of Conkling’s flunkeys; They chose Arthur, and the Republicans triumphed in November.

Just months into Garfield’s presidency, Arthur’s meaningless post suddenly became critical. On the morning of July 2, 1881, a deranged office-seeker named Charles Guiteau shot President Garfield in a Washington railroad station. To Arthur’s horror, when Guiteau was arrested immediately afterward, he proclaimed his support for the Arthur-Conkling wing of the Republican Party—which had been resisting Garfield’s reform attempts—and exulted in the fact that Arthur would now be president. Some newspapers accused Arthur and Conkling of participating in the assassination plot.

Garfield survived the shooting, but he was mortally wounded. Throughout the summer of 1881, Americans prayed for their ailing leader and shuddered at the prospect of an Arthur presidency. Prominent diplomat and historian Andrew Dickson White later wrote: “It was a common saying of that time among those who knew him best, ‘Chet Arthur, President of the United States! Good God!’”

Big-city elites mocked Arthur as unfit for the Oval Office, calling him a criminal who belonged in jail, not the White House. Some of the people around Arthur feared that he was on the verge of an emotional collapse.

The newspapers were vicious. The Chicago Tribune lamented “a pending calamity of the utmost magnitude.” The New York Times called Arthur “about the last man who would be considered eligible” for the presidency.

At the end of August 1881, as Garfield neared death, Arthur received a letter from a fellow New Yorker, a 31-year-old woman named Julia Sand. Arthur had never met Sand, or even heard of her. They were complete strangers. But her letter, the first of nearly two-dozen she wrote to him, moved him. “The hours of Garfield’s life are numbered—before this meets your eye, you may be President,” Sand wrote. “The people are bowed in grief; but—do you realize it?—not so much because he is dying, as because you are his successor.”

“But making a man President can change him!” Sand continued boldly. “Great emergencies awaken generous traits which have lain dormant half a life. If there is a spark of true nobility in you, now is the occasion to let it shine … Reform!”

Sand was the unmarried eighth daughter of Christian Henry Sand, a German immigrant who rose to become president of the Metropolitan Gas Light Company of New York. She lived at 46 East 74th Street, in a house owned by her brother Theodore V. Sand, a banker.

As the pampered daughter of a wealthy father, Julia read French, enjoyed poetry, and vacationed in Saratoga and Newport. But by the time she wrote Arthur she was an invalid, plagued by spinal pain and other ailments that kept her at home. As a woman, Julia was excluded from public life, but she followed politics closely through the newspapers, and she had an especially keen interest in Chester Arthur.

The “reform” she was most concerned about was civil service reform. Under the so-called “spoils system,” politicians doled out government jobs to loyal party hacks, regardless of their qualifications. Reformers wanted to destroy the spoils system, to root out patronage and award federal jobs based on competitive examinations, not loyalty to the party in power.

“But making a man President can change him!” Sand continued boldly. “Great emergencies awaken generous traits which have lain dormant half a life. If there is a spark of true nobility in you, now is the occasion to let it shine … Reform!”

Vice President Arthur had used his position to aid Conkling and his machine—even defying President Garfield to do so. There was every reason to believe he would do the same as president.

But Garfield’s suffering and death, and the great responsibilities that had been thrust upon him, changed Chester Arthur. As president, the erstwhile party hack shocked everybody and became an unlikely champion of civil service reform, clearing the way for a more muscular federal government in the succeeding decades. Arthur started rebuilding the decrepit U.S. Navy, which the country desperately needed to assume a greater economic and diplomatic role on the world stage. And he espoused progressive positions on civil rights.

Mark Twain, who wasn’t bashful about mocking politicians, observed, “it would be hard indeed to better President Arthur’s administration.”

Arthur’s old machine buddies saw it differently—to them, he was a traitor. Meanwhile, reformers still didn’t completely trust that Arthur had become a new man, so he had no natural base of support in the party. He also secretly suffered from Bright’s disease, a debilitating kidney ailment that dampened his enthusiasm for seeking a second term. The GOP did not nominate him in 1884.

Arthur was ashamed of his political career before the presidency. Shortly before his death, he asked that almost all of his papers be burned—with the notable exception of Julia Sand’s letters, which now reside at the Library of Congress. Arthur’s decision to save Sand’s letters, coupled with the fact that he paid her a surprise visit in August 1882 to thank her, suggests that she deserves some credit for his remarkable transformation.

Arthur served less than a full term, but he was showered with accolades when he left the White House in March 1885. “No man ever entered the Presidency so profoundly and widely distrusted as Chester Alan Arthur,” newspaper editor Alexander K. McClure wrote, “and no one ever retired from the highest civil trust of the world more generally respected, alike by political friend and foe.”

The post Can a Corrupt Politician Become a Good President? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/feed/ 0
How Fishing Created Civilizationhttp://www.zocalopublicsquare.org/2017/11/07/fishing-created-civilization/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/07/fishing-created-civilization/ideas/essay/#respond Tue, 07 Nov 2017 08:01:06 +0000 By Brian Fagan http://www.zocalopublicsquare.org/?p=89227 Of the three ancient ways of obtaining food—hunting, plant foraging, and fishing—only the last remained important after the development of agriculture and livestock raising in Southwest Asia some 12,000 years ago.

Yet ancient fisher folk and their communities have almost entirely escaped scholarly study. Why? Such communities held their knowledge close to their chests and seldom gave birth to powerful monarchs or divine rulers. And they conveyed knowledge from one generation to the next by word of mouth, not writing.

That knowledge remains highly relevant today. Fishers are people who draw their living from a hard, uncontrollable world that is perfectly indifferent to their fortunes or suffering. Many of them still fish with hooks, lines, nets, and spears that are virtually unchanged since the Ice Age.

The world’s first pre-industrial communities emerged in the Eastern Mediterranean around 3100 B.C. Other states developed independently, somewhat later, in Asia and in the

The post How Fishing Created Civilization appeared first on Zócalo Public Square.

]]>
Of the three ancient ways of obtaining food—hunting, plant foraging, and fishing—only the last remained important after the development of agriculture and livestock raising in Southwest Asia some 12,000 years ago.

Yet ancient fisher folk and their communities have almost entirely escaped scholarly study. Why? Such communities held their knowledge close to their chests and seldom gave birth to powerful monarchs or divine rulers. And they conveyed knowledge from one generation to the next by word of mouth, not writing.

That knowledge remains highly relevant today. Fishers are people who draw their living from a hard, uncontrollable world that is perfectly indifferent to their fortunes or suffering. Many of them still fish with hooks, lines, nets, and spears that are virtually unchanged since the Ice Age.

The world’s first pre-industrial communities emerged in the Eastern Mediterranean around 3100 B.C. Other states developed independently, somewhat later, in Asia and in the Americas. The entire superstructure of the pre-industrial state, whether Sumerian, Egyptian, Roman, Cambodian, or Inca, depended on powerful ideologies that propelled the efforts of thousands of anonymous laborers, who served on great estates, built temples, tombs, and public buildings, and produced the rations that fed not only the ruler but also his armies of officials. Some of the most important were the fishers, who, along with farmers, were the most vital of all food purveyors.

As city populations grew, fish became a commodity, harvested by the thousands. Fishers transported their catches to small towns and then cities, bringing fish to markets and temples. For the first time, some communities became virtually full-time fishers, bartering or selling fish in town and village markets in exchange for other necessities. Their catches were recorded and taxed. In time, too, fish became rations of standard size, issued to noble and commoner alike. The ruler and the state required hundreds, even thousands, of skilled and unskilled laborers. Their work might be a form of taxation, but the king had to support them in kind, often with fish.

The Land of the Pharaohs depended heavily on its fisher folk. Nile River catfish were easy to harvest, especially during the spring spawn, before they were gutted and dried in the tropical sun on large racks. The authorities assigned teams of fishers to catch specific quotas within set periods, especially when the flood was receding. Large seine nets provided much of the catch, deployed and hauled in by teams of villagers.

The demand was enormous. Building the Pyramids of Giza alone required thousands of people. The workers’ settlement lay close to the royal tombs. In 1991, the Egyptologist Mark Lehner excavated two bakeries, including the vats for mixing dough and a cache of the large bell-shaped pots used for baking bread. A huge mud brick building next to bakeries contained troughs, benches, and tens of thousands of tiny fish fragments in the fine ashy deposit covering the floor.

Dried fish fed merchant seamen crossing the Indian Ocean from the Red Sea to India; dried cod from northern Norway was the beef jerky that sustained Norse crews as they sailed to Iceland, Greenland, and North America.

The fresh catches had to be dried and preserved immediately. Lehner believes that the fish were laid out on reed frames to dry on well-ventilated troughs and benches in a production line that provided protein for thousands of people. At its peak, the line must have employed hundreds of people and processed thousands of fish per day—precise estimates are impossible. The fishers were thus only the first stage of an infrastructure of hundreds of people needed to process and store the dried catch for later consumption. The demands of this operation must have led to large, temporary fishing villages springing up at the same general locations every flood season.

The Ancient Egyptians were not alone. Mid-19th-century travelers, who crossed the Tonle Sap lake in Cambodia after the monsoon as the water was falling, reported catfish teeming so thickly under their canoes that one could almost walk across the water on their backs. The ancestors of these large fish fed thousands of Khmer laborers as they built the nearby stupendous temples of Angkor Wat and Angkor Thom in the 12th century.

On the other side of the world, along the arid North Coast of Peru, the inshore anchovy fisheries, nourished by natural upwelling from the sea bed, yielded enormous numbers of small fish that, when dried and turned into meal, made a valuable protein supplement for farmers in fertile river valleys inland, such as the great settlement at Caral, about 120 miles north of present-day Lima. Caravans of llamas carried bags of fish meal high into the Andes, where the fish became a major economic prop of the Inca empire. Tens of thousands of anchovies were netted, dried, and stored before being traded on a near-industrial scale.

Fish were major historical players in many places. Dried fish fed merchant seamen crossing the Indian Ocean from the Red Sea to India; dried cod from northern Norway was the beef jerky that sustained Norse crews as they sailed to Iceland, Greenland, and North America.

Those who caught the fish that fed pre-modern civilizations were anonymous folk, who appeared with their catches in city markets, then vanished quietly back to their small villages in the hinterland. Perhaps it was the smell of fish that clung to them, or the simple baskets, nets, and spears they used to harvest their catches that kept them isolated from the townsfolk. Perhaps they preferred to be taken for granted. But their efforts helped create, feed, and link great civilizations for thousands of years.

Centuries ago, urban populations numbered in the thousands, but the demand for fish was insatiable. Today, the silent elephant in the fishing room is an exploding global population that considers ocean fish a staple. Deep-water trawls, diesel trawlers, electronic fish finders, and factory ships with deep freezes have turned the most ancient of our ways of obtaining food into an industrial behemoth. Even remote fisheries are being decimated.

Despite large-scale fish farming, humans face the specter of losing our most ancient practice of food-gathering—and thus leaving behind an ocean that is almost fishless.

The post How Fishing Created Civilization appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/07/fishing-created-civilization/ideas/essay/feed/ 0
The Role of War and Sacrifice in Russia’s Mythic Identityhttp://www.zocalopublicsquare.org/2017/11/03/role-war-sacrifice-russias-mythic-identity/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/03/role-war-sacrifice-russias-mythic-identity/ideas/essay/#respond Fri, 03 Nov 2017 07:01:47 +0000 By Gregory Carleton http://www.zocalopublicsquare.org/?p=89183 If you want to understand Russia better, think of war. But not the one in eastern Ukraine or the frightening possibility of a conflict with NATO.

Go back instead to Russia’s 1945 victory over Nazi Germany. That triumph is the greatest event in Russia’s thousand-year history. In the largest war ever, Russia led the Soviet Union in crushing absolute evil and thereby saved the world from destruction.

Yes, Britain and the United States played a significant role in that victory, but Russians can counter by noting—accurately—that the back of Hitler’s army was broken on the Eastern Front before the Normandy landings. Russians also can say that no country has made a greater sacrifice in war. Officially, nearly 27 million Soviet citizens lost their lives.

Or, put in a different perspective, more people died in the siege of Leningrad (around one million) than the British and United States lost, combined, across

The post The Role of War and Sacrifice in Russia’s Mythic Identity appeared first on Zócalo Public Square.

]]>
If you want to understand Russia better, think of war. But not the one in eastern Ukraine or the frightening possibility of a conflict with NATO.

Go back instead to Russia’s 1945 victory over Nazi Germany. That triumph is the greatest event in Russia’s thousand-year history. In the largest war ever, Russia led the Soviet Union in crushing absolute evil and thereby saved the world from destruction.

Yes, Britain and the United States played a significant role in that victory, but Russians can counter by noting—accurately—that the back of Hitler’s army was broken on the Eastern Front before the Normandy landings. Russians also can say that no country has made a greater sacrifice in war. Officially, nearly 27 million Soviet citizens lost their lives.

Or, put in a different perspective, more people died in the siege of Leningrad (around one million) than the British and United States lost, combined, across the entire globe during the war.

It is no wonder that May 9, when Russia celebrates VE Day, has become its greatest secular holiday. This victorious past is projected not only through the massive military parade in Red Square, which features soldiers both in contemporary and period uniforms, but also by its most demonstrative ritual: the march of the “Immortal Regiment.” This is when ordinary Russians, each holding high the photograph of a relative who served, flood the streets to form a single, massive procession.

In 2017, in Moscow alone, the official estimate put their numbers at 600,000, with President Putin at their head. Live television coverage highlighted the many children, themselves in uniforms recalling the war, reciting the feats of their great-grandparents.

Virtually every city in Russia hosts its own march of the Immortals, thus uniting the nation across 11 time zones through the blood of their greatest generation. The parade also makes a global statement both figuratively (by flying the flags of countries Russia helped save from the Nazi yoke, including the United States) and literally (with parallel marches of descendants of Soviet veterans in cities like London and New York).

VE Day has become the center of a civic religion showcasing the sacrifice Russians have made to save humanity from tyranny. The sentiment is so powerful—and not restricted to that day alone—that it anchors a prevailing myth of Russian exceptionalism.

Russia’s army, its people, and its harsh winters combined to push back Napoleon Bonaparte’s invading French forces, leading to the emperor’s disastrous retreat in 1812. Art courtesy of Wikimedia Commons.

That myth has been fueled by the Second World War, but it did not begin there. In 1812, when Napoleon invaded Russia, the conflict was framed in existential terms, with the French emperor officially tagged as the anti-Christ. The outcome of that titanic struggle was seen by contemporaries as nothing short of a miracle: Russia, by itself, destroyed the largest army the world had yet seen and then led a coalition to rescue Europe from French tyranny. They succeeded, occupying Paris in 1814, and sounding the death-knell for Napoleon’s dreams of dominating the world.

No other nation could claim such a victory, which fueled an explosion of Russian patriotism. (Napoleon’s final defeat at Waterloo in 1815 was seen as a futile last gasp.) The victory united Russian writers and intellectuals across the political spectrum—conservatives such as Fyodor Dostoevsky, socialists like Vissarion Belinsky, icons of romanticism like Mikhail Lermontov—in the idea that Russia was a special country that had accomplished a special mission.

By century’s end, this idea became doctrine in the highest echelons of the military. As the director of Russia’s equivalent of West Point proudly proclaimed in 1898 (with emphasis in the original): “It is in the Russian people’s willingness to lay down their lives for others that one finds the key to understanding the special nature of Russia’s experience of war which so acutely distinguishes it from the experiences of other countries in the West.”

Why did he use the present tense when nearly a hundred years separated him from the miracle of 1812? It was because during that century Russian scholars and writers had delved deeper into history and found evidence that their triumph over tyranny had an even earlier precursor, suggesting that stopping invaders was part of Russia’s collective identity.

When the Mongols swept into Europe in the 13th century, they never made it appreciably further west than Russia’s lands (including those of present-day Ukraine and Belarus). Was this earlier defense, Russians would ask six centuries later, yet another sign of Russia’s definitive role in sacrificing to protect others?

Russia’s greatest writer, Alexander Pushkin, was among those who thought yes.

“We have had our own special mission,” he wrote in 1836. “Russia, with its immense expanses, was what absorbed the Mongol conquest. They did not dare to cross our western frontier and leave us in the rear. They withdrew back to the desert and Christian civilization was saved. And for achieving that goal we have had to lead a completely unique existence.”

With the seeds of exceptionalism already deeply sown in Russia’s historical imagination, the 20th century, and World War II, provided further confirmation of the country’s status as a force for good in the world.

Today Russia’s historical self-image colors its current stand-off with NATO. Does that military coalition not echo previous invaders like Napoleon and Hitler whose forces were not exclusively French or German but were also multi-national coalitions? What better demonstrates the West’s ingrained, collective hostility towards Russia?

VE Day has become the center of a civic religion showcasing the sacrifice Russians have made to save humanity from tyranny.

To amplify that sentiment today, Russia’s political and popular culture tap even more into its military past. Besides the Mongols, Napoleon, and Hitler, Russia has been invaded nearly every century of its existence. When the Mongols attacked from the east, its western neighbors, the Swedes and Teutonic knights, attacked as well—only to be defeated by Russia’s greatest medieval warrior, Alexander Nevsky. In the 16th century, the Crimean Tatars drove north and burned Moscow. In the 17th, the Poles repeated that feat while deposing the tsar and killing the patriarch of the Russian Church. In the 18th century, the Swedes invaded but were stopped only by Peter the Great.

This history is applied to current events in ways that play well with the general population. The annexation of Crimea in 2014 can be spun as the necessary defense of native Russians from alleged Ukrainian persecution. The same story can justify the conflict in eastern Ukraine (though the Kremlin denies active involvement, noting that Ukrainian separatists are assisted, if at all, by Russian volunteers).

And NATO’s expansion to Russia’s very borders—how can that not be evidence of yet another plot to take Russia down? If NATO arose to counter the military threat posed by the Soviet Union, then with the latter’s collapse in 1991, what possible motivation can there be for its continued existence and eastern expansion if Russia is not its ultimate target?

Filtered through the nation’s mythic history, the answers to these questions come easily to many Russians, and they help cushion its isolation and the bite of sanctions—at least in terms of morale. Whatever the West does—from sanctions to enhanced NATO deployments close to Russia’s borders—it feeds a historical narrative in which Russia, on the defensive and sacrificing for the good and just, always wins in the end.

The post The Role of War and Sacrifice in Russia’s Mythic Identity appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/03/role-war-sacrifice-russias-mythic-identity/ideas/essay/feed/ 0
How Don Quixote’s Battles Predicted Piracy in the Digital Agehttp://www.zocalopublicsquare.org/2017/11/01/don-quixotes-battles-predicted-piracy-digital-age/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/01/don-quixotes-battles-predicted-piracy-digital-age/ideas/essay/#respond Wed, 01 Nov 2017 07:01:03 +0000 By Martin Puchner http://www.zocalopublicsquare.org/?p=89138 Although Don Quixote wasn’t the first great novel (that honor belongs to the Tale of Genji, written by an 11th-century lady-in-waiting at the Japanese court), it was the first to do something important: capture a new world of print.

That world had begun when Johannes Gutenberg improved upon Chinese printing techniques and combined them with paper, itself an invention that had arrived from China via the Middle East and Arab-occupied Spain. (We still count paper in reams, from the Arabic rizma.)

These two inventions, brought together again in Northern Europe, encountered a rising merchant class and the alphabet, which made print with movable type much more effective than in China. Cheaper literature led to rising literacy rates, which in turn increased the demand for printed matter, beginning a virtuous cycle that has lasted until today.

Don Quixote was an early beneficiary. This irreverent story of an aristocrat

The post How Don Quixote’s Battles Predicted Piracy in the Digital Age appeared first on Zócalo Public Square.

]]>
Although Don Quixote wasn’t the first great novel (that honor belongs to the Tale of Genji, written by an 11th-century lady-in-waiting at the Japanese court), it was the first to do something important: capture a new world of print.

That world had begun when Johannes Gutenberg improved upon Chinese printing techniques and combined them with paper, itself an invention that had arrived from China via the Middle East and Arab-occupied Spain. (We still count paper in reams, from the Arabic rizma.)

These two inventions, brought together again in Northern Europe, encountered a rising merchant class and the alphabet, which made print with movable type much more effective than in China. Cheaper literature led to rising literacy rates, which in turn increased the demand for printed matter, beginning a virtuous cycle that has lasted until today.

Don Quixote was an early beneficiary. This irreverent story of an aristocrat who reads too many chivalric romances was perfect for a broader readership. After a first printing in 1605, new editions were produced across Castile and Aragon, resulting in 13,500 available copies in its first 10 years. Don Quixote became popular abroad as well, with editions in far-away Brussels, Milan, and Hamburg. Most significant was an English translation, which Shakespeare liked so much that he wrote a play, Cardenio (apparently co-authored by John Fletcher, and since lost), based on one of the novel’s interpolated tales. People started to dress as Don Quixote and his wily servant, Sancho Panza, fiction spilling over into the real world.

The new technologies came with significant side effects. So popular was the novel that an anonymous writer decided to write a sequel. Cervantes, who felt that he owned the famous character he had created, was dismayed. He depended on the novel to solve his perpetual financial troubles (he had been accused of defrauding the state while working as a tax collector raising funds for the Spanish Armada, and put in prison). With few legal means at his disposal, Cervantes realized that he had to fight fire with fire and write his own sequel. In it, he made Don Quixote defeat an imposter drawn from the unauthorized rival version—Quixote’s false double—showing who was really in charge of the story.

The title page of the first edition of Don Quixote. Image courtesy of Wikimedia Commons.

The experience taught Cervantes a lesson: Paper and print could help him find new readers both at home and abroad, but these same technologies made it easier for others to sell pirated editions. (Cervantes might not have called them pirates, because he knew about real ones: He had been captured by North African pirates after participating in the historic battle of Lepanto and spent four years in captivity in Algiers, waiting for his family to come up with the ransom.)

Eventually, Cervantes came to realize that the biggest villain in the story wasn’t copycats or pirates; it was printers, who didn’t care about originality, ownership, or artistic integrity—only sales. Once he had identified the enemy, Cervantes used his most potent weapon, his character Don Quixote, and, toward the end of the same sequel, sent him straight into a print shop.

There Don Quixote marvels at the sophisticated division of labor—one of the first industrial processes of mass production—but he also finds that printers systematically cheat authors and translators. When he comes across the unauthorized version of his own life, which is being printed before his very eyes, he leaves the print shop in a huff.

Cervantes’s broadside against printers didn’t bring them down, nor was it meant to, because Cervantes knew how much he depended on them. But he would not lionize them either. His compromise was to use his great novel to take the measure of the age of print.

That age is coming to an end now, as our own digital revolution is changing how literature is read, distributed, and written. Paper and print are being replaced with screens and servers. Electronic texts are not naturally divided into discrete pages, which is why we’re scrolling again, as our forbearers did before the invention of the book. We’ve also become attached to tablets, a format that takes us all the way back to the Mesopotamian clay tablets on which the first great masterpieces were written 4,000 years ago. What are the effects of these emerging technologies that combine old and new?

We could do worse than to ask Cervantes. He would not be surprised that the technologies replacing paper and print are making it infinitely easier to reach global audiences, nor that expanding readerships are changing the kinds of literature being written, from novels explicitly aimed at a global readership to ever more specialized subgenres of romance written and published on Amazon and similar platforms.

Nor would Cervantes be surprised by the price we have to pay for these services. Internet piracy is rampant because laws and enforcement mechanisms haven’t yet caught up with the new technologies; on the dark net, they probably never will. Unauthorized sequels are now so widespread that we have a new word for them: fan fiction. Most important, ownership of our new machines is even more concentrated today than it was in Cervantes’ time.

Were Cervantes to write a modern version of Don Quixote, he wouldn’t even need to change the famous scene in which his knight battles windmills (which, it should be noted, were sometimes used to power paper mills). A new Don Quixote could be fighting wind-powered server farms hosting websites instead. Knocked down by the blades, he would get up and look for the true culprit. Instead of entering a print shop, he would visit corporate headquarters in Mountain View or Cupertino, channeling the frustration we feel about depending on the technologies that undergird our writing and communication methods.

This was why Don Quixote, the deluded knight, became a modern hero in the first place: He acted out our helplessness in the face of new machines, heroically battling windmills, printers, and the new media landscape that was also the reason for his success. What could be more quixotic than that?

The post How Don Quixote’s Battles Predicted Piracy in the Digital Age appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/01/don-quixotes-battles-predicted-piracy-digital-age/ideas/essay/feed/ 0
The Greatest Story Ever Told About Hyperbole, Humbug, and P.T. Barnum!http://www.zocalopublicsquare.org/2017/10/27/greatest-story-ever-told-hyperbole-humbug-p-t-barnum/ideas/essay/ http://www.zocalopublicsquare.org/2017/10/27/greatest-story-ever-told-hyperbole-humbug-p-t-barnum/ideas/essay/#comments Fri, 27 Oct 2017 07:01:51 +0000 By Jennifer Mercieca http://www.zocalopublicsquare.org/?p=89032 In 1835, Phineas Taylor Barnum was down on his luck and anxious to find an “amusement” that would attract paying customers. One lucky day a stranger came into the shop where Barnum worked and told him that he possessed half-ownership of a “curiosity”: a woman named Joice Heth who, the stranger claimed, was the 161-year-old slave who raised George Washington.

Barnum examined Heth and the stranger’s “proofs” about her age and provenance and, convinced of her seeming veracity, bought Heth from the stranger. Barnum, being “a student of human nature” as well as a natural showman, “spared no reasonable efforts” in drawing a crowd to see Joice Heth’s performances, in which Heth recounted tales of George Washington’s childhood. Barnum explained that he was “aware of the great power of the public press” and “used it” in any way he could, including printing “innumerable small bills,” flooding “the city with ‘posters’

The post The Greatest Story Ever Told About Hyperbole, Humbug, and P.T. Barnum! appeared first on Zócalo Public Square.

]]>
In 1835, Phineas Taylor Barnum was down on his luck and anxious to find an “amusement” that would attract paying customers. One lucky day a stranger came into the shop where Barnum worked and told him that he possessed half-ownership of a “curiosity”: a woman named Joice Heth who, the stranger claimed, was the 161-year-old slave who raised George Washington.

Barnum examined Heth and the stranger’s “proofs” about her age and provenance and, convinced of her seeming veracity, bought Heth from the stranger. Barnum, being “a student of human nature” as well as a natural showman, “spared no reasonable efforts” in drawing a crowd to see Joice Heth’s performances, in which Heth recounted tales of George Washington’s childhood. Barnum explained that he was “aware of the great power of the public press” and “used it” in any way he could, including printing “innumerable small bills,” flooding “the city with ‘posters’ setting forth the peculiar attraction which ‘the nurse of Washington’ presented,” and paying off editors to write up the story of Joice Heth in the most dramatic way possible.

Barnum’s advertising strategy depended “upon getting people to think, and talk, and become curious and excited over and about the ‘rare spectacle.’” His advertisements had one goal above all others: They were “calculated to extort attention.” His tool was hyperbole. While P.T. Barnum is often remembered as the founder of the Barnum and Bailey Circus—“The Greatest Show on Earth”—Barnum’s story is more broadly about America’s fascination with hyperbole and humbug.

Hyperbole is the rhetorical term for “excess” (Greek hyper “beyond” + bole “to throw,” to overthrow or throw beyond). Aristotle thought of hyperbole as a kind of metaphor, a comparison between a known thing and an unknown thing. In comparing something that is unknown to something that is already well understood, audiences would make sense of new information by using associational logic (a key form of Greek thought). Yet, Aristotle thought that because hyperbole relied on excessive exaggeration, it was a special kind of metaphor that took advantage of associational logic to distort reality. Therefore, those who used hyperbole abused the power of metaphor and demonstrated a “vehemence of character.”

In the 18th century, Joseph Priestley also saw hyperbole as a kind of metaphoric comparison that “exceeds the truth.” Priestley thought that hyperbole was justly used as part of the sublime, as an attempt to use words to describe the ineffable when “no expressions literally true sufficiently answer his purpose.” However, he thought that there were only a very “few circumstances” in which hyperbole could be used “with propriety.” Mostly, Priestley thought that hyperbole was unjustly used to appeal to “persons of little reading” who were particularly attracted to the “very extravagant” or the “marvelous and supernatural.” Hyperbole drew attention to itself, for the sake of merely drawing attention. For Priestley, hyperbole was like candy: It appealed to the very young, but it was too sweet for older people with more refined taste.

According to these accounts, hyperbole could only be used justly when an accurate description was beyond the power of human speech. All other incidences of hyperbole were an attempt to take advantage of the uninformed by misrepresenting what was not well understood.

But Barnum relished hyperbole precisely because it was the best way to reach the masses.

For a time, Barnum writes in his 1855 Autobiography, ticket sales for the Heth show were great and business was good and he was happy. He was sure to keep “up a constant succession of novel advertisements and unique notices in the newspapers,” which kept “old Joice fresh in the minds of the public, and served to sharpen the curiosity of the people.” However, soon disaster stuck.

Barnum & Bailey dazzled the world with sensationalistic acts that fed a gullible public’s craving for hyperbole. Image courtesy of Wikimedia Commons.

As Barnum tells the story: “A Visitor” wrote to one of the local papers and claimed that Joice Heth was not actually the 161-year-old former slave of George Washington as had been claimed, but was actually what the hip school kids of the 1750s had started to call a “humbug.” Joice Heth, “A Visitor” claimed, was a hoax.

Specifically, “A Visitor” believed that Heth was “not a human being,” but was “simply a curiously constructed automaton, made up of whalebone, India-rubber, and numberless springs, ingeniously put together, and made to move at the slightest touch, according to the will of the operator.” Barnum, “A Visitor” charged, was nothing more than a “ventriloquist” and all of the conversations that audiences had had with Heth about George Washington were “purely imaginary” and “merely the ventriloquial voice of the exhibitor.”

The attack on Heth didn’t hurt Barnum’s show; it made it bigger. Barnum would recall that “hundreds who had not visited Joice Heth were now anxious to see the curious automaton; while many who had seen her were equally desirous of a second look, in order to determine whether or not they had been deceived.” Barnum claimed that the automaton controversy led to even greater curiosity and even greater ticket sales.

Joice Heth passed away in early 1836, ending Barnum’s show but not the nation’s curiosity over Heth. Barnum took advantage of that interest, arranging another Heth show: 1,500 audience members paid 50 cents each—double what audiences had paid to see her alive—to watch as Dr. David L. Rogers conducted an autopsy on her body. According to the February 25, 1836 edition of The New York Sun, Dr. Rogers concluded that Heth’s “wonderful old age was a wonderful humbug.” While she was, in fact, a real person, she was nearer to 80 than to 160 years old.

But, Barnum had the last word. He planted a story with The Sun’s competitor, The New York Herald on February 27, 1836, which claimed that the Heth humbug story was itself humbug. In fact, reported The Herald on “good authority,” Heth was “not dead” at all, but alive and well in Connecticut.

To be clear: 1) the story about Joice Heth was a humbug, she was neither 161 years old nor the former slave of the Washington family; 2) the story about Heth actually being an automaton was a humbug; and 3) the story refuting Heth’s autopsy results was yet another humbug. Barnum’s story of Joice Heth was at least three layers of humbug deep.

While he is often remembered as the founder of the Barnum and Bailey Circus—“The Greatest Show on Earth”—Barnum’s story is more broadly about America’s fascination with hyperbole and humbug.

By 1855 when Barnum published his Autobiography he was world famous as an entertainment promoter: His American Museum, and shows featuring Heth, Tom Thumb, Feejee Mermaid, and the Swedish opera singer Jenny Lind, all had made Barnum famous and rich. Barnum’s “celebrity was his life’s work and his prize possession. He bragged about it, sued people over it, threatened to kill it, but most of all, he reinvented it,” according to one account of Barnum’s influence on American life. His Autobiography was his self-promotion vehicle; he constantly embellished, revised, and expanded it. Barnum’s goal in his Autobiography was to portray himself as the world’s most tricky and entertaining fellow.

Historians can’t seem to find primary-source evidence to support Barnum’s recollection of the exposé of Heth-as-automaton newspaper story, but the rest of this seemingly tall tale of American curiosity and humbug checks out. As Barnum explained in his 1866 book on The Humbugs of the World (which did not include his Joice Heth humbug), a humbug was a legerdemain, a slight of hand.

Why did Barnum’s hyperbole and humbug excite American audiences in the 19th century? For the same reason that it excites Americans today: We love to be amused and we love excess, and so we reward showmen with our attention. Some have said that we’re “amusing ourselves to death” and that we live in the “society of the spectacle.”

We’re especially attracted to hyperbole during times of great transition, when things are confusing and reality can be more easily distorted. Barnum knew this too: His “A Visitor” exposé/humbug relied upon the nation’s curiosity about the emerging technology of machinery, new commercial uses for India rubber, and new Northern concerns over the abolition of slavery.

Today is a another time of great transition and America’s showmen-leaders know it. During an election interview with NBC in 2016, Donald Trump said he had enjoyed being compared to P.T. Barnum. “We need P.T. Barnum, a little bit, because we have to build up the image of our country,” he said.

Ask yourself: Was Barnum and Bailey’s circus literally the “greatest show on Earth”? Of course not, that’s nonsensical hyperbole: “the greatest” can’t be proven or quantified. But in a supposedly classless society like America, such confident appeals to American greatness via hyperbole attract audiences. Americans are much more likely to describe their hamburger or pizza as absolutely “the world’s greatest” rather than as “probably” the best. Such exaggeration is a humbug, of course, but it’s also hyperbole: it compares the known (American hamburgers or pizza) to the unknown (the other hamburgers and pizzas of the world) and tells naïve Americans that theirs is best.

And we shouldn’t forget that “there’s a sucker born every minute.” Barnum has been credited with that phrase, but probably never said that. Of course, there’s a humbug that says he did.

The post The Greatest Story Ever Told About Hyperbole, Humbug, and P.T. Barnum! appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/10/27/greatest-story-ever-told-hyperbole-humbug-p-t-barnum/ideas/essay/feed/ 1
When Halloween Mischief Turned to Mayhemhttp://www.zocalopublicsquare.org/2017/10/26/halloween-mischief-turned-mayhem/ideas/essay/ http://www.zocalopublicsquare.org/2017/10/26/halloween-mischief-turned-mayhem/ideas/essay/#comments Thu, 26 Oct 2017 07:01:46 +0000 By Lesley Bannatyne http://www.zocalopublicsquare.org/?p=88990 Imagine. Pre-electricity, no moon. It’s late October, and the people whisper: This is the season for witchery, the night the spirits of the dead rise from their graves and hover behind the hedges.

The wind kicks up, and branches click like skeletal finger bones. You make it home, run inside, wedge a chair against the door, and strain to listen. There’s a sharp rap at the window and when you turn, terrified, it’s there leering at you—a glowing, disembodied head with a deep black hole where its mouth should be.

It’s just a scooped-out pumpkin, nicked from a field by some local boys and lit from the inside with the stub of a candle. But it has spooked you. When you look again, it’s gone.

Halloween in early 19th-century America was a night for pranks, tricks, illusions, and anarchy. Jack-o’-lanterns dangled from the ends of sticks, and teens jumped out

The post When Halloween Mischief Turned to Mayhem appeared first on Zócalo Public Square.

]]>
What It Means to Be AmericanImagine. Pre-electricity, no moon. It’s late October, and the people whisper: This is the season for witchery, the night the spirits of the dead rise from their graves and hover behind the hedges.

The wind kicks up, and branches click like skeletal finger bones. You make it home, run inside, wedge a chair against the door, and strain to listen. There’s a sharp rap at the window and when you turn, terrified, it’s there leering at you—a glowing, disembodied head with a deep black hole where its mouth should be.

It’s just a scooped-out pumpkin, nicked from a field by some local boys and lit from the inside with the stub of a candle. But it has spooked you. When you look again, it’s gone.

Halloween in early 19th-century America was a night for pranks, tricks, illusions, and anarchy. Jack-o’-lanterns dangled from the ends of sticks, and teens jumped out from behind walls to terrorize smaller kids. Like the pumpkin patches and pageants that kids love today, it was all in good fun—but then, over time, it wasn’t.

As America modernized and urbanized, mischief turned to mayhem and eventually incited a movement to quell what the mid-20th-century press called the “Halloween problem”—and to make the holiday a safer diversion for youngsters. If it weren’t for the tricks of the past, there’d be no treats today.

Halloween was born nearly 2,000 years ago in the Celtic countries of northwestern Europe. November 1 was the right time for it—the date cut the agricultural year in two. It was Samhain, summer’s end, the beginning of the dangerous season of darkness and cold—which according to folklore, created a rift in reality that set spirits free, both good and bad. Those spirits were to blame for the creepy things—people lost in fairy mounds, dangerous creatures that emerged from the mist—that happened at that time of year.

Immigrants from Ireland and Scotland brought their Halloween superstitions to America in the 18th and 19th centuries, and their youngsters—our great- and great-great grandfathers—became the first American masterminds of mischief. Kids strung ropes across sidewalks to trip people in the dark, tied the doorknobs of opposing apartments together, mowed down shrubs, upset swill barrels, rattled or soaped windows, and, once, filled the streets of Catalina Island with boats. Pranksters coated chapel seats with molasses in 1887, exploded pipe bombs for kicks in 1888, and smeared the walls of new houses with black paint in 1891. Two hundred boys in Washington, D.C., used bags of flour to attack well-dressed folks on streetcars in 1894.

Teens used to terrorize smaller children on Halloween. Image courtesy of The New York Public Library.

In this era, when Americans generally lived in small communities and better knew their neighbors, it was often the local grouch who was the brunt of Halloween mischief. The children would cause trouble and the adults would just smile guiltily to themselves, amused by rocking chairs engineered onto rooftops, or pigs set free from sties. But when early 20th-century Americans moved into crowded urban centers—full of big city problems like poverty, segregation, and unemployment—pranking took on a new edge. Kids pulled fire alarms, threw bricks through shop windows, and painted obscenities on the principal’s home. They struck out blindly against property owners, adults, and authority in general. They begged for money or sweets, and threatened vandalism if they didn’t receive them.

Some grown-ups began to fight back. Newspapers in the early 20th century reported incidents of homeowners firing buckshot at pranksters who were only 11 or 12 years old. “Letting the air out of tires isn’t fun anymore,” wrote the Superintendent of Schools of Rochester, New York in a newspaper editorial in 1942, as U.S. participation in World War II was escalating. “It’s sabotage. Soaping windows isn’t fun this year. Your government needs soaps and greases for the war … Even ringing doorbells has lost its appeal because it may mean disturbing the sleep of a tired war worker who needs his rest.” That same year, the Chicago City Council voted to abolish Halloween and instead institute a “Conservation Day” on October 31. (Implementation got kicked to the mayor, who doesn’t appear to have done much about it.)

The effort to restrain and recast the holiday continued after World War II, as adults moved Halloween celebrations indoors and away from destructive tricks, and gave the holiday over to younger and younger children. The Senate Judiciary Committee under President Truman recommended Halloween be repurposed as “Youth Honor Day” in 1950, hoping that communities would celebrate and cultivate the moral fiber of children. The House of Representatives, sidetracked by the Korean War, neglected to act on the motion, but there were communities that took it up: On October 31, 1955 in Ocala, Florida, a Youth Honor Day king and queen were crowned at a massive party sponsored by the local Moose Lodge. As late as 1962, New York City Mayor Robert F. Wagner, Jr. wanted to change Halloween to UNICEF Day, to shift the emphasis of the night to charity.

Of course, the real solution was already gaining in practice by that time. Since there were children already out demanding sweets or money, why not turn it into it a constructive tradition? Teach them how to politely ask for sweets from neighbors, and urge adults to have treats at the ready. The first magazine articles detailing “trick or treat” in the United States appeared in The American Home in the late 1930s. Radio programs aimed at children, such as The Baby Snooks Show, and TV shows aimed at families, like The Jack Benny Program, put the idea of trick-or-treating in front of a national audience. The 1952 Donald Duck cartoon Trick or Treat reached millions via movie screens and TV. It featured the antics of Huey, Dewey, and Louie, who, with the help of Witch Hazel’s potions, get Uncle Donald to give them candy instead of the explosives he first pops into their treat bags.

When early 20th-century Americans moved into crowded urban centers […], pranking took on a new edge. Kids pulled fire alarms, threw bricks through shop windows, and painted obscenities on the principal’s home.

The transition could be slow. On one episode of The Adventures of Ozzie and Harriet, costumed kids come to the door, and Ozzie and Harriet are baffled. But food companies—Beatrice Foods, Borden, National Biscuit Company—quickly took notice and got into the candy business, and even tobacco companies like Philip Morris jumped in. Halloween candy and costume profits hit $300 million in 1965 and kept rising. Trick-or-treating—child-oriented and ideal for the emerging suburbs that housed a generation of Baby Boomers—became synonymous with Halloween. Reckless behavior was muted, and porch lights welcomed costumed kids coast to coast.

Today, trick or treat has more variants: trunk or treat, where kids go car-to-car in a parking lot asking for candy; and trick or treat for UNICEF, where youngsters collect money for charity along with their treats. Few children, especially young ones, have an inkling of what mischief was once possible.

For those nostalgic about the old days of Halloween mischief, all is not lost. Query the MIT police about the dissected-and-reassembled police car placed atop the Great Dome on the college’s Cambridge campus in 1994. Or ask the New York City pranksters who decorated a Lexington Avenue subway car as a haunted house in 2008. There’s even an annual Naked Pumpkin Run in Boulder, Colorado.

The modern Halloween prank—be it spectacle, internet joke, entertainment, or clever subversion—is a treat in disguise, an offering that’s usually as much fun for the tricked as it is for the trickster. Halloween is still seen as a day to cause mischief, to mock authority, and make the haves give to the have-nots—or at least shine a light on the fact that they should. For that, Americans can thank the long line of pranksters who came before us.

The post When Halloween Mischief Turned to Mayhem appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/10/26/halloween-mischief-turned-mayhem/ideas/essay/feed/ 1
It’s a Bird! It’s a Plane! It’s a Positive Symbol of American Power!http://www.zocalopublicsquare.org/2017/10/23/bird-plane-positive-symbol-american-power/ideas/essay/ http://www.zocalopublicsquare.org/2017/10/23/bird-plane-positive-symbol-american-power/ideas/essay/#respond Mon, 23 Oct 2017 07:01:12 +0000 By Ian Gordon http://www.zocalopublicsquare.org/?p=88888 I can’t really remember when I first encountered Superman. It might have been through the 1950s television series The Adventures of Superman, or it might have been in a Superman comic book—not an American comic book, but a black and white reprint, by the Australian publisher K. G. Murray.

Growing up in Australia, I learned the basic stories of American history from the pages of these Superman comics. I read about the Boston Tea Party; Nathan Hale’s patriotism; Washington crossing the Delaware (complete with a comic book replication of Emanuel Leutze’s painting); the hoary myth of Betsy Ross, creator of the Stars and Stripes; John Paul Jones’s fighting spirit; and the importance of Lincoln. I knew whose faces are carved on Mount Rushmore. And a Lois Lane story taught me that Hawaii was the 50th state of the union.

It might be no surprise that I became a historian

The post It’s a Bird! It’s a Plane! It’s a Positive Symbol of American Power! appeared first on Zócalo Public Square.

]]>
What It Means to Be American I can’t really remember when I first encountered Superman. It might have been through the 1950s television series The Adventures of Superman, or it might have been in a Superman comic book—not an American comic book, but a black and white reprint, by the Australian publisher K. G. Murray.

Growing up in Australia, I learned the basic stories of American history from the pages of these Superman comics. I read about the Boston Tea Party; Nathan Hale’s patriotism; Washington crossing the Delaware (complete with a comic book replication of Emanuel Leutze’s painting); the hoary myth of Betsy Ross, creator of the Stars and Stripes; John Paul Jones’s fighting spirit; and the importance of Lincoln. I knew whose faces are carved on Mount Rushmore. And a Lois Lane story taught me that Hawaii was the 50th state of the union.

It might be no surprise that I became a historian of the United States. But Superman’s global impact was not limited to one Australian kid, or indeed, even just to kids. From early in the existence of the character, Superman was seen as symbolic of America, for better or for worse.

After his debut in Action Comics #1, June 1938, Superman quickly became a global figure, his worldwide fame predicated on his being an American. Who could be more American than Superman, who, though Krypton-born, put his incredible powers to use serving “Truth, Justice, and the American Way”? And what did that mean for the rest of the world?

Just how important Superman was to the rest of the world after World War II is hard to evaluate. We know Superman made it as far as Italy, India, and Japan, but there’s little scholarly work that tells us what people in those countries made of him. Anecdotally, it seems that readers around the world loved him—and that cultural critics and politicians worried about the appeal of such a character, and of American culture in general. But he also served as a minor flashpoint for concerns about American power in the aftermath of World War II.

In 1949, the Soviet Literary Gazette attacked Superman as promoting the “mass fascisization” of American children. That same year, the French government restricted the type and amount of foreign comics’ content allowed into the country, on the grounds that it created juvenile delinquency. Journalists at Le Monde defended comics and Superman, arguing that it was a bit much to blame the Man of Steel, given the negative impacts of the recent German occupation.

And when the “Superman” television show was first broadcast in the U.K. in 1957, one critic said it was not only the worst thing he’d ever seen on television but was the worst thing he had “ever encountered in any form.”

His fans applaud Superman’s kindness, decency, and small acts of charity—in contrast to the United States’ attempts at systemic change on a global scale, which sometimes go astray. Image courtesy of Flickr.

The most certain evidence of Superman’s global influence, good and bad, comes from the box office revenues of the 1978 Superman movie, starring Christopher Reeve. The U.S. domestic take for the film was $134 million, or 44.7 percent of the total. The remaining 55.3 percent, $166 million, came from the international market. In contrast, the original Star Wars movie of 1977 sold 60 percent of its tickets in the United States and only 40 percent in international markets. Superman was the first blockbuster to derive the majority of its sales in the international market—and it did so nine years before foreign markets regularly became larger revenue generators for American films than the domestic market.

Why was it that in 1978, after Watergate and the American war in Vietnam, that an international audience took so eagerly to Superman? Marketing probably had something to do with it. So did the much-publicized hiring of Marlon Brando and Gene Hackman for the film, at great cost to the producers. But—and here lies the tricky bit—the key to the movie’s success was Superman’s role as ambassador of America’s idealistic promise to the world.

How can that be, you might ask, when Superman’s fighting for “Truth, Justice and the American Way” seems to many like an oxymoron? The answer lies in Superman’s combination of ideals and independence: He has mostly been an agent quite separate from the American state. To be sure, there were comic book covers and strips during World War II that lent themselves to the war effort, and a late 1960s story set in Vietnam that reflected America’s official views of that war, but very little else Superman did directly represented American foreign policy. The American Way, according to Superman in comics and on screen, was primarily one of simple decency and acts of charity.

The movie deliberately tapped into this side of America. In one crucial scene, as she interviews the hero to learn about his origins and purpose, Margot Kidder’s Lois Lane asks Reeve’s Superman why he is on Earth. “To fight for Truth, Justice, and the American Way,” he replies. Lois expresses incredulous cynicism, and Superman affirms his sincerity. At the time, Christopher Reeve told a reporter for The New York Times that it was important for the hero not to be cynical and to believe in the sentiments of that catchphrase.

The kindness-and-decency side to Superman is not without its critics. In the early 1960s, the eminent scholar and novelist Umberto Eco, who had about 200 or so Superman comic books available to him in Italy, argued that Superman’s habit of restricting himself to small acts of charity represented a particular ideology that was opposed to necessary systemic change. After all, he said, a hero with the powers of Superman could upend the economic and political systems that lay at the crux of society’s problems and so end the need for such charity. Eco’s Superman functions as a metaphor for American power, though a problematic one. Small acts of charity might have been more appealing for international audiences than the sorts of systemic change America has attempted on a global scale, which have not always gone as planned.

Today, as an Asia-based cultural historian who studies the United States, I sometimes wonder what it is that people across the globe still see in Superman. In June of this year I was in Milan’s Chinatown and spotted a painted portrait of Superman, very much in the Christopher Reeve mode, in a shop window. What do the artist or the shop owner have in common with the man I saw wearing a Superman symbol T-shirt yesterday at lunch in São Paulo, Brazil? And what do they have in common with the Malay family of four, all wearing identical Superman symbol T-shirts, who I saw in The Pavilion mall in Kuala Lumpur, Malaysia several years ago?

In my research on Superman, I found that for many readers around the world Superman exemplifies how values like tolerance and civility may be used constructively—to promote justice—in places where rulers shut down legitimate debate. Superman offers some hope that whatever you think of America, some of the values it exports may be worthwhile. Loving Superman, just maybe, is one way our world hopes against hope.

The post It’s a Bird! It’s a Plane! It’s a Positive Symbol of American Power! appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/10/23/bird-plane-positive-symbol-american-power/ideas/essay/feed/ 0
When Black Texans Gathered Under “Thursday Night Lights”http://www.zocalopublicsquare.org/2017/10/19/black-texans-gathered-thursday-night-lights/ideas/essay/ http://www.zocalopublicsquare.org/2017/10/19/black-texans-gathered-thursday-night-lights/ideas/essay/#respond Thu, 19 Oct 2017 07:01:17 +0000 By Michael Hurd http://www.zocalopublicsquare.org/?p=88867 I had only been in and out of Houston since leaving our Sunnyside neighborhood on the city’s southeast side, in 1968, to begin eight years of Air Force service. Whenever I returned, I made only casual note of neighborhood and city changes, such as the sad state of the mom-and-pop “candy store” where we used to hang out after school, now boarded up, or a new skyscraper for a Houston skyline dotted with cranes, or another congested freeway opened to relieve existing congested freeways.

As a sportswriter, during my visits I instinctively gravitated towards athletics venues. I would drive by the crumbling Astrodome, dwarfed by the gigantic NRG Stadium, home of the NFL’s Houston Texans; the Houston Astros’ new downtown baseball playground, Minute Maid Park; Rice University Stadium, the site of Super Bowl VIII in 1974, still holding up near the Medical Center.

Nothing about the changes in those venues

The post When Black Texans Gathered Under “Thursday Night Lights” appeared first on Zócalo Public Square.

]]>
What It Means to Be American I had only been in and out of Houston since leaving our Sunnyside neighborhood on the city’s southeast side, in 1968, to begin eight years of Air Force service. Whenever I returned, I made only casual note of neighborhood and city changes, such as the sad state of the mom-and-pop “candy store” where we used to hang out after school, now boarded up, or a new skyscraper for a Houston skyline dotted with cranes, or another congested freeway opened to relieve existing congested freeways.

As a sportswriter, during my visits I instinctively gravitated towards athletics venues. I would drive by the crumbling Astrodome, dwarfed by the gigantic NRG Stadium, home of the NFL’s Houston Texans; the Houston Astros’ new downtown baseball playground, Minute Maid Park; Rice University Stadium, the site of Super Bowl VIII in 1974, still holding up near the Medical Center.

Nothing about the changes in those venues fazed me. Houston’s dynamic progress was to be expected. But I was disturbed, as I made my rounds one summer morning in 2015, to see a shiny new stadium for University of Houston Cougars football. I entered Third Ward, the city’s historic African-American cultural hub, drove past the Cougars’ new den for the first time, and was so shaken by the sight that I narrowly avoided veering into oncoming traffic.

Something was terribly wrong. This was not just any new testament to collegiate athletic funding. This was a state-of-the-art, gentrified grave marker, covering—no, unashamedly hiding—a significant monument to Houston’s African-American history, now bulldozed to oblivion. Underneath this 60-acre plot, my hometown had buried a place called Jeppesen Stadium. Along with it, Houston silenced the athletic ghosts of a Jim Crow past that prevailed, for decades, throughout Texas.

Segregation had necessitated the formation, in 1920, of the Prairie View Interscholastic League (PVIL), the governing body for black high school athletics, academic, and music competitions in Texas. The University Interscholastic League, which oversaw the same activities for white students, denied membership to African-American schools. So the PVIL became the guiding force behind African-American high school gridiron action in the region. Its football was highly competitive and thrilling to watch.

In Houston, from the 1940s through 1967, the league’s Wednesday and Thursday night football games took place at Jeppesen, the school district’s public football facility. Jeppesen was the last and most visible connection to a past that saw Texas’s African-American community defy a racist system and produce some of the best high school coaches, athletes, and teams in the state’s football-mad history. For me, it was also an enduring link to my adolescence.

Underneath this 60-acre plot, my hometown had buried a place called Jeppesen Stadium. Along with it, Houston silenced the athletic ghosts of a Jim Crow past that prevailed, for decades, throughout Texas.

Jeppesen had none of the engaging architectural features of its successor, the new University of Houston stadium; it was just a dirty beige-colored concrete edifice. But even if it was homely, Jeppesen had glamor. Houston was the PVIL’s largest market, and the league fostered cultural and community pride. Jeppesen drew fans from black enclaves—Fifth Ward, Fourth Ward, Third Ward, Kashmere Gardens, Sunnyside, Acres Homes, and the Gulf Coast—to cheer for their teams, which included great players like Bubba Smith, Eldridge Dickey, Mel Farr, Gene Washington, Otis Taylor, and Warren Wells.

As many as 40,000 fans gathered in Jeppesen’s stands for the annual Thanksgiving Turkey Day Classic game between heated crosstown rivals Jack Yates and Phillis Wheatley High Schools. Attendees dressed in their Sunday best, and then some, for the social event of the season. The day included pre- and post-game events, parades, alumni and family breakfasts and dinners, and flashy halftime marching band performances. Yates fans still talk about the 1958 halftime show, when Miss Yates and her court arrived in a helicopter. It landed on the 50-yard line to a deafening roar from the crowd.

A number of PVIL players went on to dominate professional football. Six of its alumni were inducted into the Pro Football Hall of Fame, including “Mean Joe” Greene (from Temple Dunbar High School), defensive back Dick “Night Train” Lane (from Austin Anderson High School), and safety Ken Houston (from Lufkin Dunbar High School). In 1965, wide receiver Jerry LeVias, from Hebert High School in Beaumont, became the first African-American football scholarship player in the old Southwest Conference when he decided to attend Southern Methodist University in Dallas.

Charles “Bubba” Smith, at 6 feet 7 inches, terrorized PVIL opponents in the early 1960s as a lineman at Charlton-Pollard High School in Beaumont, east of Houston, and later became one of the most feared defensive players in college football history at Michigan State, and in the National Football League with the Baltimore Colts. In 1992, Smith recalled playing in the PVIL to a Houston Chronicle writer: “You’re talking about people who could righteously play the game. None of the teams were diluted back then. Everyone on the field could play. If you blink, they were gone. It was more physical and tougher. And it always meant something if you could outrun somebody because everybody could run.”

Case in point: Cliff Branch was a four-time All-Pro receiver and three-time Super Bowl champion with the Oakland Raiders, and a world-class sprinter for the Colorado Buffaloes. At Worthing High School in Houston, he was a two-time PVIL 3-A state sprint champ and the first schoolboy in Texas history to run the 100-yard dash in 9.3 seconds, a record that stood only as long as it took the very next heat, for 2-A boys, to conclude. When it did, Wichita Falls Washington speedster Reggie Robinson was the new record-holder, breezing to a 9.1 second finish.

That kind of speed only mattered in athletic contests. None of the PVIL players could outrun racism and segregation, the wedges that maintained the misguided myth of white supremacy even in athletics. While there would be no shortage of newspaper and magazine articles, books, and movies about high school football in Texas, none focused on the teams and players from the PVIL. Few on the outside were willing to acknowledge the abundance of talent bursting from the under-funded schools of the “Negro League,” or the “Colored League,” as some called it in polite company. One former PVIL player spoke with disdain about the term “Friday Night Lights,” explaining, “That’s white folks.”

“There was the black side of town and the white side, and you just dealt with what was laid out there for you,” Smith would add. “You didn’t have time to think, ‘Why can’t I do that, or why is it like that?’ Sometimes I think a large part of my life was lost by that. But you could always show what you had on the football field.”

And there was so much to show. The PVIL created lasting passionate school rivalries, and instilled confidence in its athletes and students. Teachers and coaches were short on resources but eager to be mentors, and accepted the task of preparing black children for citizenship in an environment that neither welcomed nor encouraged them.

The Prairie View Interscholastic League governed athletic, academic, and music competitions for black high schools in Texas from 1920-1970. These athletes played football during the 1930s. Photo courtesy of PVILCA.

The league’s relationship with historically black colleges fortified it. Black colleges provided the only options in the South for African-Americans seeking higher education, and Texas had several HBCUs (Historically Black Colleges and Universities), including Prairie View, Wiley, and Bishop. Black coaches and teachers had earned degrees at black colleges—often starring as players, then returning home to work in black high schools. In turn, coaches would send their best players to their collegiate alma maters, creating a string of black men guiding young black boys to adulthood.

“I knew all the mommas and poppas,” Joe Washington Sr., father of former Oklahoma Sooners and NFL running back Joe Washington, recalled from his days as head coach at Bay City’s Hilliard High School and then at Port Arthur Lincoln High School. “They used to tell me, ‘Coach Washington, take him and do what you have to do, just don’t kill him.’ ”

Some look at the loss of those nurturing relationships as a tragedy of integration, which brought about the fall of the PVIL, the closing of black high schools, and erosion in black communities.

In the fall of 1967, black and white schools competed against each other for the first time, previously all-white programs added black players, and Texas could legitimately claim playing the best high school football in the country. White coaches had quietly salivated, waiting for the moment when they could welcome all of the most talented players to their programs, and the newcomers did not disappoint. The infusion of black athletes to UIL teams made instant winners of longtime losing teams, allowing coaches to gloat about their new black superstar: “I got me one.”

I graduated from Worthing in the spring of 1967, just as the PVIL and UIL “merged.” It was closer to a hostile takeover. As the PVIL began shutting down, so did most of its member schools, although some smaller schools lingered for another three seasons in the league. (From a high of 500, only eight former PVIL schools remain in operation today.) Successful PVIL coaches with state championships under their belts lost jobs. Other coaches retired or left the profession rather than take demotions to work as assistants on predominantly white staffs.

At first, I didn’t appreciate the sea change that integration would deliver to the PVIL. I thought it was great that black players would finally get a wider audience, and get to compare skills with white players. But when I returned that fall for our homecoming football game, against a white team in a different setting, I felt wistful and restless. The new stadium lacked the Jeppesen atmosphere. I left at halftime, thinking what PVIL folks surely had already figured out.

“You just have to have integration, we knew it all along and we wanted it,” said the late Luther Booker, a former head coach at Yates. “But you miss those days because it was such a high. It affected the black community. It was an electrifying time. There’s nothing like it now. And maybe there never will be.”

Certainly, there will be no more beacons like Jeppesen.

The post When Black Texans Gathered Under “Thursday Night Lights” appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/10/19/black-texans-gathered-thursday-night-lights/ideas/essay/feed/ 0
The Invention and Evolution of the Concentration Camphttp://www.zocalopublicsquare.org/2017/10/18/concentration-camps-invented-punish-civilians/ideas/essay/ http://www.zocalopublicsquare.org/2017/10/18/concentration-camps-invented-punish-civilians/ideas/essay/#comments Wed, 18 Oct 2017 07:01:36 +0000 By Andrea Pitzer http://www.zocalopublicsquare.org/?p=88847 Before the first prisoner entered the Soviet Gulag, before “Arbeit macht frei” appeared on the gates of Auschwitz, before the 20th century had even begun, concentration camps found their first home in the cities and towns of Cuba.

The earliest modern experiment in detaining groups of civilians without trial was launched by two generals: one who refused to bring camps into the world, and one who did not.

Battles had raged off and on for decades over Cuba’s desire for independence from Spain. After years of fighting with Cuban rebels, Arsenio Martínez Campos, the governor-general of the island, wrote to the Spanish prime minister in 1895 to say that he believed the only path to victory lay in inflicting new cruelties on civilians and fighters alike. To isolate rebels from the peasants who sometimes fed or sheltered them, he thought, it would be necessary to relocate hundreds of thousands of

The post The Invention and Evolution of the Concentration Camp appeared first on Zócalo Public Square.

]]>
Before the first prisoner entered the Soviet Gulag, before “Arbeit macht frei” appeared on the gates of Auschwitz, before the 20th century had even begun, concentration camps found their first home in the cities and towns of Cuba.

The earliest modern experiment in detaining groups of civilians without trial was launched by two generals: one who refused to bring camps into the world, and one who did not.

Battles had raged off and on for decades over Cuba’s desire for independence from Spain. After years of fighting with Cuban rebels, Arsenio Martínez Campos, the governor-general of the island, wrote to the Spanish prime minister in 1895 to say that he believed the only path to victory lay in inflicting new cruelties on civilians and fighters alike. To isolate rebels from the peasants who sometimes fed or sheltered them, he thought, it would be necessary to relocate hundreds of thousands of rural inhabitants into Spanish-held cities behind barbed wire, a strategy he called reconcentración.

But the rebels had shown mercy to the Spanish wounded and had returned prisoners of war unharmed. And so Martínez Campos could not bring himself to launch the process of reconcentración against an enemy he saw as honorable. He wrote to Spain and offered to surrender his post rather than impose the measures he had laid out as necessary. “I cannot,” he wrote, “as the representative of a civilized nation, be the first to give the example of cruelty and intransigence.”

Spain recalled Martínez Campos, and in his place sent general Valeriano Weyler, nicknamed “the Butcher.” There was little doubt about what the results would be. “If he cannot make successful war upon the insurgents,” wrote The New York Times in 1896, “he can make war upon the unarmed population of Cuba.”

Civilians were forced, on penalty of death, to move into these encampments, and within a year the island held tens of thousands of dead or dying reconcentrados, who were lionized as martyrs in U.S. newspapers. No mass executions were necessary; horrific living conditions and lack of food eventually took the lives of some 150,000 people.

These camps did not rise out of nowhere. Forced labor had existed for centuries around the world, and the parallel institutions of Native American reservations and Spanish missions set the stage for relocating vulnerable residents away from their homes and forcing them to stay elsewhere. But it was not until the technology of barbed wire and automatic weapons that a small guard force could impose mass detention. With that shift, a new institution came into being, and the phrase “concentration camps” entered the world.

When U.S. newspapers reported on Spain’s brutality, Americans shipped millions of pounds of cornmeal, potatoes, peas, rice, beans, quinine, condensed milk, and other staples to the starving peasants, with railways offering to carry the goods to coastal ports free of charge. By the time the USS Maine sank in Havana harbor in February 1898, the United States was already primed to go to war. Making a call to arms before Congress, President William McKinley said of the policy of reconcentración: “It was not civilized warfare. It was extermination. The only peace it could beget was that of the wilderness and the grave.”

These camps did not rise out of nowhere. Forced labor had existed for centuries around the world, and the parallel institutions of Native American reservations and Spanish missions set the stage for relocating vulnerable residents away from their homes and forcing them to stay elsewhere.

But official rejection of the camps was short-lived. After defeating Spain in Cuba in a matter of months, the United States took possession of several Spanish colonies, including the Philippines, where another rebellion was underway. By the end of 1901, U.S. generals fighting in the most recalcitrant regions of the islands had likewise turned to concentration camps. The military recorded this turn officially as an orderly application of measured tactics, but that did not reflect the view on the ground. Upon seeing one camp, an Army officer wrote, “It seems way out of the world without a sight of the sea,—in fact, more like some suburb of hell.”

In southern Africa, the concept of concentration camps had simultaneously taken root. In 1900, during the Boer War, the British began relocating more than 200,000 civilians, mostly women and children, behind barbed wire into bell tents or improvised huts. Again, the idea of punishing civilians evoked horror among those who saw themselves as representatives of a civilized nation. “When is a war not a war?” asked British Member of Parliament Sir Henry Campbell-Bannerman in June 1901. “When it is carried on by methods of barbarism in South Africa.”

Far more people died in the camps than in combat. Polluted water supplies, lack of food, and infectious diseases ended up killing tens of thousands of detainees. Even though the Boers were often portrayed as crude people undeserving of sympathy, the treatment of European descendants in this fashion was shocking to the British public. Less notice was taken of British camps for black Africans who had even more squalid living conditions and, at times, only half the rations allotted to white detainees.

The Boer War ended in 1902, but camps soon appeared elsewhere. In 1904, in the neighboring German colony of South-West Africa—now Namibia—German general Lothar von Trotha issued an extermination order for the rebellious Herero people, writing “Every Herero, with or without a gun, with or without cattle, will be shot.”

The order was rescinded soon after, but the damage inflicted on indigenous peoples did not stop. The surviving Herero—and later the Nama people as well—were herded into concentration camps to face forced labor, inadequate rations, and lethal diseases. Before the camps were fully disbanded in 1907, German policies managed to kill some 70,000 Namibians in all, nearly exterminating the Herero.

It took just a decade for concentration camps to be established in wars on three continents. They were used to exterminate undesirable populations through labor, to clear contested areas, to punish suspected rebel sympathizers, and as a cudgel against guerrilla fighters whose wives and children were interned. Most of all, concentration camps made civilians into proxies in order to get at combatants who had dared defy the ruling power.

While these camps were widely viewed as a disgrace to modern society, this disgust was not sufficient to preclude their future use.

During the First World War, the camps evolved to address new circumstances. Widespread conscription meant that any military-age male German deported from England would soon return in a uniform to fight, with the reverse also being true. So Britain initially focused on locking up foreigners against whom it claimed to have well-grounded suspicions.

British home secretary Reginald McKenna batted away calls for universal internment, protesting that the public had no more to fear from the great majority of enemy aliens than they did from “from the ordinary bad Englishman.” But with the sinking of the Lusitania in 1915 by a German submarine and the deaths of more than a thousand civilians, British prime minister Herbert Henry Asquith took revenge, locking up tens of thousands of German and Austro-Hungarian “enemy aliens” in England.

Tanauan reconcentrado camp, Batangas, the Philippines, circa 1901. Image courtesy of University of Michigan Digital Library Collection.

The same year, the British Empire extended internment to its colonies and possessions. The Germans responded with mass arrests of aliens from not only Britain but Australia, Canada, and South Africa as well. Concentration camps soon flourished around the globe: in France, Russia, Turkey, Austro-Hungary, Brazil, Japan, China, India, Haiti, Cuba, Singapore, Siam, New Zealand, and many other locations. Over time, concentration camps would become a tool in the arsenal of nearly every country.

In the United States, more than two thousand prisoners were held in camps during the war. German-born conductor Karl Muck, a Swiss national, wound up in detention in Fort Oglethorpe in Georgia after false rumors that he had refused to conduct “The Star-Spangled Banner.”

Unlike earlier colonial camps, many camps during the First World War were hundreds or thousands of miles from the front lines, and life in them developed a strange normalcy. Prisoners were assigned numbers that traveled with them as they moved from camp to camp. Letters could be sent to detainees, and packages received. In some cases, money was transferred and accounts kept. A bureaucracy of detention emerged, with Red Cross inspectors visiting and making reports.

By the end of the war, more than 800,000 civilians had been held in concentration camps, with hundreds of thousands more forced into exile in remote regions. Mental illness and shattered minority communities were just two of the tolls this long-term internment exacted from detainees.

Nevertheless, this more “civilized” approach toward enemy aliens during the First World War managed to rehabilitate the sullied image of concentration camps. People accepted the notion that a targeted group might turn itself in and be detained during a crisis, with a reasonable expectation to one day be released without permanent harm. Later in the century, this expectation would have tragic consequences.

Yet even as the First World War raged, the camps’ bitter roots survived. The Ottoman government made use of a less-visible system of concentration camps with inadequate food and shelter to deport Armenians into the Syrian desert as part of an orchestrated genocide.

And after the war ended, the evolution of concentration camps took another grim turn. Where internment camps of the First World War had focused on foreigners, the camps that followed—the Soviet Gulag, the Nazi Konzentrationslager—used the same methods on their own citizens.

In the first Cuban camps, fatalities had resulted from neglect. Half a century later, camps would be industrialized using the power of a modern state. The concept of the concentration camp would reach its apotheosis in the death camps of Nazi Germany, where prisoners were reduced not just to a number, but to nothing.

The 20th century made General Martínez Campos into a dark visionary. Refusing to institute concentration camps on Cuba, he had said, “The conditions of hunger and misery in these centers would be incalculable.” And once they were unleashed on the world, concentration camps proved impossible to eradicate.

The post The Invention and Evolution of the Concentration Camp appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/10/18/concentration-camps-invented-punish-civilians/ideas/essay/feed/ 3
What Losing a War Does to a Nation’s Psychehttp://www.zocalopublicsquare.org/2017/10/17/losing-war-nations-psyche/ideas/essay/ http://www.zocalopublicsquare.org/2017/10/17/losing-war-nations-psyche/ideas/essay/#respond Tue, 17 Oct 2017 07:01:39 +0000 By Edgar A. Porter http://www.zocalopublicsquare.org/?p=88831 In the spring of 1976, while visiting the Tokyo Zoo, I was confronted with the unforgettable sight of an aging former Japanese soldier, wearing a ragged army uniform and cap, and bowing before all who entered.

One of his legs had been amputated. A begging bowl before him, he bowed as low as he could to Japanese families coming to see the newly arrived pandas. A few placed coins in his bowl quickly and moved on. It was a shockingly sad sight, with an aura of shame, silence, and neglect surrounding him.

I reacted strongly in part because I had recently visited China. There I was struck by the self-confidence exhibited by the men and women of the military, whether walking down the street or in military formation. The people, in turn, spoke with respect and pride of the older generation who had fought in what they called “our” People’s

The post What Losing a War Does to a Nation’s Psyche appeared first on Zócalo Public Square.

]]>
In the spring of 1976, while visiting the Tokyo Zoo, I was confronted with the unforgettable sight of an aging former Japanese soldier, wearing a ragged army uniform and cap, and bowing before all who entered.

One of his legs had been amputated. A begging bowl before him, he bowed as low as he could to Japanese families coming to see the newly arrived pandas. A few placed coins in his bowl quickly and moved on. It was a shockingly sad sight, with an aura of shame, silence, and neglect surrounding him.

I reacted strongly in part because I had recently visited China. There I was struck by the self-confidence exhibited by the men and women of the military, whether walking down the street or in military formation. The people, in turn, spoke with respect and pride of the older generation who had fought in what they called “our” People’s Liberation Army.

Watching this former Japanese soldier, I found myself thinking: Whether a country wins or loses dictates society’s response to war. Of course, this is not a new observation, nor is it unique to Japan. But that does not lessen the power of the phenomenon, or the pain of defeat experienced by a Japanese society that puts so much emphasis on collective effort and shame.

A lost war created a particularly shaken culture of quiet despair in Japan. So when I am asked today, after living in Japan for 10 years, how it is that many Japanese still refuse to acknowledge their country’s role in bringing misery to so many people through its 1931 invasion of Manchuria, and later occupation of greater China, Singapore, the Philippines, and elsewhere during what the Japanese call The Pacific War, I think of that old soldier whose presence brought such distress to his fellow Japanese.

For several years my wife and I interviewed dozens of ordinary people for a book about Japan during World War II and the American Occupation. These conversations helped me see the old soldier again, this time from the viewpoint of Japanese still struggling with a legacy of shame mixed with forgetfulness.

Through them I could hear those families walk by the old soldier asking, “Why are you here to remind us of our loss and humiliation?” “Why did you come back but not my father, my brother, my son?” And why, to put a hard point to it, did he not commit suicide like so many others?

I came to realize that the soldier represented a depth of shame that sowed seeds for the generations that came after the war. And that shame was both infectious—touching younger generations who had no experience of the war—and normal.

This peculiar normalcy of shame mixed with faded memory has been encouraged for decades by the Japanese national elite in the political, education, business, and journalism fields. One reason for this is that many of the elite are themselves direct descendants of war era leaders. They are not predisposed to have their family members, or the names of their companies, affiliated negatively with either the atrocities, or the loss, of that war.

The Japanese media rarely challenge the government directly. With the exception of a few representatives from the press, such as the “Asahi Shimbun,” most have avoided analyzing critically what happened during the war so as not to lose access to government and business officials.

Kiichi Kawano is a former kamikaze pilot. Scheduled to fly out on a mission the day after the war ended, he has built a private museum in his basement to memorialize his comrades who died. Photo by Edgar A. Porter

The people’s willingness to follow the example of these elites should come as no surprise, for in the years immediately following the war, hunger and despondency stalked the country. The elites, in concert with the American Occupation forces, put their efforts into rebuilding the country. There was little energy left to reflect on the past, even if anyone had been predisposed to do so.

As factories and homes were built, or rebuilt, and infrastructure put in place, Japan experienced a growing optimism. By the time of the 1964 Tokyo Olympics, the country was justly proud of its achievements. The shame brought by the war never disappeared, but receded into the background of public discourse, where it lay for decades in the minds of many.

But not all. Despite the absence of any vibrant public debate, there have always been those who insisted on uncovering uncomfortable truths about their history. These efforts typically came from the grassroots: Local historical associations publish memoirs of war survivors and construct war memorial museums (almost universally called Peace Museums), and progressive teachers have gone beyond the authorized texts to guide their students on discoveries about the war years. Authors such as novelist Shusaku Endo and historian Eri Hotta have tackled painful and secretive war events.

Finally, individuals such as the ones we interviewed for our book have shined a light on those terrible years, climbing over the wall of shame and silence to educate as best they can, with limited resources and often without the encouragement of family or community.

But all of these efforts exist at the margins, because the national narrative still has a chilling effect on getting to the deeper, more complex story.

Two former soldiers we spoke with showed the difficulty of piecing together a more complete story. The first man spent part of a morning sharing with us the lives of both his family and himself as a combatant during the war. But toward the end of the interview, when we asked him to pose for a photo holding a family heirloom battle flag his brother had carried during the war, he refused. He said it would be disrespectful and shameful to have his photo taken with the flag, as it would seem to honor Japan’s defeat.

In the years immediately following the war, hunger and despondency stalked the country…There was little energy left to reflect on the past, even if anyone had been predisposed to do so.

The other man told us of his disbelief and desolation when, in Indonesia at the end of the war, he heard that Japan was defeated. Upon returning home after over a year as a detainee held by allied forces, he shuttered himself in the family home. The depression and shame brought on by the startling failure of their cause, and his humiliation that he had lived when so many of his friends had died, only began to lift two years later. Even after leaving his house, he could not cope with seeing American Occupation forces walking around his town of Beppu. He occasionally fought with them, landed in jail, and spent years in and out of trouble.

He explained that he only began to confront his anger and shame when his granddaughter asked him to tell her what he did during the war. It was a class assignment from her teacher, one of those few who pushed the students to get out and learn more about their own history.

Japan’s reluctance to address its history directly places it in a large camp of like-minded states. Growing up in the American South, my textbooks never honestly described the history or horror of slavery. The Chinese government censors many details of the Cultural Revolution and the full story of the Tiananmen protests and eventual killings of 1989. Under Putin, the Russian government discourages discussion of Stalin’s brutality, such as the authorized population displacements and famine of the 1930s. And Turkey to this day refuses to acknowledge the full extent of its slaughter of the Armenian people in the early 20th century.

But Japan need not remain in this camp. It can follow another model, that of its former ally, Germany.

The German government has established Documentation Centers around the country which detail, without ambiguity, the development and consequences of National Socialism and the Nazi Party. In Japan the closest examples of Documentation Centers are the small, community-based Peace Museums dotting the country. Unlike the German centers, however, the message in these museums is mixed. While all emphasize the need to learn from the horror of the war by promoting world peace, they avoid in depth discussion of how, and why, Japan went to war. Instead they honor their own civilian and military dead, following the general story line of Japan as victim.

While Japanese children still study texts that mention the war only briefly, German schoolchildren find an honest and transparent rendering of the role Nazis, and by inference members of their own families, played in the deaths of millions of people. It is not left to individual teachers and private citizens to fill this void.

I think that the aging soldier I saw at the gate of the Tokyo Zoo 40 years ago represented both the past and future of Japan. Going off to war he was a hero. But upon return he only served as a reminder of the misguided and horrific tragedy orchestrated by the militarist government of war time Japan.

Through truth telling and reconciliation, current and future generations can uncover layers of hidden history and be freed of the national shame and humiliation. We encountered many Japanese who want to move in this direction. What they hope for is a national leadership that will find inspiration from them and follow suit.

The post What Losing a War Does to a Nation’s Psyche appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/10/17/losing-war-nations-psyche/ideas/essay/feed/ 0