Zócalo Public SquareIdeas – Zócalo Public Square http://www.zocalopublicsquare.org Ideas Journalism With a Head and a Heart Fri, 23 Feb 2018 08:01:30 +0000 en-US hourly 1 https://wordpress.org/?v=4.8 Is Forgiveness the Basis of a Healthy Democracy?http://www.zocalopublicsquare.org/2018/02/23/forgiveness-basis-healthy-democracy/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/23/forgiveness-basis-healthy-democracy/ideas/essay/#respond Fri, 23 Feb 2018 08:01:08 +0000 By Ramin Jahanbegloo http://www.zocalopublicsquare.org/?p=91434 Why do we have such difficulty thinking about forgiveness? Read the news on any day and you’ll find stories of war, injustices present and past, and attacks on democracy. It’s apparently a world of apathy and lack of empathy for one another. Forgiveness is not a virtue of this de-civilizing world. But it is the responsibility of outsiders like philosophers and artists to think about forgiveness because it is a powerful personal and political tool that is essential to democracy, to peace, and for personally coming to terms with the injustices and suffering that humans experience and inflict upon those around them.

Philosophers can bring humanness out of the inhumane, as they can bring beauty out of ugliness and peace out of war. So philosophy is a powerful human tool for forgiveness, but it can also radically rethink the idea of forgiveness as the bearer of dignity. This is why

The post Is Forgiveness the Basis of a Healthy Democracy? appeared first on Zócalo Public Square.

]]>
Why do we have such difficulty thinking about forgiveness? Read the news on any day and you’ll find stories of war, injustices present and past, and attacks on democracy. It’s apparently a world of apathy and lack of empathy for one another. Forgiveness is not a virtue of this de-civilizing world. But it is the responsibility of outsiders like philosophers and artists to think about forgiveness because it is a powerful personal and political tool that is essential to democracy, to peace, and for personally coming to terms with the injustices and suffering that humans experience and inflict upon those around them.

Philosophers can bring humanness out of the inhumane, as they can bring beauty out of ugliness and peace out of war. So philosophy is a powerful human tool for forgiveness, but it can also radically rethink the idea of forgiveness as the bearer of dignity. This is why philosophers are more than philosophers; they are individuals who can give meaning to the dignity of the human race. It was this flame of dignity and the power of philosophy to ignite it which led me to become a philosopher in the first place. However, philosophers have differed widely as to their answers to the question of forgiveness.

First, we have to establish what forgiveness means, in a political way. It is not an end to suffering. Suffering is part of life. As the German philosopher Arthur Schopenhauer said, “If the immediate and direct purpose of our life is not suffering then our existence is the most ill-adapted to its purpose in the world.” As such, each new generation, and every new human being, must pave anew the path of suffering.

The important question is how we deal with that suffering. One option, often taken in political situations, is to go for revenge against the person or people who have caused the suffering. But revenge doesn’t offer consolation. Only insofar as the heart can draw things into itself are they of any value. We are not only animals of reason, but also beings capable of compassion. It is only through forgiveness that we can derive consolation from the troubles of life.

Neither does forgiveness mean forgetting the wrongs that were done. Entering the process of forgiving does not necessarily mean that we hold back the bitter past. There are always memories of evil that we cannot forget. Many Holocaust survivors believe that forgiving the Nazis would fail the memory of past victims. But isn’t it true that forgiveness cannot forgive anything but the unforgivable? Otherwise it will lose its meaning.

What is important is how the action of forgiving works. Forgiving, as much as revenge, is one way of entering into a relationship with the Other. But while revenge is the negation of Otherness of the Other— because it disregards and discards the Other as a moral person—forgiveness, instead, tries to enter into a dialogue with that Other.

Forgiveness is both the condition for dialogue and is also realized through dialogue. Dialogue is not a phenomenon that occurs from nowhere and goes nowhere—engaging in it establishes a shared past and creates a future. Furthermore, dialogue requires both questioning oneself and caring for the Other. Thus, forgiveness is about moral repair and rebuilding decency, trust, and hope.

Unlike revenge, forgiveness is not an automatic response to injustice. It requires much more reflection and thought. All human beings can be reflective, in the sense that thinking about what one does is part of doing it. Maybe what perplexes individuals so much about the concept of forgiveness is that forgiveness is seen and felt as a newcomer in our lives. Yet forgiveness, by including both the self and the Other, gives humanity a common horizon, and a shared future.

Neither does forgiveness mean forgetting the wrongs that were done. Entering the process of forgiving does not necessarily mean that we hold back the bitter past.

It is the creation of a shared future that makes forgiveness important in a democracy. A truly moral conception of citizenship requires that one listen to the other with empathy and learn from the past. It is the action of learning to forgive that can reverse the meaninglessness and thoughtlessness of the de-civilizing process we are currently going through.

However, there are pitfalls. The language of exclusion can easily lend itself to the invention of a revengeful worldview. When justice is no longer about compassion, it is only a table of abstract regulations that people use or abuse without care for others. This kind of formal “justice” lacks empathetic listening to the other and voids the possibility of forgiveness. In fact, forgiveness is more than a simple event: It is a paradigm shift to a new outlook on human affairs. If we seek forgiveness, whatever form it may take, we must labor to find it rather than work for an insignificant world based on values such as greed, power and hatred.

This is a responsibility that our human civilization should accept without fear or apprehension. The ethos of shared responsibility finds its best expression in the process of taming violence through acts of forgiveness. The best example of this can be seen in the moral and political efforts of Nelson Mandela to establish national reconciliation in post-apartheid South Africa. As Mandela said: “If you want to make peace with your enemy, you have to work with your enemy. Then he becomes your partner.”

This is where we should look for a political exercise of moderation and empathy and where a climate of cooperation and reconciliation could flourish.

Today, in a world suffused by feelings of insignificance and violence, indifference is no longer an option. To fail to recognize this is to betray our conscience. Indifference has cheapened our human life. Therefore, forgiveness is a quality that cannot be manufactured by businessmen and politicians. What’s more, it must have a level of sincerity—individuals have to see past their own arrogance and hostility to pursue decency and human dignity.

Its ongoing relevance makes forgiveness all the more compelling in current debates on violence, democracy, and culture. As Archbishop Desmond Tutu said in the context of the South African Truth and Reconciliation Commission, “It is ultimately in our own best interests that we become forgiving, repentant, reconciling and reconciled people, because without forgiveness, without reconciliation, we have no future.”

While some will follow Tutu’s advice, others will think that what he suggests is madness. If there is only one beautiful madness in the world which can free us from all forms of political and religious lunacy, it’s the act of forgiving the person while not forgetting the event.

This is when we enter the stage of history not from the back door, but by being fully present in the agora in order to predict the horrors and warn others. As Hannah Arendt says: “Men in plural can experience meaningfulness only because they can talk with and make sense to each other and themselves.” As such, forgiveness, as a new beginning, is not when the past is forgotten or hidden in a corner of our mind, it is when our past sufferings are not repeated and we do not repeat each other’s.

We accomplish the politics of forgiveness when we are capable of organizing our societies around the idea of decency of humanity. There is no reason to think that this struggle is a lost cause.

The post Is Forgiveness the Basis of a Healthy Democracy? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/23/forgiveness-basis-healthy-democracy/ideas/essay/feed/ 0
How White Settlers Buried the Truth About the Midwest’s Mysterious Moundshttp://www.zocalopublicsquare.org/2018/02/22/white-settlers-buried-truth-midwests-mysterious-mounds/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/22/white-settlers-buried-truth-midwests-mysterious-mounds/ideas/essay/#comments Thu, 22 Feb 2018 08:01:34 +0000 By Sarah E. Baires http://www.zocalopublicsquare.org/?p=91405 Around 1100 or 1200 A.D., the largest city north of Mexico was Cahokia, sitting in what is now southern Illinois, across the Mississippi River from St. Louis. Built around 1050 A.D. and occupied through 1400 A.D., Cahokia had a peak population of between 25,000 and 50,000 people. Now a UNESCO World Heritage Site, Cahokia was composed of three boroughs (Cahokia, East St. Louis, and St. Louis) connected to each other via waterways and walking trails that extended across the Mississippi River floodplain for some 20 square km. Its population consisted of agriculturalists who grew large amounts of maize, and craft specialists who made beautiful pots, shell jewelry, arrow-points, and flint clay figurines.

The city of Cahokia is one of many large earthen mound complexes that dot the landscapes of the Ohio and Mississippi River Valleys and across the Southeast. Despite the preponderance of archaeological evidence that these mound complexes were

The post How White Settlers Buried the Truth About the Midwest’s Mysterious Mounds appeared first on Zócalo Public Square.

]]>
Around 1100 or 1200 A.D., the largest city north of Mexico was Cahokia, sitting in what is now southern Illinois, across the Mississippi River from St. Louis. Built around 1050 A.D. and occupied through 1400 A.D., Cahokia had a peak population of between 25,000 and 50,000 people. Now a UNESCO World Heritage Site, Cahokia was composed of three boroughs (Cahokia, East St. Louis, and St. Louis) connected to each other via waterways and walking trails that extended across the Mississippi River floodplain for some 20 square km. Its population consisted of agriculturalists who grew large amounts of maize, and craft specialists who made beautiful pots, shell jewelry, arrow-points, and flint clay figurines.

The city of Cahokia is one of many large earthen mound complexes that dot the landscapes of the Ohio and Mississippi River Valleys and across the Southeast. Despite the preponderance of archaeological evidence that these mound complexes were the work of sophisticated Native American civilizations, this rich history was obscured by the Myth of the Mound Builders, a narrative that arose ostensibly to explain the existence of the mounds. Examining both the history of Cahokia and the historic myths that were created to explain it reveals the troubling role that early archaeologists played in diminishing, or even eradicating, the achievements of pre-Columbian civilizations on the North American continent, just as the U.S. government was expanding westward by taking control of Native American lands.

Today it’s difficult to grasp the size and complexity of Cahokia, composed of about 190 mounds in platform, ridge-top, and circular shapes aligned to a planned city grid oriented five degrees east of north. This alignment, according to Tim Pauketat, professor of anthropology at the University of Illinois, is tied to the summer solstice sunrise and the southern maximum moonrise, orientating Cahokia to the movement of both the sun and the moon. Neighborhood houses, causeways, plazas, and mounds were intentionally aligned to this city grid. Imagine yourself walking out from Cahokia’s downtown; on your journey you would encounter neighborhoods of rectangular, semi-subterranean houses, central hearth fires, storage pits, and smaller community plazas interspersed with ritual and public buildings. We know Cahokia’s population was diverse, with people moving to this city from across the midcontinent, likely speaking different dialects and bringing with them some of their old ways of life.

View of Cahokia from Rattlesnake Mound ca 1175 A.D., drawn by Glen Baker. Image courtesy of Sarah E. Baires.

The largest mound at Cahokia was Monks Mound, a four-terraced platform mound about 100 feet high that served as the city’s central point. Atop its summit sat one of the largest rectangular buildings ever constructed at Cahokia; it likely served as a ritual space.

In front of Monks Mound was a large, open plaza that held a chunk yard to play the popular sport of chunkey. This game, watched by thousands of spectators, was played by two large groups who would run across the plaza lobbing spears at a rolling stone disk. The goal of the game was to land their spear at the point where the disk would stop rolling. In addition to the chunk yard, upright marker posts and additional platform mounds were situated along the plaza edges. Ridge-top burial mounds were placed along Cahokia’s central organizing grid, marked by the Rattlesnake Causeway, and along the city limits.

Cahokia was built rapidly, with thousands of people coming together to participate in its construction. As far as archaeologists know, there was no forced labor used to build these mounds; instead, people came together for big feasts and gatherings that celebrated the construction of the mounds.

The splendor of the mounds was visible to the first white people who described them. But they thought that the American Indian known to early white settlers could not have built any of the great earthworks that dotted the midcontinent. So the question then became: Who built the mounds?

Early archaeologists working to answer the question of who built the mounds attributed them to the Toltecs, Vikings, Welshmen, Hindus, and many others. It seemed that any group—other than the American Indian—could serve as the likely architects of the great earthworks. The impact of this narrative led to some of early America’s most rigorous archaeology, as the quest to determine where these mounds came from became salacious conversation pieces for America’s middle and upper classes. The Ohio earthworks, such as Newark Earthworks, a National Historic Landmark located just outside Newark, OH, for example, were thought by John Fitch (builder of America’s first steam-powered boat in 1785) to be military-style fortifications. This contributed to the notion that, prior to the Native American, highly skilled warriors of unknown origin had populated the North American continent.

This was particularly salient in the Midwest and Southeast, where earthen mounds from the Archaic, Hopewell, and Mississippian time periods crisscross the midcontinent. These landscapes and the mounds built upon them quickly became places of fantasy, where speculation as to their origins rose from the grassy prairies and vast floodplains, just like the mounds themselves. According to Gordon Sayre (The Mound Builders and the Imagination of American Antiquity in Jefferson, Bartram, and Chateaubriand), the tales of the origins of the mounds were often based in a “fascination with antiquity and architecture,” as “ruins of a distant past,” or as “natural” manifestations of the landscape.

When William Bartram and others recorded local Native American narratives of the mounds, they seemingly corroborated these mythical origins of the mounds. According to Bartram’s early journals (Travels, published in 1928) the Creek and the Cherokee who lived around mounds attributed their construction to “the ancients, many ages prior to their arrival and possessing of this country.” Bartram’s account of Creek and Cherokee histories led to the view that these Native Americans were colonizers, just like Euro-Americans. This served as one more way to justify the removal of Native Americans from their ancestral lands: If Native Americans were early colonizers, too, the logic went, then white Americans had just as much right to the land as indigenous peoples.

Location of Cahokia, East St Louis, and St Louis sites in the American Bottom. Map courtesy of Sarah E. Baires.

The creation of the Myth of the Mounds parallels early American expansionist practices like the state-sanctioned removal of Native peoples from their ancestral lands to make way for the movement of “new” Americans into the Western “frontier.” Part of this forced removal included the erasure of Native American ties to their cultural landscapes.

In the 19th century, evolutionary theory began to take hold of the interpretations of the past, as archaeological research moved away from the armchair and into the realm of scientific inquiry. Within this frame of reference, antiquarians and early archaeologists, as described by Bruce Trigger, attempted to demonstrate that the New World, like the Old World, “could boast indigenous cultural achievements rivaling those of Europe.” Discoveries of ancient stone cities in Central America and Mexico served as the catalyst for this quest, recognizing New World societies as comparable culturally and technologically to those of Europe.

But this perspective collided with Lewis Henry Morgan’s 1881 text Houses and House-life of the American Aborigines. Morgan, an anthropologist and social theorist, argued that Mesoamerican societies (such as the Maya and Aztec) exemplified the evolutionary category of “Middle Barbarism”—the highest stage of cultural and technological evolution to be achieved by any indigenous group in the Americas. By contrast, Morgan said that Native Americans located in the growing territories of the new United States were quintessential examples of “Stone Age” cultures—unprogressive and static communities incapable of technological or cultural advancement. These ideologies framed the archaeological research of the time.

In juxtaposition to this evolutionary model there was unease about the “Vanishing Indian,” a myth-history of the 18th and 19th centuries that depicted Native Americans as a vanishing race incapable of adapting to the new American civilization. The sentimentalized ideal of the Vanishing Indian—who were seen as noble but ultimately doomed to be vanquished by a superior white civilization—held that these “vanishing” people, their customs, beliefs, and practices, must be documented for posterity. Thomas Jefferson was one of the first to excavate into a Native American burial mound, citing the disappearance of the “noble” Indians—caused by violence and the corruption of the encroaching white civilization—as the need for these excavations. Enlightenment-inspired scholars and some of America’s Founders viewed Indians as the first Americans, to be used as models by the new republic in the creation of its own legacy and national identity.

During the last 100 years, extensive archaeological research has changed our understanding of the mounds. They are no longer viewed as isolated monuments created by a mysterious race. Instead, the mounds of North America have been proven to be constructions by Native American peoples for a variety of purposes. Today, some tribes, like the Mississippi Band of Choctaw, view these mounds as central places tying their communities to their ancestral lands. Similar to other ancient cities throughout the world, Native North Americans venerate their ties to history through the places they built.

The post How White Settlers Buried the Truth About the Midwest’s Mysterious Mounds appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/22/white-settlers-buried-truth-midwests-mysterious-mounds/ideas/essay/feed/ 1
What the Path of Curry Tells Us About Globalizationhttp://www.zocalopublicsquare.org/2018/02/21/path-curry-tells-us-globalization/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/21/path-curry-tells-us-globalization/ideas/essay/#respond Wed, 21 Feb 2018 08:01:05 +0000 By Lizzie Collingham http://www.zocalopublicsquare.org/?p=91371 One Sunday morning in 1993, “Bushman,” “Spider,” “Tall Boy,” and “Crab Dog” were gathered at a rum shop in the Guyanese coastal village of Mahaica. The rainy season had driven these Afro-Guyanese diamond miners out of the interior, and they had settled down for a companionable drinking session. They were joined by Terry Roopnaraine, an anthropologist gathering information for a study of gold and diamond mining. Spider, who was flush with the proceeds from a big strike, was treating the others from his earnings. Eventually, Bushman announced that he had killed an iguana the day before.

“So, let’s cook him,” the men declared.

Tall Boy, who worked as a cook out in the mining camps, persuaded the rum shop owner to let him use the kitchen in back. He softened onions in coconut oil while he slit the belly of the iguana, cleaned out its guts, and chopped up the

The post What the Path of Curry Tells Us About Globalization appeared first on Zócalo Public Square.

]]>
One Sunday morning in 1993, “Bushman,” “Spider,” “Tall Boy,” and “Crab Dog” were gathered at a rum shop in the Guyanese coastal village of Mahaica. The rainy season had driven these Afro-Guyanese diamond miners out of the interior, and they had settled down for a companionable drinking session. They were joined by Terry Roopnaraine, an anthropologist gathering information for a study of gold and diamond mining. Spider, who was flush with the proceeds from a big strike, was treating the others from his earnings. Eventually, Bushman announced that he had killed an iguana the day before.

“So, let’s cook him,” the men declared.

Tall Boy, who worked as a cook out in the mining camps, persuaded the rum shop owner to let him use the kitchen in back. He softened onions in coconut oil while he slit the belly of the iguana, cleaned out its guts, and chopped up the carcass. After sprinkling a generous helping of curry powder over the onions, he threw the meat in the pot followed by a glug of cane spirit to counter its musty odor. He set a pot of rice to cook while the curry was simmering and the men drank yet more rum. When the food was ready, they eagerly wolfed it down and then sauntered off to doze away the afternoon.

The story of how a group of Afro-Guyanese diamond miners came to be making a bush-meat version of an Indian curry begins in 1627, when a band of 50 British men founded the colony of Barbados, on the uninhabited island.

These colonists experimented with growing tobacco, cotton, ginger, and indigo as they sought to make their fortunes. But it was not until James Drax returned from a visit to Portuguese Brazil with sugar cane in 1640 that the settlers found the cash crop of their dreams. An acre of land planted with cane earned Drax four times more than an acre planted with tobacco. Within a decade, every scrap of land on the 169-square-mile island was planted with sugarcane, and Barbados became the wealthiest colony in the British Empire.

Imperial Federation map of the world showing the extent of the British Empire in 1886. Image courtesy of Boston Public LIbrary/Flickr.

The planters initially employed white indentured laborers and convicts to do the back-breaking and dangerous work of cultivating and processing the sugar cane. In Britain under Oliver Cromwell, to be “Barbadosed” was to be transported into servitude on the West Indian plantations. But white men were gradually replaced by West African slaves, as English ships tapped into the trans-Atlantic slave trade already established by the Portuguese to supply workers for their South American sugar plantations.

At the end of the 18th century, on a tour of the West Indies, which included Barbados, ship’s purser Aaron Thomas was so shocked by the conditions in which the slaves lived and worked that he wrote in his sea diary, “I never more will drink Sugar in my Tea, for it is nothing but Negroe’s blood.”

The British dissolved most of the sugar they consumed in tea. Over the 100 years after Drax sent home his first consignment, sugar poured into Britain in unprecedented quantities and its consumption rose in tandem with that of tea. The price of both commodities fell until even the poorest could afford them. A late 18th-century survey found that rural laborers spent as much as 10 percent of their annual income on tea and sugar; while Friedrich Engels noticed that tea was “quite indispensable” to the inhabitants of Manchester’s slums.

Social commentators condemned the poor for wasting their money on these “luxuries.” But tea-drinking was a symptom not of extravagance but of workers’ impoverishment. Rising food and fuel prices meant that they could barely afford to cook a warm meal let alone simmer the wort to brew their own beer. As a consequence, they turned for sustenance to shop-bought bread washed down with tea sweet enough to provide them with energy as well as the illusion that they were eating a warm meal. One family of ironworkers dissolved four pounds of sugar—enough to fill 10 teacups—in their weekly half-pound of tea. Britain’s Industrial Revolution was fueled by tea and sugar.

During the Napoleonic Wars (1803−15), Britain acquired two new sugar-producing colonies: French Mauritius and British Guiana. But when slavery was abolished in 1838, more than half the West African slaves turned their backs on the sugar plantations, resulting in a drastic fall in production.

The planters searched for a replacement workforce and found it in India, where a growing number of impoverished laborers were seeking work. Britain’s Industrial Revolution had sent India’s economy into turmoil. In the 1820s the British imposed protective tariffs closing the British market to Indian textiles; instead, cheap, machine-made Manchester cottons flooded into India. Millions of Indian artisans went out of business and joined the growing number of landless laborers pushed off the land by debt.

And so the resources of the British Empire, which had been channeled into the slave trade, were redirected into moving hundreds of thousands of Indians to work on plantations around the globe. Between 1838 and 1916, about 240,000 Indians were taken to British Guiana.

This was how curry came to the northern coast of South America and the country now called Guyana.

Indian indentured laborers were given rations of rice, lentils, coconut oil, sugar, salt, and curry powder, all of which allowed them to make a semblance of Indian meals. The one ingredient that they would not have used in India was curry powder. This was a British invention. The Victorians usually would mix cayenne pepper, cumin, coriander, lots of turmeric (Indians tended to be more sparing in their use of this spice), and fenugreek, which was a commonly used spice around Madras, where the first curry powder factories were set up.

When the British had first settled in India as merchants and traders they had loved Indian food, and brought cooks and recipes back to Britain with them when they retired. The first cookbook to include a recipe for “how to make a curry the India way” was Hannah Glasse’s The Art of Cookery Made Plain and Easy published in 1747. In Indian kitchens the spices were freshly roasted and ground each morning before they were added to the dishes at different stages in the cooking process. Glasse’s recipe attempted to replicate Indian practice by instructing the cook to first roast the spices on a shovel over the fire before beating them to a powder.

However, as the British grew accustomed to making Indian food, they took shortcuts, and over time Victorian cooks transformed curries into spicy casseroles. An essential element in this transformation was the use of standardized, pre-mixed curry powders that became commercially available in the 1780s. The Victorians would use a spoonful of curry powder to curry anything from periwinkles to sheep’s trotters. British curries became unrecognizable to Indian visitors who were dismayed when they were confronted with these hashes “flavoured with turmeric and cayenne.”

The one ingredient that [the indentured laborers] would not have used in India was curry powder. This was a British invention.

Even though it was Indian indentured laborers who first taught the Africans in Guyana how to cook curry, Indian food there was diminished by the forces of Empire. Limited rations meant that the indentured laborers struggled to replicate Indian home cooking. There were only two types of pea available to make dhal and everything had to be cooked in coconut oil. The biggest handicap was that, instead of the plethora of different spices and herbs available to an Indian cook, the laborers were only given a pre-mixed curry powder. This meant that, in Guyana, the panoply of different styles and types of dish were replaced by one simplified version of a curry. Although there are distinctively Guyanese combinations, such as shrimp and pumpkin, Indo-Guyanese dishes are all variations on one theme.

The meal of iguana curry eaten by a group of Afro-Guyanese diamond miners on a Sunday morning in 1993 carries within it the story of how Africans and Indians were brought to the Americas by the British craving for sweetness.

These two groups of arrivals and their progeny developed different relationships with their new homeland. The Indo-Guyanese tended to stay close to the coast within the orbit of the plantation world. Once freed from slavery, the Afro-Guyanese had made their way into the interior to tap rubber, and prospect for gold and diamonds. Here they interacted with the Amerindians who taught them how to hunt and cook the forest animals. And so the Africans applied the currying technique they had learned from their Indian neighbors to bush meat.

The British Empire was a powerful force for spreading new foods and new ways of eating throughout the globe. And yet it was also a powerful force for homogenization. Through the collisions of history, middle-class British housewives and South American descendants of African slaves both ended up eating a similar version of Indian food: a curry of peculiar meats, made with curry powder.

The post What the Path of Curry Tells Us About Globalization appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/21/path-curry-tells-us-globalization/ideas/essay/feed/ 0
Before You Push That Big Nuclear Button, Consider the Sourcehttp://www.zocalopublicsquare.org/2018/02/20/push-big-nuclear-button-consider-source/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/20/push-big-nuclear-button-consider-source/ideas/essay/#respond Tue, 20 Feb 2018 08:01:54 +0000 By Robert Bartholomew http://www.zocalopublicsquare.org/?p=91351 Shortly after 8 a.m. on January 13, 2018, the Hawaii Emergency Management Agency sent out a chilling alert to residents across the state of Hawaii: “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.”

Thousands of frightened people flocked to shelters; some even climbed down manholes to save themselves. Hawaii State Representative Matthew LoPresti told CNN: “I was sitting in the bathtub with my children, saying our prayers.” It was not until 38 minutes later that a second message made it clear that that the first had been a false alarm.

Such episodes are not new. Since the advent of mass communications, similar scares have taken place. From intentional hoaxes to accidental alerts, we have become susceptible to reports of terrifying events that never come to pass.

Canadian philosopher and media scholar Marshall McLuhan famously observed that we all live in a global village. Where

The post Before You Push That Big Nuclear Button, Consider the Source appeared first on Zócalo Public Square.

]]>
Shortly after 8 a.m. on January 13, 2018, the Hawaii Emergency Management Agency sent out a chilling alert to residents across the state of Hawaii: “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.”

Thousands of frightened people flocked to shelters; some even climbed down manholes to save themselves. Hawaii State Representative Matthew LoPresti told CNN: “I was sitting in the bathtub with my children, saying our prayers.” It was not until 38 minutes later that a second message made it clear that that the first had been a false alarm.

Such episodes are not new. Since the advent of mass communications, similar scares have taken place. From intentional hoaxes to accidental alerts, we have become susceptible to reports of terrifying events that never come to pass.

Canadian philosopher and media scholar Marshall McLuhan famously observed that we all live in a global village. Where it once took months to relay news from the other side of the world, it now takes less than a second. The problem is, with so many people now reliant on the internet and mobile phones, the potential for technology-driven hoaxes, panics, and scares has never been greater. And such events, while usually short-lived, have tremendous power to wreak widespread fear and chaos.

The most famous example of mass panic in response to an announcement is the 1938 “War of the Worlds” radio drama produced by Orson Welles at WABC’s studios in New York City. Broadcast across the United States and in parts of Canada, the play narrated a fictitious Martian invasion of the New York-New Jersey metropolitan area. In his bestselling book The Invasion From Mars, American psychologist Hadley Cantril wrote that of the estimated 6 million listeners, about 1.5 million were frightened or panicked.

A smartphone displaying a false incoming ballistic missile emergency alert, sent Jan. 13, 2018. Image courtesy of Caleb Jones/Associated Press.

While contemporary sociologists who have re-examined the episode believe that the number of those who panicked was much smaller, no doubt many were frightened. Some people living near the epicenter of the play—tiny Grover’s Mill, New Jersey—tried to flee the Martian “gas raids” and heat rays.” During the hour-long broadcast, The New York Times fielded 875 phone calls about the broadcast. Curiously, Cantril found that about one-fifth of listeners thought that America was under attack by a foreign power—most likely Germany, using advanced weapons. Historical context often plays a major role in shaping the mindset of listeners.

While perhaps the best-known incident, the 1938 broadcast did not have the most serious consequences. That distinction goes to another radio play, broadcast in 1949, that caused pandemonium in Quito, Ecuador. That highly realistic production mentioned real people and places, and included impersonations of government leaders. A correspondent for The New York Times in Quito at the time described a tumultuous scene as the drama “drove most of the population of Quito into the streets” to escape the Martian “gas raids.”

After realizing that the broadcast was a play, an enraged mob marched on the radio station, surrounded the building, and burned it to the ground. At least 20 died in the rioting and chaos. Authorities had trouble restoring order, as most of the police and military had been sent to the nearby town of Cotocallao to repel the Martians.

These “War of the Worlds” broadcasts were not the first to provoke false alarms. One of the earliest recorded scares occurred in the United Kingdom in 1926, when the BBC reported that the government was under siege and could fall amid a bloody uprising by disgruntled workers. In reality, the streets of London were calm; people had only been listening to a radio play airing on the regularly scheduled Saturday evening radio program.

The BBC broadcast announcements throughout the evening, apologizing and reassuring listeners. But the story was plausible, given deep tensions with labor unions at the time. And, four months after the broadcast, a historic General Workers Strike rocked the government, as 1.5 million workers took to the streets over 10 days to call for higher wages and better working conditions.

Decades later, on March 20, 1983, hundreds of Americans were frightened by a broadcast of the NBC Sunday Night Movie Special Bulletin, about news coverage of a group of terrorists who were threatening to detonate a nuclear bomb in South Carolina. The program began like any other Sunday night movie but was quickly “interrupted” by breaking news bulletins from well-known news outlets such as Reuters and the Associated Press, showing scenes of devastation. The film even recreated a White House press briefing. Special bulletins throughout the show were interrupted by “live feeds” from the terrorists. Conveniently, a reporter and a cameraman happened to be among the “hostages.”

While many false alarms have apparently been unintentional, there are egregious examples of deliberate attempts to cause widespread fear.

The realism of the special broadcast was instrumental in creating the panic. The fictional NBC affiliate broadcasting the siege in Charleston, WPIV, was similar to the real affiliate, WCIV. The real-world WCIV-TV received 250 calls; others rang the police. WDAF in Kansas City fielded 37 calls. At San Francisco’s KRON-TV, 50 calls were logged in the first 30 minutes. One woman was so convinced that she called to complain that too much air time was being given to the terrorists. The program began with an advisory that what viewers were about to see was a “realistic depiction of fictional events” but was not actually happening. Unfortunately, the next advisory did not run until 15 minutes later. The movie won an Emmy, but across the country viewers mistook it for a live news coverage and thought a nuclear catastrophe was imminent.

It wouldn’t be the last nuke scare. On November 9, 1982, WSSR-FM in Springfield, Illinois, reported that there had been an accident at a nearby nuclear power plant. The program began by claiming “that a nuclear cloud was headed for Springfield.” Concerned residents immediately deluged police with phone calls, prompting the station, which was operated by Sangamon State University, to pull the plug on the half-hour drama after just two and a half minutes. The Illinois Emergency Services and Disaster Agency was not amused. Director Chuck Jones said: “I’m still shocked that someone out at that station let that get on the air.” The nuclear power plant depicted in the program was located 25 miles northeast of the city, and while it was not operating at the time, people didn’t have any way of knowing this.

While many false alarms have apparently been unintentional, there are egregious examples of deliberate attempts to cause widespread fear. On January 29, 1991, DJ John Ulett of radio station KSHE-FM in Crestwood, Missouri, decided to protest America’s involvement in the Persian Gulf War by airing the following announcement: “Attention, attention. This is an official civil defense warning. This is not a test. The United States is under nuclear attack.” Worried listeners flooded the station with phone calls. While Ulett’s statement did not trigger a mass panic, the Federal Communications Commission fined the station $25,000. Ulett was suspended but managed to save his job.

Poorly timed jokes have also triggered dramatic results. On August 11, 1984, just before his weekly radio address, President Ronald Reagan tested his microphone by saying: “My fellow Americans, I am pleased to tell you today that I’ve signed legislation that will outlaw Russia forever. We begin bombing in five minutes.” Americans didn’t panic, because the broadcasters knew he was joking, but the Russians, nervous at a time of considerable distrust between the two countries, placed their armed forces on standby.

Even fleeting scares can have long-term consequences. The 1938 Martian scare resulted in jammed phone lines. In Trenton, emergency services were knocked out for six hours. In Quito, damage from the rioting was estimated at $350,000—an enormous sum at the time.

In 1597, English philosopher Francis Bacon famously observed, “Knowledge is power.” But in today’s world, knowledge comes with risk. The most educated and technologically adept generation in the history of the world is also the most vulnerable.

The post Before You Push That Big Nuclear Button, Consider the Source appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/20/push-big-nuclear-button-consider-source/ideas/essay/feed/ 0
What Benjamin Franklin Ate When He Was Homesickhttp://www.zocalopublicsquare.org/2018/02/19/benjamin-franklin-ate-homesick/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/19/benjamin-franklin-ate-homesick/ideas/essay/#respond Mon, 19 Feb 2018 08:01:05 +0000 By Rae Katherine Eighmey http://www.zocalopublicsquare.org/?p=91323 In the midst of the American Revolution, Benjamin Franklin envisioned the turkey as an exemplar of the ideal American citizen. In a 1783 letter home to his daughter Sally, written while Franklin was serving as chief diplomat to France, he wrote about the “ribbons and medals” presented to the French by grateful Americans in thanks for significant military and financial support. The tokens bore an image of an eagle—but, Franklin explained, some recipients complained that the workmanship was not up to sophisticated French standards. They thought that the eagle looked more like a turkey.

Franklin asserted that this plucky fowl would have been a better choice in the first place. Eagles were found in many countries, but the turkey was an American native and “a bird of courage,” a fitting symbol of America’s valor and virtues. It “would not hesitate to attack a grenadier of the British Guard who should

The post What Benjamin Franklin Ate When He Was Homesick appeared first on Zócalo Public Square.

]]>
In the midst of the American Revolution, Benjamin Franklin envisioned the turkey as an exemplar of the ideal American citizen. In a 1783 letter home to his daughter Sally, written while Franklin was serving as chief diplomat to France, he wrote about the “ribbons and medals” presented to the French by grateful Americans in thanks for significant military and financial support. The tokens bore an image of an eagle—but, Franklin explained, some recipients complained that the workmanship was not up to sophisticated French standards. They thought that the eagle looked more like a turkey.

Franklin asserted that this plucky fowl would have been a better choice in the first place. Eagles were found in many countries, but the turkey was an American native and “a bird of courage,” a fitting symbol of America’s valor and virtues. It “would not hesitate to attack a grenadier of the British Guard who should presume to invade his farm yard with a red coat on,” he wrote to Sally. Turkeys were tasty, too, Franklin further explained to her, first brought to France by the Jesuits and served to citizens of note including “at the Wedding Table of Charles the ninth,” in 1570. Some 200 years later, Franklin had served turkey to guests of his own in Philadelphia, and now the sumptuous fowl were often on his diplomatic table in Passy, France. Ever practical, Benjamin “Waste not, want not” Franklin clearly appreciated that a bird of taste and courage could nourish both the body and spirit of the nation.

I write about food, using it as an interpretive tool to understand history and historical figures, so I was delighted to see that Benjamin Franklin liked writing about the topic as well, in his letters and political articles. Franklin realized the impact that foods—particularly, native foods—had in building the identity of a new nation. By the time he wrote that letter to his daughter praising the courage of turkeys, Franklin had pondered and promoted the idea of an American identity for more than 40 years, first as a loyal colonist, later as an emerging patriot, and then as one of the nation’s founders. He championed the things that set his homeland apart from its English and European heritage. Out of his ponderings he would come to define the essential American persona and the ingredients of American success. American geography and its bounty, including its food, were central to the recipe.

Franklin recognized American potential early. In his 1751 essay Observations Concerning the Increase of Mankind, written 25 years before the Declaration of Independence, he explained, “so vast is the territory of North America, that it will require many ages to settle it fully; and, till it is fully settled, labor will never be cheap here, where no man continues long a laborer for others, but gets a plantation [farm] of his own, no man continues long a journeyman to a trade, but goes among those new settlers, and sets up for himself, &c.” Franklin contrasted this vista of opportunity with life across the Atlantic, where “Europe is generally full settled with husbandmen, manufacturers, &c. and therefore cannot now much increase in people ….”

Throughout his political and diplomatic career—including 26 years spent in England and France, before and during the American Revolution—Franklin worked hard to convey the strengths of the American setting and the American character. He became a celebrity when he lived near Paris from 1776 to 1785, his image adorning all manner of objects. People felt as though they knew him personally, and he would happily answer their questions about America.

Often, his message was that success in America required persistent hard work, and that no one should travel to the New World unprepared for the challenge. “Many I believe go to America with very little; and with such romantic schemes and expectations as must end in disappointment and poverty,” Franklin wrote in another 1783 letter to Sally and her husband Richard Bache. “I dissuade, all I can, those who have not some useful trade or art by which they may get a living; but there are many who hope for offices and public employments, who value themselves and expect to be valued by us for their birth or quality, though I tell them those things bear no price in our markets. But fools will ruin themselves their own way.”

Products of the land fit easily into Franklin’s vision of American industriousness and greatness. When he lived in London and worked as an agent for the Pennsylvania Colony, in two postings between 1757 and 1775, his wife Deborah sent him a wide variety of his favorite American foods: smoked wild venison, home-cured hams, and dried peaches—he preferred those dried without the peel. She sent kegs of cranberries, which caused great wonder among Franklin’s landlady’s kitchen staff. Deborah shipped barrels of her husband’s favorite Newtown Pippin apples, an American native variety famed for its superior keeping qualities. She sent grafted trees for planting on Franklin’s friends’ estates, symbols of the productive exchange of ideas and commerce he sought to encourage between the Colonies and the Crown.

Out of his ponderings [Franklin] would come to define the essential American persona and the ingredients of American success. American geography and its bounty, including its food, were central to the recipe.

But it was maize, the primary American grain, that was Franklin’s most ideologically impactful import. Shared by Native Americans with the pilgrims and, now, after more than 100 years of European settlement, widely cultivated on farms across the colonies, this “Indian corn,” as it had been commonly known, was often praised by Franklin for its taste and variety of uses. Deborah’s packages included dried or parched corn kernels, cornmeal, and nocake—a flour made from parched corn that Franklin’s London cook probably used to make pancakes. In letters home to Philadelphia, Franklin thanked Deborah for the corn meals and flours she sent: “For since I cannot be in America, every thing that come from thence comforts me a little, as being something like home,” he wrote, noting specifically that, “The nocake proves very good.”

Benjamin Franklin often considered the essence of his homeland from afar, especially as the 1760s brought the disintegration of the mutually supportive and respectful relationship between England and its American Colonies, which he and other patriots had sought to cultivate. In 1764, during the height of the Stamp Act controversy, Franklin, then living in London, wrote several letters to the editor using pen names such as “Homespun” to make the case for the Colonies. Again, his thoughts turned to corn. He employed it as a metaphor to dramatize the differences between the dynamic, diverse American settlements and the staid English homeland.

In one essay published in several London newspapers in early January 1764, Franklin contrasted the essential American grain’s virtuosity and variety with the limitations of lowly English wheat. “[Maize] is one of the most agreeable and wholesome grains in the world; that its green ears roasted are a delicacy beyond expression; that samp, hominy, succatash, and nocake, made of it, are so many pleasing varieties; and that a johny or hoecake, hot from the fire, is better than a Yorkshire muffin.” Franklin continued saying that British essay writers who preferred “the roast beef of Old England” and condemned corn as “disagreeable” and “indigestible” without even tasting it, suffered from a misguided sense of superiority. He saw their snobbery as a metaphorical parallel to the Crown’s (faulty) assumption that it understood American people and possibilities, and thus knew best how to rule the colonies.

The first shots of the Revolution were at the battle of Lexington and Concord in April 1775, and in June and July of 1776 members of the Continental Congress wrote and signed the Declaration of Independence, with its aspirations to “life, liberty, and the pursuit of happiness”—truths that were, in the word Benjamin Franklin himself wrote into that founding document, “self-evident.” Franklin would spend the next 10 years in France promoting American freedom and possibilities.

After the war, when an old English friend, the Earl of Buchan, sought resettlement advice, Franklin told him that, “The only encouragements we hold out to strangers, are a good climate, fertile soil, wholesome air, and water, plenty of provisions and fuel, good pay for labor, kind neighbors, good laws, liberty, and a hearty welcome. The rest depends on a man’s own industry and virtue.”

Franklin cautioned that America’s streets were not paved with gold. But, he might also have added, the new nation’s fields and orchards were indeed filled with delicious turkeys, exceptional apples, and golden grains of opportunity.

The post What Benjamin Franklin Ate When He Was Homesick appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/19/benjamin-franklin-ate-homesick/ideas/essay/feed/ 0
Why Are There so Many Statues of Men on Horseback?http://www.zocalopublicsquare.org/2018/02/16/many-statues-men-horseback/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/16/many-statues-men-horseback/ideas/essay/#respond Fri, 16 Feb 2018 08:01:02 +0000 By Peter Louis Bonfitto http://www.zocalopublicsquare.org/?p=91278 Statues are created to project meaning. Contemporary public artworks, for example, use purposely veiled messages aimed to generate thoughtful exchange with the viewer and to prompt reflection. By contrast, historic monumental sculptures employ symbolism that is direct and intentionally easy for viewers to understand.

The ancient Roman tradition of publicly displaying monumental equestrian statues of important historical figures is a particularly striking case of how to convey meaning in no uncertain terms.

Traditionally cast in bronze, these huge forms of horse and rider display messages of dominance, power, and virtue through strength. And, by doing so, they established a template that has persisted for centuries.

Perhaps, no statue embodies these values more than a famous depiction of the emperor Marcus Aurelius on horseback.

The figure of the emperor is seated on top of a regal horse, artfully posed as if it were moving gracefully through a crowd. The rider fully

The post Why Are There so Many Statues of Men on Horseback? appeared first on Zócalo Public Square.

]]>
Statues are created to project meaning. Contemporary public artworks, for example, use purposely veiled messages aimed to generate thoughtful exchange with the viewer and to prompt reflection. By contrast, historic monumental sculptures employ symbolism that is direct and intentionally easy for viewers to understand.

The ancient Roman tradition of publicly displaying monumental equestrian statues of important historical figures is a particularly striking case of how to convey meaning in no uncertain terms.

Traditionally cast in bronze, these huge forms of horse and rider display messages of dominance, power, and virtue through strength. And, by doing so, they established a template that has persisted for centuries.

Perhaps, no statue embodies these values more than a famous depiction of the emperor Marcus Aurelius on horseback.

The figure of the emperor is seated on top of a regal horse, artfully posed as if it were moving gracefully through a crowd. The rider fully controls his muscular mount. The horse’s bit and bridle indicate that there were originally bronze reins, which were separately cast pieces lost over the centuries. The emperor’s posture, legs, and complete mental dominance over the beast underscores his great power. The emperor effortlessly motions with an outstretched left arm and hand. Art historians have defined this gesture as one of “pacification,” displaying authority and the ability to subjugate foreign enemies or forces of chaos that threatened the stability of the Empire.

This monumental bronze equestrian statue, inarguably one of the most extraordinary artworks that has come down to us from antiquity, was created to commemorate Marcus Aurelius’ great victories over Germanic tribes in 176 CE, or possibly posthumously to honor his prosperous reign (161-180 CE), when he, was canonized as one of Rome’s greatest emperors—a leader who ruled with intellect and decisive action.

Thousands of years later, the statue has become a hallmark in art history textbooks and the pride of the Capitoline Museum in Rome, where droves of tourists flock to see the work and to match it against the illustrations in their guidebooks.

Equestrian statues were not first seen in Rome. Before the Roman Empire existed, other regional cultures used this form of representation to commemorate their noblemen, kings, and heroes. The Romans were acutely aware of these artworks, especially those from Greece, and sought to collect and display them in their luxury villas, as well as dedicate them in public and religious spaces.

In contrast to many excavated archeological discoveries, the statue of Marcus Aurelius and his horse stayed above ground, where it surveyed the streets of Rome for nearly 2,000 years.

Roman equestrian statues, like many equestrian statues before and after, were about much more than men with horses; they embody the relationship between the leader and the military. The equites, a military class, played an incredibly important role in Roman society. Generally speaking, the equites received positions of privilege through merit and imperial favor instead of through a noble bloodline. Emperors selected members of this class for their elite Praetorian Guard, and used them politically as a counterweight to levy power from the high-ranking bureaucrats of the senatorial class.

The Western tradition of creating equestrian statues is interlinked with an appreciation of the equites. In the Marcus Aurelius statue, the emperor’s powers of statecraft are stressed, but his mastery of horsemanship clearly connects him to this elite military class, upon which he depended to manifest his far-reaching power.

Observed carefully, this statue of horse and rider idealizes Marcus Aurelius in an imagined moment of triumph. Romans loved victory, which they celebrated in grand civic gestures called “Triumphs.” They erected massive stone arches, many of which survive in the streets of Rome today, for their military heroes to ceremoniously march beneath in parades. Victorious generals or emperors would have ridden in horse-drawn chariots in these orchestrated events. While it is not specifically a depiction of a “Triumph,” the emperor wears a military cloak representing, perhaps, a humbler moment of victorious homecoming, one that symbolizes all of his achievements.

Ancient sources reference 22 equi magn—colossal bronze equestrian statues—that adorned the imperial capital. It is believed that this Marcus Aurelius statue was one of them, but centuries after the empire fell, the other 21 equi magni—along with hundreds of public statues throughout the city—were melted down during times of war or strife.

This single statue was preserved by a strange twist of fate. During the turmoil of Imperial Rome’s decline and its transformation into a medieval Christian center, the statue was incorrectly identified as the later emperor Constantine, who reigned from 306-337 CE. To later Christians, Constantine was a more favorable historic figure, as they widely believed that his 313 CE Edict of Milan, also known as the “peace of the church,” created a pathway for Christianity to become the principal religion of Europe. Medieval Romans could live more easily under the shadow of a statue identified as the Christian patron Constantine than with the polytheist Marcus Aurelius.

Although modern scholarship can easily settle the matter of the statue’s identity, conflation of Marcus Aurelius and Constantine indicates that, although the two emperors ruled Rome in very different political and social periods, the statue’s message of military prowess and leadership could be applied to them both. Constantine’s most important victory was at the Battle of Milvian Bridge (312 CE), when he—not unlike Julius Caesar in his famous crossing of the Rubicon—led an army into the city, crushed his enemies, and consolidated his power.

The conflation of Marcus Aurelius with Constantine reveals another part of the statue’s message—in which the mounted hero embodies virtue. Marcus Aurelius has been historically cast as the emperor exemplar, a Stoic philosopher with deep moral convictions. His oft-quoted collection of writings, Meditations, remains standard college course reading. Constantine’s virtue, in contrast, is attributed to his Christianity: He reportedly had a vision of the cross before his fateful battle for control of Rome, which led to his status of sainthood as a true defender of the faith. In medieval and later depictions of Constantine, he is almost always looking up, a gesture meant to signal that God is directly communicating with the emperor in his moment of triumph.

The allure of Rome’s greatness held sway over Western culture for millennia and the Marcus Aurelius statue directly contributed to this phenomenon. Part of what made it so influential was simply the fact that it was visible.

In contrast to many excavated archeological discoveries, the statue of Marcus Aurelius and his horse stayed above ground, where it surveyed the streets of Rome for nearly 2,000 years. By the mid-10th century this statue stood near the Pope’s Lateran Palace, and in 1538 it became the focal point of Michelangelo’s new design for the Piazza del Campidoglio. A replica is there today; the original was moved inside the Capitoline Museum in 1981 for conservation reasons.

Bronze replica of the Marcus Aurelius statue in Rome’s piazza del Campidoglio. Photo courtesy of Jean-Pol Grandmont/Wikimedia Commons.

Through the circumstances of its survival, its artistic quality, and its sheer size, the statue became widely known and revered throughout Europe long after the fall of Rome. Local artists and travelers copied it in their sketchbooks and used it as a model for their own artworks. Later rulers appropriated its formidable symbolism for themselves. Charlemagne commissioned his own portraits of dominance, using the Marcus Aurelius statue as a prototype. Machiavellian-style princes of Renaissance Italy revived the ancient Roman tradition in earnest, placing remarkably crafted bronze statues of themselves on mounts in their public squares.

From Europe, the tradition was exported to colonial territories from the early modern period well into the 20th century. The deeply embedded concepts of power and virtue on horseback were exploited by Mussolini and Franco to frame their autocratic regimes—even as the machine gun and other industrial weaponry made mounted warriors obsolete.

Although the specific messaging of any given equestrian statue changes through iconographic details, the basic model persists, as do the core themes of power and virtue. Most equestrian statues visible today in public spaces stick to form by portraying a stoic general on a sturdy horse with an outstretched arm, perhaps brandishing a sword for added effect. Typically, they wear ceremonial dress, not combat armor or field uniforms, a detail that brings them back to their greatest moments of triumph.

In Oyster Bay, the hometown of Teddy Roosevelt, stands a statue (not to be confused with the recently vandalized statue in New York City) that depicts the 26th U.S. president on horseback in his Rough Rider gear, alluding to his leadership of a volunteer cavalry regiment in Cuba during the Spanish-American War. Pulling decisively on the reigns of his horse, and surveying the field before him, he is shown as a man of action, vigor, and clear mind—a rough-and-tumble message that echoes back to antiquity.

In recent months, Americans have fiercely debated whether to preserve or tear down statues of horse-mounted Confederate generals. But criticism of equestrian statues is hardly new to the 21st century; rather, uneasiness with their symbolism has been embedded within the tradition since ancient times. The famous first-century BCE orator and writer Cicero, who frequently commented on virtus—a Roman concept that combines the “masculine” virtues of excellence, valor, honor, and integrity—condemned the erection of public equestrian statues as shameless acts of arrogance. Not immune to political hypocrisy, Cicero later supported resolutions for his allies to be honored by equestrian statues of them installed in a public space. But his core criticism regarding equestrian statues is clear: Representing an individual on horseback in a civic space suggests an infallibility of character and literary sets them above public scrutiny.

A few thousand years later, we can level some new criticism at the Marcus Aurelius statue, or the figure it represents. Even though Marcus Aurelius has historically been labeled one of the “good” emperors of ancient Roman, his army, as with any Roman military force, would routinely massacre, torture, mutilate, and terrorize its adversaries. On the domestic front, he ruled an empire that was bound to an especially brutal and dehumanizing system of slavery. The same indictments can be made against Constantine (who, in addition, ordered the execution of his wife and son), or any of Rome’s emperors.

Yet when one stares up at the statue today—either the one placed in a grand gallery of the Capitoline Museum or the replica in the historic courtyard—the viewer is urged to push those modern criticisms away. The magnificent rider and his steed radiate power and nobility, as if they still were striding through the streets of Rome.

Who would argue that such a venerated artistic marvel is problematic?

The statue itself does not explain the complicated historical circumstances of Roman rule or Marcus Aurelius’ reign, because it has no intention to do so. Instead, it purely celebrates a man for his deeds and the mark he left on the world as viewed by those who commissioned the statue.

Even though historic equestrian statues are direct in their messaging to the viewer, today—if we are viewing these statues for the first or the 100th time—we have the power, and hopefully the virtue, to take a deeper look.

The post Why Are There so Many Statues of Men on Horseback? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/16/many-statues-men-horseback/ideas/essay/feed/ 0
The “Little Giant” Who Thought That Backing Slavery Would Unite Americahttp://www.zocalopublicsquare.org/2018/02/15/little-giant-thought-backing-slavery-unite-america/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/15/little-giant-thought-backing-slavery-unite-america/ideas/essay/#respond Thu, 15 Feb 2018 08:01:23 +0000 By Graham A. Peck http://www.zocalopublicsquare.org/?p=91240 One of the most ambitious attempts to unite America ended up dividing it, and altering it forever.

At the opening of the 33rd Congress on December 5, 1853, Stephen A. Douglas, the short, rotund U.S. Senator from Illinois, planned an ambitious legislative program of national expansion.

“The Little Giant,” as he was known, sought to establish a new territory—Nebraska Territory—fashioned from the immense tract of Indian-occupied land that stretched west to the Rockies from Missouri, and north to Canada. He desired to remove the Indian inhabitants; to survey, sell, and populate the land; and to construct a transcontinental railroad to unite the country.

“How are we to develop, cherish and protect our immense interests and possessions on the Pacific,” he asked fellow senators, “with a vast wilderness fifteen hundred miles in breadth, filled with hostile savages, and cutting off all direct communication?” The solution, he said, was removing the “Indian

The post The “Little Giant” Who Thought That Backing Slavery Would Unite America appeared first on Zócalo Public Square.

]]>
One of the most ambitious attempts to unite America ended up dividing it, and altering it forever.

At the opening of the 33rd Congress on December 5, 1853, Stephen A. Douglas, the short, rotund U.S. Senator from Illinois, planned an ambitious legislative program of national expansion.

“The Little Giant,” as he was known, sought to establish a new territory—Nebraska Territory—fashioned from the immense tract of Indian-occupied land that stretched west to the Rockies from Missouri, and north to Canada. He desired to remove the Indian inhabitants; to survey, sell, and populate the land; and to construct a transcontinental railroad to unite the country.

“How are we to develop, cherish and protect our immense interests and possessions on the Pacific,” he asked fellow senators, “with a vast wilderness fifteen hundred miles in breadth, filled with hostile savages, and cutting off all direct communication?” The solution, he said, was removing the “Indian barrier.” That done, Americans soon would command a continental empire.

Only 40 years old in 1853, Douglas was already a figure of national renown. He had played the key role in defusing the nation’s crisis over slavery in 1850, when Southerners had threatened to leave the Union, by skillfully shepherding a bevy of compromise measures through Congress. These had included acts authorizing territorial settlers to determine slavery’s legality in states carved from the huge domain acquired in 1848 from Mexico. Those measures had, for the time being, silenced debate over Congress’s power to control territorial slavery.

Most Southern politicians contended that the United States Constitution required Congress to protect slaveholders’ property rights in national territories. Most Northern congressmen replied that slavery existed only if state law sanctioned it. Douglas sought to sidestep the constitutional logjam by permitting territorial settlers to determine slavery’s legality.

He initially termed this doctrine congressional “non-interference” with slavery, but subsequently invoked the talismanic phrase “popular sovereignty” to lay claim to the concept of self-rule. Popular sovereignty was the core of the country’s democratic experiment; hence, to Douglas, territorial settlers deserved the right to determine their fate. Slaves did not get a vote.

To Douglas, the Union required compromise over slavery, as it had at the nation’s founding.

At stake was not only slavery’s potential expansion, but also the meaning of the nation. Both Southerners and Northerners lauded freedom, yet in doing so they meant different things. Southern politics reflected a proslavery perspective that rested upon the massive wealth of slave property and the immense profits of plantation agriculture. By contrast, Northern politics reflected both a burgeoning free population’s insatiable demand for western farmland and a moral resistance to making chattel of men.

At the intersection of the country’s clashing definitions of freedom and its conflicting economic interests stood Stephen A. Douglas and popular sovereignty.

Douglas, like many Northern Democrats, had long been a middleman between North and South. Born in Vermont, he had migrated to central Illinois as a young man, launched a political career, and successfully solicited the support of Southern-born settlers who had made central Illinois their home. Once in Congress, he socialized with Southern Democrats and married a wealthy slave owner’s daughter, whose inheritance in 1848 made him the beneficiary of a large slave plantation.

His political views marched in tune. A leading voice for both national expansion and national union, he invariably sought compromise when conflicts between slavery and freedom arose. To Douglas, the Union required compromise over slavery, as it had at the nation’s founding. For this reason alone, he denounced antislavery activists. Yet their egalitarian racial creed also repelled him. If, as Douglas believed, blacks were not equal to whites and did not possess inalienable rights to life, liberty, and the pursuit of happiness, then slavery was justifiable, perhaps even necessary. Hence popular sovereignty profoundly appealed to Douglas: It reflected America’s democratic practice and racial sensibilities, and, if adopted as a national principle, promised to end conflict over slavery’s expansion.

Yet the clash of slavery and freedom meant that Douglas’ dream for Nebraska Territory was not without risk. Expansion tended to bring the country’s divergence over slavery into sharp relief. In a new territory, slavery would either be lawful or not. This basic fact often brought the politics of slavery to the fore.

U.S. map of 1856 shows free and slave states and populations. Image courtesy of Wikimedia Commons.

In Nebraska, the politics of slavery were explosive. The Missouri Compromise in 1820—which had settled a bitter and prolonged debate over Missouri’s admission as a slave state—pledged to keep free the land that Douglas now sought to organize as Nebraska Territory. Yet Southern congressmen in the early 1850s resisted creating new free territories and refused to supply the votes by which future antislavery congressmen would subsequently make their way to Washington, D.C.

Likely for this reason, Southern senators in the preceding Congress had tabled Douglas’ previous Nebraska bill, which had been organized under the Missouri Compromise’s antislavery provisions. So, in 1853, Douglas incorporated popular sovereignty into his new bill, intending to resolve not only the logjam over Nebraska Territory, but also future conflicts over slavery’s expansion.

His decision forever altered American history.

The ambitious bill precipitated a massive battle over the future of freedom in America. Introduced on the first day of the congressional session, it would pass Congress as the Kansas-Nebraska Act in May 1854. But the backlash against it was stupendous. Northerners angrily rejected Congress’ possible repeal of the Missouri Compromise’s antislavery prohibition.

Mass political meetings in countless cities protested the bill. 3000 New England clergymen sent a signed petition against the bill “in the name of Almighty God.” And, when Douglas traveled through Ohio’s antislavery Western Reserve on his return to Illinois, he could see his burning effigy “upon every tree” he passed.

Douglas believed that agitators had whipped up the crisis and predicted that the act would “be as popular at the North as it is at the South” once its democratic character was fully comprehended. Instead, the 1854 elections portended an antislavery revolution in the North. Candidates supported by hastily assembled and motley anti-Nebraska coalitions received an avalanche of votes that swept incumbent Democrats from Congress.

By 1856, the recently founded antislavery Republican Party had emerged as the North’s majority party. And in 1860 Republican voters put Abraham Lincoln in the White House by claiming almost every electoral vote in the Northern states. Southern states responded with secession, and then war, and soon the political contest over the meaning of freedom in America escalated into a remorseless revolutionary struggle over the fate of slavery and the nation. The nation survived the struggle. Slavery did not.

In 1853, Stephen A. Douglas had not sought to make an antislavery nation. That achievement, which our country now so justly prizes, was thus ironically made possible by one of the many Americans tolerant of slavery.

But that irony also reminds us that the powerful hold of slavery on antebellum Americans—Northerners and Southerners alike—is a central reason why we still struggle with the legacy of the Civil War. Our 19th-century forebears tried to put freedom into practice, and left a complicated heritage. We will do the same.

The post The “Little Giant” Who Thought That Backing Slavery Would Unite America appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/15/little-giant-thought-backing-slavery-unite-america/ideas/essay/feed/ 0
How Iranian Women Turn “Pious Fashion” Into Under-the-Radar Dissenthttp://www.zocalopublicsquare.org/2018/02/14/iranian-women-turn-pious-fashion-radar-dissent/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/14/iranian-women-turn-pious-fashion-radar-dissent/ideas/essay/#respond Wed, 14 Feb 2018 08:01:59 +0000 By Elizabeth Bucar http://www.zocalopublicsquare.org/?p=91214 In 2018, Islamic clothing is officially cool. CoverGirl has a hijabi ambassador. H&M sells a popular modest clothing line. Even Barbie wears a headscarf on a doll modeled after the American fencer Ibtihaj Muhammad.

Despite this cool factor, Islamic women’s headscarves and clothing retain strong associations with piety and politics, symbolism that is wielded both by the woman in the clothes and the people around her. In countries where Muslims are minorities, as in the United States, merely wearing hijab is seen as a political act, albeit one that can be interpreted in many ways. Shepard Fairey created an image of a woman wearing a flag hijab as a sign of tolerance and inclusivity, while others claim that the scarf is a sign of Muslim women’s repression.

In Muslim-majority countries, however, the symbolism—and the way that women and the state both use hijab to express ideas—is deeper and more interesting.

The post How Iranian Women Turn “Pious Fashion” Into Under-the-Radar Dissent appeared first on Zócalo Public Square.

]]>
In 2018, Islamic clothing is officially cool. CoverGirl has a hijabi ambassador. H&M sells a popular modest clothing line. Even Barbie wears a headscarf on a doll modeled after the American fencer Ibtihaj Muhammad.

Despite this cool factor, Islamic women’s headscarves and clothing retain strong associations with piety and politics, symbolism that is wielded both by the woman in the clothes and the people around her. In countries where Muslims are minorities, as in the United States, merely wearing hijab is seen as a political act, albeit one that can be interpreted in many ways. Shepard Fairey created an image of a woman wearing a flag hijab as a sign of tolerance and inclusivity, while others claim that the scarf is a sign of Muslim women’s repression.

In Muslim-majority countries, however, the symbolism—and the way that women and the state both use hijab to express ideas—is deeper and more interesting. Rather than arguing about whether or not Muslim women should dress modestly, I study how Muslim women dress: what they are wearing and why, and how they use fashion to exert political influence.

Muslim-majority countries have a history of regulating women’s clothing through official dress codes, whether banning headscarves or requiring them. In Iran, for instance, Muslim women’s dress was a political matter long before it became the symbol of revolution in 1979. The shah banned the full-body covering called chador in 1936 as part of his attempt to undermine the authority of the Shia clerics and westernize Iranian women.

Now, of course, Islamic clothing is required for women in Iran by law. Drafted under Ayatollah Ruhollah Khomeini’s leadership as part of his vision for a public space governed by the principles of Islamic morality, these laws include harsh punishments for inadequate hijab—jail time, fines, even 74 lashes with a whip. Harassment and arrests for violations became commonplace after the revolution.

Despite conditions of discrimination—because requiring a headscarf and modest clothing is discriminatory—pious fashion comes in a remarkable range of styles in Tehran. One option is to wear the floor-length chador draped over the hair and shoulders. The alternative to chador is a coat-like manteaux with some sort of head covering. There are two popular head coverings to pair with a manteaux. One is a sort of balaclava, called a maghneh. But the fashionable women of Tehran wear a rusari—a scarf covering the head and knotted under the chin or wrapped around the neck, personalized by fabric, color, pattern, and style of drape.

In this urban casual look, the hardware on the Dr. Martens boots echoes the studs on the Valentino crossbody bag. The Topshop floral leggings are the stand-out item, made even cooler by being paired with utilitarian items like a black scarf and a military jacket. The graffiti in the background is of a dish-soap bottle. Photo courtesy of Anita Sepehry/the Tehran Times fashion blog.

Though these items represent the building blocks of modest garb, they do not define its expression. Women define what pious fashion looks like when they get dressed every morning—whether they wear structured separates accessorized with designer sunglasses, flowy pastel chiffons embellished with rhinestones, or ripped jeans tucked into combat boots. On the streets of Tehran, in its cafés and places of business, women find ways to use their clothing to make claims about what counts not only as fashion, but also as piety.

Within a regime that has attempted for decades to promote dress codes as a way to craft particular types of Muslim citizens, and in which direct political resistance is dangerous, clothing has become a form of political engagement that is potentially powerful because it can sometimes slide under the radar as a matter of culture versus statecraft.

What sort of power can modest clothing choices have? For one, dress becomes a way to access governmental office. Women hold numerous advisory roles in government. Chador is a requirement of appointment to these positions. But this limitation also creates an opportunity. Women can take advantage of the symbolic meaning of the chador to mark themselves as supporters of the theocracy, independent of their actual political views.

Then there is a more recent popular style integrating traditional motifs and embroidery that is Kurdish, Turkoman, or Indian. Called lebase mahali, which means “local clothing” in Persian, it does not push the boundaries of modesty. But it does something else: It highlights Persian and Asian aesthetics over Islamic and Arabic ones. This Persian ethnic chic undermines current Islamic authority, sometimes unintentionally, simply because it draws on sources of authority that predate the Islamization of Iran.

This power to critique through sartorial choice comes with substantial risk. Since clothing is so strongly linked to character, a bad outfit can be seen as a reflection of poor character. In Iran, there is even a term for this: bad hijab. Bad hijab can be both an ethical failure (too sexy) and an aesthetic failure (not tasteful). It’s a concern of the authorities because bad hijab disrupts the public Islamic space that Iranian theocracy tries to create.

The infamous morality police have often targeted women for what they deem bad hijab, but they are not the only ones. In fact the first time I noticed it was while shopping with my Iranian friend Homa. “Liz, this is a good example of bad hijab for you,” she said when a young woman walked by. Homa was quite happy to elaborate: “Her ankles are showing, her pants are rolled up, they are made of denim and tight. Her manteaux is short, slit up the side, tight, made of thin material, and exposes the back of her neck and her throat. And her rusari, look at her rusari. It is folded in half so that her hair sticks out in front and back and tied so loosely that we can see all her jewelry. Plus, her makeup is caked on.”

This outfit is a glam version of edgy hijab. The Alexander McQueen-style skull-patterned scarf, fur vest, and Givenchy Rottweiler print clutch give the woman a rock vibe. Photo courtesy of Donya Joshani/the Tehran Times fashion blog.

Homa’s determination of bad hijab was based on a number of perceived violations. The first problem was that the woman’s outfit exposed parts of her body legally required to be covered. Homa also disapproved of the woman’s jeans—reflecting a widely held opinion in Iran that denim is improper for women to wear for both aesthetic reasons (as a fabric that is too casual) and political reasons (as a Western fabric that might infect the subject with Western ideas).

Homa spent considerable time describing for me why this woman’s rusari was inadequate. In this case, the violation depended in part on the scarf’s gauzy material, which was translucent. The way the scarf was worn was also a problem: By folding the rusari in half lengthwise, the woman only covered half as much hair as normal. Homa had also judged the woman’s heavy hand with makeup a hijab “failure” because it made her appear more alluring to the opposite sex.

Why so catty? Of course women, even pious ones, can be hard on one other, but there is more to learn from Homa’s reaction. Accusation of bad hijab is an expression of her own concern over sartorial practice. Pious fashion creates aesthetic and moral anxiety. Am I doing it right? Do I look modest? Professional? Stylish? Feminine? Women try to resolve this anxiety by identifying who is doing it wrong. Improper pious fashion is what allows proper pious fashion to redefine itself away from stigma to style: If this mystery woman was wearing bad hijab then surely Homa was a sartorial success.

Homa’s accusation of bad hijab might have helped legitimate her own clothing choices, but it came at a cost. Public shaming of Muslim women’s dress relies on a specific ideology of how women should appear in public, and women themselves are not exempt from promoting this aspect of patriarchy. By policing other women, they accommodate existing ideology to improve their own status.

At the same time, bad hijab is politically potent because it can shift the boundaries of successful pious fashion, sometimes expanding those boundaries, sometimes narrowing them. Homa might have been outraged by what this mystery woman was wearing, but she was violating some of the very same norms: Her own ankles were showing, her hair peeked out from her scarf, she had on foundation, eyeliner, and mascara.

And when everyone is showing her ankles and painting her toes, it sends a very personal signal about how the state’s power to define women’s morality is declining. What are my friends wearing? What are designers producing? What are bloggers posting? These are the sorts of things that influence what Iranian women wear, not only the threat of police surveillance and arrest. Besides, there are not enough police in Tehran on a hot summer day to arrest every young woman wearing capris.

In a surprise public statement last December, Brigadier General Hossein Rahimi, head of Greater Tehran police, admitted as much. He announced that women who are found to be wearing bad hijab will no longer be arrested, but instead sent to morality classes. It is too soon to say if this is a clear sign of a shift in Iranian politics. But if this does signal a positive change, credit goes to women’s sartorial savvy, not the police. And to the public who would undoubtedly react if everyone wearing nail polish was administered the 74 lashes permitted in the penal code.

In recent weeks a few Iranian women have protested the forced dress code directly. They stand on top of utility boxes, take off their headscarves, and wave them on sticks. These protests have resulted in dozens of arrests, proving that in the current political climate bad hijab might be tolerated, but no hijab is going too far. Images of these protests on Twitter include women in full chador waving headscarves in solidarity. This is a good reminder that it is not the wearing of hijab that Iranian women oppose, but rather the government’s attempt to police their bodies. The protesters and the Iranian authorities agree on at least one thing: what women wear matters.

The post How Iranian Women Turn “Pious Fashion” Into Under-the-Radar Dissent appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/14/iranian-women-turn-pious-fashion-radar-dissent/ideas/essay/feed/ 0
In Whose God Do Americans Trust?http://www.zocalopublicsquare.org/2018/02/13/whose-god-americans-trust/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/13/whose-god-americans-trust/ideas/essay/#respond Tue, 13 Feb 2018 08:01:57 +0000 By Matthew Bowman http://www.zocalopublicsquare.org/?p=91183 Charles Bennett, a Democratic Congressman from Jacksonville, Florida, was afraid of communism. In July 1955, he spoke of his concerns on the floor of the House of Representatives. “In these days, when imperialistic and materialistic communism seeks to attack and destroy freedom, we should continually look for ways to strengthen the foundations of our freedom,” he told his fellow members of Congress. Bennett’s proposed solution was simple: Americans could add the phrase “In God We Trust” to their dollar bills. By consensus, Congress adopted Bennett’s resolution.

Americans’ ready embrace of the phrase in the 1950s seems foreign to a contemporary politics when so many Americans find invocations of Christianity in American politics frustrating. But seen in another light, the story of “In God We Trust” actually sheds light on the reasons behind that frustration. The phrase seems straightforward and clear, but any interrogation of it shows how quickly its meaning

The post In Whose God Do Americans Trust? appeared first on Zócalo Public Square.

]]>
Charles Bennett, a Democratic Congressman from Jacksonville, Florida, was afraid of communism. In July 1955, he spoke of his concerns on the floor of the House of Representatives. “In these days, when imperialistic and materialistic communism seeks to attack and destroy freedom, we should continually look for ways to strengthen the foundations of our freedom,” he told his fellow members of Congress. Bennett’s proposed solution was simple: Americans could add the phrase “In God We Trust” to their dollar bills. By consensus, Congress adopted Bennett’s resolution.

Americans’ ready embrace of the phrase in the 1950s seems foreign to a contemporary politics when so many Americans find invocations of Christianity in American politics frustrating. But seen in another light, the story of “In God We Trust” actually sheds light on the reasons behind that frustration. The phrase seems straightforward and clear, but any interrogation of it shows how quickly its meaning dissolves into ambiguity. The “God” of the phrase is by implication—though not explicitly—the monotheistic deity of the Hebrew and Christian Bibles. But what it means to “trust” is unclear, and “we” is undefined. Does it refer to all Americans? More, the same is true for the word “Christian” itself. What does it mean to be a “Christian candidate”? What does it mean to have a “Christian nation”? During the Cold War, Protestants like Bennett imagined the existence of a religious consensus and coined the phrase “Judeo-Christian” to describe it—but whether that consensus actually exists remains up for debate.

Putting the name of God on American currency was not a new idea in 1955. The words “In God We Trust” had first been placed on coins in 1861 at the direction of Treasury Secretary Salmon P. Chase, at the urging of a Christian minister named M.R. Watkinson. For a week, Chase pondered on Watkinson’s letter, which suggested placing the phrase “GOD. LIBERTY. LAW.” on coins. Then the treasury secretary instructed the director of the Mint to stamp the declaration on the money, but he changed the wording to “In God We Trust,” which Chase—always pompous—thought stylistically superior to Watkinson’s suggestion. The phrase appeared on American coins intermittently from Chase’s time until 1938, when it began appearing on all metal currency.

Bennett’s resolution fixed its appearance on paper currency as well. In the 1950s, with the country feeling under siege, his proposal to revive Chase’s plan received wide support from both political parties and President Dwight Eisenhower. At almost precisely the same time that Bennett’s measure passed, Congress voted to add the phrase “under God” to the Pledge of Allegiance. Senator Homer Ferguson, a Republican from Michigan, explained that he supported that resolution because “Our nation is founded on a fundamental belief in God, and the first and most important reason for the existence of our government is to protect the God-given rights of our citizens.”

Americans’ enthusiasm to publicly declare their faith in God—both in the turbulent 1860s and 1950s—was more than simple piety; these were acts of public theology. American Christians did not merely advocate for public acknowledgement of God; they offered interpretations of what that faith should mean for U.S. democracy. Bennett, Ferguson, and Chase each performed a ritual of national sanctification, trying to bind together in the public eye what they took for granted: that American democracy depended upon the virtues of what they thought of as simply “Christianity,” but which in retrospect seems to be Protestantism.

Chase, Bennett, Ferguson, Eisenhower: all were Protestants, and as Protestants they argued that Christian theology provides a transcendent rationale for the importance of democracy and human freedom. Protestants had been suspicious of authority—whether spiritual (as in church leadership) or worldly (as in kings and queens)—since the Reformation. Only a few years before the American Revolution, for instance, a popular political cartoon depicted a band of restive Boston colonists banishing an Anglican bishop that the British crown had sent to them. The protesters are shown shouting slogans like “No Lords Spiritual or Temporal in New England” and “Liberty & Freedom of Conscience.” Words like “liberty,” “freedom,” and “rights” linked Christianity to individual independence, and fueled what Protestant Westerners in general and Protestant Americans in particular thought of as Christian civilization: a society in which democratic politics and Protestant religion were intertwined into a creed of personal liberty.

Grasping this connection helps explain why some American Protestant activists today insist that the health of the state depends upon whether their form of Christianity is given space in the public sphere to thrive. And yet, that insistence occludes the fact that even within American Protestantism, there have been distinct visions of the form of Protestant faith that democracy depends on.

Dwight Eisenhower, for instance, was famously convinced of the importance of religious belief to American democracy, holding to the same mainline Protestant convictions as Bennett or Ferguson. He was baptized Presbyterian soon after he was inaugurated as president because he believed in the importance of Protestantism to the health of the state. “A democracy cannot exist without a religious base,” he argued, famously stating that, “Our government has no sense unless it is founded in a deeply felt religious faith, and I don’t care what it is”—a seeming ecumenical gesture he quickly qualified by asserting that the United States was based upon the “Judeo-Christian concept.”

For Eisenhower, as for many other Cold War American Protestants, the “Judeo-Christian” tradition was a way to unite Protestants, Catholics, and Jews. It dismissed niceties of theology or particularities of religious ritual in favor of a nebulous sense of solidarity. In practice, however, “Judeo-Christianity,” like the people who used the word, often took for granted Protestant assumptions: suspicion of religious hierarchy, association of religious faith with individual piety and moral practice, an emphasis upon personal feeling rather than corporate participation. All these things were to them compatible with American democracy.

American Christians did not merely advocate for public acknowledgement of God; they offered interpretations of what that faith should mean for U.S. democracy.

But to other American Christians—even American Protestants—the generalities of Eisenhower’s Judeo-Christianity were not sufficient, and the words “In God We Trust” took on very different meaning. The struggle over what it means to be a Christian in America are thus very much up for debate, with some insisting the obligations of the term extend far beyond Eisenhower’s gentle platitudes.

For example, the pastor David Barton runs a large evangelical ministry devoted to demonstrating that the American Founding depended upon Christian ideas. Barton is representative of the politically and theologically conservative subset of Protestants today sometimes called the “Religious Right.” Barton goes much further than Eisenhower, formulating a Christianity of much more detailed theological expectation and far less comfort in the mainline of American culture. Not content to claim that Christians share simply generalized pieties, Barton argues that the American Founders shared his own particular brand of American evangelicalism, marked by the importance of affirming Jesus Christ, by social conservatism, and by small government. For Barton, then, not only were American leaders from the Revolution through Eisenhower influenced by Protestant ideas of liberty; they explicitly intended to found a state shaped by the sort of pious evangelical Christianity that Barton himself embraces.

Surveys have shown that conservative activists like Barton have been somewhat successful in claiming the term “Christian”—many young Americans now associate the term “Christian” with Barton’s conservative social politics. But Barton faces a number of Protestant critics, like historian John Fea, who argue that his form of evangelical patriotism is not simply bad politics, but bad Christianity. For Fea, Barton’s veneration of the American Founders is idolatry; hardly Christianity at all.

Similarly, other Protestants contend that the relationship between their faith and democracy should point American politics in different directions than Barton’s economically libertarian and socially conservative ideology. While Fea offers a theological criticism of Barton’s faith, Jim Wallis maintains that Barton’s politics are distant from what Christianity demands. Wallis, a Methodist pastor and political activist known best for assailing American capitalism and calling for a stronger welfare safety net, argues that Protestant liberty means that the Christian community should use civil authority to promote the economic independence and well-being of all its members. For Barton, Protestant morality requires social conservatism, opposition to abortion and same-sex marriage. For Wallis, Protestant morality means hospitality and welcoming the stranger.

In truth, the relationship between American democracy and American Christianity remains open to debate—and the spectrum from Barton to Eisenhower to Wallis encompasses only white American Protestants.

When he proposed adding “In God We Trust” to American currency, Charles Bennett declared that the sentiments behind the phrase were “indigenous to our country.” He assumed a common heritage and common understanding of what Christianity might mean. But contemporary disputes, like those between Barton and Wallis, illustrate that the argument about that relationship has never been so simple as any particular American, or American Christian, might wish.

The post In Whose God Do Americans Trust? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/13/whose-god-americans-trust/ideas/essay/feed/ 0
The Myth of Untouched Wilderness That Gave Rise to Modern Miamihttp://www.zocalopublicsquare.org/2018/02/12/myth-untouched-wilderness-gave-rise-modern-miami/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/12/myth-untouched-wilderness-gave-rise-modern-miami/ideas/essay/#comments Mon, 12 Feb 2018 08:01:18 +0000 By Andrew K. Frank http://www.zocalopublicsquare.org/?p=91134 Miami is widely known as the “Magic City.” It earned its nickname in the late 19th and early 20th centuries, shortly after the arrival of Henry Flagler’s East Coast Railroad and the opening of his opulent Royal Palm Hotel in 1897. Visitors from across the country were lured to this extravagant five-story hotel, at the edge of the nation’s southernmost frontier. From their vantage point, South Florida was the Wild West—and Miami could only exist if incoming settlers were able to tame it. And tame it they did. Miami’s population boomed, from roughly 300 in 1896 to nearly 30,000 in 1920. Onlookers marveled as the “metropolis” seemed to emerge overnight from the “wilderness.”

This legend, repeated for more than a century, blends truth with fiction, and reminds us that history is as much about forgetting as it is about remembering. Flagler and a woman named Julia Tuttle stand at the

The post The Myth of Untouched Wilderness That Gave Rise to Modern Miami appeared first on Zócalo Public Square.

]]>
Miami is widely known as the “Magic City.” It earned its nickname in the late 19th and early 20th centuries, shortly after the arrival of Henry Flagler’s East Coast Railroad and the opening of his opulent Royal Palm Hotel in 1897. Visitors from across the country were lured to this extravagant five-story hotel, at the edge of the nation’s southernmost frontier. From their vantage point, South Florida was the Wild West—and Miami could only exist if incoming settlers were able to tame it. And tame it they did. Miami’s population boomed, from roughly 300 in 1896 to nearly 30,000 in 1920. Onlookers marveled as the “metropolis” seemed to emerge overnight from the “wilderness.”

This legend, repeated for more than a century, blends truth with fiction, and reminds us that history is as much about forgetting as it is about remembering. Flagler and a woman named Julia Tuttle stand at the center of the story: The importance of Flagler’s East Coast Railroad and Royal Palm Hotel led some residents to propose naming the city after him, and he is often depicted as the city’s “father.” Tuttle, a businesswoman who lured Flagler to Miami and otherwise promoted the region during the 1890s, earned the title of “Mother of Miami.” But Tuttle and Flagler did not create something out of nothing. On the contrary, Tuttle’s home and Flagler’s hotel stood precisely where earlier settlers had already left indelible marks over 2,000 years of continuous occupation.

These Miamians included Tequesta Indians who lived in the area for more than 1,500 years, and Spanish missionaries who tried to convert them; African enslaved persons tasked with turning the land into sugar fields, who instead created orchards of fruit trees; Seminole Indians who came to trade and harvest the local bounty, and U.S. soldiers who waged a war to exterminate them; and a continuous stream of Bahamian mariners, fugitive soldiers from various armies, and shipwrecked sailors. These earlier generations have been forgotten largely because Tuttle and Flagler were master illusionists who engaged in a combination of physical sleight of hand and intellectual misdirection. Rather than create something out of nothing, they built upon the storied history that preceded them—and then helped others forget it.

Julia Tuttle, widely known as the Mother of Miami. https://www.floridamemory.com/items/show/29793>State Archives of Florida, Florida Memory.

Tuttle clearly knew that she was not the first occupant of her waterfront property. Recently widowed, she relocated in 1891 from Cleveland to the mouth of the Miami River on Biscayne Bay, where she worked tenaciously to promote the region as a commercial and agricultural opportunity. Tuttle moved into a 19th-century plantation house that had been built by enslaved Africans in the early 1830s, and constantly referred to it as “Fort Dallas,” which had been the name given to it when it was turned into a military outpost during the Second Seminole War (1835-1842). Tuttle’s property contained a man-made well, a stone wall, and several gravestones. There was a decades-old road that connected her home to the community on the New River—today’s Fort Lauderdale—and elsewhere up the Atlantic coast.

Still, despite all this evidence of earlier occupation, Tuttle declared to all who would listen that she was a founder of a new community. In words that would be widely repeated, she explained her ambitions. “It may seem strange to you but it is the dream of my life to see this wilderness turned into a prosperous country,” she wrote. One day, she hoped, “where this tangled mass of vine brush, trees and rocks now are to see homes with modern improvements surrounded by beautiful grassy lawns, flowers, shrubs and shade trees.” Tuttle wanted to “settle” a place that had been settled for centuries and turn it into an agricultural or commercial entrepôt.

James Henry Ingraham, president of the South Florida Railroad Company of the Plant System, was but one of the newcomers she impressed. Ingraham proclaimed that Tuttle had “shown a great deal of energy and enterprise in this frontier country where it is almost a matter of creation to accomplish so much in so short a time.” But his description of Tuttle’s efforts, too, revealed the preexisting history that made her successful. Tuttle, he wrote, “converted [Fort Dallas] into a dwelling house after being renovated and repaired with the addition of a kitchen, etc. The barracks … is used as office and sleeping rooms.” Despite her “improvement … on hammock land which fringes the river and bay,” Ingraham explained, the natural world remained largely untamed. “Lemon and lime trees,” which were planted by the earlier waves of Spanish, Bahamian, and American occupants, “are growing wild all through the uncleared hammock.” Ingraham, Tuttle, and others knew that citrus was not native to South Florida. Their claims about untamed wilderness were disingenuous.

Tuttle ignored evidence of the ancient Indian world that surrounded her. Like others of her generation, she recorded the presence of several large man-made mounds and shell middens in the area. Some were ancient burial sites or ceremonial centers, and others were basically landfills, built from generations of discarded shellfish and tools. They were all constructed by the Tequesta Indians, who had first settled the waterfront site 2,000 years earlier and lived there into the 17th century, when they attracted the unwanted attention of slave raiders, Spanish missionaries, and others moving in. Tuttle, like others who declared themselves to be on the frontier, deemed the Indian past to be inconsequential to the development that would follow.

With Tuttle engaged in acts of intellectual misdirection, Flagler and his construction crews took care of the physical destruction. Flagler, like most Gilded Age industrialists, is more typically associated with building than with razing. He earned his fame for helping found Standard Oil with John D. Rockefeller in 1870 and then creating Florida’s modern tourist industry with his railroad and luxury hotels in St. Augustine, Palm Beach, and elsewhere along Florida’s Atlantic Coast. Tuttle lured Flagler to Miami with gifts of orange blossoms after a brutal frost had destroyed the citrus crop in central Florida, and clinched the deal by dividing her property on the Miami River with him.

Tuttle and Flagler were master illusionists who engaged in a combination of physical sleight of hand and intellectual misdirection.

In 1896, Flagler’s laborers at the mouth of the river leveled the ancient Tequesta mounds that stood in the way of progress. They were unabashedly brutal about it. One of the workers noted that a burial mound “stood out like a small mountain, twenty to twenty-five feet above water” and “about one hundred feet long and seventy feet wide.” Flagler’s African American workers struggled to remove “a poison tree” that grew on the top of the mound, as it “would knock them cold.” Those workers “who were not allergic to it” leveled the mound, uncovering and hastily removing “between fifty and sixty skulls.” One of the workers took home the bones, “stored them away in barrels and gave away a great many … to anyone that wanted them.” When construction ended, he dumped the remaining skeletons “nearby where there was a big hole in the ground.” Another bayside mound was hidden behind a “great tangle of briars and wild lime trees.” The midden materials from these and other mounds were strewn across the property, becoming the foundation for Henry Flagler’s opulent Royal Palm Hotel.

The city of Miami incorporated in July 1896, a bit more than a year after the railroad reached the site of the Royal Palm Hotel. Thanks to the vision of Tuttle and marketing genius of Flagler and others, Miami quickly became a tourist destination. City boosters built roads and canals, plotted new communities, constructed man-made beaches, and established new civic organizations. The real estate boom that followed incorporation pushed the residential community out from the mouth of the river and in only a couple of decades turned the small town into a bustling city. Tuttle died in 1898 and Flagler in 1916, but their collective imprint on Miami survived the hurricane of 1926, even as it destroyed the Royal Palm Hotel and temporarily slowed the city’s growth during the Depression. Miami remained a city committed to reimagining the future rather than one interested in celebrating the past.

Tuttle and Flagler shared an illusion that they were settling untouched wilderness—even as they were surrounded by evidence of earlier occupation. In this way, their story is no different than those of settlers across the continent whose shared myth of the frontier allowed them to ignore the history that preceded them. In the 1880s, the frontier was a fairly simple but magical idea: It allowed white Americans to ignore the ancient history of Native America. The myth of the frontier—that pervasive and most-American idea—allowed Tuttle and others in Miami to see “unclaimed lands” in the United States as an untapped and disappearing resource, and to imagine that white American ingenuity transformed wilderness into civilization.

The post The Myth of Untouched Wilderness That Gave Rise to Modern Miami appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/12/myth-untouched-wilderness-gave-rise-modern-miami/ideas/essay/feed/ 1