Zócalo Public SquareZócalo Public Square http://www.zocalopublicsquare.org Ideas Journalism With a Head and a Heart Tue, 20 Feb 2018 08:01:54 +0000 en-US hourly 1 https://wordpress.org/?v=4.8 Before You Push That Big Nuclear Button, Consider the Sourcehttp://www.zocalopublicsquare.org/2018/02/20/push-big-nuclear-button-consider-source/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/20/push-big-nuclear-button-consider-source/ideas/essay/#respond Tue, 20 Feb 2018 08:01:54 +0000 By Robert Bartholomew http://www.zocalopublicsquare.org/?p=91351 Shortly after 8 a.m. on January 13, 2018, the Hawaii Emergency Management Agency sent out a chilling alert to residents across the state of Hawaii: “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.”

Thousands of frightened people flocked to shelters; some even climbed down manholes to save themselves. Hawaii Congressman Matthew LoPresti told CNN: “I was sitting in the bathtub with my children, saying our prayers.” It was not until 38 minutes later that a second message made it clear that that the first had been a false alarm.

Such episodes are not new. Since the advent of mass communications, similar scares have taken place. From intentional hoaxes to accidental alerts, we have become susceptible to reports of terrifying events that never come to pass.

Canadian philosopher and media scholar Marshall McLuhan famously observed that we all live in a global village. Where it

The post Before You Push That Big Nuclear Button, Consider the Source appeared first on Zócalo Public Square.

]]>
Shortly after 8 a.m. on January 13, 2018, the Hawaii Emergency Management Agency sent out a chilling alert to residents across the state of Hawaii: “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.”

Thousands of frightened people flocked to shelters; some even climbed down manholes to save themselves. Hawaii Congressman Matthew LoPresti told CNN: “I was sitting in the bathtub with my children, saying our prayers.” It was not until 38 minutes later that a second message made it clear that that the first had been a false alarm.

Such episodes are not new. Since the advent of mass communications, similar scares have taken place. From intentional hoaxes to accidental alerts, we have become susceptible to reports of terrifying events that never come to pass.

Canadian philosopher and media scholar Marshall McLuhan famously observed that we all live in a global village. Where it once took months to relay news from the other side of the world, it now takes less than a second. The problem is, with so many people now reliant on the internet and mobile phones, the potential for technology-driven hoaxes, panics, and scares has never been greater. And such events, while usually short-lived, have tremendous power to wreak widespread fear and chaos.

The most famous example of mass panic in response to an announcement is the 1938 “War of the Worlds” radio drama produced by Orson Welles at WABC’s studios in New York City. Broadcast across the United States and in parts of Canada, the play narrated a fictitious Martian invasion of the New York-New Jersey metropolitan area. In his bestselling book The Invasion From Mars, American psychologist Hadley Cantril wrote that of the estimated 6 million listeners, about 1.5 million were frightened or panicked.

A smartphone displaying a false incoming ballistic missile emergency alert, sent Jan. 13, 2018. Image courtesy of Caleb Jones/Associated Press.

While contemporary sociologists who have re-examined the episode believe that the number of those who panicked was much smaller, no doubt many were frightened. Some people living near the epicenter of the play—tiny Grover’s Mill, New Jersey—tried to flee the Martian “gas raids” and heat rays.” During the hour-long broadcast, The New York Times fielded 875 phone calls about the broadcast. Curiously, Cantril found that about one-fifth of listeners thought that America was under attack by a foreign power—most likely Germany, using advanced weapons. Historical context often plays a major role in shaping the mindset of listeners.

While perhaps the best-known incident, the 1938 broadcast did not have the most serious consequences. That distinction goes to another radio play, broadcast in 1949, that caused pandemonium in Quito, Ecuador. That highly realistic production mentioned real people and places, and included impersonations of government leaders. A correspondent for The New York Times in Quito at the time described a tumultuous scene as the drama “drove most of the population of Quito into the streets” to escape the Martian “gas raids.”

After realizing that the broadcast was a play, an enraged mob marched on the radio station, surrounded the building, and burned it to the ground. At least 20 died in the rioting and chaos. Authorities had trouble restoring order, as most of the police and military had been sent to the nearby town of Cotocallao to repel the Martians.

These “War of the Worlds” broadcasts were not the first to provoke false alarms. One of the earliest recorded scares occurred in the United Kingdom in 1926, when the BBC reported that the government was under siege and could fall amid a bloody uprising by disgruntled workers. In reality, the streets of London were calm; people had only been listening to a radio play airing on the regularly scheduled Saturday evening radio program.

The BBC broadcast announcements throughout the evening, apologizing and reassuring listeners. But the story was plausible, given deep tensions with labor unions at the time. And, four months after the broadcast, a historic General Workers Strike rocked the government, as 1.5 million workers took to the streets over 10 days to call for higher wages and better working conditions.

Decades later, on March 20, 1983, hundreds of Americans were frightened by a broadcast of the NBC Sunday Night Movie Special Bulletin, about news coverage of a group of terrorists who were threatening to detonate a nuclear bomb in South Carolina. The program began like any other Sunday night movie but was quickly “interrupted” by breaking news bulletins from well-known news outlets such as Reuters and the Associated Press, showing scenes of devastation. The film even recreated a White House press briefing. Special bulletins throughout the show were interrupted by “live feeds” from the terrorists. Conveniently, a reporter and a cameraman happened to be among the “hostages.”

While many false alarms have apparently been unintentional, there are egregious examples of deliberate attempts to cause widespread fear.

The realism of the special broadcast was instrumental in creating the panic. The fictional NBC affiliate broadcasting the siege in Charleston, WPIV, was similar to the real affiliate, WCIV. The real-world WCIV-TV received 250 calls; others rang the police. WDAF in Kansas City fielded 37 calls. At San Francisco’s KRON-TV, 50 calls were logged in the first 30 minutes. One woman was so convinced that she called to complain that too much air time was being given to the terrorists. The program began with an advisory that what viewers were about to see was a “realistic depiction of fictional events” but was not actually happening. Unfortunately, the next advisory did not run until 15 minutes later. The movie won an Emmy, but across the country viewers mistook it for a live news coverage and thought a nuclear catastrophe was imminent.

It wouldn’t be the last nuke scare. On November 9, 1982, WSSR-FM in Springfield, Illinois, reported that there had been an accident at a nearby nuclear power plant. The program began by claiming “that a nuclear cloud was headed for Springfield.” Concerned residents immediately deluged police with phone calls, prompting the station, which was operated by Sangamon State University, to pull the plug on the half-hour drama after just two and a half minutes. The Illinois Emergency Services and Disaster Agency was not amused. Director Chuck Jones said: “I’m still shocked that someone out at that station let that get on the air.” The nuclear power plant depicted in the program was located 25 miles northeast of the city, and while it was not operating at the time, people didn’t have any way of knowing this.

While many false alarms have apparently been unintentional, there are egregious examples of deliberate attempts to cause widespread fear. On January 29, 1991, DJ John Ulett of radio station KSHE-FM in Crestwood, Missouri, decided to protest America’s involvement in the Persian Gulf War by airing the following announcement: “Attention, attention. This is an official civil defense warning. This is not a test. The United States is under nuclear attack.” Worried listeners flooded the station with phone calls. While Ulett’s statement did not trigger a mass panic, the Federal Communications Commission fined the station $25,000. Ulett was suspended but managed to save his job.

Poorly timed jokes have also triggered dramatic results. On August 11, 1984, just before his weekly radio address, President Ronald Reagan tested his microphone by saying: “My fellow Americans, I am pleased to tell you today that I’ve signed legislation that will outlaw Russia forever. We begin bombing in five minutes.” Americans didn’t panic, because the broadcasters knew he was joking, but the Russians, nervous at a time of considerable distrust between the two countries, placed their armed forces on standby.

Even fleeting scares can have long-term consequences. The 1938 Martian scare resulted in jammed phone lines. In Trenton, emergency services were knocked out for six hours. In Quito, damage from the rioting was estimated at $350,000—an enormous sum at the time.

In 1597, English philosopher Francis Bacon famously observed, “Knowledge is power.” But in today’s world, knowledge comes with risk. The most educated and technologically adept generation in the history of the world is also the most vulnerable.

The post Before You Push That Big Nuclear Button, Consider the Source appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/20/push-big-nuclear-button-consider-source/ideas/essay/feed/ 0
Could a New River City Transform California?http://www.zocalopublicsquare.org/2018/02/19/new-river-city-transform-california/inquiries/connecting-california/ http://www.zocalopublicsquare.org/2018/02/19/new-river-city-transform-california/inquiries/connecting-california/#comments Mon, 19 Feb 2018 08:01:42 +0000 By Joe Mathews http://www.zocalopublicsquare.org/?p=91317 Could the San Joaquin River, long a dividing line in the heart of California, unite the state in pursuit of a more metropolitan future for the Central Valley?

Whether that happens will be determined in Madera County, on the north side of the river from Fresno. There, a new city, consisting of multiple large planned communities, is finally under construction after decades of planning and litigation.

The city has no name and incorporation could be decades away. But within a generation, its population could grow to more than 100,000 people; by mid-century, it might double Madera County’s current population of 150,000.

And that is just on the Madera side of the river. On the Fresno side, the county is developing open space, the city of Fresno’s north side is growing, and the city of Clovis is expanding to its south and east. Rising together, the new Madera city, Fresno, and

The post Could a New River City Transform California? appeared first on Zócalo Public Square.

]]>
Could the San Joaquin River, long a dividing line in the heart of California, unite the state in pursuit of a more metropolitan future for the Central Valley?

Whether that happens will be determined in Madera County, on the north side of the river from Fresno. There, a new city, consisting of multiple large planned communities, is finally under construction after decades of planning and litigation.

The city has no name and incorporation could be decades away. But within a generation, its population could grow to more than 100,000 people; by mid-century, it might double Madera County’s current population of 150,000.

And that is just on the Madera side of the river. On the Fresno side, the county is developing open space, the city of Fresno’s north side is growing, and the city of Clovis is expanding to its south and east. Rising together, the new Madera city, Fresno, and Clovis could come to constitute a tri-cities area in the center of California, offering a new model for the state’s long-neglected interior.

If the new Madera and expanded Fresno and Clovis cities could cohere into a stronger region by mid-century—and that’s an “if” as big as the Valley floor—greater Fresno could transform from a relatively poor backwater of 1 million-plus into California’s answer to Austin, an inland country metropolis of 2 million or more capable of spreading the Golden State’s coastal prosperity to its dusty interior.

Of course, such a transformation would require extensive regional planning of the sort that has been little seen in Fresno. It would require establishing new and more effective governance arrangements and funding for regional transportation, economic development, water management, recreation, and air quality. In short, it would require something just short of a revolution in California governance, and in thinking about what city governments do.

Transforming greater Fresno also would require collaboration between local governments that have spent decades using lawsuits to stall the growth of their neighbors. Madera County’s development has only recently gone forward after fights so bitter that the governor’s office intervened.

Indeed, the very structure of California, and its land-use planning, works against turning Fresno into a region, never mind a powerhouse. In our state, local jurisdictions are weak and have little power to raise their own revenues; they are incentivized to compete with other cities, often using questionable subsidies, in the chase for developments and the taxes they bring. In the Golden State, cooperating with neighboring municipalities is for saps.

The battles between the San Joaquin Valley’s cities have been especially hard-fought, since those municipalities are weak even by California’s diminished standards. (Madera County doesn’t even have a parks department.) The game is: support development that provides revenue for your city, while spreading the costs—in traffic, water and air quality—onto your neighbors.

That has inspired nearly constant litigation. To take just two examples: The city of Fresno sued Madera County to block the new river development plan until it got a tax-sharing agreement that would compensate it for impacts like traffic. In retaliation, Madera County sued Fresno to block a new shopping center, claiming it would siphon off shopping dollars and sales taxes that should go to Madera.

Most, but not all, of such litigation is now over, offering an opportunity to build together. Potential collaborations could include a stronger and more resilient water infrastructure (the new Madera developments tout their water efficiency), a joint powers authority that could raise revenue to improve access to the river itself, and a regional transportation network. That network ought to reach as far south as Visalia, and north, across the river into Madera, along both the Highway 41 and 99 corridors.

Another problem is the lack of local government brainpower. The area’s municipalities in particular need more personnel with training and experience in regional planning. The existing regional planning includes some collaboration on trails and water treatment, but it is still too irregular and unimaginative.

That’s why the big and bold development in Madera is so promising. The county on Fresno’s northwestern flank is saying via its big new developments that it doesn’t want to be small, poor, and isolated anymore. That’s the message all of greater Fresno needs to embrace.

Indeed, Madera County is pitching its new developments as a huge step forward for central California: master-planned communities with trails and schools and job centers and water facilities wrapped in, providing the greater density and smaller lots of more urban living.

The signature project, now under construction, is Riverstone, with acres of commercial space and nearly 6,600 homes of various sizes across six themed districts, along Highway 41, best known to most Californians as a road to Yosemite. “The new-home community of Riverstone,” boasts one brochure, “will be a celebration of California living where people of every generation can enjoy the relaxed and informal spirit of the Golden State.”

Other developments in the pipeline—with names like Tesoro Viejo and Gunner Ranch—are supposed to offer a similar approach, and county officials say they are likely to be incorporated one day as the county’s third city (after Madera city and Chowchilla). These developments are close to river-adjacent Fresno County projects—like a town-size development near Friant Dam.

“This is going to be a new town and we have this opportunity with a blank canvas to do it right,” Madera County Supervisor Brett Frazier recently told local television.

Much could go wrong. If the new river city doesn’t produce promised jobs and inspire better transit, the expanded development could fuel sprawl, add to air pollution, and turn Highway 41 into a traffic nightmare.

Successful regionalization will require outside help. The state’s climate change regime must prioritize infill development in central Fresno, so that the urban core isn’t weakened as people move to the new river city. The ongoing revival of Fresno’s downtown needs the added momentum of the state’s high-speed rail project, which is already under construction across Fresno County (a signature rail bridge is being built across the river, linking Madera and Fresno in another way).

Greater Fresno badly needs high-speed rail to provide connections to Northern California and Southern California, making it an affordable crossroads between two world-class regional economies.

And Fresno has a large population of undocumented immigrants who are desperate for legal status so they can advance themselves, and their region, economically.

You should not bet the farm on the grand project of turning greater Fresno into the next great region. But if Madera’s new development can inspire progress in that direction, the state would have reason to celebrate—and perhaps call the new river city Future Town, CA.

The post Could a New River City Transform California? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/19/new-river-city-transform-california/inquiries/connecting-california/feed/ 1
What Benjamin Franklin Ate When He Was Homesickhttp://www.zocalopublicsquare.org/2018/02/19/benjamin-franklin-ate-homesick/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/19/benjamin-franklin-ate-homesick/ideas/essay/#respond Mon, 19 Feb 2018 08:01:05 +0000 By Rae Katherine Eighmey http://www.zocalopublicsquare.org/?p=91323 In the midst of the American Revolution, Benjamin Franklin envisioned the turkey as an exemplar of the ideal American citizen. In a 1783 letter home to his daughter Sally, written while Franklin was serving as chief diplomat to France, he wrote about the “ribbons and medals” presented to the French by grateful Americans in thanks for significant military and financial support. The tokens bore an image of an eagle—but, Franklin explained, some recipients complained that the workmanship was not up to sophisticated French standards. They thought that the eagle looked more like a turkey.

Franklin asserted that this plucky fowl would have been a better choice in the first place. Eagles were found in many countries, but the turkey was an American native and “a bird of courage,” a fitting symbol of America’s valor and virtues. It “would not hesitate to attack a grenadier of the British Guard who should

The post What Benjamin Franklin Ate When He Was Homesick appeared first on Zócalo Public Square.

]]>
In the midst of the American Revolution, Benjamin Franklin envisioned the turkey as an exemplar of the ideal American citizen. In a 1783 letter home to his daughter Sally, written while Franklin was serving as chief diplomat to France, he wrote about the “ribbons and medals” presented to the French by grateful Americans in thanks for significant military and financial support. The tokens bore an image of an eagle—but, Franklin explained, some recipients complained that the workmanship was not up to sophisticated French standards. They thought that the eagle looked more like a turkey.

Franklin asserted that this plucky fowl would have been a better choice in the first place. Eagles were found in many countries, but the turkey was an American native and “a bird of courage,” a fitting symbol of America’s valor and virtues. It “would not hesitate to attack a grenadier of the British Guard who should presume to invade his farm yard with a red coat on,” he wrote to Sally. Turkeys were tasty, too, Franklin further explained to her, first brought to France by the Jesuits and served to citizens of note including “at the Wedding Table of Charles the ninth,” in 1570. Some 200 years later, Franklin had served turkey to guests of his own in Philadelphia, and now the sumptuous fowl were often on his diplomatic table in Passy, France. Ever practical, Benjamin “Waste not, want not” Franklin clearly appreciated that a bird of taste and courage could nourish both the body and spirit of the nation.

I write about food, using it as an interpretive tool to understand history and historical figures, so I was delighted to see that Benjamin Franklin liked writing about the topic as well, in his letters and political articles. Franklin realized the impact that foods—particularly, native foods—had in building the identity of a new nation. By the time he wrote that letter to his daughter praising the courage of turkeys, Franklin had pondered and promoted the idea of an American identity for more than 40 years, first as a loyal colonist, later as an emerging patriot, and then as one of the nation’s founders. He championed the things that set his homeland apart from its English and European heritage. Out of his ponderings he would come to define the essential American persona and the ingredients of American success. American geography and its bounty, including its food, were central to the recipe.

Franklin recognized American potential early. In his 1751 essay Observations Concerning the Increase of Mankind, written 25 years before the Declaration of Independence, he explained, “so vast is the territory of North America, that it will require many ages to settle it fully; and, till it is fully settled, labor will never be cheap here, where no man continues long a laborer for others, but gets a plantation [farm] of his own, no man continues long a journeyman to a trade, but goes among those new settlers, and sets up for himself, &c.” Franklin contrasted this vista of opportunity with life across the Atlantic, where “Europe is generally full settled with husbandmen, manufacturers, &c. and therefore cannot now much increase in people ….”

Throughout his political and diplomatic career—including 26 years spent in England and France, before and during the American Revolution—Franklin worked hard to convey the strengths of the American setting and the American character. He became a celebrity when he lived near Paris from 1776 to 1785, his image adorning all manner of objects. People felt as though they knew him personally, and he would happily answer their questions about America.

Often, his message was that success in America required persistent hard work, and that no one should travel to the New World unprepared for the challenge. “Many I believe go to America with very little; and with such romantic schemes and expectations as must end in disappointment and poverty,” Franklin wrote in another 1783 letter to Sally and her husband Richard Bache. “I dissuade, all I can, those who have not some useful trade or art by which they may get a living; but there are many who hope for offices and public employments, who value themselves and expect to be valued by us for their birth or quality, though I tell them those things bear no price in our markets. But fools will ruin themselves their own way.”

Products of the land fit easily into Franklin’s vision of American industriousness and greatness. When he lived in London and worked as an agent for the Pennsylvania Colony, in two postings between 1757 and 1775, his wife Deborah sent him a wide variety of his favorite American foods: smoked wild venison, home-cured hams, and dried peaches—he preferred those dried without the peel. She sent kegs of cranberries, which caused great wonder among Franklin’s landlady’s kitchen staff. Deborah shipped barrels of her husband’s favorite Newtown Pippin apples, an American native variety famed for its superior keeping qualities. She sent grafted trees for planting on Franklin’s friends’ estates, symbols of the productive exchange of ideas and commerce he sought to encourage between the Colonies and the Crown.

Out of his ponderings [Franklin] would come to define the essential American persona and the ingredients of American success. American geography and its bounty, including its food, were central to the recipe.

But it was maize, the primary American grain, that was Franklin’s most ideologically impactful import. Shared by Native Americans with the pilgrims and, now, after more than 100 years of European settlement, widely cultivated on farms across the colonies, this “Indian corn,” as it had been commonly known, was often praised by Franklin for its taste and variety of uses. Deborah’s packages included dried or parched corn kernels, cornmeal, and nocake—a flour made from parched corn that Franklin’s London cook probably used to make pancakes. In letters home to Philadelphia, Franklin thanked Deborah for the corn meals and flours she sent: “For since I cannot be in America, every thing that come from thence comforts me a little, as being something like home,” he wrote, noting specifically that, “The nocake proves very good.”

Benjamin Franklin often considered the essence of his homeland from afar, especially as the 1760s brought the disintegration of the mutually supportive and respectful relationship between England and its American Colonies, which he and other patriots had sought to cultivate. In 1764, during the height of the Stamp Act controversy, Franklin, then living in London, wrote several letters to the editor using pen names such as “Homespun” to make the case for the Colonies. Again, his thoughts turned to corn. He employed it as a metaphor to dramatize the differences between the dynamic, diverse American settlements and the staid English homeland.

In one essay published in several London newspapers in early January 1764, Franklin contrasted the essential American grain’s virtuosity and variety with the limitations of lowly English wheat. “[Maize] is one of the most agreeable and wholesome grains in the world; that its green ears roasted are a delicacy beyond expression; that samp, hominy, succatash, and nocake, made of it, are so many pleasing varieties; and that a johny or hoecake, hot from the fire, is better than a Yorkshire muffin.” Franklin continued saying that British essay writers who preferred “the roast beef of Old England” and condemned corn as “disagreeable” and “indigestible” without even tasting it, suffered from a misguided sense of superiority. He saw their snobbery as a metaphorical parallel to the Crown’s (faulty) assumption that it understood American people and possibilities, and thus knew best how to rule the colonies.

The first shots of the Revolution were at the battle of Lexington and Concord in April 1775, and in June and July of 1776 members of the Continental Congress wrote and signed the Declaration of Independence, with its aspirations to “life, liberty, and the pursuit of happiness”—truths that were, in the word Benjamin Franklin himself wrote into that founding document, “self-evident.” Franklin would spend the next 10 years in France promoting American freedom and possibilities.

After the war, when an old English friend, the Earl of Buchan, sought resettlement advice, Franklin told him that, “The only encouragements we hold out to strangers, are a good climate, fertile soil, wholesome air, and water, plenty of provisions and fuel, good pay for labor, kind neighbors, good laws, liberty, and a hearty welcome. The rest depends on a man’s own industry and virtue.”

Franklin cautioned that America’s streets were not paved with gold. But, he might also have added, the new nation’s fields and orchards were indeed filled with delicious turkeys, exceptional apples, and golden grains of opportunity.

The post What Benjamin Franklin Ate When He Was Homesick appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/19/benjamin-franklin-ate-homesick/ideas/essay/feed/ 0
HOARDERS:CLAIRE AND VANCEhttp://www.zocalopublicsquare.org/2018/02/16/hoardersclaire-and-vance/chronicles/poetry/ http://www.zocalopublicsquare.org/2018/02/16/hoardersclaire-and-vance/chronicles/poetry/#respond Fri, 16 Feb 2018 08:01:03 +0000 By Kate Durbin http://www.zocalopublicsquare.org/?p=91271 C: I’m Claire, I’m an avid reader, aspiring writer, and I collect a lot of books filling the entire house so there are only narrow crevices to squeeze through; windows blocked with books so no natural light comes in; floors buckling under the weight of paperback stacks
C: My husband Vance and I have been married for 42 wonderful, crowded years Vance’s tie hung on the sides of a bookcase; Claire’s shirt flung over a mound of books
V: My name is Vance, I’m a teacher and a book lover from way Back Mystics and Messiahs, Basic Teachings of the Great Philosophers, Paganism Book
C: In this house, we have two very familiar phrases—“I love you” and “timbeeeeeer!” pile of books collapsing
C: We have books that go to nine feet high in some places Hurricanes and Tornadoes, Jack and the Beanstalk
C: On the first floor it is wall-to-wall

The post HOARDERS:
CLAIRE AND VANCE
appeared first on Zócalo Public Square.

]]>
C: I’m Claire, I’m an avid reader, aspiring writer, and I collect a lot of books filling the entire house so there are only narrow crevices to squeeze through; windows blocked with books so no natural light comes in; floors buckling under the weight of paperback stacks
C: My husband Vance and I have been married for 42 wonderful, crowded years Vance’s tie hung on the sides of a bookcase; Claire’s shirt flung over a mound of books
V: My name is Vance, I’m a teacher and a book lover from way Back Mystics and Messiahs, Basic Teachings of the Great Philosophers, Paganism Book
C: In this house, we have two very familiar phrases—“I love you” and “timbeeeeeer!” pile of books collapsing
C: We have books that go to nine feet high in some places Hurricanes and Tornadoes, Jack and the Beanstalk
C: On the first floor it is wall-to-wall books, with only a narrow path through Versailles, American Castles, Great Houses of Washington DC; Claire turns sideways to move through a crack between ceiling-high stacks; she walks on top of books, steps uneven
V: We find that uh getting up and down the stairs can be a Challenge Gaudi, Allergy, William Morris, Corpse
C: Books are our passion; we are omnivores of every kind of Information Exam Cram MSCR Core, MSCR Network Plus, Installing GNU/Linux, Windows 98, Practical Windows, Presenting Java, Official BONG, Access 2000
V: I tested over 200 on an IQ test; I instruct in some 31 different subjects, ranging from mathematics, political science, geography, psychology—all at the college level Crocheting for Dummies, Screenwriting for Dummies, Organic Chemistry for Dummies, British Sign Language for Dummies, The Ancient Egyptians for Dummies, The British Monarchy for Dummies, Catholic High School Entrance Exams for Dummies, Composting for Dummies, Atheism for Dummies, Ballet for Dummies, Baby Massage for Dummies, Dad’s Guide to Baby’s First Year for Dummies, Bird Watching for Dummies, Dog Photography for Dummies, Second Life for Dummies, Solving Cryptic Crosswords for Dummies, Workplace Conflict Resolution Essentials for Dummies New Zealand and Australian Edition, Work/Life Balance for Dummies Australian Edition, Veterans Benefits for Dummies, Starting an iPhone Application Business for Dummies, Acid Reflux Diet for Dummies, Cooking with Chia for Dummies, Building Chicken Coops for Dummies, Wilderness Survival for Dummies, Being a Great Dad for Dummies, Body Language for Dummies, Boosting Self-Esteem for Dummies, Build a Better Life Box Set for Dummies
C: My kitchen is not a kitchen anymore Cooking Without a Kitchen
V: It takes a little bit of dexterity to get to the stove Operation Dragoon
C: So mainly we eat dusty Jell-O boxes and Mikes Hard Cranberry Lemonade nestled between massive stacks of remaindered hardcovers on top of the stove; inside the stove is The Bell Jar; the fridge doors are ajar, shelves filled with Harlequin romances
C: Next to my bed, I have a pile of books; in the middle of the night sometimes, it all comes crashing down on me Shakespeare’s Tragedies, Greek Classics, The Civil War, Gone with the Wind, Pearl Harbor, PERL, 1812: The War that Forged a Nation, Return of Depression Economics: The Year 2008, The Battle of Britain, The Battle Plan for Prayer, Terrorism, Counter-Terrorism, Great Gambles, World History, German Verbs, Introduction to Statistics, Wicca for Beginners, Wicca for Men
V: I think the conclusion we reached is we have a uneven bookshelf with a paperback of Herman Hesse’s Siddhartha precariously propping it up
V: Uh, we have reached the limit another bookshelf toppling; books falling everywhere
C: My biggest fear is my husband’s Relationships for Dummies
C: He has a heart condition and anything can happen at any time and the EMT’s absolutely could not get him out of the Canadian Rockies Access Guide
C: The City patrol to make sure everybody has a parking pass, and they inferred from the stuff in our car that the house looked like that UFO CRASH, Tom Cruise Answer Book, Grays Anatomy, BIG BOOK OF PIZZA
C: They left a note on the door with intent to inspect the premises Nazi Germany
C: I don’t want to think about what would happen if the city came into the Battle of Waterloo
C: Vance and I met in college; we talked books, we were birds of a feather Sisson’s Synonyms
V: There was a happy collision of mutual interests Scottish Architecture, Copspeak, Encyclopedia of Psychological Problems, Gun Dog Breeds, Alternative Medications, Embracing the Moon
C: Vance and I knew each other 18 days before we got married Turning Life into Fiction
V: A few weeks after getting married, they had a book sale at the university; I came home with two shopping bags of books and my bride’s eyes got big The Dollar Crisis
C: When we moved into this house in 1977, we took out some of the appliances, and we just started moving books in bookcases where the washer and dryer would be; books stacked around and on top of the toilet
V: We moved in perhaps with 30,000 books; now it could be 500,000 Unsolved Disappearances in the Great Smoky Mountains
C: We are soulmates; we don’t have to be physical on a bed together to connect two single mattresses in separate rooms, each surrounded by walls of books
V: I don’t see how books can be a danger to anybody Berlin Wall, Bush Agenda, Gettysburg Jury, Elections, America Eats
V: Books don’t bite The Science of Jurassic Park and The Lost World, Dracula
C: We couldn’t see the books through the reading DaVinci Code
V: When you finally reach the point where you are tripping over your possessions, are you in possession of these possessions, or are they in possession of you? The Way Things Work Volume II
C: This is a financial problem for us, it is not psychological Freakanomics
V: We’re guilty of thinking that the answers in life come from books, and they decidedly do not American Poetry 1988-1997, Redneck Words of Wisdom, Hamlet

The post HOARDERS:
CLAIRE AND VANCE
appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/16/hoardersclaire-and-vance/chronicles/poetry/feed/ 0
Why Are There so Many Statues of Men on Horseback?http://www.zocalopublicsquare.org/2018/02/16/many-statues-men-horseback/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/16/many-statues-men-horseback/ideas/essay/#respond Fri, 16 Feb 2018 08:01:02 +0000 By Peter Louis Bonfitto http://www.zocalopublicsquare.org/?p=91278 Statues are created to project meaning. Contemporary public artworks, for example, use purposely veiled messages aimed to generate thoughtful exchange with the viewer and to prompt reflection. By contrast, historic monumental sculptures employ symbolism that is direct and intentionally easy for viewers to understand.

The ancient Roman tradition of publicly displaying monumental equestrian statues of important historical figures is a particularly striking case of how to convey meaning in no uncertain terms.

Traditionally cast in bronze, these huge forms of horse and rider display messages of dominance, power, and virtue through strength. And, by doing so, they established a template that has persisted for centuries.

Perhaps, no statue embodies these values more than a famous depiction of the emperor Marcus Aurelius on horseback.

The figure of the emperor is seated on top of a regal horse, artfully posed as if it were moving gracefully through a crowd. The rider fully

The post Why Are There so Many Statues of Men on Horseback? appeared first on Zócalo Public Square.

]]>
Statues are created to project meaning. Contemporary public artworks, for example, use purposely veiled messages aimed to generate thoughtful exchange with the viewer and to prompt reflection. By contrast, historic monumental sculptures employ symbolism that is direct and intentionally easy for viewers to understand.

The ancient Roman tradition of publicly displaying monumental equestrian statues of important historical figures is a particularly striking case of how to convey meaning in no uncertain terms.

Traditionally cast in bronze, these huge forms of horse and rider display messages of dominance, power, and virtue through strength. And, by doing so, they established a template that has persisted for centuries.

Perhaps, no statue embodies these values more than a famous depiction of the emperor Marcus Aurelius on horseback.

The figure of the emperor is seated on top of a regal horse, artfully posed as if it were moving gracefully through a crowd. The rider fully controls his muscular mount. The horse’s bit and bridle indicate that there were originally bronze reins, which were separately cast pieces lost over the centuries. The emperor’s posture, legs, and complete mental dominance over the beast underscores his great power. The emperor effortlessly motions with an outstretched left arm and hand. Art historians have defined this gesture as one of “pacification,” displaying authority and the ability to subjugate foreign enemies or forces of chaos that threatened the stability of the Empire.

This monumental bronze equestrian statue, inarguably one of the most extraordinary artworks that has come down to us from antiquity, was created to commemorate Marcus Aurelius’ great victories over Germanic tribes in 176 CE, or possibly posthumously to honor his prosperous reign (161-180 CE), when he, was canonized as one of Rome’s greatest emperors—a leader who ruled with intellect and decisive action.

Thousands of years later, the statue has become a hallmark in art history textbooks and the pride of the Capitoline Museum in Rome, where droves of tourists flock to see the work and to match it against the illustrations in their guidebooks.

Equestrian statues were not first seen in Rome. Before the Roman Empire existed, other regional cultures used this form of representation to commemorate their noblemen, kings, and heroes. The Romans were acutely aware of these artworks, especially those from Greece, and sought to collect and display them in their luxury villas, as well as dedicate them in public and religious spaces.

In contrast to many excavated archeological discoveries, the statue of Marcus Aurelius and his horse stayed above ground, where it surveyed the streets of Rome for nearly 2,000 years.

Roman equestrian statues, like many equestrian statues before and after, were about much more than men with horses; they embody the relationship between the leader and the military. The equites, a military class, played an incredibly important role in Roman society. Generally speaking, the equites received positions of privilege through merit and imperial favor instead of through a noble bloodline. Emperors selected members of this class for their elite Praetorian Guard, and used them politically as a counterweight to levy power from the high-ranking bureaucrats of the senatorial class.

The Western tradition of creating equestrian statues is interlinked with an appreciation of the equites. In the Marcus Aurelius statue, the emperor’s powers of statecraft are stressed, but his mastery of horsemanship clearly connects him to this elite military class, upon which he depended to manifest his far-reaching power.

Observed carefully, this statue of horse and rider idealizes Marcus Aurelius in an imagined moment of triumph. Romans loved victory, which they celebrated in grand civic gestures called “Triumphs.” They erected massive stone arches, many of which survive in the streets of Rome today, for their military heroes to ceremoniously march beneath in parades. Victorious generals or emperors would have ridden in horse-drawn chariots in these orchestrated events. While it is not specifically a depiction of a “Triumph,” the emperor wears a military cloak representing, perhaps, a humbler moment of victorious homecoming, one that symbolizes all of his achievements.

Ancient sources reference 22 equi magn—colossal bronze equestrian statues—that adorned the imperial capital. It is believed that this Marcus Aurelius statue was one of them, but centuries after the empire fell, the other 21 equi magni—along with hundreds of public statues throughout the city—were melted down during times of war or strife.

This single statue was preserved by a strange twist of fate. During the turmoil of Imperial Rome’s decline and its transformation into a medieval Christian center, the statue was incorrectly identified as the later emperor Constantine, who reigned from 306-337 CE. To later Christians, Constantine was a more favorable historic figure, as they widely believed that his 313 CE Edict of Milan, also known as the “peace of the church,” created a pathway for Christianity to become the principal religion of Europe. Medieval Romans could live more easily under the shadow of a statue identified as the Christian patron Constantine than with the polytheist Marcus Aurelius.

Although modern scholarship can easily settle the matter of the statue’s identity, conflation of Marcus Aurelius and Constantine indicates that, although the two emperors ruled Rome in very different political and social periods, the statue’s message of military prowess and leadership could be applied to them both. Constantine’s most important victory was at the Battle of Milvian Bridge (312 CE), when he—not unlike Julius Caesar in his famous crossing of the Rubicon—led an army into the city, crushed his enemies, and consolidated his power.

The conflation of Marcus Aurelius with Constantine reveals another part of the statue’s message—in which the mounted hero embodies virtue. Marcus Aurelius has been historically cast as the emperor exemplar, a Stoic philosopher with deep moral convictions. His oft-quoted collection of writings, Meditations, remains standard college course reading. Constantine’s virtue, in contrast, is attributed to his Christianity: He reportedly had a vision of the cross before his fateful battle for control of Rome, which led to his status of sainthood as a true defender of the faith. In medieval and later depictions of Constantine, he is almost always looking up, a gesture meant to signal that God is directly communicating with the emperor in his moment of triumph.

The allure of Rome’s greatness held sway over Western culture for millennia and the Marcus Aurelius statue directly contributed to this phenomenon. Part of what made it so influential was simply the fact that it was visible.

In contrast to many excavated archeological discoveries, the statue of Marcus Aurelius and his horse stayed above ground, where it surveyed the streets of Rome for nearly 2,000 years. By the mid-10th century this statue stood near the Pope’s Lateran Palace, and in 1538 it became the focal point of Michelangelo’s new design for the Piazza del Campidoglio. A replica is there today; the original was moved inside the Capitoline Museum in 1981 for conservation reasons.

Bronze replica of the Marcus Aurelius statue in Rome’s piazza del Campidoglio. Photo courtesy of Jean-Pol Grandmont/Wikimedia Commons.

Through the circumstances of its survival, its artistic quality, and its sheer size, the statue became widely known and revered throughout Europe long after the fall of Rome. Local artists and travelers copied it in their sketchbooks and used it as a model for their own artworks. Later rulers appropriated its formidable symbolism for themselves. Charlemagne commissioned his own portraits of dominance, using the Marcus Aurelius statue as a prototype. Machiavellian-style princes of Renaissance Italy revived the ancient Roman tradition in earnest, placing remarkably crafted bronze statues of themselves on mounts in their public squares.

From Europe, the tradition was exported to colonial territories from the early modern period well into the 20th century. The deeply embedded concepts of power and virtue on horseback were exploited by Mussolini and Franco to frame their autocratic regimes—even as the machine gun and other industrial weaponry made mounted warriors obsolete.

Although the specific messaging of any given equestrian statue changes through iconographic details, the basic model persists, as do the core themes of power and virtue. Most equestrian statues visible today in public spaces stick to form by portraying a stoic general on a sturdy horse with an outstretched arm, perhaps brandishing a sword for added effect. Typically, they wear ceremonial dress, not combat armor or field uniforms, a detail that brings them back to their greatest moments of triumph.

In Oyster Bay, the hometown of Teddy Roosevelt, stands a statue (not to be confused with the recently vandalized statue in New York City) that depicts the 26th U.S. president on horseback in his Rough Rider gear, alluding to his leadership of a volunteer cavalry regiment in Cuba during the Spanish-American War. Pulling decisively on the reigns of his horse, and surveying the field before him, he is shown as a man of action, vigor, and clear mind—a rough-and-tumble message that echoes back to antiquity.

In recent months, Americans have fiercely debated whether to preserve or tear down statues of horse-mounted Confederate generals. But criticism of equestrian statues is hardly new to the 21st century; rather, uneasiness with their symbolism has been embedded within the tradition since ancient times. The famous first-century BCE orator and writer Cicero, who frequently commented on virtus—a Roman concept that combines the “masculine” virtues of excellence, valor, honor, and integrity—condemned the erection of public equestrian statues as shameless acts of arrogance. Not immune to political hypocrisy, Cicero later supported resolutions for his allies to be honored by equestrian statues of them installed in a public space. But his core criticism regarding equestrian statues is clear: Representing an individual on horseback in a civic space suggests an infallibility of character and literary sets them above public scrutiny.

A few thousand years later, we can level some new criticism at the Marcus Aurelius statue, or the figure it represents. Even though Marcus Aurelius has historically been labeled one of the “good” emperors of ancient Roman, his army, as with any Roman military force, would routinely massacre, torture, mutilate, and terrorize its adversaries. On the domestic front, he ruled an empire that was bound to an especially brutal and dehumanizing system of slavery. The same indictments can be made against Constantine (who, in addition, ordered the execution of his wife and son), or any of Rome’s emperors.

Yet when one stares up at the statue today—either the one placed in a grand gallery of the Capitoline Museum or the replica in the historic courtyard—the viewer is urged to push those modern criticisms away. The magnificent rider and his steed radiate power and nobility, as if they still were striding through the streets of Rome.

Who would argue that such a venerated artistic marvel is problematic?

The statue itself does not explain the complicated historical circumstances of Roman rule or Marcus Aurelius’ reign, because it has no intention to do so. Instead, it purely celebrates a man for his deeds and the mark he left on the world as viewed by those who commissioned the statue.

Even though historic equestrian statues are direct in their messaging to the viewer, today—if we are viewing these statues for the first or the 100th time—we have the power, and hopefully the virtue, to take a deeper look.

The post Why Are There so Many Statues of Men on Horseback? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/16/many-statues-men-horseback/ideas/essay/feed/ 0
The “Little Giant” Who Thought That Backing Slavery Would Unite Americahttp://www.zocalopublicsquare.org/2018/02/15/little-giant-thought-backing-slavery-unite-america/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/15/little-giant-thought-backing-slavery-unite-america/ideas/essay/#respond Thu, 15 Feb 2018 08:01:23 +0000 By Graham A. Peck http://www.zocalopublicsquare.org/?p=91240 One of the most ambitious attempts to unite America ended up dividing it, and altering it forever.

At the opening of the 33rd Congress on December 5, 1853, Stephen A. Douglas, the short, rotund U.S. Senator from Illinois, planned an ambitious legislative program of national expansion.

“The Little Giant,” as he was known, sought to establish a new territory—Nebraska Territory—fashioned from the immense tract of Indian-occupied land that stretched west to the Rockies from Missouri, and north to Canada. He desired to remove the Indian inhabitants; to survey, sell, and populate the land; and to construct a transcontinental railroad to unite the country.

“How are we to develop, cherish and protect our immense interests and possessions on the Pacific,” he asked fellow senators, “with a vast wilderness fifteen hundred miles in breadth, filled with hostile savages, and cutting off all direct communication?” The solution, he said, was removing the “Indian

The post The “Little Giant” Who Thought That Backing Slavery Would Unite America appeared first on Zócalo Public Square.

]]>
One of the most ambitious attempts to unite America ended up dividing it, and altering it forever.

At the opening of the 33rd Congress on December 5, 1853, Stephen A. Douglas, the short, rotund U.S. Senator from Illinois, planned an ambitious legislative program of national expansion.

“The Little Giant,” as he was known, sought to establish a new territory—Nebraska Territory—fashioned from the immense tract of Indian-occupied land that stretched west to the Rockies from Missouri, and north to Canada. He desired to remove the Indian inhabitants; to survey, sell, and populate the land; and to construct a transcontinental railroad to unite the country.

“How are we to develop, cherish and protect our immense interests and possessions on the Pacific,” he asked fellow senators, “with a vast wilderness fifteen hundred miles in breadth, filled with hostile savages, and cutting off all direct communication?” The solution, he said, was removing the “Indian barrier.” That done, Americans soon would command a continental empire.

Only 40 years old in 1853, Douglas was already a figure of national renown. He had played the key role in defusing the nation’s crisis over slavery in 1850, when Southerners had threatened to leave the Union, by skillfully shepherding a bevy of compromise measures through Congress. These had included acts authorizing territorial settlers to determine slavery’s legality in states carved from the huge domain acquired in 1848 from Mexico. Those measures had, for the time being, silenced debate over Congress’s power to control territorial slavery.

Most Southern politicians contended that the United States Constitution required Congress to protect slaveholders’ property rights in national territories. Most Northern congressmen replied that slavery existed only if state law sanctioned it. Douglas sought to sidestep the constitutional logjam by permitting territorial settlers to determine slavery’s legality.

He initially termed this doctrine congressional “non-interference” with slavery, but subsequently invoked the talismanic phrase “popular sovereignty” to lay claim to the concept of self-rule. Popular sovereignty was the core of the country’s democratic experiment; hence, to Douglas, territorial settlers deserved the right to determine their fate. Slaves did not get a vote.

To Douglas, the Union required compromise over slavery, as it had at the nation’s founding.

At stake was not only slavery’s potential expansion, but also the meaning of the nation. Both Southerners and Northerners lauded freedom, yet in doing so they meant different things. Southern politics reflected a proslavery perspective that rested upon the massive wealth of slave property and the immense profits of plantation agriculture. By contrast, Northern politics reflected both a burgeoning free population’s insatiable demand for western farmland and a moral resistance to making chattel of men.

At the intersection of the country’s clashing definitions of freedom and its conflicting economic interests stood Stephen A. Douglas and popular sovereignty.

Douglas, like many Northern Democrats, had long been a middleman between North and South. Born in Vermont, he had migrated to central Illinois as a young man, launched a political career, and successfully solicited the support of Southern-born settlers who had made central Illinois their home. Once in Congress, he socialized with Southern Democrats and married a wealthy slave owner’s daughter, whose inheritance in 1848 made him the beneficiary of a large slave plantation.

His political views marched in tune. A leading voice for both national expansion and national union, he invariably sought compromise when conflicts between slavery and freedom arose. To Douglas, the Union required compromise over slavery, as it had at the nation’s founding. For this reason alone, he denounced antislavery activists. Yet their egalitarian racial creed also repelled him. If, as Douglas believed, blacks were not equal to whites and did not possess inalienable rights to life, liberty, and the pursuit of happiness, then slavery was justifiable, perhaps even necessary. Hence popular sovereignty profoundly appealed to Douglas: It reflected America’s democratic practice and racial sensibilities, and, if adopted as a national principle, promised to end conflict over slavery’s expansion.

Yet the clash of slavery and freedom meant that Douglas’ dream for Nebraska Territory was not without risk. Expansion tended to bring the country’s divergence over slavery into sharp relief. In a new territory, slavery would either be lawful or not. This basic fact often brought the politics of slavery to the fore.

U.S. map of 1856 shows free and slave states and populations. Image courtesy of Wikimedia Commons.

In Nebraska, the politics of slavery were explosive. The Missouri Compromise in 1820—which had settled a bitter and prolonged debate over Missouri’s admission as a slave state—pledged to keep free the land that Douglas now sought to organize as Nebraska Territory. Yet Southern congressmen in the early 1850s resisted creating new free territories and refused to supply the votes by which future antislavery congressmen would subsequently make their way to Washington, D.C.

Likely for this reason, Southern senators in the preceding Congress had tabled Douglas’ previous Nebraska bill, which had been organized under the Missouri Compromise’s antislavery provisions. So, in 1853, Douglas incorporated popular sovereignty into his new bill, intending to resolve not only the logjam over Nebraska Territory, but also future conflicts over slavery’s expansion.

His decision forever altered American history.

The ambitious bill precipitated a massive battle over the future of freedom in America. Introduced on the first day of the congressional session, it would pass Congress as the Kansas-Nebraska Act in May 1854. But the backlash against it was stupendous. Northerners angrily rejected Congress’ possible repeal of the Missouri Compromise’s antislavery prohibition.

Mass political meetings in countless cities protested the bill. 3000 New England clergymen sent a signed petition against the bill “in the name of Almighty God.” And, when Douglas traveled through Ohio’s antislavery Western Reserve on his return to Illinois, he could see his burning effigy “upon every tree” he passed.

Douglas believed that agitators had whipped up the crisis and predicted that the act would “be as popular at the North as it is at the South” once its democratic character was fully comprehended. Instead, the 1854 elections portended an antislavery revolution in the North. Candidates supported by hastily assembled and motley anti-Nebraska coalitions received an avalanche of votes that swept incumbent Democrats from Congress.

By 1856, the recently founded antislavery Republican Party had emerged as the North’s majority party. And in 1860 Republican voters put Abraham Lincoln in the White House by claiming almost every electoral vote in the Northern states. Southern states responded with secession, and then war, and soon the political contest over the meaning of freedom in America escalated into a remorseless revolutionary struggle over the fate of slavery and the nation. The nation survived the struggle. Slavery did not.

In 1853, Stephen A. Douglas had not sought to make an antislavery nation. That achievement, which our country now so justly prizes, was thus ironically made possible by one of the many Americans tolerant of slavery.

But that irony also reminds us that the powerful hold of slavery on antebellum Americans—Northerners and Southerners alike—is a central reason why we still struggle with the legacy of the Civil War. Our 19th-century forebears tried to put freedom into practice, and left a complicated heritage. We will do the same.

The post The “Little Giant” Who Thought That Backing Slavery Would Unite America appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/15/little-giant-thought-backing-slavery-unite-america/ideas/essay/feed/ 0
How Iranian Women Turn “Pious Fashion” Into Under-the-Radar Dissenthttp://www.zocalopublicsquare.org/2018/02/14/iranian-women-turn-pious-fashion-radar-dissent/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/14/iranian-women-turn-pious-fashion-radar-dissent/ideas/essay/#respond Wed, 14 Feb 2018 08:01:59 +0000 By Elizabeth Bucar http://www.zocalopublicsquare.org/?p=91214 In 2018, Islamic clothing is officially cool. CoverGirl has a hijabi ambassador. H&M sells a popular modest clothing line. Even Barbie wears a headscarf on a doll modeled after the American fencer Ibtihaj Muhammad.

Despite this cool factor, Islamic women’s headscarves and clothing retain strong associations with piety and politics, symbolism that is wielded both by the woman in the clothes and the people around her. In countries where Muslims are minorities, as in the United States, merely wearing hijab is seen as a political act, albeit one that can be interpreted in many ways. Shepard Fairey created an image of a woman wearing a flag hijab as a sign of tolerance and inclusivity, while others claim that the scarf is a sign of Muslim women’s repression.

In Muslim-majority countries, however, the symbolism—and the way that women and the state both use hijab to express ideas—is deeper and more interesting.

The post How Iranian Women Turn “Pious Fashion” Into Under-the-Radar Dissent appeared first on Zócalo Public Square.

]]>
In 2018, Islamic clothing is officially cool. CoverGirl has a hijabi ambassador. H&M sells a popular modest clothing line. Even Barbie wears a headscarf on a doll modeled after the American fencer Ibtihaj Muhammad.

Despite this cool factor, Islamic women’s headscarves and clothing retain strong associations with piety and politics, symbolism that is wielded both by the woman in the clothes and the people around her. In countries where Muslims are minorities, as in the United States, merely wearing hijab is seen as a political act, albeit one that can be interpreted in many ways. Shepard Fairey created an image of a woman wearing a flag hijab as a sign of tolerance and inclusivity, while others claim that the scarf is a sign of Muslim women’s repression.

In Muslim-majority countries, however, the symbolism—and the way that women and the state both use hijab to express ideas—is deeper and more interesting. Rather than arguing about whether or not Muslim women should dress modestly, I study how Muslim women dress: what they are wearing and why, and how they use fashion to exert political influence.

Muslim-majority countries have a history of regulating women’s clothing through official dress codes, whether banning headscarves or requiring them. In Iran, for instance, Muslim women’s dress was a political matter long before it became the symbol of revolution in 1979. The shah banned the full-body covering called chador in 1936 as part of his attempt to undermine the authority of the Shia clerics and westernize Iranian women.

Now, of course, Islamic clothing is required for women in Iran by law. Drafted under Ayatollah Ruhollah Khomeini’s leadership as part of his vision for a public space governed by the principles of Islamic morality, these laws include harsh punishments for inadequate hijab—jail time, fines, even 74 lashes with a whip. Harassment and arrests for violations became commonplace after the revolution.

Despite conditions of discrimination—because requiring a headscarf and modest clothing is discriminatory—pious fashion comes in a remarkable range of styles in Tehran. One option is to wear the floor-length chador draped over the hair and shoulders. The alternative to chador is a coat-like manteaux with some sort of head covering. There are two popular head coverings to pair with a manteaux. One is a sort of balaclava, called a maghneh. But the fashionable women of Tehran wear a rusari—a scarf covering the head and knotted under the chin or wrapped around the neck, personalized by fabric, color, pattern, and style of drape.

In this urban casual look, the hardware on the Dr. Martens boots echoes the studs on the Valentino crossbody bag. The Topshop floral leggings are the stand-out item, made even cooler by being paired with utilitarian items like a black scarf and a military jacket. The graffiti in the background is of a dish-soap bottle. Photo courtesy of Anita Sepehry/the Tehran Times fashion blog.

Though these items represent the building blocks of modest garb, they do not define its expression. Women define what pious fashion looks like when they get dressed every morning—whether they wear structured separates accessorized with designer sunglasses, flowy pastel chiffons embellished with rhinestones, or ripped jeans tucked into combat boots. On the streets of Tehran, in its cafés and places of business, women find ways to use their clothing to make claims about what counts not only as fashion, but also as piety.

Within a regime that has attempted for decades to promote dress codes as a way to craft particular types of Muslim citizens, and in which direct political resistance is dangerous, clothing has become a form of political engagement that is potentially powerful because it can sometimes slide under the radar as a matter of culture versus statecraft.

What sort of power can modest clothing choices have? For one, dress becomes a way to access governmental office. Women hold numerous advisory roles in government. Chador is a requirement of appointment to these positions. But this limitation also creates an opportunity. Women can take advantage of the symbolic meaning of the chador to mark themselves as supporters of the theocracy, independent of their actual political views.

Then there is a more recent popular style integrating traditional motifs and embroidery that is Kurdish, Turkoman, or Indian. Called lebase mahali, which means “local clothing” in Persian, it does not push the boundaries of modesty. But it does something else: It highlights Persian and Asian aesthetics over Islamic and Arabic ones. This Persian ethnic chic undermines current Islamic authority, sometimes unintentionally, simply because it draws on sources of authority that predate the Islamization of Iran.

This power to critique through sartorial choice comes with substantial risk. Since clothing is so strongly linked to character, a bad outfit can be seen as a reflection of poor character. In Iran, there is even a term for this: bad hijab. Bad hijab can be both an ethical failure (too sexy) and an aesthetic failure (not tasteful). It’s a concern of the authorities because bad hijab disrupts the public Islamic space that Iranian theocracy tries to create.

The infamous morality police have often targeted women for what they deem bad hijab, but they are not the only ones. In fact the first time I noticed it was while shopping with my Iranian friend Homa. “Liz, this is a good example of bad hijab for you,” she said when a young woman walked by. Homa was quite happy to elaborate: “Her ankles are showing, her pants are rolled up, they are made of denim and tight. Her manteaux is short, slit up the side, tight, made of thin material, and exposes the back of her neck and her throat. And her rusari, look at her rusari. It is folded in half so that her hair sticks out in front and back and tied so loosely that we can see all her jewelry. Plus, her makeup is caked on.”

This outfit is a glam version of edgy hijab. The Alexander McQueen-style skull-patterned scarf, fur vest, and Givenchy Rottweiler print clutch give the woman a rock vibe. Photo courtesy of Donya Joshani/the Tehran Times fashion blog.

Homa’s determination of bad hijab was based on a number of perceived violations. The first problem was that the woman’s outfit exposed parts of her body legally required to be covered. Homa also disapproved of the woman’s jeans—reflecting a widely held opinion in Iran that denim is improper for women to wear for both aesthetic reasons (as a fabric that is too casual) and political reasons (as a Western fabric that might infect the subject with Western ideas).

Homa spent considerable time describing for me why this woman’s rusari was inadequate. In this case, the violation depended in part on the scarf’s gauzy material, which was translucent. The way the scarf was worn was also a problem: By folding the rusari in half lengthwise, the woman only covered half as much hair as normal. Homa had also judged the woman’s heavy hand with makeup a hijab “failure” because it made her appear more alluring to the opposite sex.

Why so catty? Of course women, even pious ones, can be hard on one other, but there is more to learn from Homa’s reaction. Accusation of bad hijab is an expression of her own concern over sartorial practice. Pious fashion creates aesthetic and moral anxiety. Am I doing it right? Do I look modest? Professional? Stylish? Feminine? Women try to resolve this anxiety by identifying who is doing it wrong. Improper pious fashion is what allows proper pious fashion to redefine itself away from stigma to style: If this mystery woman was wearing bad hijab then surely Homa was a sartorial success.

Homa’s accusation of bad hijab might have helped legitimate her own clothing choices, but it came at a cost. Public shaming of Muslim women’s dress relies on a specific ideology of how women should appear in public, and women themselves are not exempt from promoting this aspect of patriarchy. By policing other women, they accommodate existing ideology to improve their own status.

At the same time, bad hijab is politically potent because it can shift the boundaries of successful pious fashion, sometimes expanding those boundaries, sometimes narrowing them. Homa might have been outraged by what this mystery woman was wearing, but she was violating some of the very same norms: Her own ankles were showing, her hair peeked out from her scarf, she had on foundation, eyeliner, and mascara.

And when everyone is showing her ankles and painting her toes, it sends a very personal signal about how the state’s power to define women’s morality is declining. What are my friends wearing? What are designers producing? What are bloggers posting? These are the sorts of things that influence what Iranian women wear, not only the threat of police surveillance and arrest. Besides, there are not enough police in Tehran on a hot summer day to arrest every young woman wearing capris.

In a surprise public statement last December, Brigadier General Hossein Rahimi, head of Greater Tehran police, admitted as much. He announced that women who are found to be wearing bad hijab will no longer be arrested, but instead sent to morality classes. It is too soon to say if this is a clear sign of a shift in Iranian politics. But if this does signal a positive change, credit goes to women’s sartorial savvy, not the police. And to the public who would undoubtedly react if everyone wearing nail polish was administered the 74 lashes permitted in the penal code.

In recent weeks a few Iranian women have protested the forced dress code directly. They stand on top of utility boxes, take off their headscarves, and wave them on sticks. These protests have resulted in dozens of arrests, proving that in the current political climate bad hijab might be tolerated, but no hijab is going too far. Images of these protests on Twitter include women in full chador waving headscarves in solidarity. This is a good reminder that it is not the wearing of hijab that Iranian women oppose, but rather the government’s attempt to police their bodies. The protesters and the Iranian authorities agree on at least one thing: what women wear matters.

The post How Iranian Women Turn “Pious Fashion” Into Under-the-Radar Dissent appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/14/iranian-women-turn-pious-fashion-radar-dissent/ideas/essay/feed/ 0
Pediatrician and Author Nadine Burke Harrishttp://www.zocalopublicsquare.org/2018/02/14/pediatrician-author-nadine-burke-harris/personalities/in-the-green-room/ http://www.zocalopublicsquare.org/2018/02/14/pediatrician-author-nadine-burke-harris/personalities/in-the-green-room/#respond Wed, 14 Feb 2018 08:01:13 +0000 zocalo http://www.zocalopublicsquare.org/?p=91217 Nadine Burke Harris is the founder and CEO of the Center for Youth Wellness in San Francisco’s Bayview-Hunters Point district. She is the author of The Deepest Well: Healing the Long-Term Effects of Childhood Adversity, and her TED talk “How Childhood Trauma Affects Health Across the Lifetime” has been viewed over three million times. Before taking part in a Zócalo Public Square event titled “Does Childhood Trauma Live In The Body Forever?” at the National Center for the Preservation of Democracy in Little Tokyo in downtown Los Angeles, she spoke in the green room about mental illness in her family, thinking about San Francisco while working in Haiti, and what’s most Jamaican about her.

The post Pediatrician and Author Nadine Burke Harris appeared first on Zócalo Public Square.

]]>
Nadine Burke Harris is the founder and CEO of the Center for Youth Wellness in San Francisco’s Bayview-Hunters Point district. She is the author of The Deepest Well: Healing the Long-Term Effects of Childhood Adversity, and her TED talk “How Childhood Trauma Affects Health Across the Lifetime” has been viewed over three million times. Before taking part in a Zócalo Public Square event titled “Does Childhood Trauma Live In The Body Forever?” at the National Center for the Preservation of Democracy in Little Tokyo in downtown Los Angeles, she spoke in the green room about mental illness in her family, thinking about San Francisco while working in Haiti, and what’s most Jamaican about her.

The post Pediatrician and Author Nadine Burke Harris appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/14/pediatrician-author-nadine-burke-harris/personalities/in-the-green-room/feed/ 0
In Whose God Do Americans Trust?http://www.zocalopublicsquare.org/2018/02/13/whose-god-americans-trust/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/13/whose-god-americans-trust/ideas/essay/#respond Tue, 13 Feb 2018 08:01:57 +0000 By Matthew Bowman http://www.zocalopublicsquare.org/?p=91183 Charles Bennett, a Democratic Congressman from Jacksonville, Florida, was afraid of communism. In July 1955, he spoke of his concerns on the floor of the House of Representatives. “In these days, when imperialistic and materialistic communism seeks to attack and destroy freedom, we should continually look for ways to strengthen the foundations of our freedom,” he told his fellow members of Congress. Bennett’s proposed solution was simple: Americans could add the phrase “In God We Trust” to their dollar bills. By consensus, Congress adopted Bennett’s resolution.

Americans’ ready embrace of the phrase in the 1950s seems foreign to a contemporary politics when so many Americans find invocations of Christianity in American politics frustrating. But seen in another light, the story of “In God We Trust” actually sheds light on the reasons behind that frustration. The phrase seems straightforward and clear, but any interrogation of it shows how quickly its meaning

The post In Whose God Do Americans Trust? appeared first on Zócalo Public Square.

]]>
Charles Bennett, a Democratic Congressman from Jacksonville, Florida, was afraid of communism. In July 1955, he spoke of his concerns on the floor of the House of Representatives. “In these days, when imperialistic and materialistic communism seeks to attack and destroy freedom, we should continually look for ways to strengthen the foundations of our freedom,” he told his fellow members of Congress. Bennett’s proposed solution was simple: Americans could add the phrase “In God We Trust” to their dollar bills. By consensus, Congress adopted Bennett’s resolution.

Americans’ ready embrace of the phrase in the 1950s seems foreign to a contemporary politics when so many Americans find invocations of Christianity in American politics frustrating. But seen in another light, the story of “In God We Trust” actually sheds light on the reasons behind that frustration. The phrase seems straightforward and clear, but any interrogation of it shows how quickly its meaning dissolves into ambiguity. The “God” of the phrase is by implication—though not explicitly—the monotheistic deity of the Hebrew and Christian Bibles. But what it means to “trust” is unclear, and “we” is undefined. Does it refer to all Americans? More, the same is true for the word “Christian” itself. What does it mean to be a “Christian candidate”? What does it mean to have a “Christian nation”? During the Cold War, Protestants like Bennett imagined the existence of a religious consensus and coined the phrase “Judeo-Christian” to describe it—but whether that consensus actually exists remains up for debate.

Putting the name of God on American currency was not a new idea in 1955. The words “In God We Trust” had first been placed on coins in 1861 at the direction of Treasury Secretary Salmon P. Chase, at the urging of a Christian minister named M.R. Watkinson. For a week, Chase pondered on Watkinson’s letter, which suggested placing the phrase “GOD. LIBERTY. LAW.” on coins. Then the treasury secretary instructed the director of the Mint to stamp the declaration on the money, but he changed the wording to “In God We Trust,” which Chase—always pompous—thought stylistically superior to Watkinson’s suggestion. The phrase appeared on American coins intermittently from Chase’s time until 1938, when it began appearing on all metal currency.

Bennett’s resolution fixed its appearance on paper currency as well. In the 1950s, with the country feeling under siege, his proposal to revive Chase’s plan received wide support from both political parties and President Dwight Eisenhower. At almost precisely the same time that Bennett’s measure passed, Congress voted to add the phrase “under God” to the Pledge of Allegiance. Senator Homer Ferguson, a Republican from Michigan, explained that he supported that resolution because “Our nation is founded on a fundamental belief in God, and the first and most important reason for the existence of our government is to protect the God-given rights of our citizens.”

Americans’ enthusiasm to publicly declare their faith in God—both in the turbulent 1860s and 1950s—was more than simple piety; these were acts of public theology. American Christians did not merely advocate for public acknowledgement of God; they offered interpretations of what that faith should mean for U.S. democracy. Bennett, Ferguson, and Chase each performed a ritual of national sanctification, trying to bind together in the public eye what they took for granted: that American democracy depended upon the virtues of what they thought of as simply “Christianity,” but which in retrospect seems to be Protestantism.

Chase, Bennett, Ferguson, Eisenhower: all were Protestants, and as Protestants they argued that Christian theology provides a transcendent rationale for the importance of democracy and human freedom. Protestants had been suspicious of authority—whether spiritual (as in church leadership) or worldly (as in kings and queens)—since the Reformation. Only a few years before the American Revolution, for instance, a popular political cartoon depicted a band of restive Boston colonists banishing an Anglican bishop that the British crown had sent to them. The protesters are shown shouting slogans like “No Lords Spiritual or Temporal in New England” and “Liberty & Freedom of Conscience.” Words like “liberty,” “freedom,” and “rights” linked Christianity to individual independence, and fueled what Protestant Westerners in general and Protestant Americans in particular thought of as Christian civilization: a society in which democratic politics and Protestant religion were intertwined into a creed of personal liberty.

Grasping this connection helps explain why some American Protestant activists today insist that the health of the state depends upon whether their form of Christianity is given space in the public sphere to thrive. And yet, that insistence occludes the fact that even within American Protestantism, there have been distinct visions of the form of Protestant faith that democracy depends on.

Dwight Eisenhower, for instance, was famously convinced of the importance of religious belief to American democracy, holding to the same mainline Protestant convictions as Bennett or Ferguson. He was baptized Presbyterian soon after he was inaugurated as president because he believed in the importance of Protestantism to the health of the state. “A democracy cannot exist without a religious base,” he argued, famously stating that, “Our government has no sense unless it is founded in a deeply felt religious faith, and I don’t care what it is”—a seeming ecumenical gesture he quickly qualified by asserting that the United States was based upon the “Judeo-Christian concept.”

For Eisenhower, as for many other Cold War American Protestants, the “Judeo-Christian” tradition was a way to unite Protestants, Catholics, and Jews. It dismissed niceties of theology or particularities of religious ritual in favor of a nebulous sense of solidarity. In practice, however, “Judeo-Christianity,” like the people who used the word, often took for granted Protestant assumptions: suspicion of religious hierarchy, association of religious faith with individual piety and moral practice, an emphasis upon personal feeling rather than corporate participation. All these things were to them compatible with American democracy.

American Christians did not merely advocate for public acknowledgement of God; they offered interpretations of what that faith should mean for U.S. democracy.

But to other American Christians—even American Protestants—the generalities of Eisenhower’s Judeo-Christianity were not sufficient, and the words “In God We Trust” took on very different meaning. The struggle over what it means to be a Christian in America are thus very much up for debate, with some insisting the obligations of the term extend far beyond Eisenhower’s gentle platitudes.

For example, the pastor David Barton runs a large evangelical ministry devoted to demonstrating that the American Founding depended upon Christian ideas. Barton is representative of the politically and theologically conservative subset of Protestants today sometimes called the “Religious Right.” Barton goes much further than Eisenhower, formulating a Christianity of much more detailed theological expectation and far less comfort in the mainline of American culture. Not content to claim that Christians share simply generalized pieties, Barton argues that the American Founders shared his own particular brand of American evangelicalism, marked by the importance of affirming Jesus Christ, by social conservatism, and by small government. For Barton, then, not only were American leaders from the Revolution through Eisenhower influenced by Protestant ideas of liberty; they explicitly intended to found a state shaped by the sort of pious evangelical Christianity that Barton himself embraces.

Surveys have shown that conservative activists like Barton have been somewhat successful in claiming the term “Christian”—many young Americans now associate the term “Christian” with Barton’s conservative social politics. But Barton faces a number of Protestant critics, like historian John Fea, who argue that his form of evangelical patriotism is not simply bad politics, but bad Christianity. For Fea, Barton’s veneration of the American Founders is idolatry; hardly Christianity at all.

Similarly, other Protestants contend that the relationship between their faith and democracy should point American politics in different directions than Barton’s economically libertarian and socially conservative ideology. While Fea offers a theological criticism of Barton’s faith, Jim Wallis maintains that Barton’s politics are distant from what Christianity demands. Wallis, a Methodist pastor and political activist known best for assailing American capitalism and calling for a stronger welfare safety net, argues that Protestant liberty means that the Christian community should use civil authority to promote the economic independence and well-being of all its members. For Barton, Protestant morality requires social conservatism, opposition to abortion and same-sex marriage. For Wallis, Protestant morality means hospitality and welcoming the stranger.

In truth, the relationship between American democracy and American Christianity remains open to debate—and the spectrum from Barton to Eisenhower to Wallis encompasses only white American Protestants.

When he proposed adding “In God We Trust” to American currency, Charles Bennett declared that the sentiments behind the phrase were “indigenous to our country.” He assumed a common heritage and common understanding of what Christianity might mean. But contemporary disputes, like those between Barton and Wallis, illustrate that the argument about that relationship has never been so simple as any particular American, or American Christian, might wish.

The post In Whose God Do Americans Trust? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/13/whose-god-americans-trust/ideas/essay/feed/ 0
The Myth of Untouched Wilderness That Gave Rise to Modern Miamihttp://www.zocalopublicsquare.org/2018/02/12/myth-untouched-wilderness-gave-rise-modern-miami/ideas/essay/ http://www.zocalopublicsquare.org/2018/02/12/myth-untouched-wilderness-gave-rise-modern-miami/ideas/essay/#comments Mon, 12 Feb 2018 08:01:18 +0000 By Andrew K. Frank http://www.zocalopublicsquare.org/?p=91134 Miami is widely known as the “Magic City.” It earned its nickname in the late 19th and early 20th centuries, shortly after the arrival of Henry Flagler’s East Coast Railroad and the opening of his opulent Royal Palm Hotel in 1897. Visitors from across the country were lured to this extravagant five-story hotel, at the edge of the nation’s southernmost frontier. From their vantage point, South Florida was the Wild West—and Miami could only exist if incoming settlers were able to tame it. And tame it they did. Miami’s population boomed, from roughly 300 in 1896 to nearly 30,000 in 1920. Onlookers marveled as the “metropolis” seemed to emerge overnight from the “wilderness.”

This legend, repeated for more than a century, blends truth with fiction, and reminds us that history is as much about forgetting as it is about remembering. Flagler and a woman named Julia Tuttle stand at the

The post The Myth of Untouched Wilderness That Gave Rise to Modern Miami appeared first on Zócalo Public Square.

]]>
Miami is widely known as the “Magic City.” It earned its nickname in the late 19th and early 20th centuries, shortly after the arrival of Henry Flagler’s East Coast Railroad and the opening of his opulent Royal Palm Hotel in 1897. Visitors from across the country were lured to this extravagant five-story hotel, at the edge of the nation’s southernmost frontier. From their vantage point, South Florida was the Wild West—and Miami could only exist if incoming settlers were able to tame it. And tame it they did. Miami’s population boomed, from roughly 300 in 1896 to nearly 30,000 in 1920. Onlookers marveled as the “metropolis” seemed to emerge overnight from the “wilderness.”

This legend, repeated for more than a century, blends truth with fiction, and reminds us that history is as much about forgetting as it is about remembering. Flagler and a woman named Julia Tuttle stand at the center of the story: The importance of Flagler’s East Coast Railroad and Royal Palm Hotel led some residents to propose naming the city after him, and he is often depicted as the city’s “father.” Tuttle, a businesswoman who lured Flagler to Miami and otherwise promoted the region during the 1890s, earned the title of “Mother of Miami.” But Tuttle and Flagler did not create something out of nothing. On the contrary, Tuttle’s home and Flagler’s hotel stood precisely where earlier settlers had already left indelible marks over 2,000 years of continuous occupation.

These Miamians included Tequesta Indians who lived in the area for more than 1,500 years, and Spanish missionaries who tried to convert them; African enslaved persons tasked with turning the land into sugar fields, who instead created orchards of fruit trees; Seminole Indians who came to trade and harvest the local bounty, and U.S. soldiers who waged a war to exterminate them; and a continuous stream of Bahamian mariners, fugitive soldiers from various armies, and shipwrecked sailors. These earlier generations have been forgotten largely because Tuttle and Flagler were master illusionists who engaged in a combination of physical sleight of hand and intellectual misdirection. Rather than create something out of nothing, they built upon the storied history that preceded them—and then helped others forget it.

Julia Tuttle, widely known as the Mother of Miami. https://www.floridamemory.com/items/show/29793>State Archives of Florida, Florida Memory.

Tuttle clearly knew that she was not the first occupant of her waterfront property. Recently widowed, she relocated in 1891 from Cleveland to the mouth of the Miami River on Biscayne Bay, where she worked tenaciously to promote the region as a commercial and agricultural opportunity. Tuttle moved into a 19th-century plantation house that had been built by enslaved Africans in the early 1830s, and constantly referred to it as “Fort Dallas,” which had been the name given to it when it was turned into a military outpost during the Second Seminole War (1835-1842). Tuttle’s property contained a man-made well, a stone wall, and several gravestones. There was a decades-old road that connected her home to the community on the New River—today’s Fort Lauderdale—and elsewhere up the Atlantic coast.

Still, despite all this evidence of earlier occupation, Tuttle declared to all who would listen that she was a founder of a new community. In words that would be widely repeated, she explained her ambitions. “It may seem strange to you but it is the dream of my life to see this wilderness turned into a prosperous country,” she wrote. One day, she hoped, “where this tangled mass of vine brush, trees and rocks now are to see homes with modern improvements surrounded by beautiful grassy lawns, flowers, shrubs and shade trees.” Tuttle wanted to “settle” a place that had been settled for centuries and turn it into an agricultural or commercial entrepôt.

James Henry Ingraham, president of the South Florida Railroad Company of the Plant System, was but one of the newcomers she impressed. Ingraham proclaimed that Tuttle had “shown a great deal of energy and enterprise in this frontier country where it is almost a matter of creation to accomplish so much in so short a time.” But his description of Tuttle’s efforts, too, revealed the preexisting history that made her successful. Tuttle, he wrote, “converted [Fort Dallas] into a dwelling house after being renovated and repaired with the addition of a kitchen, etc. The barracks … is used as office and sleeping rooms.” Despite her “improvement … on hammock land which fringes the river and bay,” Ingraham explained, the natural world remained largely untamed. “Lemon and lime trees,” which were planted by the earlier waves of Spanish, Bahamian, and American occupants, “are growing wild all through the uncleared hammock.” Ingraham, Tuttle, and others knew that citrus was not native to South Florida. Their claims about untamed wilderness were disingenuous.

Tuttle ignored evidence of the ancient Indian world that surrounded her. Like others of her generation, she recorded the presence of several large man-made mounds and shell middens in the area. Some were ancient burial sites or ceremonial centers, and others were basically landfills, built from generations of discarded shellfish and tools. They were all constructed by the Tequesta Indians, who had first settled the waterfront site 2,000 years earlier and lived there into the 17th century, when they attracted the unwanted attention of slave raiders, Spanish missionaries, and others moving in. Tuttle, like others who declared themselves to be on the frontier, deemed the Indian past to be inconsequential to the development that would follow.

With Tuttle engaged in acts of intellectual misdirection, Flagler and his construction crews took care of the physical destruction. Flagler, like most Gilded Age industrialists, is more typically associated with building than with razing. He earned his fame for helping found Standard Oil with John D. Rockefeller in 1870 and then creating Florida’s modern tourist industry with his railroad and luxury hotels in St. Augustine, Palm Beach, and elsewhere along Florida’s Atlantic Coast. Tuttle lured Flagler to Miami with gifts of orange blossoms after a brutal frost had destroyed the citrus crop in central Florida, and clinched the deal by dividing her property on the Miami River with him.

Tuttle and Flagler were master illusionists who engaged in a combination of physical sleight of hand and intellectual misdirection.

In 1896, Flagler’s laborers at the mouth of the river leveled the ancient Tequesta mounds that stood in the way of progress. They were unabashedly brutal about it. One of the workers noted that a burial mound “stood out like a small mountain, twenty to twenty-five feet above water” and “about one hundred feet long and seventy feet wide.” Flagler’s African American workers struggled to remove “a poison tree” that grew on the top of the mound, as it “would knock them cold.” Those workers “who were not allergic to it” leveled the mound, uncovering and hastily removing “between fifty and sixty skulls.” One of the workers took home the bones, “stored them away in barrels and gave away a great many … to anyone that wanted them.” When construction ended, he dumped the remaining skeletons “nearby where there was a big hole in the ground.” Another bayside mound was hidden behind a “great tangle of briars and wild lime trees.” The midden materials from these and other mounds were strewn across the property, becoming the foundation for Henry Flagler’s opulent Royal Palm Hotel.

The city of Miami incorporated in July 1896, a bit more than a year after the railroad reached the site of the Royal Palm Hotel. Thanks to the vision of Tuttle and marketing genius of Flagler and others, Miami quickly became a tourist destination. City boosters built roads and canals, plotted new communities, constructed man-made beaches, and established new civic organizations. The real estate boom that followed incorporation pushed the residential community out from the mouth of the river and in only a couple of decades turned the small town into a bustling city. Tuttle died in 1898 and Flagler in 1916, but their collective imprint on Miami survived the hurricane of 1926, even as it destroyed the Royal Palm Hotel and temporarily slowed the city’s growth during the Depression. Miami remained a city committed to reimagining the future rather than one interested in celebrating the past.

Tuttle and Flagler shared an illusion that they were settling untouched wilderness—even as they were surrounded by evidence of earlier occupation. In this way, their story is no different than those of settlers across the continent whose shared myth of the frontier allowed them to ignore the history that preceded them. In the 1880s, the frontier was a fairly simple but magical idea: It allowed white Americans to ignore the ancient history of Native America. The myth of the frontier—that pervasive and most-American idea—allowed Tuttle and others in Miami to see “unclaimed lands” in the United States as an untapped and disappearing resource, and to imagine that white American ingenuity transformed wilderness into civilization.

The post The Myth of Untouched Wilderness That Gave Rise to Modern Miami appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2018/02/12/myth-untouched-wilderness-gave-rise-modern-miami/ideas/essay/feed/ 1