Zócalo Public SquareIdeas – Zócalo Public Square http://www.zocalopublicsquare.org Ideas Journalism With a Head and a Heart Thu, 23 Nov 2017 01:59:12 +0000 en-US hourly 1 https://wordpress.org/?v=4.8 I’m Counting My Family’s Thanksgiving Blessings. My Neighbors Aren’t All So Fortunatehttp://www.zocalopublicsquare.org/2017/11/22/im-counting-familys-thanksgiving-blessings-neighbors-arent-fortunate/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/22/im-counting-familys-thanksgiving-blessings-neighbors-arent-fortunate/ideas/essay/#respond Wed, 22 Nov 2017 08:01:47 +0000 By Shanice Joseph http://www.zocalopublicsquare.org/?p=89538 In November, 2013, Shanice Joseph wrote an essay for Zócalo about how her financially challenged family was preparing to celebrate Thanksgiving. This year we asked her for an update, and she obliged.

 

With the holidays approaching I thought that I couldn’t be any happier. Over the past four years everything has been going great. My family and friends are happy and healthy. I made supervisor at my job. I bought a car. I’m more involved in my community. Most importantly I grew to become a better person. With everything going well, I anticipated things would be great this holiday season for the fourth time in a row, but an unexpected turn crushed my hopes.

Over the past couple of years, my community has changed a lot. Most people would say things changed for the better, being that much-needed resources were added to the community over the past four years.

The post I’m Counting My Family’s Thanksgiving Blessings. My Neighbors Aren’t All So Fortunate appeared first on Zócalo Public Square.

]]>
In November, 2013, Shanice Joseph wrote an essay for Zócalo about how her financially challenged family was preparing to celebrate Thanksgiving. This year we asked her for an update, and she obliged.

 

With the holidays approaching I thought that I couldn’t be any happier. Over the past four years everything has been going great. My family and friends are happy and healthy. I made supervisor at my job. I bought a car. I’m more involved in my community. Most importantly I grew to become a better person. With everything going well, I anticipated things would be great this holiday season for the fourth time in a row, but an unexpected turn crushed my hopes.

Over the past couple of years, my community has changed a lot. Most people would say things changed for the better, being that much-needed resources were added to the community over the past four years. For example, a new Work Source Center and The College Track, a resource center for aspiring college students in the community, were opened recently. Alta Med and Children’s Institute contributed new locations in Watts. Also, Martin Luther King., Jr Community Hospital was modernized, and a newly designed community garden even brought a smile to my face. Beyond this, the introduction of the “community based improvement initiative” Watts Re-imagined, the goal of which was to add opportunities and resources that benefitted the community, gave me even more to be thankful for just in time for Thanksgiving.

In addition to the changes, the apartment complex that I lived in was renovated. Some of the renovations included new paint jobs, hardwood floors, a new dishwasher, and new windows, which was great because they provided a cozy new look for residents. Just as everyone was elated over all the new resources, some of my neighbors were unfortunately given letters asking if they would consider moving in exchange for a small amount of money, or face eviction.

I didn’t agree with the idea of paying tenants to leave homes that they’d lived in for years. Rightfully angry, my neighbors rallied, arguing that this is an injustice, and that this is exactly what is wrong when gentrification infiltrates communities. Some of them even pointed out that they felt it was unfair that they weren’t going to be able to enjoy all these new resources, which they once had been thankful for. If the new upgrades meant that they would be left homeless, or forced to break social ties, they would rather do without.

I was equally upset and fearful that the some of the inevitable negative effects of gentrification would further encourage involuntary displacement. To me it felt like watching a child open up a desired Christmas gift, then having it snatched from them after it was unwrapped. I know from personal experience how hard financially the holidays can be on some families—and the additional stress and financial burden of moving from a beloved community to a new home had to be worse.

Although the holidays have been going well for me for the past couple of years, I couldn’t feel well inside knowing that although my community was progressing as a whole, some members were being left behind. It was a bittersweet feeling: to be thankful for all the resources given to my community, but knowing that some people were being asked to leave, unable to enjoy the holidays in the community they’ve lived in for years. I’m not sure how I could enjoy my holiday knowing that all wasn’t well—but it’s definitely on my Christmas list to help in any way that I can.

The post I’m Counting My Family’s Thanksgiving Blessings. My Neighbors Aren’t All So Fortunate appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/22/im-counting-familys-thanksgiving-blessings-neighbors-arent-fortunate/ideas/essay/feed/ 0
How Norway Taught Me to Balance My Hyphenated-Americannesshttp://www.zocalopublicsquare.org/2017/11/20/norway-taught-balance-hyphenated-americanness/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/20/norway-taught-balance-hyphenated-americanness/ideas/essay/#respond Mon, 20 Nov 2017 08:01:01 +0000 By Eric Dregni http://www.zocalopublicsquare.org/?p=89446 During the year I spent studying at the university in Trondheim, Norway, I sometimes learned more about my own country than Norway. One day, in my immigration studies class, my professor David Mauk, who hailed from Ohio, asked, “What does it mean to be American?”

I braced myself to hear the usual stereotypes from the news from the Norwegian students in my class. Then the professor clarified, “What to you is truly good about America?”

Even though I’m an American, I was stumped. I mentioned the clichés of “liberty,” “melting pot,” “representative government.”

My Norwegian classmate Astrid was much more awake: “What about your Constitution?”

“The Freedom of Information Act!” exclaimed Dag, an outspoken intellectual with his hair slicked stylishly forward. “That is what makes America great.”

I realized that Dr. Mauk was actually leading us to one of the big questions of the day in Norwegian newspapers: What did

The post How Norway Taught Me to Balance My Hyphenated-Americanness appeared first on Zócalo Public Square.

]]>
During the year I spent studying at the university in Trondheim, Norway, I sometimes learned more about my own country than Norway. One day, in my immigration studies class, my professor David Mauk, who hailed from Ohio, asked, “What does it mean to be American?”

I braced myself to hear the usual stereotypes from the news from the Norwegian students in my class. Then the professor clarified, “What to you is truly good about America?”

Even though I’m an American, I was stumped. I mentioned the clichés of “liberty,” “melting pot,” “representative government.”

My Norwegian classmate Astrid was much more awake: “What about your Constitution?”

“The Freedom of Information Act!” exclaimed Dag, an outspoken intellectual with his hair slicked stylishly forward. “That is what makes America great.”

I realized that Dr. Mauk was actually leading us to one of the big questions of the day in Norwegian newspapers: What did it mean to be “Norwegian?” This was especially relevant because thousands of new immigrants had recently arrived in Norway, to a less-than-enthusiastic welcome. What I didn’t realize was how this question would help me understand my place in my own country.

Like many Americans, my great-grandparents escaped poverty and underwent a grueling ocean voyage to seek a better life. They came from Scandinavia, 125 years ago, and settled in Wisconsin and Minnesota. I had thought of this past as a curiosity rather than having much bearing on who I am today. Now that I was living in Norway for a year—and relying on its great socialized medicine and free university tuition—I, too, was one of the new foreigners. The word in Norwegian for us was “innvandrere,” which sounds like “invaders.” My classmate Sophie pointed out that “No, it’s ‘vandre.’ That’s more like wander, so it’s people who wander in.”

I slowly realized that many traits that I considered Minnesotan were essentially Scandinavian.

Sophie assured me that I wasn’t a foreigner in Norway; I was Norwegian, because my great-grandfather was born here. She and others in Norway referred to Minnesota, Wisconsin, North Dakota and other cold spots as Norway’s “colonies” in America. I was simply a colonist returning to the homeland.

I insisted that I was American, just with Norwegian roots and last name. But during my time in Norway, I slowly realized that many traits that I considered Minnesotan were essentially Scandinavian. Our art of avoiding confrontation and being passive aggressive to bring down people who think too much of themselves is reflected in the Nordic idea of janteloven, which enforces the “law” that you are no better than anyone else, so get your nose out of the air and put it to the grindstone. The Minnesota accent is Scandinavian or Germanic, and Midwestern linguistic peculiarities such as leaving prepositions to dangle come right from Norwegian (“Do you want to come with?”). Like Scandinavia, our government in the upper Midwest is remarkably clean and free of corruption. All we need is Nordic universal health care.

To appease the others in my class, I gave up arguing over whether I’m truly Norwegian or American. I realized this debate signaled that they accepted me, even though other new immigrants seemed to be excluded. While Sophie was positive about my “Norwegianness,” she wasn’t sure about Professor Mauk. He spoke the language fluently (as opposed to my halting Norwegian), lived in Norway, was even married to a Norwegian, but, without ancestors from Norway, definitely was not Norwegian.

Professor Mauk replied, “People keep asking me if I have become a Norwegian citizen yet, so I thought up a pat answer, ‘Would you consider me more Norwegian then?’ They pause to wonder if it’s a trick question and then reply, ‘Probably not.’ Then they stopped asking if I’ve become Norwegian.” He was American, according to them, even if he knew far more about Norway than I.

Back in the American Midwest, many people refer to themselves as simply “Norwegian” or “Swedish,” not “Norwegian-American” or “Swedish-American,” even though many have never been to Scandinavia and can’t speak the language. Most are third-, fourth-, and even fifth-generation Scandinavians who can decide which of their many different backgrounds they want to claim. Ethnicity in America becomes a choice when one has more than one background.

In the past, though, the U.S. government viewed these “hyphenated Americans” with suspicion. When World War I broke out, suddenly no one was German in Minnesota, even though it’s the largest ethnic group. Scandinavians were viewed with equal distrust. My grandfather spoke a mixture of Swedish and Norwegian at home (“Svorsk”) until he went to kindergarten—where he was punished for not speaking English, a language he didn’t know. One child in the class translated for the rest of the kids. When I asked why my grandfather didn’t pass on the “old language,” he told me solemnly, “We’re in America; we speak English now. It’s better that way.”

Warning posted at Scandinavian lodges and elsewhere advising against using foreign languages during WWI. It suggested that people speak “American,” rather than “English.” Image courtesy of Minnesota Historical Society.

During World War I, speaking foreign languages in the U.S. was considered tantamount to treason. Posters mounted in ethnic lodges warned, “Don’t be SUSPECTED! Use AMERICAN LANGUAGE. America is Our Home.” The Minnesota Commission of Public Safety had the power to jail anyone speaking out against the war or for “the idea of peace.” The government set up “Americanization” committees to force immigrants to give up their hyphenated identities by abandoning their old citizenship and culture and becoming Americans and only Americans. My great-grandparents arriving in the Midwest were ultimately able to assume a new identity of being “American,” if they wanted. However, today’s new immigrants to Norway may never have the option of being considered Norwegian.

Remnants of Norwegian culture remain and are treasured in the Midwest, but my family lost the language, though some of us have struggled to learn it again. One Lutheran parishioner joked to the author of They Chose Minnesota about being forced to give up his native tongue, “I have nothing against the English language. I use it myself every day. But if we don’t teach our children Norwegian, what will they do when they get to heaven?”

Would I have learned all of this if I hadn’t lived in Norway? I feel that only through living abroad was I able to appreciate this complicated history and discover my place as an American. While I feel comfortable with this label, many people in the Midwest call themselves simply Norwegians without the hyphen. I told Helen, another university student, about this and how many of these “Norwegians” in Minnesota have never been to Norway and can’t speak a word of the language.

Without a hint of irony, she responded. “Then you, too, are Norwegian now.” I argued that I’m American with Norwegian roots, but she didn’t agree. “You were Norwegian first,” she insisted.

The post How Norway Taught Me to Balance My Hyphenated-Americanness appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/20/norway-taught-balance-hyphenated-americanness/ideas/essay/feed/ 0
Want to Take My Civics Class? Get Ready to Squirmhttp://www.zocalopublicsquare.org/2017/11/17/want-take-civics-class-get-ready-squirm/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/17/want-take-civics-class-get-ready-squirm/ideas/essay/#comments Fri, 17 Nov 2017 08:01:20 +0000 By Sarah Cooper http://www.zocalopublicsquare.org/?p=89399 In many conversations, the topic of civics education comes with its own halo. The conventional wisdom is that it’s good, clean medicine, and if our children just get enough of its inoculation, the American body politic will be healthy enough to survive another generation.

But after nearly two decades as a middle-school and high-school history teacher, I’ve come to understand through teaching civics—and studying how it’s taught—that learning how to be a citizen doesn’t work like that. Indeed, civics education is best when it’s messy and uncomfortable.

That’s especially true in times of conflict and transition, like the ones we are living in now.

Approaches to teaching civics have been as volatile as the country’s history.

Horace Mann, common school advocate in the mid-19th century, believed that schools should teach only those values that everyone agreed with, such as charity and justice, recounted the educational historian David Tyack in Seeking

The post Want to Take My Civics Class? Get Ready to Squirm appeared first on Zócalo Public Square.

]]>
In many conversations, the topic of civics education comes with its own halo. The conventional wisdom is that it’s good, clean medicine, and if our children just get enough of its inoculation, the American body politic will be healthy enough to survive another generation.

But after nearly two decades as a middle-school and high-school history teacher, I’ve come to understand through teaching civics—and studying how it’s taught—that learning how to be a citizen doesn’t work like that. Indeed, civics education is best when it’s messy and uncomfortable.

That’s especially true in times of conflict and transition, like the ones we are living in now.

Approaches to teaching civics have been as volatile as the country’s history.

Horace Mann, common school advocate in the mid-19th century, believed that schools should teach only those values that everyone agreed with, such as charity and justice, recounted the educational historian David Tyack in Seeking Common Ground: Public Schools in a Diverse Society. At the time, “everyone” meant people like Horace Mann—white, wealthy, Protestant, and native-born, not the immigrants who poured across the Atlantic.

Later in the 19th century and into the 20th, the government waded into programs of Americanization, trying to replace the influence of the family with the power of the state in teaching how to be an American citizen. Such inculcation through schools included Native Americans early on and Japanese Americans later, after World War II. In contrast, with the cultural upheavals of the 1960s and ‘70s, many school districts began treating each culture as something to be celebrated rather than repressed.

Regardless of the era, civics education over the past two centuries has been driven by social upheaval. As Tyack observed, “During periods of sharp demographic change, or war, or ethno-religious conflict, or economic challenge, for example, foundational principles of civic education came into sharper relief because they were less taken for granted.”

Today we live in a period of such conflicts, and I take absolutely nothing for granted in teaching civics. So I try to mix traditional and contemporary approaches.

On the traditional side, I ask my students to memorize lines from the Declaration of Independence and Lincoln’s speeches, especially those that focus on difficult compromise and principled revolt. I hope that phrases such as “We shall nobly save, or meanly lose, the last best hope of earth,” and “it is their right, it is their duty, to throw off such Government,” will ring in students’ heads well into adulthood, inciting them to action.

When I first taught eighth-grade history, I required students to memorize a lot more. On tests they had to regurgitate names that educated Americans barely remember, such as John O’Sullivan with his Manifest Destiny theory or Frederick Jackson Turner with his frontier thesis. For women’s suffrage, students had to spit out a litany of names rather than just a few who could serve as touchstones, such as Elizabeth Cady Stanton, Ida B. Wells, or Alice Paul.

With civics and the Constitution, I also used to include details that no longer matter as much to me, because they mire students in minutiae at the expense of deeper understanding. My middle schoolers no longer have to know the exact number of the amendment that limits presidential terms, but rather why it happened and in response to what.

My eighth-grade U.S. history students conduct discussions on their own, pose questions that provoke and unsettle, and challenge authority—including and especially mine—with respect.

Broad educational shifts, such as evolving state standards, have definitely influenced me to focus more on depth and less on breadth when teaching both civics and history. Even Advanced Placement history exams have been overhauled in the past several years to emphasize thematic understanding over disconnected details.

Yet even this kind of conceptual focus on the broader picture has not always felt like enough.

In the past several years, to get students to engage and think like citizens, I’ve found that I have to do a heck of a lot more than ask them to understand the past and dissect the present: I need them to create the future in my classroom, right now, through discussion and debate. I need them to be leaders in the classroom, not just participants, so that they can imagine presiding over boardrooms, civic groups, and family conversations in the future.

And so, on the best days, these eighth-grade U.S. history students conduct discussions on their own, pose questions that provoke and unsettle, and challenge authority—including and especially mine—with respect.

After all, the laws of a country mean little without the ability of leaders and citizens to listen to other views and believe that our laws mean something. The Constitution remains powerful only in that citizens have gifted it that power over two centuries and counting.

As Horace Mann foresaw, open dialogue breeds discomfort. When we examine Black Lives Matter or gun control, some of my students lean in, while others squirm and slouch. I tell them that these discussions are supposed to be uncomfortable.

And each Friday in my classroom, several students bring up such uncomfortable issues in their weekly current events presentations. The presenter summarizes a news article and gives the reasons he or she chose it. Everyone writes down a question or comment—and then the floor opens for discussion.

The presenter fields questions on everything from how to prevent nuclear war, to why there are so many homeless people in Los Angeles, to what abortion looked like before Roe v. Wade. Week by week, we work on solving problems together.

When I first started teaching, students did these weekly presentations, but we talked about their articles only for a few minutes, and the questions came largely from me. In an attempt to make my classroom feel more democratic, I realized a few years ago that more of the power needed to be in students’ hands, at least one day a week.

And so I sit back and listen, to questions they ask of each other: “So is the government doing anything yet to help with the recent flooding?” or “Why has the homicide rate gone up in that country?”

In describing such “maximally open classroom climates,” education professor Meira Levinson in No Citizen Left Behind offers questions that students can ask themselves and each other while engaging in discussion, such as “Why do people care about this topic?” and “Is this person making an argument, or just talking for the sake of talking?”

In our national discourse, Levinson’s kind of metacognition might inspire those who talk too much to listen more, and those who don’t talk enough to speak up already.

By fostering such reflective questions about how we talk, civics education can also create new lenses through which students can view the world.

I see kids change their perspectives in my classroom every week. For instance, once they understand how a concept such as federalism relates to marijuana or immigration laws in California, they begin to ask about how people in other states are affected by these issues. The thought that someone in Texas might possess entirely different rights than someone in Los Angeles shocks them. They sputter at what seems to them the unfairness of it all.

If we’ve just discussed the road to Civil War, I’ll link civics to history—ask them to imagine what it might have felt like to be bound under even more serious state-to-state conflicts, such as the Fugitive Slave Law.

Not all of these discussions come easy. Sometimes the political opinions that the presenters express, in our majority but certainly not entirely liberal school, land the wrong way for a student who believes in pushing tax cuts or loosening gun restrictions. Sometimes I worry that those in the ideological minority don’t even want to talk because they don’t feel it’s worth it. And so I occasionally step in to play devil’s advocate for whichever side is not getting enough airtime on an issue, to remind students that our classroom bubble is not the world’s bubble.

The ultimate perspective that I hope for my students is that they assume a little more humility than adolescents (or adults) typically do. When I ask them what they don’t know about an issue, I want these eighth graders to remember that they need to rely not only on themselves for answers to civic problems, but also on the people around them. Often I’ll bring in a historical primary source to make a point like this, such as a speech that the not-always-humble Ben Franklin gave to the Constitutional Convention in September 1787.

Franklin, after four months of difficult deliberations, asked each delegate in Independence Hall to accept that the draft of the Constitution was good enough. Then he asked each man to do even more, to leap beyond mere acceptance and “doubt a little of his own infallibility.”

Franklin’s ability to step back from the debate is the real inoculation I’m looking for as a civics teacher and a citizen. Not a syringe filled with terms such as ratify and suffrage, but one packed with a heavy dose of self-doubt. And it should be a big enough dose for our children to understand that raising citizens, and being citizens, can be as messy and pungent a process as four months of sitting in a hot Philadelphia summer, until you’re ready to call something good enough, for now.

The post Want to Take My Civics Class? Get Ready to Squirm appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/17/want-take-civics-class-get-ready-squirm/ideas/essay/feed/ 2
Can a Corrupt Politician Become a Good President?http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/#respond Thu, 16 Nov 2017 08:01:11 +0000 BY SCOTT S. GREENBERGER http://www.zocalopublicsquare.org/?p=89382 “Who you are, what you are, it doesn’t change after you occupy the Oval Office,” President Barack Obama said during the 2016 election campaign. “It magnifies who you are. It shines a spotlight on who you are.”

But at least one man was transformed by the presidency: Chester Alan Arthur. Arthur’s redemption is all the more remarkable because it was spurred, at least in part, by a mysterious young woman who implored him to rediscover his better self.

Arthur, the country’s 21st president, often lands on lists of the most obscure chief executives. Few Americans know anything about him, and even history buffs mostly recall him for his magnificent mutton-chop sideburns.

Most visitors to Arthur’s brownstone at 123 Lexington Avenue in New York City are there to shop at Kalustyan’s, a store that sells Indian and Middle Eastern spices and foods—not to see the only site in the city where

The post Can a Corrupt Politician Become a Good President? appeared first on Zócalo Public Square.

]]>
“Who you are, what you are, it doesn’t change after you occupy the Oval Office,” President Barack Obama said during the 2016 election campaign. “It magnifies who you are. It shines a spotlight on who you are.”

But at least one man was transformed by the presidency: Chester Alan Arthur. Arthur’s redemption is all the more remarkable because it was spurred, at least in part, by a mysterious young woman who implored him to rediscover his better self.

Arthur, the country’s 21st president, often lands on lists of the most obscure chief executives. Few Americans know anything about him, and even history buffs mostly recall him for his magnificent mutton-chop sideburns.

Most visitors to Arthur’s brownstone at 123 Lexington Avenue in New York City are there to shop at Kalustyan’s, a store that sells Indian and Middle Eastern spices and foods—not to see the only site in the city where a president took the oath of office. Arthur’s statue in Madison Square Park, erected by his friends in 1899, is ignored. But Arthur’s story of redemption, which illustrates the profound impact that the U.S. presidency can have on a person, deserves to be remembered.

Arthur was born in Vermont in 1829, the son of a rigid abolitionist preacher. Even in the North, abolitionism was not popular during the first decades of the 19th century, and Arthur’s father—known as Elder Arthur—was so outspoken and uncompromising in his beliefs that he was kicked out of several congregations, forcing him to move his family from town to town in Vermont and upstate New York.

Shortly after graduating from Union College in Schenectady, Arthur did what many ambitious young men from the hinterlands do: He moved to New York City. Once there, he became a lawyer, joining the firm of a friend of his father’s, who was also a staunch abolitionist.

And so he started down an idealistic path. As a young attorney, Arthur won the 1855 case that desegregated New York City’s streetcars. During the Civil War, when many corrupt officials gorged themselves on government contracts, he was an honest and efficient quartermaster for the Union Army.

But after the war Arthur changed. Seeking greater wealth and influence, he became a top lieutenant to U.S. Senator Roscoe Conkling, the all-powerful boss of the New York Republican machine. For the machine, the ultimate prize was getting and maintaining party control—even if that meant handing out government jobs to inexperienced men, or using brass-knuckled tactics to win elections. Arthur and his cronies didn’t view politics as a struggle over issues or ideals. It was a partisan game, and to the victor went the spoils: jobs, power and money.

It was at Conkling’s urging, in 1871, that President Ulysses S. Grant appointed Arthur collector of the New York Custom House. From that perch, Arthur doled out jobs and favors to keep Conkling’s machine humming.

Vice President Chester A. Arthur. Photo courtesy of Library of Congress.

He also became rich. Under the rules of the Custom House, whenever merchants were fined for violations, “Chet” Arthur took a cut. He lived in a world of Tiffany silver, fine carriages, and grand balls, and owned at least 80 pairs of trousers. When an old college classmate told him that his deputy in the Custom House was corrupt, Chet waved him away. “You are one of those goody-goody fellows who set up a high standard of morality that other people cannot reach,” he said. In 1878, reform-minded President Rutherford B. Hayes, also a Republican, fired him.

Two years later, Republicans gathered in Chicago to pick their presidential nominee. Conkling and his machine wanted the party to nominate former president Grant, whose second administration had been riddled with corruption, for an unprecedented third term. But after 36 rounds of voting, the delegates instead chose James Garfield, a longtime Ohio congressman.

Conkling was enraged. Party elders were desperate to placate him, realizing that Garfield had little hope of winning in November without the help of the New York boss. The second place on the ticket seemed to be a safe spot for one of Conkling’s flunkeys; They chose Arthur, and the Republicans triumphed in November.

Just months into Garfield’s presidency, Arthur’s meaningless post suddenly became critical. On the morning of July 2, 1881, a deranged office-seeker named Charles Guiteau shot President Garfield in a Washington railroad station. To Arthur’s horror, when Guiteau was arrested immediately afterward, he proclaimed his support for the Arthur-Conkling wing of the Republican Party—which had been resisting Garfield’s reform attempts—and exulted in the fact that Arthur would now be president. Some newspapers accused Arthur and Conkling of participating in the assassination plot.

Garfield survived the shooting, but he was mortally wounded. Throughout the summer of 1881, Americans prayed for their ailing leader and shuddered at the prospect of an Arthur presidency. Prominent diplomat and historian Andrew Dickson White later wrote: “It was a common saying of that time among those who knew him best, ‘Chet Arthur, President of the United States! Good God!’”

Big-city elites mocked Arthur as unfit for the Oval Office, calling him a criminal who belonged in jail, not the White House. Some of the people around Arthur feared that he was on the verge of an emotional collapse.

The newspapers were vicious. The Chicago Tribune lamented “a pending calamity of the utmost magnitude.” The New York Times called Arthur “about the last man who would be considered eligible” for the presidency.

At the end of August 1881, as Garfield neared death, Arthur received a letter from a fellow New Yorker, a 31-year-old woman named Julia Sand. Arthur had never met Sand, or even heard of her. They were complete strangers. But her letter, the first of nearly two-dozen she wrote to him, moved him. “The hours of Garfield’s life are numbered—before this meets your eye, you may be President,” Sand wrote. “The people are bowed in grief; but—do you realize it?—not so much because he is dying, as because you are his successor.”

“But making a man President can change him!” Sand continued boldly. “Great emergencies awaken generous traits which have lain dormant half a life. If there is a spark of true nobility in you, now is the occasion to let it shine … Reform!”

Sand was the unmarried eighth daughter of Christian Henry Sand, a German immigrant who rose to become president of the Metropolitan Gas Light Company of New York. She lived at 46 East 74th Street, in a house owned by her brother Theodore V. Sand, a banker.

As the pampered daughter of a wealthy father, Julia read French, enjoyed poetry, and vacationed in Saratoga and Newport. But by the time she wrote Arthur she was an invalid, plagued by spinal pain and other ailments that kept her at home. As a woman, Julia was excluded from public life, but she followed politics closely through the newspapers, and she had an especially keen interest in Chester Arthur.

The “reform” she was most concerned about was civil service reform. Under the so-called “spoils system,” politicians doled out government jobs to loyal party hacks, regardless of their qualifications. Reformers wanted to destroy the spoils system, to root out patronage and award federal jobs based on competitive examinations, not loyalty to the party in power.

“But making a man President can change him!” Sand continued boldly. “Great emergencies awaken generous traits which have lain dormant half a life. If there is a spark of true nobility in you, now is the occasion to let it shine … Reform!”

Vice President Arthur had used his position to aid Conkling and his machine—even defying President Garfield to do so. There was every reason to believe he would do the same as president.

But Garfield’s suffering and death, and the great responsibilities that had been thrust upon him, changed Chester Arthur. As president, the erstwhile party hack shocked everybody and became an unlikely champion of civil service reform, clearing the way for a more muscular federal government in the succeeding decades. Arthur started rebuilding the decrepit U.S. Navy, which the country desperately needed to assume a greater economic and diplomatic role on the world stage. And he espoused progressive positions on civil rights.

Mark Twain, who wasn’t bashful about mocking politicians, observed, “it would be hard indeed to better President Arthur’s administration.”

Arthur’s old machine buddies saw it differently—to them, he was a traitor. Meanwhile, reformers still didn’t completely trust that Arthur had become a new man, so he had no natural base of support in the party. He also secretly suffered from Bright’s disease, a debilitating kidney ailment that dampened his enthusiasm for seeking a second term. The GOP did not nominate him in 1884.

Arthur was ashamed of his political career before the presidency. Shortly before his death, he asked that almost all of his papers be burned—with the notable exception of Julia Sand’s letters, which now reside at the Library of Congress. Arthur’s decision to save Sand’s letters, coupled with the fact that he paid her a surprise visit in August 1882 to thank her, suggests that she deserves some credit for his remarkable transformation.

Arthur served less than a full term, but he was showered with accolades when he left the White House in March 1885. “No man ever entered the Presidency so profoundly and widely distrusted as Chester Alan Arthur,” newspaper editor Alexander K. McClure wrote, “and no one ever retired from the highest civil trust of the world more generally respected, alike by political friend and foe.”

The post Can a Corrupt Politician Become a Good President? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/16/can-a-corrupt-politician-become-a-good-president/ideas/essay/feed/ 0
Surviving Managua’s Government Crackdowns and Torrential Rainshttp://www.zocalopublicsquare.org/2017/11/15/surviving-managuas-government-crackdowns-torrential-rains/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/15/surviving-managuas-government-crackdowns-torrential-rains/ideas/essay/#respond Wed, 15 Nov 2017 08:01:00 +0000 By Douglas Haynes http://www.zocalopublicsquare.org/?p=89364 On an overcast afternoon, Julio Baldelomar carries his metal ring of bagged chips past a new tourist attraction called Paseo Xolotlán, named for the nearly Los Angeles-sized lake on Managua, Nicaragua’s north side. Families flock to the high-walled complex to see a miniature replica of old Managua and walk on the waterfront promenade. But 31-year-old Julio is not allowed to enter while he’s hawking the plantain and yucca chips that his family makes.

“The government has made a disaster,” he says. “You can’t go in and sell here, because it’s privatized.”

To his chagrin, construction of new attractions continues for a mile along the lakefront. He sells a bag of chips to a brown-uniformed guard at the gate of a building site. Behind the gate, leveled earth stretches along the lake to where Julio squatted in a sheet metal shanty before a 2010 flood displaced him and his partner, Aryeri.

The post Surviving Managua’s Government Crackdowns and Torrential Rains appeared first on Zócalo Public Square.

]]>
On an overcast afternoon, Julio Baldelomar carries his metal ring of bagged chips past a new tourist attraction called Paseo Xolotlán, named for the nearly Los Angeles-sized lake on Managua, Nicaragua’s north side. Families flock to the high-walled complex to see a miniature replica of old Managua and walk on the waterfront promenade. But 31-year-old Julio is not allowed to enter while he’s hawking the plantain and yucca chips that his family makes.

“The government has made a disaster,” he says. “You can’t go in and sell here, because it’s privatized.”

To his chagrin, construction of new attractions continues for a mile along the lakefront. He sells a bag of chips to a brown-uniformed guard at the gate of a building site. Behind the gate, leveled earth stretches along the lake to where Julio squatted in a sheet metal shanty before a 2010 flood displaced him and his partner, Aryeri.

“The tragedy,” he calls the flood.

In his memory, their lakefront home remains an oasis. He recalls the plants that grew in Aryeri’s garden as if saying their names could bring them back: olives, ten o’clock flowers, papayas, passion fruit, mango, hog plum, chili peppers.

At the gate, Julio convinces the guard to let him walk across the construction site. “What a disaster!” he says when he enters the site. Heaps of brush are piled around the few trees that remain. Potable water from a broken pipe streams toward the lake. The square outlines of house walls mark the bare, burnt-umber soil blazed with bulldozer tracks. A crumpled yellow T-shirt, a pair of underwear, and a shredded green plastic cup top a mound of dirt.

There’s nothing else left of the barrio that Julio haunted when he lived on the lakefront. In October 2014, civil defense forces forcibly evacuated about 300 families from the site, though they weren’t in imminent danger. Some wanted to leave, some didn’t. They were all taken to a nearby emergency shelter, ostensibly because of the barrio’s vulnerability to flooding. Many claim that the government cleared the barrio only to develop the lakefront, repeating a process of sacrificing poor people’s homes for profit that happens all too often around the world. But in Managua, most of the nearly 30,000 people displaced between 2010 and 2014 were voluntarily evacuated, fleeing floods, landslides, and crumbling earthquake ruins.

Julio Baldelomar vending on the lakefront in Managua. Photo by Douglas Haynes.

Next to the stream pouring from the broken pipe, a girl wearing baggy black shorts and a black T-shirt many sizes too big pounds a sledgehammer at a slab of concrete. Her curly, dark hair is flecked with crud, and the bridge of her nose bears a dirty scab. Her feet barely cling to green plastic sandals with holes in both heels.

Julio approaches the girl, grabs her sledgehammer, and starts swinging it at the concrete. It’s about a foot thick and deeply entrenched in the soft earth next to the stream. The job is too big for the nine-year-old girl, who is only twice the height of the sledgehammer.

Julio heaves and swings the hammer over and over. His green canvas backpack sways as the head hits the concrete with the full force of his considerable girth. Gradually, the concrete breaks into smaller pieces, revealing several long pieces of rebar. But the steel the girl is seeking to sell for scrap is stuck in the ground. He’s about to give up. Sweat glosses his bronze forehead and drips from his nose. He grabs a white rag out of his pocket and wipes his face. He sets the sledgehammer down, plants his boots firmly, and tugs and tugs on the rebar. He almost falls into the stream. The earth won’t loose the steel.

Then something miraculous happens. A bulldozer driver steers his machine over to Julio and opens the cab door to buy a bag of chips. Then he motions for Julio and the girl to get out of the way. The driver lowers the blade into the soil below the concrete and pushes the slab into the air. But the ground still doesn’t let the steel go.

As the bulldozer backs away, a digger with a bucket loader drives over and takes a swipe at the concrete. It only needs two swipes to loosen the concrete and long rebar from the earth. Julio whacks the sledgehammer at the remaining bits of concrete clinging to the rebar. Within minutes, he has removed nearly all of it. The girl beams.

Julio sees himself in the girl. He used to forage the city for scrap metal as a child, too. The girl will get 20 or 30 córdobas for the rebar, enough to quell her hunger with some tortillas and a soda.

Julio lopes away in his ripped jeans toward the neoclassical towers of Managua’s old cathedral, hollowed out in a 1972 earthquake that killed at least 11,000 people, left hundreds of thousands homeless, and leveled 10 square miles of the city’s heart. Managua has never recovered. Squatter settlements sprang up in the ruins of what was once Central America’s most cosmopolitan downtown. Tons of rubble were dumped in the lake where today’s recreation areas are being built and squatters are being swamped. One disaster has covered another. Despite decades of grandiose plans for urban revitalization, the poor still scrape the city’s margins for the dregs of Nicaragua’s stuttering economy.

On the other side of the cathedral, in the shady Parque Central, Julio washes his arms and hands under a tap in a fountain. He sells a few bags of chips, asking each customer if they want hot sauce. Dusk dims the park. Parakeets roost in the treetops.

After dark, he gets a bus home and walks through the blue door of the new concrete house he shares with Aryeri. It’s the first home either of them has ever had with a legal title, a bathroom, and a floor that isn’t dirt. When the flood drove them from the lakefront, they lived 16 months in an emergency shelter’s dark cubicle. Then the Nicaraguan government gave them this furnished, three-room house in a community for flood refugees. Julio bought plants for Aryeri, and now she tends a new garden, lush with ferns and a red-flowering living fence.

The couple now worries less about Managua’s torrential rains drowning their home, though the rains’ growing intensity and the city’s rapid expansion cause the metropolis to regularly flood. They’ve left the community of squatters who make up about one-quarter of Managua’s residents. But Julio still prizes the camaraderie of street vendors and scavengers he grew up with, and the government’s takeover of the lakefront looms like another deluge for his income.

Before Julio had gone out vending in the morning, Aryeri sat down on his lap. He squeezed her with both arms. Their faces glowed. Julio’s black hair was slicked-back; Aryeri’s frizzed in every direction. She rubbed Julio’s round belly.

“Your cooking is how I got it,” he joked.

He doesn’t have to scavenge for scraps to survive anymore. But he still knows how to find them if need be.

The post Surviving Managua’s Government Crackdowns and Torrential Rains appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/15/surviving-managuas-government-crackdowns-torrential-rains/ideas/essay/feed/ 0
American Populism Shouldn’t Have to Embrace Ignorancehttp://www.zocalopublicsquare.org/2017/11/14/american-populism-shouldnt-embrace-ignorance/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/14/american-populism-shouldnt-embrace-ignorance/ideas/essay/#comments Tue, 14 Nov 2017 08:01:41 +0000 By Daniel R. DeNicola http://www.zocalopublicsquare.org/?p=89343 Public ignorance is an inherent threat to democracy. It breeds superstition, prejudice, and error; and it prevents both a clear-eyed understanding of the world and the formulation of wise policies to adapt to that world.

Plato believed it was more than a threat: He thought it characterized democracies, and would lead them inevitably into anarchy and ultimately tyranny. But the liberal democracies of the modern era, grudgingly extending suffrage, have extended public education in parallel, in the hope of cultivating an informed citizenry. Yet today, given the persistence and severity of public ignorance, the ideal of an enlightened electorate seems a fading wish at best, a cruel folly at worst.

Unfortunately, our current civic problem cuts even deeper: We are witnessing the rise of a culture of ignorance. It is particularly insidious because it hijacks certain democratic values. To begin to understand this culture and its effects, it is

The post American Populism Shouldn’t Have to Embrace Ignorance appeared first on Zócalo Public Square.

]]>
Public ignorance is an inherent threat to democracy. It breeds superstition, prejudice, and error; and it prevents both a clear-eyed understanding of the world and the formulation of wise policies to adapt to that world.

Plato believed it was more than a threat: He thought it characterized democracies, and would lead them inevitably into anarchy and ultimately tyranny. But the liberal democracies of the modern era, grudgingly extending suffrage, have extended public education in parallel, in the hope of cultivating an informed citizenry. Yet today, given the persistence and severity of public ignorance, the ideal of an enlightened electorate seems a fading wish at best, a cruel folly at worst.

Unfortunately, our current civic problem cuts even deeper: We are witnessing the rise of a culture of ignorance. It is particularly insidious because it hijacks certain democratic values. To begin to understand this culture and its effects, it is helpful to identify the ways it differs from simple ignorance.

Perhaps the most noticeable aspect of a culture of ignorance is the extent of willful ignorance. Ignorance that is willful may involve resistance to learning, denial of relevant facts, the ignoring of relevant evidence, and suppression of information. Such ignorance is usually maintained in order to protect a prior belief or value—a sense of self, an ideology, a religious doctrine, or some other cherished cognitive commitment. False knowledge often bolsters one’s will in maintaining a closed mind; but of course, it is only ignorance in elaborate disguise.

When the willfully ignorant are cornered by mounting evidence, they assert their individual right to believe whatever they choose to believe. This is a hollow and silly claim. Beliefs are factive; they aspire to truth. Moreover, beliefs affect attitudes, decisions, and actions. As the Victorian mathematical philosopher William K. Clifford remarked, “No one man’s belief is in any case a private matter which concerns him alone.” He proposed “an ethic of belief” and championed our responsibility to respect evidence for and against our beliefs. Though his standard of evidence may have been too stringent, we can agree that claiming the right to believe “whatever” exploits the democratic respect for individual rights by foregoing individual responsibilities.

A related characteristic is the rejection of expertise. Liberal democratic theory and practice have always elevated individual autonomy and independence, rejecting authority and dependency. They therefore have had difficulties with any relationship that yields individual autonomy—which seems to be involved in consulting an expert. It is true that the place of expertise in a democracy remains contested: We may yield to the expertise of the physician, pilot, or engineer (albeit uneasily); but we may be skeptical of the expertise of the economist, climate scientist, or critic.

Are we heading toward a culture of ignorance? Photo courtesy of pxhere.

Our ambivalence regarding expertise has increasingly come to be a rejection. The rise of social media has certainly contributed to this trend. Who needs a qualified film or restaurant critic when one can find websites that provide thousands of audience or diner ratings? But the implications go far beyond aesthetics: As a senior minister famously said during the recent Brexit campaign, “Britain has had enough of experts.” Among at least a significant portion of the population, this attitude has led to a rejection of the traditional sources and certifiers of knowledge—universities, science, established journalism. As this attitude engulfs public life, it undermines the fragile but vital distinction between knowledge and belief, between informed judgment and unreflective opinion.

This epistemic populism seems radically democratic, but that image is an illusion. Democracy is, as John Dewey described, a moral climate in which each person may contribute to the construction of knowledge; but it doesn’t imply that each person possesses the truth. Moreover, one need not yield political authority to experts; it is epistemic authority—the authority of knowledge, skill, experience, and judgment—that is carried by experts.

At some point, the “wisdom of crowds” becomes the celebration of ignorance. Conspiracy theories, wild speculations and accusations, nutty claims, “alternate facts,” and pronouncements that are far afield from one’s knowledge—all these claim time or space on a par with accurate and important information. The politician who is ignorant of politics, the law, and history is seen as the person who will “get things done.” Some public figures wear their ignorance as a badge of honor. Let’s be clear: Ignorance is not stupidity, though I admit it is sometimes difficult to tell them apart in practice. And stupidity is likely to produce ignorance across a broad front. But one can be ignorant without being stupid.

Underlying all of these factors is the loss of respect for the truth. No doubt, many things have contributed: the venality of some experts, the public disagreement among experts, the continual revising of expert advice, and the often-unwarranted movement by social scientists from the descriptive to the normative, from facts to pronouncements. Religious fundamentalism, which stretches credibility, is another precipitating factor. The postmodernist deconstruction of ideals like truth, rationality, and objectivity, also contributed to this loss—though I doubt that postmodernist treatises were widely read among conspiracy theorists, religious fundamentalists, or climate change deniers.

At some point, the “wisdom of crowds” becomes the celebration of ignorance.

The irony is that these folks believe they are holding the truth. Indeed, I am not suggesting that we need to claim we possess the Truth, firmly and finally; in fact, I believe those who make that claim actually disrespect the truth. Rather, we need to keep the ideal of truth to guide our inquiries, to aspire to greater truth. Not all opinions or interpretations are equally worthy. The concept of truth is required to separate knowledge from opinion; those who give up on truth, those for whom truth doesn’t matter, are—as the contemporary philosopher Harry Frankfurt said—left with bullshit.

There are signs of hope. Many young people have a naturally skeptical, even cynical, attitude regarding information sources. There is a surge of interest in investigative journalism in various forms. The teaching of critical thinking has broadened to include information literacy: Many colleges now provide ways to learn the skills of evaluating informational sources and content, including statistical integrity. Scholars are giving new attention to epistemic virtues, capacities and traits that enhance the acquisition of knowledge. There is excited talk among feminist and educational philosophers of “an epistemology and pedagogy of resistance” that confronts willful ignorance and the “epistemic injustice” of systematically discrediting certain voices.

The danger, and by the same token, the hope lies in this truth: In the end, ignorance will lead to error. Serious mistakes and their consequences may be required before there is momentum sufficient to roll back this culture.

The post American Populism Shouldn’t Have to Embrace Ignorance appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/14/american-populism-shouldnt-embrace-ignorance/ideas/essay/feed/ 2
The Southern Writers Who Defined Americahttp://www.zocalopublicsquare.org/2017/11/13/southern-writers-defined-america/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/13/southern-writers-defined-america/ideas/essay/#respond Mon, 13 Nov 2017 08:01:13 +0000 By James C. Cobb http://www.zocalopublicsquare.org/?p=89309 Tell about the South. What’s it like there? What do they do there? Why do they live there? Why do they live at all?
           —Shreve McCannon, to Quentin Compson

Struggling in William Faulkner’s Absalom, Absalom! to field these questions, flung at him by his Harvard roommate on a snowy evening in 1910, the young Mississippian Quentin Compson plunges into the history of his own Southern community. Drawing on the accounts of his family and fellow citizens of the small town of Jefferson, supplemented by his own experiences and tortured imaginings, he pieces together the fictional but vividly real and tragic story of Thomas Sutpen, a man whose single-minded pursuit of social acceptance and status leads to ruin for him and his family.

Rife with themes of racial and sexual exploitation, incest, miscegenation, brutality, and greed, Absalom, Absalom! seemed to tell a great deal—surely enough to

The post The Southern Writers Who Defined America appeared first on Zócalo Public Square.

]]>
Tell about the South. What’s it like there? What do they do there? Why do they live there? Why do they live at all?
           —Shreve McCannon, to Quentin Compson

Struggling in William Faulkner’s Absalom, Absalom! to field these questions, flung at him by his Harvard roommate on a snowy evening in 1910, the young Mississippian Quentin Compson plunges into the history of his own Southern community. Drawing on the accounts of his family and fellow citizens of the small town of Jefferson, supplemented by his own experiences and tortured imaginings, he pieces together the fictional but vividly real and tragic story of Thomas Sutpen, a man whose single-minded pursuit of social acceptance and status leads to ruin for him and his family.

Rife with themes of racial and sexual exploitation, incest, miscegenation, brutality, and greed, Absalom, Absalom! seemed to tell a great deal—surely enough to shock some readers and confirm the suspicions of others—about the South. But it wasn’t the first, or the last, literary attempt to address the kinds of questions about the region that Faulkner posed so pointedly through Shreve. During the decades spanning the end of World War I and the birth of the Civil Rights movement, many Americans sought explanations for the region’s racial horrors, entrenched ignorance, depravity, and poverty.

Responding to this barrage of queries became a veritable raison d’être for many contributors to the great literary outpouring some have called the “Southern Renaissance.” Yet despite their focus on the South, and no less than their Northern peers, these writers gained national and international significance and acclaim through their deep engagement with broader themes of human ambition, greed, materialism, and alienation. In this, their work transcended geographic and temporal boundaries.

No novel better embodies or more brilliantly explores these universal themes than Faulkner’s 1936 classic. Its setting, plot, and historical context are specifically Southern, but its central concern, the destructiveness of the obsession with the American dream of wealth and social elevation, runs through other contemporary novels set elsewhere, including F. Scott Fitzgerald’s The Great Gatsby.

In Faulkner’s convoluted and chilling story, Thomas Sutpen’s single-minded fixation on social rank brings him to Jefferson, Mississippi, in 1833. Stunned as a young poor white when a slave turns him away from the front door of a huge Tidewater Virginia plantation house, Sutpen is soon enslaved himself, by an all-consuming obsession with owning such a mansion of his own and claiming all the aristocratic social prerogatives that come with it.

“I had a design,” he explains. “To accomplish it I should require money, a house, a plantation, slaves, a family—incidentally of course, a wife. I set out to acquire these, asking no favor of any man.” Sutpen accomplishes it all, with ruthless efficiency. Arriving in Jefferson with a gang of slaves in tow, he purchases a huge parcel of land, erects a grand manor house, and marries a respectable merchant’s daughter who gives birth to a son, Henry, and a daughter, Judith.

William Faulkner works at his Underwood typewriter in his study at his Rowan Oaks home near Oxford, Mississippi. Photo courtesy of Associated Press.

But Sutpen has already set the stage for the unravelling of his design. Before coming to Mississippi, he had gone to the West Indies to obtain the money and slaves his plan required. While there, he also had married the daughter of a wealthy planter, who bore him a son. Upon discovering his wife’s mixed racial lineage, however, he abandoned her and the boy as incompatible with his ultimate objective.

Years later, his light-skinned biracial son, Charles Bon, arrives in Jefferson, befriends Henry and becomes engaged to Judith. Sutpen’s cold-blooded manipulation of these developments leads Henry to murder Bon. Like King David’s son Absalom, who also killed his half-brother and fled, Henry then takes flight, depriving his father of the male heir he will need to secure whatever may be salvaged of his “design” after the Civil War.

Tellingly, with the bulk of his plan seemingly in shambles, instead of questioning the design itself, Sutpen can only torment himself about the “mistake” he must somehow have made in implementing it. Desperate to learn “what did I do or misdo in it,” he is blind to the possibility that the inherent moral bankruptcy of his plan might have led to its undoing. Indeed, refusing to give up on his original strategy, Sutpen meets his demise in 1869 at the hands of a poor white man whose granddaughter he has impregnated in his crazed pursuit of another son—and then callously tossed aside when the child proves to be female.

Forty years later, his two surviving offspring—Henry, who has secretly returned home to die, and Clytie, a formerly enslaved mulatto daughter—perish when Clytie, fearful that Henry is about to be captured, sets the family’s great house ablaze. Thus are the final human and physical vestiges of Thomas Sutpen’s grand design reduced to ashes, meeting the same fate as the slave South or any edifice, as Faulkner put it, “erected on the shifting sands of opportunism and moral brigandage.”

Its striking response to many of the nagging questions about the South should not obscure Absalom, Absalom!’s critical implications for America as a whole. Sutpen’s absolute certainty of the success of a rational, properly executed plan surely smacked of the nation’s modern can-do mindset. Faulkner’s message about the consequences of human obsession with wealth and social rank was, as he said, also too universal “without regard to race or time or condition” to be applicable solely to the South. In his amoral reduction of respectability to a mere matter of accumulating material wealth and taking on elite airs, Sutpen was little different, one critic noted, from a “ruthless robber baron of the Gilded Age building a fake Renaissance palace on the banks of the Hudson”—or, he might have said, from Jay Gatsby flaunting his Jazz Age party palace on Long Island Sound.

Faulkner’s message about the consequences of human obsession with wealth and social rank was, as he said, also too universal “without regard to race or time or condition” to be applicable solely to the South.

Meanwhile, Faulkner’s black literary contemporaries also wrestled with broader concerns as they explored the nature of Southern black life for a curious national readership. Writers like Zora Neale Hurston and Sterling Brown drew the ire of the Northern “New Negro” advocates of the 1920s, who criticized them for casting Southern blacks as simple “pseudo-primitives whom the reading public still love to laugh with, weep over, and envy.” Instead, New Negro critics urged Southern black writers to serve up a steady diet of black role models who made material and professional strides despite white racial oppression. According to this rigidly bourgeois value scale, Brown quipped, a book about “a Negro and a Rolls-Royce” would always be superior to “one about a Negro and a Ford.” As Ralph Ellison would put it later, if these Northern critics had their way, “one would simply portray Negro experience and Negro personality as the exact opposite of any stereotype set up by prejudiced whites.”

Realizing that simply replacing one caricature with another would do nothing to counter the denial of individuality inherent in all stereotypes—racial, regional, or otherwise—Ellison resolved to address the devastating consequences of this denial in his 1952 novel, Invisible Man. Ellison was born in Oklahoma City and educated at Tuskegee Institute, and the story’s unnamed narrator and protagonist is an eager young black man who has imbibed deeply of Tuskegee founder Booker T. Washington’s doctrine of racial pragmatism and self-help. He persists in his “belief in the rightness of things,” even after he attends a white men’s club gathering, where he has been invited to deliver his high-school valedictory speech and receive a scholarship.

Expecting to be treated with respect, he is forced instead to entertain the tipsy white onlookers, boxing with other black boys and then scrambling for coins on an electrified carpet. He is finally allowed to give his speech, which parrots the accommodationist rhetoric of Washington, but he still angers his white audience by accidentally substituting “social equality” for “social responsibility.” The young man salvages the occasion (and his scholarship) only by explaining apologetically that he had been distracted by having to swallow the blood from a cut suffered during the boxing match.

He moves on to a Tuskegee-esque college, still accepting, as Ellison puts it, “the definition of himself handed down by the white South and the paternalism of northern philanthropy.” When he is expelled for taking one of the school’s Northern white benefactors out into the countryside, where the donor’s illusions about the progress of the black race are traumatically shattered, the young man heads to New York City. There, he tries to conceal his Southernness so as not to stand out, only to face demands to suppress his individual identity and worth as a black man for the greater good of his race—demands leveled, not by straitlaced New Negro advocates, but by the radical quasi-communist and Black Nationalist groups he encounters.

Caught between the two hostile contingents in the wake of a race riot in Harlem, he takes refuge underground, resigned to being an “invisible man” whose personal identity no longer matters. In time, though, he realizes that he is complicit in his own fate, because he denied his regional heritage and sacrificed his black individuality in an effort to be what others said he should be.

With the Supreme Court already mulling the constitutionality of segregated schools in 1952, many readers focused on the book’s racial implications for the South, although Invisible Man also spoke to the several million black Southerners who had landed in Northern cities since World War I. More broadly, irrespective of race, when Ellison’s narrator observes, at the novel’s conclusion, “Who knows but that, on the lower frequencies, I speak for you,” he might just as well be addressing alienated whites beset by modern America’s pervasive homogenizing and anonymizing pressures at every turn.

There was no shortage of Southern writers, black or white, who accepted Shreve McCannon’s challenge to Quentin Compson. Yet the ones whose work has proven most enduring were those who managed in the course of telling about the South to illuminate critical core truths not only about America but about what Faulkner called the “universal mutual experience, the anguishes and troubles and griefs of the human heart.”

The post The Southern Writers Who Defined America appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/13/southern-writers-defined-america/ideas/essay/feed/ 0
We Shouldn’t Rely on Politicians to Memorialize Our Fallen Soldiershttp://www.zocalopublicsquare.org/2017/11/10/shouldnt-rely-politicians-memorialize-fallen-soldiers/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/10/shouldnt-rely-politicians-memorialize-fallen-soldiers/ideas/essay/#comments Fri, 10 Nov 2017 08:01:38 +0000 By Kelly Kennedy http://www.zocalopublicsquare.org/?p=89299 Five U.S. infantry soldiers died on June 21, 2007, when their 30-ton Bradley tracked vehicle hit a deep-buried bomb in Adhamiyah, Iraq.

I was embedded as a reporter with their unit when they died, and I watched as the men who served with them rallied.

They reached out to the mothers and fathers and wives, offering and seeking comfort, but also saying what they believed needed to be heard:

It was quick.
We were with them at the end.
We will never forget.

The families often reach back too, spreading wide wings over the men and women left behind in return for stories of their sons and daughters and wives and husbands.

“You can call me ‘mom,’ because he can’t.”
“Tell me again about the time she …. ”

A service member’s bond with a Gold Star family feels profound because it squares so many different contradictions. The relationship is

The post We Shouldn’t Rely on Politicians to Memorialize Our Fallen Soldiers appeared first on Zócalo Public Square.

]]>
Five U.S. infantry soldiers died on June 21, 2007, when their 30-ton Bradley tracked vehicle hit a deep-buried bomb in Adhamiyah, Iraq.

I was embedded as a reporter with their unit when they died, and I watched as the men who served with them rallied.

They reached out to the mothers and fathers and wives, offering and seeking comfort, but also saying what they believed needed to be heard:

It was quick.
We were with them at the end.
We will never forget.

The families often reach back too, spreading wide wings over the men and women left behind in return for stories of their sons and daughters and wives and husbands.

“You can call me ‘mom,’ because he can’t.”
“Tell me again about the time she …. ”

A service member’s bond with a Gold Star family feels profound because it squares so many different contradictions. The relationship is about both loss and presence, about courage and fear, and about a link with the loved one with whom we no longer can connect.

But, as the families and veterans wrap around each other, these tight bonds can exclude those in our communities who haven’t served in the military themselves or who don’t know anyone who serves now. Such exclusion may seem a small point in the immediate context of soldiers and families grieving those they loved. And letting a wider group of people into a tragedy may seem like too much for people who already carry a heavy burden of loss.

But exclusion has real-world consequences for families, communities, and the country as a whole.

How can we grieve for service members we don’t know, but who so completely represent us? How can we support families who don’t convey their grief and experiences beyond those tight bonds? And how, without paying attention to more than just headlines, can we feel the weight of a particular family member’s words, while fully understanding the diversity of that community?

If civilians don’t know about, understand, or feel comfortable reaching out to service members’ families, that can lead to those in the military, and their families, feeling isolated, abandoned, and afraid to speak.

We send people to war, but the contract shouldn’t end with their lives.

Renee Wood-Vincent, whose son Sgt. Ryan Wood died that day in Iraq, said she feels that fallen soldiers can be forgotten, and that there’s a lack of respect and knowledge in the public for what families and members of the military experience. But that also creates an obligation to reach out.

“There’s such a focus on what’s happening to us—it’s all about our sorrow, our problems, our military families—and we aren’t letting people in,” she said.

Letting people in can’t be done alone. It requires civilian leaders who can bridge and connect people. And in the United States, the highest bridge is embodied in one office, the presidency.

That’s why it’s so important that the person occupying that office be able to connect with soldiers and their families.

When the president reaches out to Gold Star families, he speaks for the civilians who made the decision with their votes to send service members to war. Even if a letter or phone call does not bring comfort, it is an acknowledgment of sacrifice for country. It’s why scrutiny of President Trump’s calls with Gold Star families is warranted.

Private First Class Ryan Hill and his mother Shawna Fenison. Photo courtesy of Kelly Kennedy.

But whatever the nature of the president’s words, the most important thing to know is that his words aren’t enough. A president should serve only as a starting point for civilians to reach out. “If I rely on politicians to memorialize Ryan and understand his sacrifice, I’m going to be sorely disappointed,” Wood-Vincent said. “They can empathize, but it’s still a number.”

Wood-Vincent received a letter from President George W. Bush, which she said was enough in a time of war, when the commander-in-chief should be dealing with national issues. So did Shawna Fenison, whose son, Private First Class Ryan Hill, served with Wood in Charlie Company, 1st Battalion 26th Infantry Regiment. Hill died on January 20, 2007, in Iraq, when a roadside bomb exploded near his Humvee.

But the letter wasn’t enough. “I don’t think the country cares about us and would rather we just go away,” she said.

That feeling represents a failure, and a historic shift. The concept of the Gold Star family began as an invitation for conversation and caring between civilians and military.

During World War I, a family could hang the red-bordered flag with two blue stars in the front window to alert the neighborhood that two sons served overseas.

The neighbors could say, “Heard from your boy?” or “Where’s he fighting?”

If one of those stars turned to gold, the conversation changed.

“I’m sorry for your loss.”
“Thank you for your sacrifice.”

But as wars turned more political and became less of a community effort—during World Wars I and II, most families had relatives or friends serving—the conversation changed again. Some families folded their flags when they felt they brought unwanted attention during the Vietnam War. In recent wars, the rarity of the flags offered reminders of just how few people served. About 7 percent of Americans have served in the military, and less than 1 percent serve now in our all-volunteer armed forces.

Both Fenison and Wood-Vincent were initially showered with gifts: flags. Artwork anonymously sent in the mail. Letters from strangers. “I have a large, supportive family,” Wood-Vincent said. “My neighborhood happens to be very military.” At work, people knew her son and offered condolences. People told her they fly the American flag for her son.

“We had a neighbor who came down every night for two years and prayed over our home,” she said. “I had never met her. I would see her out there in the summertime, in the wintertime, standing in the rain with her little dog.”

People still leave things on her porch.

“It may be the part of the country I’m in, or the neighborhood,” she said. “But part of it is the people I’ve surrounded myself with.”

Fenison had a similar experience, at first. But then, as people moved on with their lives or grew disenchanted with the wars, they encouraged her to stop talking about her son, to take down the “shrine” she’d assembled in her home that included her son’s pictures and the flag from his coffin.

“When I talk about Ryan, many will change the subject or give me the look of ‘Here she goes again,’ so I find myself withdrawing more and more,” she said. “Communities are good about honoring on Memorial Day with their token events, but it pretty much stops there.

“While my world has stopped, the rest has moved on.”

The families ache for the engagement—for someone to care. For someone to mourn their losses. For someone to look up Niger on a map and not only think about what it might have felt like to be doing what you, yes, signed up for and loved—but also to contemplate the terror and heartbreak for service members, friends, and families.

But, as the families and veterans wrap around each other, these tight bonds can exclude those in our communities who haven’t served in the military themselves or who don’t know anyone who serves now.

Those flags should serve as a call to action: This family’s sacrifice represents you. Gather them up. Listen to their stories.

“It’s much more complicated than people know,” Wood-Vincent said.

“On one hand, I’m a mother who lost a child no matter how he was taken from the world. I’m not thinking of him as a soldier.”

But then she explains to strangers how he died.

“People will say, ‘Oh, what a shame. What a waste,’” she said. “Don’t assume I feel the same.”

Sometimes, she said, she gets angry and wants to walk away. Other times, she reminds herself that she can’t be mad about people’s ignorance about proper responses or “Gold Star” moms if she’s not helping to educate them.

“I’ll think, ‘That person just made me so angry,’” she said. “Why? Well, my son’s loss was not a waste. Give me 10 seconds in the parking lot to tell you why. If someone sees your Gold Star plate and says, ‘What is that?’, you don’t say, ‘Hey. You’re an idiot. You should know.’”

She sees her personal call to action as part of that big conversation. Every summer, she invites her son’s brothers in arms to a reunion. Her family created a scholarship to celebrate his art—punk-rock drawings that expressed convictions about being different and doing your part to save the world—through the university. And she told his story at several events.

She makes sure people know and remember him, and through that, she closes the divide.

She believes that communities can, too. Local organizations can invite in Gold Star family members. They can form community partnerships—Boy Scouts who adopt families, or Junior Leaguers who organize lunches, or schools that bring Gold Star alumni in as speakers. Communities can organize town halls about what families need—even if that need is simply relaying kind questions to ask. Leaders can ensure families are remembered beyond Memorial Day.

And Gold Star families have to be willing to accept those invitations.

“You’ve got to open yourself,” Wood-Vincent said. “They’ll never completely understand, and thank God for that. But they will never understand if we don’t invite them in.”

The post We Shouldn’t Rely on Politicians to Memorialize Our Fallen Soldiers appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/10/shouldnt-rely-politicians-memorialize-fallen-soldiers/ideas/essay/feed/ 1
The “Crying Indian” Ad That Fooled the Environmental Movementhttp://www.zocalopublicsquare.org/2017/11/09/crying-indian-ad-fooled-environmental-movement/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/09/crying-indian-ad-fooled-environmental-movement/ideas/essay/#comments Thu, 09 Nov 2017 08:01:31 +0000 By Finis Dunaway http://www.zocalopublicsquare.org/?p=89272 It’s probably the most famous tear in American history: Iron Eyes Cody, an actor in Native American garb, paddles a birch bark canoe on water that seems, at first, tranquil and pristine, but that becomes increasingly polluted along his journey. He pulls his boat ashore and walks toward a bustling freeway. As the lone Indian ponders the polluted landscape, a passenger hurls a paper bag out a car window. The bag bursts on the ground, scattering fast-food wrappers all over the Indian’s beaded moccasins. In a stern voice, the narrator comments: “Some people have a deep, abiding respect for the natural beauty that was once this country. And some people don’t.” The camera zooms in on Iron Eyes Cody’s face to reveal a single tear falling, ever so slowly, down his cheek.

Cody’s tear made its television debut in 1971 at the close of a public service advertisement for the

The post The “Crying Indian” Ad That Fooled the Environmental Movement appeared first on Zócalo Public Square.

]]>
It’s probably the most famous tear in American history: Iron Eyes Cody, an actor in Native American garb, paddles a birch bark canoe on water that seems, at first, tranquil and pristine, but that becomes increasingly polluted along his journey. He pulls his boat ashore and walks toward a bustling freeway. As the lone Indian ponders the polluted landscape, a passenger hurls a paper bag out a car window. The bag bursts on the ground, scattering fast-food wrappers all over the Indian’s beaded moccasins. In a stern voice, the narrator comments: “Some people have a deep, abiding respect for the natural beauty that was once this country. And some people don’t.” The camera zooms in on Iron Eyes Cody’s face to reveal a single tear falling, ever so slowly, down his cheek.

Cody’s tear made its television debut in 1971 at the close of a public service advertisement for the anti-litter organization Keep America Beautiful. Appearing in languid motion on TV over and over again during the 1970s, the tear also circulated in other media, stilled on billboards and print ads, forever fixing the image of Iron Eyes Cody as the Crying Indian. The ad won many prizes and is still ranked as one of the best commercials of all time. By the mid-1970s, an Advertising Council official noted, “TV stations have continually asked for replacement films” of the commercial, “because they have literally worn out the originals from the constant showings.” For many Americans, the Crying Indian became the quintessential symbol of environmental idealism. But a closer examination of the ad reveals that neither the tear nor the sentiment was what it seemed to be.

The campaign was based on many duplicities. The first of them was that Iron Eyes Cody was actually born Espera De Corti—an Italian-American who played Indian in both his life and on screen. The commercial’s impact hinged on the emotional authenticity of the Crying Indian’s tear. In promoting this symbol, Keep American Beautiful (KAB) was trying to piggyback on the counterculture’s embrace of Indian-ness as a more authentic identity than commercial culture.

The second duplicity was that KAB was composed of leading beverage and packaging corporations. Not only were they the very essence of what the counterculture was against; they were also staunchly opposed to many environmental initiatives.

KAB was founded in 1953 by the American Can Company and the Owens-Illinois Glass Company, who were later joined by the likes of Coca-Cola and the Dixie Cup Company. During the 1960s, KAB anti-litter campaigns featured Susan Spotless, a white girl who wore a spotless white dress and pointed her accusatory finger at pieces of trash heedlessly dropped by her parents. The campaign used the wagging finger of a child to condemn individuals for being bad parents, irresponsible citizens, and unpatriotic Americans. But by 1971, Susan Spotless no longer captured the zeitgeist of the burgeoning environmental movement and rising concerns about pollution.

The shift from KAB’s bland admonishments about litter to the Crying Indian did not represent an embrace of ecological values but instead indicated industry’s fear of them. In the time leading up to the first Earth Day in 1970, environmental demonstrations across the United States focused on the issue of throwaway containers. All these protests held industry—not consumers—responsible for the proliferation of disposable items that depleted natural resources and created a solid waste crisis. Enter the Crying Indian, a new public relations effort that incorporated ecological values but deflected attention from beverage and packaging industry practices.

KAB practiced a sly form of propaganda. Since the corporations behind the campaign never publicized their involvement, audiences assumed that KAB was a disinterested party. The Crying Indian provided the guilt-inducing tear KAB needed to propagandize without seeming propagandistic and countered the claims of a political movement without seeming political. At the moment the tear appears, the narrator, in a baritone voice, intones: “People start pollution. People can stop it.” By making individual viewers feel guilty and responsible for the polluted environment, the ad deflected the question of responsibility away from corporations and placed it entirely in the realm of individual action, concealing the role of industry in polluting the landscape.

When the ad debuted, KAB enjoyed the support of mainstream environmental groups, including the National Audubon Society and the Sierra Club. But these organizations soon resigned from its advisory council over an important environmental debate of the 1970s: efforts to pass “bottle bills,” legislation that would require soft drink and beer producers to sell, as they had until quite recently, their beverages in reusable containers. The shift to the throwaway was responsible, in part, for the rising levels of litter that KAB publicized, but also, as environmentalists emphasized, for the mining of vast quantities of natural resources, the production of various kinds of pollution, and the generation of tremendous amounts of solid waste. The KAB leadership lined up against the bottle bills, going so far, in one case, as to label supporters of such legislation as “Communists.”

We can still see the impact of the Crying Indian campaign today in mainstream portrayals of environmentalism that prioritize the personal over the political. The answer to pollution, as KAB would have it, had nothing to do with power, politics, or production decisions; it was simply a matter of how individuals acted in their daily lives. Ever since the first Earth Day, the mainstream media have repeatedly turned big systemic problems into questions of individual responsibility. Too often, individual actions like recycling and green consumerism have provided Americans with a therapeutic dose of environmental hope that fails to address our underlying issues.

Iron Eyes Cody (right) at a Keep America Beautiful awards ceremony with Leland C. Barbeur, president of the Fayetteville, N.C., County Youth Council, and Miss Teenage America Cathy Durden, in Washington, D.C. on Dec. 5, 1975. Photo courtesy of Associated Press.

But there is a final way that the commercial distorted reality. In the ad, the time-traveling Indian paddled his canoe out of the distant past, appearing as a visual relic of indigenous people who had supposedly vanished from the continent. He was presented as an anachronism who did not belong in the picture.

One of the commercial’s striking ironies is that Iron Eyes Cody became the Crying Indian at the same moment that actual Indians occupied Alcatraz Island in San Francisco Bay, the very same body of water in which the actor paddled his canoe. For almost two years, from late 1969 through mid-1971, a period that overlapped with both the filming and release of the Crying Indian commercial, indigenous activists demanded that the U.S. government cede control of the abandoned island. They presented themselves not as past-tense Indians, but as coeval citizens laying claim to the land. The Alcatraz activists sought to challenge the legacies of colonialism and contest contemporary injustices—to address, in other words, the realities of native lives erased by the anachronistic Indians who typically populate Hollywood film. By contrast, the Crying Indian appears completely powerless. In the commercial, all he can do is lament the land his people lost.

In recent years, the large-scale organizing and protests against the Keystone XL Pipeline, the Dakota Access Pipeline, and other fossil fuel development projects all represent a powerful rejection of the Crying Indian. While the Crying Indian appeared as a ghost from the past who erased the presence of actual Indians from the landscape, these activists have visibly proposed structural solutions for the environment while demanding indigenous land rights. Moving beyond individual-driven messages, they cast off static symbols of the past to envision a just and sustainable future.

The post The “Crying Indian” Ad That Fooled the Environmental Movement appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/09/crying-indian-ad-fooled-environmental-movement/ideas/essay/feed/ 1
The Origins of Burma’s Old and Dangerous Hatredhttp://www.zocalopublicsquare.org/2017/11/08/origins-burmas-old-dangerous-hatred/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/08/origins-burmas-old-dangerous-hatred/ideas/essay/#comments Wed, 08 Nov 2017 08:01:11 +0000 By Michael Jerryson http://www.zocalopublicsquare.org/?p=89247 In a recent interview with a Guardian journalist, the Burmese monk U Rarzar expressed his country’s rationale for fearing and repressing its Muslim minority. “[The] Ma Ba Tha is protecting people from terrorists like ISIS,” U Rarzar told the British newspaper. “Muslims always start the problems, such as rape and violence.” While U Rarzar’s comments might seem shocking, they repeat a script that Burmese Buddhists have said for almost one hundred years.

The fear, suspicion, and ill will, if not active hatred, that Burmese Buddhists bear toward Muslims is pervasive. It is a kind of ideological indoctrination that permeates the society in ways both subtle and overt. Buddhists across Burma (also known as Myanmar)—whether they are Buddhist monks, nuns, or laity—have expressed fear that their Burmese Buddhist identity is under threat of extermination. In Myanmar, it is popularly understood that to be Burmese (the nation’s largest ethnic group) is to

The post The Origins of Burma’s Old and Dangerous Hatred appeared first on Zócalo Public Square.

]]>
In a recent interview with a Guardian journalist, the Burmese monk U Rarzar expressed his country’s rationale for fearing and repressing its Muslim minority. “[The] Ma Ba Tha is protecting people from terrorists like ISIS,” U Rarzar told the British newspaper. “Muslims always start the problems, such as rape and violence.” While U Rarzar’s comments might seem shocking, they repeat a script that Burmese Buddhists have said for almost one hundred years.

The fear, suspicion, and ill will, if not active hatred, that Burmese Buddhists bear toward Muslims is pervasive. It is a kind of ideological indoctrination that permeates the society in ways both subtle and overt. Buddhists across Burma (also known as Myanmar)—whether they are Buddhist monks, nuns, or laity—have expressed fear that their Burmese Buddhist identity is under threat of extermination. In Myanmar, it is popularly understood that to be Burmese (the nation’s largest ethnic group) is to be Buddhist. As such, a threat to Burmese Buddhism is seen as an existential threat to the nation.

The escalating persecution and genocide of Myanmar’s Muslim minority, the Rohingya, has deep roots in the country’s Buddhist institutions. The Ma Ba Tha that U Rarzar refers to translates to Association for the Protection of Race and Religion. It is well-known for its community outreach programs, legal clinics, donation drives, and its advocacy for Buddhism. Its membership consists of both monastic and lay Buddhist members.

The Ma Ba Tha is also known for its members’ active persecutions of the Rohingya Muslims in far western Burma’s Rakhine State. The attacks against the Rohingya in recent weeks have aroused international condemnation of Burma’s military government and of Aung San Suu Kyi, the formerly revered Nobel Peace Prize-winning politician who is Burma’s de facto civilian leader.

The Ma Ba Tha has been among the most vocal promoters of the notion of an imminent Muslim takeover in the country. In order to address these concerns, in 2015 the Ma Ba Tha supported the passing of four laws, collectively known as the “Race and Religion Protection Laws.” These laws were specifically designed to control the Muslim population’s growth through regulating birth rates, marriages, and conversions. Yet even with these laws in place, there is a rising fear and anxiety among Burmese Buddhists, who believe that the Muslim threat of extermination is nigh.

In fact, the country’s statistics show no such threat. Home to 55 million people, Myanmar has a population that is roughly 88 percent Buddhist. During the 1970s and 80s, the Muslim population stood at 3.9 percent. In the most recent census data from the Burmese Ministry of Labor, in 2016, the Muslim population had risen to 4.3 percent. However, the largest concentration of Muslims in Myanmar is the Rohingya, who have lived in Rakhine State, on the border with what is now Bangladesh, since the 1800s. While their numbers have increased over the years, their proportion of the national population has remained relatively constant.

If these numbers are accurate, why do Myanmar’s Buddhists exhibit such anxiety and fear?

Part of the answer lies in history. During the British colonization of Burma (1824-1948), there was a steady flow of South Asian immigrants into Myanmar. The British interpreted the developing South Asian Muslim community as evidence of modernization. Unfortunately, this colonial preferential stereotyping also divided South Asian Muslims from their Burmese Buddhist counterparts.

The British occupation of Burma, promotion of Christianity, and the lauding of non-Burmese Buddhists, sparked organized Buddhist responses, such as the Young Men’s Buddhist Association (YMBA), which sought to revitalize Burmese Buddhism. At the same time, South Asian Muslims were derided with the derogatory label “kalars,” due to their religion and darker skin color. Burmese Buddhists viewed South Asians as both polluting the reputation of the country, and as contributing to the eradication of Burmese Buddhists within their own country.

In the 1930s, Burma began boycotting “Indian goods.” Authoritative organizations such as the Legislative Council of the Governor of Burma characterized the continual immigration of South Asians as turning Burma into a dumping ground. The racialization of South Asian Muslims was not unique to Myanmar. In other Southeast Asian countries such as Thailand, South Asian Muslims and Malay Muslims have been labeled with the derogatory term khaek, another reference to skin color.

It would be easy to discount the atrocities taking place in Myanmar as an aberration. Unfortunately, the country’s history offers a very different assessment. Sadly, this is not a new issue, but it is a new chapter.

From the 1930s onward, there were periodic anti-Muslim riots and pogroms. According to Nyi Nyi Kyaw, a postdoctoral fellow at the Centre for Asian Legal Studies, at the National University of Singapore, who has written extensively on the history of anti-Muslim feelings in Myanmar, the Burmese Buddhist attacks focused primarily on the South Asian Muslims, such as Bengali Muslims. Many of these people emigrated from the Indian state of Bengal and what is now Bangladesh. These attacks continued throughout the Burmese military junta’s reign, from 1962 to 2011.

This background becomes crucial in understanding the power behind the recent Rohingya narratives in the media. When high-ranking Buddhist monks such as U Wirathu remind their Buddhist audiences about the dangers of Islam, and reference the kalars—likening the Rohingya to wild dogs or African carp—they are making use of a well-rehearsed racist narrative. This racism fuels fears of pollution, and stokes the fires of hatred and desire to commit violence. It also allows the Burmese Buddhists to see the Rohingya as the “Other:” a caricature of the foreign as sub-human, with very little moral worth.

This campaign of dehumanization has been disastrous for the Rohingya. After widespread anti-Muslim violence in 2012, the Burmese government placed many Rohingya in camps. Despite severe criticism from international organizations including Human Rights Watch, International Crisis Group, and Amnesty International, the government forced more than 120,000 Rohingya to live in cramped spaces, without sufficient food, water, or medical attention. In 2014, The New York Times columnist Nicholas Kristof identified these areas as concentration camps and noted that physicians, including Doctors without Borders, were removed from the camps and not permitted to re-enter.

Buddhist authorities have fostered another narrative in Burmese history: the invasion and pollution of the Burmese Buddhist female body. With Myanmar’s Race and Religion Protection Laws, the Ma Ba Tha made women’s bodies the staging ground of a battle for Buddhism. The “Religious Conversion Law” “protects” Burmese Buddhist women from marrying Muslims and converting to Islam. U Wirathu has delivered sermons claiming that the Muslim strategy is to convert Buddhist women, impregnate them, and raise Muslims as enemies against the country. This tactic has not been overlooked by Hindu nationalists in India, who recently made allegations of Muslim plots to “seduce” their women.

Women’s bodies are not only protected, they are revenged in this narrative, with violent retaliation for the “pollution” of Burmese Buddhist womens’ bodies. The most recent chapter of anti-Muslim violence began in June 2012, over allegations that Rohingyas had raped a Rakhine Buddhist woman. Even though there was no legal verification of the attack, the Rakhine Buddhists burned the villages of the Rohingyas. More than 100,000 Rohingya became refugees by the end of 2012—and were soon placed in Myanmar’s concentration camps.

In his book Colors of Violence, Indian psychoanalyst Sudhir Kakar examines the roots of Hindu-Muslim violence in India. He argues that during a conflict, an attack on a female body escalates a conflict and dissolves any possibility of civil discourse. Kakar writes: “Rape makes such interactions impossible and turns Hindu-Muslim animosity into implacable hatred.”

The violence also focuses on Rohingya female bodies. Burmese Buddhist soldiers have raped Rohingya women as a means to exert their dominance. While Buddhist monks like U Wirathu allege that the Rohingya are raping Burmese Buddhist women, there have been steady reports coming from UN-sanctioned shelters of Rohingya women being raped by Burmese Buddhist soldiers. Annette Ekin, reporting from a Bangladeshi shelter for the Rohingya, details 20 year-old Ayesha Begun’s recounting of soldiers killing the men, tearing a baby away from a mother, and gang-raping Ayesha and the other women. The New York Times reporter Jeffrey Gettleman narrates an equally brutal example with a young Rohingya woman called Rajuma.

It would be easy to discount the atrocities taking place in Myanmar as an aberration. Unfortunately, the country’s history offers a very different assessment. Sadly, this is not a new issue. It is but a new chapter of Buddhist-inspired violence, racism, and sexist rhetoric. The actions do not reflect a new development in Buddhism, or a unique strain within Burmese Buddhism.

Whether it is Japanese Zen Buddhist masters, Tibetan lamas, or Sri Lanka monks, history provides examples of Buddhist religious authorities engaging in violence, and supporting wars and conflicts. In addition they have a tradition of methods in which Buddhists support gender discrimination and military forms of governance.

I cannot emphasize enough that these dark elements do not reflect general Buddhist sentiments on a global level. More than 1 billion people practice some form of Buddhism. The vast majority of them actively support peace and contemplative behavior. But that generality does not mean that Buddhists are immune to racist tendencies, acts of rape, and other forms of violence. Instead, atrocities such as those in Myanmar serve as a grim reminder that humankind is vulnerable to vices, regardless of religion or nationality.

The post The Origins of Burma’s Old and Dangerous Hatred appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/08/origins-burmas-old-dangerous-hatred/ideas/essay/feed/ 1