Zócalo Public SquareZócalo Public Square http://www.zocalopublicsquare.org Ideas Journalism With a Head and a Heart Thu, 23 Nov 2017 01:59:12 +0000 en-US hourly 1 https://wordpress.org/?v=4.8 A Very Cheech Marin Thanksgivinghttp://www.zocalopublicsquare.org/2017/11/21/cheech-marin-thanksgiving/inquiries/connecting-california/ http://www.zocalopublicsquare.org/2017/11/21/cheech-marin-thanksgiving/inquiries/connecting-california/#comments Tue, 21 Nov 2017 08:01:03 +0000 By Joe Mathews http://www.zocalopublicsquare.org/?p=89461 This week, California should give thanks for Cheech.

Richard Anthony Marin deserves our gratitude not just because his new autobiography, Cheech Is Not My Real Name … But Don’t Call Me Chong, turns out to be the best California book of the year. And not just because his career should give you hope that no matter how short, bald, or brown you are, you can be a star.

The biggest reason to thank Cheech now is that his life embodies Thanksgiving itself: a big, robust meal that includes many different flavors but is ultimately for everyone. This California entertainer’s decades-long persistence at the center of our culture reminds us, happily, that our state’s cultural mainstream is so much more interesting and inclusive than we acknowledge.

Indeed, Cheech is evidence of a California paradox: To stay in the mainstream here, you have to be something of an outsider. To achieve

The post A Very Cheech Marin Thanksgiving appeared first on Zócalo Public Square.

]]>
This week, California should give thanks for Cheech.

Richard Anthony Marin deserves our gratitude not just because his new autobiography, Cheech Is Not My Real Name … But Don’t Call Me Chong, turns out to be the best California book of the year. And not just because his career should give you hope that no matter how short, bald, or brown you are, you can be a star.

The biggest reason to thank Cheech now is that his life embodies Thanksgiving itself: a big, robust meal that includes many different flavors but is ultimately for everyone. This California entertainer’s decades-long persistence at the center of our culture reminds us, happily, that our state’s cultural mainstream is so much more interesting and inclusive than we acknowledge.

Indeed, Cheech is evidence of a California paradox: To stay in the mainstream here, you have to be something of an outsider. To achieve Cheech-level ubiquity in a place as big and diverse as this, you have to cross lines, rather than respect them.

In the case of Cheech, this is all the more so because he is most often identified as a “cult” figure—best remembered as one-half of the stoner comedy team, Cheech and Chong, that produced hit comedy records and the 1978 film Up in Smoke. But his career has been much bigger and more mainstream than that.

Indeed, the dirty secret of Cheech’s life, as he tells it, is just how much of a square he’s been. Marin was a middle-class kid who saw two sides of L.A. He spent his early years in African-American sections of South Central Los Angeles, where his parents had gone to high school. His father was an LAPD officer; his mother was president of the PTA. But by his teens, the family had relocated to a white neighborhood in the San Fernando Valley.

Racially and ethnically, he was an outsider in both places, so he coped by doing everything he could to fit in. The future stoner actor-musician-writer-comedian was a Cub Scout, a Boy Scout, an altar boy, and “a little wiseass who got straight A’s,” first at Catholic schools and later at San Fernando Valley State College (now Cal State Northridge). Marin describes himself as an educational traditionalist, stumping for the values of the classic liberal arts education. He even worked in aerospace during college, manufacturing airplane galleys at Nordskog.

The book’s signature moment—recounted by Cheech as the Apostle Paul might have recalled his trip to Damascus—is when he smoked marijuana for the first time, and found that the allegedly mind-rotting substance expanded his perspective. He thought: “What else have they been lying about?”

And with that, he discovered art, awakened politically, dodged the draft, went to Canada, met former musician Tommy Chong, and began playing shows all over the world. The rest is California history. Lou Adler started managing him, he bought a house in Malibu, and even practiced Transcendental Meditation, as taught by the Maharishi Mahesh Yogi.

Cheech proudly identifies as Chicano and Latino, and sees his heritage as bridge, not niche. The glory of being Latino, in his telling, is that you are part of a demographic that is itself so diverse as to contain multitudes.

“My face has some kind of international malleability to it. Add your own preferences or prejudice to it, and I could be anything,” he writes, before sharing a successful romantic strategy in which he told women he was “Dutch Indonesian.”

But narrow-minded Hollywood types couldn’t see his natural breadth at first, and sought to pigeonhole him in narrow roles. Marin countered by writing his own material, most successfully in the 1987 film, Born in East L.A. The movie is quintessential Cheech—a comedy that frames the Mexican-American story as fundamentally American, and demonstrates the absurdity of trying to put people in boxes.

(Its humor holds up far too well. Marin’s character, an American citizen and native Angeleno, gets caught up in an immigration raid and mistakenly deported back to Mexico, where he is a fish out of water. Today, that comic premise is a nasty reality: The federal government has been mistakenly deporting thousands of American citizens annually, according to the Deportation Research Clinic at Northwestern University.)

“My face has some kind of international malleability to it. Add your own preferences or prejudice to it, and I could be anything.”

Marin’s other strategy was to find roles in the most middlebrow TV, movie, and musical productions and make them his own. He did a spin-off of The Golden Girls. (“When the opportunity came to be in a very, very mainstream, down-the-middle-of-Middle-America show, I jumped at it,” he was quoted as saying at the time). He also made a successful children’s album, and became a regular player in Richard Rodriguez movies.

He has touched most stations of the California cultural cross. For years, he co-starred with Don Johnson on the high-ratings police drama Nash Bridges, which was set and filmed in San Francisco. Marin loved the city so much that he moved his family there, and developed a golf jones from sneaking away from the set with Johnson to play Pebble Beach.

He also found time to appear in the premiere of a Sam Shepard play, The Late Henry Moss, opposite Sean Penn at San Francisco’s Theatre on the Square. And he turned himself into a regular voice in the animated films of Emeryville-based Pixar films, most notably as Ramone in the Cars films. He also showed up as Banzai in The Lion King.

Marin is unapologetic about mainstream success. His book includes an entire chapter on how he outsmarted Anderson Cooper to become the champion of Celebrity Jeopardy. And by his account, his old partner, Tommy Chong, is a cautionary tale, whose career founders because he was not willing to evolve to reach audiences.

“I have no hard feelings about the journey,” he writes. “We’ve been successful and made a lot of people happy. I just wonder why we were so hard for people to see for so long.”

Marin has made news recently as a leading collector and public champion of Chicano art. The City of Riverside has proposed to turn over its main library—across the street from the historic Mission Inn—for the Cheech Marin Center for Chicano Art, Culture, and Industry. Marin, ever mainstream, emphasizes that, “Chicano art is American art.”

Despite his cult status, it’s hard to call Cheech countercultural now. The man who performed his most recent marriage, former Mayor Antonio Villaraigosa, is a leading candidate for California governor. And on January 1, 2018, recreational marijuana will become legal in his home state. The counterculture has become the mainstream, and Cheech has one foot in both. Isn’t that the California dream?

Now that Cheech is an institution, maybe it’s time to honor him as one. Perhaps California could have its own version of Mt. Rushmore; the natural spot to chisel out the faces of Golden State greats would be in the Granite Mountains, a small range in the Mojave Desert.

There would be many great candidates for this pantheon. But why not start by carving the old stoner in stone?

The post A Very Cheech Marin Thanksgiving appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/21/cheech-marin-thanksgiving/inquiries/connecting-california/feed/ 1
Want to Take My Civics Class? Get Ready to Squirmhttp://www.zocalopublicsquare.org/2017/11/17/want-take-civics-class-get-ready-squirm/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/17/want-take-civics-class-get-ready-squirm/ideas/essay/#comments Fri, 17 Nov 2017 08:01:20 +0000 By Sarah Cooper http://www.zocalopublicsquare.org/?p=89399 In many conversations, the topic of civics education comes with its own halo. The conventional wisdom is that it’s good, clean medicine, and if our children just get enough of its inoculation, the American body politic will be healthy enough to survive another generation.

But after nearly two decades as a middle-school and high-school history teacher, I’ve come to understand through teaching civics—and studying how it’s taught—that learning how to be a citizen doesn’t work like that. Indeed, civics education is best when it’s messy and uncomfortable.

That’s especially true in times of conflict and transition, like the ones we are living in now.

Approaches to teaching civics have been as volatile as the country’s history.

Horace Mann, common school advocate in the mid-19th century, believed that schools should teach only those values that everyone agreed with, such as charity and justice, recounted the educational historian David Tyack in Seeking

The post Want to Take My Civics Class? Get Ready to Squirm appeared first on Zócalo Public Square.

]]>
In many conversations, the topic of civics education comes with its own halo. The conventional wisdom is that it’s good, clean medicine, and if our children just get enough of its inoculation, the American body politic will be healthy enough to survive another generation.

But after nearly two decades as a middle-school and high-school history teacher, I’ve come to understand through teaching civics—and studying how it’s taught—that learning how to be a citizen doesn’t work like that. Indeed, civics education is best when it’s messy and uncomfortable.

That’s especially true in times of conflict and transition, like the ones we are living in now.

Approaches to teaching civics have been as volatile as the country’s history.

Horace Mann, common school advocate in the mid-19th century, believed that schools should teach only those values that everyone agreed with, such as charity and justice, recounted the educational historian David Tyack in Seeking Common Ground: Public Schools in a Diverse Society. At the time, “everyone” meant people like Horace Mann—white, wealthy, Protestant, and native-born, not the immigrants who poured across the Atlantic.

Later in the 19th century and into the 20th, the government waded into programs of Americanization, trying to replace the influence of the family with the power of the state in teaching how to be an American citizen. Such inculcation through schools included Native Americans early on and Japanese Americans later, after World War II. In contrast, with the cultural upheavals of the 1960s and ‘70s, many school districts began treating each culture as something to be celebrated rather than repressed.

Regardless of the era, civics education over the past two centuries has been driven by social upheaval. As Tyack observed, “During periods of sharp demographic change, or war, or ethno-religious conflict, or economic challenge, for example, foundational principles of civic education came into sharper relief because they were less taken for granted.”

Today we live in a period of such conflicts, and I take absolutely nothing for granted in teaching civics. So I try to mix traditional and contemporary approaches.

On the traditional side, I ask my students to memorize lines from the Declaration of Independence and Lincoln’s speeches, especially those that focus on difficult compromise and principled revolt. I hope that phrases such as “We shall nobly save, or meanly lose, the last best hope of earth,” and “it is their right, it is their duty, to throw off such Government,” will ring in students’ heads well into adulthood, inciting them to action.

When I first taught eighth-grade history, I required students to memorize a lot more. On tests they had to regurgitate names that educated Americans barely remember, such as John O’Sullivan with his Manifest Destiny theory or Frederick Jackson Turner with his frontier thesis. For women’s suffrage, students had to spit out a litany of names rather than just a few who could serve as touchstones, such as Elizabeth Cady Stanton, Ida B. Wells, or Alice Paul.

With civics and the Constitution, I also used to include details that no longer matter as much to me, because they mire students in minutiae at the expense of deeper understanding. My middle schoolers no longer have to know the exact number of the amendment that limits presidential terms, but rather why it happened and in response to what.

My eighth-grade U.S. history students conduct discussions on their own, pose questions that provoke and unsettle, and challenge authority—including and especially mine—with respect.

Broad educational shifts, such as evolving state standards, have definitely influenced me to focus more on depth and less on breadth when teaching both civics and history. Even Advanced Placement history exams have been overhauled in the past several years to emphasize thematic understanding over disconnected details.

Yet even this kind of conceptual focus on the broader picture has not always felt like enough.

In the past several years, to get students to engage and think like citizens, I’ve found that I have to do a heck of a lot more than ask them to understand the past and dissect the present: I need them to create the future in my classroom, right now, through discussion and debate. I need them to be leaders in the classroom, not just participants, so that they can imagine presiding over boardrooms, civic groups, and family conversations in the future.

And so, on the best days, these eighth-grade U.S. history students conduct discussions on their own, pose questions that provoke and unsettle, and challenge authority—including and especially mine—with respect.

After all, the laws of a country mean little without the ability of leaders and citizens to listen to other views and believe that our laws mean something. The Constitution remains powerful only in that citizens have gifted it that power over two centuries and counting.

As Horace Mann foresaw, open dialogue breeds discomfort. When we examine Black Lives Matter or gun control, some of my students lean in, while others squirm and slouch. I tell them that these discussions are supposed to be uncomfortable.

And each Friday in my classroom, several students bring up such uncomfortable issues in their weekly current events presentations. The presenter summarizes a news article and gives the reasons he or she chose it. Everyone writes down a question or comment—and then the floor opens for discussion.

The presenter fields questions on everything from how to prevent nuclear war, to why there are so many homeless people in Los Angeles, to what abortion looked like before Roe v. Wade. Week by week, we work on solving problems together.

When I first started teaching, students did these weekly presentations, but we talked about their articles only for a few minutes, and the questions came largely from me. In an attempt to make my classroom feel more democratic, I realized a few years ago that more of the power needed to be in students’ hands, at least one day a week.

And so I sit back and listen, to questions they ask of each other: “So is the government doing anything yet to help with the recent flooding?” or “Why has the homicide rate gone up in that country?”

In describing such “maximally open classroom climates,” education professor Meira Levinson in No Citizen Left Behind offers questions that students can ask themselves and each other while engaging in discussion, such as “Why do people care about this topic?” and “Is this person making an argument, or just talking for the sake of talking?”

In our national discourse, Levinson’s kind of metacognition might inspire those who talk too much to listen more, and those who don’t talk enough to speak up already.

By fostering such reflective questions about how we talk, civics education can also create new lenses through which students can view the world.

I see kids change their perspectives in my classroom every week. For instance, once they understand how a concept such as federalism relates to marijuana or immigration laws in California, they begin to ask about how people in other states are affected by these issues. The thought that someone in Texas might possess entirely different rights than someone in Los Angeles shocks them. They sputter at what seems to them the unfairness of it all.

If we’ve just discussed the road to Civil War, I’ll link civics to history—ask them to imagine what it might have felt like to be bound under even more serious state-to-state conflicts, such as the Fugitive Slave Law.

Not all of these discussions come easy. Sometimes the political opinions that the presenters express, in our majority but certainly not entirely liberal school, land the wrong way for a student who believes in pushing tax cuts or loosening gun restrictions. Sometimes I worry that those in the ideological minority don’t even want to talk because they don’t feel it’s worth it. And so I occasionally step in to play devil’s advocate for whichever side is not getting enough airtime on an issue, to remind students that our classroom bubble is not the world’s bubble.

The ultimate perspective that I hope for my students is that they assume a little more humility than adolescents (or adults) typically do. When I ask them what they don’t know about an issue, I want these eighth graders to remember that they need to rely not only on themselves for answers to civic problems, but also on the people around them. Often I’ll bring in a historical primary source to make a point like this, such as a speech that the not-always-humble Ben Franklin gave to the Constitutional Convention in September 1787.

Franklin, after four months of difficult deliberations, asked each delegate in Independence Hall to accept that the draft of the Constitution was good enough. Then he asked each man to do even more, to leap beyond mere acceptance and “doubt a little of his own infallibility.”

Franklin’s ability to step back from the debate is the real inoculation I’m looking for as a civics teacher and a citizen. Not a syringe filled with terms such as ratify and suffrage, but one packed with a heavy dose of self-doubt. And it should be a big enough dose for our children to understand that raising citizens, and being citizens, can be as messy and pungent a process as four months of sitting in a hot Philadelphia summer, until you’re ready to call something good enough, for now.

The post Want to Take My Civics Class? Get Ready to Squirm appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/17/want-take-civics-class-get-ready-squirm/ideas/essay/feed/ 2
American Populism Shouldn’t Have to Embrace Ignorancehttp://www.zocalopublicsquare.org/2017/11/14/american-populism-shouldnt-embrace-ignorance/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/14/american-populism-shouldnt-embrace-ignorance/ideas/essay/#comments Tue, 14 Nov 2017 08:01:41 +0000 By Daniel R. DeNicola http://www.zocalopublicsquare.org/?p=89343 Public ignorance is an inherent threat to democracy. It breeds superstition, prejudice, and error; and it prevents both a clear-eyed understanding of the world and the formulation of wise policies to adapt to that world.

Plato believed it was more than a threat: He thought it characterized democracies, and would lead them inevitably into anarchy and ultimately tyranny. But the liberal democracies of the modern era, grudgingly extending suffrage, have extended public education in parallel, in the hope of cultivating an informed citizenry. Yet today, given the persistence and severity of public ignorance, the ideal of an enlightened electorate seems a fading wish at best, a cruel folly at worst.

Unfortunately, our current civic problem cuts even deeper: We are witnessing the rise of a culture of ignorance. It is particularly insidious because it hijacks certain democratic values. To begin to understand this culture and its effects, it is

The post American Populism Shouldn’t Have to Embrace Ignorance appeared first on Zócalo Public Square.

]]>
Public ignorance is an inherent threat to democracy. It breeds superstition, prejudice, and error; and it prevents both a clear-eyed understanding of the world and the formulation of wise policies to adapt to that world.

Plato believed it was more than a threat: He thought it characterized democracies, and would lead them inevitably into anarchy and ultimately tyranny. But the liberal democracies of the modern era, grudgingly extending suffrage, have extended public education in parallel, in the hope of cultivating an informed citizenry. Yet today, given the persistence and severity of public ignorance, the ideal of an enlightened electorate seems a fading wish at best, a cruel folly at worst.

Unfortunately, our current civic problem cuts even deeper: We are witnessing the rise of a culture of ignorance. It is particularly insidious because it hijacks certain democratic values. To begin to understand this culture and its effects, it is helpful to identify the ways it differs from simple ignorance.

Perhaps the most noticeable aspect of a culture of ignorance is the extent of willful ignorance. Ignorance that is willful may involve resistance to learning, denial of relevant facts, the ignoring of relevant evidence, and suppression of information. Such ignorance is usually maintained in order to protect a prior belief or value—a sense of self, an ideology, a religious doctrine, or some other cherished cognitive commitment. False knowledge often bolsters one’s will in maintaining a closed mind; but of course, it is only ignorance in elaborate disguise.

When the willfully ignorant are cornered by mounting evidence, they assert their individual right to believe whatever they choose to believe. This is a hollow and silly claim. Beliefs are factive; they aspire to truth. Moreover, beliefs affect attitudes, decisions, and actions. As the Victorian mathematical philosopher William K. Clifford remarked, “No one man’s belief is in any case a private matter which concerns him alone.” He proposed “an ethic of belief” and championed our responsibility to respect evidence for and against our beliefs. Though his standard of evidence may have been too stringent, we can agree that claiming the right to believe “whatever” exploits the democratic respect for individual rights by foregoing individual responsibilities.

A related characteristic is the rejection of expertise. Liberal democratic theory and practice have always elevated individual autonomy and independence, rejecting authority and dependency. They therefore have had difficulties with any relationship that yields individual autonomy—which seems to be involved in consulting an expert. It is true that the place of expertise in a democracy remains contested: We may yield to the expertise of the physician, pilot, or engineer (albeit uneasily); but we may be skeptical of the expertise of the economist, climate scientist, or critic.

Are we heading toward a culture of ignorance? Photo courtesy of pxhere.

Our ambivalence regarding expertise has increasingly come to be a rejection. The rise of social media has certainly contributed to this trend. Who needs a qualified film or restaurant critic when one can find websites that provide thousands of audience or diner ratings? But the implications go far beyond aesthetics: As a senior minister famously said during the recent Brexit campaign, “Britain has had enough of experts.” Among at least a significant portion of the population, this attitude has led to a rejection of the traditional sources and certifiers of knowledge—universities, science, established journalism. As this attitude engulfs public life, it undermines the fragile but vital distinction between knowledge and belief, between informed judgment and unreflective opinion.

This epistemic populism seems radically democratic, but that image is an illusion. Democracy is, as John Dewey described, a moral climate in which each person may contribute to the construction of knowledge; but it doesn’t imply that each person possesses the truth. Moreover, one need not yield political authority to experts; it is epistemic authority—the authority of knowledge, skill, experience, and judgment—that is carried by experts.

At some point, the “wisdom of crowds” becomes the celebration of ignorance. Conspiracy theories, wild speculations and accusations, nutty claims, “alternate facts,” and pronouncements that are far afield from one’s knowledge—all these claim time or space on a par with accurate and important information. The politician who is ignorant of politics, the law, and history is seen as the person who will “get things done.” Some public figures wear their ignorance as a badge of honor. Let’s be clear: Ignorance is not stupidity, though I admit it is sometimes difficult to tell them apart in practice. And stupidity is likely to produce ignorance across a broad front. But one can be ignorant without being stupid.

Underlying all of these factors is the loss of respect for the truth. No doubt, many things have contributed: the venality of some experts, the public disagreement among experts, the continual revising of expert advice, and the often-unwarranted movement by social scientists from the descriptive to the normative, from facts to pronouncements. Religious fundamentalism, which stretches credibility, is another precipitating factor. The postmodernist deconstruction of ideals like truth, rationality, and objectivity, also contributed to this loss—though I doubt that postmodernist treatises were widely read among conspiracy theorists, religious fundamentalists, or climate change deniers.

At some point, the “wisdom of crowds” becomes the celebration of ignorance.

The irony is that these folks believe they are holding the truth. Indeed, I am not suggesting that we need to claim we possess the Truth, firmly and finally; in fact, I believe those who make that claim actually disrespect the truth. Rather, we need to keep the ideal of truth to guide our inquiries, to aspire to greater truth. Not all opinions or interpretations are equally worthy. The concept of truth is required to separate knowledge from opinion; those who give up on truth, those for whom truth doesn’t matter, are—as the contemporary philosopher Harry Frankfurt said—left with bullshit.

There are signs of hope. Many young people have a naturally skeptical, even cynical, attitude regarding information sources. There is a surge of interest in investigative journalism in various forms. The teaching of critical thinking has broadened to include information literacy: Many colleges now provide ways to learn the skills of evaluating informational sources and content, including statistical integrity. Scholars are giving new attention to epistemic virtues, capacities and traits that enhance the acquisition of knowledge. There is excited talk among feminist and educational philosophers of “an epistemology and pedagogy of resistance” that confronts willful ignorance and the “epistemic injustice” of systematically discrediting certain voices.

The danger, and by the same token, the hope lies in this truth: In the end, ignorance will lead to error. Serious mistakes and their consequences may be required before there is momentum sufficient to roll back this culture.

The post American Populism Shouldn’t Have to Embrace Ignorance appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/14/american-populism-shouldnt-embrace-ignorance/ideas/essay/feed/ 2
What Californians Can Learn From South Korea’s Nuclear Coolhttp://www.zocalopublicsquare.org/2017/11/13/californians-can-learn-south-koreas-nuclear-cool/inquiries/connecting-california/ http://www.zocalopublicsquare.org/2017/11/13/californians-can-learn-south-koreas-nuclear-cool/inquiries/connecting-california/#comments Mon, 13 Nov 2017 08:01:13 +0000 By Joe Mathews http://www.zocalopublicsquare.org/?p=89322

Can Californians learn to be as cool as Koreans in the face of nuclear annihilation?

Visiting Seoul last week, I asked people how they stay sane while living within range of North Korea’s weapons. After all, Kim Jong Un’s capital, Pyongyang, is just 120 miles from Seoul—the same meager distance protecting San Diego from Los Angeles.

Seoul’s regional population is now 25 million, about half of the country’s total population, and so South Koreans have been living productively under North Korea’s threats for more than six decades. But for Californians, being North Korean targets is disorientingly new—because of the regime’s recent advances in developing both nuclear warheads and intercontinental missiles that might be able to reach Disneyland. North Korean propaganda has sown fears with specific threats against California and even animations of nuking San Francisco.

These threats have been amplified by President Trump’s thoughtless provocations of the North—his pledge of

The post What Californians Can Learn From South Korea’s Nuclear Cool appeared first on Zócalo Public Square.

]]>

Can Californians learn to be as cool as Koreans in the face of nuclear annihilation?

Visiting Seoul last week, I asked people how they stay sane while living within range of North Korea’s weapons. After all, Kim Jong Un’s capital, Pyongyang, is just 120 miles from Seoul—the same meager distance protecting San Diego from Los Angeles.

Seoul’s regional population is now 25 million, about half of the country’s total population, and so South Koreans have been living productively under North Korea’s threats for more than six decades. But for Californians, being North Korean targets is disorientingly new—because of the regime’s recent advances in developing both nuclear warheads and intercontinental missiles that might be able to reach Disneyland. North Korean propaganda has sown fears with specific threats against California and even animations of nuking San Francisco.

These threats have been amplified by President Trump’s thoughtless provocations of the North—his pledge of “fire and fury like the world has never seen” seemed to welcome nuclear war—and by more official warnings in California. The L.A.-area Joint Regional Intelligence Center issued a bulletin last summer urging state and local officials in California to update nuclear attack response plans. The bulletin also included scary details, including how, post-mushroom cloud, your pets might carry enough radioactive contamination to kill you.

Arriving in Seoul a little jittery (my mother asked me if I really had to go), I found people in Seoul to be profoundly reassuring, especially considering that North Korea has its ground forces and thousands of pieces of artillery positioned near the border, just 35 miles to the north.

“Keep calm,” I was advised over again, with some citing the World War II-era British advice to “Keep calm and carry on.” In bars, patrons swapped out “and carry on” for “and drink beer”—a popular meme that nods to the Korean passion for spirits.

It’s also hard-won wisdom that Californians should adopt: To avoid war with a volatile neighbor for 60 years, you can’t lose your temper or your head. Indeed, the deepest worries I heard from South Koreans involved the reliability of President Trump, and whether he was truly committed to honoring the longstanding ironclad American commitment to protect South Korea even if it means risking an attack on the U.S. mainland. The North Korean strategy of escalation—through nuclear bomb and missile tests—is often seen as posing a question to the United States: “Are you willing to trade Los Angeles for Seoul?”

But that question, while linking the fates of California and Korea in a frightening manner, is seen as mostly rhetorical in Seoul, a city so economically and culturally vital that its destruction is almost unthinkable.

Instead, South Koreans see the current conflict cynically—as a contest between a dictator, Kim Jong Un, and a reality TV-authoritarian, President Trump, who both use threats to rile people up in service of keeping power, so they can enhance their personal wealth. So why play into their hands? One Korean scholar, boasting that his country had impeached its own corrupt and crazy president, Park Geun-hye, back in March, asked me if the United States might follow Korea’s lead and do the same.

Rather than give in to authoritarian madness, locals say it’s better to behave nonchalantly. That’s why South Korea’s new president Moon Jae-In went on vacation after the North launched an intercontinental ballistic missile this summer. The news media reinforces such sanguinity; last week, stories about shake-ups in Korea’s business world, rising Seoul housing prices, the upcoming Winter Olympic Games here in February, and efforts to tackle the social problems of suicide and “spy-camera pornography” got more notice than the possibility of a nuclear exchange.

The North Korean strategy of escalation—through nuclear bomb and missile tests—is often seen as posing a question to the United States: “Are you willing to trade Los Angeles for Seoul?”

During a daylong conference I attended on the future of Korean democracy, North Korea got mentioned exactly once. When I asked Monk Ji-Sun—the president of Korea Democracy Foundation, which protects the history of Korea’s democratization and promotes a more democratic future—about the situation, he argued that the best strategy is to ignore the threats and machinations of other powers and focus on developing the country’s own institutions instead.

To be sure, South Koreans are making some defensive preparations, and even discussing the possibility of nuclear weapons. In August, the government conducted a large-scale civil defense drill, though it wasn’t taken particularly seriously. And I met a few Koreans who admitted to having packed bags at home just in case of attack, with many of the items—cash, identification, water, food, first-aid supplies—that Californians assemble in their own earthquake kits.

One afternoon, I had coffee with Leif-Eric Easley, who grew up in Long Beach and now lives with his wife and two children in Seoul, where he is a professor at Ewha University. An expert on international relations and Northeast Asia, Easley argues that North Korea’s provocations are meant to divide its neighbors, so not rising to the bait is a good strategic response.

Easley says that Koreans stay cool in the face of threats because they understand the situation well, and knowledge reduces fear. But the risk of war is not zero, and he sees a certain desensitization to the war threat. After North Korea’s sixth nuclear test in September, the parks were so full of Koreans enjoying good weather and beer that his family found it hard to find a place to picnic. Part of this lack of fear is generational—the Koreans who remember the horrors of war are dying, he noted.

After our conversation, I walked by the U.S. Embassy, where I encountered competing protests—one a “No Trump Zone” that called for the pursuit of peace with the North, and the removal from South Korea of an American missile defense system known as THAAD (Terminal High Altitude Area Defense). The smaller counter-protest urged a pre-emptive American strike on the North: “You Bomb North Korea. We Support You.”

Both protests were tiny compared to two nearby events. Several hundred Koreans, mostly in their twenties, were attending a job fair, a familiar scene in a wealthy country struggling with high youth unemployment. And a short walk away, in Gwanghwamun Plaza, thousands of young people gathered to watch a rehearsal for an upcoming Olympic-themed concert by the K-pop group Twice.

The nine young women in the group were singing their huge hit, “Cheer Up.” It’s about dealing with an anxious boyfriend who keeps texting his love, escalating in desperation to something that might sound threatening.

But the chorus offers some good advice, to girlfriends and Californians alike: Stay cool and de-escalate the confrontation. “I’ll act calm,” Twice sings, “as if it’s nothing.”

The post What Californians Can Learn From South Korea’s Nuclear Cool appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/13/californians-can-learn-south-koreas-nuclear-cool/inquiries/connecting-california/feed/ 1
We Shouldn’t Rely on Politicians to Memorialize Our Fallen Soldiershttp://www.zocalopublicsquare.org/2017/11/10/shouldnt-rely-politicians-memorialize-fallen-soldiers/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/10/shouldnt-rely-politicians-memorialize-fallen-soldiers/ideas/essay/#comments Fri, 10 Nov 2017 08:01:38 +0000 By Kelly Kennedy http://www.zocalopublicsquare.org/?p=89299 Five U.S. infantry soldiers died on June 21, 2007, when their 30-ton Bradley tracked vehicle hit a deep-buried bomb in Adhamiyah, Iraq.

I was embedded as a reporter with their unit when they died, and I watched as the men who served with them rallied.

They reached out to the mothers and fathers and wives, offering and seeking comfort, but also saying what they believed needed to be heard:

It was quick.
We were with them at the end.
We will never forget.

The families often reach back too, spreading wide wings over the men and women left behind in return for stories of their sons and daughters and wives and husbands.

“You can call me ‘mom,’ because he can’t.”
“Tell me again about the time she …. ”

A service member’s bond with a Gold Star family feels profound because it squares so many different contradictions. The relationship is

The post We Shouldn’t Rely on Politicians to Memorialize Our Fallen Soldiers appeared first on Zócalo Public Square.

]]>
Five U.S. infantry soldiers died on June 21, 2007, when their 30-ton Bradley tracked vehicle hit a deep-buried bomb in Adhamiyah, Iraq.

I was embedded as a reporter with their unit when they died, and I watched as the men who served with them rallied.

They reached out to the mothers and fathers and wives, offering and seeking comfort, but also saying what they believed needed to be heard:

It was quick.
We were with them at the end.
We will never forget.

The families often reach back too, spreading wide wings over the men and women left behind in return for stories of their sons and daughters and wives and husbands.

“You can call me ‘mom,’ because he can’t.”
“Tell me again about the time she …. ”

A service member’s bond with a Gold Star family feels profound because it squares so many different contradictions. The relationship is about both loss and presence, about courage and fear, and about a link with the loved one with whom we no longer can connect.

But, as the families and veterans wrap around each other, these tight bonds can exclude those in our communities who haven’t served in the military themselves or who don’t know anyone who serves now. Such exclusion may seem a small point in the immediate context of soldiers and families grieving those they loved. And letting a wider group of people into a tragedy may seem like too much for people who already carry a heavy burden of loss.

But exclusion has real-world consequences for families, communities, and the country as a whole.

How can we grieve for service members we don’t know, but who so completely represent us? How can we support families who don’t convey their grief and experiences beyond those tight bonds? And how, without paying attention to more than just headlines, can we feel the weight of a particular family member’s words, while fully understanding the diversity of that community?

If civilians don’t know about, understand, or feel comfortable reaching out to service members’ families, that can lead to those in the military, and their families, feeling isolated, abandoned, and afraid to speak.

We send people to war, but the contract shouldn’t end with their lives.

Renee Wood-Vincent, whose son Sgt. Ryan Wood died that day in Iraq, said she feels that fallen soldiers can be forgotten, and that there’s a lack of respect and knowledge in the public for what families and members of the military experience. But that also creates an obligation to reach out.

“There’s such a focus on what’s happening to us—it’s all about our sorrow, our problems, our military families—and we aren’t letting people in,” she said.

Letting people in can’t be done alone. It requires civilian leaders who can bridge and connect people. And in the United States, the highest bridge is embodied in one office, the presidency.

That’s why it’s so important that the person occupying that office be able to connect with soldiers and their families.

When the president reaches out to Gold Star families, he speaks for the civilians who made the decision with their votes to send service members to war. Even if a letter or phone call does not bring comfort, it is an acknowledgment of sacrifice for country. It’s why scrutiny of President Trump’s calls with Gold Star families is warranted.

Private First Class Ryan Hill and his mother Shawna Fenison. Photo courtesy of Kelly Kennedy.

But whatever the nature of the president’s words, the most important thing to know is that his words aren’t enough. A president should serve only as a starting point for civilians to reach out. “If I rely on politicians to memorialize Ryan and understand his sacrifice, I’m going to be sorely disappointed,” Wood-Vincent said. “They can empathize, but it’s still a number.”

Wood-Vincent received a letter from President George W. Bush, which she said was enough in a time of war, when the commander-in-chief should be dealing with national issues. So did Shawna Fenison, whose son, Private First Class Ryan Hill, served with Wood in Charlie Company, 1st Battalion 26th Infantry Regiment. Hill died on January 20, 2007, in Iraq, when a roadside bomb exploded near his Humvee.

But the letter wasn’t enough. “I don’t think the country cares about us and would rather we just go away,” she said.

That feeling represents a failure, and a historic shift. The concept of the Gold Star family began as an invitation for conversation and caring between civilians and military.

During World War I, a family could hang the red-bordered flag with two blue stars in the front window to alert the neighborhood that two sons served overseas.

The neighbors could say, “Heard from your boy?” or “Where’s he fighting?”

If one of those stars turned to gold, the conversation changed.

“I’m sorry for your loss.”
“Thank you for your sacrifice.”

But as wars turned more political and became less of a community effort—during World Wars I and II, most families had relatives or friends serving—the conversation changed again. Some families folded their flags when they felt they brought unwanted attention during the Vietnam War. In recent wars, the rarity of the flags offered reminders of just how few people served. About 7 percent of Americans have served in the military, and less than 1 percent serve now in our all-volunteer armed forces.

Both Fenison and Wood-Vincent were initially showered with gifts: flags. Artwork anonymously sent in the mail. Letters from strangers. “I have a large, supportive family,” Wood-Vincent said. “My neighborhood happens to be very military.” At work, people knew her son and offered condolences. People told her they fly the American flag for her son.

“We had a neighbor who came down every night for two years and prayed over our home,” she said. “I had never met her. I would see her out there in the summertime, in the wintertime, standing in the rain with her little dog.”

People still leave things on her porch.

“It may be the part of the country I’m in, or the neighborhood,” she said. “But part of it is the people I’ve surrounded myself with.”

Fenison had a similar experience, at first. But then, as people moved on with their lives or grew disenchanted with the wars, they encouraged her to stop talking about her son, to take down the “shrine” she’d assembled in her home that included her son’s pictures and the flag from his coffin.

“When I talk about Ryan, many will change the subject or give me the look of ‘Here she goes again,’ so I find myself withdrawing more and more,” she said. “Communities are good about honoring on Memorial Day with their token events, but it pretty much stops there.

“While my world has stopped, the rest has moved on.”

The families ache for the engagement—for someone to care. For someone to mourn their losses. For someone to look up Niger on a map and not only think about what it might have felt like to be doing what you, yes, signed up for and loved—but also to contemplate the terror and heartbreak for service members, friends, and families.

But, as the families and veterans wrap around each other, these tight bonds can exclude those in our communities who haven’t served in the military themselves or who don’t know anyone who serves now.

Those flags should serve as a call to action: This family’s sacrifice represents you. Gather them up. Listen to their stories.

“It’s much more complicated than people know,” Wood-Vincent said.

“On one hand, I’m a mother who lost a child no matter how he was taken from the world. I’m not thinking of him as a soldier.”

But then she explains to strangers how he died.

“People will say, ‘Oh, what a shame. What a waste,’” she said. “Don’t assume I feel the same.”

Sometimes, she said, she gets angry and wants to walk away. Other times, she reminds herself that she can’t be mad about people’s ignorance about proper responses or “Gold Star” moms if she’s not helping to educate them.

“I’ll think, ‘That person just made me so angry,’” she said. “Why? Well, my son’s loss was not a waste. Give me 10 seconds in the parking lot to tell you why. If someone sees your Gold Star plate and says, ‘What is that?’, you don’t say, ‘Hey. You’re an idiot. You should know.’”

She sees her personal call to action as part of that big conversation. Every summer, she invites her son’s brothers in arms to a reunion. Her family created a scholarship to celebrate his art—punk-rock drawings that expressed convictions about being different and doing your part to save the world—through the university. And she told his story at several events.

She makes sure people know and remember him, and through that, she closes the divide.

She believes that communities can, too. Local organizations can invite in Gold Star family members. They can form community partnerships—Boy Scouts who adopt families, or Junior Leaguers who organize lunches, or schools that bring Gold Star alumni in as speakers. Communities can organize town halls about what families need—even if that need is simply relaying kind questions to ask. Leaders can ensure families are remembered beyond Memorial Day.

And Gold Star families have to be willing to accept those invitations.

“You’ve got to open yourself,” Wood-Vincent said. “They’ll never completely understand, and thank God for that. But they will never understand if we don’t invite them in.”

The post We Shouldn’t Rely on Politicians to Memorialize Our Fallen Soldiers appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/10/shouldnt-rely-politicians-memorialize-fallen-soldiers/ideas/essay/feed/ 1
The “Crying Indian” Ad That Fooled the Environmental Movementhttp://www.zocalopublicsquare.org/2017/11/09/crying-indian-ad-fooled-environmental-movement/ideas/essay/ http://www.zocalopublicsquare.org/2017/11/09/crying-indian-ad-fooled-environmental-movement/ideas/essay/#comments Thu, 09 Nov 2017 08:01:31 +0000 By Finis Dunaway http://www.zocalopublicsquare.org/?p=89272 It’s probably the most famous tear in American history: Iron Eyes Cody, an actor in Native American garb, paddles a birch bark canoe on water that seems, at first, tranquil and pristine, but that becomes increasingly polluted along his journey. He pulls his boat ashore and walks toward a bustling freeway. As the lone Indian ponders the polluted landscape, a passenger hurls a paper bag out a car window. The bag bursts on the ground, scattering fast-food wrappers all over the Indian’s beaded moccasins. In a stern voice, the narrator comments: “Some people have a deep, abiding respect for the natural beauty that was once this country. And some people don’t.” The camera zooms in on Iron Eyes Cody’s face to reveal a single tear falling, ever so slowly, down his cheek.

Cody’s tear made its television debut in 1971 at the close of a public service advertisement for the

The post The “Crying Indian” Ad That Fooled the Environmental Movement appeared first on Zócalo Public Square.

]]>
It’s probably the most famous tear in American history: Iron Eyes Cody, an actor in Native American garb, paddles a birch bark canoe on water that seems, at first, tranquil and pristine, but that becomes increasingly polluted along his journey. He pulls his boat ashore and walks toward a bustling freeway. As the lone Indian ponders the polluted landscape, a passenger hurls a paper bag out a car window. The bag bursts on the ground, scattering fast-food wrappers all over the Indian’s beaded moccasins. In a stern voice, the narrator comments: “Some people have a deep, abiding respect for the natural beauty that was once this country. And some people don’t.” The camera zooms in on Iron Eyes Cody’s face to reveal a single tear falling, ever so slowly, down his cheek.

Cody’s tear made its television debut in 1971 at the close of a public service advertisement for the anti-litter organization Keep America Beautiful. Appearing in languid motion on TV over and over again during the 1970s, the tear also circulated in other media, stilled on billboards and print ads, forever fixing the image of Iron Eyes Cody as the Crying Indian. The ad won many prizes and is still ranked as one of the best commercials of all time. By the mid-1970s, an Advertising Council official noted, “TV stations have continually asked for replacement films” of the commercial, “because they have literally worn out the originals from the constant showings.” For many Americans, the Crying Indian became the quintessential symbol of environmental idealism. But a closer examination of the ad reveals that neither the tear nor the sentiment was what it seemed to be.

The campaign was based on many duplicities. The first of them was that Iron Eyes Cody was actually born Espera De Corti—an Italian-American who played Indian in both his life and on screen. The commercial’s impact hinged on the emotional authenticity of the Crying Indian’s tear. In promoting this symbol, Keep American Beautiful (KAB) was trying to piggyback on the counterculture’s embrace of Indian-ness as a more authentic identity than commercial culture.

The second duplicity was that KAB was composed of leading beverage and packaging corporations. Not only were they the very essence of what the counterculture was against; they were also staunchly opposed to many environmental initiatives.

KAB was founded in 1953 by the American Can Company and the Owens-Illinois Glass Company, who were later joined by the likes of Coca-Cola and the Dixie Cup Company. During the 1960s, KAB anti-litter campaigns featured Susan Spotless, a white girl who wore a spotless white dress and pointed her accusatory finger at pieces of trash heedlessly dropped by her parents. The campaign used the wagging finger of a child to condemn individuals for being bad parents, irresponsible citizens, and unpatriotic Americans. But by 1971, Susan Spotless no longer captured the zeitgeist of the burgeoning environmental movement and rising concerns about pollution.

The shift from KAB’s bland admonishments about litter to the Crying Indian did not represent an embrace of ecological values but instead indicated industry’s fear of them. In the time leading up to the first Earth Day in 1970, environmental demonstrations across the United States focused on the issue of throwaway containers. All these protests held industry—not consumers—responsible for the proliferation of disposable items that depleted natural resources and created a solid waste crisis. Enter the Crying Indian, a new public relations effort that incorporated ecological values but deflected attention from beverage and packaging industry practices.

KAB practiced a sly form of propaganda. Since the corporations behind the campaign never publicized their involvement, audiences assumed that KAB was a disinterested party. The Crying Indian provided the guilt-inducing tear KAB needed to propagandize without seeming propagandistic and countered the claims of a political movement without seeming political. At the moment the tear appears, the narrator, in a baritone voice, intones: “People start pollution. People can stop it.” By making individual viewers feel guilty and responsible for the polluted environment, the ad deflected the question of responsibility away from corporations and placed it entirely in the realm of individual action, concealing the role of industry in polluting the landscape.

When the ad debuted, KAB enjoyed the support of mainstream environmental groups, including the National Audubon Society and the Sierra Club. But these organizations soon resigned from its advisory council over an important environmental debate of the 1970s: efforts to pass “bottle bills,” legislation that would require soft drink and beer producers to sell, as they had until quite recently, their beverages in reusable containers. The shift to the throwaway was responsible, in part, for the rising levels of litter that KAB publicized, but also, as environmentalists emphasized, for the mining of vast quantities of natural resources, the production of various kinds of pollution, and the generation of tremendous amounts of solid waste. The KAB leadership lined up against the bottle bills, going so far, in one case, as to label supporters of such legislation as “Communists.”

We can still see the impact of the Crying Indian campaign today in mainstream portrayals of environmentalism that prioritize the personal over the political. The answer to pollution, as KAB would have it, had nothing to do with power, politics, or production decisions; it was simply a matter of how individuals acted in their daily lives. Ever since the first Earth Day, the mainstream media have repeatedly turned big systemic problems into questions of individual responsibility. Too often, individual actions like recycling and green consumerism have provided Americans with a therapeutic dose of environmental hope that fails to address our underlying issues.

Iron Eyes Cody (right) at a Keep America Beautiful awards ceremony with Leland C. Barbeur, president of the Fayetteville, N.C., County Youth Council, and Miss Teenage America Cathy Durden, in Washington, D.C. on Dec. 5, 1975. Photo courtesy of Associated Press.

But there is a final way that the commercial distorted reality. In the ad, the time-traveling Indian paddled his canoe out of the distant past, appearing as a visual relic of indigenous people who had supposedly vanished from the continent. He was presented as an anachronism who did not belong in the picture.

One of the commercial’s striking ironies is that Iron Eyes Cody became the Crying Indian at the same moment that actual Indians occupied Alcatraz Island in San Francisco Bay, the very same body of water in which the actor paddled his canoe. For almost two years, from late 1969 through mid-1971, a period that overlapped with both the filming and release of the Crying Indian commercial, indigenous activists demanded that the U.S. government cede control of the abandoned island. They presented themselves not as past-tense Indians, but as coeval citizens laying claim to the land. The Alcatraz activists sought to challenge the legacies of colonialism and contest contemporary injustices—to address, in other words, the realities of native lives erased by the anachronistic Indians who typically populate Hollywood film. By contrast, the Crying Indian appears completely powerless. In the commercial, all he can do is lament the land his people lost.

In recent years, the large-scale organizing and protests against the Keystone XL Pipeline, the Dakota Access Pipeline, and other fossil fuel development projects all represent a powerful rejection of the Crying Indian. While the Crying Indian appeared as a ghost from the past who erased the presence of actual Indians from the landscape, these activists have visibly proposed structural solutions for the environment while demanding indigenous land rights. Moving beyond individual-driven messages, they cast off static symbols of the past to envision a just and sustainable future.

The post The “Crying Indian” Ad That Fooled the Environmental Movement appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/11/09/crying-indian-ad-fooled-environmental-movement/ideas/essay/feed/ 1
California’s Fear of High-Rise Living Is Blocking Our View of the Futurehttp://www.zocalopublicsquare.org/2017/10/30/californias-fear-high-rise-living-blocking-view-future/inquiries/connecting-california/ http://www.zocalopublicsquare.org/2017/10/30/californias-fear-high-rise-living-blocking-view-future/inquiries/connecting-california/#comments Mon, 30 Oct 2017 07:01:53 +0000 By Joe Mathews http://www.zocalopublicsquare.org/?p=89061

Want to spook your neighbors this Halloween? Don’t bother with big displays of goblins, ghouls, or ghosts. Instead, just decorate your door with a picture of an eight-story apartment building.

Californians are famously fearless in most things. We devote ourselves to extreme outdoor sports, buy homes near earthquake faults, and launch startups and make TV pilots against all odds. But in the face of tall buildings, especially multi-family residential high-rises, we turn into a bunch of scaredy-cats.

This statewide acrophobia has fueled a historic housing shortage that cuts into our incomes, holds back our economy, drives up homelessness, and forces us into long, unhealthy commutes. In other words, it’s downright frightening what our fears are doing to our future.

The case for taller buildings, especially in urban areas, is strong. Taller development creates badly needed housing on a smaller footprint, and thus the population density to support robust public transportation

The post California’s Fear of High-Rise Living Is Blocking Our View of the Future appeared first on Zócalo Public Square.

]]>

Want to spook your neighbors this Halloween? Don’t bother with big displays of goblins, ghouls, or ghosts. Instead, just decorate your door with a picture of an eight-story apartment building.

Californians are famously fearless in most things. We devote ourselves to extreme outdoor sports, buy homes near earthquake faults, and launch startups and make TV pilots against all odds. But in the face of tall buildings, especially multi-family residential high-rises, we turn into a bunch of scaredy-cats.

This statewide acrophobia has fueled a historic housing shortage that cuts into our incomes, holds back our economy, drives up homelessness, and forces us into long, unhealthy commutes. In other words, it’s downright frightening what our fears are doing to our future.

The case for taller buildings, especially in urban areas, is strong. Taller development creates badly needed housing on a smaller footprint, and thus the population density to support robust public transportation and thriving retail corridors. It also preserves open land for parks and agriculture. And, if the housing is near transit and job centers, it reduces traffic and pollution.

But California is laced with Munsters-era zoning codes that make tall, dense, or multi-unit buildings illegal in many neighborhoods. And, even where we permit residential towers, our frighteningly complicated processes for approving houses create so much litigation and so many delays that taller buildings become too expensive to finance.

To their credit, both cities and developers across the state have been advancing proposals for taller buildings, often in the dense centers where they’re needed. But smart plans are little match for the collective acrophobia of Californians.

If you dare, you can witness the plague of height fears right now in Long Beach, where citizens are revolting against a thoughtful city effort to update a three-decade-old land use plan to accommodate taller buildings. (This, in a place that is crippled, mummy-like, by a housing shortage.)

In Santa Monica, a new Expo Line rail connection should be encouraging taller development, but longtime residents, afflicted with the most haunting case of vertigo since Jimmy Stewart starred in the Alfred Hitchcock classic film of that name, oppose it. And in Hollywood, attempts to adopt a new community plan to accommodate high-rises have been blocked by relentless opposition from neighbors, judges, and ballot measures filed by a powerful nonprofit executive angry at what new towers might do to his views.

To their credit, both cities and developers across the state have been advancing proposals for taller buildings, often in the dense centers where they’re needed. But smart plans are little match for the collective acrophobia of Californians.

No California place is more firmly in the grip of the terror of the tall than Oakland. In response to sky-high housing prices that are displacing long-time residents, the city has permitted construction of thousands of units. But good luck getting them constructed in neighborhoods outside of downtown. Oakland has a backlog of 18,000 approved but as yet unbuilt units in its pipeline, many of which would be in taller buildings near transit.

How to stop the fear? The late Frank Herbert, a Northern California journalist better known for his Dune novels, wrote: “Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me.”

When it comes to tall buildings, we should take his advice and face our fears. Instead, Californians construct elaborate and self-deceiving justifications for their acrophobic anxieties.

We tell ourselves that earthquakes make taller buildings less safe—even though studies show that poorly designed smaller buildings, subjected to strong ground motions, are more likely to collapse and hurt us. We focus too much on the upfront costs of high-rise multi-family buildings—which are more expensive to build than single-family homes because they required stronger materials—and ignore all the hidden costs of our fixation with single-family housing.

And we are highly selective in our fears. We fear the apartment building in our flat downtown more than the elaborate home surrounded by brush on a hill. We block tall buildings in our town centers because we worry about new crowds of new people—claustrophobia and xenophobia are cousins to our fear of heights—and then complain about all the resulting freeway traffic at rush hour. We oppose new housing on the grounds that it will change the character of our neighborhoods, and then lament the appearance of homeless encampments down the street.

Our fears literally distort our vision. Many Californians oppose tall and thin buildings, even though they actually are better for our views.

During a recent conversation, a prominent L.A. architect took out his smartphone. Holding the thin phone vertically, he explained that he could design a tall and thin building that is easy to see around. But because of fear of heights, he said, turning the phone horizontally, buildings are often made much shorter and squatter, effectively becoming walls that block more views of more people.

“We’re building a bunch of fat boys,” he lamented.

Which is too bad, because the few areas of California with high-rise housing are successes. Fear somehow has blinded us to the vibrancy of high-rise-heavy precincts in the downtowns of L.A. and San Diego. And the benefits of pursuing a taller, denser housing future, particularly in coastal urban areas, would be considerable: higher annual economic growth, more tax revenue, and fewer greenhouse gases.

But it’s hard to have a conversation about this fear when there is so much else for Californians to be afraid about now, from a spike in property crime in the state to the nuclear-armed madmen who run Pyongyang and Washington.

The current gubernatorial campaign might provide an opportunity for reassessing the altitude of our development. The top two contenders, Gavin Newsom and Antonio Villaraigosa, pushed to make their cities more vertical when they were mayors of San Francisco and Los Angeles, respectively.

But both got hammered for doing so. And now the two former mayors are surrounded by protective political professionals who, when it comes to the highest and hardest issues, are perpetually scared to death.

The post California’s Fear of High-Rise Living Is Blocking Our View of the Future appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/10/30/californias-fear-high-rise-living-blocking-view-future/inquiries/connecting-california/feed/ 2
When You Live Online, Will Anyone Know When You Die?http://www.zocalopublicsquare.org/2017/09/12/live-online-will-anyone-know-die/ideas/nexus/ http://www.zocalopublicsquare.org/2017/09/12/live-online-will-anyone-know-die/ideas/nexus/#comments Tue, 12 Sep 2017 07:01:09 +0000 By Emma Electra Jones http://www.zocalopublicsquare.org/?p=87873 I suspected that something was wrong on the Sunday morning when I saw the beginning of a Facebook post in my newsfeed sidebar that said, in French, “Our dear AJ has given up …” I was unable to read the rest because it was removed as I looked at it, but I was concerned that it might actually mean that AJ was hurt or in trouble.

It could have said something like, “AJ has given up his studies,” because the French wording is similar for all scenarios. That particular formulation is most often used in reference to death, true. But on this quiet Sunday morning, when I was about to go grocery shopping, I found the concept outrageous: AJ was simply too real to me to turn up dead online.

AJ loved the L.A. Zoo, Donald Duck orange juice, and cats. He often held his arms close to his chest,

The post When You Live Online, Will Anyone Know When You Die? appeared first on Zócalo Public Square.

]]>
I suspected that something was wrong on the Sunday morning when I saw the beginning of a Facebook post in my newsfeed sidebar that said, in French, “Our dear AJ has given up …” I was unable to read the rest because it was removed as I looked at it, but I was concerned that it might actually mean that AJ was hurt or in trouble.

It could have said something like, “AJ has given up his studies,” because the French wording is similar for all scenarios. That particular formulation is most often used in reference to death, true. But on this quiet Sunday morning, when I was about to go grocery shopping, I found the concept outrageous: AJ was simply too real to me to turn up dead online.

AJ loved the L.A. Zoo, Donald Duck orange juice, and cats. He often held his arms close to his chest, like a T-Rex. He was sarcastic. He could play the guitar, and would sometimes play mine, but he didn’t sing. He was very much alive in my mind.

I texted Fergus, a mutual friend from high school: “AJ is fine, right? He didn’t kill himself or anything.” Fergus texted back: “I don’t think he killed himself. Nico got a Snapchat from him the other day.” The text tone was mocking. I didn’t text AJ for this very reason; I was so sure that he was alive that I thought he, like Fergus, would make fun of me for being worried that he was dead.

So now all was presumably well. We had Facebook, we had Snapchat, we knew what was going on! I responded with “Ok yay, glad AJ is still alive!” and Fergus and I continued to debate whether pancakes or waffles were tastier. Something significant and heart-wrenching had happened, but it quickly disappeared, lost in the way social media collapses all distinction between the trivial and the profound.

The author (left) and AJ. Photo courtesy of Emma Electra Jones.

I met AJ sometime in the fifth grade. At first we weren’t friends. In fact, I hated him for a good portion of the sixth and seventh grades in that irrational way that children sometimes hate one another. Eventually I no longer found him an awful person to be around, and we were good friends throughout high school. When we went off to college we carried on our friendship online—the way my friends and I do almost everything. But an unforeseen hazard of living life online is that it confers a kind of immortality that doesn’t square with real life—and certainly not real death.

Around 11 p.m. that same Sunday, I was cooking chicken-less chicken nuggets for a group of friends in my New York apartment when I went to my bedroom to grab my phone. There was a text: “Emma, AJ did kill himself. I’m sorry if you wake up to this.” Fergus had just gotten off the phone with AJ’s mother. Later, I would find out that he had died two days earlier after overdosing on heroin. I would learn that he was an addict and had overdosed once before. I would remember that he’d dabbled with drugs while we were in high school. But the moment that I read those words, the only thought that came to mind was being 16, out past curfew, parked somewhere in the Hollywood Hills, and as I collapsed in the hallway of my apartment, weeping, I kept repeating: “Oh my god, I kissed him in the back of a truck and now he’s dead.”

I had some previous experience with death: the passing of a great uncle. The sensations that I was having now, though, were new and horrible. This death made me frantic. Finding out that a close friend has died without any human contact, not even the sound of another person’s voice, is disorienting. For me there was no ritual to delineate when he had died and when he had lived—no sheet pulled over the face, no pennies on the eyelids, no pronouncement and silence.

That night everyone who knew him was bewildered. We exchanged shocked online messages. Was this real? Could these texts be trusted? Because that’s all anyone was getting: texts. The mother of a friend of mine was waiting to tell her children until she had “more evidence.”

But in the end, it didn’t matter, because there wasn’t any. There were only those same words sent over and over again from person to person: “AJ is dead.”

My relationship with AJ was always somewhere between friendly and flirtatious, and it reached an apex the summer before we became seniors in high school when we watched a lot of movies, kissed in a truck; I even had dinner with his parents. Maybe for a second we almost were, and then we weren’t. But that was okay; our relationship had always been fluid and easy. I trusted AJ. Even though I knew he liked drugs, and would often do them at parties, I never worried that he’d take things too far. He once told me that he would never do heroin “because I know I’d like it too much.”

An unforeseen hazard of living life online is that it confers a kind of immortality that doesn’t square with real life—and certainly not real death. 

But of course that wasn’t what happened. Fifty thousand Americans died from drug overdoses in 2016, and AJ was one of them. As soon as Fergus texted me the news I wanted to call him to find out how AJ had gotten involved with opioids. I wanted to know whether everybody else was aware of his drug problem and I was just in the dark. I wanted to understand how in the world this could have happened.

But Fergus lived in Canada and didn’t have a phone plan. It would cost him a fortune to talk on the telephone. I realize how irrational that sounds—Your friend died and you were concerned with phone plans? Or: What about Skype? Maybe it was also easier not to call. Maybe I appreciated the distance that technology gave us. I knew that a call would reflect my own shock and sadness back at me, and I didn’t want to stare at someone through a screen who I knew felt as hollow as I did.

And calling would also have made it real, as though I’d killed him. He wasn’t really dead yet. Not if I let the internet stand between me and his death.

After he died, I was alone in Manhattan, texting people and reading Facebook messages filled with condolences. I found out about the date of his funeral in a Facebook post from his mom. So while all of this public grief was unfolding online, each of us was experiencing it alone, tucked away in our separate corners of the world.

Later I got a message that AJ had “liked” one of my photos, but in fact his mother had taken over his Facebook page and it was no longer him, though it seemed to be. I still get the occasional notification that AJ has posted something on his wall or that he is online. This always feels existentially wrong, a dead person spending time making Facebook posts.

The first time I felt genuinely better after his death was when I flew home to L.A. for the funeral. I spent the whole weekend in a cluster of friends, and was alone for no more than two hours the entire time. We really “shared” memories and stories. We held one another. We cried. We also went go-cart racing and ate garlic fries. We existed together in a way that was impossible over social media. We couldn’t plan what to say or how to express ourselves, as you can in online forums, and I think we suffered less because of it. We got to experience the honesty and relief of laying our grief bare to one another.

And, perhaps most importantly, we could all see, plainly, that AJ was not there.

The post When You Live Online, Will Anyone Know When You Die? appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/09/12/live-online-will-anyone-know-die/ideas/nexus/feed/ 2