Zócalo Public SquareWho We Were – Zócalo Public Square http://www.zocalopublicsquare.org Ideas Journalism With a Head and a Heart Tue, 21 Nov 2017 04:26:17 +0000 en-US hourly 1 https://wordpress.org/?v=4.8 Our Revelatory Culinary Road Trip Through the New Southhttp://www.zocalopublicsquare.org/2017/10/02/revelatory-culinary-road-trip-new-south/ideas/nexus/ http://www.zocalopublicsquare.org/2017/10/02/revelatory-culinary-road-trip-new-south/ideas/nexus/#respond Mon, 02 Oct 2017 07:01:51 +0000 By Ashli Q. Stokes and Wendy Atkins-Sayre http://www.zocalopublicsquare.org/?p=88243 It was New Year’s Day in Charlotte, North Carolina, and seemingly half of Mecklenburg County had come to the K&W Cafeteria for black-eyed peas, greens, and hog jowls—foods to bring good luck for the year ahead. The Formica tables were packed with local ladies in their fancy hats, college kids, tired families, and business folks in suits, all snaking slowly through a winding line to order.

We were at the K&W reflecting on a year-long mission to understand how Southern food shaped Southern identity. Both of us are academics at Southern universities, and we had come to our project with an interest in how rhetoric plays a role in shaping Southern identity—that is, how words and symbols send messages about how we see ourselves and others, our opinions, and our actions. We also both grew up in the South. Ashli is a Virginian; Wendy, a Texan. As adults, we chose

The post Our Revelatory Culinary Road Trip Through the New South appeared first on Zócalo Public Square.

]]>
What It Means to Be American It was New Year’s Day in Charlotte, North Carolina, and seemingly half of Mecklenburg County had come to the K&W Cafeteria for black-eyed peas, greens, and hog jowls—foods to bring good luck for the year ahead. The Formica tables were packed with local ladies in their fancy hats, college kids, tired families, and business folks in suits, all snaking slowly through a winding line to order.

We were at the K&W reflecting on a year-long mission to understand how Southern food shaped Southern identity. Both of us are academics at Southern universities, and we had come to our project with an interest in how rhetoric plays a role in shaping Southern identity—that is, how words and symbols send messages about how we see ourselves and others, our opinions, and our actions. We also both grew up in the South. Ashli is a Virginian; Wendy, a Texan. As adults, we chose to stay here to work and raise our families, knowing that the South would always feel like our only true home.

Exploring how Southern food communicates, playing a vital role in shaping and creating the identities of Southerners today, was a dream assignment—a rare chance to combine our work with pleasure. Food is a vital cultural component. It acts rhetorically by articulating identities and inviting individuals to embrace those identities. Analyzing food in the South, we believed, would help us better understand this culture we loved, in a broad-sweeping and universal kind of way. After all, everyone eats. Southerners, in particular, take a great deal of pride in family recipes, and have strong opinions about the regional cuisine. It is easy to get people to talk “Southern food-ese,” whether they are originally from the region or not.

Our travels took us from Charlotte, North Carolina to Birmingham, Alabama, and from Memphis, Tennessee to New Orleans, Louisiana. We prepared maps and agendas, wrote interview questions and brought note-taking tools, scheduled meetings, and made reservations. Before we took off, friends and family asked us which spots made the list for the “authentic Southern food experience.” Would our itinerary include Athens, Georgia’s Weaver D’s, Charleston’s Husk, or Austin’s Franklin’s Barbecue? Or were we going to stop by a Cracker Barrel in every town?

A shrimp po’ boy, and patrons, at local favorite Domilise’s, New Orleans, Louisiana. Photo by Wendy Atkins-Sayre.

This second question was generally a joke, a knowing nod to the roadside restaurant chain’s often-derided faux-Southern decor and mediocre food. There would be no Cracker Barrels on our eating tour, we assured them. We knew “real Southern” fare when we saw it. Indeed, we would be frequenting places like Domilise’s, the legendary dive in uptown New Orleans known for its po’ boy sandwiches. The current explosion of writings on Southern food might not yield a uniform definition, but it almost always emphasizes locally grown produce, ingredients that were readily available and inexpensive, and a strong connection to the land. We would follow the lead of those in the know.

We did go to Domilise’s, and it did not disappoint. We waited 20 minutes, in wilting humidity, for the famous crusty French bread sandwiches, stuffed with fried shrimp, oysters, or roast beef. Photos, autographs, and beer bottles from long-shuttered breweries adorned the walls. The bar was weathered by the touch of thousands of visitors. Diners chatted away, happily greeting familiar faces behind the counters. It was obvious why folks in a town full of great food were so devoted to this spot. The food not only brought them to the same location, but also connected them through shared sensual experiences—the sights and sounds of the neighborhood joint, the smells and tastes of the simple meal so closely tied to the region.

Domilise’s was the real deal. But the scene at K&W, and many other eateries like it, seemed to be, too. Whether Grandma cooked the hog jowls or not, eating that particular dish, in that particular place, was necessary for the Charlotte crowd to start the year successfully. Being there, we were participating in an important community ritual. This gave us pause. Like Cracker Barrel, K&W is a chain, with dozens of locations, in several states. There is little, if anything, artisanal about it. But it was apparent that its patrons on New Year’s Day were having a “real Southern” food experience. Were chain restaurants necessarily inauthentic expressions of Southern identity, as some food critics—not to mention so many of our friends—seemed to think?

The more we traveled, the more we began to wonder. We also began to question our attempts at scholarly remove. Getting mired in details was distracting us. This epiphany dawned at a chicken shack place in Middle Georgia that sat next to an old gas station. We fumbled with greasy fingers to record our experiences, getting odd looks from fellow customers and the women behind the counter—the very people we wanted to connect with. In Atlanta our food tour schedule was packed with too many restaurants, and the places we had chosen to visit seemed too controlled. So we stopped relying on critics’ lists of places we needed to visit, chefs we needed to watch, foodie experiences we needed to have. We decided just to eat, to listen, and to let the whole weird experience wash over us. There were more spontaneous stops along the roadside, and fewer reservation-heavy New South restaurants.

Fundamental questions started to emerge. What the heck was Southern food? And what, for that matter, was being Southern? We had started with a sense of what made a dish “real,” and had set out to find eating experiences that fit our definition. When our approach changed, our definition of authenticity began to shift, too. We saw many Souths, many kinds of Southerners, and many foods and experiences that could count as “real.” Was a Southern meal defined by particular foods, like BBQ, fried chicken, and grits? We had imagined our research would focus on discovering the boundaries circumscribing Southern cuisine, but instead we found a food culture in flux. Southern food doesn’t belong to a certain region; its reach extends beyond the traditional South and can be hard to determine. Ashli’s definition of real Southern barbecue is the vinegar-y (with a touch of tomato) pulled pork you find in Piedmont North Carolina. Wendy’s is beef brisket and sausages. Appalachian soup beans are very different than shrimp and grits. All of these foods are Southern.

Collard greens, macaroni and cheese, and carrots served up at Homestyle, housed in a trailer outside of Hattiesburg, Mississippi. Photo by Wendy Atkins-Sayre.

So, we learned, is chorizo. In Charlotte, North Carolina, a Sav-Way grocery in one formerly industrial neighborhood now serves an immensely popular torta made with the sausage, at a lunch counter in the back. An expanding immigrant community in the South has brought countless additions to old recipes. Hummus, a traditionally Middle Eastern dish, has found its way to highbrow Southern tables in the form of mashed boiled peanuts, field peas, or Lima beans. Sandra Gutierrez, food writer and cookbook author, draws on Latin American traditions and adds a Southern twist. Chiles rellenos can be stuffed with pimiento cheese for a Southern flair.

There’s a long tradition of Southerners incorporating ingredients that are abundantly available but not necessarily “authentic” to the region in their food, such as casserole staples like Velveeta cheese and cream of mushroom soup. The Stokes family’s holiday shrimp mold, combining crustaceans with canned tomato soup, gelatin, celery, onion, and cream cheese is one such example of old-school Southern food with alien origins. As Ashli’s mother-in-law explained, it has to be served with Ritz crackers and it has to be served at Christmas. It does not matter that combining shrimp with gelatin might seem weird. Chorizo in Charlotte may one day be the same kind of thing. Oddball dishes become Southern food when they make their way into family traditions and stories. They send messages about family, about togetherness, about defining who you are in your corner of the world.

We found ourselves moving away from concentrating on the boundaries of Southern food and instead let the food speak for itself, experiencing everything from the Homestyle Restaurant, housed in a trailer home in the piney woods outside of Hattiesburg, Mississippi, to a Popeye’s franchise in Jennings, Louisiana. We learned to let the people we met define the food for us. It became less important to define Southern cuisine and more important to embrace all that is—and wants to be—Southern. Southern food more broadly defined gives us all something to discuss, and even celebrate.

Some have opined that Southern food is in danger of extinction because of changes in the region, including increases in the numbers of chain restaurants, and the general suburbanization of so many Southern towns. But our travels revealed that a vibrant and evolving regional food culture brings people together and unites them, whether through a shared love of certain dishes or even heated arguments inspired by food loyalties. That Southern cuisine is moving forward by blending its own traditional ingredients and methods with those of other cultures, regions and times—and, even, taking root in chain restaurants—is a nod to its roots and a sign of breaking ground, a demonstration of how the South itself can become stronger while also changing.

Southern food may, in fact, both reflect current regional changes and also lead the way in crafting a new South, one where its culture echoes its changing people—more inclusive, more diverse, and more hopeful, influenced by influxes of immigrants, and newly fast-paced Southern cities. Optimistic, yes, but possible. The South is still here, but maybe not the one you, or we, were expecting.

The post Our Revelatory Culinary Road Trip Through the New South appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/10/02/revelatory-culinary-road-trip-new-south/ideas/nexus/feed/ 0
How the South Recast Defeat as Victory with an Army of Stone Soldiershttp://www.zocalopublicsquare.org/2017/09/28/south-recast-defeat-victory-army-stone-soldiers/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/09/28/south-recast-defeat-victory-army-stone-soldiers/chronicles/who-we-were/#comments Thu, 28 Sep 2017 07:01:25 +0000 By Gaines M. Foster http://www.zocalopublicsquare.org/?p=88197 Monuments to Robert E. Lee and other Confederate leaders have long been controversial, but monuments to nameless Confederate soldiers, those lone stone figures in public places, are far more common and have long served as an iconic symbol of the South. Understanding the origins of these stone soldiers who still loom over present-day towns and cities may help us better understand current controversies over them.

The white South began to erect soldiers’ monuments soon after the Confederacy’s defeat. In the first two decades after the war, communities most often chose a simple obelisk or other monument of funeral design and placed it in a cemetery. Former Confederates thereby mourned their dead and memorialized their cause. Even in the early years after the war, though, some monuments featured a sculpture of a soldier and occupied a more public place—a practice that increased over the next two decades.

The vast majority of Confederate

The post How the South Recast Defeat as Victory with an Army of Stone Soldiers appeared first on Zócalo Public Square.

]]>
What It Means to Be American Monuments to Robert E. Lee and other Confederate leaders have long been controversial, but monuments to nameless Confederate soldiers, those lone stone figures in public places, are far more common and have long served as an iconic symbol of the South. Understanding the origins of these stone soldiers who still loom over present-day towns and cities may help us better understand current controversies over them.

The white South began to erect soldiers’ monuments soon after the Confederacy’s defeat. In the first two decades after the war, communities most often chose a simple obelisk or other monument of funeral design and placed it in a cemetery. Former Confederates thereby mourned their dead and memorialized their cause. Even in the early years after the war, though, some monuments featured a sculpture of a soldier and occupied a more public place—a practice that increased over the next two decades.

The vast majority of Confederate monuments were erected between 1890 and 1912, and most of these consisted of a single soldier, with his hands folded over the top of his rifle’s barrel and with its stock resting on the ground. Typically, the soldier stood atop a column on the courthouse lawn or some other central public space. These statues hardly seemed martial, much less ready to attack. Indeed, they looked surprisingly calm and at ease. They did not always face north, as folklore has it but, rather, whichever way the courthouse faced.

An advertisement for soldier monuments in the magazine, Confederate Veteran. Image courtesy of Gaines Foster.

The origins and purposes of these monuments to the common Confederate soldier is complex. They resulted, in part, from a commercial campaign. Monument companies advertised in veterans’ magazines and hired agents to travel the South. They offered credit terms (lest the veterans die before a town could raise the money for a memorial) and, in one ad, even offered a free marble breadboard to the secretary of any United Daughters of the Confederacy chapter that ordered a monument.

The companies, though, were exploiting an important cultural movement. Putting up soldiers’ monuments was a central ritual of the Lost Cause, a shorthand term for an organized attempt by the Daughters, Confederate veterans, and many other white Southerners to shape the memory of the Civil War. Southern whites erected Confederate soldier monuments for at least three interrelated reasons.

The leaders of the Lost Cause first sought to honor the veterans of the war. The monuments expressed white society’s appreciation and respect for the soldiers’ wartime sacrifice, constituting a more profound and permanent version of today’s off-hand “thank you for your service.” The monuments also reassured the veterans that, despite losing on the battlefield, they had fought honorably and well—and for the noblest of reasons.

The Lost Cause and the monuments that emerged from it also sought to vindicate the Confederacy itself. The white South’s memory of the war claimed that soldiers fought for states’ rights and the defense of their homes and families. The Lost Cause also proclaimed secession to be legal, denied the centrality of slavery to the war, ignored the evil inherent in the South’s peculiar institution, and over time romanticized it. The monuments thereby celebrate not just the veterans but the Confederacy and, despite the attempt to deny it, its cause—slavery.

The Lost Cause thereby offered a vision of the “proper” social order, one in which the lower classes deferred to leaders, women proved loyal to men, and African Americans remained subservient to whites.

Finally, although they celebrated the Confederacy, the monuments and the Lost Cause were as much about the present as the past. In honoring the faithful soldier, the Lost Cause’s leaders made him a model for the lower classes in a turbulent period of change in the South and the nation.

The erection of the monuments followed the populist revolt and widespread labor unrest. The soldier statues were a reminder that, as during the war—when Confederate soldiers loyally followed aristocratic leaders like Lee into battle—the middle and lower classes should be loyal to a hierarchical society. The Lost Cause thereby offered a vision of the “proper” social order, one in which the lower classes deferred to leaders, women proved loyal to men, and African Americans remained subservient to whites. In the same decades in which most of the soldiers’ monuments went up, the white South created a repressive racial order based on segregation, disfranchisement, lynching, and other forms of white racial violence.

The story of the Lost Cause’s monuments to the Confederate soldier reveals the difficulty of knowing how to honor soldiers’ sacrifices without embracing or even justifying their cause—a problem also faced by later generations of Americans struggling over some subsequent wars. It shows that monuments emerge more from memory—an attempt to shape the past—than from the history that actually happened. And, in the midst of a public debate over Confederate monuments, it reminds us that memory and its symbols have less to say about history and more to proclaim about the shape of society in the present and the future.

The post How the South Recast Defeat as Victory with an Army of Stone Soldiers appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/09/28/south-recast-defeat-victory-army-stone-soldiers/chronicles/who-we-were/feed/ 2
How Bullwinkle Helped Us Laugh Off Nuclear Annihilationhttp://www.zocalopublicsquare.org/2017/09/25/bullwinkle-helped-us-laugh-off-nuclear-annihilation/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/09/25/bullwinkle-helped-us-laugh-off-nuclear-annihilation/chronicles/who-we-were/#comments Mon, 25 Sep 2017 07:01:46 +0000 By Beth Daniels http://www.zocalopublicsquare.org/?p=88138 “Mr. Chairman, I am against all foreign aid, especially to places like Hawaii and Alaska,” says Senator Fussmussen from the floor of a cartoon Senate in 1962. In the visitors’ gallery, Russian agents Boris Badenov and Natasha Fatale are deciding whether to use their secret “Goof Gas” gun to turn the Congress stupid, as they did to all the rocket scientists and professors in the last episode of Bullwinkle.

Another senator wants to raise taxes on everyone under the age of 67. He, of course, is 68. Yet a third stands up to demand, “We’ve got to get the government out of government!” The Pottsylvanian spies decide their weapon is unnecessary: Congress is already ignorant, corrupt, and feckless.

Hahahahaha. Oh, Washington.

That joke was a wheeze half a century ago, a cornball classic that demonstrates the essential charm of the Adventures of Rocky and Bullwinkle and Friends,

The post How Bullwinkle Helped Us Laugh Off Nuclear Annihilation appeared first on Zócalo Public Square.

]]>
What It Means to Be American “Mr. Chairman, I am against all foreign aid, especially to places like Hawaii and Alaska,” says Senator Fussmussen from the floor of a cartoon Senate in 1962. In the visitors’ gallery, Russian agents Boris Badenov and Natasha Fatale are deciding whether to use their secret “Goof Gas” gun to turn the Congress stupid, as they did to all the rocket scientists and professors in the last episode of Bullwinkle.

Another senator wants to raise taxes on everyone under the age of 67. He, of course, is 68. Yet a third stands up to demand, “We’ve got to get the government out of government!” The Pottsylvanian spies decide their weapon is unnecessary: Congress is already ignorant, corrupt, and feckless.

Hahahahaha. Oh, Washington.

That joke was a wheeze half a century ago, a cornball classic that demonstrates the essential charm of the Adventures of Rocky and Bullwinkle and Friends, the cartoon show that originally aired between 1959 and 1964 about a moose and a squirrel navigating Cold War politics.

High-flyin’ duo: Giant balloons of Rocky and Bullwinkle soar over Broadway in Manhattan during the Macy’s Thanksgiving Day Parade, Nov. 28, 1996. Photo by Doug Kanter/Associated Press.

I’ve been wistful about the show of late, as I’m sure many of my generation are. Last month, we lost the great June Foray, the voice of Rocky the Flying Squirrel and many others. Her passing gave me pause to reflect on how important the show was during my formative years and how far-reaching its influence is on satire today. Bullwinkle was, like so many of the really good cartoons, technically before my time (I was born the year it ended). My sister and I caught it in syndication as part of our regular weekend cartoon lineup of Looney Tunes, Jonny Quest, and The Jetsons, from elementary through high school.

It wasn’t that Bullwinkle the character was especially compelling. He was an affable doofus with a loyal heart, if limited brainpower. Rocky was the more intelligent straight man: a less hostile Abbott to Bullwinkle’s more secure Costello. They were earnest do-gooders who took every obviously shady setup at face value. Their enemies were far cleverer, better resourced, and infinitely more cunning, but Rocky and Bullwinkle always prevailed. Always. For absolutely no good reason. It was a sendup of every Horatio Alger, Tom Swift, plucky-American-hero-wins-against-all-odds story ever made.

What we didn’t know in the ’70s, when we were watching, was that this was pretty subversive stuff for a children’s program made at the height of the Cold War. Watching this dumb moose and his rodent pal continually prevail against well-funded human saboteurs gave me pause to consider, even as a kid, that perhaps it is a silly idea to believe that just because we’re the good guys we should always expect to win.

The animation was stiff but sweet, the puns plentiful and painful. The show poked fun at radio, television, and movie tropes, and took playful aim at Cold War spycraft. Part of the fun was that Bullwinkle wasn’t a regular cartoon, but an animated half-hour variety show. And variety shows used to be so much of a thing that I am stunned there is no niche cable network devoted to them today.

Every episode of the Bullwinkle show featured two cliffhanger segments in the adventures of Bullwinkle J. Moose and Rocket J. Squirrel, pitted against master spies Boris and Natasha, all narrated breathlessly by erstwhile radio star William Conrad. Between each serial installment were stand-alone features, including Peabody’s Improbable History, wherein Mr. Peabody, a genius dog, and his pet boy, Sherman, travel through time to make terrible puns; Fractured Fairy Tales, updated twists on Grimm Brothers classics; Dudley Do-Right, a parody of silent melodramas starring a cleft-chinned Canadian Mountie; and Aesop & Son, modernized versions of Aesop’s fables as told by Charlie Ruggles, star of silent and classic films. Other features included Bullwinkle’s Corner, an over-enunciated poetry reading, and Mr. Know-It-All, in which Bullwinkle tries and fails to teach us something.

Tom Lehrer’s topical, bitingly satirical songs exemplified a dark vein of humor that ran through the Eisenhower-Kennedy era. Image courtesy of Lawrence/Flickr.

The variety show format enabled three things. First, its gloss of adult sophistication completely undercut by silliness was incredibly attractive to me and my sister. Secondly, it got us to delight in the work of a revolving cast of top-notch, old school voice actors who’d grown up in radio and knew how to sell a line. June Foray, for example, is the common thread that weaves together the everyman fast-talkers of Warner Bros. films (she voiced Granny and Witch Hazel for Looney Tunes), the pop culture and political satire of Stan Freberg, and the Cold War kiddie fare of Bullwinkle (as Rocky, Nell Fenwick, Natasha, and more).

Fractured Fairy Tales were narrated by veteran actor Edward Everett Horton, a Warner Bros. stable favorite, and featured Daws Butler (Elroy Jetson), a Stan Freberg comedy show veteran, along with Paul Frees and June Foray. Before giving voice to Dudley Do-Right’s nemesis Snidely Whiplash, Hans Conried was better known as Captain Hook in Disney’s Peter Pan, as well as for his years’ long yeoman’s work on radio mystery shows, I Love Lucy, and Burns and Allen.

Finally, the show’s format and depth of talent connected my sister and me to a world of comedy that was well before our time, but helped us navigate what came afterwards. Apart from Sesame Street and The Electric Company (whose cast was a gift to future Broadway lovers) the cartoon landscape during the 1970s was bleak. I don’t know what happened during the Summer of Love to cause formerly respectable shops like Hanna-Barbera to go from Jonny Quest to Captain Caveman and the Teen Angels, but it can’t have been pretty. In those grim years when cable was not yet available to the common man and one had to physically get up to change the channel (or make one’s sister do it), we relied on three networks, a local PBS affiliate, and a couple of random UHF stations for our home entertainment. By setting the contemporary junk fare right up against reruns of infinitely better material, regular television gave my sister and me a great education in quality satire, voice recognition, and genius parody.

There was also the added benefit of our mother’s healthy collection of comedy albums—Stan Freberg, Tom Lehrer, Nichols & May, and vintage Woody Allen—all of which are of the same era as Bullwinkle and feature some of the same performers. My parents and these comedians belong to the so-called “Silent” Generation—that cohort born between 1925 and 1945—too young to be the Greatest and too old to be Boomers. Born during times of economic insecurity, this group came of age during the McCarthy Era and is marked, understandably, by a desire not to rock the boat too much. While they weren’t as culturally radical as the Boomers of the ’60s, the artists and cultural provocateurs of the Silent Generation loved to take a whack at the Eisenhower status quo, not to mention psychoanalysis and the Bomb.

The late June Foray, shown on the job on Nov. 2, 1967, gave voice to Rocky the Flying Squirrel, babies, birds, cackling witches, and many other animated characters. Photo by George Brich/Associated Press.

Because we loved these old records and shows, my sister and I ended up singing along with Tom Lehrer about German rocket scientist Wernher von Braun (about whom we knew nothing), did the Vatican Rag and the Masochism Tango (ditto).

And so, through Bullwinkle, we were granted access to nearly a century’s worth of comedy and satire, three generations of backhanded patriotism tempered with gentle skepticism going back to vaudeville, a sort of atavistic psychic tool chest for navigating strange and scary times.

Bullwinkle was there when PBS pre-empted all programming to air the Watergate hearings in the summer I was eight, my last before sleepaway camp. At P.S. 19, we were still having bomb drills and the Cold War was still very much on, as was a hot war in Vietnam, but there was no recognition of these facts in the Archies or Hong Kong Fooey.

Bullwinkle’s playful critique lives on today in Spongebob and The Simpsons, shows whose creators openly acknowledge their debts. (Spongebob’s Squidward’s voice is Ned Sparks; Plankton is Walter Brennan. All the male Simpsons have Bullwinkle & Rocky’s middle initial “J.”) These shows are a loving critique of the ways that American ideals and American reality are often out of whack.

The post How Bullwinkle Helped Us Laugh Off Nuclear Annihilation appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/09/25/bullwinkle-helped-us-laugh-off-nuclear-annihilation/chronicles/who-we-were/feed/ 6
The 1938 Hurricane That Revived New England’s Fall Colorshttp://www.zocalopublicsquare.org/2017/09/21/1938-hurricane-revived-new-englands-fall-colors/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/09/21/1938-hurricane-revived-new-englands-fall-colors/chronicles/who-we-were/#comments Thu, 21 Sep 2017 07:01:22 +0000 By Stephen Long http://www.zocalopublicsquare.org/?p=88086 This morning, while driving in central Vermont, listening to the latest news about hurricanes in Florida and Texas, I caught up with my first leaf peeper of the season. Poking along at about 20 mph in his rental car, the tourist was peering at our hills of orange and crimson and gold leaves while simultaneously looking for a place to pull over to snap a photo.

Fall foliage and hurricane season go hand in hand in New England. But what few people realize is that the spectacular blazing colors from our hardwood forests are the result of the great hurricane of 1938, which brought 100 mph winds inland to Vermont, New Hampshire, and Maine 79 years ago on September 21.

The storm that came to be known as “Thirty-Eight” (the system of naming hurricanes didn’t begin until 1953) was the first Category 2 hurricane to reach Vermont and New Hampshire,

The post The 1938 Hurricane That Revived New England’s Fall Colors appeared first on Zócalo Public Square.

]]>
What It Means to Be American This morning, while driving in central Vermont, listening to the latest news about hurricanes in Florida and Texas, I caught up with my first leaf peeper of the season. Poking along at about 20 mph in his rental car, the tourist was peering at our hills of orange and crimson and gold leaves while simultaneously looking for a place to pull over to snap a photo.

Fall foliage and hurricane season go hand in hand in New England. But what few people realize is that the spectacular blazing colors from our hardwood forests are the result of the great hurricane of 1938, which brought 100 mph winds inland to Vermont, New Hampshire, and Maine 79 years ago on September 21.

The storm that came to be known as “Thirty-Eight” (the system of naming hurricanes didn’t begin until 1953) was the first Category 2 hurricane to reach Vermont and New Hampshire, and it came without warning. Thirty-Eight made landfall on Long Island, crossed the Long Island Sound into Connecticut and Rhode Island, and raced through Massachusetts and Vermont. It had been at least a generation since any hurricane had hit the region, even the coast.

Because of the lack of warning, or preparedness, more than 600 people died, most of them from the storm surge that swept beachfront houses into the sea. Floods and high winds—the fiercest wind was measured near Boston at 186 miles per hour—destroyed roads, bridges, houses, barns, and railroad tracks.

Inland, these winds uprooted nearly 1,000 square miles of forest, ripping holes in the tree canopy ranging from the size of a city yard to as large as 90 acres. And in doing so, the hurricane created a new forest across much of New England.

Most of the people who lived through the hurricane are gone, but I have been fortunate to hear the stories of many of them. One dramatic story came from Fred Hunt, at the time a 14-year-old boy playing hooky in the woods in Rindge, New Hampshire. Late in the day, a huge pine—more than 100 feet tall—was uprooted and landed five steps behind him, its trunk parallel to the ground. Thinking quickly, he scrambled into the space beneath the trunk of the fallen pine and stayed there for 10 minutes while the winds howled mercilessly and blew down every other tree in the forest. When there were no more trees left standing, Fred scrambled through the tangle of downed trees the last half-mile to home.

New England’s largest hurricane was followed by its largest logging job, and this one-two punch brought about the forest that we see today.

The white pine that served as Fred’s refuge happened to be growing in that spot because of the history of the area’s land, which was typical of much of rural New Hampshire and Massachusetts. In the 17th and 18th centuries, farmers cleared most of the original forest to grow crops and raise livestock. With the advance of the Industrial Revolution, these farmers left to work in the mills. Starting in 1860, the cleared fields reverted to forests. In New England, there’s no need to replant trees because they happily grow on their own. One of the most prolific colonizers of farm fields is white pine.

So when Thirty-Eight raged through, forests covered 80 percent of the land in New Hampshire and Massachusetts, and much of that forest was white pine. Before the storm, many rural families saw their woodlots as living bank accounts, where a few trees could be cut and sold when they needed money. Ninety percent of the trees that were blown down were white pine.

With the disaster, the federal government saw a need to get involved. The Great Depression had not yet ended, and in the forested areas of New England the New Deal make-work programs such as the Works Progress Administration (WPA) and Civilian Conservation Corps (CCC) were well-established. Fearing the kind of fires seen in the West each summer, the U.S. Forest Service directed the WPA and CCC to strip the downed trees of their branches, twigs, and needles to reduce the fire danger. Simultaneously it created the Northeast Timber Salvage Administration to purchase logs from the blowdown. Five times the annual harvest of trees had been blown down in a five-hour period, creating a huge glut of wood. NETSA created a market for the logs and purchased nearly half of the salvageable timber, providing some income to the 30,000 families that otherwise would have lost their woodland bank accounts.

And so, New England’s largest hurricane was followed by its largest logging job, and this one-two punch brought about the forest that we see today. When the towering canopy of white pine blew down, what was left were the seedlings and saplings of deciduous hardwood trees. If they hadn’t been blown down in 1938, those pines might still be there, holding the ground until they died from wind, disease, or logging. Instead, the mix of maple, birch, and oak that relished the new sunlight (having been released from the shade of the pines) grew vigorously. This new forest closely approximates the species mix of the original forest that had greeted the settlers, and its vibrant display of turning leaves attracts leaf peepers from around the globe.

Not all of New England experienced Thirty-Eight the same way. In Vermont, for example, farming had continued well into the 1930s, so only half of the state was covered in forests. So hurricane damage appeared mostly in woodlots on top of ridges and in the sugar maple orchards that produced the springtime crop of maple syrup. Maple syrup was a hugely important crop in Vermont, because dairy farmers used the income from syrup to pay a year’s wages for hired help. With so many sugar orchards laying in ruins, many Vermont farmers had no choice but to get out of farming. The regrowth of the forest began in Vermont 80 years later than in Massachusetts and New Hampshire, and the process was different because Vermont’s soils are better than those of its neighbors. Vermont’s forest cover has now reached 80 percent, and the vast majority of it is the mix of northern hardwoods—maple, beech, and birch—that makes the hills come alive in the fall.

When I last spoke to Fred Hunt, just months before he died at 87, he said, “I’ve always been a white pine man.” He told me that after graduating with a degree in forestry from the University of New Hampshire, he ran a logging business for 10 years, specializing in thinning pine plantations. He then earned an M.S. and Ph.D. from the University of Massachusetts studying white pine and its effect on the water supply. Along the way, his master’s thesis served as the first management plan for the 58,000-acre forest surrounding Quabbin Reservoir, which provides the drinking water for Boston and 40 other nearby towns. He then taught forest management and managed a large forest deep in the Adirondacks for 10 years before he decided at the age of 54 to make his final career change, moving back to Reading, Vermont and tending his own forest.

Hunt spent a lifetime working to grow superior white pine because it provided a good living and because he loved the practice of forestry. But it’s possible that his lifelong affinity for white pine could have little to do with money or forestry. It could have more to do with an event when he was 14 years old. On that day, as New England’s most destructive hurricane passed through, a white pine saved his life.

The post The 1938 Hurricane That Revived New England’s Fall Colors appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/09/21/1938-hurricane-revived-new-englands-fall-colors/chronicles/who-we-were/feed/ 6
How Recipe Cards and Cookbooks Fed a Mobile, Modernizing Americahttp://www.zocalopublicsquare.org/2017/09/18/recipe-cards-cookbooks-fed-mobile-modernizing-america/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/09/18/recipe-cards-cookbooks-fed-mobile-modernizing-america/chronicles/who-we-were/#respond Mon, 18 Sep 2017 07:01:21 +0000 By Helen Zoe Veit http://www.zocalopublicsquare.org/?p=87985 The first edition of The Boston Cooking-School Cook Book—now known as The Fannie Farmer Cookbook—reads like a road map for 20th-century American cuisine. Published in 1896, it was filled with recipes for such familiar 19th-century dishes as potted pigeons, creamed vegetables, and mock turtle soup. But it added a forward-looking bent to older kitchen wisdom, casting ingredients such as cheese, chocolate, and ground beef—all bit players in 19th-century U.S. kitchens—in starring roles. It introduced cooks to recipes like hamburg steaks and French fried potatoes, early prototypes of hamburgers and fries, and fruit sandwiches, peanuts sprinkled on fig paste that were a clear precursor to peanut butter and jelly.

Americans went nuts for the 567-page volume, buying The Boston Cooking-School Cook Book in numbers the publishing industry had never seen—around 360,000 copies by the time author Fannie Farmer died in 1915. Home cooks in the United States loved the

The post How Recipe Cards and Cookbooks Fed a Mobile, Modernizing America appeared first on Zócalo Public Square.

]]>
What It Means to Be American The first edition of The Boston Cooking-School Cook Book—now known as The Fannie Farmer Cookbook—reads like a road map for 20th-century American cuisine. Published in 1896, it was filled with recipes for such familiar 19th-century dishes as potted pigeons, creamed vegetables, and mock turtle soup. But it added a forward-looking bent to older kitchen wisdom, casting ingredients such as cheese, chocolate, and ground beef—all bit players in 19th-century U.S. kitchens—in starring roles. It introduced cooks to recipes like hamburg steaks and French fried potatoes, early prototypes of hamburgers and fries, and fruit sandwiches, peanuts sprinkled on fig paste that were a clear precursor to peanut butter and jelly.

Americans went nuts for the 567-page volume, buying The Boston Cooking-School Cook Book in numbers the publishing industry had never seen—around 360,000 copies by the time author Fannie Farmer died in 1915. Home cooks in the United States loved the tastiness and inventiveness of Farmer’s recipes. They also appreciated her methodical approach to cooking, which spoke to the unique conditions they faced. Farmer’s recipes were gratifyingly precise, and unprecedentedly replicable, perfect for Americans with newfangled gadgets like standardized cup and spoon measures, who worked in relative isolation from the friends and family who had passed along cooking knowledge in generations past. Farmer’s book popularized the modern recipe format, and it was a fitting guide to food and home life in a modernizing country.

Recipes today serve many purposes, from documenting cooking techniques, to showing off a creator’s skills, to serving up leisure reading for the food-obsessed. But their most important goal is replicability. A good recipe imparts enough information to let a cook reproduce a dish, in more or less the same form, in the future.
The earliest surviving recipes, which give instructions for a series of meaty stews, are inscribed on cuneiform tablets from ancient Mesopotamia. Recipes also survive from ancient Egypt, Greece, China, and Persia. For millennia, however, most people weren’t literate and never wrote down cooking instructions. New cooks picked up knowledge by watching more experienced friends and family at work, in the kitchen or around the fire, through looking, listening, and tasting.

Cover of 1919 edition of The Boston Cooking-School Cook Book by Fannie Merritt Farmer. Image courtesy of Smithsonian Libraries.

Recipes, as a format and genre, only really began coming of age in the 18th century, as widespread literacy emerged. This was around the same time, of course, that the United States came into its own as a country. The first American cookbook, American Cookery, was published in 1796. Author Amelia Simmons copied some of her text from an English cookbook but also wrote sections that were wholly new, using native North American ingredients like “pompkins,” “cramberries,” and “Indian corn.” Simmons’s audience was mainly middle class and elite women, who were more likely to be able to read and who could afford luxuries like a printed book in the first place.

The reach of both handwritten recipes and cookbooks would expand steadily in the coming decades, and rising literacy was only one reason. Nineteenth-century Americans were prodigiously mobile. Some had emigrated from other countries, some relocated from farms to cities, and others moved from settled urban areas to the Western frontier. Young Americans regularly found themselves living far from friends and relatives who otherwise might have offered help with cooking questions. In response, mid-19th-century cookbooks attempted to offer comprehensive household advice, giving instructions not just on cooking but on everything from patching old clothes to caring for the sick to disciplining children. American authors routinely styled their cookbooks as “friends” or “teachers”—that is, as companions that could provide advice and instruction to struggling cooks in the most isolated of spots.

Americans’ mobility also demonstrated how easily a dish—or even a cuisine—could be lost if recipes weren’t written down. The upheaval wrought by the Civil War singlehandedly tore a hole in one of the most important bodies of unwritten American culinary knowledge: pre-war plantation cookery. After the war, millions of formerly enslaved people fled the households where they had been compelled to live, taking their expertise with them. Upper-class Southern whites often had no idea how to light a stove, much less how to produce the dozens of complicated dishes they had enjoyed eating, and the same people who had worked to keep enslaved people illiterate now rued the dearth of written recipes. For decades after the war, there was a boom in cookbooks, often written by white women, attempting to approximate antebellum recipes.

Title page of Miss Beecher’s Domestic Receipt-Book, by Catharine Beecher, 1862. Image courtesy of Smithsonian Libraries.

Standardization of weights and measures, driven by industrial innovation, also fueled the rise of the modern American recipe. For most of the 19th century, recipes usually consisted of only a few sentences giving approximate ingredients and explaining basic procedure, with little in the way of an ingredient list and with nothing resembling precise guidance on quantities, heat, or timing. The reason for such imprecision was simple: There were no thermometers on ovens, few timepieces in American homes, and scant tools available to ordinary people to tell exactly how much of an ingredient they were adding.

Recipe writers in the mid-19th century struggled to express ingredient quantity, pointing to familiar objects to estimate how much of a certain item a dish needed. One common approximation, for instance, was “the weight of six eggs in sugar.” They also struggled to give instructions on temperature, sometimes advising readers to gauge an oven’s heat by putting a hand inside and counting the seconds they could stand to hold it there. Sometimes they hardly gave instructions at all. A typically vague recipe from 1864 for “rusks,” a dried bread, read in its entirety: “One pound of flour, small piece of butter big as an egg, one egg, quarter pound white sugar, gill of milk, two great spoonfuls of yeast.”

By the very end of the 19th century, American home economics reformers, inspired by figures like Catharine Beecher, had begun arguing that housekeeping in general, and cooking in particular, should be more methodical and scientific, and they embraced motion studies and standardization measures that were redefining industrial production in this era. And that was where Fannie Merritt Farmer, who started working on The Boston Cooking-School Cook Book in the 1890s, entered the picture.

Farmer was an unlikely candidate to transform American cookery. As a teenager in Boston in the 1870s, she suffered a sudden attack of paralysis in her legs, and she was 30 years old before she regained enough mobility to begin taking classes at the nearby Boston Cooking School. Always a lover of food, Farmer proved to be an indomitable student with a knack for sharing knowledge with others. The school hired her as a teacher after she graduated. Within a few years, by the early 1890s, she was its principal.

Julia Child’s handwritten recipe for pain de mie. Child’s Cambridge, Massachusetts kitchen is on view in the exhibition FOOD: Transforming the American Table 1950–2000, at the National Museum of American History. Image courtesy of the National Museum of American History.

Farmer started tinkering with a book published by her predecessor a few years earlier, Mrs. Lincoln’s Boston Cook Book. Farmer had come to believe that rigorous precision made cooking more satisfying and food more delicious, and her tinkering soon turned into wholesale revision.

She called for home cooks to obtain standardized teaspoons, tablespoons, and cups, and her recipes called for ultra-precise ingredient amounts such as seven-eighths of a teaspoon of salt, and four and two-thirds cups of flour. Also, crucially, Farmer insisted that all quantities be measured level across the top of the cup or spoon, not rounded in a changeable dome, as American cooks had done for generations.

This attention to detail, advocated by home economists and given life by Farmer’s enthusiasm, made American recipes more precise and reliable than they ever had been, and the wild popularity of Farmer’s book showed how eager home cooks were for such guidance. By the start of the 20th century, instead of offering a few prosy sentences that gestured vaguely toward ingredient amounts, American recipes increasingly began with a list of ingredients in precise, numerical quantities: teaspoons, ounces, cups.

In more than a century since, it’s a format that has hardly changed. American cooks today might be reading recipes online and trying out metric scales, but the American recipe format itself remains extraordinarily durable. Designed as a teaching tool for a mobile society, the modern recipe is grounded in principles of clarity, precision, and replicability that emerge clearly from the conditions of early American life. They are principles that continue to guide and empower cooks in America and around the world today.

The post How Recipe Cards and Cookbooks Fed a Mobile, Modernizing America appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/09/18/recipe-cards-cookbooks-fed-mobile-modernizing-america/chronicles/who-we-were/feed/ 0
The Shoe Salesman Whose Name Became Synonymous with Basketballhttp://www.zocalopublicsquare.org/2017/09/14/shoe-salesman-whose-name-became-synonymous-basketball/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/09/14/shoe-salesman-whose-name-became-synonymous-basketball/chronicles/who-we-were/#respond Thu, 14 Sep 2017 07:01:31 +0000 By Abe Aamidor http://www.zocalopublicsquare.org/?p=87933 When Chuck Taylor, who was born in rural southern Indiana in 1901, left home at age 17 to play professional basketball, he was following an unlikely dream. The game of basketball—invented by James Naismith, a YMCA physical fitness instructor in Massachusetts in 1891—was still a minor sport in America. Few competitive leagues existed, and those that did were regional. Most organized teams were subsidized by large manufacturing concerns, such as General Electric or the Firestone Tire and Rubber Co., or by fraternal organizations such as the Knights of Columbus. Professional contracts hardly existed; players were paid $5 or $10 a game. You couldn’t make a living at that.

Like struggling musicians or actors, basketball players needed a day job. But Taylor’s day job—selling the athletic shoes that he wore on court—became a lifelong career. Endowed with a love of the game, a showman’s flair, and just the right amount of

The post The Shoe Salesman Whose Name Became Synonymous with Basketball appeared first on Zócalo Public Square.

]]>
What It Means to Be American When Chuck Taylor, who was born in rural southern Indiana in 1901, left home at age 17 to play professional basketball, he was following an unlikely dream. The game of basketball—invented by James Naismith, a YMCA physical fitness instructor in Massachusetts in 1891—was still a minor sport in America. Few competitive leagues existed, and those that did were regional. Most organized teams were subsidized by large manufacturing concerns, such as General Electric or the Firestone Tire and Rubber Co., or by fraternal organizations such as the Knights of Columbus. Professional contracts hardly existed; players were paid $5 or $10 a game. You couldn’t make a living at that.

Like struggling musicians or actors, basketball players needed a day job. But Taylor’s day job—selling the athletic shoes that he wore on court—became a lifelong career. Endowed with a love of the game, a showman’s flair, and just the right amount of self-aggrandizement, Taylor would become a veritable Johnny Appleseed for promoting basketball to Americans and around the world. By inventing himself, he helped invent what is now one of the most popular sports across the globe.

Many people know the name Chuck Taylor from the autograph that since 1932 has been stamped on the ankle patch of hundreds of millions of Converse All Star shoes—the classic beloved by everyone from James Dean to Kurt Cobain to Michelle Obama. Larry Bird and Julius Irving wore them on the court. They’re a celebrity staple, unisex and multigenerational, a kind of “hipster” fashion statement. Today, Converse Chuck Taylor All Stars take their place among sports Americana with Louisville Slugger baseball bats and Chicago Schwinn bicycles.

The real Chuck Taylor can sometimes seem like a fabrication, an ad-agency invention like Juan Valdez or Betty Crocker. In fact, Taylor was a standout forward at Columbus (Ind.) High School and captain of his team as a sophomore. He later played semi-pro and “industrial league” basketball in Indianapolis and Detroit. His most famous team was the Akron, Ohio-based Firestone Non-Skids. He proved to be a competent player but he was not the star he had been in high school. He made his name, instead, when in 1922 he took a full-time job selling the recently introduced All Star court shoe for the Converse Rubber Company’s Chicago regional office.

The All Star wasn’t the first shoe of its type—rubber-soled with a canvas top. The A.G. Spalding and Bros. company had previously introduced a similar shoe, and “plimsoles” and “sneakers” had long existed in England. But the Converse All Star, introduced in 1917, improved on earlier designs, particularly with its diamond pattern sole that many believed led to quicker stops and starts on a hardwood court. The trademark on that design was upheld by a court ruling as recently as 2016.

A 1960 basketball game with several pairs of Chuck Taylors on the court. Photo courtesy of Wikimedia Commons.

Taylor was not a good salesman at first. According to a newspaper interview given by his widow, Lucy Taylor Hennessey, in 1979 with the Lansdale (Pa.) Reporter, he was nervous the first time he approached legendary coach Knute Rockne (he of “Win one for the Gipper” fame) at the University of Notre Dame fieldhouse. Rockne allegedly saw the young Taylor pacing outside his door and called him in, and then schooled him in the power of positive thinking and good sales technique.

Taylor’s territory included selling to high schools and colleges, which typically outfitted student athletes at school expense. It wasn’t an obvious sell. Most basketball coaches were just football or baseball coaches with time on their hands in between seasons, and didn’t care much about the finer points of the new game. Taylor, who cared very much, had a brilliant idea to boost sales: Around 1925, he began organizing in-school “clinics” to demonstrate the sport. Grateful coaches appreciated his expertise and welcomed him—and ordered the Converse shoe.

Before long, the clinics became public events, often co-hosted by a local sporting goods store. Taylor would explain the rules, do trick shots, and sometimes compete one-on-five against a local team, daring younger players to block his shots or steal the ball. His most famous trick was called “the invisible pass.” Taylor would thrust the ball toward the face of his opponent, then pass it quickly when the defender instinctively blinked. It was something of a cheat, of course, yet basketball was more of a rough-and-tumble sport in its early days, with far fewer rules and whistle calls than today. And Taylor was genuinely unstoppable against youthful players when he would dribble the ball from one end of the court to the other, weaving between a whole slew of defenders. It’s said that his passes were so forceful and straight that he could pass a ball under a truck underhanded without touching either the floor or the truck’s undercarriage.

Taylor’s theatrics—with a boost from Converse’s financial sponsorship of the Kansas City-based National Association of Basketball Coaches—eventually made the Converse All Star the dominant basketball shoe in America. In time, the company would almost exclusively hire former players or coaches as salesmen, such as Grady Lewis (who was later to replace Taylor as Sales Manager), Canadian basketball player Bob Houbregs, and the important early African American player Earl Lloyd.

Why was Taylor’s autograph added to the shoe in 1932, though? After all, he was just a salesman for the company, and while he had been a pro, he did not prove to be one of the stars of the game as an adult. It was a combination of his marketing skill (those clinics), plus the fact that he’d served for a time in the late 1920s as player/coach of the well-regarded Converse All-Stars (with the hyphen), the company’s own traveling basketball team. The Converse Rubber Co. had gone through a bankruptcy in 1928, and was sold and then sold again in the early 1930s. While it’s not known whose idea it was to put Taylor’s name on the shoe, his “brand” was clearly better than that of Converse at the time.

Plus there was this little thing about the invention of “Chuck Taylor.” He was real, but he often exaggerated his earlier success as a professional, most notoriously claiming to have been a veteran of the most famous basketball team of the 1920s, the New York City-based Original Celtics. Historian Murry Nelson has written the definitive history of the Original Celtics and he found no evidence that Taylor ever played for that team.

Before long, the clinics became public events, often co-hosted by a local sporting goods store. Taylor would explain the rules, do trick shots, and sometimes compete one-on-five against a local team, daring younger players to block his shots or steal the ball.

Taylor was paid a fixed sum annually for his autograph. It’s not known how much he was paid annually (in one document dating from the early 1950s he received $15,000) but as a traveling salesman he lived on the road almost 365 days a year, staying in the finest hotels and expensing everything. He drove Chevrolets at first, then Lincolns and Cadillacs. Though he clearly was a showman, he proved his basic humanity in other ways. After his older brother, Howard, was blinded in France during World War I, Taylor supported him his entire life, even going so far as to insist in a divorce settlement that his estate must always pay Howard’s expenses off the top before any alimony payments would be made.

Taylor also served his country during World War II as coach of the best “service” team in the military at the time, the Wright Field Air-Tecs, who competed in fundraisers against the best college and pro teams of the day and usually won. He toured South America as a goodwill ambassador for the U.S. State Department in the late 1950s.

By the 1960s Taylor, and the canvas athletic shoe itself, were somewhat anachronistic. Mass marketers were replacing the locally owned sporting goods store and national advertising was overtaking the gladhand approach of men like Taylor. European manufacturers such as Adidas and Puma were introducing new leather shoes with lighter-weight soles and other high-tech materials, and in some cases were paying coaches directly to adopt their shoes. (While Converse did subsidize the National Association of Basketball Coaches for years, that money went to the organization, not to individual coaches or players. In the 1970s Converse, too, followed the herd and started paying players and coaches directly to adopt their newer shoes, but that was long after Taylor was gone from the scene.)

The real death knell of the canvas All Star in competitive sports came in 1969, when the most famous basketball coach in America, John Wooden, announced that going forward he would not outfit his players with the shoe. As he told Sports Illustrated and others, he was tired of personally trimming rough seams and loose threads inside the shoe. Converse brought Taylor out of retirement, flying him from his Florida home to meet in person with Wooden in Los Angeles, but to no avail. UCLA went to a new shoe the following year. Though Converse did eventually introduce newer, more modern footwear for competition, it was an uphill battle. Adidas, Reebok, and the upstart Nike (which now owns the Converse brand) all had taken their positions in the marketplace.

Taylor died in June 1969, shortly after he was inducted into the Naismith Memorial Basketball Hall of Fame in a stellar class that included former Boston Celtics coach Arnold “Red” Auerbach, true Original Celtics star Henry “Dutch” Dehnert, controversial University of Kentucky basketball coach Adolf Rupp, and former Oklahoma State coach Hank Iba. Taylor was inducted not as a player or coach, but as a “contributor.” His tagline at the Hall of Fame reads, “Taylor … pursued his goal of building players, coaches, and spectator interest in the game of basketball by conducting clinics and demonstrations throughout the country.”

The post The Shoe Salesman Whose Name Became Synonymous with Basketball appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/09/14/shoe-salesman-whose-name-became-synonymous-basketball/chronicles/who-we-were/feed/ 0
When the Idea of Home Was Key to American Identityhttp://www.zocalopublicsquare.org/2017/09/11/idea-home-key-american-identity/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/09/11/idea-home-key-american-identity/chronicles/who-we-were/#comments Mon, 11 Sep 2017 07:01:17 +0000 By Richard White http://www.zocalopublicsquare.org/?p=87852 Like viewers using an old-fashioned stereoscope, historians look at the past from two slightly different angles—then and now. The past is its own country, different from today. But we can only see that past world from our own present. And, as in a stereoscope, the two views merge.

I have been living in America’s second Gilded Age—our current era that began in the 1980s and took off in the 1990s—while writing about the first, which began in the 1870s and continued into the early 20th century. The two periods sometimes seem like doppelgängers: worsening inequality, deep cultural divisions, heavy immigration, fractious politics, attempts to restrict suffrage and civil liberties, rapid technological change, and the reaping of private profit from public governance.

In each, people debate what it means to be an American. In the first Gilded Age, the debate centered on a concept so encompassing that its very ubiquity can

The post When the Idea of Home Was Key to American Identity appeared first on Zócalo Public Square.

]]>
What It Means to Be American Like viewers using an old-fashioned stereoscope, historians look at the past from two slightly different angles—then and now. The past is its own country, different from today. But we can only see that past world from our own present. And, as in a stereoscope, the two views merge.

I have been living in America’s second Gilded Age—our current era that began in the 1980s and took off in the 1990s—while writing about the first, which began in the 1870s and continued into the early 20th century. The two periods sometimes seem like doppelgängers: worsening inequality, deep cultural divisions, heavy immigration, fractious politics, attempts to restrict suffrage and civil liberties, rapid technological change, and the reaping of private profit from public governance.

In each, people debate what it means to be an American. In the first Gilded Age, the debate centered on a concept so encompassing that its very ubiquity can cause us to miss what is hiding in plain sight. That concept was the home, the core social concept of the age. If we grasp what 19th century Americans meant by home, then we can understand what they meant by manhood, womanhood, and citizenship.

I am not sure if we have, for better or worse, a similar center to our debates today. Our meanings of central terms will not, and should not, replicate those of the 19th century. But if our meanings do not center on an equivalent of the home, then they will be unanchored in a common social reality. Instead of coherent arguments, we will have a cacophony.

A Currier & Ives print called “Home Sweet Home.” Image courtesy of Library of Congress.

When reduced to the “Home Sweet Home” of Currier and Ives lithographs, the idea of “home” can seem sentimental. Handle it, and you discover its edges. Those who grasped “home” as a weapon caused blood, quite literally, to flow. And if you take the ubiquity of “home” seriously, much of what we presume about 19th century America moves from the center to the margins. Some core “truths” of what American has traditionally meant become less certain.

It’s a cliché, for example, that 19th century Americans were individualists who believed in inalienable rights. Individualism is not a fiction, but Horatio Alger and Andrew Carnegie no more encapsulated the dominant social view of the first Gilded Age than Ayn Rand does our second one. In fact, the basic unit of the republic was not the individual but the home, not so much isolated rights-bearing-citizen as collectives—families, churches, communities, and volunteer organizations. These collectives forged American identities in the late-19th century, and all of them orbited the home. The United States was a collection of homes.

Evidence of the power of the home lurks in places rarely visited anymore. Mugbooks, the illustrated county histories sold door to door by subscription agents, constituted one of the most popular literary genres of the late 19th century. The books became monuments to the home. If you subscribed for a volume, you would be included in it. Subscribers summarized the trajectories of their lives, illustrated on the page. The stories of these American lives told of progress from small beginnings—symbolized by a log cabin—to a prosperous home.

A picture from a late 19th century “mugbook”: Ira and Susan Warren of Calhoun County, Michigan represented millions of Americans who saw the meaning of their lives in establishing, sustaining, and protecting homes. Image courtesy of History of Calhoun County, Michigan by H. B. Pierce, L.H. Everts & Co, 1877.

The concept of the home complicated American ideas of citizenship. Legally and constitutionally, Reconstruction proclaimed a homogenous American citizenry, with every white and black man endowed with identical rights guaranteed by the federal government.

In practice, the Gilded Age mediated those rights through the home. The 13th, 14th, and 15th Amendments established black freedom, citizenship, civil rights, and suffrage, but they did not automatically produce homes for black citizens. And as Thomas Nast recognized in one of his most famous cartoons, the home was the culmination and proof of freedom.

“Emancipation,” an illustration by Thomas Nast from around 1865. Image courtesy of Library of Congress.

Thus the bloodiest battles of Reconstruction were waged over the home. The Klan attacked the black home. Through murder, arson, and rape, Southern terrorists aimed to impart a lesson: Black men could not protect their homes. They were not men and not worthy of the full rights of citizenship.

In attacking freedpeople, terrorists sought to make them cultural equivalents of Chinese immigrants and Indians—those who, purportedly, failed to establish homes, could not sustain homes, or attacked white homes. Their lack of true homes underlined their supposed unsuitability for full rights of citizenship. Sinophobes repeated this caricature endlessly.

An 1878 lithograph panel called “While they can live on 40 cents a day, and they can’t.” Image courtesy of Library of Congress.

In the iconography of the period, both so-called “friends” of the Indian and Indian haters portrayed Indians as lacking true homes and preventing whites from establishing homes. Buffalo Bill’s Wild West had Indians attacking cabins and wagon trains full of families seeking to establish homes. They were male and violent, but they were not men. Americans decided who were true men and women by who had a home. Metaphorically, Indians became savages and animals.

A poster for Buffalo Bill’s Wild West and Congress of Rough Riders of the World in the late 1890s. Image courtesy od Library of Congress.

Even among whites, a category itself constantly changing during this and other eras, the home determined which people were respectable or fully American. You could get away with a lot in the Gilded Age, but you could neither desert the home nor threaten it. Horatio Alger was a pedophile, but this is not what ultimately cost him his popularity. His great fault, as women reformers emphasized, was that his heroes lived outside the home.

Position people outside the home and rights as well as respectability slip away. Tramps were the epitome of the era’s dangerous classes. Vagrancy—homelessness—became a crime. Single working women were called “women adrift” because they had broken free of the home and, like Theodore Dreiser’s Sister Carrie, threatened families. (Carrie broke up homes but she, rather than the men who thought they could exploit her, survived.) European immigrants, too, found their political rights under attack when they supposedly could not sustain true homes. Tenements were, in the words of Jacob Riis, “the death of the home.”

As the great democratic advances of Reconstruction came under attack, many of the attempts to restrict suffrage centered on the home. Small “l” liberal reformers—people who embraced market freedom, small government, and individualism but grew wary of political freedom—sought to reinstitute property requirements. Failing that, they policed voting, demanding addresses for voter registration, a seemingly simple requirement, but one that required permanent residences and punished the transience that accompanied poverty. Home became the filter that justified the exclusion of Chinese immigrants, Indian peoples, eventually African Americans, transients, and large numbers of the working poor.

The home always remained a two-edged sword. American belief in the republic as a collection of homes could and did become an instrument for exclusion, but it could also be a vehicle for inclusion. Gilded Age social reformers embraced the home. The Homestead Act sought to expand the creation of homes by both citizens and non-citizens. When labor reformers demanded a living wage, they defined it in terms of the money needed to support a home and family. Freedpeople’s demands for 40 acres and a mule were demands for a home. Frances Willard and the Woman’s Christian Temperance Union made “home protection” the basis of their push for political power and the vote for women. Cities and states pushed restrictions on the rights of private landholders to seek wealth at the expense of homes. In these cases, the home could be a weapon for enfranchisement and redistribution. But whether it was used to include or exclude, the idea of home remained at the center of Gilded Age politics. To lose the cultural battle for the home was to lose, in some cases, virtually everything.

Home became the filter that justified the exclusion of Chinese immigrants, Indian peoples, eventually African Americans, transients, and large numbers of the working poor.

The idea of home has not vanished. Today a housing crisis places homes beyond the reach of many, and the homeless have been exiled to a place beyond the polity. But still, the cultural power of the home has waned.

A new equivalent of home—complete with its transformative powers for good and ill—might be hiding in plain sight, or it could be coming into being. When I ask students, teachers, and public audiences about a modern equivalent to the Gilded Age home, some suggest family, a concept increasingly deployed in different ways by different people. But I have found no consensus.

If we cannot locate a central collective concept which, for better or worse, organizes our sense of being American, then this second Gilded Age has become a unique period in American history. We will have finally evolved into the atomized individuals that 19th century liberals and modern libertarians always imagined us to be.

The alternative is not a single set of values, a kind of catechism for Americans, but rather a site where we define ourselves around our relationships to each other rather than by our autonomy. We would quarrel less over what we want for ourselves individually than over what we want collectively. Articulating a central concept that is the equivalent of the 19th century idea of home would not end our discussions and controversies, but it would center them on something larger than ourselves.

I wish I could announce the modern equivalent of home, but I am not perceptive enough to recognize it yet. I do know that, once identified, the concept will become the ground that anyone seeking to define what it is to be an American must seize.

The post When the Idea of Home Was Key to American Identity appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/09/11/idea-home-key-american-identity/chronicles/who-we-were/feed/ 1
The Bostonian Who Armed the Anti-Slavery Settlers in “Bleeding Kansas”http://www.zocalopublicsquare.org/2017/08/08/bostonian-armed-anti-slavery-settlers-bleeding-kansas/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/08/08/bostonian-armed-anti-slavery-settlers-bleeding-kansas/chronicles/who-we-were/#respond Tue, 08 Aug 2017 07:01:17 +0000 By Robert K. Sutton http://www.zocalopublicsquare.org/?p=87282 On May 24, 1854, Anthony Burns, a young African-American man, was captured on his way home from work. He had escaped from slavery in Virginia and had made his way to Boston, where he was employed in a men’s clothing store. His owner tracked him down and had him arrested. Under the Fugitive Slave Act of 1850 and the United States Constitution, Burns had no rights whatsoever.

To the people of Boston, his capture was an outrage. Seven thousand citizens tried to break him out of jail, and the finest lawyers in Boston tried to make a case for his freedom, all to no avail. On June 2, Burns was escorted to a waiting ship and returned to bondage.

This entire episode had a profound impact on many Bostonians, but one in particular: Amos Adams Lawrence. The Burns episode likely was the first time Lawrence came face-to-face with the evils

The post The Bostonian Who Armed the Anti-Slavery Settlers in “Bleeding Kansas” appeared first on Zócalo Public Square.

]]>
What It Means to Be American On May 24, 1854, Anthony Burns, a young African-American man, was captured on his way home from work. He had escaped from slavery in Virginia and had made his way to Boston, where he was employed in a men’s clothing store. His owner tracked him down and had him arrested. Under the Fugitive Slave Act of 1850 and the United States Constitution, Burns had no rights whatsoever.

To the people of Boston, his capture was an outrage. Seven thousand citizens tried to break him out of jail, and the finest lawyers in Boston tried to make a case for his freedom, all to no avail. On June 2, Burns was escorted to a waiting ship and returned to bondage.

This entire episode had a profound impact on many Bostonians, but one in particular: Amos Adams Lawrence. The Burns episode likely was the first time Lawrence came face-to-face with the evils of slavery, and shortly after Burns was returned to bondage, he wrote to his uncle that “we went to bed one night old-fashioned, conservative, Compromise Union Whigs and waked up stark mad Abolitionists.” (The Whig Party was divided over slavery at this time; by 1854, when the Republican Party was organized, the Whigs were no longer a strong force in U.S. politics.)

A print created in Boston in the 1850s showing Anthony Burns and scenes from his life. Image courtesy of Library of Congress.

Lawrence was a somewhat unlikely abolitionist. He was born into one of the bluest of blue-blood families in Boston and had every benefit his family’s wealth could provide, attending Franklin Academy, an elite boarding school, and then Harvard. True, the Lawrence family had a strong philanthropic ethic. Amos’s uncle, Abbott Lawrence, donated $50,000 to Harvard in 1847—which at the time was the largest single donation given to any college in the United States—to establish Lawrence Scientific School, and Amos’s father, also named Amos, retired at age 45 to devote the remainder of his life to philanthropy. In 1854, Amos Adams Lawrence wrote in his private diary that he needed to make enough money in his business practices to support charities that were important to him.

But those business practices made backing an anti-slavery charity unlikely. His family made its fortune in the textile industry, and Lawrence himself created a business niche as a commission merchant selling manufactured textiles produced in New England. Most of the textiles Lawrence and his family produced and sold were made from cotton, which was planted, picked, ginned, baled, and shipped by slaves. This fact presents an interesting conundrum. The Burns episode made Lawrence, as he wrote, “a stark mad abolitionist,” but, as far as we know, the fact that his business relied on the same people he was trying to free did not seem to bother him.

Lawrence very quickly had the opportunity to translate his new-found abolitionism into action. On May 30, 1854, in the midst of the Burns affair, President Franklin Pierce signed into law the Kansas-Nebraska Act, which established Kansas and Nebraska as territories but allowed each to decide for themselves, under the concept of popular sovereignty, whether they wanted slavery or not. To many abolitionists, this was an outrage, because it opened the possibility for another slave state to enter the union. Also, with the slave-holding state of Missouri right next door, the pro-slavery side seemed to have an undue advantage.

This was Lawrence’s chance. A friend introduced him to Eli Thayer, who had just organized the Emigrant Aid Company to encourage antislavery settlers to emigrate to Kansas with the goal of making the territory a free state. Lawrence became the company’s treasurer, and immediately began dipping into his pocket to cover expenses. When the first antislavery pioneers arrived in Kansas, they decided to call their new community “Lawrence,” knowing that without their benefactor’s financial aid, their venture likely would not have been possible.

Lawrence was frequently frustrated that the company’s leaders were not aggressive enough to raise money, but he quietly continued to cover the bills. At one point, he confided to his diary, when bills for the Emigrant Aid Company came due, he did not have enough of his own money on hand, so he sold shares in his business to cover the expenses. Whenever there was a need for special funding in Kansas, Lawrence would donate and ask others to do so as well. Lawrence and his brothers, for example, contributed to the purchase of Sharps rifles—the most advanced weapons of the day—for citizens of Lawrence.

44-caliber Sharps percussion sporting rifle used by abolitionist John Brown, ca 1856. Image courtesy of National Museum of American History.

They needed those guns. Because Lawrence, Kansas was the center of the antislavery movement, it became the bullseye of the target of pro-slavery folks. In late 1855, Missourians lined up planning to attack Lawrence in what was called the Wakarusa War. Nothing happened that time, and the Missourians returned home. But less than a year later came the “Sack of Lawrence,” in which pro-slavery Missourians burned much of the town to the ground. Amos Lawrence continued to support the effort to make Kansas a free state. In 1857, Lawrence again dug into his pocket and donated $12,696 to establish a fund “for the advancement of religious and intellectual education of the young in Kansas.”

Eventually, in 1861, Kansas was admitted to the Union as a free state. The town of Lawrence played an important role in this development, and several of its residents became leaders in the early state government. But the wounds of the territorial period continued to fester. In August 1863, during the Civil War, Lawrence burned again: Willian Clarke Quantrill, a Confederate guerrilla chieftain, led his cutthroat band into the town, killed more than 200 men and boys, and set the place on fire.

Just several months before, Lawrence had been granted approval from the new state legislature to build the University of Kansas in their town. Citizens needed to raise $15,000 to make this happen, and the raid had nearly wiped out everyone. Again, Amos Lawrence came to the rescue, digging into his pocket for $10,000 to make sure Lawrence, Kansas would become the home of the state university.

In 1884, Amos Lawrence finally visited the town that bore his name. Citizens rolled out the red carpet to honor their namesake. He was honored by the university he was instrumental in creating. He was invited as the guest of honor for several other events. But Lawrence had always been a very private person, and the hoopla over his visit was too much. He stayed for a couple of days, then returned home to Boston. He never visited again.

To the people of modern-day Lawrence, Amos Lawrence has faded from memory. A reporter writing about him in a recent local newspaper article was unaware that he had visited the town. But Lawrence’s support and money were essential in making Kansas a free state. When Lawrence responded to Burns’s brutal treatment, he showed how a citizen can be shocked out of complacency and into action—and thus made history.

The post The Bostonian Who Armed the Anti-Slavery Settlers in “Bleeding Kansas” appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/08/08/bostonian-armed-anti-slavery-settlers-bleeding-kansas/chronicles/who-we-were/feed/ 0
How the Kellogg Brothers Taught America to Eat Breakfasthttp://www.zocalopublicsquare.org/2017/08/03/kellogg-brothers-taught-america-eat-breakfast/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/08/03/kellogg-brothers-taught-america-eat-breakfast/chronicles/who-we-were/#respond Thu, 03 Aug 2017 07:01:42 +0000 By Howard Markel http://www.zocalopublicsquare.org/?p=87194 The popular singer and movie star Bing Crosby once crooned, “What’s more American than corn flakes?” Virtually every American is familiar with this iconic cereal, but few know the story of the two men from Battle Creek, Michigan who created those famously crispy, golden flakes of corn back in 1895, revolutionizing the way America eats breakfast: John Harvey Kellogg and his younger brother Will Keith Kellogg.

Fewer still know that among the ingredients in the Kelloggs’ secret recipe were the teachings of the Seventh-day Adventist church, a homegrown American faith that linked spiritual and physical health, and which played a major role in the Kellogg family’s life.

For half a century, Battle Creek was the Vatican of the Seventh-day Adventist church. Its founders, the self-proclaimed prophetess Ellen White and her husband, James, made their home in the Michigan town starting in 1854, moving the church’s headquarters in 1904 to Takoma

The post How the Kellogg Brothers Taught America to Eat Breakfast appeared first on Zócalo Public Square.

]]>
What It Means to Be American The popular singer and movie star Bing Crosby once crooned, “What’s more American than corn flakes?” Virtually every American is familiar with this iconic cereal, but few know the story of the two men from Battle Creek, Michigan who created those famously crispy, golden flakes of corn back in 1895, revolutionizing the way America eats breakfast: John Harvey Kellogg and his younger brother Will Keith Kellogg.

Fewer still know that among the ingredients in the Kelloggs’ secret recipe were the teachings of the Seventh-day Adventist church, a homegrown American faith that linked spiritual and physical health, and which played a major role in the Kellogg family’s life.

For half a century, Battle Creek was the Vatican of the Seventh-day Adventist church. Its founders, the self-proclaimed prophetess Ellen White and her husband, James, made their home in the Michigan town starting in 1854, moving the church’s headquarters in 1904 to Takoma Park, outside of Washington, D.C. Eventually, Seventh-day Adventism grew into a major Christian denomination with churches, ministries, and members all around the world. One key component of the Whites’ sect was healthy living and a nutritious, vegetable and grain based diet.

Patrons dining at the Battle Creek Sanitarium. Image courtesy of the Ellen G. White Estate, Inc.

From the distance of more than a century and a half, it is fascinating to note how many of Ellen White’s religious experiences were connected to personal health. During the 1860s, inspired by visions and messages she claimed to receive from God, she developed a doctrine on hygiene, diet, and chastity enveloped within the teachings of Christ. She began promoting health as a major part of her ministry as early as June 6, 1863. At a Friday evening Sabbath welcoming service in Otsego, Michigan, she described a 45-minute revelation on “the great subject of Health Reform,” which included advice on proper diet and hygiene. Her canon of health found greater clarity in a sermon that she delivered on Christmas Eve 1865 in Rochester, New York. White vividly described a vision from God which emphasized the importance of diet and lifestyle in helping worshippers stay well, prevent disease, and live a holy life. Good health relied on physical and sexual purity, White preached, and because the body was intertwined with the soul her prescriptions would help eliminate evil, promote the greater good in human society, and please God.

The following spring, on May 20, 1866, “Sister” White formally presented her ideas to the 3,500 Adventists comprising the denomination’s governing body, or General Conference. When it came to diet, White’s theology found great import in Genesis 1:29: “And God said, ‘Behold, I have given you every herb bearing seed, which is upon the face of all the earth, and every tree, in the which is the fruit of a tree yielding seed; to you it shall be for meat.’” White interpreted this verse strictly, as God’s order to consume a grain and vegetarian diet.

She told her Seventh-day Adventist flock that they must abstain not only from eating meat but also from using tobacco or consuming coffee, tea, and, of course, alcohol. She warned against indulging in the excitatory influences of greasy, fried fare, spicy condiments, and pickled foods; against overeating; against using drugs of any kind; and against wearing binding corsets, wigs, and tight dresses. Such evils, she taught, led to the morally and physically destructive “self-vice” of masturbation and the less lonely vice of excessive sexual intercourse.

Dr. John Kellogg dictating to his brother Will, circa 1890. Image courtesy of John Harvey Kellogg Papers, Bentley Historical Library, The University of Michigan.

The Kellogg family moved to Battle Creek in 1856, primarily to be close to Ellen White and the Seventh-day Adventist church. Impressed by young John Harvey Kellogg’s intellect, spirit, and drive, Ellen and James White groomed him for a key role in the Church. They hired John, then 12 or 13, as their publishing company’s “printer’s devil,” the now-forgotten name for an apprentice to printers and publishers in the days of typesetting by hand and cumbersome, noisy printing presses. Like many other American printer’s devils who went on to greatness—including Thomas Jefferson, Benjamin Franklin, Mark Twain, Walt Whitman, and Lyndon Johnson—Kellogg mixed up batches of ink, filled paste pots, retrieved individual letters of type to set, and proofread the not-always-finished printed copy. He was swimming in a river of words and took to it with glee, discovering his own talent for composing clear and balanced sentences, filled with rich explanatory metaphors and allusions. By the time he was 16, Kellogg was editing and shaping the church’s monthly health advice magazine, The Health Reformer.

The Whites wanted a first-rate physician to run medical and health programs for their denomination and they found him in John Harvey Kellogg. They sent the young man to the Michigan State Normal College in Ypsilanti, the University of Michigan in Ann Arbor, and the Bellevue Hospital Medical College in New York. It was during medical school when a time-crunched John, who prepared his own meals on top of studying round the clock, first began to think about creating a nutritious, ready-to-eat cereal.

Upon returning to Battle Creek in 1876, with the encouragement and leadership of the Whites, the Battle Creek Sanitarium was born and within a few years it became a world famous medical center, grand hotel, and spa run by John and Will, eight years younger, who ran the business and human resources operations of the Sanitarium while the doctor tended to his growing flock of patients. The Kellogg brothers’ “San” was internationally known as a “university of health” that preached the Adventist gospel of disease prevention, sound digestion, and “wellness.” At its peak, it saw more than 12,000 to 15,000 new patients a year, treated the rich and famous, and became a health destination for the worried well and the truly ill.

There were practical factors, beyond those described in Ellen White’s ministry, that inspired John’s interest in dietary matters. In 1858, Walt Whitman described indigestion as “the great American evil.” A review of mid-19th-century American diet on the “civilized” eastern seaboard, within the nation’s interior, and on the frontier explains why one of the most common medical complaints of the day was dyspepsia, the 19th-century catchall term for a medley of flatulence, constipation, diarrhea, heartburn, and “upset stomach.”

Breakfast was especially problematic. For much of the 19th century, many early morning repasts included filling, starchy potatoes, fried in the congealed fat from last night’s dinner. For protein, cooks fried up cured and heavily salted meats, such as ham or bacon. Some people ate a meatless breakfast, with mugs of cocoa, tea, or coffee, whole milk or heavy cream, and boiled rice, often flavored with syrup, milk, and sugar. Some ate brown bread, milk-toast, and graham crackers to fill their bellies. Conscientious (and frequently exhausted) mothers awoke at the crack of dawn to stand over a hot, wood-burning stove for hours on end, cooking and stirring gruels or mush made of barley, cracked wheat, or oats.

Advertisement, 1934. Image courtesy of N. W. Ayer Advertising Agency Records, Archives Center, National Museum of American History, Smithsonian Institution.

It was no wonder Dr. Kellogg saw a need for a palatable, grain-based “health food” that was “easy on the digestion” and also easy to prepare. He hypothesized that the digestive process would be helped along if grains were pre-cooked—essentially, pre-digested—before they entered the patient’s mouth. Dr. Kellogg baked his dough at extremely high heat to break down starch contained in the grain into the simple sugar dextrose. John Kellogg called this baking process dextrinization. He and Will labored for years in a basement kitchen before coming up with dextrinized flaked cereals—first, wheat flakes, and then the tastier corn flakes. They were easily-digested foods meant for invalids with bad stomachs.

Today most nutritionists, obesity experts, and physicians argue that the easy digestibility the Kelloggs worked so hard to achieve is not such a good thing. Eating processed cereals, it turns out, creates a sudden spike in blood sugar, followed by an increase in insulin, the hormone that enables cells to use glucose. A few hours later, the insulin rush triggers a blood sugar “crash,” loss of energy, and a ravenous hunger for an early lunch. High fiber cereals like oatmeal and other whole grain preparations are digested more slowly. People who eat them report feeling fuller for longer periods of time and, thus, have far better appetite control than those who consume processed breakfast cereals.

By 1906, Will had had enough of working for his domineering brother, who he saw as a tyrant who refused to allow him the opportunity to grow their cereal business into the empire he knew it could become. He quit the San and founded what ultimately became the Kellogg’s Cereal Company based upon the brilliant observation that there were many more normal people who wanted a nutritious and healthy breakfast than invalids—provided the cereal tasted good, which by that point it did, thanks to the addition of sugar and salt.

The Kelloggs had the science of corn flakes all wrong, but they still became breakfast heroes. Fueled by 19th-century American reliance on religious authority, they played a critical role in developing the crunchy-good breakfast many of us ate this morning. 

The post How the Kellogg Brothers Taught America to Eat Breakfast appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/08/03/kellogg-brothers-taught-america-eat-breakfast/chronicles/who-we-were/feed/ 0
How Sears Industrialized, Suburbanized, and Fractured the American Economyhttp://www.zocalopublicsquare.org/2017/07/20/sears-industrialized-suburbanized-fractured-american-economy/chronicles/who-we-were/ http://www.zocalopublicsquare.org/2017/07/20/sears-industrialized-suburbanized-fractured-american-economy/chronicles/who-we-were/#comments Thu, 20 Jul 2017 07:01:22 +0000 By Vicki Howard http://www.zocalopublicsquare.org/?p=86943 The lifetime of Sears has spanned, and embodied, the rise of modern American consumer culture. The 130-year-old mass merchandiser that was once the largest retailer in the United States is part of the fabric of American society.

From its start as a 19th-century mail-order firm, to its heyday on Main Street and in suburban malls, and from its late 20th-century reorientation toward credit and financial products to its attempted return to its original retail identity, Sears has mirrored the ups and downs of the American economy. It was a distribution arm of industrial America. It drove the suburbanizing wedge of postwar shopping malls. It helped atomize the industrial economy through manufacturer outsourcing in the 1970s and 1980s. It played a key role in the diffusion of mass consumer culture and commercial values. For better and for worse, Sears is a symbol of American capitalism.

By the early 20th century, Sears

The post How Sears Industrialized, Suburbanized, and Fractured the American Economy appeared first on Zócalo Public Square.

]]>
What It Means to Be American The lifetime of Sears has spanned, and embodied, the rise of modern American consumer culture. The 130-year-old mass merchandiser that was once the largest retailer in the United States is part of the fabric of American society.

From its start as a 19th-century mail-order firm, to its heyday on Main Street and in suburban malls, and from its late 20th-century reorientation toward credit and financial products to its attempted return to its original retail identity, Sears has mirrored the ups and downs of the American economy. It was a distribution arm of industrial America. It drove the suburbanizing wedge of postwar shopping malls. It helped atomize the industrial economy through manufacturer outsourcing in the 1970s and 1980s. It played a key role in the diffusion of mass consumer culture and commercial values. For better and for worse, Sears is a symbol of American capitalism.

By the early 20th century, Sears already was a household name across the United States, one that represented rural thrift and industry as well as material abundance and consumer pleasures. The company was founded as a modest mail-order retailer of watches in the 1880s by Richard W. Sears and Alvah C. Roebuck. Julius Rosenwald, a Chicago clothing merchant who became a partner in the firm in 1895, directed its rapid growth, expanding into new products and ever-broader territory. Mail-order firms like Sears were able to penetrate underserved rural areas by leaning on new infrastructure, such as the railroads that linked far-flung regions of the country. Government regulation also aided the company’s growth, with the Rural Free Delivery Act of 1896 underwriting its distribution chain by expanding mail routes in rural areas.

Sears, Roebuck letterhead from 1907 featured the mail-order company’s state-of-the-art distribution center, a symbol of its retail dominance. Image courtesy of Sears, Roebuck & Co/Wikimedia Commons.

In an era when print media reigned supreme, Sears dominated the rural retail market through its huge catalog, an amazing work of product advertising, consumer education, and corporate branding. Titled the Book of Bargains and later, The Great Price Maker, the famous Sears catalog expanded in the 1890s from featuring watches and jewelry to including everything from buggies and bicycles to sporting goods and sewing machines. It educated millions of shoppers about mail-order procedures, such as shipping, cash payment, substitutions and returns. It used simple and informal language and a warm, welcoming tone. “We solicit honest criticism more than orders,” the 1908 catalog stated, emphasizing customer satisfaction above all else. Sears taught Americans how to shop.

Sears also demonstrated how to run a business. Cutting costs and tightly controlling distribution fueled its rise to power. The company built a massive Chicago distribution complex in 1906, which occupied three million square feet of floor space. A full-page illustration of the plant, in all its bright redbrick glory, graced the back of the Sears catalog. Any customer could see how his merchandise was received and held, how his orders were filled and shipped out, and where the catalog itself was published. The distribution center was its own best advertisement; among the largest in the world, it was a symbol of the mail-order company’s dominance.

The company innovated in other ways, too. Bricks-and-mortar retailers today have to contend with new consumer habits brought about by e-commerce. Similarly, mail-order firms like Sears faced potential loss of their markets as the nation urbanized 100 years ago and entered the automobile age. Sears navigated the challenge brilliantly when it opened its first department store in Chicago in 1925. Under the managerial leadership of Gen. Robert E. Wood, who had formerly worked with mail-order competitor Montgomery Ward, Sears initiated a rapid expansion outside of urban centers. By 1929, on the eve of the Great Depression, it operated more than 300 department stores.

In 1948, Ruth Parrington, a librarian at the Chicago Public Library, studied a Sears catalog from 1902. Photo courtesy of Associated Press.

Growth continued even during the economic downturn, because Sears wisely championed an aesthetic of thrift. The chain made its name selling dependable staples such as socks and underwear and sheets and towels, rather than fashion items like those found in traditional department stores such as Marshall Field’s in Chicago or John Wanamaker’s in Philadelphia or New York. Sears outlets were spare, catering to customers who were interested in finding good value, to meet practical needs. By the end of the Depression decade, the number of stores had almost doubled.

After World War II, still under Wood’s leadership, Sears continued to open new stores across North America, in the bustling new shopping centers populating the expanding suburban landscape. In the United States, the number of Sears stores passed 700 by the mid-1950s. The firm also expanded across the borders north and south, opening its first Mexico City store in 1947 and moving into Canada in 1952 (incorporating with a Canadian mail-order firm to become Simpson-Sears). Sears benefited from being a pioneer chain in a landscape of largely independent department stores. Along with J.C. Penney, it became a standard shopping mall anchor. Together, the two chains, along with Montgomery Ward, captured 43 percent of all department store sales by 1975.

Sears wouldn’t really lose any footing until the 1970s, when new challenges emerged. Skyrocketing inflation meant low-price retailers such as Target, Kmart and Walmart, all founded in 1962, lured new customers. The market became bifurcated as prosperous upper-middle class shoppers turned to more luxurious traditional department stores, while bargain seekers found lower prices at the discounters than at Sears.

Women’s and girls’ underskirts featured in the 1902 Sears Roebuck catalog sold from $1.18. Image courtesy of Edward Kitch/Associated Press.

In 1991, Walmart overtook Sears as the nation’s largest retailer. As big box stores began to dominate the country, the department store industry responded through mergers, reorganization and experimentation with the department store category itself. Sears was no exception. The company took many different tacks under a series of problematic leaders, losing sight in the process of its traditional niche, which it ceded to discounters. Sears moved into insurance and financial services. Its credit card business, for example, accounted for 60 percent of its profits at the turn of the 21st century. In 2003, however, it tried returning to its retail core, selling its credit and financial business to Citigroup for $32 billion.

There is a tendency to look at Sears’s decline, and the potential loss of a grand icon of American business, with fond nostalgia. This would be a mistake. Sears embodied many of the uglier aspects of American capitalism, too. Many times, the firm’s management pushed back against forces that benefited workers. Sears tried to undermine organized labor, successfully resisting it even though several other traditional flagship department stores had unionized by the 1940s and 1950s. Company leaders resisted 20th-century progressive social movements that sought economic equality for African Americans and women. Like other department stores, Sears contributed both to structural and daily acts of racism, against customers and workers. African-American boycotts against Sears in the 1930s, for example, exposed racist hiring practices; in the late 1960s, welfare-rights activists revealed the firm’s discriminatory credit policies. Gender inequality was deeply entrenched in its work structure—and challenged, prominently and unsuccessfully, in the famous 1986 “Sears case,” which emerged from an Equal Employment Opportunity Commission complaint concerning discrimination against women, who had been passed over for lucrative commissioned sales jobs in traditionally-male departments.

All of it, good and bad, reflects our nation’s struggle to adapt to larger economic, political, and cultural forces. For historians like myself, who see business as a social institution through which to view and critique the past, the end of Sears will mean more than just one less place to buy my socks.

The post How Sears Industrialized, Suburbanized, and Fractured the American Economy appeared first on Zócalo Public Square.

]]>
http://www.zocalopublicsquare.org/2017/07/20/sears-industrialized-suburbanized-fractured-american-economy/chronicles/who-we-were/feed/ 2