Some of Wisconsin’s best writers hail from the Flatlands. Kristin A. Oakley is one of those.
Oakley’s novel Carpe Diem, Illinois (Little Creek Press, 2014) is a mystery, a suspense thriller, and a romance. Dashing but troubled reporter Leo Townsend hopes to save his career by taking on a ho-hum assignment to profile a small town, Carpe Diem, that is a haven for home schoolers. Just when Townsend arrives to interview the mayor, things in Carpe Diem are heating up, due to an auto crash involving a local activist and the wife of a crusading state senator.
In the process of investigating the town, Townsend finds himself also investigating the accident. The lives and fortunes of the town’s residents—particularly its young, “unschooled” citizens—hang in the balance. There are lots of thrills and twists, and along the way we learn about the philosophy known as “unschooling,” a form of education in which “the children determine what they need to learn, when they will learn it, and how they go about it.”
The book is well-written and moves at a brisk pace. The reader winds up cheering not only for Leo Townsend but also for various teen and adult denizens of Carpe Diem. If you like to examine important social and educational issues in context of suspense and high drama, you’ll enjoy Carpe Diem, Illinois.
Kristin Oakley, who now lives in Madison, was a founder of In Print professional writers’ organization, is a board member of the Chicago Writers’ Association, and teaches in the UW-Madison Division of Continuing Studies writing program. She is also the mother of two daughters who were home schooled. You can find more about her at https://kristinoakley.net.
Carpe Diem, Illinois is the first book in the Leo Townsend series. The second, God on Mayhem Street, was released in August 2016.
July 11, 1959: a sultry night all over the country, including Kenosha, Wisconsin.
We played Wiffle® ball under the streetlight at 23rd Avenue and 68th Street. Dom and Loretta Forgianni, Sandy and Pat Johnson, my sister Cynda and me. Sometime after full dark, maybe nine-thirty, we broke it up and went inside.
Mounting the stairs to our second-floor flat above the Forgiannis, I heard an NBC staff voice—sonorous, gray, authoritative—interrupt regular programming to inform us, coast-to-coast, that a jet airliner was in serious trouble in the East. Pan American World Airways Flight 102, a Boeing 707, had dropped two of four wheels of its left main landing gear into the bay on its ascent from New York International Airport—then known as Idlewild, now JFK.
Prudence and Recklessness
The pilot requested at least three thousand feet of Runway 13R be spread with fire suppression foam. During the two hours it took to accomplish this, he circled the airport, burning as much jet fuel as possible to reduce chances of a catastrophic fire on landing. When foaming was complete, the plane flew another hour—burning fuel and preparing 102 passengers and eleven crew members for a possibly rough landing.
In New York, as in Kenosha, it was a hot summer night. Many thousands of bored New Yorkers drove out to Idlewild to view the spectacle. When airport access roads became blocked with traffic, drivers abandoned cars where they stood and swarmed over the runways and taxiways on foot. Idling and abandoned cars blocked roads needed by New York fire and police units attempting to converge on the airport. Those units that did get through combined with Port Authority police to corral the ambulatory thrill-seekers north of Taxiway Q, more than eight hundred feet from Runway 13R.
Down to Earth
Preparations were as complete as possible. With millions of us glued to our TV sets, the pilot touched down his right landing gear at 130 knots and full flaps, dropped the left side gingerly—with a shower of sparks as the sheared-off wheel strut met the runway—held the craft straight and true while blasting full reverse thrust, came to rest 1,200 feet short of the foamed area of the runway.
Cabin crew deployed emergency egress chutes immediately. Several passengers slid down them before responders cut them away and replaced them with portable stairs. All passengers deplaned in under three minutes.
There was no fire. Hundreds of curiosity seekers encroached on the scene, refused to move back, and got sprayed away by a Port Authority fire truck. Four passengers were injured getting off the plane.
No grand explosion. Nobody died. The pilot was a hero.
The Pilot Was Uncle Ed
My father’s oldest brother, then 44 years of age. The TV reported that around midnight.
Edward Foster Sommers, born in 1914, graduated from high school in Knoxville, Illinois, then attended the University of Washington on a Naval ROTC scholarship. The U.S. Navy taught him to fly. Graduation made him a Naval Reserve officer. On November 29, 1939, he joined Pan American Airways as a co-pilot. After a stint flying Pan Am’s bread-and-butter routes in South America, he came to Oakland, California, in 1940 to fly the transpacific “Clipper” routes in Boeing B-314 “flying boats.” He, his wife Mary, and young daughter Elaine lived on a hillside in Oakland, a pleasant downhill drive to the seaplane harbor at Alameda, from which he flew.
On the morning of December 7, 1941, Uncle Ed had a brush with Infamy as the Anzac Clipper he was flying inbound to Hawaii was forced to divert to Hilo. Its regular port, Pearl Harbor, had unusually heavy traffic that day. But that’s another story.
At the time of his spectacular landing at Idlewild, Uncle Ed was a captain, command pilot of the plane, with full responsibility for 113 lives. He had amassed 17,100 flying hours—not unusual for a professional pilot of two decades’ experience. Only 170 of his hours were in the Boeing 707, which had been in operational service less than eight months.
Inerviewed soon after the landing, he said that despite the complex flight skills needed, he never doubted he would set the plane down safely. His main concern was that when the exposed strut touched the runway, the craft might “slew around sideways and take out a few hundred damned fools on the ground.”
A London-bound passenger on Flight 102 was movie director Otto Preminger. If Uncle Ed had not preserved this man’s life that night, we would not have had such motion pictures as Exodus, Advise and Consent, and Hurry Sundown. Or maybe we would have had them anyway—but not directed by Otto Preminger.
According to my cousin Steve, this was the first major incident for the 707. General details can be found in the pages of the Civil Aeronautics Board’s accident report here and here. A concise summary is also posted on the Facebook page of the Pan American Museum Foundation, Inc.
In a profile that appeared a day after the big event, the New York Times said Uncle Ed “demonstrated the steady, clearheaded qualities essential to the complete airman.” I think that’s about right. It would be a mistake, however, to think of him as the steely-eyed Robert Stack or John Wayne type that we all desire in a pilot. The real life Edward Sommers was ordinary. Though his career took him around the world—he and his family lived in Brazil, California, England, Germany, and New Jersey at various times—part of him was still the small-town boy from the flatlands of Illinois. He received at least his share of the dour, phlegmatic, mundane outlook that marks our family. Perfectly at home guiding a multi-million-dollar plane, he leaned on Mary, his charming wife, for the social niceties of life.
I imagine he relished the world’s adulation of this one particular instance of routine superlative performance at his chosen trade. Who doesn’t like recognition? But I also am certain Uncle Ed was glad to have his fifteen minutes of fame behind him. He continued flying for Pan Am until he reached the then-mandatory retirement age of 60, in 1975. His renowned employer went out of business in 1991. Mary having died some time earlier, he lived out his life as a gradually shrinking old man until his passing just a few years ago.
Need for Speed
One of his grandchildren told me of taking “Grandpa Ed” for a motorcycle ride during his later years. The old man perched on the rear fender, my second cousin (Uncle Ed’s namesake) drove cautiously until his grandfather—like many pilots, intoxicated by speed—said, “Let her rip, Eddie!” The bike then gobbled up a few miles on a rural highway at astoundingly illegal speeds, much to the old man’s delight.
A few paragraphs above I said my uncle was dour, phlegmatic, and mundane—but I never said he was dull.
His name is known to most of us, but it’s unusual to hear it spoken, except around Thanksgiving. Each November, we briefly recall that Squanto taught the Pilgrims how to plant corn, thereby saving their colony from annihilation. We honor him for giving our English ancestors a warm welcome.
There are no true pictures of Squanto. Photography had not been invented; no artist drew him from life. The image above—adopted here mainly for its freedom from legal encumbrance—shows a man with intelligent eyes and open smile, demonstrating the use of fish to fertilize a planting. But Squanto’s story goes far beyond that.
Our knowledge of history can be ten miles wide and one millimeter deep.
“So Squanto helped the Pilgrims get started when they landed at Plymouth. Why, for crying out loud, do we need to know more?”
The Rest of the Story
It’s a fair question, and here’s the fair answer: The full story of Squanto informs us beyond the familiar triumphal tale of European colonization. We heirs of the Pilgrims should desire this information, not to dim the luster of our own history, but to remember it with wisdom and grace.
Squanto never aspired to be the native mentor to the Pilgrims. That role came about because when the Pilgrims arrived at Cape Cod in 1620, Squanto was already quite familiar with the English and even spoke their language.
Six years earlier, he and about twenty other young men of the Patuxet tribe had been snatched in one of many kidnappings by English explorers and freebooters ranging in those days up and down the Massachusetts coast. He was shipped across the Atlantic to Málaga, Spain. In Málaga he was freed by Spanish friars, or escaped on his own, or somehow avoided the life of slavery to which he had been consigned. He made his way—we know not how—to England, where he lived for some time in London.
After a few years, he managed to return to Massachusetts with an English voyage of exploration. When he returned on foot, alone, to the site his old village, he found it abandoned. All of his people were dead or scattered to the winds.
“Virgin Soil Epidemic”
Squanto’s Patuxet tribe had been utterly wiped out by an illness that swept the Northeastern seaboard in those years. Because that illness did not afflict the many Englishmen and other Europeans mingling with the natives at that time, historians consider this great plague a “virgin soil epidemic.” The kind of epidemic that occurs when new disease organisms are brought by outsiders into the midst of a population which lacks prior exposure to them. Nobody knows for sure what single disease, or combination of diseases, rampaged the Massachusetts coast in those years, but the result was a region cleared of former inhabitants. Thus, when the Pilgrims in 1620 arrived on the Mayflower, in foul weather, desperate for a place to hunker down, there was a choice spot of land recently vacated: Squanto’s former home.
One could hardly blame Squanto had he showed hostility to new English settlers. Not only had he been abducted and forced into years of exile far from home, while his friends and family suffered extinction by disease. Many similar kidnappings and other atrocities had been worked upon the local inhabitants in the years preceding the Pilgrims’ landfall. Despite all this, Squanto befriended the Pilgrims.
We would like to think the Pilgrims’ character, which stood out from these toxic relationships, vouched for them; that remaining Indian tribes, such as the Pokanoket under Chief Massasoit, discerned their peaceful and honorable intentions, well enough at any rate to trust them and form an alliance. In this context, Squanto was far from a “noble savage” who innocently befriended newcomers with a great white vessel and strange ways. Rather, he was a capable, worldly man, acquainted with European technology and customs. He consented—given his footloose status on his former soil—to become a kind of diplomat for the neighboring tribe in its calculated attempt to forge an alliance with the least-threatening and most promising band of Englishmen in the region.
After about twenty months of generally satisfactory service in that role, Squanto himself succumbed to illness, leaving the Pilgrims bereft of one important man who had been their friend in adversity. Governor William Bradford, in Of Plimoth Plantation, writes approvingly of Squanto and his influence on the young colony.
Squanto was a complex individual. The Pilgrims were, like many of us, saints but also sinners. Chief Massasoit and other Native Americans sought to advance their own interests. Latter-day champions of the Pilgrims and other Puritans who poured into Massachusetts starting in 1630 point out that the lands occupied by these English immigrants were acquired in fair, legal purchases, duly recorded in colonial archives. It is also true that white European immigrants—legal niceties aside—began to displace the original inhabitants of the land, who retreated ever further westward. This trend eventuated in King Philip’s War of 1675-76, the first real “Indian War” fought in the English colonies. More than 600 colonists were killed; thousands of Native Americans were killed or displaced. The ultimate effect was the continued advance of English civilization and progressive decimation of the American Indian population.
The Moral of the Story
Nobody can undo the past. The people of the past had their own motives, praiseworthy and otherwise, for everything they did. Wisdom for us in the present requires owning the full truth of the past in all its messy–sometimes inconvenient–complexity.
We’ve all got a good memoir or reminiscence book buried inside us. It’s quite another thing to actually get it out on paper, virtual or real, in any useful form. Because it requires selectivity. Unless you’re a major public figure, the world probably doesn’t need your autobiography. But it might not be able to resist your own take on the choicest bits.
That’s why there is so much to admire in what my friend, Michael Bourgo, has done. His memoir, Once Upon a Time: Growing Up in the 1950s, delivers exactly what the title claims—the experience of childhood in that now-legendary era from which so much of today’s pop culture—Happy Days, Back to the Future, Leave it to Beaver—derives.
Unlike Hollywood’s version, however, Michael’s version has the smack and tang of real events as lived in a particular person’s life. That person happens to be a warm, engaging old man recounting oodles of details from a long-ago period of his life. The struggles of a young family trying to get a start in a dynamic yet unpredictable postwar economy; the thrill of shopping at Marshall Field’s in Chicago’s Loop and dining at one of that elegant store’s six on-site restaurants; the satisfaction of showing up at summer camp self-contained and not dependent on a helicopter mom (yes, they had them in those days, too!) to unpack one’s footlocker.
Most of us, when we go to write a memoir, get overwhelmed by the imperative of sharing everything we have experienced—because every bit of it is significant to us, and we are sure that if we simply spray it out in its entirety, our own deep appreciation of each detail will transfer automatically to the mind of the reader. That is a delusion.
Write for the Reader, Not the Author
What readers want is information that is in some way new and significant to them—not a catalog of what is old and significant to the author. While trotting out an abundance of details from his amazing memory, Michael Bourgo always respects the reader’s need to get something surprising and interesting from the narrative. He also knows when to quit. This never becomes a recitation of everything that happened in the author’s life. He knows that what is significant, that today’s people might need or want to know, has to do with childhood in the Fifties. He sticks to that subject.
With a format composed of solid chapters arranged on chronological and topical lines, alternating with page-long poems that shed further light on matters already covered in prose, Michael gives us a credible understanding of life in the Fifties, one that goes well beyond the stereotypical adventures of Beaver, Wally, and Eddie Haskell.
For example, describing the ritual of young boys getting haircuts in those days: “There was another side to Ken’s [barber shop]. . . . My brother, always a more astute observer than I, figured it out when he was in high school. One day he overheard a strange exchange between a patron and one of the barbers, and he realized they were using some sort of code to set up a wager. So, in addition to cutting hair, Ken’s was also a front for a bookie operation that handled bets on sports. No doubt this was a service that many citizens found useful because in those days there were only two places to place a legal bet—at a horse track or in Las Vegas.” (I also, Dear Reader, patronized that kind of a barber shop as a boy. But I only got my hair cut.)
Those of us who lived through the times Michael Bourgo describes will recognize many of our own experiences in his narrative; and we will encounter other episodes, foreign to our own experience, that reflect the broad range of life lessons disclosed to members of different families in different places.
For readers who did not arrive on the scene before the Fifties finally petered out (around 1965), this well-balanced and life-affirming memoir will showcase a whole new world in richness and nuance—a world that Marty McFly would never find in his DeLorean.
I recommend Once Upon a Time: Growing Up in the 1950s to anyone who would like to re-live the era through a different set of eyes, and also to anyone who would like to experience it for the first time as it really was—not just as shown on TV.
When geezers gather, the gab gets garrulous. There is boasting value in extremes.
“We were so poor that the patches on our jeans, had patches on their jeans!”
“What! . . . You had jeans?”
Tales of poverty can still score points, but people who remember the Great Depression are mostly gone. So the extremest thing most of us can conjure these days is the weather.
Eco-warriors among us—whippersnappers!—construe any bump in the barometer, any thump in the thermometer, any slump in the sling psychrometer as a harbinger of the woe we are to reap from Global Warming. Well, maybe.
I can say this for sure: Nobody ever weathered weather like the weather we weathered, back in The Old Days. Gathered geezers may tell of the Terrible Winter of 1935-36, the Great Floods of ’93, the Summer That It Rained Alligator Eggs, or the Year With No Summer Atall. You never know, Dear Reader, when you may find yourself swamped in a five-hundred-year flood of such remembrances.
Winter of Purple Snow
When I mention the Winter of the Purple Snow, people look askance. When I claim that, actually, every winter in The Old Days was a winter of purple snow, a ceiling-mounted wide-angle lens would show a frenzy of Brownian motion away from me and toward the exits.
But it’s all true, every word. We did have purple snow, at least in Streator, Illinois, where my boyhood was misspent. Other cities must have had it, too.
Each winter, the snow tumbled down in December—pure, fluffy, altogether white. Over the next three days, the snow on the ground—not the snow in my backyard, but the snow on every city street—became empurpled. The cause of purple snow is easiest to explain in retrospect: Snow tires had not yet been invented.
In these apocalyptic times—even as we face continual peril from CNN-scale floods, hurricanes, earthquakes, volcanoes, tsunamis, and disaster films—one thing we no longer worry about, much, is sideways slippage on winter streets. All our cars wear radial tires. Radial tires slump a bit. This increases the surface that contacts the road, thus improves traction. Those who like to gild the lily may put on special “winter radial tires” in the fall. They have a deeper, more “road-gripping” tread design in addition to the famous radial slump. Most of us don’t feel a need for this. But before radial tires were invented, deep-tread “snow tires” were better than nothing.
However, in the 1950s, we didn’t even have those. There were only regular bias-ply or belted-bias tires. No special deep tread, no radial slump. They just perched on the ice and slid this way or that. In heavy snow, you might put messy, inconvenient “tire chains” on your tires. These were circular cages, made of interlinked chains, that enveloped each tire. They bit into the snow and ice. If you had to climb a long hill in the country, you needed chains. But on city streets that were half snow-covered and half clear, as is often the case, those chains chewed up the pavement, the tires, and themselves. So you didn’t use them any more than you had to.
“Where,” you ask, “is all this headed? Have you forgotten about the purple snow?” Stay with me, Kind Reader.
We needed something short of chains to help tires grip the street—especially at intersections, where most winter crashes occur. Sand would have been dandy. But why use expensive sand, when you can get crunchy, gritty cinders free of charge? This thrifty solution appealed to the city fathers in Streator and, I’ve got to believe, elsewhere.
You see, our houses were heated by coal. In Illinois, Mother Nature, 350 million years ago, had buried a generous layer of bituminous coal not far underground.
There are three forms, or “ranks,” of coal: anthracite, bituminous, and lignite. Lignite is brown, not much harder than the peat burned by poor Irish cottagers and rich Scottish distillers. Anthracite is hard, black, almost-a-diamond coal that’s mined in Pennsylvania. Bituminous is harder lignite but not as hard as anthracite. In other words, it is just right—not too hard, not too soft. Goldilocks would have used it in her furnace, for sure.
One ton of bituminous coal cost about five dollars—1950s dollars, that is. About fifty bucks in today’s money, so it wasn’t as cheap as it sounds. But if you could heat your house halfway through the winter on fifty dollars—that wouldn’t be so bad, would it? Bituminous coal was useful, abundant, and cheap.
But “O! The horror!” Did not all this burning coal cause sulfur dioxide, hydrogen sulfide, toxic metal residues, acid rain, air pollution, and so forth? Why, yes. It did. That is why we have air-quality regulations now, why the coal industry looks for low-sulfur deposits. It’s also why most coal-burning homes converted to gas, oil, or electric in the 1960s and ’70s. Through a combination of governmental action and industry initiatives, air and water in most places is cleaner now than it was in the 1950s.
Even in the Fabulous Fifties, however, pollution from coal was not very bad—in most places. It was quite bad in some heavy industrial corridors. But for most of us, the worst side effect was a thin film of soot on our walls.
“Spring cleaning” in those days meant something very particular. Our mothers each April removed coal dust from every interior wall. This was not a happy task that added joy to Mom’s relentless mission of caring for her family. My mother seemed to regard it as an irksome chore. But it must be done, and done it was.
She bought wall-cleaning putty at the hardware store. She rubbed it over the wall surface, then pulled it out, folded it over to expose clean putty, rubbed again. At the end we had clean walls. Plus many little balls of soiled putty to throw away. When homeowners abandoned coal, the makers of wall-cleaning putty added bright colors to the stuff and called it “Play-Doh.” That’s right, they did. (As Casey Stengel might say if he were alive today, “You could Google it.”)
“BUT WHAT ABOUT THE PURPLE SNOW?”
How to Be a Kid, 1950s Edition
When I was seven, Dad introduced me to my first regular chore—stoking the furnace. The furnace lived in the basement. It was a huge cylinder with ducts about a foot in diameter that sprouted all directions from its head. The main chamber and all the ducts were padded with asbestos insulation. (See “O! The horror!” above.)
Bituminous coal filled a room near the furnace, called “the coal bin.” Two or three times a year, the coal deliverymen would pour a ton of coal down a metal chute into the coal bin through a basement window.
Our coal came in rough lumps the size of a baseball or softball. It was shiny and black. You could break a lump in two with your bare hands. This exposed the striations of the rock. Sometimes it also exposed a fossil—the outline of a small leaf, for example—that had been trapped in the coal back in the Pennsylvanian Age of geology.
Coal was lightweight, for a rock. It was friable; when you handled it, you got greasy black dust on your hands. I scooped it from the coal bin with a giant shovel, set it in the furnace on top of the coal already aflame there. I had to make sure the new coal caught flame, augmented the fire and did not smother it.
Then I shook down the grates. (Purple snow coming up, Gentle Reader!) Two metal handles protruded from the furnace below the coal door. I rattled these handles; dead ashes and cinders fell through the grates into a hopper below. Once a week we shoveled ashes and cinders—also called “clinkers”—out of the furnace. We carried them to the alley behind our house in a five-gallon can. When the garbage men came by to collect our refuse, they dumped our ashes and clinkers into a separate compartment on their truck.
They collected these materials from every alley in the city. The product, as donated by householders, was a mix of fine, white fly ash and dense, iridescent clinkers. The city washed the fly ash away, leaving the clinkers—small, irregular rocks of metallic slag. A single clinker could be round, bulbous, sharp, jagged—all at the same time. They were multi-hued, but dominated by purple, blue, green, and pink.
The Empurplement of Streator, Illinois
When snow blanketed city streets, crews dumped these clinkers on every intersection for traction. Every passing car crushed them into smaller pieces. Periodically the city replenished the clinkers at the intersections.
Numberless bits of cinder got dragged down the street—transferred from interesections to tires, then deposited in mid-street, in driveways, in alleys, even on sidewalks. By mid-winter, all streets were festooned with purple snow, colored by the powdered residued of our furnace clinkers. It ranged from bright purple-pink to a dull brown slush with just a bit of rosiness.
Snow melts; cinders remain. They lay in small, sharp bits, in gutters and on sidewalks. They formed a light coat over asphalt schoolyards and potholed alleys. They lay in wait for innocent childen.
Cinders paved athletic running tracks before the invention of GrassTex, Tartan Track, AstroTurf. Sprinters and middle-distance runners got cinders in their low-cut track shoes, chewing up their feet. Or they fell on the track and embedded tiny chunks of metal under their skin.
The same hazard faced every child who strapped on a pair of roller skates or drove a tricycle pell-mell along uneven sidewalks while clad in short pants and tee shirts. Nobody escaped. Some kids had cinders embedded so deep that years later you could still find the black speck in cheek, knee, or elbow where the projectile had burrowed in.
I noticed last Thursday that the world is going to hell. You say, “The world has always been going to hell.” I say, “Yes, but now it is going straight to hell. Rapidly to hell. Immediately to hell.”
Do not pass “Go,” and do not bother with a handbasket.
Senior citizens have long known that civilization is on the skids. The knowledge comes free with age. You have seen too much. You remember how things were. The good things you remember keep sliding down into the dustbin of entropy. Meanwhile, bad things come up out of nowhere and metastasize across the evening sky.
However, God says:
See, I am doing a new thing! Now it springs up; do you not perceive it? I am making a way in the wilderness and streams in the wasteland.
—Isaiah 43:19 (NIV)
God is preparing Something Completely Different, and it’s on a channel our sets can’t pick up. The Great New Thing of the Future is already here, but we’re looking the wrong way. (Theologians have been known to call this “eschatological tension.”)
Eternity crashes down about our ears in more ways than Chicken Little could ever count.
War. Plague. Famine.
Inflation. Depression. Hard-heartedness.
Dissension. Criticism. Hurtfulness.
Pick your poison.
Whenever things come crashing down, a whole new arrangement waits in the wings.
The old order changeth, yielding place to new, And God fulfills himself in many ways, Lest one good custom should corrupt the world.
—Tennyson, Idylls of the King
We waste ourselves attacking visions that diverge from our own. History shows that diversity of viewpoints is a kind of “rocket fuel” that has propelled our society to greatness. We can’t be bothered with that. The deplorable politics of others, we take for our bête noir—perhaps because we face no real existential threats.
The Bible tells us, more than it tells us any other thing, “Fear not.” Yet we continue to be governed by fear. What if we were governed by confidence that the next new wave of things will bring the perfect, peaceable Kingdom of God that much closer to fruition?
Huntley (1911-1974) was an influential broadcaster, a television journalist who co-anchored NBC’s evening news program, The Huntley-Brinkley Report, for fourteen years beginning in 1956. When his run at NBC ended in 1970, Huntley, then 58, became front man for the founding of the Big Sky ski resort in his native Montana. Earlier, he had written a memoir titled The Generous Years: Remembrances of a Frontier Boyhood, published by Random House in 1968. This book was recommended and lent to me by my friend Jerry Peterson.
The Generous Years is a warm and interesting read. We learn much about the childhood of Chet Huntley but more importantly we learn about life in Montana in the first quarter of the twentieth century. Seen through the eyes of a boy who, as his adult self tells us more than once, was privileged “to know and remember a few years and a few scenes of the nation’s last frontier.”
The Last Frontier
The Montana of Huntley’s youth was indeed, in many ways, a raw frontier. People made their livings by farming, by herding, by mining and railroading. It was a society that still went about on horseback; motor vehicles, other than steam locomotives, were rare. Old Doc Minnick, the blunt, persevering medico of Huntley’s remembrance, made his housecalls in a one-horse buckboard. The memoir includes those staples of frontier life: prairie fires, locusts, and even an enterprising bank robber foiled by the derring-do of local boys. It’s a tale worth reading, and I commend it to you.
But what of Huntley’s claim to have recorded America’s last frontier? Even while typing the phrase, I thought of Alaskan friends. “What about us?” they would cry. “What are we, chopped liver?” Alaska has been raw frontier much more recently than Montana. Many parts of Alaska still qualify for that distinction. That’s also true of vast swaths of Canada’s Yukon Territory and northern British Columbia. These places are truly “the last frontier.”
Or are they?
The Frontier Thesis
Historian Frederick Jackson Turner put forth in 1893—eighteen years before Huntley’s birth in Montana—an idea that came to be called “the Frontier Thesis” of American history. Turner figured the frontier experience was the main thing that called forth the development of American democracy and other unique aspects of our civilization. Jackson’s Frontier Thesis became a mainstay in the scholarly interpretation of U.S. history. It has also been fiercely disputed; yet it still holds considerable sway.
Turner’s thesis took the frontier as a fact of physical geography. He proposed that when the frontier line reached the West Coast about 1880, the first phase of American history had ended. The frontier was no more.
This has not stopped others from declaring new areas of frontier-like emphasis. One example is likewise rooted in physical geography, although it is extraterrestrial. The moon, by this thinking, is a new frontier—and so is Mars. In 1966, forty-four years after Turner retired from Harvard, actor William Shatner declared all of space to be “the final frontier” in the opening title sequence to the Star Trek television series.
Whoever wrote Shatner’s speech (Gene Roddenberry, et al.) ought to have been more circumspect; because many more “new” and “final” frontiers have been proposed.
Senator John F. Kennedy, accepting the Democratic nomination for president in 1960, said: “We stand today on the edge of a New Frontier—the frontier of the 1960s, the frontier of unknown opportunities and perils, the frontier of unfilled hopes and unfilled threats. . . . Beyond that frontier are uncharted areas of science and space, unsolved problems of peace and war, unconquered problems of ignorance and prejudice, unanswered questions of poverty and surplus.” The phrase “New Frontier” then became a label for Kennedy’s presidential administration—like Teddy Roosevelt’s “Square Deal,” Franklin Roosevelt’s “New Deal,” or Harry Truman’s “Fair Deal.” As political branding it stood for a vaguely-defined stance of confronting unknown but large national challenges of the future. In that sense, we will always have a “new frontier” to deal with.
The Perpetual Lure of the Frontier
All this frontiersmanship makes me think that Americans have been so shaped by our frontier experience that we simply cannot do without it. We always need a frontier. Unless we are out on a frontier of some kind, we are not satisfied.
I wonder if Italians, Poles, Vietnamese, or Pakistanis talk and think as much about frontiers as we do. Frederick Jackson Turner and I doubt it.
My irascible sometime friend and former work supervisor, Tim, once went ballistic in my presence over the historic fact that U.S. presidents including George Washington, Abraham Lincoln, and in the twentieth century Woodrow Wilson on various occasions had issued public calls for “fasting, humiliation, and prayer.”
Tim—alas, now deceased—was a military man. He was quite intelligent, tolerably well-educated, and always in the grip of a steamy anger that was never far from the surface. He had been raised in a Catholic family but in adulthood described himself as “agnostic.”
He made no quarrel with presidential calls for fasting and prayer. He understood that even in a nation that prohibits “an establishment of Religion,” a leader may give voice to the general religious impulses of the people. But he did not think a chief executive should call for the country to be humiliated.
Tim was a notable narcissist, full of pride in himself and esteeming pride as a general virtue in all cases. He considered humiliation as the one thing to be avoided above all. Therefore, to call for humiliation of the whole nation was tantamount to treason. After all—the British, the Germans, and the Japanese had tried to humiliate us and we had not let them get away with it. Why, then, do it to ourselves?
With more time and more patience, had I been wiser and deeper, I might have helped Tim understand the concept of national humiliation in a larger context. But I did not.
In his sensitivity to that issue, Tim inadvertently put his finger on a key dimension of America’s church-state relationship. If we understand our nation’s affairs to fall within the Providence of a Power who calls each of us to approach life with Christ-like humility, then it seems proper for all of us, as a body politic, periodically to be humbled. To be reminded, that is, of our proper place in the world under the overarching care of God.
“Humiliation” in this sense may be what Lincoln had in mind when he said, in his Second Inaugural Address,
“If we shall suppose that American slavery is one of those offenses which, in the providence of God, must needs come, but which, having continued through His appointed time, He now wills to remove, and that He gives to both North and South this terrible war as the woe due to those by whom the offense came, shall we discern therein any departure from those divine attributes which the believers in a living God always ascribe to Him?”
That kind of thinking, I believe, is what Washington, Lincoln, and others meant when they called for national “humiliation.”
Past generations have mostly understood and assumed a close kinship between our lives as Christians and our lives as citizens. Alhough the Establishment Clause of the First Amendment has always forbidden the government to prescribe forms of prayer and worship, nobody construed it to prevent Americans from expressing our religious affiliations and sentiments in our public lives.
Under such a general understanding, it seemed perfectly natural to Americans of the mid-twentieth century to salute our national sovereignty by displaying flags in our houses of worship and recognizing national holidays during regular worship services. But expectations and understandings are much different today.
Our pastor—no bomb-throwing activist, she—called our attention to three articles in the current online Alban Weekly dealing with churches’ sometimes uneasy relationship with Independence Day celebrations. She wanted to know what we thought about them. The leading piece, a nine-year-old reflection from Duke University’s Faith & Leadership website, titled “What to do about the 4th,” written by a retired Methodist minister named Ed Moore, mentioned some “local traditions” that he called “affronting.” These were: “an American flag draped over the Lord’s Table, the Pledge of Allegiance included in the liturgy, or the choir expecting to deliver a patriotic anthem.”
I suppose these “local traditions” must exist somewhere in Christendom, or Rev. Moore would not have called them out. But they must be exceeding rare. In all my years I have never seen any of these “affronting” cases included in the worship of any churches I have attended. Using the U.S. flag as a communion cloth or a chancel parament? Such a practice must be abhorrent both to Christians and to patriots (bearing in mind that many of us aspire to be both).
Some patriotic expression in worship space, however, has been a commonplace in most churches since the dim past. It might take the form of red/white/blue floral decorations on July Fourth (a practice Rev. Moore okays, faintly); or the display of the flag somewhere in the worship space; or the singing of a patriotic song such as “America the Beautiful” by the congregation on the Fourth, in place of a regular hymn.
The reason such practices come under the microscope of critical examination now is not that we somehow are better educated than our grandparents about the implications of the Establishment Clause. Rather, it’s because we now live in a society that is markedly less religious than theirs was. I believe we are the poorer for that. But it does not follow that those who still keep the faith must embrace a sharp divorce between that faith and our inner sense of national identity. There can be room for both.
In the church where I have been a member for the past forty years, we have never practiced extreme liturgical patriotism. Sometimes we sing a patriotic song or two on national holidays. We used to display a U.S. flag and a “Christian flag” in our sanctuary. We retired those flags a while back; I am not aware of any complaints about that.
But should we, at some future time, choose to restore flags to our worship space, that would not show that we had sold out our Christian faith to some crypto-fascist conspiracy. It would only signal that fashions, or group preferences, had shifted slightly.
Some wise person once decreed that sleeping dogs ought to be permitted their slumber. Despite any number of learned articles that may be written, already or in the future, I doubt that most American church people feel any great tension between their devotion to Christ and their loyalty to our country.
I’ll bet my combustible friend Tim, if he were here today, would at least agree with that.
One of the endearing things about experts is how much escapes their notice. I’m not talking about peripheral matters outside their sphere of expertise. Even things smack dab in their wheelhouse may elude them.
Sometimes, the oversight may be merely geographic.
Take literature. In the United States, “literary fiction” resides in one or two postal codes on the island of Manhattan. The Big Five Publishers and most of their subsidiary imprints are located there—not to mention most of the editors, agents, reviewers, and listmakers (That’s you, New York Times!) who define the genre.
Once, American Literature may have radiated from Concord, Massachusetts, home of Emerson, Thoreau, and the Alcotts. But since the Civil War or even earlier, New York is The Place. Even otherwise sophisticated people seldom look beyond their own desk and dinner table. Ergo, “literature” is that which is written by people in New York City. Or at least, written by people who know the folkways of the Five Boroughs or could feel themselves at home there—and who write that way.
However: A funny thing happened on the way to the twentieth century. New York critics discovered “regional” writing (also called “local color”). After the Great Conflagration of the nineteenth century, a few southerners (e.g., Kate Chopin), westerners (Mark Twain), and New Englanders (Emily Dickinson) wrote works surprisingly worth reading, despite their focus on far-flung American localities—perhaps, even, because of it. In view of the Recent Unpleasantness, the literary world recognized some kind of national duty to make believe that We Were All Americans, even though some of us were entangled in local allegiances.
By the time I was a schoolboy in the 1950s and ’60s, the literati had digested this wave of regional literature and had reduced it to a few specimens in high school anthologies; a few required books, such as Willa Cather’s My Ántonia; and a general recommendation to read works by Hamlin Garland, Ole Rolvaag, William Faulkner, August Derleth, Erskine Caldwell, Joel Chandler Harris, and Sarah Orne Jewett. The tacit assumption behind this neat packaging of regional literature was that its efflorescence had been temporary, and literature could now revert to normal.
Today, however—more than fifty years later—almost every bit of what’s called “literary” (meaning serious and well-written) fiction is regional, in one way or another. “Local color” writing turns out to have been a hardy varietal that could not be weeded out.
Take Shotgun Lovesongs, a 2014 debut novel by Nickolas Butler. It presents four friends raised in the fictional hamlet of Little Wing, Wisconsin. Three had left to pursue careers in the wider world; one, Henry, had stayed in town to work the dairy farm his parents left him. Now some years have gone by. Kip the Chicago commodities trader, Ronny the rodeo rider, and Lee the music star have all returned—each drawn back by the mystical lure of home. With lots of scenes set in the VFW hall and in the town’s once-derelict (now gentrifying) feed mill, the book has plenty of the familiar cheese curds-peppermint schnapps-cow manure atmosphere that says Wisconsin. But it’s less about local color, less even about the varied career paths the four main men have taken, and more about their loves and friendships—among themselves, with various neighbors, and with the women and children in their lives. So yes, Shotgun Lovesongs is about the glory of the Wisconsin life, but it’s also about the hard things that we Badgers can inflict on one another. It’s not just a Wisconsin book, it’s also a full-fledged “literary” novel in the usual sense, and a fine one at that. It may not be coincidence that the author was educated in the Iowa Writers’ Workshop, which has influenced so many other fine writers.
Another good regional book is John Straley’s Cold Storage, Alaska. Though just as “regional” as Shotgun Lovesongs—maybe more so—and just as deserving of the “literary” label, Cold Storage, Alaska is quite different in tone and approach. For one thing, it’s at heart a crime novel. Most of the characters who move the plot are crooks, writ large or writ small. At the same time, there is something worthy of redemption in each of them. The non-criminal central character, Miles, a health care provider in the Alaska village of Cold Storage, is more reactive than active—yet he’s the stable tentpole around which the whole circus revolves. His arc, though subtler than those of his brother and the other grand and petty crooks in this book, is also perhaps more profound. His great challenge is to remain human while also honoring his compulsion to care for others. Those others, in a place like Cold Storage, are not always easy to serve. If you like crime bosses who aspire to be screenwriters, rock bands who get paid in fish, and an innkeeper-impresario whom wild creatures address in English . . . be sure to pay a visit to Cold Storage.
These are but two among hundreds of books published these days—and in an unbroken train since the beginning of literature in America—with both regional attributes and unmistakable literary talent. It is a great time to be an author . . . or a reader.
A boy named Izzy Mahler, seven years old, springs out of bed and dashes down the stairs. It is a Saturday morning in October, 1952.
Barefoot and pajama’d, Izzy makes straight for the wooden Philco radio, switches it on. Izzy remembers going downtown with Dad to bring home the Philco and its fine supporting table. Ever since—through three apartments, the birth of little Christine, and now the move to this two-story house just across the alley from Grant School—the Philco has been the Mahlers’ proudest possession, and the most useful.
Moving on to the kitchen, Izzy opens the refrigerator, takes out a quart of milk, removes the round cardboard cap from the glass bottle’s neck, and pours himself a glass. Then he sits down at the kitchen table and listens as the radio set in the living room spills forth Let’s Pretend, Buster Brown, and Space Patrol. He sees every detail of each story.
Commander Buzz Corey is just cutting his way into Jelna’s spaceship with an atomic cutting torch when Mom and Dad come out in wrinkled pajamas, rubbing their heads with their knuckles. Izzy wishes he had an atomic cutting torch like Buzz Corey’s, or even just a plain old cosmic ray gun. He would give it to President Eisenhower for copying. That way, should American soldiers run into bug-eyed monsters from Planet Orkulon, they’d be ready.
Christine bangs her tin cup on the wooden tray of her high chair, but Izzy hardly hears. Why can’t you get a ray gun by sending in box-tops? he wonders. A ray gun would take more boxtops, and probably more quarters, than the usual things like the Lone Ranger decoder ring he lost while helping Buster Wiggins plant potatoes—but it would be worth it. He hopes none of the Wigginses will bite into a spud and break a tooth on his decoder ring.
Now Christine squalls to beat the band, so loud that Izzy can’t hear the radio.
“Harold,” Mom says. Dad stares into space, as usual. Mom plunks down the checkbook with a loud WHACK! Dad sighs and sits down at the kitchen table.
Izzy goes upstairs and gets dressed. When he comes down, Dad frowns over his slide rule, while Mom knits her brows over numbers scrawled on paper with a pencil.
Izzy opens the back door. Dad looks up. “Where are you going, son?”
“Out to play,” Izzy says.
“Be home for supper,” says Mom.
A fictionalized account of true events.
Out of the Ether
I was born in 1945 into a family that couldn’t, or at least didn’t, afford a television set until 1957, when everybody else had already had a set for two or three years. As a result, I was privileged to be present at the last stand of radio broadcasting as a mass entertainment medium—before TV gobbled up radio’s best shows, and most of its advertising revenue, added a few original programs of its own, and became—well, Television. As we know it.
If you did not experience those “radio days,” let me assure you: radio was great. All the action, all the drama, all the excitement, all the laughs of TV—only you could see it better, because everything played on the full color, panoramic, high-definition screen inside your mind—with all the pans, tilts, and zooms each story required.
Stan Freberg, the advertising world’s comic genius, produced a radio spot, “Stretching the Imagination,” that perfectly illustrates the vast cinematic potential of the sound-only medium. You can hear it at https://www.youtube.com/watch?v=ppZ57EeX6vE.
An Embarrassment of Riches
What kind of shows did radio offer? Besides the Saturday morning fare Izzy consumed in our fictional vignette, there were:
Westerns galore, all of the juvenile variety: Roy Rogers, Gene Autry, Hopalong Cassidy, Bobby Benson and the B-Bar-B Riders. But most of all, every Monday, Wednesday, and Friday at 6:30 p.m.: “In the pages of history there is no greater champion of justice than this daring and resourceful masked rider of the plains, who, with his faithful Indian companion Tonto, led the fight for law and order in the early West. . . . Return with us now to those gripping days of yesteryear—the Lone Ranger rides again!”
Northerns, starring Royal Canadian Mounted Police like Sergeant Preston of the Yukon with his famous lead dog Yukon King; and mountie Jim West, The Silver Eagle, voiced by radio legend Jim Ameche—one of the Amici boys from Kenosha, Wisconsin—on Tuesdays and Thursdays in the Lone Ranger’s 6:30 time slot.
Game shows like The Quiz Kids and The 64-Dollar Question. That’s not a misprint. Sixty-four dollars was the top prize. That was big money. When television came along, the same show was recycled, “isolation booths” added for showmanship, and three zeroes tacked on to the prizes—so it became The $64,000 Question.
Audience-participation shows like Art Linkletter’s People Are Funny or Ralph Edwards’ Truth or Consequences, in which typical Americans made fools of themselves, on the screen in your mind, for fame, glory, and small sums of money. They may have been forerunners of what is today called “reality TV.”
Comedies, glorious comedies of all descriptions. There was the pompous ventriloquist Edgar Bergen with his dummies Charlie McCarthy and Mortimer Snerd; you could not even see his lips move—at least, on the radio. There were situation comedies of small-town life, like Fibber McGee and Molly and The Great Gildersleeve. Others relied on ethnic identities: The Goldbergs (not to be confused with the 2013 TV series of that name), Life with Luigi (in which Irish-American actor J. Carrol Naish played the title Italian character), and Amos ’n’ Andy (a show whose African American title characters were created and portrayed by white actors Freeman Gosden and Charles Correll). There were comedies about teenagers—Henry Aldrich, Corliss Archer, My Little Margie, and the high school denizens taught by Our Miss Brooks. And there were wholesome family shows like The Adventures of Ozzie and Harriet and Father Knows Best. (Leave It to Beaver, the classic exemplar of this kind of show, never appeared on radio; it was a creature of television only.)
And then there was The Jack Benny Show, in some ways the forerunner of modern shows like Seinfeld. To say the Benny show was comedy is true enough; but it hardly does justice to the subject. Jack Benny was an institution. Perhaps a good subject for a later blog post.