Memoir-ization

We’ve all got a good memoir or reminiscence book buried inside us. It’s quite another thing to actually get it out on paper, virtual or real, in any useful form. Because it requires selectivity. Unless you’re a major public figure, the world probably doesn’t need your autobiography. But it might not be able to resist your own take on the choicest bits.

That’s why there is so much to admire in what my friend, Michael Bourgo, has done. His memoir, Once Upon a Time: Growing Up in the 1950s, delivers exactly what the title claims—the experience of childhood in that now-legendary era from which so much of today’s pop culture—Happy DaysBack to the Future Leave it to Beaver—derives.

Unlike Hollywood’s version, however, Michael’s version has the smack and tang of real events as lived in a particular person’s life. That person happens to be a warm, engaging old man recounting oodles of details from a long-ago period of his life. The struggles of a young family trying to get a start in a dynamic yet unpredictable postwar economy; the thrill of shopping at Marshall Field’s in Chicago’s Loop and dining at one of that elegant store’s six on-site restaurants; the satisfaction of showing up at summer camp self-contained and not dependent on a helicopter mom (yes, they had them in those days, too!) to unpack one’s footlocker. 

Most of us, when we go to write a memoir, get overwhelmed by the imperative of sharing everything we have experienced—because every bit of it is significant to us, and we are sure that if we simply spray it out in its entirety, our own deep appreciation of each detail will transfer automatically to the mind of the reader. That is a delusion.

Write for the Reader, Not the Author

What readers want is information that is in some way new and significant to them—not a catalog of what is old and significant to the author. While trotting out an abundance of details from his amazing memory, Michael Bourgo always respects the reader’s need to get something surprising and interesting from the narrative. He also knows when to quit. This never becomes a recitation of everything that happened in the author’s life. He knows that what is significant, that today’s people might need or want to know, has to do with childhood in the Fifties. He sticks to that subject.

Jerry Mathers as The Beaver. ABC Television. Public Domain.

With a format composed of solid chapters arranged on chronological and topical lines, alternating with page-long poems that shed further light on matters already covered in prose, Michael gives us a credible understanding of life in the Fifties, one that goes well beyond the stereotypical adventures of Beaver, Wally, and Eddie Haskell. 

For example, describing the ritual of young boys getting haircuts in those days: “There was another side to Ken’s [barber shop]. . . . My brother, always a more astute observer than I, figured it out when he was in high school. One day he overheard a strange exchange between a patron and one of the barbers, and he realized they were using some sort of code to set up a wager. So, in addition to cutting hair, Ken’s was also a front for a bookie operation that handled bets on sports. No doubt this was a service that many citizens found useful because in those days there were only two places to place a legal bet—at a horse track or in Las Vegas.” (I also, Dear Reader, patronized that kind of a barber shop as a boy. But I only got my hair cut.) 

Those of us who lived through the times Michael Bourgo describes will recognize many of our own experiences in his narrative; and we will encounter other episodes, foreign to our own experience, that reflect the broad range of life lessons disclosed to members of different families in different places. 

For readers who did not arrive on the scene before the Fifties finally petered out (around 1965), this well-balanced and life-affirming memoir will showcase a whole new world in richness and nuance—a world that Marty McFly would never find in his DeLorean.

I recommend Once Upon a Time: Growing Up in the 1950s to anyone who would like to re-live the era through a different set of eyes, and also to anyone who would like to experience it for the first time as it really was—not just as shown on TV.

Blessings,

Larry F. Sommers, Your New Favorite Author 

Purple Snow

When geezers gather, the gab gets garrulous. There is boasting value in extremes. 

“We were so poor that the patches on our jeans, had patches on their jeans!” 

“What! . . . You had jeans?”

Tales of poverty can still score points, but people who remember the Great Depression are mostly gone. So the extremest thing most of us can conjure these days is the weather. 

Sling psychrometer. CambridgeBayWeather. Public Domain.

Eco-warriors among us—whippersnappers!—construe any bump in the barometer, any thump in the thermometer, any slump in the sling psychrometer as a harbinger of the woe we are to reap from Global Warming. Well, maybe.

I can say this for sure: Nobody ever weathered weather like the weather we weathered, back in The Old Days. Gathered geezers may tell of the Terrible Winter of 1935-36, the Great Floods of ’93, the Summer That It Rained Alligator Eggs, or the Year With No Summer Atall. You never know, Dear Reader, when you may find yourself swamped in a five-hundred-year flood of such remembrances.

Winter of Purple Snow

When I mention the Winter of the Purple Snow, people look askance. When I claim that, actually, every winter in The Old Days was a winter of purple snow, a ceiling-mounted wide-angle lens would show a frenzy of Brownian motion away from me and toward the exits.

But it’s all true, every word. We did have purple snow, at least in Streator, Illinois, where my boyhood was misspent. Other cities must have had it, too. 

Each winter, the snow tumbled down in December—pure, fluffy, altogether white. Over the next three days, the snow on the ground—not the snow in my backyard, but the snow on every city street—became empurpled. The cause of purple snow is easiest to explain in retrospect: Snow tires had not yet been invented.

“So, ‘no snow tires’ equals purple snow?” Exactly.

In these apocalyptic times—even as we face continual peril from CNN-scale floods, hurricanes, earthquakes, volcanoes, tsunamis, and disaster films—one thing we no longer worry about, much, is sideways slippage on winter streets. All our cars wear radial tires. Radial tires slump a bit. This increases the surface that contacts the road, thus improves traction. Those who like to gild the lily may put on special “winter radial tires” in the fall. They have a deeper, more “road-gripping” tread design in addition to the famous radial slump. Most of us don’t feel a need for this. But before radial tires were invented, deep-tread “snow tires” were better than nothing. 

However, in the 1950s, we didn’t even have those. There were only regular bias-ply or belted-bias tires. No special deep tread, no radial slump. They just perched on the ice and slid this way or that. In heavy snow, you might put messy, inconvenient “tire chains” on your tires. These were circular cages, made of interlinked chains, that enveloped each tire. They bit into the snow and ice. If you had to climb a long hill in the country, you needed chains. But on city streets that were half snow-covered and half clear, as is often the case, those chains chewed up the pavement, the tires, and themselves. So you didn’t use them any more than you had to.

“Where,” you ask, “is all this headed? Have you forgotten about the purple snow?” Stay with me, Kind Reader.

We needed something short of chains to help tires grip the street—especially at intersections, where most winter crashes occur. Sand would have been  dandy. But why use expensive sand, when  you can get crunchy, gritty cinders free of charge? This thrifty solution appealed to the city fathers in Streator and, I’ve got to believe, elsewhere.

Coal

You see, our houses were heated by coal. In Illinois, Mother Nature, 350 million years ago, had buried a generous layer of bituminous coal not far underground.

There are three forms, or “ranks,” of coal: anthracite, bituminous, and lignite. Lignite is brown, not much harder than the peat burned by poor Irish cottagers and rich Scottish distillers. Anthracite is hard, black, almost-a-diamond coal that’s mined in Pennsylvania. Bituminous is harder lignite but not as hard as anthracite. In other words, it is just right—not too hard, not too soft. Goldilocks would have used it in her furnace, for sure.

One ton of bituminous coal cost about five dollars—1950s dollars, that is. About fifty bucks in today’s money, so it wasn’t as cheap as it sounds. But if you could heat your house halfway through the winter on fifty dollars—that wouldn’t be so bad, would it? Bituminous coal was useful, abundant, and cheap.

But “O! The horror!” Did not all this burning coal cause sulfur dioxide, hydrogen sulfide, toxic metal residues, acid rain, air pollution, and so forth? Why, yes. It did. That is why we have air-quality regulations now, why the coal industry looks for low-sulfur deposits. It’s also why most coal-burning homes converted to gas, oil, or electric in the 1960s and ’70s. Through a combination of governmental action and industry initiatives, air and water in most places is cleaner now than it was in the 1950s.

Even in the Fabulous Fifties, however, pollution from coal was not very bad—in most places. It was quite bad in some heavy industrial corridors. But for most of us, the worst side effect was a thin film of soot on our walls. 

“Spring cleaning” in those days meant something very particular. Our mothers each April removed coal dust from every interior wall. This was not a happy task that added joy to Mom’s relentless mission of caring for her family. My mother seemed to regard it as an irksome chore. But it must be done, and done it was.

Casey Stengel. Public Domain.

She bought wall-cleaning putty at the hardware store. She rubbed it over the wall surface, then pulled it out, folded it over to expose clean putty, rubbed again. At the end we had clean walls. Plus many little balls of soiled putty to throw away. When homeowners abandoned coal, the makers of wall-cleaning putty added bright colors to the stuff and called it “Play-Doh.” That’s right, they did. (As Casey Stengel might say if he were alive today, “You could Google it.”)

“BUT WHAT ABOUT THE PURPLE SNOW?”

How to Be a Kid, 1950s Edition

When I was seven, Dad introduced me to my first regular chore—stoking the furnace. The furnace lived in the basement. It was a huge cylinder with ducts about a foot in diameter that sprouted all directions from its head. The main chamber and all the ducts were padded with asbestos insulation. (See “O! The horror!” above.)

Bituminous coal filled a room near the furnace, called “the coal bin.” Two or three times a year, the coal deliverymen would pour a ton of coal down a metal chute into the coal bin through a basement window.

Our coal came in rough lumps the size of a baseball or softball. It was shiny and black. You could break a lump in two with your bare hands. This exposed the striations of the rock. Sometimes it also exposed a fossil—the outline of a small leaf, for example—that had been trapped in the coal back in the Pennsylvanian Age of geology.

Coal was lightweight, for a rock. It was friable; when you handled it, you got greasy black dust on your hands. I scooped it from the coal bin with a giant shovel, set it in the furnace on top of the coal already aflame there. I had to make sure the new coal caught flame, augmented the fire and did not smother it. 

Then I shook down the grates. (Purple snow coming up, Gentle Reader!) Two metal handles protruded from the furnace below the coal door. I rattled these handles; dead ashes and cinders fell through the grates into a hopper below. Once a week we shoveled ashes and cinders—also called “clinkers”—out of the furnace. We carried them to the alley behind our house in a five-gallon can. When the garbage men came by to collect our refuse, they dumped our ashes and clinkers into a separate compartment on their truck. 

They collected these materials from every alley in the city. The product, as donated by householders, was a mix of fine, white fly ash and dense, iridescent clinkers. The city washed the fly ash away, leaving the clinkers—small, irregular rocks of metallic slag. A single clinker could be round, bulbous, sharp, jagged—all at the same time. They were multi-hued, but dominated by purple, blue, green, and pink.

The Empurplement of Streator, Illinois

When snow blanketed city streets, crews dumped these clinkers on every intersection for traction. Every passing car crushed them into smaller pieces. Periodically the city replenished the clinkers at the intersections. 

Voilà! Purple snow. This image is a modern re-enactment, because I only had black-and-white film for my Brownie camera in those days. And besides, purple snow was so normal that nobody would have thought to photograph it. “purple snow” by TORLEY is licensed under CC BY-SA 2.0 

Numberless bits of cinder got dragged down the street—transferred from interesections to tires, then deposited in mid-street, in driveways, in alleys, even on sidewalks. By mid-winter, all streets were festooned with purple snow, colored by the powdered residued of our furnace clinkers. It ranged from bright purple-pink to a dull brown slush with just a bit of rosiness. 

Snow melts; cinders remain. They lay in small, sharp bits, in gutters and on sidewalks. They formed a light coat over asphalt schoolyards and potholed alleys. They lay in wait for innocent childen.

Cinders paved athletic running tracks before the invention of GrassTex, Tartan Track, AstroTurf. Sprinters and middle-distance runners got cinders in their low-cut track shoes, chewing up their feet. Or they fell on the track and embedded tiny chunks of metal under their skin.

The same hazard faced every child who strapped on a pair of roller skates or drove a tricycle pell-mell along uneven sidewalks while clad in short pants and tee shirts. Nobody escaped. Some kids had cinders embedded so deep that years later you could still find the black speck in cheek, knee, or elbow where the projectile had burrowed in.

Was anybody killed or maimed by these clinkers?

Come on. We were made of sterner stuff.

Blessings,

Larry F. Sommers, Your New Favorite Writer

The Next New Thing

I noticed last Thursday that the world is going to hell. You say, “The world has always been going to hell.” I say, “Yes, but now it is going straight to hell. Rapidly to hell. Immediately to hell.”  

No handbaskets need apply. Photo by Melody Bates on Unsplash.

Do not pass “Go,” and do not bother with a handbasket.

Senior citizens have long known that civilization is on the skids. The knowledge comes free with age. You have seen too much. You remember how things were. The good things you remember keep sliding down into the dustbin of entropy. Meanwhile, bad things come up out of nowhere and metastasize across the evening sky.

Wheels coming off the dustbin of entropy. Photo by Jon Toney on Unsplash.

However, God says:

See, I am doing a new thing!
    Now it springs up; do you not perceive it?
I am making a way in the wilderness
    and streams in the wasteland.

—Isaiah 43:19 (NIV)

God is preparing Something Completely Different, and it’s on a channel our sets can’t pick up. The Great New Thing of the Future is already here, but we’re looking the wrong way. (Theologians have been known to call this “eschatological tension.”)

Bubbling up from below, not quite visible, something altogether new. Photo by Daniel Chen on Unsplash

Eternity crashes down about our ears in more ways than Chicken Little could ever count. 

  • War. Plague. Famine.
  • Inflation. Depression. Hard-heartedness.
  • Dissension. Criticism. Hurtfulness.
  • Politics.

Pick your poison.

Whenever things come crashing down, a whole new arrangement waits in the wings.

The old order changeth, yielding place to new,
And God fulfills himself in many ways,
Lest one good custom should corrupt the world. 

—Tennyson, Idylls of the King

We waste ourselves attacking visions that diverge from our own. History shows that diversity of viewpoints is a kind of “rocket fuel” that has propelled our society to greatness. We can’t be bothered with that. The deplorable politics of others, we take for our bête noir—perhaps because we face no real existential threats.

The Bible tells us, more than it tells us any other thing, “Fear not.” Yet we continue to be  governed by fear. What if we were governed by confidence that the next new wave of things will bring the perfect, peaceable Kingdom of God that much closer to fruition? 

Road to the peaceable kingdom. Photo by Eryk on Unsplash

Blessings,

Larry F. Sommers, Your New Favorite Author

The Newest, Latest, Last Frontier

Chet Huntley has got me thinking about frontiers. 

Chet Huntley. NBC Television. Public domain.

Huntley (1911-1974) was an influential broadcaster, a television journalist who co-anchored NBC’s evening news program, The Huntley-Brinkley Report, for fourteen years beginning in 1956. When his run at NBC ended in 1970, Huntley, then 58, became front man for the founding of the Big Sky ski resort in his native Montana. Earlier, he had written a memoir titled The Generous Years: Remembrances of a Frontier Boyhood, published by Random House in 1968. This book was recommended and lent to me by my friend Jerry Peterson.

The Generous Years is a warm and interesting read. We learn much about the childhood of Chet Huntley but more importantly we learn about life in Montana in the first quarter of the twentieth century. Seen through the eyes of a boy who, as his adult self tells us more than once, was privileged “to know and remember a few years and a few scenes of the nation’s last frontier.” 

The Last Frontier

The Montana of Huntley’s youth was indeed, in many ways, a raw frontier. People made their livings by farming, by herding, by mining and railroading. It was a society that still went about on horseback; motor vehicles, other than steam locomotives, were rare. Old Doc Minnick, the blunt, persevering medico of Huntley’s remembrance, made his housecalls in a one-horse buckboard. The memoir includes those staples of frontier life: prairie fires, locusts, and even an enterprising bank robber foiled by the derring-do of local boys. It’s a tale worth reading, and I commend it to you.

But what of Huntley’s claim to have recorded America’s last frontier? Even while typing the phrase, I thought of Alaskan friends. “What about us?” they would cry. “What are we, chopped liver?” Alaska has been raw frontier much more recently than Montana. Many parts of Alaska still qualify for that distinction. That’s also true of vast swaths of Canada’s Yukon Territory and northern British Columbia. These places are truly “the last frontier.” 

Or are they? 

The Frontier Thesis

Frederick Jackson Turner. Public Domain.

Historian Frederick Jackson Turner put forth in 1893—eighteen years before Huntley’s birth in Montana—an idea that came to be called “the Frontier Thesis” of American history. Turner figured the frontier experience was the main thing that called forth the development of American democracy and other unique aspects of our civilization. Jackson’s Frontier Thesis became a mainstay in the scholarly interpretation of U.S. history. It has also been fiercely disputed; yet it still holds considerable sway.

Turner’s thesis took the frontier as a fact of physical geography. He proposed that when the frontier line reached the West Coast about 1880, the first phase of American history had ended. The frontier was no more. 

New Frontiers

Starship Enterprise. “1701-D” by kreg.steppe is licensed under CC BY-SA 2.0 

This has not stopped others from declaring new areas of frontier-like emphasis. One example is likewise rooted in physical geography, although it is extraterrestrial. The moon, by this thinking, is a new frontier—and so is Mars. In 1966, forty-four years after Turner retired from Harvard, actor William Shatner declared all of space to be “the final frontier” in the opening title sequence to the Star Trek television series. 

Gene Roddenberry. NASA. Public domain.

Whoever wrote Shatner’s speech (Gene Roddenberry, et al.) ought to have been more circumspect; because many more “new” and “final” frontiers have been proposed. 

JFK. Cecil Stoughton, White House. Public domain.

Senator John F. Kennedy, accepting the Democratic nomination for president in 1960, said: “We stand today on the edge of a New Frontier—the frontier of the 1960s, the frontier of unknown opportunities and perils, the frontier of unfilled hopes and unfilled threats. . . . Beyond that frontier are uncharted areas of science and space, unsolved problems of peace and war, unconquered problems of ignorance and prejudice, unanswered questions of poverty and surplus.” The phrase “New Frontier” then became a label for Kennedy’s presidential administration—like Teddy Roosevelt’s “Square Deal,” Franklin Roosevelt’s “New Deal,” or Harry Truman’s “Fair Deal.” As political branding it stood for a vaguely-defined stance of confronting unknown but large national challenges of the future. In that sense, we will always have a “new frontier” to deal with. 

The Perpetual Lure of the Frontier

Davy Crockett, “King of the Wild Frontier.” Portrait by James Hamilton Shegogue, 1806-1872. Public Domain.

All this frontiersmanship makes me think that Americans have been so shaped by our frontier experience that we simply cannot do without it. We always need a frontier. Unless we are out on a frontier of some kind, we are not satisfied. 

I wonder if Italians, Poles, Vietnamese, or Pakistanis talk and think as much about frontiers as we do. Frederick Jackson Turner and I doubt it.

Blessings,

Larry F. Sommers, Your New Favorite Author

Cross and Flag

My irascible sometime friend and former work supervisor, Tim, once went ballistic in my presence over the historic fact that U.S. presidents including George Washington, Abraham Lincoln, and in the twentieth century Woodrow Wilson on various occasions had issued public calls for “fasting, humiliation, and prayer.” 

Our flag. “US Flag” by jnn1776 is licensed under CC BY-SA 2.0 

Tim—alas, now deceased—was a military man. He was quite intelligent, tolerably well-educated, and always in the grip of a steamy anger that was never far from the surface. He had been raised in a Catholic family but in adulthood described himself as “agnostic.” 

He made no quarrel with presidential calls for fasting and prayer. He understood that even in a nation that prohibits “an establishment of Religion,” a leader may give voice to the general religious impulses of the people. But he did not think a chief executive should call for the country to be humiliated.

“Cross” by dino_b is licensed under CC BY-NC-ND 2.0 

Tim was a notable narcissist, full of pride in himself and esteeming pride as a general virtue in all cases. He considered humiliation as the one thing to be avoided above all. Therefore, to call for humiliation of the whole nation was tantamount to treason. After all—the British, the Germans, and the Japanese had tried to humiliate us and we had not let them get away with it. Why, then, do it to ourselves?

With more time and more patience, had I been wiser and deeper, I might have helped Tim understand the concept of national humiliation in a larger context. But I did not.

In his sensitivity to that issue, Tim inadvertently put his finger on a key dimension of America’s church-state relationship. If we understand our nation’s affairs to fall within the Providence of a Power who calls each of us to approach life with Christ-like humility, then it seems  proper for all of us, as a body politic, periodically to be humbled. To be reminded, that is, of our proper place in the world under the overarching care of God.

“Humiliation” in this sense may be what Lincoln had in mind when he said, in his Second Inaugural Address,

Abraham Lincoln. “twlncn63” by gvgoebel is licensed under CC PDM 1.0 

“If we shall suppose that American slavery is one of those offenses which, in the providence of God, must needs come, but which, having continued through His appointed time, He now wills to remove, and that He gives to both North and South this terrible war as the woe due to those by whom the offense came, shall we discern therein any departure from those divine attributes which the believers in a living God always ascribe to Him?” 

That kind of thinking, I believe, is what Washington, Lincoln, and others meant when they called for national “humiliation.”

Past generations have mostly understood and assumed a close kinship between our lives as Christians and our lives as citizens. Alhough the Establishment Clause of the First Amendment has always forbidden the government to prescribe forms of prayer and worship, nobody construed it to prevent Americans from expressing our religious affiliations and sentiments in our public lives.

Under such a general understanding, it seemed perfectly natural to Americans of the mid-twentieth century to salute our national sovereignty by displaying flags in our houses of worship and recognizing national holidays during regular worship services. But expectations and understandings are much different today.

Our pastor—no bomb-throwing activist, she—called our attention to three articles in the current online Alban Weekly dealing with churches’ sometimes uneasy relationship with Independence Day celebrations. She wanted to know what we thought about them. The leading piece, a nine-year-old reflection from Duke University’s Faith & Leadership website, titled “What to do about the 4th,” written by a retired Methodist minister named Ed Moore, mentioned some “local traditions” that he called “affronting.” These were: “an American flag draped over the Lord’s Table, the Pledge of Allegiance included in the liturgy, or the choir expecting to deliver a patriotic anthem.”

I suppose these “local traditions” must exist somewhere in Christendom, or Rev. Moore would not have called them out. But they must be exceeding rare. In all my years I have never seen any of these “affronting” cases included in the worship of any churches I have attended. Using the U.S. flag as a communion cloth or a chancel parament? Such a practice must be abhorrent both to Christians and to patriots (bearing in mind that many of us aspire to be both).

Some patriotic expression in worship space, however, has been a commonplace in most churches since the dim past. It might take the form of red/white/blue floral decorations on July Fourth (a practice Rev. Moore okays, faintly); or the display of the flag somewhere in the worship space; or the singing of a patriotic song such as “America the Beautiful” by the congregation on the Fourth, in place of a regular hymn.

The reason such practices come under the microscope of critical examination now is not that we somehow are better educated than our grandparents about the implications of the Establishment Clause. Rather, it’s because we now live in a society that is markedly less religious than theirs was. I believe we are the poorer for that. But it does not follow that those who still keep the faith must embrace a sharp divorce between that faith and our inner sense of national identity. There can be room for both.

The Christian flag.

In the church where I have been a member for the past forty years, we have never practiced extreme liturgical patriotism. Sometimes we sing a patriotic song or two on national holidays. We used to display a U.S. flag and a “Christian flag” in our sanctuary. We retired those flags a while back; I am not aware of any complaints about that. 

But should we, at some future time, choose to restore flags to our worship space, that would not show that we had sold out our Christian faith to some crypto-fascist conspiracy. It would only signal that fashions, or group preferences, had shifted slightly.

Some wise person once decreed that sleeping dogs ought to be permitted their slumber. Despite any number of learned articles that may be written, already or in the future, I doubt that most American church people feel any great tension between their devotion to Christ and their loyalty to our country.

I’ll bet my combustible friend Tim, if he were here today, would at least agree with that.