Airplanes added economic, psychological factors to warfare

Alexander Leydenfrost’s oil on canvas Bombers at Night sold for $4,182 at a February 2010 Heritage auction.

By Jim O’Neal

In August 1945, a historic event occurred: Foreign forces occupied Japan for the first time in recorded history. It was, of course, the end of World War II and cheering crowds were celebrating in the streets of major cities around the world as peace returned to this little planet.

A major factor in finally ending this long, costly war against the Axis powers of Germany and Japan was, ultimately, the use of strategic bombing. An essential element was the development of the B-29 bomber – an aircraft not even in use when Japan attacked Pearl Harbor in 1941, forcing a reluctant United States into a foreign war. Maybe it was hubris or fate, but the attack was a highly flawed decision that would not end well for the perpetrators.

The concept of war being waged from the air dates to the 17th century when several wrote about it while speculating on when or where it would begin. The answer turned out to be the Italo-Turkish War (1911-12), when an Italian pilot on Oct. 23, 1911, flew the first aerial reconnaissance mission. A week later, the first aerial bomb was dropped on Turkish troops in Libya. The Turks responded by shooting down an airplane with rifle fire.

As World War I erupted seemingly out of nowhere, the use of airplanes became more extensive. However, for the most part, the real war was still being waged on the ground by static armies. One bitter legacy of this particular war was the frustration over the futility and horror of trench warfare, which was employed by most armies. Many experts knew, almost intuitively, that airplanes could play a role in reducing the slaughter of trench warfare and a consensus evolved that airplanes could best be used as tactical army support.

However, in the 20-year pause between the two great wars, aviation technology improved much faster than other categories of weaponry. Arms, tanks, submarines and other amphibious units were only undergoing incremental changes. The airplane benefited by increased domestic use and major improvements in engines and airframes. The conversion to all-metal construction from wood quickly spread to wings, crew positions, landing gear and even the lowly rivet.

As demand for commercial aircraft expanded rapidly, increased competition led to significant improvements in speed, reliability, load capacity and, importantly, increased range. Vintage bombers were phased out in favor of heavier aircraft with modern equipment. A breakthrough occurred in February 1932 when the Martin B-10 incorporated all the new technologies into a twin-engine plane. The new B-10 was rated the highest performing bomber in the world.

Then, in response to an Air Corps competition for multi-engine bombers, Boeing produced a four-engine model that had its inaugural flight in July 1935. It was the highly vaunted B-17, the Flying Fortress. Henry “Hap” Arnold, chief of the U.S. Army Air Forces, declared it was a turning point in American airpower. The AAF had created a genuine air program.

Arnold left active duty in February 1946 and saw his cherished dream of an independent Air Force become a reality the following year. In 1949, he was promoted to five-star general, becoming the only airman to achieve that rank. He died in 1950.

War planning evolved with the technology and in Europe, the effectiveness of strategic long-range bombing was producing results. By destroying cities, factories and enemy morale, the Allies hastened the German surrender. The strategy was comparable to Maj. Gen. William Tecumseh Sherman’s “March to the Sea” in 1864, which added economic and psychological factors to sheer force. Air power was gradually becoming independent of ground forces and generally viewed as a faster, cheaper strategic weapon.

After V-E Day, it was time to force the end of the war by compelling Japan to surrender. The island battles that led toward the Japanese mainland in the Pacific had ended after the invasion of Okinawa on April 1, 1945, and 82 days of horrific fighting that resulted in the loss of life for 250,000 people. This had been preceded by the March 9-10 firebombing of Tokyo, which killed 100,000 civilians and destroyed 16 square miles, leaving an estimated 1 million homeless.

Now for the mainland … and the choices were stark and unpleasant: either a naval blockade and massive bombings, or an invasion. Based on experience, many believed that the Japanese would never surrender, acutely aware of the “Glorious Death of 100 Million” campaign, designed to convince every inhabitant that an honorable death was preferable to surrendering to “white devils.” The bombing option had the potential to destroy the entire mainland.

The decision to use the atomic bomb on Hiroshima (Aug. 6) and Nagasaki (Aug. 9) led to the surrender on Aug. 10, paving the way for Gen. Douglas MacArthur to gain agreement to an armistice and 80-month occupation by the United States. Today, that decision still seems prudent despite the fact we only had the two atomic bombs. Japan has the third-largest economy in the world at $5 trillion and is a key strategic partner with the United States in the Asia-Pacific region.

Now about those ground forces in the Middle East…

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Artists, writers tapped into America’s traveling spirit

Robert Crumb’s original illustration of Jack Kerouac sold for $33,460 at a February 2011 Heritage auction.

By Jim O’Neal

While we lived in London, I was always fascinated by one common employee characteristic. Irrespective of the school – be it Eton, Oxford or the London School of Economics – every curriculum vitae (CV) included an explanation of a student’s foreign travels the year following graduation. Asia and Australia were the most popular; only rarely did it include the United States. After all the studying, cramming for exams and other typical activities on campus, they felt an overwhelming compulsion to just travel for as long as a year.

America was like that one time not too long ago. Novelist/journalist James Agee (A Death in the Family) wrote about it in Fortune magazine in 1934: Hunger for movement, he said, was “very probably the profoundest and most compelling of American racial hungers.” The road could help satisfy that hunger. Just put the hood ornament on the center line, the speedometer on 80 and let ’er rip. The urge was there before the car … long before … and invariably sent the country westward. As Huck Finn said: “But I reckon I got to light out for the territory ahead of the rest, because Aunt Sally she’s going to adopt me and sivilize me, and I can’t stand it. I been there before.”

The road was our nation’s ticket to ride and, more precisely, to ride away on. Maybe it was away from who we were, but it was, for sure, away from where we were. To where? Who knew? How about just a fresh start? We could put it all behind us as fast as the car could go.

Novelist/playwright William Saroyan, who liked getting behind the wheel of his Buick, wrote about his desire to hit the road: “It isn’t simply driving at night, it’s going on … to find out what’s out there now, not so much along the highway, in the terrain, under the sky, but in the interior of the driver himself.” Romance with the road was all about get up and go. Wherever you want to go, whenever you want to leave. There were no schedules and no reservations. Time of arrival? Whenever.

Lolita’s Humbert Humbert chose to hit the road to find his interior. Humbert’s creator, the Russian lepidopterist/novelist Vladimir Nabokov, spent two summers on America’s highways, chasing butterflies. A great year for the road was 1957. It was the year painter Edward Hopper gave us his classic Western Motel, the stark symbol of mobility and restlessness. The year that Jack Kerouac, out of the grim mill town of Lowell, Mass., weighed in with his novel On the Road. The road was Kerouac’s characters’ means of escape like the Mississippi was for Huck and Jim. On the Road captured the energy of trying to satisfy that hunger for movement.

The true north of the road was west. The West owned those lonesome, inexhaustible roads with few-and-far between motels designed so that cars could be parked about 20 feet from the beds. There was a lot of nowhere for these roads to cover. Distance was measured by hours (18 hours from Amarillo to Santa Monica), providing time to think. Playwright Sam Shepard used the road for writing and that may explain how he got the West so right.

John Steinbeck wrote that our “Mother Road” was Route 66. The Okies (including my whole family) called it their highway to heaven because it got us to California. We didn’t pick fruit like the Joads in The Grapes of Wrath. We bought real estate in Southern California that had its own fruit trees. I picked peaches and apricots off our three acres and I sold them in front of our house for 50 cents and $1 a lug. One uncle was a carpenter and he bought an entire block, built two houses, sold one for a tidy profit and lived in the other with a semi-alcoholic aunt.

My mother’s three brothers all found great jobs building airplanes and my father bought Pacific Cold Storage in Central Los Angeles (after he divorced my mother). I had two paper routes that netted me $60 a month after expenses (bicycle tires and rubber bands). I could also play night league softball in Huntington Park (we lived in Downey, home of the first Taco Bell 25 years later), and one-on-one basketball every spare minute.

My friends and I lived vicariously through TV shows like Route 66 with Martin Milner and George Maharis playing drifters in a Corvette – the only fictional series that shot all over North America – with their stories of working in shipbuilding, oilrigs and shrimping from Chicago to Los Angeles. (Corvette sales doubled). This was followed by Michael Parks in Then Came Bronson and the classic Easy Rider with Peter Fonda, Jack Nicholson and Dennis Hopper.

We had scratched our itch and found gold fast (sunshine, beaches, long-legged tan girls), but it was still fun watching others make their way west.

I wonder if space travel is what itches Elon Musk and Jeff Bezos?

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Franklin had Faults, but he Remains an Extraordinary American

Norman Rockwell’s illustration of Ben Franklin for a 1926 cover of The Saturday Evening Post sold for $762,500 at a May 2018 Heritage auction.

By Jim O’Neal

George Washington was the only United States president who lived his entire life in the 18th century (1732-1799). However, every early vice president – from John Adams on – spent a significant part of their lives in the 19th century. Of the Founding Fathers, Benjamin Franklin (often considered the “grandfather” of the group), was born in 1706 and died in 1790 – nine years earlier than even Washington. In reality, he was actually a member of the previous generation and spent virtually all of his life as a loyal British subject.

As a result, Franklin didn’t have an opportunity to observe the nation he helped create as it struggled to function smoothly. Many were determined not to simply replicate the English monarchy, but to establish a more perfect union for the common man to prosper (women, slaves and non-property owners would have to wait). Franklin was also a man of vast contradictions. He was really a most reluctant revolutionary and discretely wished to preserve the traditions of the British Empire he had grown so familiar with. He secretly mourned the final break, even as he helped lead the fight for America’s independence.

Even while signing the Declaration of Independence and the Constitution, like many other loyalists, he hoped for some sort of reconciliation, a hopeless cause after so many careless British transgressions.

Fortunately, we have a rich history of this remarkable man’s life since he was such a prolific writer and his correspondence was greatly admired, and thus preserved by those lucky enough to receive it. Additionally, his scientific papers were highly respected and covered a vast breadth of topics that generated interest by the brightest minds in the Western world. He knew most of them personally on a first-name basis due to the many years he lived in France and England while traveling the European continent. Government files are replete with the many letters exchanged with heads of state.

Despite his passion for science, Franklin viewed his breakthrough experiments as secondary to his civic duties. He became wealthy as a young man and this provided the freedom to travel and assume important government assignments. Somehow, he was also able to maintain a pleasant marriage despite his extended absences, some for as long 10 years. He rather quickly developed a reputation as a “ladies’ man” and his social life flourished at the highest levels of society.

Some historians consider him the best-known celebrity of the 18th century. Even today, we still see his portrait daily on our $100 bills – colloquially known as “Benjamins” – and earlier on common 50-cent pieces and various denominations of postage stamps. Oddly, he is probably better known today by people of all ages than those 200 years ago. That is true stardom that very few manage to attain.

Every student in America generally knows something about Franklin flying a kite in a thunderstorm. They may not know that he proved the clouds were electrified and that lightning is a form of electricity. Or that Franklin’s work inspired Joseph Priestley to publish a comprehensive work on The History and Present State of Electricity in 1767. And it would be exceedingly rare if they knew the prestigious Royal Society honored him with its first Copley Medal for the advancement of scientific knowledge. But they do know Franklin from any picture.

Others may know of his connection to the post office, unaware that the U.S. postal system was established on July 26, 1775, by the Second Continental Congress when virtually all the mail was sent to Europe, not to themselves. There were no post offices in the Colonies and bars and taverns filled that role nicely. Today, there are 40,000 of them handling 175 billion pieces (six per second) and they have an arrangement with Amazon to deliver their packages, even on Sundays. Mr. Franklin helped create this behemoth as the first Postmaster General.

Franklin was also a racist in an era when the word didn’t even exist. He finally freed his house slaves and later became a staunch opponent of slavery, even sponsoring legislation. But he literally envisioned a White America, most especially for the Western development of the country. He was alarmed about German immigrants flooding Philadelphia and wrote passionately about their not learning English or assimilating into society. He was convinced it would be better if blacks stayed in Africa. His dream was to replicate England since the new nation had so much more room for expansion than that tiny island across the Atlantic. But we are here to examine the extraordinary mind and curiosity that led to so many successful experiments. Franklin always bemoaned the fact that he had been born too early and dreamed about all the new wonderful things that would be 300 years in the future.

Dear Ben, you just wouldn’t believe it!

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Withholding Created 75 Years Ago – Giving Us a ‘Charge It’ Government

A preliminary sketch Norman Rockwell did for a 1945 Saturday Evening Post cover, titled Income Taxes (Beating the Deadline), sold for $59,375 at a November 2014 Heritage auction.

“Our new Constitution is now established, and has an appearance of permanency, but in this world nothing can be said to be certain, except death and taxes.” – Benjamin Franklin, 1789

By Jim O’Neal

I suspect Benjamin Franklin would be pleased that our Constitution has become the most revered document of our United States, but mildly surprised that the U.S. Internal Revenue Code – while undoubtedly much more prosaic – now symbolizes the highly intimate relationship between the people and their federal government. Detailed IRS regulations guide the filing of federal tax returns, an activity that is the most universal civic act in our history. Its 14,000 pages and 4 million words represent a remarkable achievement unparalleled by any government on earth.

As the size and cost of government have grown, so has the size and difficulty of the tax return itself. In 1913, the first year the modern income tax was levied (an emergency income tax levied during the Civil War was allowed to expire in 1872), the top rate was 7 percent and then only for incomes over $500,000. The rate for people between $3,000 and $20,000 was just 1 percent, and below that zero. All but 1 percent of Americans were exempt from taxes. This was by design since advocates wanted a tax directed only at excess corporate and personal profits, not the wages of ordinary people. It was a way of reasserting the values of the early republic – now focused principally on equality – in reaction to gross inequities brought on by industrialization, and a way to force millionaire industrialists to share their wealth with society.

But WWI and the Great Depression increased the responsibilities of the federal government and rates took a quantum leap with the demands of WWII as the government took advantage of American patriotism. The number of tax filers rose to a point that what had been a “class tax” became a “mass tax.” The April 15 deadline is now a national rite, dreaded as much as it is observed. The complexity has become so pervasive that most filers require the aid of professional tax preparers. Looking back, it still seems remarkable that the income tax could have been extended to include so many people without creating a backlash. The wars helped, as did the success of government in defeating our enemies and the post-war economic growth. But the primary reason was that a new way had been devised to collect it.

For that, the IRS can thank Beardsley Ruml, a mid-century Macy’s executive who came up with a plan to institute what is politely called “withholding.” Until 1943, income tax was paid each year in a lump sum and filers were expected to put aside the money to make the payment. Yet that year, when the number of wage earners included under the tax grew by nearly 35 million and the Treasury Department became nervous about how many were actually prepared to pay, Ruml offered an idea. Aware that customers in his store were comfortable buying big-ticket items when they could pay in installments, he suggested the government get businesses to collect the tax in small increments and report that amount to the employees and IRS each year for future reconciliation.

To get the public’s endorsement, he also suggested a tax amnesty for the previous year. Congress did just that by forgiving 75 percent of the previous year’s tax liability while they installed the machinery for the withholding that has operated ever since. To appreciate the profound shift that a broad-based income tax brought to the Treasury, just consider that in 1910, tariffs and excise taxes brought in more than 90 percent of federal monies; by the end of the century, income tax had replaced tariffs, providing 90 percent of the nation’s revenue, or $2 trillion! More importantly, it changed the debates – from regional tariffs or whiskey producers versus cattle growers, to which income levels should be taxed more. Class versus class and a “soak the rich” is always the first reaction to feed the insatiable appetite at every level of government.

As an elastic source of revenue, the income tax became a fundamental part of statism, a tool to be used in the interest of creating a more democratic social order. Look to Washington, D.C., today to see what this has wrought: a city bursting at the seams with lobbyists, industry organizations, tax lawyers and political advocacy groups. Any tall building will have a group with the word “tax” in its title, all working to shape policy and regulations. Yet despite our best efforts, we have become addicted to spending more than our revenue and simply saying “charge it.”

I suspect even Mr. Ruml would be surprised about the success of our “buy-now-pay-later” system that so closely resembles his Macy’s secret sauce.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Do We Risk Forgetting the Past … Again?

An illustration of Jimmy Carter and Gerald Ford, dated 1977 and attributed to Al Hirschfeld, sold for $4,500 at an October 2015 auction.

By Jim O’Neal

People of my generation recall the 1970s as a decade of chronic financial instability. A lethal combination of rising inflation, slower growth and unpredictable economic policies resulted in a level of volatility that made the stock market a tricky place to navigate. Although the Dow Jones Industrial Average had closed near $1,000 in 1966, it went sideways for the next 17 years. 1972 produced a boomlet for “Nifty Fifty” stock prices that was followed by a steep decline. By spring 1980, the Dow Jones was back below $800.

Risk-averse investors piled into Money Market Funds (MMF) with high yields and low risk. Ross Perot supposedly bought $1 billion of 30-year Treasury Notes and locked in a 15 percent yield. Others chose to speculate in commodities or precious metals as a hedge against the pernicious effects of high inflation. President Ford waged a war on inflation with his WIN (Whip Inflation Now) program that was more of a slogan than a tangible set of financial policies. Cash was something to convert into tangible assets before it lost its buying power.

One prominent example in 1978 was the wife of the governor of Arkansas. The future first lady turned a modest bankroll of $1,000 into $100,000 in 10 short months by trading in cattle futures, soybeans and live hogs. She explained her market prowess was due to reading The Wall Street Journal. Perhaps even more remarkable was that her trades were mostly “shorts” at a time when cattle prices doubled.

But all commodities were generally on the rise and after the Soviets invaded Afghanistan in 1979, the price of gold rose to $875 an ounce. Nelson Bunker Hunt and his brothers tried to corner the silver market and bought control of 200 million ounces – equivalent to 50 percent of the world’s supply. In the process, silver prices shot up tenfold to $50. The Commodities Exchange (COMEX) and the Federal Reserve stepped in and changed the rules and the price quickly plummeted to $10 in March 1980. Despite losing over a billion dollars, they seemed to be mildly amused and still ended up in Johnny’s BBQ for the usual. Later, they were forced into bankruptcy, but a lot of silverware in Dallas homes got melted down, along with jewelry, teapots and other silver-based objects.

I lost a $20 gold coin when gold was at $430 and I bet it would fall to $400 before it hit $500. I had won it on a different bet by knowing a horse had to run 3 15/16th miles to win the Triple Crown. The Wall Street Journal was not involved in either case.

Then the 1980s gave way to the rise of the professional market trader after several leading investment banks had gone public; transforming cautious partners with limited capital to anonymously secret shareholders with large capital resources. “Proprietary Trading” produced quick profits and large bonuses that offset the elimination of fixed commissions by the NYSE. The flashy trader became a symbol of Wall Street – “Masters of the Universe” as chronicled in Tom Wolfe’s The Bonfire of the Vanities. It was now the era of greed and it became an international phenomenon as deregulation and globalization exploded.

Capital whirled around the globe in 24-hour trading and the remnants of conservatism from the Great Depression had quietly vanished. Debt was now viewed as a tax-efficient way to finance corporate takeovers and deregulation replaced supervision. Hedge funds and private partnerships proliferated like George Soros’ Quantum Fund, which generated 25 percent returns with highly leveraged bets on stocks, currency or “risk arbitrage.” In summer 1982, the Federal Reserve reduced the discount rate and incentivized the leveraged buyouts of public companies (LBO).

Falling interest rates and rising stock prices created a perfect setting for “junk bonds” and leverage became a strategy rather than a risk. Eventually it relied on trading on illegal proprietary insider information. Corporate raiders had a field day until 1986, when Ivan Boesky was arrested and the action moved to the federal courts. Naturally, the virus spread into the large home mortgage market and the savings and loan bubble collapsed.

It took a while for a new generation of greedy financiers to come along, and this time the leverage almost took down the world’s financial system in 2008.

Philosopher George Santayana was right: “Those who do not remember the past are condemned to repeat it.” What’s in your wallet?

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Since the Days of the Pony Express, It’s Been All About Speed

Walter Martin Baumhofer’s oil on board titled The Pony Express, St. Joseph, sold for $10,000 at a March 2012 Heritage auction.

By Jim O’Neal

On Nov. 19, 2007, I watched Charlie Rose interview Jeff Bezos about the new Kindle that debuted that day and sold out in 5½ hours. Books had been the last bastion of analog holdouts, although there were lots of Ebooks for sale.

The Kindle was something new.

It was a light (10 ounces) device that didn’t require a computer and would store up to 200 books in a library. One hundred and one of the 112 New York Times bestsellers were already available for $9.99, in addition to all major newspapers and magazines. You could order it from Amazon.com for $399 and when it arrived, it automatically recognized the user. Plus, there were lots of other cool features like a 250,000-word dictionary, access to Wikipedia and its 400 million pages, and you could read the first chapter of any book for free before deciding to buy it.

I immediately ordered mine and there was already a two-month waiting time. It is still around somewhere gathering dust, but my new Kindle is on my iPad (with an audio feature).

I guess we have always been an impatient culture with the “need for speed” in our DNA. Until the spring of 1860, it took 20 days for mail to get from St. Joseph, Mo., to Sacramento … far too long for most merchants and businessmen. Stagecoaches were just too slow until one of them – Russell, Majors and Waddell of Leavenworth, Kan. – came up with an idea to cut the delivery time by 50 percent.

For a fee of $2 to $10 per ounce, depending on the distance, you could use their new innovative mail service: the Pony Express. Starting on April 3, 1860, every Wednesday and Saturday one rider on horseback would leave at noon from Sacramento heading east, and another at 8 a.m. from St. Joseph heading west. Averaging about 8 miles an hour, every 10 miles or so, they would change horses at a relay station and continue at breakneck speed.

The typical payload of 20 pounds of letters was tucked into a Mexican mochila (knapsack) with four cantinas (locked pockets) for the oiled silk-wrapped letters that kept them dry when the rider inevitably had to cross streams or rivers. The mochila was designed to slip over the saddle horn and the rider sat on it until the next change of horses. Every 75 to 80 miles was a “home station,” where the incoming rider would pass the mochila to a fresh rider and then bunk down until the mail arrived from the opposite direction. Then, somewhat rested, he was off again, back the way he came.

The young hell-bent for leather riders intrigued the nation and their dedication became a staple for many bedtime stories. Some children heard the tale about 19-year-old Jack Keetley, who rode 340 miles in 31 hours non-stop until taken from the saddle … sound asleep. Fifteen-year-old William Frederick Cody was one of the daring young riders who would later become famous as Buffalo Bill with his own traveling Wild West show.

Still, speed was what mattered and the formal record for the central route’s 1,966 miles is 7 days and 17 hours. This was of special significance since it allowed California newspapers to publish President Abraham Lincoln’s inaugural address. The Pony Express was, in every way, about speed. Even its history went by in a flash— 19 months, 2 weeks, 3 days and kaput! The bold experiment was finished, partially because it was costing about $30 a letter, but it was really just another victim of a new technology.

The ride lasted from April 3, 1860, to Nov. 20, 1861 … 596,501 miles, 30,700 pieces of mail and only one lost mochila! The Pony Express and its 80 dedicated riders, 400 horses and 190 relay stations all became irrelevant on Oct. 21, 1861, with the completion of the transcontinental telegraph line. That pattern of one innovation, one technology, giving way to another seems to be accelerating as our impatience and expectations continue to grow.

I just ordered four C batteries from Amazon and discovered they won’t be here until tomorrow (bummer!).

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

‘M*A*S*H’ Showed Us How Far Intelligent TV Can Go

Rick Meyerowitz’s original M*A*S*H art for a 1974 cover of TV Guide sold for $657 at a June 2008 Heritage auction.

By Jim O’Neal

By 1983, the population of the United States had increased to 232 million … and virtually everyone was watching television on a regular basis. On Feb. 23, over 50 percent of them (best estimate is 125 million) tuned in that night to the last episode of one of their all-time favorite shows, M*A*S*H.

For weeks, newspapers had run contests asking readers to suggest how the show should end. “M*A*S*H Bashes” were held in every major city and people donned old army fatigues to watch the show, primarily in bars. Seventy-one percent of viewers watching television that night helped “Goodbye, Farewell and Amen” become the most-watched show in history. (No. 2 is Cheers for its finale “One For The Road.”)

We were saying farewell, not just to beloved television characters, but to an era and anti-war spirit that the show had captured so brilliantly.

M*A*S*H, which ran for 11 years (1972-83), with 251 episodes that snagged nearly 100 Emmy nominations, is still broadcast in reruns and is considered one of network television’s finest efforts. It was based on Richard Hooker’s 1968 bestselling novel Mash: A Novel About Three Army Doctors and the 1970 feature film directed by Robert Altman. (Note: Some nitpickers claim it was really based on the failed film M*A*S*H Goes to Maine, which itself was based on Hooker’s 1972 book sequel.)

The story of a fictional Mobile Army Surgical Hospital near the front lines of the Korean War (technically a U.N. “police action”), the TV show was filled with the high jinks typical of the book and movie, yet it established its own tone of prickly intelligence, wit and sardonic warmth. In tackling the darker aspects of war, the show perfectly echoed a conscience-stricken America, deeply troubled by Vietnam. In reality, and with an exquisite touch of irony, the book’s author was a surgeon from Maine who served in a MASH unit in Korea and actually hated the show for its anti-war message!

M*A*S*H creator, comedy writer Larry Gelbart, put the wise-cracking, womanizing, yet humane Benjamin Franklin “Hawkeye” Pierce (Alan Alda) at the center of the action. Sharing Hawkeye’s flea-bitten tent were fellow surgeons “Trapper” John McIntyre and Frank Burns, who was having an affair with Margaret “Hot Lips” Houlihan, the strong-willed head nurse. A favorite was Max Klinger, a cross-dresser who would try anything to get home. The cast changed over time, and finally even Gelbart left, exhausted from battles with network sensors.

The last episode, “Goodbye, Farewell and Amen,” which ran 2½ hours, was a remarkable culmination of everything the series represented: good, funny television drama that probed the ugly underside of war – in that last case, the savagery of peace in the closing days of Korea.

“What happened on the bus?” psychiatrist Sidney Freedman keeps asking Hawkeye, who in the final episode’s opening is in a mental institution.

Slowly, we learn that on July 4, after a day at the beach, the unit’s bus stopped to pick up refugees and wounded G.I.’s who told them to drive the bus into the bushes to hide from an enemy patrol. Hawkeye keeps hissing at a refugee woman, to keep her rooster on her lap quiet. The woman complies and eventually the repressed memory emerges … the woman has smothered her own child.

Hawkeye shakily returns to the 4077th and on the night of the armistice, one of the worst rounds of casualties is brought in. “Does this look like peace to you?” Margaret asks. Then over the PA system comes a litany of the war’s damage, ending with “2 million killed and 100,000 Korean orphans.” As the unit is broken down, each character gropes toward civilian life.

As Hawkeye lifts off in a helicopter, he sees down below on the deserted 4077, spelled out in stones, a message from his friend B.J. Hunnicutt: GOODBYE.

This last episode, considered the best in television history, was more than a goodbye. It was an example of how far serious and intelligent television can go, and a reminder that it very rarely does.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Sanctions Didn’t Stop Germany from Roaring Back After WWI

A 1939 political cartoon by Charles Werner (1909-1997) for Time magazine comments on the worldwide mood 20 years after the Treaty of Versailles. The original art sold for $836 at a February 2006 Heritage auction.

By Jim O’Neal

From 1939 to the winter of 1941, the German military won a series of battles rarely equaled in the history of warfare. In rapid succession, Poland, Norway, France, Belgium, Holland, Yugoslavia, Denmark and Greece all fell victim to the armed forces of the Third Reich. In the summer and fall of 1941, the USSR came close to total defeat at the hands of the Wehrmacht, losing millions of soldiers on the battlefield and witnessing the occupation of a large portion of Russia and the Ukraine. The German air force, the Luftwaffe, played a central role in this remarkable string of victories.

It was even more startling to those countries that had participated in WWI and taken draconian anti-war measures when it ended. This was simply something that was NEVER supposed to happen again, much less a mere 20 years later. How was it even possible?

The Allied powers had been so impressed with the combat efficiency of the German Luftwaffe in WWI that they made a concerted effort to eliminate Germany’s capability to wage war in the air. Then they crippled their civilian aviation capability just to be certain. The Allies demanded the immediate surrender of 2,000 aircraft and rapid demobilization of the Luftwaffe. Then in May 1919, the Germans were forced to surrender vast quantities of aviation material, including 17,000 more aircraft and engines. Germany was permanently forbidden from maintaining a military or naval air force.

No aircraft or parts were to be imported, and in a final twist of the knife, Germany was not allowed to control their own airspace. Allied aircraft were granted free passage over Germany and unlimited landing rights. On May 8, 1920, the Luftwaffe was officially disbanded.

Other provisions of the Versailles Treaty dealt with the limits of the army and navy, which were denied tanks, artillery, poison gas, submarines and other modern weapons. Germany was to be effectively disarmed and rendered militarily helpless. An Inter-Allied Control Commission was given broad authority to inspect military and industrial installations throughout Germany to ensure compliance with all restrictions.

However, one critical aspect got overlooked in the zeal to impose such a broad set of sanctions. They left unsupervised one of the most influential military thinkers of the 20th century … former commander-in-chief of the German Army Hans von Seeckt. He was the only one who correctly analyzed the operational lessons of the war, and accurately predicted the direction that future wars would take. Allied generals clung to outdated principles like using overwhelming force to overcome defensive positions, while Von Seeckt saw that maneuvers and mobility would be the primary means for the future. Mass armies would become cannon fodder and trench warfare would not be repeated.

The story of the transformation of the Luftwaffe is a fascinating one. Faced with total aerial disarmament in 1919, it was reborn only 20 years later as the most combat-effective air force in the world. Concepts of future air war along with training and equipment totally trumped the opposition, which was looking backward … always fighting the last war.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

G.I. Bill Crucial to Creation of our ‘Greatest Generation’

Illustrator Mort Künstler’s depiction of D-Day, which began the liberation of German-occupied northwestern Europe, went to auction in May 2017.

By Jim O’Neal

By 1944, it was clear that World War II would end the following year and America had a difficult question to answer: What to do with the 16.35 million men and women serving in the armed forces when they came home from the war?

One estimate from the Department of Labor was that up to 15 million of them would be unemployed since the economy (which was winding down) would not be able to absorb them, especially in an orderly fashion. A similar post-war situation of lower production and a bulge of returning veterans had resulted in a sharp depression after WWI, from 1921 to 1923. To further complicate things, the world was in worse economic shape following the devastation the war had produced. The government had tried a cash bonus program and it failed so miserably that many Americans were angry for the next decade.

President Franklin Roosevelt was well aware of the potential implications and determined to avoid a repeat performance. He proactively took to the nation’s airwaves, proposing a series of benefits for all the men and women who had sacrificed so much for the country. The veterans’ self-appointed lobby, the American Legion, grabbed onto the proposal with both hands – as did Hearst newspapers. Legion publicist Jack Cejnar came up with the term the “G.I. Bill of Rights,” officially passed as the Servicemen’s Readjustment Act of 1944.

Returning veterans could borrow up to $2,000 to buy a house, start a business or start a farm. They would receive $20 a week for 52 weeks, until they found a job. There would be lifelong medical assistance, improved services for those disabled in action, and a de facto bonus of $1,300 in discharge benefits.

The effect of the program was substantial and immediate. By 1955, 4.3 million home loans worth $33 billion had been granted. Veterans were responsible for 20 percent of all new homes built after the end of the war. Instead of another depression, the country enjoyed unparalleled prosperity for a generation.

However, few veterans bothered to collect their $20-a-week unemployment checks. Instead, they used the money for the most significant benefits of all: education and vocational training. Altogether, 7.8 million vets received education and training benefits. Some 2.3 million went to college, receiving $500 a year for books and tuition, plus $50 a month in living expenses. The effect was to transform American education and help create a middle class.

College was sheer bliss to men used to trenches and K-rations. By 1946, over half the college enrollments in the country were vets, who bonded into close, supportive communities within the wider campuses. Countless G.I. Bill graduates would go on to occupy the highest ranks of business, government and the professions, and even win Nobel Prizes.

The number of degrees awarded by U.S. colleges and universities more than doubled between 1940 and 1950 and the percentage of Americans with bachelor degrees or more rose from 4.6 percent in 1945 to 25 percent a half century later. Joseph C. Goulden writes in The Best Years, 1945-1950 that the G.I. Bill “marked the popularization of higher education in America.” After the 1940s, a college degree was considered an essential passport for entrance into much of the business and professional world.

Thanks to the G.I Bill, a successful entrance into that world was created for the millions of men and women who kept our world free and assured its future. Along the way, they also helped rebuild a world that had been ravaged.

I offer you the Greatest Generation!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Look to 1935 if Goal is Infrastructure Projects That Work

Joseph Christian Leyendecker’s cover illustration for the Oct. 19, 1935, edition of The Saturday Evening Post sold for $137,000 at a May 2015 Heritage auction.

By Jim O’Neal

“The social objective is to try to do what any honest government … would do: to try to increase the security and happiness of a larger number of people in all occupations of life and in all parts of the country … to give them assurance that they are not going to starve in their old age.”

Although this could have been taken directly from any Bernie Sanders speech anytime over the past 10 years … it was actually a response from President Franklin D. Roosevelt on June 7, 1935, when answering a question about the social role of government.

This was the same week that Babe Ruth announced his retirement from the Boston Braves, only six days after he hit three home runs in the last game he played. It was the end of an era and it came right in the middle of the Great Depression.

Bread lines were still long and double-digit unemployment was accepted as the new normal. People were generally depressed and hope was a rare commodity.

Technological unemployment threatened to permanently engulf huge sectors of the workforce, particularly less skilled and older workers in general. Observers suggested that deep structural changes in the economy meant that the majority of those over 45 would never get their jobs back. Lorena Hickok (Eleanor’s paramour) opined that, “It looks like we’re in this relief business for a long, long time.” The president’s advisor, Harry Hopkins, was soon speaking of workers who had passed into “an occupational oblivion from which they will never be rescued… We shall have with us large numbers of the unemployed. Intelligent people have long since left behind them.” Sound familiar?

Even FDR chipped in with his “Fireside Chat” on June 28, 1934: “For many years to come, we shall be engaged in rehabilitating hundreds of thousands of our American families … The need for relief will continue for a long time; we may as well recognize that fact.”

The Emergency Relief Appropriation Act became law on April 18, 1935. The bill approved the largest peacetime appropriation in American history. This single appropriation authorized more spending than total federal revenues in 1934; with a special $4 billion earmarked for work relief and public works construction. Roosevelt and the bill’s architects did NOT believe they were addressing a transient disruption in the labor market, but a long-term (perhaps permanent) inability of the private economy to provide employment for all who wanted to work.

Thus were born many federal agencies, with the Works Progress Administration (WPA) the largest. The WPA employed 3 million people in the first year and in eight years it put 8.5 million people to work at a cost of $11 billion. WPA workers built 500,000 miles of highways, 100,000 bridges, as many public buildings, plus 8,000 parks.

When the current administration and Congress debate “infrastructure projects,” they would be well served to study this period in American history. These folks really knew how to do it!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].