Tremendous Challenges Awaited the Plainspoken Truman

Fewer than 10 examples of this Harry Truman “60 Million People Working” political pin are known to exist. This pin sold for $19,717 at an August 2008 Heritage auction.

By Jim O’Neal

When Franklin Roosevelt died on April 12, 1945, Harry Truman became the seventh vice president to move into the Oval Office after the death of a president. Truman had been born during the White House years of Chester Arthur, who had followed James Garfield after his assassination (1881). And in Truman’s lifetime, Teddy Roosevelt and Calvin Coolidge had ascended to the presidency after the deaths of William McKinley (1901) and Warren Harding (1923). However, none of these men had been faced with the challenges awaiting the plainspoken Truman.

FDR had been a towering figure for 12 years, first leading the country out of the Great Depression and then deftly steering the United States into World War II after being elected a record four times. Unfortunately, Truman had not been involved in several important decisions, and was totally unaware of several strategic secrets (e.g. the development of the atom bomb) or even side agreements made with others, notably Winston Churchill. He was not prepared to be president.

Even the presidents who preceded FDR tended to exaggerate the gap in Truman’s foreign-relations experience. Woodrow Wilson was a brilliant academic and Herbert Hoover was a world-famous engineer. There were enormously important decisions to be made that would shape the world for the next half century. Even Truman had his sincere doubts about being able to follow FDR, despite the president’s rapidly failing health.

The significance of these decisions has gradually faded, but for Truman, they were foisted upon him in rapid order: April 12, FDR’s death; April 28, Benito Mussolini killed by partisan Italians; two days later Adolf Hitler committed suicide; and on April 29, German military forces surrendered. The news from the Pacific was equally dramatic as troop landings on the critical island of Okinawa had apparently been unopposed by the Japanese. It was clearly the apex of optimism regarding the prospects for an unconditional surrender by Japan and the welcomed return of world peace.

In fact, it was a miracle that turned out to be a mirage.

After victory in Europe (V-E Day), Truman was faced with an immediate challenge regarding the 3 million troops in Europe. FDR and Churchill did not trust Joseph Stalin and were wary of what the Russians would do if we started withdrawing our troops. Churchill proved to be right about Russian motives, as they secretly intended to continue to permanently occupy the whole of Eastern Europe and expand into adjacent territories at will.

Then the U.S. government issued a report stating that the domestic economy could make a smooth transition to pre-war normalcy once the voracious demands from the military war-machine abated. Naturally, the war-weary public strongly supported “bringing the boys home,” but Truman knew that Japan would have to be forced to quit before any shifts in troops or production could start.

There was also a complex scheme under way to redeploy the troops from Europe to the Pacific if the Japanese decided to fight on to defend their sacred homeland. It was a task that George Marshall would call “the greatest administrative and logistical problem in the history of the world.”

Truman pondered in a diary entry: “I have to decide the Japanese strategy – shall we invade proper or shall we bomb and blockade? That is my hardest decision to date.” (No mention was made of “the other option.”)

The battle on Okinawa answered the question. Hundreds of Japanese suicide planes had a devastating effect. Even after 10 days of heavy sea and air bombardment on the island; 30 U.S ships sunk, 300 more damaged; 12,000 Americans killed; 36,000 wounded. It was now obvious that Japan would defend every single island, regardless of their losses. Surrender would not occur and America’s losses would be extreme.

So President Truman made a historic decision that is still being debated today: Drop the atomic bomb on Japan and assume that the effect would be so dramatic that the Japanese would immediately surrender. On Aug. 6, 1945, “Little Boy” was dropped on Hiroshima with devastating effects. Surprisingly, the Japanese maintained their silence, perhaps not even considering that there could be a second bomb. That second bomb – a plutonium variety nicknamed “Fat Man” – was then dropped two days ahead of schedule on Aug. 9 on the seaport city of Nagasaki.

No meeting had been held and there was no second order given (other than by Enola Gay pilot Paul Tibbets). The directive that had ordered the first bomb simply said in paragraph two that “additional bombs will be delivered AS MADE READY.” However, two is all that was needed. Imperial Japan surrendered on Aug. 15, thus ending one of history’s greatest wars.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

If President Jackson had Followed Through with a Threat…

This U.S. Colt Model 1877 Bulldog Gatling Gun, with five 18-inch barrels secured in brass casement, realized $395,000 at a December 2014 Heritage auction.

“An army travels on its stomach.”

By Jim O’Neal

Both Frederick the Great and Napoleon Bonaparte are credited with aphorisms similar to this theme intended to emphasize the concept that a well-provisioned military is critical to its performance. In 1775, France offered 10,000 francs to anyone who could improve this persistent problem. In 1809, a confectioner named Nicolas Appert claimed the prize by inventing a heating, boiling and sealing system that preserved food similar to modern technology.

During the Revolutionary War, General Washington had to contend with this issue, as well as uniforms and ordnance (e.g. arms, powder and shot), which were essential to killing and capturing the British enemies. Responsibilities were far too dispersed and decision-making overly reliant on untrained personnel.

By the dawn of the War of 1812, the War Department convinced Congress that all these activities should be consolidated under experienced military personnel. On May 14, 1812, the U.S. Army Ordnance Corps was established. Over the past 200-plus years, 41 different men (mostly generals) have held the title of Army Chief of Ordnance. The system has evolved slowly and is regarded as a highly effective organization at the center of military actions in many parts of the world.

However, when the Civil War started in 1861, the man in charge was General James Wolfe Ripley (1794-1870), a hardheaded, overworked old veteran that Andrew Jackson had once threatened to hang for disobedience during the war with the Creek Indians. Ripley believed that the North would make this a short war and all they needed was an ample supply of orthodox weapons. He flatly refused to authorize the purchase of additional rifle-muskets for the infantry; primarily because of a large inventory of smooth bore muskets in various U.S. ordnance centers. Furthermore, he adamantly refused to allow the introduction of the more modern breech-loading repeating rifles due to a bizarre belief that ammunition would be wasted.

After two years of defiantly resisting the acquisition of new, modern weaponry, he was forced to retire. He was derided by the press as an old foggy, while some military historians claim he was personally responsible for extending the war by two years – a staggering indictment of enormous significance if in fact true!

One prominent example occurred in early June 1861 when President Lincoln met the first-known salesman of machine guns: J.D. Mills of New York, who performed a demonstration in the loft of a carriage shop near the Willard Hotel. Lincoln was so impressed that a second demonstration was held for the president, five generals and three Cabinet members. The generals were equally impressed and ready to place an order on the spot. But, Ripley stubbornly managed to delay any action.

Lincoln was also stubborn and personally ordered 10 guns from Mills for $1,300 each without consulting anyone. It was the first machine gun order in history.

Then, on Dec. 18, 1861, General George McClellan bought 50 of the guns on a cost-plus basis for $750 each. Two weeks later, a pair of these guns debuted in the field under Colonel John Geary, a veteran of the Mexican War, the first mayor of San Francisco and, later, governor of both Kansas and Pennsylvania. Surprisingly, he wrote a letter saying they were “inefficient and unsafe to the operators.” But the colorful explorer General John C. Fremont, who commanded in West Virginia, sent an urgent dispatch to Ripley demanding 16 of the new machine guns.

Ripley characteristically replied:

“Have no Union Repeating Guns on hand and am not aware that any have been ordered.”

After several other tests produced mixed results, Scientific American wrote a requiem for the weapon, saying, “They had proved to be of no practical value to the Army of the Potomac and are now laid up in a storehouse in Washington.”

Then, belatedly, came a gifted inventor, Richard J. Gatling, who patented a six-barrel machine gun on Nov. 4, 1862. Gatling tried to interest Lincoln, who had now turned to other new weapons. However, some managed to get into service and three were used to help guard The New York Times building in the draft riots of July 1863. The guns eventually made Gatling rich and famous, but it was more than a year after the end of the war – Aug. 14, 1866 – when the U.S. Army became the first to adopt a machine gun … Gatlings!

It is always fun to consider counterfactuals (i.e. expressing what might have happened under different circumstances). In this case, if Andrew Jackson had hanged Ripley, then the North would have had vastly superior weaponry – especially the machine gun – and the war would have ended two years earlier. Many battles would have been avoided … Gettysburg … Sherman’s March to the Sea. Lincoln would have made a quick peace, thereby avoiding the assassination on April 14, 1865.

If … if … if …

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Much has Changed Since the Invention of ‘Instant’ Cameras

An Eastman Company porcelain enamel on steel advertising sign sold for $1,625 at a September 2017 auction.

By Jim O’Neal

In 2014, Internet Trends reported that people uploaded 1.8 billion images every day, or 657 billion per year. One curious statistician calculated that every two minutes, more photos are snapped than the total number in existence 150 years ago. Since there is an obvious correlation between smartphones, populations and photos, if my math is correct, that translates to 1.2 trillion photos taken in 2017. However, all that I am certain of is that in the past three years I’ve personally taken approximately zero – despite having the newest iPhone around almost constantly. There must be a term for people like me, but I’m not familiar with it.

Anyway, the man who probably deserves most of the credit for this photographic phenomenon is George Eastman. In 1888, he invented the Kodak camera and a dry, transparent, flexible photographic film to use with the camera. Eastman was the founder of the once-famous Eastman Kodak Company and personally dreamed up a brilliant advertising slogan to induce average people to buy his new invention: “You push the button – we do the rest.”

Then in 1900, Eastman unveiled the first Brownie cameras and people quickly joined the Brownie Camera Club by the thousands. It was truly the birth of the snapshot and fostered the novel idea that every family (not just the rich) could actually create their own visual history for themselves and the generations that followed. Individual snapshots may have been relatively small from a mere size standpoint, but it was hugely democratic.

Those first Brownies used film that sold for 15 cents a roll, making photography financially feasible for virtually everyone in the country, no matter where they lived. More than a quarter of a million Brownies were sold in the first year and an astonishing 50 million by the early 1940s.

Eastman (1854-1932) was supremely confident about the positive effect of advertising to increase sales; and even more confident that a major effort to educate the public would be an essential element to make it a mass-market product. He wrote all the ads personally and championed the expansion of the brand internationally. One specific example was the word “Kodak” sparkling down from an electric sign on Trafalgar Square in central London.

When asked about the derivation of the word Kodak, Eastman would invariably say that “the letter K was my favorite. It’s a strong, incisive sort of letter. So it was simply a matter of trying out a great number of combinations of words that started and ended with K.” Add the distinctive yellow color that Eastman selected and Kodak became an instantly recognized brand all over the world.

The word that didn’t seem to fit the true impact of Eastman’s invention was “hobby.” His Brownie meant that aspiring photographers no longer had to be bothered by technical camera settings, precise focus or even film development. After exposing the film, the entire camera was then shipped back to Eastman’s factory. The film was developed, camera reloaded and mailed back to the customer along with the mounted prints. As early as 1896, the 100,000th Kodak camera was manufactured and the factory was churning out 400 miles of film and photography paper each month.

Eastman not only had powerful, creative ideas; he understood that he had to execute better than any of the inevitable competition. In time, Kodak had developed an impeccable reputation for affordable cameras and film. It was a formula that was replicated by Gillette (for razors and blades) and Sony (its Walkman provided a private, convenient concert in your ear any time of your choosing).

Yet as successful as Kodak became, there was someone else who would perceive our desire for even greater speed and instant gratification. (No, not Jeff Bezos.)

His name was Edwin Land (1909-1991) and in December 1943, he was on vacation with his family, walking around taking pictures (probably with a Kodak). Back in their room, his daughter posed a simple question: “But Daddy, why can’t I see the pictures now?” Instead of the standard reply … “Because you can’t” … Land started working on solving that problem. He recalled, “Within an hour, the camera, the film and the physical chemistry became so clear. I rushed to my patent attorney and described in great detail a dry camera which would give a picture immediately after exposure.”

Of course, he had conceptualized the Polaroid instant photographic process, which would own that category for decades. Their SX-70 instant color camera was an overnight success as we all responded to the massive TV advertising in which Sir Laurence Olivier sold us on the revolutionary idea of instant photography.

Alas, these two iconic photographic companies ended up in the largest patent suit, with Polaroid suing Kodak for infringing on 12 patents in 1976 and the litigation lasted until 1985, when Kodak was found guilty of infringing seven patents. The story has a sad ending since Kodak is now a shadow of its size and Polaroid ended up in bankruptcy, both victims of 20th century digital photography, which basically obsoleted everything else in sight.

But, think about all the memories that are stored in every house in America, just waiting for someone to take another look and spend time trying to figure out who all these people were and the stories about when they were taken. There is joy in all those cabinets.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

America has a Long History of Rough-and-Tumble Politics

A cabinet card photograph dated 1852, shortly after the marriage of Rutherford and Lucy Hayes, went to auction in October 2008.

By Jim O’Neal

A surprisingly high number of political pundits ascribe the current bitter partisan divide to the presidential election of 2000, when the Supreme Court ordered the recount of “under-votes” in Florida to cease. As a result, the previously certified election results would stand and George W. Bush would receive all 25 Florida electoral votes, thus providing him a 271-266 nationwide victory over Al Gore. Democrats almost universally believed the election had been “stolen” due to the seemingly unprecedented action by the Supremes.

Although obviously a factor in the situation today, it seems too simplistic to me, as I remember the Clinton Impeachment, the start of the Iraq War (and the president who lied us into war), and, of course, Obamacare – all of which were also major contributors to the long, slow erosion of friendly bipartisanship. Now, we’re in an era when each new day seems to drag up a new issue that Americans can’t agree on and the schism widens ever so slightly.

Could it be worse?

The answer is obviously “yes,” since we once tried to kill each other into submission during the Civil War. Another good example is the highly controversial presidential election of 1876, which resulted in Rutherford B. Hayes becoming president. The loser, Samuel J. Tilden, had such staunch supporters that they promised “blood would run in the streets” if their candidate lost. After a highly ultra-controversial decision threw the election to Hayes, Democrats continued to make wild threats, and public disturbances were rampant across New York City hotels, saloons, bars and any other venues where crowds gathered.

The unrest was so high that outgoing President Ulysses S. Grant gradually became convinced that a coup was imminent. This was the closest the Dems had come to the White House since James Buchanan’s election 20 years earlier in 1856 and passions were so high that they would not be calmed easily. The level of resentment was much more than about losing an election or the ascendancy of the Republican Party with all their fierce abolitionists. It seems apparent even today that the election results had been politically rigged or, at a minimum, very cleverly stolen in a quasi-legalistic maneuver.

Grant’s primary concern was one of timing. The normal inauguration date of March 4 fell on a Sunday and tradition called for it to be held the next day, on Monday, March 5 (as with Presidents James Monroe and Zachary Taylor). Thus the presidency would be technically vacant from noon on Sunday until noon on Monday. The wily old military genius knew this would be plenty of time to pull off a coup d’état. He insisted Hayes not wait to take the oath of office.

In a clever ruse, the Grants made arrangements for a secret oath-taking on Saturday evening by inviting 38 people to an honorary dinner at the White House. While the guests were being escorted to the State Dining Room, Grant and Hayes slipped into the Red Room, where Chief Justice Morrison Waite was waiting with the proper documents. All went as planned until it was discovered there was no Bible available. No problem … Hayes was sworn in as the 19th president of the United States with a simple oath.

The passing of power has been one of the outstanding aspects of our constitutional form of governance.

Hayes was born on Oct. 4, 1822 – 2½ months after his father had died of tetanus, leaving his pregnant mother with two young children. From these less-than-humble beginnings, the enterprising “Rud” got a first-rate education that culminated with an LLB degree from Harvard Law School. Returning to Ohio, he established a law practice, was active in the Civil War and finally served two non-consecutive terms as governor of Ohio, which proved to be a steppingstone to the White House.

Most historians believe Hayes and his family were the richest occupants of the White House until Herbert and Lou Hoover showed up 52 years later. They certainly had a reputation for living on the edge of extravagance, and some cynics believe this was in large part due to the banning all alcohol in the White House (presidents in those days paid for booze and wine personally). Incidentally, the nickname for the first lady, “Lemonade Lucy,” did not happen until long after they left the White House.

President Hayes kept his pledge to serve only one term; he died of a heart attack in 1893 at age 70. The first Presidential Library in the United States was built in his honor in 1916.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Scientists Like Joseph Lister Have Made Life Better for All of Us

A March 25, 1901, letter signed by Joseph Lister went to auction in October 2014.

By Jim O’Neal

In the 1880s, American physicist Albert Michelson embarked on a series of experiments that undermined a long-held belief in a luminiferous ether that was thought to permeate the universe and affect the speed of light ever so slightly. Embraced by Isaac Newton (and almost venerated by all others), the ether theory was considered an absolute certainty in 19th century physics in explaining how light traveled across the universe.

However, Michelson’s experiments (partially funded by Alexander Graham Bell) proved the exact opposite of the theory. In the words of author William Cropper, “It was probably the most famous negative result in the history of physics.” The fact was that the speed of light was the same in all directions and in every season – reversing Newton’s law that had been thought to be a constant for the past 200 years. But, not everyone agreed for a long time.

The more modern scientist Max Planck (1858-1947) helped explain the resistance to accept new facts in a rather novel way: “A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.”

Even if true, it still makes it no less easy to accept the fact that the United States was the only nation “that remained unconvinced of the merits of Joseph Lister’s methods of modern antiseptic medicine.” In fact, Henry Jacob Bigelow (1818-1890), the esteemed Harvard professor of surgery and a fellow of the Academy of Arts and Sciences, derided antisepsis as “medical hocus-pocus.” This is even more remarkable when one considers he was the leading surgeon in New England and his contributions to orthopedic and urologic surgery are legendary.

But this short story begins with a sleight of hand by asking: In the 19th century, what do you think was the most dangerous place in the vast territories of the British Empire? The frozen wastes of the Northwest Passage or the treacherous savannas of Zululand? Or perhaps the dangerous passes of Hindu Kush? The surprising answer is almost undoubtedly the Victorian teaching hospital, where patients entered with a trauma and exited to a cemetery after a deadly case of “hospital gangrene.”

Victorian hospitals were described as factories of death, reeking with an unmistakable stench resembling rotting fish, cheerfully described as “hospital stink.” Infectious wounds were considered normal or beneficial to recovery. Stories abound of surgeons operating on a continuous flow of patients, and bloody smocks were badges of honor or evidence of their dedication to saving lives. The eminent surgeon Sir Frederick Treves (1853-1923) recalled, “There was one sponge to a ward. With this putrid article and a basin of once clear water, all the wounds in the ward were washed twice a day. By this ritual, any chance that a patient had of recovery was eliminated.”

Fortunately, Joseph Lister was born in 1827 and chose the lowly, mechanical profession of surgery over the more prestigious practice of internal medicine. In 1851, he was appointed one of four residents of surgery at London’s University College Hospital. The head of surgery was wrongfully convinced that infections came from miasma, a peculiar type of noxious air that emanated from the rot and decay.

Ever skeptical, Lister scoured out rotten tissue from gangrene wounds using mercury pernitrate on the healthy tissue. Thus began Lister’s lifelong journey to investigate the cause of infection and prevention through modern techniques. He spent the next 25 years in Scotland, becoming the Regius Professor of Surgery at the University of Glasgow. After Louis Pasteur confirmed germs caused infections rather than bad air, Lister discovered that carbolic acid (a derivative of coal tar) could prevent many amputations by cleaning the skin and wounds.

He then went on the road, advocating his gospel of antisepsis, which was eagerly adopted by the scientific Germans and some Scots, but plodding and practical English surgeons took much longer. Thus left were the isolated Americans who, like Dr. Bigelow, were too stubborn and unwilling to admit the obvious.

Planck was right all along. It would take a new generation, but we are the generation that has derived the greatest benefits from the astonishing advances in 20th century medical breakthroughs, which only seem to be accelerating. It is a good time to be alive.

So enjoy it!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How Far Will We Go In Amending American History?

A collection of items related to the dedication of the Washington Monument went to auction in May 2011.

By Jim O’Neal

Four years ago, George Clooney, Matt Damon and Bill Murray starred in a movie titled The Monuments Men, about a group of almost 400 specialists who were commissioned to try and retrieve monuments, manuscripts and artwork that had been looted in World War II.

The Germans were especially infamous for this and literally shipped long strings of railroad cars from all over Europe to German generals in Berlin. While they occupied Paris, they almost stripped the city of its fabled art collections by the world’s greatest artists. Small stashes of hidden art hoards are still being discovered yet today.

In the United States, another generation of anti-slavery groups are doing the exact opposite: lobbying to have statues and monuments removed, destroyed or relocated to obscure museums to gather dust out of the public eyes. Civil War flags and memorabilia on display were among the first to disappear, followed by Southern generals and others associated with the war. Now, streets and schools are being renamed. Slavery has understandably been the reason for the zeal to erase the past, but it sometimes appears the effort is slowly moving up the food chain.

More prominent names like President Woodrow Wilson have been targeted and for several years Princeton University has been protested because of the way it still honors Wilson, asserting he was a Virginia racist. Last year, Yale removed John C. Calhoun’s name from one of its residential colleges because he was one of the more vocal advocates of slavery, opening the path to the Civil War by supporting states’ rights to decide the slavery issue in South Carolina (which is an unquestionable fact). Dallas finally got around to removing some prominent Robert E. Lee statues, although one of the forklifts broke in the process.

Personally, I don’t object to any of this, especially if it helps to reunite America. So many different things seem to end up dividing us even further and this only weakens the United States (“United we stand, divided we fall”).

However, I hope to still be around if (when?) we erase Thomas Jefferson from the Declaration of Independence and are only left with George Washington and his extensive slavery practices (John Adams did not own slaves and Massachusetts was probably the first state to outlaw it).

It would seem to be relatively easy to change Mount Vernon or re-Washington, D.C., as the nation’s capital. But the Washington Monument may be an engineering nightmare. The Continental Congress proposed a monument to the Father of Our Country in 1783, even before the treaty conferring American independence was received. It was to honor his role as commander-in-chief during the Revolutionary War. But when Washington became president, he canceled it since he didn’t believe public money should be used for such honors. (If only that ethos was still around.)

But the idea for a monument resurfaced on the centennial of Washington’s birthday in 1832 (Washington died in 1799). A private group, the Washington National Monument Society – headed by Chief Justice John Marshall – was formed to solicit contributions. However, they were not sophisticated fundraisers since they limited gifts to $1 per person a year. (These were obviously very different times.) This restriction was exacerbated by the economic depression that gripped the country in 1832. This resulted in the cornerstone being delayed until July 4, 1848. An obscure congressman by the name of Abraham Lincoln was in the cheering crowd.

Even by the start of the Civil War 13 years later, the unsightly stump was still only 170 feet high, a far cry from the 600 feet originality projected. Mark Twain joined in the chorus of critics: “It has the aspect of a chimney with the top broken off … It is an eyesore to the people. It ought to be either pulled down or built up and finished,” Finally, President Ulysses S. Grant got Congress to appropriate the money and it was started again and ultimately opened in 1888. At the time, it was 555 feet tall and the tallest building in the world … a record that was eclipsed the following year when the Eiffel Tower was completed.

For me, it’s an impressive structure, with its sleek marble silhouette. I’m an admirer of the simplicity of plain, unadorned obelisks, since there are so few of them (only two in Maryland that I’m aware of). I realize others consider it on a par with a stalk of asparagus, but I’m proud to think of George Washington every time I see it.

Even so, if someday someone thinks it should be dismantled as the last symbol of a different period, they will be disappointed when they learn of all the other cities, highways, lakes, mountains and even a state that remain to go. Perhaps we can find a better use for all of that passion, energy and commitment and start rebuilding a crumbling infrastructure so in need of repairs. One can only hope.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Rosenwald Belongs with Titans Like Rockefeller, Carnegie

A card with signatures and a photograph of President Calvin Coolidge, New York Governor Alfred E. Smith and Julius Rosenwald, circa 1930, went to auction in 2008.

By Jim O’Neal

One fact that is difficult to verify is the total net worth of the Rockefeller family fortune. John Davison Rockefeller Sr. (1839-1937) rose from pious beginnings to become the world’s richest man by creating America’s most powerful monopoly, Standard Oil Company. Scores of muckrakers (especially Ida Tarbell) scorned it as “The Octopus” and posters protested the company by showing it swallowing the world … whole.

He is definitely the most prominent and controversial businessman in our history, especially when the trust he created came from refining 90 percent of the oil produced and marketed in America. His vocal critics charged he was an unscrupulous man who colluded with railroads to fix prices, and conducted illegal industrial espionage and outright bribery of political officials. It took Teddy Roosevelt and his team of stalwart trustbusters to break the trust, but even that inured to his benefit since he had ownership shares in all the new, smaller entities that were created.

Although the business practices were as ruthless and corrupt as charged, he was a quirky, passionate, temperate advocate who was generous and gave enormous sums to organizations like the Rockefeller Foundation, University of Chicago and what is now Rockefeller University. As an old man (he lived to be 98), he was parodied as a harmless billionaire who delighted in giving shiny dimes to needy children.

The actual story has grown much more complex after his only son, John D. Rockefeller Jr. (1874-1960), took over the massive estate and had five sons of his own. The last one, David Rockefeller, died last year and his personal estate was auctioned off this month by an East Coast firm. The total net proceeds were consigned to 12 of his favorite charities, which will create another layer of veneer over the money. What we know is that 1,500 items sold for over $832 million, setting 22 records in the process.

Another son of Junior was Nelson Rockefeller (1908-1979), who was governor of New York and made unsuccessful attempts to snag the GOP presidential nomination in 1960, 1964 and 1968. After serving in other high-profile positions, he was chosen by Gerald Ford to be the 47th vice president of the United States after Richard Nixon’s resignation. Rockefeller holds the distinction of being the last VP to decline to seek re-election when he decided not to join the 1976 Republican ticket with Ford.

Andrew Carnegie (1835-1919) was another famous philanthropist who made a fortune in steel and spent the last 18 years of his life giving $350 million to charities, foundations and universities. “I should consider it a disgrace to die a rich man.” Both the Rockefeller and Carnegie names have been well known throughout the 20th century, primarily because of the numerous foundations and buildings that bear their names.

But let’s focus now on an equally generous man who is largely forgotten because no foundations and few buildings mention him.

Julius Rosenwald (1862-1932) made his fortune the old-fashioned way. He earned it. He started running a clothing store in Springfield, Ill., and then went to New York to learn about the garment business. When he returned to Chicago, he opened another modest clothing store, but also started shrewdly investing in a small catalog store with the undistinguished name of Sears, Roebuck & Company. When co-founder Richard Sears left the company in 1908, Rosenwald assumed a leadership role. With financial help from Henry Goldman (son of Marcus Goldman of Goldman Sachs), he expanded the company with a massive 40-acre mail-order plant on Chicago’s West Side.

Then, in an unprecedented move in 1906, an IPO with Goldman was created, and Sears became a public company. Rosenwald had climbed from a vice president to chairman and CEO, and the new plant in Chicago, with a staggering 3 million square feet, became the largest building in the world. In the process, Sears became America’s largest retailer and people all over the United States discovered how to order using the mail, after hours of thumbing through the sacred Sears catalog.

The demise of Sears is well known and the company is currently being dismantled and sold by brand. It may not be as quickly forgotten as Julius Rosenwald, who went to extremes to be modest. When he died in 1932, it is estimated that he had donated $2 billion to a wide range of interests, including projects that funded African-American education in the South. He funded a program to construct elementary and secondary schools in any willing black community. Over a 20-year period, 5,000 schools were constructed in the South, 90 percent of all buildings in which Mississippi’s black youngsters received an education.

Not bad for a generous man who had no need for recognition, just a desire to help needy people. Now another generation of people will know what he did, in such a humble and modest way, by insisting on closing his foundation after his death and opposing the attachment of his name to so many projects.

Bravo.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Fillmore Among Presidents Who Juggled Balance Between Free and Slave States

This folk art campaign banner for Millard Fillmore’s failed 1856 bid for the presidency sold for $11,950 at a June 2013 Heritage auction.

By Jim O’Neal

On his final day in office, President James Polk wrote in his diary: “Closed my official term of President of the United States at 6am this morning.”

Later, after one last stroll through the silent White House, he penned a short addendum: “I feel exceedingly relieved that I am now free from all public cares. I am sure that I will be a happier man in my retirement than I have been for 4 years ….” He died 103 days later, the shortest retirement in presidential history and the first president survived by his mother. His wife Sarah (always clad only in black) lived for 42 more lonely years.

Fillmore

The Washington, D.C., that greeted his successor, General Zachary Taylor (“Old Rough and Ready”), still looked “unfinished” – even after 50 years of planning and development. The Mall was merely a grassy field where cows and sheep peacefully grazed. The many plans developed in the 1840s were disparate projects. Importantly, the marshy expanse south of the White House was suspected of emitting unhealthy vapors that were especially notable in the hot summers. Cholera was the most feared disease and it was prevalent until November each year when the first frost appeared.

Taylor

Naturally, the affluent left the Capitol for the entire summer. Since the Polks had insisted on remaining, there was a widespread belief that his death so soon after departing was directly linked to spending the presidential summers in the White House. The theory grew even stronger when Commissioner of Public Buildings Charles Douglas proposed to regrade the sloping fields into handsome terraces under the guise of “ornamental improvement.” Insiders knew the real motive was actually drainage and sanitation to eliminate the foul air that hung ominously around the White House. (It’s not clear if Donald Trump’s campaign promise to “drain the swamp” was another effort or a political metaphor.)

President Taylor was inaugurated with a predictable storm of jubilation since his name was a household word. After a 40-year career in the military (1808-1848), he had the distinction of serving in four difference wars: War of 1812, Black Hawk War (1832), Second Seminole War (1835-1842), and the Mexican-American War (1846-1848). By 1847, Taylormania broke out and his picture was everywhere … on ice carts, tall boards, fish stands, butcher stalls, cigar boxes and so on. After four years under the dour Polk, the public was ready to once again idolize a war hero with impeccable integrity and a promise to staff his Cabinet with the most experienced men in the country.

Alas, a short two years later, on July 9, 1850, President Taylor became the second president to die in office (William Henry Harrison lasted 31 days). On July 4, after too long in the hot sun listening to ponderous orations and too much ice water to cool off, he returned to the White House. It was there that he gorged on copious quantities of cherries, slathered with cream and sugar. After dinner, he developed severe stomach cramps and then the doctors took over and finished him off with calomel opium, quinine and, lastly, raising blisters and drawing blood. He survived this for several days and the official cause of death was cholera morbus, a gastrointestinal illness common in Washington where poor sanitation made it risky to eat raw fruit and fresh dairy products in the summer.

Vice President Millard Fillmore took the oath of office and spent the rest of the summer trying to catch up. Taylor had spent little time with his VP and then the entire Cabinet submitted their resignations over the next few days, which Fillmore cheerfully accepted. He immediately appointed a new Cabinet featuring the great Daniel Webster as Secretary of State. On Sept. 9, 1850, he signed a bill admitting California as the 31st state and as “a free state.” This was the first link in a chain that became the Compromise of 1850.

The Constitutional Congress did not permit the words “slave” or “slavery” since James Madison thought it was wrong to admit in the Constitution the idea that men could be considered property. In order to get enough states to approve it, it also prohibited Congress from passing any laws blocking it for 20 years (1808), by which it was assumed slavery would have long been abandoned for economic reasons. However, cotton production flourished after the invention of the cotton gin and on Jan. 1, 1808, President Thomas Jefferson signed into law that “Congress will have the power to exterminate slavery from our borders.”

This explains why controlling Congress was key to controlling slavery, so all the emphasis turned to maintaining a delicate balance whenever a new state was to be admitted … as either “free” or “slave.” Fillmore thus became the first of three presidents – including Franklin Pierce and James Buchanan – who worked hard to maintain harmony. However, with the election of Abraham Lincoln in 1860, it was clear what would happen … and all the Southern states started moving to the exit signs.

A true Civil War was now the only option to permanently resolving the slavery dilemma and it came with an enormous loss of life, property and a culture that we still struggle with yet today. That dammed cotton gin!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Washington Remains Our Greatest President

A George Washington inaugural button, perhaps the earliest artifact that refers to Washington as the “Father of His Country,” realized $225,000 at a February 2018 Heritage auction.

By Jim O’Neal

Presidential scholars typically list George Washington, Abraham Lincoln and Franklin Delano Roosevelt as our finest presidents. I tend to favor Washington since without him, we would probably have a much different country in so many aspects. If there were any doubts about the feats of the “Father of Our Country,” they were certainly dispelled in 2005 when David McCullough’s 1776 hit bookstores, followed five years later by Ron Chernow’s masterful Washington: A Life, which examined the man in exquisite detail. They didn’t leave much ground uncovered, but there are still a few tidbits that haven’t become overused and still interesting for those interested in fresh anecdotes.

For example, Washington wasn’t aware that on Nov. 30, 1782, a preliminary Treaty of Paris was signed that brought American Revolutionary hostilities to an end. The United States was prevented from dealing directly with Great Britain due to an alliance with France that stipulated we would not negotiate with Britain without them. Had he known, Washington would have been highly suspicious since King George III “will push the war as long as the nation will find men or money.” In a way, Washington would have been right since the United States had demanded full recognition as a sovereign nation, in addition to removal of all troops and fishing rights in Newfoundland. The king rejected this since he was still determined to keep the United States as a British colony, with greater autonomy. Ben Franklin naturally opposed this and countered with adding 100 percent of Canada to the United States. And so it went until May 12, 1784, when the documents bringing the Revolutionary War to an end were finally ratified and exchanged by all parties.

It was during these protracted negotiations that Washington was concerned that the army might lose its fighting edge. He kept drilling the troops while issuing a steady stream of instructions: “Nothing contributes so much to the appearance of a soldier, or so plainly indicates discipline, as an erect carriage, firm step and steady countenance.” After all these years of hardships and war, Washington was still a militant committed to end the haughty pride of the British. To help ensure the fighting spirit of his army, Washington introduced a decoration designated as the Badge of Military Merit on Aug. 7, 1782. He personally awarded three and then authorized his subordinate officers to issue them in cases of unusual gallantry or extraordinary fidelity and essential service. Soldiers received a purple heart-shaped cloth, to be worn over the left breast. After a lapse, it was redesigned and is now the Purple Heart medal, awarded to those wounded or killed. The first was awarded on Feb. 22, 1932, the 200th anniversary of Washington’s birthday.

The victorious conclusion of the Revolutionary War left many questions unanswered concerning American governance, prominently the relationship between the government and the military. At the end, army officers had several legitimate grievances. Congress was in arrears with pay and had not settled officer food and clothing accounts or made any provisions for military pensions. In March 1783, an anonymous letter circulated calling on officers to take a more aggressive stance, draw up a list of demands, and even possibly defy the new government! Washington acted quickly, calling for a meeting of all officers and at the last moment delivered one of the most eloquent and important speeches of his life.

After the speech, he drew a letter from a pocket that outlined Congressional actions to be undertaken. He hesitated and then fumbled in his pockets and remarked, “Gentlemen, you will permit me to put on my spectacles, for I have not only grown gray, but almost blind, in the service of my country.” By all accounts, the officers were brought to tears, and the potentially dangerous conspiracy collapsed immediately.

He gets my vote.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Walt Disney is Proof that Great Ideas Can Come Out of Nowhere

Walt Disney’s passport, dated Aug. 19, 1965, sold for $28,680 at an April 2007 Heritage auction.

By Jim O’Neal

I never met Walt Disney. He died four months after I joined Frito-Lay in 1966 and was among the last of a generation that smoked three packs of unfiltered cigarettes a day. He died of lung cancer.

However, like virtually everyone else in Southern California, we were familiar with the incredible theme park he built in nearby Anaheim. Disneyland opened to the public on July 17, 1955, and Frito-Lay had a direct relationship with Disneyland through a Mexican food restaurant in Frontierland called Casa de Fritos (Home of Fritos). It featured a unique product called a “Ta-cup” … basically, traditional taco ingredients served in a fried tortilla cup that helped with eating on-the-go using one hand.

I met the manager, Joe Nugent, during a meeting with Frito-Lay corporate officers from Dallas (The Flying Circus) who were in Los Angeles to review our zone’s 1967 profit plan and operating budgets. The only surprise was a mild rebuke to Nugent: “Dammit Joe, we told you last year that we don’t want to make a profit. The whole idea is to expose more people to Fritos and build the brand!” Joe just nodded and resorted to his previous tactic of lowering prices in the hope of lowering profits. Alas, it seemed that the more he dropped prices, the more the crowds waited in line to buy even more. What he experienced was the 1960s version of leveraging overhead costs, which lowered margins but increased total profitability. The Sam Walton slogan of “stack ’em high and sell ’em low.”

Another surprise was the mid-1967 completion of Club 33 at Disneyland, a private club with a top-notch restaurant that could only be accessed with a special card and hidden elevator. Frito-Lay was one of 33 local companies with membership, along with Carnation, Bell Telephone and Bank of America. It was the only place in the park where alcoholic beverages were served and in 2010, I read there was a 14-year waiting list for new members. Frito-Lay is now the crown jewel in the PepsiCo empire and analysts are advocating they get out of the beverage business.

From what I’ve read, Walter Elias Disney was a lot like the two men who started the Frito and Lay companies: humble beginnings, entrepreneurial and ambitious. Disney went broke when no one would buy his first animated films, Alice Comedies, so he moved from Kansas City to Hollywood in 1923, took out a home equity loan for $2,500 and rented the back room of a local real estate office where he created the studio that would become the Walt Disney Company‍. What a success story from another “garage” operation it has become.

We then moved to Cupertino in the Bay Area, where the business guys only seemed to talk about technology (using names like Ampex, Atari and H-P) and whispered about the next garage operation that would be a huge success. What we didn’t realize is that we had moved into one of the greatest wealth-creation areas in the history of the world … where two members of the Homebrew Computer Club – both college dropouts – set up shop to sell their techie friends the circuit boards one of them had invented. There are several versions of why they named their company Apple, but the consensus is Steve Jobs had been working at an apple orchard. Our old backyard is not far from the $5 billion Apple campus, but nobody tipped us about what the future might be.

We never heard the term “Silicon Valley” or about two guys in Vermont who had pooled their resources to make ice cream in an abandoned gas station after installing a 4½-gallon freezer. Ben Cohen and Jerry Greenfield were on the opposite end of the technology spectrum when they made a low-tech fortune with Ben & Jerry’s. Of course, not every inventor sticks his hand into a tin and comes up with “Chunky Monkey.” Most will fail. Even those who receive patents and set up small companies will not make these kinds of fortunes. Consider poor Eli Whitney, who invented the cotton gin in 1793 while working as a tutor at a Georgia plantation. His was probably the most influential invention in the 18th century and it got him into the history books, but nothing in the bankbook.

But that’s not the point. The dream that it’s possible, that an idea can come out of nowhere and can – with a lot of hard work – lead to success is more alive than ever. That kid working away or thinking about dropping out of Harvard to start his own company can change the world. Still skeptical? Just ask Bill Gates.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].