John Adams saw the White House as a home for ‘honest and wise men’

A vintage creamware punch bowl, commemorating “John Adams President of the United States,” sold for $15,535 at a March 2008 Heritage auction.

By Jim O’Neal

As the states prepared for the first presidential election under the new Constitution, it was clear that George Washington was the overwhelming favorite to become the first president of the United States.

Under the rules, each state would cast two votes and at the February 1789 Electoral College, all 69 Electors cast one of their votes for Washington, making him the unanimous choice of 10 states. Two of the original Colonies (North Carolina and Rhode Island) had not yet ratified the Constitution, and New York had an internal dispute and did not chose Electors in time to participate. Eleven other men received a total of 69 votes, with John Adams topping the list with 34 votes, slightly less than 50 percent. He became the first vice president.

Four years later, there were 15 states (Vermont and Kentucky) and the Electoral College increased to 132 Electors. Again, Washington was elected president unanimously, with 132 votes. Adams was also re-elected with 77 votes, besting George Clinton, Thomas Jefferson and Aaron Burr. All three of the runner-ups would later become vice presidents, with Clinton serving a term for two different presidents (Jefferson and Madison). Jefferson had cleverly picked Clinton as his VP due to his age, correctly assuming Clinton would be too old to secede him … thus ensuring that Secretary of State James Madison would be the logical choice. Clinton would actually be the first VP to die in office.

John Adams

Two-time Vice President John Adams would finally win the presidency on his third try after Washington decided not to seek a third term in 1796. Still, Adams barely squeaked by, defeating Jefferson 71-68. Jefferson would become vice president after finishing second. It was during the Adams presidency that the federal government would make its final move to the South after residing first in New York City and then Philadelphia.

This relocation was enabled by the 1790 Residence Act, a compromise that was brokered by Jefferson with Alexander Hamilton and James Madison, with the proviso that the federal government assume all remaining state debts from the Revolutionary War. In addition to specifying the Potomac River area as the permanent seat of the government, it further authorized the president to select the exact spot and allowed a 10-year window for completion.

Washington rather eagerly agreed to assume this responsibility and launched into it with zeal. He personally selected the exact spot, despite expert advice against it. He even set the stakes for the foundation himself and carefully supervised the myriad details involved during actual construction. When the stone walls were rising, everyone on the project assembled, laid the cornerstone and affixed an engraved plate. Once in the mortar, the plate sank and has never been located since. An effort was made to find it on the 200th anniversary in 1992. All the old maps were pored over and the area was X-rayed … all to no avail. It remained undetected.

The project was completed on time and with Washington in his grave for 10 months, plans were made to relocate the White House from Philadelphia. The first resident, President John Adams, entered the President’s House at 1 p.m. on Nov. 1, 1800. It was the 24th year of American independence and three weeks later, he would deliver his fourth State of the Union address to a joint session of Congress. It was the last annual message delivered personally for 113 years. Thomas Jefferson discontinued the practice and it was not revived until 1913 (by Woodrow Wilson). With the advent of radio, followed by television, it was just too tempting for any succeeding presidents to pass up the opportunity.

John Adams was a fifth-generation American. He followed his father to Harvard and dabbled in teaching before becoming a lawyer. His most well-known case was defending the British Captain and eight soldiers involved in the Boston Massacre on March 5, 1770. He was not involved in the Boston Tea Party, but rejoiced since he suspected it would inevitably lead to the convening of the First Continental Congress in Philadelphia in 1774.

He married Abigail Smith … the first woman married to a president who also had a son become president. Unlike Barbara Bush, she died 10 years before John Quincy Adams actually became president in 1825. Both father and son served only one term. Abigail had not yet joined the president at the White House, but the next morning he sent her a letter with a benediction for their new home: “I pray heaven to bestow the best blessing on this house and on all that shall hereafter inhabit it. May none but honest and wise men ever rule under this roof.” Franklin D. Roosevelt was so taken with it that he had it carved into the State Dining Room mantle in 1945.

Amen.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Benjamin Franklin’s basement was literally filled with skeletons

A pre-1850 folk art tavern sign depicting Benjamin Franklin sold for $11,250 at a May 2014 Heritage auction.

By Jim O’Neal

The Benjamin Franklin House is a formal museum in Central London near Trafalgar Square. It’s a popular location for kooky political speeches and peaceful demonstrations. Although anyone is free to speak about virtually anything, many visitors are not raptly paying attention, preferring to instead feed the pigeons. I never had the temerity to practice my public speaking, although I’m sometimes tempted (“Going wobbly,” as my English friends would observe).

Known once as Charing Cross, Trafalgar Square now commemorates the British naval victory in October 1805 off the coast of Cape Trafalgar, Spain. Admiral Horatio Nelson defeated the Spanish and French fleets there, resulting in Britain gaining global sea supremacy for the next century.

The Franklin House is reputedly the only building still standing where Franklin actually lived … anywhere. He resided there for several years after accepting a diplomatic role from the Pennsylvania Assembly in pre-Revolutionary times. Derelict for most of the 20th century, the site caused a stir 20-plus years ago while it was being renovated. During the extensive excavation, a cache of several hundred human bones were unearthed

Since anatomy was one of the few scientific things Franklin did not dabble in, the general consensus was that one of his colleagues did, at a time when privately dissecting cadavers was unlawful and those who did it were very discreet. I discovered the museum while riding a black cab on the way to the American Bar at the nearby Savoy Hotel. I may take the full tour if we ever return to London.

However, my personal favorite is likely to remain the Franklin Institute in the middle of Philadelphia. A large rotunda features the official national memorial to Franklin: a 20-foot marble statue sculpted by James Earle Fraser in 1938. It was dedicated by Vice President Nelson Aldrich Rockefeller in 1976. Fraser is well known in the worlds of sculpting, medals and coin collecting. He designed the Indian Head (Buffalo) nickel, minted from 1913-38; several key dates in high grade have sold for more than $100,000 at auction. I’ve owned several nice ones, including the popular 3-Leg variety that was minted in Denver in 1937. (Don’t bother checking your change!).

Fraser (1876-1953) grew up in the West and his father, an engineer, was one of the men asked to help retrieve remains from Custer’s Last Stand. George Armstrong Custer needs no introduction due to his famous massacre by the Lakota, Cheyenne and Arapaho in 1876 – the year Fraser was born – in the Battle of the Little Bighorn (Montana). But it helps explain his empathy for American Indians as they were forced off their reservations. His famous statue titled End of the Trail depicts the despair in a dramatic and memorable way. The Beach Boys used it for the cover of their 1971 album Surf’s Up.

Another historic Fraser sculpture is 1940’s Equestrian Statue of Theodore Roosevelt at the American Museum of Natural History (AMNH) in New York City. Roosevelt is on horseback with an American Indian standing on one side and an African-American man on the other. The AMNH was built using private funds, including from TR’s father, and it is an outstanding world-class facility in a terrific location across from Central Park.

However, there is a movement to have Roosevelt’s statue removed, with activists claiming it is racist and emblematic of the theft of land by Europeans. Another group has been active throwing red paint on the statue while a commission appointed by Mayor Bill de Blasio studies how to respond to the seemingly endless efforts to erase history. Apparently, the city’s Columbus Circle and its controversial namesake have dropped off the radar screen.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Despite numerous failed examples, socialism still fascinates some people

An 1872 presidential campaign banner for Horace Greeley sold for $40,000 at a December 2016 Heritage auction.

By Jim O’Neal

Many credit the famous 19th century motto of “Go West, young man” to newspaperman Horace Greeley for a line in a July 1865 editorial. However, there is still a debate over whether it was first penned by Greeley or the lesser-known John Soule in an 1851 edition of the Terre Haute (Ind.) Express. Either way, the dictum helped fuel the westward movement of Americans in our quest for Manifest Destiny (“From sea to shining sea”). Clearly, Greeley helped more to popularize the concept due to the great influence of his successful newspaper.

Greeley was much less successful as a politician. He was sent to Congress in 1848 in a special election to represent New York. His colleagues groused that the brief three months he spent there were primarily devoted to exposing Congressional corruption in his newspaper rather than passing legislation. He was unable to generate any meaningful support for re-election, which relegated him back to his real interest, which was reporting on news and exposing crooked politicians.

Despite this setback to his political career, Greeley remained a powerful force in American politics throughout the entire Civil War period and beyond. After exposing the corruption in the first term of the Grant presidency (1868-1872), he found himself in the curious position of being the presidential candidate for both the Democratic Party (which he had opposed on every issue for many years) and the Liberal-Republican Party (which was an offshoot that objected to the corruption).

The 1872 presidential election was especially bitter, with both sides resorting to dirty tricks and making wild allegations against each other. Grant won the Republican nomination unanimously and as the incumbent, chose not to actively campaign. Greeley was a virtual whirlwind, traveling widely and making 20 or more speeches every day. A cynic observed that the problem was it was the wrong message to the wrong audience, but fundamentally, the issue was that Greeley was simply a poor campaigner and Grant was still a very popular president/general.

Grant easily won his re-election bid for a second term with 56 percent of the popular vote and Greeley died on Nov. 29 – just 24 days after the election and before the electoral votes were cast or counted. This is the first and only time a nominee for president of a major party has died during the election process. Grant went on to snag a comfortable 56 electoral votes as the others were spread among several candidates, including three for the deceased Greeley (which were later contested).

Thus ended the life of Horace Greeley (1811-1872), who had been founder and editor of the New-York Tribune, arguably in the top tier of great American newspapers. Established in 1841, it was renamed the New-York Daily Tribune (1842-1866) as its daily circulation exploded to 200,000. Greeley was endlessly promoting utopian reforms such as vegetarianism, agrarianism, feminism and socialism. In 1852-62, the paper retained Karl Marx as its London-based European correspondent to elaborate on his basic tenets of Marxism.

Great Britain had just finished its decennial census, which put the population at precisely 20,959,477. This was just 1.6 percent of the world’s population, but nowhere on the planet was there a more rich or productive group of people. The empire produced 50 percent of the world’s iron and coal, controlled two-thirds of the shipping and accounted for one-third of all trade. London’s banks had more money on deposit than all other financial centers … combined! Virtually all the finished cotton in the world was produced in Great Britain on machines built in Britain by British inventors.

The famous British Empire covered 11.5 million square miles and included 25 percent of the world’s population. By whatever measurement, it was the richest, most innovative and skilled nation known to man, and in London – where he was living the good life – primarily on his friend Friedrich Engels’ money – Marx was still churning out socialist propaganda. He made no attempt to explain that for the first time in history, there was a lot of everything in most people’s lives. Victorian London was not only the largest city in the world, but the only place one could buy 500 different kinds of hammers and a dazzling array of nails to pound on.

While Marxism morphed into Bolshevism, communism and socialism – polluting the economic systems of many hopeful utopians like Greeley – capitalism and the market-based theories of Adam Smith (“the father of modern economics”) quietly crept over America almost unnoticed. Despite the numerous failed examples of socialism in the real world, there will always be a new generation of people wanting to try it.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

100 Years Before Rosa Parks, There was Octavius Catto

Rosa Parks refused to give up her seat on a segregated bus, sparking the Montgomery, Ala., bus boycott.

By Jim O’Neal

Most Americans are familiar with Rosa Parks and recall the heroic story of a weary black woman on her way home after a hard day at work who refused to give up her seat and “move to the back of the bus” to make room for white people. The date was Dec. 1, 1955, and the city was Montgomery, Ala.

Later, she would be arrested during the ensuing Montgomery bus boycott that lasted 381 days. She was fined $10, but ultimately vindicated by the U.S. Supreme Court, which ruled the segregation law was unconstitutional. After her death, she became the first African-American woman to have her likeness depicted in the National Statuary Hall in the U.S. Capitol.

Parks (1913-2005) earned her way into the pantheon of civil rights leaders, but few remember a remarkable man who preceded her by a century when streetcars were pulled by horses.

Catto

His name was Octavius Valentine Catto (1839-1871) and history was slow in recognizing his astonishing accomplishments. Even the epitaph on his tombstone shouts in bold letters “THE FORGOTTEN HERO.” One episode in his far-too-short but inspiring life is eerily similar to the events in Montgomery, only dramatically more so. Catto was a fierce enemy of the entire Philadelphia trolley car system, which banned black passengers. On May 18, 1865, The New York Times ran a story about an incident involving Catto that occurred the previous afternoon in Philadelphia, “The City of Brotherly Love” (at least for some).

Paraphrasing the story, it describes how a colored man (Catto) had refused all attempts to get him to leave a strictly segregated trolley car. Frustrated and in fear of being fined if he physically ejected him, the conductor cleverly side railed the car, detached the horses and left the defiant passenger in the now-empty stationary car. Apparently, the stubborn man was still on-board after spending the night. It caused a neighborhood sensation that led to even more people challenging the rules.

The following year, there was an important meeting with the Urban League to protest the forcible ejection of several black women from Philadelphia streetcars. The intrepid Catto presented a number of resolutions that highlighted the inequities in segregation, principles of freedom, civil liberty and a heavily biased judicial system. He also boldly solicited support from fellow citizens in his quest for fairness and justice.

He got specific help from Pennsylvania Congressman Thaddeus Stevens, a leader of the “Radical Republicans” who had a fiery passion for desegregation and abolition of slavery, and who criticized President Lincoln for lack of more forceful action. Stevens is a major character in Steven Spielberg’s 2013 Oscar-nominated film Lincoln, with Tommy Lee Jones gaining an Oscar nomination for his portrayal of Stevens. On Feb. 3, 1870, the 15th Amendment to the Constitution guaranteed suffrage to black men (women of all colors would have to wait another 50 years until 1920 to gain the right to vote in all states). It would also lead to Catto’s death. On Election Day, Oct. 10, 1871, Catto was out encouraging black men to vote for Republicans. He was fatally shot by white Democrats who wanted to suppress the black vote.

Blacks continued to vote heavily for Republicans until the early 20th century and were not even allowed to attend Democratic conventions until 1924. This was primarily due to the fact that Southern states had white governors who mostly discouraged equal rights and supported Jim Crow laws that were unfair to blacks. As comedian Dick Gregory (1932-2017) famously joked, he was at a white lunch counter where he was told, “We don’t serve colored people here,” and Gregory replied, “That’s all right. I don’t eat colored people … just bring me a whole fried chicken!”

Octavius Catto, who broke segregation on trolley cars and was an all-star second basemen long before Jackie Robinson, would have to wait until the 20th century to get the recognition he deserved. I suspect he would be surprised that we are still struggling to “start a national conversation” about race when that’s what he sacrificed his life for.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why We Owe a Lot to Second President John Adams

An 1805 oil-on-canvas portrait of John Adams attributed to William Dunlap sold for $35,000 at a May 2017 Heritage auction.

By Jim O’Neal

John Adams had the misfortune of being squeezed into the presidency of the United States (for a single term) between George Washington and Thomas Jefferson, two of the most famous presidents of all time. As a result, Adams (1735-1826) was often overlooked as one of America’s greatest statesmen and perhaps the most learned and penetrating thinker of his time. The importance of his role in the founding of America was noted by Richard Stockton, a delegate to the Continental Congress: “The man to whom the country is most indebted for the great measure of independence. … I call him the Atlas of American Independence.”

On the way to that independence, his participation started as early as 1761 when he assisted James Otis in defending Boston merchants against Britain’s enforcement of the Sugar Tax. When the American Revolution ended, Adams played a key role in the peace treaty that formally ended the war in 1783. In between those two bookends, he wrote many of the most significant essays and treatises, led the radical movement in Boston, and articulated the principles at the Continental Congress.

Following the infamous Stamp Act in 1765, he attacked it with a vengeance and wrote A Dissertation on the Canon and Feudal Law, asserting it deprived the colonists of two basic rights: taxation by consent and a jury trial by peers – both guaranteed to all Englishmen by the Magna Carta. Within a brief 10 years, he was acknowledged as one of America’s best constitutional scholars. When Parliament passed the Coercive Acts in 1774, Adams drafted the principal clause of the Declaration of Rights and Grievances; no man worked harder in the movement for independence and the effort to constitutionalize the powers of self-government.

After the Battles of Lexington and Concord, Adams argued for the colonies to declare independence and in 1776, Congress passed a resolution recommending the colonies draft new constitutions and form new governments. Adams wrote a draft blueprint, Thoughts on Government, and four states used it to shape new constitutions. In summer 1776, Congress considered arguments for a formal independence and John Adams made a four-hour speech that forcefully persuaded the assembly to vote in favor. Thomas Jefferson later recalled that “it moved us from our seats … He was our colossus on the floor.”

Three years later, Adams drafted the Massachusetts Constitution, which was copied by other states and guided the framers of the Federal Constitution of 1787.

He faithfully served two full terms as vice president for George Washington at a time when the office had only two primary duties: preside over the Senate and break any tie votes, and count the ballots for presidential elections. Many routinely considered the office to be part of Congress as opposed to the executive branch. He served one term as president and then lost the 1800 election to his vice president, Thomas Jefferson, as the party system (and Alexander Hamilton) conspired against his re-election. Bitter and disgruntled, he left Washington, D.C., before Jefferson was inaugurated and returned to his home in Massachusetts. His wife Abigail had departed earlier as their son Charles died in November from the effects of chronic alcoholism.

Their eldest son, John Quincy Adams, served as the sixth president (for a single term) after a contentious election, and they both gradually sunk into relative obscurity. This changed dramatically in 2001 when historian David McCullough published a wonderful biography that reintroduced John and Abigail Adams to a generation that vaguely knew he had died on the same day as Thomas Jefferson, July 4, 1826 – the 50th anniversary of the signing of the Declaration of Independence. In typical McCullough fashion, it was a bestseller and led to an epic TV mini-series that snagged four Golden Globes and a record 13 Emmys in 2008.

Television at its very best!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

McKinley Skillfully Assumed More Presidential Power

This William McKinley political poster, dated 1900, sold for $6,875 at a May 2015 Heritage auction.

By Jim O’Neal

William McKinley was 54 years old at the time of his first inauguration in 1897. The Republicans had selected him as their nominee at the St. Louis convention on the first ballot on June 16, 1896. He had spent several years as an effective congressional representative and more recently the 39th governor of Ohio. Importantly, he had the backing of a shrewd manager, Mark Hanna, and the promise of what turned out to be the largest campaign fund in history – $3.5 million – largely by describing the campaign as a crusade of the working man versus the rich, who had impoverished the poor by limiting the money supply.

In the 1896 election, he defeated a remarkable 36-year-old orator, William Jennings Bryan, perhaps the most talented public speaker who ever ran for any office. McKinley wisely decided he could not compete against Bryan in a national campaign filled with political speeches. He adopted a novel “front porch” campaign that resulted in trainloads of voters arriving at his home in Canton, Ohio.

Bryan would lose again to McKinley in 1900, ducked Teddy Roosevelt in 1904, and then lose a third time in 1908 against William Howard Taft. The three-time Democratic nominee did serve two years as secretary of state for Woodrow Wilson (1913-15) and then died five days after the end of the Scopes Monkey Trial in 1925.

William and Ida McKinley followed Grover and Frances Cleveland into the White House after Cleveland’s non-consecutive terms as the 22nd and 24th president. Cleveland’s second term began with a disaster – the Panic of 1893 – when stock prices declined, 500 banks closed, 15,000 businesses failed and unemployment skyrocketed. This significant depression lasted all four years of his term in office and Cleveland, a Democrat, got most of the blame.

His excuse was the 1890 Sherman Silver Purchase Act, which required the Treasury to buy any silver offered using notes backed by silver or gold. An enormous over-production of silver by Western mines forced the Treasury to borrow $65 million in gold from J.P. Morgan and the Rothschild family in England. Since Cleveland had been unable to turn the economy around, it virtually ruined the Democratic Party and created the era of Republican domination from 1861 to 1933, with only Woodrow Wilson winning in 1912 when squabbling between Roosevelt and Taft split the vote three ways.

It’s common knowledge that McKinley was assassinated in 1901 after winning re-election in 1900, but there’s little attention paid to the time he spent in office beginning in 1897. 1898 got off to a wobbly start when his mother died, leading to a full 30 days of mourning that canceled an important diplomatic New Year’s celebration. Tensions between the United States and Spain over Cuba had electrified the diplomatic community and it was hoped that a White House reception would have provided a convenient venue to discuss strategic options.

Spain had mistreated Cuba since Columbus discovered it in 1492 and in 1895, it suspended the constitutional rights of the Cuban people following numerous internal revolutions. Once again, the countryside raged with bloody guerilla warfare; 200,000 Spanish troops were busy suppressing the insurgents and cruelly governing the peasant population. American newspapers horrified the public with details that offended their sense of justice and prompted calls for U.S. intervention. Talk of war with Spain was in the air again.

On Feb. 9, two days before a reception to honor the U.S. Army and Navy, the New York Journal published a front-page article revealing the details of a Spanish diplomat denouncing McKinley as a weakling, “a mere bidder for the admiration of the crowd.” The same day, the Spanish minister in Washington retrieved his passport from the State Department and boarded a train to Canada.

A rapid series of events led to war with Spain, including $50 million that Congress placed at the disposal of the president to be used for defense of the country, with no conditions attached. McKinley was wary of war due to his experience in the Civil War, but he carefully discussed the issue with his Cabinet and key senators to ensure concurrence. This was the first significant step to war and ultimately the transformation of presidential power. On April 25, Congress formally declared war on Spain and the actual landing of forces took place on June 6, when 100 Marines went ashore at Guantanamo Bay.

McKinley’s skillful assumption of authority during the Spanish-American War subtly changed the presidency, as Professor Woodrow Wilson of Princeton University wrote: “The president of the United States is now … at the front of affairs as no president since Lincoln has been since the start of the 19th century.” Those who followed McKinley into the White House would develop and expand these new powers of the presidency … starting with his vice president and successor Theodore Roosevelt, who had eagerly participated in the war with Spain with his “Rough Riders at San Juan Hill.”

We see their fingerprints throughout the 20th century and even today as the concept of formal declarations of war has become murky. Urgency has gradually eroded the power enumerated to Congress and there is almost always “no time to wait for an impotent Congress to resolve their partisan differences.”

The Founding Fathers would be surprised at how far the pendulum has swung.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Tremendous Challenges Awaited the Plainspoken Truman

Fewer than 10 examples of this Harry Truman “60 Million People Working” political pin are known to exist. This pin sold for $19,717 at an August 2008 Heritage auction.

By Jim O’Neal

When Franklin Roosevelt died on April 12, 1945, Harry Truman became the seventh vice president to move into the Oval Office after the death of a president. Truman had been born during the White House years of Chester Arthur, who had followed James Garfield after his assassination (1881). And in Truman’s lifetime, Teddy Roosevelt and Calvin Coolidge had ascended to the presidency after the deaths of William McKinley (1901) and Warren Harding (1923). However, none of these men had been faced with the challenges awaiting the plainspoken Truman.

FDR had been a towering figure for 12 years, first leading the country out of the Great Depression and then deftly steering the United States into World War II after being elected a record four times. Unfortunately, Truman had not been involved in several important decisions, and was totally unaware of several strategic secrets (e.g. the development of the atom bomb) or even side agreements made with others, notably Winston Churchill. He was not prepared to be president.

Even the presidents who preceded FDR tended to exaggerate the gap in Truman’s foreign-relations experience. Woodrow Wilson was a brilliant academic and Herbert Hoover was a world-famous engineer. There were enormously important decisions to be made that would shape the world for the next half century. Even Truman had his sincere doubts about being able to follow FDR, despite the president’s rapidly failing health.

The significance of these decisions has gradually faded, but for Truman, they were foisted upon him in rapid order: April 12, FDR’s death; April 28, Benito Mussolini killed by partisan Italians; two days later Adolf Hitler committed suicide; and on April 29, German military forces surrendered. The news from the Pacific was equally dramatic as troop landings on the critical island of Okinawa had apparently been unopposed by the Japanese. It was clearly the apex of optimism regarding the prospects for an unconditional surrender by Japan and the welcomed return of world peace.

In fact, it was a miracle that turned out to be a mirage.

After victory in Europe (V-E Day), Truman was faced with an immediate challenge regarding the 3 million troops in Europe. FDR and Churchill did not trust Joseph Stalin and were wary of what the Russians would do if we started withdrawing our troops. Churchill proved to be right about Russian motives, as they secretly intended to continue to permanently occupy the whole of Eastern Europe and expand into adjacent territories at will.

Then the U.S. government issued a report stating that the domestic economy could make a smooth transition to pre-war normalcy once the voracious demands from the military war-machine abated. Naturally, the war-weary public strongly supported “bringing the boys home,” but Truman knew that Japan would have to be forced to quit before any shifts in troops or production could start.

There was also a complex scheme under way to redeploy the troops from Europe to the Pacific if the Japanese decided to fight on to defend their sacred homeland. It was a task that George Marshall would call “the greatest administrative and logistical problem in the history of the world.”

Truman pondered in a diary entry: “I have to decide the Japanese strategy – shall we invade proper or shall we bomb and blockade? That is my hardest decision to date.” (No mention was made of “the other option.”)

The battle on Okinawa answered the question. Hundreds of Japanese suicide planes had a devastating effect. Even after 10 days of heavy sea and air bombardment on the island; 30 U.S ships sunk, 300 more damaged; 12,000 Americans killed; 36,000 wounded. It was now obvious that Japan would defend every single island, regardless of their losses. Surrender would not occur and America’s losses would be extreme.

So President Truman made a historic decision that is still being debated today: Drop the atomic bomb on Japan and assume that the effect would be so dramatic that the Japanese would immediately surrender. On Aug. 6, 1945, “Little Boy” was dropped on Hiroshima with devastating effects. Surprisingly, the Japanese maintained their silence, perhaps not even considering that there could be a second bomb. That second bomb – a plutonium variety nicknamed “Fat Man” – was then dropped two days ahead of schedule on Aug. 9 on the seaport city of Nagasaki.

No meeting had been held and there was no second order given (other than by Enola Gay pilot Paul Tibbets). The directive that had ordered the first bomb simply said in paragraph two that “additional bombs will be delivered AS MADE READY.” However, two is all that was needed. Imperial Japan surrendered on Aug. 15, thus ending one of history’s greatest wars.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Scientists Like Joseph Lister Have Made Life Better for All of Us

A March 25, 1901, letter signed by Joseph Lister went to auction in October 2014.

By Jim O’Neal

In the 1880s, American physicist Albert Michelson embarked on a series of experiments that undermined a long-held belief in a luminiferous ether that was thought to permeate the universe and affect the speed of light ever so slightly. Embraced by Isaac Newton (and almost venerated by all others), the ether theory was considered an absolute certainty in 19th century physics in explaining how light traveled across the universe.

However, Michelson’s experiments (partially funded by Alexander Graham Bell) proved the exact opposite of the theory. In the words of author William Cropper, “It was probably the most famous negative result in the history of physics.” The fact was that the speed of light was the same in all directions and in every season – reversing Newton’s law that had been thought to be a constant for the past 200 years. But, not everyone agreed for a long time.

The more modern scientist Max Planck (1858-1947) helped explain the resistance to accept new facts in a rather novel way: “A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.”

Even if true, it still makes it no less easy to accept the fact that the United States was the only nation “that remained unconvinced of the merits of Joseph Lister’s methods of modern antiseptic medicine.” In fact, Henry Jacob Bigelow (1818-1890), the esteemed Harvard professor of surgery and a fellow of the Academy of Arts and Sciences, derided antisepsis as “medical hocus-pocus.” This is even more remarkable when one considers he was the leading surgeon in New England and his contributions to orthopedic and urologic surgery are legendary.

But this short story begins with a sleight of hand by asking: In the 19th century, what do you think was the most dangerous place in the vast territories of the British Empire? The frozen wastes of the Northwest Passage or the treacherous savannas of Zululand? Or perhaps the dangerous passes of Hindu Kush? The surprising answer is almost undoubtedly the Victorian teaching hospital, where patients entered with a trauma and exited to a cemetery after a deadly case of “hospital gangrene.”

Victorian hospitals were described as factories of death, reeking with an unmistakable stench resembling rotting fish, cheerfully described as “hospital stink.” Infectious wounds were considered normal or beneficial to recovery. Stories abound of surgeons operating on a continuous flow of patients, and bloody smocks were badges of honor or evidence of their dedication to saving lives. The eminent surgeon Sir Frederick Treves (1853-1923) recalled, “There was one sponge to a ward. With this putrid article and a basin of once clear water, all the wounds in the ward were washed twice a day. By this ritual, any chance that a patient had of recovery was eliminated.”

Fortunately, Joseph Lister was born in 1827 and chose the lowly, mechanical profession of surgery over the more prestigious practice of internal medicine. In 1851, he was appointed one of four residents of surgery at London’s University College Hospital. The head of surgery was wrongfully convinced that infections came from miasma, a peculiar type of noxious air that emanated from the rot and decay.

Ever skeptical, Lister scoured out rotten tissue from gangrene wounds using mercury pernitrate on the healthy tissue. Thus began Lister’s lifelong journey to investigate the cause of infection and prevention through modern techniques. He spent the next 25 years in Scotland, becoming the Regius Professor of Surgery at the University of Glasgow. After Louis Pasteur confirmed germs caused infections rather than bad air, Lister discovered that carbolic acid (a derivative of coal tar) could prevent many amputations by cleaning the skin and wounds.

He then went on the road, advocating his gospel of antisepsis, which was eagerly adopted by the scientific Germans and some Scots, but plodding and practical English surgeons took much longer. Thus left were the isolated Americans who, like Dr. Bigelow, were too stubborn and unwilling to admit the obvious.

Planck was right all along. It would take a new generation, but we are the generation that has derived the greatest benefits from the astonishing advances in 20th century medical breakthroughs, which only seem to be accelerating. It is a good time to be alive.

So enjoy it!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How Far Will We Go In Amending American History?

A collection of items related to the dedication of the Washington Monument went to auction in May 2011.

By Jim O’Neal

Four years ago, George Clooney, Matt Damon and Bill Murray starred in a movie titled The Monuments Men, about a group of almost 400 specialists who were commissioned to try and retrieve monuments, manuscripts and artwork that had been looted in World War II.

The Germans were especially infamous for this and literally shipped long strings of railroad cars from all over Europe to German generals in Berlin. While they occupied Paris, they almost stripped the city of its fabled art collections by the world’s greatest artists. Small stashes of hidden art hoards are still being discovered yet today.

In the United States, another generation of anti-slavery groups are doing the exact opposite: lobbying to have statues and monuments removed, destroyed or relocated to obscure museums to gather dust out of the public eyes. Civil War flags and memorabilia on display were among the first to disappear, followed by Southern generals and others associated with the war. Now, streets and schools are being renamed. Slavery has understandably been the reason for the zeal to erase the past, but it sometimes appears the effort is slowly moving up the food chain.

More prominent names like President Woodrow Wilson have been targeted and for several years Princeton University has been protested because of the way it still honors Wilson, asserting he was a Virginia racist. Last year, Yale removed John C. Calhoun’s name from one of its residential colleges because he was one of the more vocal advocates of slavery, opening the path to the Civil War by supporting states’ rights to decide the slavery issue in South Carolina (which is an unquestionable fact). Dallas finally got around to removing some prominent Robert E. Lee statues, although one of the forklifts broke in the process.

Personally, I don’t object to any of this, especially if it helps to reunite America. So many different things seem to end up dividing us even further and this only weakens the United States (“United we stand, divided we fall”).

However, I hope to still be around if (when?) we erase Thomas Jefferson from the Declaration of Independence and are only left with George Washington and his extensive slavery practices (John Adams did not own slaves and Massachusetts was probably the first state to outlaw it).

It would seem to be relatively easy to change Mount Vernon or re-Washington, D.C., as the nation’s capital. But the Washington Monument may be an engineering nightmare. The Continental Congress proposed a monument to the Father of Our Country in 1783, even before the treaty conferring American independence was received. It was to honor his role as commander-in-chief during the Revolutionary War. But when Washington became president, he canceled it since he didn’t believe public money should be used for such honors. (If only that ethos was still around.)

But the idea for a monument resurfaced on the centennial of Washington’s birthday in 1832 (Washington died in 1799). A private group, the Washington National Monument Society – headed by Chief Justice John Marshall – was formed to solicit contributions. However, they were not sophisticated fundraisers since they limited gifts to $1 per person a year. (These were obviously very different times.) This restriction was exacerbated by the economic depression that gripped the country in 1832. This resulted in the cornerstone being delayed until July 4, 1848. An obscure congressman by the name of Abraham Lincoln was in the cheering crowd.

Even by the start of the Civil War 13 years later, the unsightly stump was still only 170 feet high, a far cry from the 600 feet originality projected. Mark Twain joined in the chorus of critics: “It has the aspect of a chimney with the top broken off … It is an eyesore to the people. It ought to be either pulled down or built up and finished,” Finally, President Ulysses S. Grant got Congress to appropriate the money and it was started again and ultimately opened in 1888. At the time, it was 555 feet tall and the tallest building in the world … a record that was eclipsed the following year when the Eiffel Tower was completed.

For me, it’s an impressive structure, with its sleek marble silhouette. I’m an admirer of the simplicity of plain, unadorned obelisks, since there are so few of them (only two in Maryland that I’m aware of). I realize others consider it on a par with a stalk of asparagus, but I’m proud to think of George Washington every time I see it.

Even so, if someday someone thinks it should be dismantled as the last symbol of a different period, they will be disappointed when they learn of all the other cities, highways, lakes, mountains and even a state that remain to go. Perhaps we can find a better use for all of that passion, energy and commitment and start rebuilding a crumbling infrastructure so in need of repairs. One can only hope.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Fillmore Among Presidents Who Juggled Balance Between Free and Slave States

This folk art campaign banner for Millard Fillmore’s failed 1856 bid for the presidency sold for $11,950 at a June 2013 Heritage auction.

By Jim O’Neal

On his final day in office, President James Polk wrote in his diary: “Closed my official term of President of the United States at 6am this morning.”

Later, after one last stroll through the silent White House, he penned a short addendum: “I feel exceedingly relieved that I am now free from all public cares. I am sure that I will be a happier man in my retirement than I have been for 4 years ….” He died 103 days later, the shortest retirement in presidential history and the first president survived by his mother. His wife Sarah (always clad only in black) lived for 42 more lonely years.

Fillmore

The Washington, D.C., that greeted his successor, General Zachary Taylor (“Old Rough and Ready”), still looked “unfinished” – even after 50 years of planning and development. The Mall was merely a grassy field where cows and sheep peacefully grazed. The many plans developed in the 1840s were disparate projects. Importantly, the marshy expanse south of the White House was suspected of emitting unhealthy vapors that were especially notable in the hot summers. Cholera was the most feared disease and it was prevalent until November each year when the first frost appeared.

Taylor

Naturally, the affluent left the Capitol for the entire summer. Since the Polks had insisted on remaining, there was a widespread belief that his death so soon after departing was directly linked to spending the presidential summers in the White House. The theory grew even stronger when Commissioner of Public Buildings Charles Douglas proposed to regrade the sloping fields into handsome terraces under the guise of “ornamental improvement.” Insiders knew the real motive was actually drainage and sanitation to eliminate the foul air that hung ominously around the White House. (It’s not clear if Donald Trump’s campaign promise to “drain the swamp” was another effort or a political metaphor.)

President Taylor was inaugurated with a predictable storm of jubilation since his name was a household word. After a 40-year career in the military (1808-1848), he had the distinction of serving in four difference wars: War of 1812, Black Hawk War (1832), Second Seminole War (1835-1842), and the Mexican-American War (1846-1848). By 1847, Taylormania broke out and his picture was everywhere … on ice carts, tall boards, fish stands, butcher stalls, cigar boxes and so on. After four years under the dour Polk, the public was ready to once again idolize a war hero with impeccable integrity and a promise to staff his Cabinet with the most experienced men in the country.

Alas, a short two years later, on July 9, 1850, President Taylor became the second president to die in office (William Henry Harrison lasted 31 days). On July 4, after too long in the hot sun listening to ponderous orations and too much ice water to cool off, he returned to the White House. It was there that he gorged on copious quantities of cherries, slathered with cream and sugar. After dinner, he developed severe stomach cramps and then the doctors took over and finished him off with calomel opium, quinine and, lastly, raising blisters and drawing blood. He survived this for several days and the official cause of death was cholera morbus, a gastrointestinal illness common in Washington where poor sanitation made it risky to eat raw fruit and fresh dairy products in the summer.

Vice President Millard Fillmore took the oath of office and spent the rest of the summer trying to catch up. Taylor had spent little time with his VP and then the entire Cabinet submitted their resignations over the next few days, which Fillmore cheerfully accepted. He immediately appointed a new Cabinet featuring the great Daniel Webster as Secretary of State. On Sept. 9, 1850, he signed a bill admitting California as the 31st state and as “a free state.” This was the first link in a chain that became the Compromise of 1850.

The Constitutional Congress did not permit the words “slave” or “slavery” since James Madison thought it was wrong to admit in the Constitution the idea that men could be considered property. In order to get enough states to approve it, it also prohibited Congress from passing any laws blocking it for 20 years (1808), by which it was assumed slavery would have long been abandoned for economic reasons. However, cotton production flourished after the invention of the cotton gin and on Jan. 1, 1808, President Thomas Jefferson signed into law that “Congress will have the power to exterminate slavery from our borders.”

This explains why controlling Congress was key to controlling slavery, so all the emphasis turned to maintaining a delicate balance whenever a new state was to be admitted … as either “free” or “slave.” Fillmore thus became the first of three presidents – including Franklin Pierce and James Buchanan – who worked hard to maintain harmony. However, with the election of Abraham Lincoln in 1860, it was clear what would happen … and all the Southern states started moving to the exit signs.

A true Civil War was now the only option to permanently resolving the slavery dilemma and it came with an enormous loss of life, property and a culture that we still struggle with yet today. That dammed cotton gin!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].