Betty Ford set a standard that all who follow should study

A portrait of Betty Ford by Lawrence Williams went to auction in 2007.

By Jim O’Neal

Every presidential trivia fan knows that Eleanor Roosevelt’s birth name was Eleanor Roosevelt. She had married her father’s fifth cousin, Franklin. Although the couple had six children, Eleanor said she disliked intimacy with him and wrote she was ill-equipped to be a mother since she didn’t understand or even like small children.

They somehow managed to stay married for 40 years until FDR died in 1945. Franklin did enjoy intimate relations, especially with Lucy Mercer, Eleanor’s social secretary. He wanted a divorce, but his mother (who controlled the family money) would not allow it. This even after a trove of love letters between Franklin and Lucy exposed their elicit relationship.

Eleanor skillfully leveraged her position as First Lady; many consider her the first First Lady since she personally championed so many women’s rights issues. She had an active public life and a serious relationship with reporter Lorena Hickok. Eleanor became well known during her long occupancy in the White House and was highly respected all over the world.

That was not true (initially) of Betty Ford, who became First Lady when Jerry Ford became president after Richard Nixon resigned in 1974. She was born Betty Bloomer and she had divorced after a failed five-year marriage to William Warren, an alcoholic she nursed during his final two years.

She was a dancer before she married the man whose name was Leslie Lynch King Jr. when he was born in 1913 (he changed his name in 1935). As a member of the renowned Martha Graham dance troupe, Ford had performed at Carnegie Hall and later earned the prestigious Presidential Medal of Freedom. It was presented by the recently deceased President George H.W. Bush in 1991.

Betty Ford (1918-2011) had been impressed by Eleanor Roosevelt since childhood. “She eventually became my role model because I admired her so. I loved her Independence … a woman finally speaking out for herself rather than saying what would be politically helpful to her husband. That seemed healthy to me.” Others were quick to note the similarities between the two women. Major publications compared the willingness of both to offer bold, personal opinions on highly controversial issues. I would argue that Betty Ford set a higher standard for candor than any of her predecessors.

One small example is the very first press conference in the State Dining Room. Ford seemed to have no reservations about repeating her strong positions as a supporter of the Equal Rights Amendment and her pro-choice stance on abortion. She admitted she had consulted a psychiatrist, had been divorced, and used tranquilizers for physical pain. Any single one of these uttered today would instantly be “Breaking News” on the cable news channels so starved for fresh material (or innuendo).

Initially, Ford didn’t consider her Ladyship as a “meaningful position,” but rather than letting the role define her, she decided to change it. “I wanted to be a good First Lady … but didn’t feel compelled to emulate my predecessors.” She simply decided to be Betty Bloomer Ford … “and [I] might as well have a good time doing it.” She succeeded on both accounts and the results were more than just surprising.

She talked about “demanding privilege” and “a great opportunity,” but also about the “salvation” that gave her a genuine career of her own … and on a national level she’d never experienced before. Her impact helped reshape her into a likeable leader with broad respect.

Her creative imagination rivaled Jackie’s. “This house has been a grave,” she said. “I want it to sing!” More women were seated at the president’s table, especially second-tier political women who needed a little boost. And they were round tables, which denoted equality. This was the instinct of a free, bohemian spirit, but not by contrivance. She had been a single woman who studied modern dance and introduced it to the ghettos of Grand Rapids, Mich. She spoke deliberately and was unafraid of listening to differing viewpoints.

There were the occasional curious remarks about her drug and alcohol use, but easily rationalized by her well-known physical pain from severe arthritis and pinched nerve courtesy of her dancing. Not even nosy reporters questioned or sought to investigate the degree of her medications. It wasn’t until after the Fords left the White House that the drinking resulted in a family intervention.

In true Betty Ford fashion, after the denial, anger and resentment subsided, a positive outcome resulted. The Betty Ford Center was founded in Rancho Mirage, Calif. The center, known as Camp Betty, has helped celebrities and others overcome substance abuse issues. It offers treatment without shame and, although not a cure or panacea, gives people control over their lives. The opioid crisis of today is using some of the experience gained from Camp Betty.

However, her most lasting and important contribution concerns breast cancer. During the mid-1970s, television didn’t even allow the word “breast” until a determined Betty Ford decided to go very public with her condition. She had accompanied a friend who was having an annual checkup and the doctor suggested she do the same. After several more doctors got involved, a biopsy confirmed she had breast cancer. The White House press office squabbled over releasing information about her condition, but Betty spotted another opportunity.

By the time she was back in the White House two weeks later, women across America were having breast examinations and mammograms. The ensuing media coverage of her honest revelations was credited with saving the lives of thousands of women who had discovered breast tumors. The East Wing was flooded with 60,000 cards, letters and telegrams, 10 percent from women who had mastectomies. The First Lady told the American Cancer Society, “I just cannot stress enough how necessary it is for women to take an active interest in their own health and body … too many women are so afraid … they endanger their lives.”

Ford was a modern day Abigail Adams, but Ford used a megaphone rather than letters, and in a practical way. Bravo to an under-appreciated First Lady, who set a standard that all who follow should study.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Real Populist legacy lives on in a thousand different ways

The People’s Party, also known as the Populist Party, met in Omaha, Neb., in 1892 and nominated James B. Weaver for its presidential ticket. A convention ticket went to auction in March 2018.

By Jim O’Neal

After the Civil War, thousands of farmers found themselves mired in a European style of serfdom. By 1883, they were trapped by the monopolistic pricing of both merchants and the railroads, which consumed virtually all their profits. To make their dilemma even worse, the federal government had returned to the gold standard after the war ended and demands of Wall Street drained money from rural banks to the point that entire regions were essentially broke.

The poor farmer was in the classic squeeze where the harder they worked and the more they produced the less they had. An early attempt to break the conundrum was to band together in what was called the Farmers’ Alliance, which began in Lampasas, Texas, in 1877. The Alliance quickly spread to Kansas, but within six years it was a failed effort since market forces were simply too strong for such an amateurish effort born out of desperation and lacking any real leverage or political power.

Voila! Enter the first in a long string of populists, a 36-year-old former tenant farmer from Mississippi: S.O. Daws. It was Daws’ goal to convert the Alliance into an overtly political organization, with its own Populist platform, formal candidates and party structure. However, his real genius lay in a dazzling oratory skill and grasp of political tactics. Daws persuaded the Alliance to appoint him “Traveling Lecturer” and he quickly started spreading the word and convincing his fellow farmers what to do.

One of his converts was a 34-year-old Tennessean named William Lamb, an undereducated (25 days of formal education) rail-splitter and farmer with an almost unsurpassed talent for organization. Together, Daws and Lamb provided the spark the Alliance had been missing. They used the sweeping executive power the farmers granted and soon had literally hundreds of thousands of new recruits in the organization. All told, they actually enlisted over 2 million people in 43 states. Populist historian Lawrence Goodwyn characterized it as the most massive organizing drive of any citizen institution in the entire 19th century in America.

But the Populists were never really about their leaders. They were about an idea, or actually many ideas … anything that might allow common men to make a living off the land while maintaining their human dignity. Generally, they were derided as nativist hicks, primarily because of later efforts. At the beginning, when they were at their best, they were staunchly anti-racist and injected a firestorm of ideas into a political system that tended to be moribund. Populist programs included a graduated income tax, the eight-hour workday, direct election of senators, citizen referendums, the secret ballot and, above all, regulation of agricultural markets to ensure farmers a decent return for their labor.

Time and fate worked against the Populists as America became increasingly industrialized with the lure of urbanization. After running their candidate for president in 1892, James B. Weaver of Iowa, the Populist Party (also known as the People’s Party or simply the Populists) folded themselves into William Jennings Bryan’s silver wing of the Democratic Party. Naturally, many of their ideas were also subsumed into other progressive political movements. One example is Sam Ealy Johnson (LBJ’s father), who served in the Texas House of Representatives from 1905-1924, and who said: “The job of government is to help people who are caught in the tentacles of circumstances.” Clearly a Populist inspiration.

Obviously, FDR incorporated many of the same concepts into his New Deal programs that helped during the Great Depression. The real Populist legacy lives on in a thousand other ways yet today, but these were people capable of standing in the hot sun for hours, and listening to speeches about obscure and esoteric subjects while working their way to a better life for all. This is how a democratic culture is created and we need to ensure it doesn’t get diluted by ruinous socialist beliefs that have failed every time well-intentioned people go too far.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

President Lincoln can teach us a little about working together

A scarce copy of The Photographs of Abraham Lincoln featuring nearly 150 images relating to the president sold for $5,250 at a June 2015 Heritage auction.

By Jim O’Neal

No matter where you stand on the current controversy over the Supreme Court nomination process, it’s almost a certainty that irrespective of the outcome, you haven’t changed your opinion (much). There are just too many places to find others who totally agree with you and who validate your position. Technically, it’s called an “echo chamber” and we are surrounded by it.

Never fear, someone will share your viewpoint and increase your confidence level in believing that you are right … irrespective of what others think.

The 24/7 cable news business model was built on this premise and increased eyeballs, resulting in higher ratings, which drive higher advertising rates. They probably caught on from analyzing newspapers, who learned early that good news doesn’t sell as well, just as their street vendors learned that shouting “Read all about it (fill in the blank)” sold newspapers. Living in the U.K. for five years finally broke my habit, but it was mostly sensory overload from all the tabloids rather than my preference for a juicy story, regardless of the topic.

People who study the echo chamber have been writing about the increase in “tribalism,” in the sense that people are actually moving to communities, joining clubs and sharing social media with like-minded people at an accelerating rate. I suppose this will continue, but I haven’t found a tribe that will have me. In fact, quite the opposite, since I much prefer hearing a broadly diverse spectrum of ideas.

I relish hearing opinions about climate change, gun control, border security, health care, policing our cities, the Electoral College, the Iraq War, media bias and so on … especially from smart people who have fresh ideas … instead of stale recycled talking points borrowed from others. I regularly read both The New York Times and The Wall Street Journal to get basic balance. The only line I draw is at wild conspiracies, unless they’re packaged by people who are also highly entertaining (e.g. Oliver Stone and his JFK or Platoon).

Doris Kearns Godwin’s Team of Rivals does a terrific job of explaining this concept, using the election of 1860 and how Abraham Lincoln leveraged his administration by filling three of his top Cabinet posts with his main election rivals. They became part of the solution rather than critics. In my opinion, the U.S. Congress should practice this to gain consensus rather than relying on an appellate system and the Supreme Court to shape our legal landscape.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

John Adams saw the White House as a home for ‘honest and wise men’

A vintage creamware punch bowl, commemorating “John Adams President of the United States,” sold for $15,535 at a March 2008 Heritage auction.

By Jim O’Neal

As the states prepared for the first presidential election under the new Constitution, it was clear that George Washington was the overwhelming favorite to become the first president of the United States.

Under the rules, each state would cast two votes and at the February 1789 Electoral College, all 69 Electors cast one of their votes for Washington, making him the unanimous choice of 10 states. Two of the original Colonies (North Carolina and Rhode Island) had not yet ratified the Constitution, and New York had an internal dispute and did not chose Electors in time to participate. Eleven other men received a total of 69 votes, with John Adams topping the list with 34 votes, slightly less than 50 percent. He became the first vice president.

Four years later, there were 15 states (Vermont and Kentucky) and the Electoral College increased to 132 Electors. Again, Washington was elected president unanimously, with 132 votes. Adams was also re-elected with 77 votes, besting George Clinton, Thomas Jefferson and Aaron Burr. All three of the runner-ups would later become vice presidents, with Clinton serving a term for two different presidents (Jefferson and Madison). Jefferson had cleverly picked Clinton as his VP due to his age, correctly assuming Clinton would be too old to secede him … thus ensuring that Secretary of State James Madison would be the logical choice. Clinton would actually be the first VP to die in office.

John Adams

Two-time Vice President John Adams would finally win the presidency on his third try after Washington decided not to seek a third term in 1796. Still, Adams barely squeaked by, defeating Jefferson 71-68. Jefferson would become vice president after finishing second. It was during the Adams presidency that the federal government would make its final move to the South after residing first in New York City and then Philadelphia.

This relocation was enabled by the 1790 Residence Act, a compromise that was brokered by Jefferson with Alexander Hamilton and James Madison, with the proviso that the federal government assume all remaining state debts from the Revolutionary War. In addition to specifying the Potomac River area as the permanent seat of the government, it further authorized the president to select the exact spot and allowed a 10-year window for completion.

Washington rather eagerly agreed to assume this responsibility and launched into it with zeal. He personally selected the exact spot, despite expert advice against it. He even set the stakes for the foundation himself and carefully supervised the myriad details involved during actual construction. When the stone walls were rising, everyone on the project assembled, laid the cornerstone and affixed an engraved plate. Once in the mortar, the plate sank and has never been located since. An effort was made to find it on the 200th anniversary in 1992. All the old maps were pored over and the area was X-rayed … all to no avail. It remained undetected.

The project was completed on time and with Washington in his grave for 10 months, plans were made to relocate the White House from Philadelphia. The first resident, President John Adams, entered the President’s House at 1 p.m. on Nov. 1, 1800. It was the 24th year of American independence and three weeks later, he would deliver his fourth State of the Union address to a joint session of Congress. It was the last annual message delivered personally for 113 years. Thomas Jefferson discontinued the practice and it was not revived until 1913 (by Woodrow Wilson). With the advent of radio, followed by television, it was just too tempting for any succeeding presidents to pass up the opportunity.

John Adams was a fifth-generation American. He followed his father to Harvard and dabbled in teaching before becoming a lawyer. His most well-known case was defending the British Captain and eight soldiers involved in the Boston Massacre on March 5, 1770. He was not involved in the Boston Tea Party, but rejoiced since he suspected it would inevitably lead to the convening of the First Continental Congress in Philadelphia in 1774.

He married Abigail Smith … the first woman married to a president who also had a son become president. Unlike Barbara Bush, she died 10 years before John Quincy Adams actually became president in 1825. Both father and son served only one term. Abigail had not yet joined the president at the White House, but the next morning he sent her a letter with a benediction for their new home: “I pray heaven to bestow the best blessing on this house and on all that shall hereafter inhabit it. May none but honest and wise men ever rule under this roof.” Franklin D. Roosevelt was so taken with it that he had it carved into the State Dining Room mantle in 1945.

Amen.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Benjamin Franklin’s basement was literally filled with skeletons

A pre-1850 folk art tavern sign depicting Benjamin Franklin sold for $11,250 at a May 2014 Heritage auction.

By Jim O’Neal

The Benjamin Franklin House is a formal museum in Central London near Trafalgar Square. It’s a popular location for kooky political speeches and peaceful demonstrations. Although anyone is free to speak about virtually anything, many visitors are not raptly paying attention, preferring to instead feed the pigeons. I never had the temerity to practice my public speaking, although I’m sometimes tempted (“Going wobbly,” as my English friends would observe).

Known once as Charing Cross, Trafalgar Square now commemorates the British naval victory in October 1805 off the coast of Cape Trafalgar, Spain. Admiral Horatio Nelson defeated the Spanish and French fleets there, resulting in Britain gaining global sea supremacy for the next century.

The Franklin House is reputedly the only building still standing where Franklin actually lived … anywhere. He resided there for several years after accepting a diplomatic role from the Pennsylvania Assembly in pre-Revolutionary times. Derelict for most of the 20th century, the site caused a stir 20-plus years ago while it was being renovated. During the extensive excavation, a cache of several hundred human bones were unearthed

Since anatomy was one of the few scientific things Franklin did not dabble in, the general consensus was that one of his colleagues did, at a time when privately dissecting cadavers was unlawful and those who did it were very discreet. I discovered the museum while riding a black cab on the way to the American Bar at the nearby Savoy Hotel. I may take the full tour if we ever return to London.

However, my personal favorite is likely to remain the Franklin Institute in the middle of Philadelphia. A large rotunda features the official national memorial to Franklin: a 20-foot marble statue sculpted by James Earle Fraser in 1938. It was dedicated by Vice President Nelson Aldrich Rockefeller in 1976. Fraser is well known in the worlds of sculpting, medals and coin collecting. He designed the Indian Head (Buffalo) nickel, minted from 1913-38; several key dates in high grade have sold for more than $100,000 at auction. I’ve owned several nice ones, including the popular 3-Leg variety that was minted in Denver in 1937. (Don’t bother checking your change!).

Fraser (1876-1953) grew up in the West and his father, an engineer, was one of the men asked to help retrieve remains from Custer’s Last Stand. George Armstrong Custer needs no introduction due to his famous massacre by the Lakota, Cheyenne and Arapaho in 1876 – the year Fraser was born – in the Battle of the Little Bighorn (Montana). But it helps explain his empathy for American Indians as they were forced off their reservations. His famous statue titled End of the Trail depicts the despair in a dramatic and memorable way. The Beach Boys used it for the cover of their 1971 album Surf’s Up.

Another historic Fraser sculpture is 1940’s Equestrian Statue of Theodore Roosevelt at the American Museum of Natural History (AMNH) in New York City. Roosevelt is on horseback with an American Indian standing on one side and an African-American man on the other. The AMNH was built using private funds, including from TR’s father, and it is an outstanding world-class facility in a terrific location across from Central Park.

However, there is a movement to have Roosevelt’s statue removed, with activists claiming it is racist and emblematic of the theft of land by Europeans. Another group has been active throwing red paint on the statue while a commission appointed by Mayor Bill de Blasio studies how to respond to the seemingly endless efforts to erase history. Apparently, the city’s Columbus Circle and its controversial namesake have dropped off the radar screen.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Despite numerous failed examples, socialism still fascinates some people

An 1872 presidential campaign banner for Horace Greeley sold for $40,000 at a December 2016 Heritage auction.

By Jim O’Neal

Many credit the famous 19th century motto of “Go West, young man” to newspaperman Horace Greeley for a line in a July 1865 editorial. However, there is still a debate over whether it was first penned by Greeley or the lesser-known John Soule in an 1851 edition of the Terre Haute (Ind.) Express. Either way, the dictum helped fuel the westward movement of Americans in our quest for Manifest Destiny (“From sea to shining sea”). Clearly, Greeley helped more to popularize the concept due to the great influence of his successful newspaper.

Greeley was much less successful as a politician. He was sent to Congress in 1848 in a special election to represent New York. His colleagues groused that the brief three months he spent there were primarily devoted to exposing Congressional corruption in his newspaper rather than passing legislation. He was unable to generate any meaningful support for re-election, which relegated him back to his real interest, which was reporting on news and exposing crooked politicians.

Despite this setback to his political career, Greeley remained a powerful force in American politics throughout the entire Civil War period and beyond. After exposing the corruption in the first term of the Grant presidency (1868-1872), he found himself in the curious position of being the presidential candidate for both the Democratic Party (which he had opposed on every issue for many years) and the Liberal-Republican Party (which was an offshoot that objected to the corruption).

The 1872 presidential election was especially bitter, with both sides resorting to dirty tricks and making wild allegations against each other. Grant won the Republican nomination unanimously and as the incumbent, chose not to actively campaign. Greeley was a virtual whirlwind, traveling widely and making 20 or more speeches every day. A cynic observed that the problem was it was the wrong message to the wrong audience, but fundamentally, the issue was that Greeley was simply a poor campaigner and Grant was still a very popular president/general.

Grant easily won his re-election bid for a second term with 56 percent of the popular vote and Greeley died on Nov. 29 – just 24 days after the election and before the electoral votes were cast or counted. This is the first and only time a nominee for president of a major party has died during the election process. Grant went on to snag a comfortable 56 electoral votes as the others were spread among several candidates, including three for the deceased Greeley (which were later contested).

Thus ended the life of Horace Greeley (1811-1872), who had been founder and editor of the New-York Tribune, arguably in the top tier of great American newspapers. Established in 1841, it was renamed the New-York Daily Tribune (1842-1866) as its daily circulation exploded to 200,000. Greeley was endlessly promoting utopian reforms such as vegetarianism, agrarianism, feminism and socialism. In 1852-62, the paper retained Karl Marx as its London-based European correspondent to elaborate on his basic tenets of Marxism.

Great Britain had just finished its decennial census, which put the population at precisely 20,959,477. This was just 1.6 percent of the world’s population, but nowhere on the planet was there a more rich or productive group of people. The empire produced 50 percent of the world’s iron and coal, controlled two-thirds of the shipping and accounted for one-third of all trade. London’s banks had more money on deposit than all other financial centers … combined! Virtually all the finished cotton in the world was produced in Great Britain on machines built in Britain by British inventors.

The famous British Empire covered 11.5 million square miles and included 25 percent of the world’s population. By whatever measurement, it was the richest, most innovative and skilled nation known to man, and in London – where he was living the good life – primarily on his friend Friedrich Engels’ money – Marx was still churning out socialist propaganda. He made no attempt to explain that for the first time in history, there was a lot of everything in most people’s lives. Victorian London was not only the largest city in the world, but the only place one could buy 500 different kinds of hammers and a dazzling array of nails to pound on.

While Marxism morphed into Bolshevism, communism and socialism – polluting the economic systems of many hopeful utopians like Greeley – capitalism and the market-based theories of Adam Smith (“the father of modern economics”) quietly crept over America almost unnoticed. Despite the numerous failed examples of socialism in the real world, there will always be a new generation of people wanting to try it.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why We Owe a Lot to Second President John Adams

An 1805 oil-on-canvas portrait of John Adams attributed to William Dunlap sold for $35,000 at a May 2017 Heritage auction.

By Jim O’Neal

John Adams had the misfortune of being squeezed into the presidency of the United States (for a single term) between George Washington and Thomas Jefferson, two of the most famous presidents of all time. As a result, Adams (1735-1826) was often overlooked as one of America’s greatest statesmen and perhaps the most learned and penetrating thinker of his time. The importance of his role in the founding of America was noted by Richard Stockton, a delegate to the Continental Congress: “The man to whom the country is most indebted for the great measure of independence. … I call him the Atlas of American Independence.”

On the way to that independence, his participation started as early as 1761 when he assisted James Otis in defending Boston merchants against Britain’s enforcement of the Sugar Tax. When the American Revolution ended, Adams played a key role in the peace treaty that formally ended the war in 1783. In between those two bookends, he wrote many of the most significant essays and treatises, led the radical movement in Boston, and articulated the principles at the Continental Congress.

Following the infamous Stamp Act in 1765, he attacked it with a vengeance and wrote A Dissertation on the Canon and Feudal Law, asserting it deprived the colonists of two basic rights: taxation by consent and a jury trial by peers – both guaranteed to all Englishmen by the Magna Carta. Within a brief 10 years, he was acknowledged as one of America’s best constitutional scholars. When Parliament passed the Coercive Acts in 1774, Adams drafted the principal clause of the Declaration of Rights and Grievances; no man worked harder in the movement for independence and the effort to constitutionalize the powers of self-government.

After the Battles of Lexington and Concord, Adams argued for the colonies to declare independence and in 1776, Congress passed a resolution recommending the colonies draft new constitutions and form new governments. Adams wrote a draft blueprint, Thoughts on Government, and four states used it to shape new constitutions. In summer 1776, Congress considered arguments for a formal independence and John Adams made a four-hour speech that forcefully persuaded the assembly to vote in favor. Thomas Jefferson later recalled that “it moved us from our seats … He was our colossus on the floor.”

Three years later, Adams drafted the Massachusetts Constitution, which was copied by other states and guided the framers of the Federal Constitution of 1787.

He faithfully served two full terms as vice president for George Washington at a time when the office had only two primary duties: preside over the Senate and break any tie votes, and count the ballots for presidential elections. Many routinely considered the office to be part of Congress as opposed to the executive branch. He served one term as president and then lost the 1800 election to his vice president, Thomas Jefferson, as the party system (and Alexander Hamilton) conspired against his re-election. Bitter and disgruntled, he left Washington, D.C., before Jefferson was inaugurated and returned to his home in Massachusetts. His wife Abigail had departed earlier as their son Charles died in November from the effects of chronic alcoholism.

Their eldest son, John Quincy Adams, served as the sixth president (for a single term) after a contentious election, and they both gradually sunk into relative obscurity. This changed dramatically in 2001 when historian David McCullough published a wonderful biography that reintroduced John and Abigail Adams to a generation that vaguely knew he had died on the same day as Thomas Jefferson, July 4, 1826 – the 50th anniversary of the signing of the Declaration of Independence. In typical McCullough fashion, it was a bestseller and led to an epic TV mini-series that snagged four Golden Globes and a record 13 Emmys in 2008.

Television at its very best!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

McKinley Skillfully Assumed More Presidential Power

This William McKinley political poster, dated 1900, sold for $6,875 at a May 2015 Heritage auction.

By Jim O’Neal

William McKinley was 54 years old at the time of his first inauguration in 1897. The Republicans had selected him as their nominee at the St. Louis convention on the first ballot on June 16, 1896. He had spent several years as an effective congressional representative and more recently the 39th governor of Ohio. Importantly, he had the backing of a shrewd manager, Mark Hanna, and the promise of what turned out to be the largest campaign fund in history – $3.5 million – largely by describing the campaign as a crusade of the working man versus the rich, who had impoverished the poor by limiting the money supply.

In the 1896 election, he defeated a remarkable 36-year-old orator, William Jennings Bryan, perhaps the most talented public speaker who ever ran for any office. McKinley wisely decided he could not compete against Bryan in a national campaign filled with political speeches. He adopted a novel “front porch” campaign that resulted in trainloads of voters arriving at his home in Canton, Ohio.

Bryan would lose again to McKinley in 1900, ducked Teddy Roosevelt in 1904, and then lose a third time in 1908 against William Howard Taft. The three-time Democratic nominee did serve two years as secretary of state for Woodrow Wilson (1913-15) and then died five days after the end of the Scopes Monkey Trial in 1925.

William and Ida McKinley followed Grover and Frances Cleveland into the White House after Cleveland’s non-consecutive terms as the 22nd and 24th president. Cleveland’s second term began with a disaster – the Panic of 1893 – when stock prices declined, 500 banks closed, 15,000 businesses failed and unemployment skyrocketed. This significant depression lasted all four years of his term in office and Cleveland, a Democrat, got most of the blame.

His excuse was the 1890 Sherman Silver Purchase Act, which required the Treasury to buy any silver offered using notes backed by silver or gold. An enormous over-production of silver by Western mines forced the Treasury to borrow $65 million in gold from J.P. Morgan and the Rothschild family in England. Since Cleveland had been unable to turn the economy around, it virtually ruined the Democratic Party and created the era of Republican domination from 1861 to 1933, with only Woodrow Wilson winning in 1912 when squabbling between Roosevelt and Taft split the vote three ways.

It’s common knowledge that McKinley was assassinated in 1901 after winning re-election in 1900, but there’s little attention paid to the time he spent in office beginning in 1897. 1898 got off to a wobbly start when his mother died, leading to a full 30 days of mourning that canceled an important diplomatic New Year’s celebration. Tensions between the United States and Spain over Cuba had electrified the diplomatic community and it was hoped that a White House reception would have provided a convenient venue to discuss strategic options.

Spain had mistreated Cuba since Columbus discovered it in 1492 and in 1895, it suspended the constitutional rights of the Cuban people following numerous internal revolutions. Once again, the countryside raged with bloody guerilla warfare; 200,000 Spanish troops were busy suppressing the insurgents and cruelly governing the peasant population. American newspapers horrified the public with details that offended their sense of justice and prompted calls for U.S. intervention. Talk of war with Spain was in the air again.

On Feb. 9, two days before a reception to honor the U.S. Army and Navy, the New York Journal published a front-page article revealing the details of a Spanish diplomat denouncing McKinley as a weakling, “a mere bidder for the admiration of the crowd.” The same day, the Spanish minister in Washington retrieved his passport from the State Department and boarded a train to Canada.

A rapid series of events led to war with Spain, including $50 million that Congress placed at the disposal of the president to be used for defense of the country, with no conditions attached. McKinley was wary of war due to his experience in the Civil War, but he carefully discussed the issue with his Cabinet and key senators to ensure concurrence. This was the first significant step to war and ultimately the transformation of presidential power. On April 25, Congress formally declared war on Spain and the actual landing of forces took place on June 6, when 100 Marines went ashore at Guantanamo Bay.

McKinley’s skillful assumption of authority during the Spanish-American War subtly changed the presidency, as Professor Woodrow Wilson of Princeton University wrote: “The president of the United States is now … at the front of affairs as no president since Lincoln has been since the start of the 19th century.” Those who followed McKinley into the White House would develop and expand these new powers of the presidency … starting with his vice president and successor Theodore Roosevelt, who had eagerly participated in the war with Spain with his “Rough Riders at San Juan Hill.”

We see their fingerprints throughout the 20th century and even today as the concept of formal declarations of war has become murky. Urgency has gradually eroded the power enumerated to Congress and there is almost always “no time to wait for an impotent Congress to resolve their partisan differences.”

The Founding Fathers would be surprised at how far the pendulum has swung.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Tremendous Challenges Awaited the Plainspoken Truman

Fewer than 10 examples of this Harry Truman “60 Million People Working” political pin are known to exist. This pin sold for $19,717 at an August 2008 Heritage auction.

By Jim O’Neal

When Franklin Roosevelt died on April 12, 1945, Harry Truman became the seventh vice president to move into the Oval Office after the death of a president. Truman had been born during the White House years of Chester Arthur, who had followed James Garfield after his assassination (1881). And in Truman’s lifetime, Teddy Roosevelt and Calvin Coolidge had ascended to the presidency after the deaths of William McKinley (1901) and Warren Harding (1923). However, none of these men had been faced with the challenges awaiting the plainspoken Truman.

FDR had been a towering figure for 12 years, first leading the country out of the Great Depression and then deftly steering the United States into World War II after being elected a record four times. Unfortunately, Truman had not been involved in several important decisions, and was totally unaware of several strategic secrets (e.g. the development of the atom bomb) or even side agreements made with others, notably Winston Churchill. He was not prepared to be president.

Even the presidents who preceded FDR tended to exaggerate the gap in Truman’s foreign-relations experience. Woodrow Wilson was a brilliant academic and Herbert Hoover was a world-famous engineer. There were enormously important decisions to be made that would shape the world for the next half century. Even Truman had his sincere doubts about being able to follow FDR, despite the president’s rapidly failing health.

The significance of these decisions has gradually faded, but for Truman, they were foisted upon him in rapid order: April 12, FDR’s death; April 28, Benito Mussolini killed by partisan Italians; two days later Adolf Hitler committed suicide; and on April 29, German military forces surrendered. The news from the Pacific was equally dramatic as troop landings on the critical island of Okinawa had apparently been unopposed by the Japanese. It was clearly the apex of optimism regarding the prospects for an unconditional surrender by Japan and the welcomed return of world peace.

In fact, it was a miracle that turned out to be a mirage.

After victory in Europe (V-E Day), Truman was faced with an immediate challenge regarding the 3 million troops in Europe. FDR and Churchill did not trust Joseph Stalin and were wary of what the Russians would do if we started withdrawing our troops. Churchill proved to be right about Russian motives, as they secretly intended to continue to permanently occupy the whole of Eastern Europe and expand into adjacent territories at will.

Then the U.S. government issued a report stating that the domestic economy could make a smooth transition to pre-war normalcy once the voracious demands from the military war-machine abated. Naturally, the war-weary public strongly supported “bringing the boys home,” but Truman knew that Japan would have to be forced to quit before any shifts in troops or production could start.

There was also a complex scheme under way to redeploy the troops from Europe to the Pacific if the Japanese decided to fight on to defend their sacred homeland. It was a task that George Marshall would call “the greatest administrative and logistical problem in the history of the world.”

Truman pondered in a diary entry: “I have to decide the Japanese strategy – shall we invade proper or shall we bomb and blockade? That is my hardest decision to date.” (No mention was made of “the other option.”)

The battle on Okinawa answered the question. Hundreds of Japanese suicide planes had a devastating effect. Even after 10 days of heavy sea and air bombardment on the island; 30 U.S ships sunk, 300 more damaged; 12,000 Americans killed; 36,000 wounded. It was now obvious that Japan would defend every single island, regardless of their losses. Surrender would not occur and America’s losses would be extreme.

So President Truman made a historic decision that is still being debated today: Drop the atomic bomb on Japan and assume that the effect would be so dramatic that the Japanese would immediately surrender. On Aug. 6, 1945, “Little Boy” was dropped on Hiroshima with devastating effects. Surprisingly, the Japanese maintained their silence, perhaps not even considering that there could be a second bomb. That second bomb – a plutonium variety nicknamed “Fat Man” – was then dropped two days ahead of schedule on Aug. 9 on the seaport city of Nagasaki.

No meeting had been held and there was no second order given (other than by Enola Gay pilot Paul Tibbets). The directive that had ordered the first bomb simply said in paragraph two that “additional bombs will be delivered AS MADE READY.” However, two is all that was needed. Imperial Japan surrendered on Aug. 15, thus ending one of history’s greatest wars.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How Far Will We Go In Amending American History?

A collection of items related to the dedication of the Washington Monument went to auction in May 2011.

By Jim O’Neal

Four years ago, George Clooney, Matt Damon and Bill Murray starred in a movie titled The Monuments Men, about a group of almost 400 specialists who were commissioned to try and retrieve monuments, manuscripts and artwork that had been looted in World War II.

The Germans were especially infamous for this and literally shipped long strings of railroad cars from all over Europe to German generals in Berlin. While they occupied Paris, they almost stripped the city of its fabled art collections by the world’s greatest artists. Small stashes of hidden art hoards are still being discovered yet today.

In the United States, another generation of anti-slavery groups are doing the exact opposite: lobbying to have statues and monuments removed, destroyed or relocated to obscure museums to gather dust out of the public eyes. Civil War flags and memorabilia on display were among the first to disappear, followed by Southern generals and others associated with the war. Now, streets and schools are being renamed. Slavery has understandably been the reason for the zeal to erase the past, but it sometimes appears the effort is slowly moving up the food chain.

More prominent names like President Woodrow Wilson have been targeted and for several years Princeton University has been protested because of the way it still honors Wilson, asserting he was a Virginia racist. Last year, Yale removed John C. Calhoun’s name from one of its residential colleges because he was one of the more vocal advocates of slavery, opening the path to the Civil War by supporting states’ rights to decide the slavery issue in South Carolina (which is an unquestionable fact). Dallas finally got around to removing some prominent Robert E. Lee statues, although one of the forklifts broke in the process.

Personally, I don’t object to any of this, especially if it helps to reunite America. So many different things seem to end up dividing us even further and this only weakens the United States (“United we stand, divided we fall”).

However, I hope to still be around if (when?) we erase Thomas Jefferson from the Declaration of Independence and are only left with George Washington and his extensive slavery practices (John Adams did not own slaves and Massachusetts was probably the first state to outlaw it).

It would seem to be relatively easy to change Mount Vernon or re-Washington, D.C., as the nation’s capital. But the Washington Monument may be an engineering nightmare. The Continental Congress proposed a monument to the Father of Our Country in 1783, even before the treaty conferring American independence was received. It was to honor his role as commander-in-chief during the Revolutionary War. But when Washington became president, he canceled it since he didn’t believe public money should be used for such honors. (If only that ethos was still around.)

But the idea for a monument resurfaced on the centennial of Washington’s birthday in 1832 (Washington died in 1799). A private group, the Washington National Monument Society – headed by Chief Justice John Marshall – was formed to solicit contributions. However, they were not sophisticated fundraisers since they limited gifts to $1 per person a year. (These were obviously very different times.) This restriction was exacerbated by the economic depression that gripped the country in 1832. This resulted in the cornerstone being delayed until July 4, 1848. An obscure congressman by the name of Abraham Lincoln was in the cheering crowd.

Even by the start of the Civil War 13 years later, the unsightly stump was still only 170 feet high, a far cry from the 600 feet originality projected. Mark Twain joined in the chorus of critics: “It has the aspect of a chimney with the top broken off … It is an eyesore to the people. It ought to be either pulled down or built up and finished,” Finally, President Ulysses S. Grant got Congress to appropriate the money and it was started again and ultimately opened in 1888. At the time, it was 555 feet tall and the tallest building in the world … a record that was eclipsed the following year when the Eiffel Tower was completed.

For me, it’s an impressive structure, with its sleek marble silhouette. I’m an admirer of the simplicity of plain, unadorned obelisks, since there are so few of them (only two in Maryland that I’m aware of). I realize others consider it on a par with a stalk of asparagus, but I’m proud to think of George Washington every time I see it.

Even so, if someday someone thinks it should be dismantled as the last symbol of a different period, they will be disappointed when they learn of all the other cities, highways, lakes, mountains and even a state that remain to go. Perhaps we can find a better use for all of that passion, energy and commitment and start rebuilding a crumbling infrastructure so in need of repairs. One can only hope.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].