100 Years Before Rosa Parks, There was Octavius Catto

Rosa Parks refused to give up her seat on a segregated bus, sparking the Montgomery, Ala., bus boycott.

By Jim O’Neal

Most Americans are familiar with Rosa Parks and recall the heroic story of a weary black woman on her way home after a hard day at work who refused to give up her seat and “move to the back of the bus” to make room for white people. The date was Dec. 1, 1955, and the city was Montgomery, Ala.

Later, she would be arrested during the ensuing Montgomery bus boycott that lasted 381 days. She was fined $10, but ultimately vindicated by the U.S. Supreme Court, which ruled the segregation law was unconstitutional. After her death, she became the first African-American woman to have her likeness depicted in the National Statuary Hall in the U.S. Capitol.

Parks (1913-2005) earned her way into the pantheon of civil rights leaders, but few remember a remarkable man who preceded her by a century when streetcars were pulled by horses.

Catto

His name was Octavius Valentine Catto (1839-1871) and history was slow in recognizing his astonishing accomplishments. Even the epitaph on his tombstone shouts in bold letters “THE FORGOTTEN HERO.” One episode in his far-too-short but inspiring life is eerily similar to the events in Montgomery, only dramatically more so. Catto was a fierce enemy of the entire Philadelphia trolley car system, which banned black passengers. On May 18, 1865, The New York Times ran a story about an incident involving Catto that occurred the previous afternoon in Philadelphia, “The City of Brotherly Love” (at least for some).

Paraphrasing the story, it describes how a colored man (Catto) had refused all attempts to get him to leave a strictly segregated trolley car. Frustrated and in fear of being fined if he physically ejected him, the conductor cleverly side railed the car, detached the horses and left the defiant passenger in the now-empty stationary car. Apparently, the stubborn man was still on-board after spending the night. It caused a neighborhood sensation that led to even more people challenging the rules.

The following year, there was an important meeting with the Urban League to protest the forcible ejection of several black women from Philadelphia streetcars. The intrepid Catto presented a number of resolutions that highlighted the inequities in segregation, principles of freedom, civil liberty and a heavily biased judicial system. He also boldly solicited support from fellow citizens in his quest for fairness and justice.

He got specific help from Pennsylvania Congressman Thaddeus Stevens, a leader of the “Radical Republicans” who had a fiery passion for desegregation and abolition of slavery, and who criticized President Lincoln for lack of more forceful action. Stevens is a major character in Steven Spielberg’s 2013 Oscar-nominated film Lincoln, with Tommy Lee Jones gaining an Oscar nomination for his portrayal of Stevens. On Feb. 3, 1870, the 15th Amendment to the Constitution guaranteed suffrage to black men (women of all colors would have to wait another 50 years until 1920 to gain the right to vote in all states). It would also lead to Catto’s death. On Election Day, Oct. 10, 1871, Catto was out encouraging black men to vote for Republicans. He was fatally shot by white Democrats who wanted to suppress the black vote.

Blacks continued to vote heavily for Republicans until the early 20th century and were not even allowed to attend Democratic conventions until 1924. This was primarily due to the fact that Southern states had white governors who mostly discouraged equal rights and supported Jim Crow laws that were unfair to blacks. As comedian Dick Gregory (1932-2017) famously joked, he was at a white lunch counter where he was told, “We don’t serve colored people here,” and Gregory replied, “That’s all right. I don’t eat colored people … just bring me a whole fried chicken!”

Octavius Catto, who broke segregation on trolley cars and was an all-star second basemen long before Jackie Robinson, would have to wait until the 20th century to get the recognition he deserved. I suspect he would be surprised that we are still struggling to “start a national conversation” about race when that’s what he sacrificed his life for.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why We Owe a Lot to Second President John Adams

An 1805 oil-on-canvas portrait of John Adams attributed to William Dunlap sold for $35,000 at a May 2017 Heritage auction.

By Jim O’Neal

John Adams had the misfortune of being squeezed into the presidency of the United States (for a single term) between George Washington and Thomas Jefferson, two of the most famous presidents of all time. As a result, Adams (1735-1826) was often overlooked as one of America’s greatest statesmen and perhaps the most learned and penetrating thinker of his time. The importance of his role in the founding of America was noted by Richard Stockton, a delegate to the Continental Congress: “The man to whom the country is most indebted for the great measure of independence. … I call him the Atlas of American Independence.”

On the way to that independence, his participation started as early as 1761 when he assisted James Otis in defending Boston merchants against Britain’s enforcement of the Sugar Tax. When the American Revolution ended, Adams played a key role in the peace treaty that formally ended the war in 1783. In between those two bookends, he wrote many of the most significant essays and treatises, led the radical movement in Boston, and articulated the principles at the Continental Congress.

Following the infamous Stamp Act in 1765, he attacked it with a vengeance and wrote A Dissertation on the Canon and Feudal Law, asserting it deprived the colonists of two basic rights: taxation by consent and a jury trial by peers – both guaranteed to all Englishmen by the Magna Carta. Within a brief 10 years, he was acknowledged as one of America’s best constitutional scholars. When Parliament passed the Coercive Acts in 1774, Adams drafted the principal clause of the Declaration of Rights and Grievances; no man worked harder in the movement for independence and the effort to constitutionalize the powers of self-government.

After the Battles of Lexington and Concord, Adams argued for the colonies to declare independence and in 1776, Congress passed a resolution recommending the colonies draft new constitutions and form new governments. Adams wrote a draft blueprint, Thoughts on Government, and four states used it to shape new constitutions. In summer 1776, Congress considered arguments for a formal independence and John Adams made a four-hour speech that forcefully persuaded the assembly to vote in favor. Thomas Jefferson later recalled that “it moved us from our seats … He was our colossus on the floor.”

Three years later, Adams drafted the Massachusetts Constitution, which was copied by other states and guided the framers of the Federal Constitution of 1787.

He faithfully served two full terms as vice president for George Washington at a time when the office had only two primary duties: preside over the Senate and break any tie votes, and count the ballots for presidential elections. Many routinely considered the office to be part of Congress as opposed to the executive branch. He served one term as president and then lost the 1800 election to his vice president, Thomas Jefferson, as the party system (and Alexander Hamilton) conspired against his re-election. Bitter and disgruntled, he left Washington, D.C., before Jefferson was inaugurated and returned to his home in Massachusetts. His wife Abigail had departed earlier as their son Charles died in November from the effects of chronic alcoholism.

Their eldest son, John Quincy Adams, served as the sixth president (for a single term) after a contentious election, and they both gradually sunk into relative obscurity. This changed dramatically in 2001 when historian David McCullough published a wonderful biography that reintroduced John and Abigail Adams to a generation that vaguely knew he had died on the same day as Thomas Jefferson, July 4, 1826 – the 50th anniversary of the signing of the Declaration of Independence. In typical McCullough fashion, it was a bestseller and led to an epic TV mini-series that snagged four Golden Globes and a record 13 Emmys in 2008.

Television at its very best!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

McKinley Skillfully Assumed More Presidential Power

This William McKinley political poster, dated 1900, sold for $6,875 at a May 2015 Heritage auction.

By Jim O’Neal

William McKinley was 54 years old at the time of his first inauguration in 1897. The Republicans had selected him as their nominee at the St. Louis convention on the first ballot on June 16, 1896. He had spent several years as an effective congressional representative and more recently the 39th governor of Ohio. Importantly, he had the backing of a shrewd manager, Mark Hanna, and the promise of what turned out to be the largest campaign fund in history – $3.5 million – largely by describing the campaign as a crusade of the working man versus the rich, who had impoverished the poor by limiting the money supply.

In the 1896 election, he defeated a remarkable 36-year-old orator, William Jennings Bryan, perhaps the most talented public speaker who ever ran for any office. McKinley wisely decided he could not compete against Bryan in a national campaign filled with political speeches. He adopted a novel “front porch” campaign that resulted in trainloads of voters arriving at his home in Canton, Ohio.

Bryan would lose again to McKinley in 1900, ducked Teddy Roosevelt in 1904, and then lose a third time in 1908 against William Howard Taft. The three-time Democratic nominee did serve two years as secretary of state for Woodrow Wilson (1913-15) and then died five days after the end of the Scopes Monkey Trial in 1925.

William and Ida McKinley followed Grover and Frances Cleveland into the White House after Cleveland’s non-consecutive terms as the 22nd and 24th president. Cleveland’s second term began with a disaster – the Panic of 1893 – when stock prices declined, 500 banks closed, 15,000 businesses failed and unemployment skyrocketed. This significant depression lasted all four years of his term in office and Cleveland, a Democrat, got most of the blame.

His excuse was the 1890 Sherman Silver Purchase Act, which required the Treasury to buy any silver offered using notes backed by silver or gold. An enormous over-production of silver by Western mines forced the Treasury to borrow $65 million in gold from J.P. Morgan and the Rothschild family in England. Since Cleveland had been unable to turn the economy around, it virtually ruined the Democratic Party and created the era of Republican domination from 1861 to 1933, with only Woodrow Wilson winning in 1912 when squabbling between Roosevelt and Taft split the vote three ways.

It’s common knowledge that McKinley was assassinated in 1901 after winning re-election in 1900, but there’s little attention paid to the time he spent in office beginning in 1897. 1898 got off to a wobbly start when his mother died, leading to a full 30 days of mourning that canceled an important diplomatic New Year’s celebration. Tensions between the United States and Spain over Cuba had electrified the diplomatic community and it was hoped that a White House reception would have provided a convenient venue to discuss strategic options.

Spain had mistreated Cuba since Columbus discovered it in 1492 and in 1895, it suspended the constitutional rights of the Cuban people following numerous internal revolutions. Once again, the countryside raged with bloody guerilla warfare; 200,000 Spanish troops were busy suppressing the insurgents and cruelly governing the peasant population. American newspapers horrified the public with details that offended their sense of justice and prompted calls for U.S. intervention. Talk of war with Spain was in the air again.

On Feb. 9, two days before a reception to honor the U.S. Army and Navy, the New York Journal published a front-page article revealing the details of a Spanish diplomat denouncing McKinley as a weakling, “a mere bidder for the admiration of the crowd.” The same day, the Spanish minister in Washington retrieved his passport from the State Department and boarded a train to Canada.

A rapid series of events led to war with Spain, including $50 million that Congress placed at the disposal of the president to be used for defense of the country, with no conditions attached. McKinley was wary of war due to his experience in the Civil War, but he carefully discussed the issue with his Cabinet and key senators to ensure concurrence. This was the first significant step to war and ultimately the transformation of presidential power. On April 25, Congress formally declared war on Spain and the actual landing of forces took place on June 6, when 100 Marines went ashore at Guantanamo Bay.

McKinley’s skillful assumption of authority during the Spanish-American War subtly changed the presidency, as Professor Woodrow Wilson of Princeton University wrote: “The president of the United States is now … at the front of affairs as no president since Lincoln has been since the start of the 19th century.” Those who followed McKinley into the White House would develop and expand these new powers of the presidency … starting with his vice president and successor Theodore Roosevelt, who had eagerly participated in the war with Spain with his “Rough Riders at San Juan Hill.”

We see their fingerprints throughout the 20th century and even today as the concept of formal declarations of war has become murky. Urgency has gradually eroded the power enumerated to Congress and there is almost always “no time to wait for an impotent Congress to resolve their partisan differences.”

The Founding Fathers would be surprised at how far the pendulum has swung.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Tremendous Challenges Awaited the Plainspoken Truman

Fewer than 10 examples of this Harry Truman “60 Million People Working” political pin are known to exist. This pin sold for $19,717 at an August 2008 Heritage auction.

By Jim O’Neal

When Franklin Roosevelt died on April 12, 1945, Harry Truman became the seventh vice president to move into the Oval Office after the death of a president. Truman had been born during the White House years of Chester Arthur, who had followed James Garfield after his assassination (1881). And in Truman’s lifetime, Teddy Roosevelt and Calvin Coolidge had ascended to the presidency after the deaths of William McKinley (1901) and Warren Harding (1923). However, none of these men had been faced with the challenges awaiting the plainspoken Truman.

FDR had been a towering figure for 12 years, first leading the country out of the Great Depression and then deftly steering the United States into World War II after being elected a record four times. Unfortunately, Truman had not been involved in several important decisions, and was totally unaware of several strategic secrets (e.g. the development of the atom bomb) or even side agreements made with others, notably Winston Churchill. He was not prepared to be president.

Even the presidents who preceded FDR tended to exaggerate the gap in Truman’s foreign-relations experience. Woodrow Wilson was a brilliant academic and Herbert Hoover was a world-famous engineer. There were enormously important decisions to be made that would shape the world for the next half century. Even Truman had his sincere doubts about being able to follow FDR, despite the president’s rapidly failing health.

The significance of these decisions has gradually faded, but for Truman, they were foisted upon him in rapid order: April 12, FDR’s death; April 28, Benito Mussolini killed by partisan Italians; two days later Adolf Hitler committed suicide; and on April 29, German military forces surrendered. The news from the Pacific was equally dramatic as troop landings on the critical island of Okinawa had apparently been unopposed by the Japanese. It was clearly the apex of optimism regarding the prospects for an unconditional surrender by Japan and the welcomed return of world peace.

In fact, it was a miracle that turned out to be a mirage.

After victory in Europe (V-E Day), Truman was faced with an immediate challenge regarding the 3 million troops in Europe. FDR and Churchill did not trust Joseph Stalin and were wary of what the Russians would do if we started withdrawing our troops. Churchill proved to be right about Russian motives, as they secretly intended to continue to permanently occupy the whole of Eastern Europe and expand into adjacent territories at will.

Then the U.S. government issued a report stating that the domestic economy could make a smooth transition to pre-war normalcy once the voracious demands from the military war-machine abated. Naturally, the war-weary public strongly supported “bringing the boys home,” but Truman knew that Japan would have to be forced to quit before any shifts in troops or production could start.

There was also a complex scheme under way to redeploy the troops from Europe to the Pacific if the Japanese decided to fight on to defend their sacred homeland. It was a task that George Marshall would call “the greatest administrative and logistical problem in the history of the world.”

Truman pondered in a diary entry: “I have to decide the Japanese strategy – shall we invade proper or shall we bomb and blockade? That is my hardest decision to date.” (No mention was made of “the other option.”)

The battle on Okinawa answered the question. Hundreds of Japanese suicide planes had a devastating effect. Even after 10 days of heavy sea and air bombardment on the island; 30 U.S ships sunk, 300 more damaged; 12,000 Americans killed; 36,000 wounded. It was now obvious that Japan would defend every single island, regardless of their losses. Surrender would not occur and America’s losses would be extreme.

So President Truman made a historic decision that is still being debated today: Drop the atomic bomb on Japan and assume that the effect would be so dramatic that the Japanese would immediately surrender. On Aug. 6, 1945, “Little Boy” was dropped on Hiroshima with devastating effects. Surprisingly, the Japanese maintained their silence, perhaps not even considering that there could be a second bomb. That second bomb – a plutonium variety nicknamed “Fat Man” – was then dropped two days ahead of schedule on Aug. 9 on the seaport city of Nagasaki.

No meeting had been held and there was no second order given (other than by Enola Gay pilot Paul Tibbets). The directive that had ordered the first bomb simply said in paragraph two that “additional bombs will be delivered AS MADE READY.” However, two is all that was needed. Imperial Japan surrendered on Aug. 15, thus ending one of history’s greatest wars.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Scientists Like Joseph Lister Have Made Life Better for All of Us

A March 25, 1901, letter signed by Joseph Lister went to auction in October 2014.

By Jim O’Neal

In the 1880s, American physicist Albert Michelson embarked on a series of experiments that undermined a long-held belief in a luminiferous ether that was thought to permeate the universe and affect the speed of light ever so slightly. Embraced by Isaac Newton (and almost venerated by all others), the ether theory was considered an absolute certainty in 19th century physics in explaining how light traveled across the universe.

However, Michelson’s experiments (partially funded by Alexander Graham Bell) proved the exact opposite of the theory. In the words of author William Cropper, “It was probably the most famous negative result in the history of physics.” The fact was that the speed of light was the same in all directions and in every season – reversing Newton’s law that had been thought to be a constant for the past 200 years. But, not everyone agreed for a long time.

The more modern scientist Max Planck (1858-1947) helped explain the resistance to accept new facts in a rather novel way: “A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.”

Even if true, it still makes it no less easy to accept the fact that the United States was the only nation “that remained unconvinced of the merits of Joseph Lister’s methods of modern antiseptic medicine.” In fact, Henry Jacob Bigelow (1818-1890), the esteemed Harvard professor of surgery and a fellow of the Academy of Arts and Sciences, derided antisepsis as “medical hocus-pocus.” This is even more remarkable when one considers he was the leading surgeon in New England and his contributions to orthopedic and urologic surgery are legendary.

But this short story begins with a sleight of hand by asking: In the 19th century, what do you think was the most dangerous place in the vast territories of the British Empire? The frozen wastes of the Northwest Passage or the treacherous savannas of Zululand? Or perhaps the dangerous passes of Hindu Kush? The surprising answer is almost undoubtedly the Victorian teaching hospital, where patients entered with a trauma and exited to a cemetery after a deadly case of “hospital gangrene.”

Victorian hospitals were described as factories of death, reeking with an unmistakable stench resembling rotting fish, cheerfully described as “hospital stink.” Infectious wounds were considered normal or beneficial to recovery. Stories abound of surgeons operating on a continuous flow of patients, and bloody smocks were badges of honor or evidence of their dedication to saving lives. The eminent surgeon Sir Frederick Treves (1853-1923) recalled, “There was one sponge to a ward. With this putrid article and a basin of once clear water, all the wounds in the ward were washed twice a day. By this ritual, any chance that a patient had of recovery was eliminated.”

Fortunately, Joseph Lister was born in 1827 and chose the lowly, mechanical profession of surgery over the more prestigious practice of internal medicine. In 1851, he was appointed one of four residents of surgery at London’s University College Hospital. The head of surgery was wrongfully convinced that infections came from miasma, a peculiar type of noxious air that emanated from the rot and decay.

Ever skeptical, Lister scoured out rotten tissue from gangrene wounds using mercury pernitrate on the healthy tissue. Thus began Lister’s lifelong journey to investigate the cause of infection and prevention through modern techniques. He spent the next 25 years in Scotland, becoming the Regius Professor of Surgery at the University of Glasgow. After Louis Pasteur confirmed germs caused infections rather than bad air, Lister discovered that carbolic acid (a derivative of coal tar) could prevent many amputations by cleaning the skin and wounds.

He then went on the road, advocating his gospel of antisepsis, which was eagerly adopted by the scientific Germans and some Scots, but plodding and practical English surgeons took much longer. Thus left were the isolated Americans who, like Dr. Bigelow, were too stubborn and unwilling to admit the obvious.

Planck was right all along. It would take a new generation, but we are the generation that has derived the greatest benefits from the astonishing advances in 20th century medical breakthroughs, which only seem to be accelerating. It is a good time to be alive.

So enjoy it!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How Far Will We Go In Amending American History?

A collection of items related to the dedication of the Washington Monument went to auction in May 2011.

By Jim O’Neal

Four years ago, George Clooney, Matt Damon and Bill Murray starred in a movie titled The Monuments Men, about a group of almost 400 specialists who were commissioned to try and retrieve monuments, manuscripts and artwork that had been looted in World War II.

The Germans were especially infamous for this and literally shipped long strings of railroad cars from all over Europe to German generals in Berlin. While they occupied Paris, they almost stripped the city of its fabled art collections by the world’s greatest artists. Small stashes of hidden art hoards are still being discovered yet today.

In the United States, another generation of anti-slavery groups are doing the exact opposite: lobbying to have statues and monuments removed, destroyed or relocated to obscure museums to gather dust out of the public eyes. Civil War flags and memorabilia on display were among the first to disappear, followed by Southern generals and others associated with the war. Now, streets and schools are being renamed. Slavery has understandably been the reason for the zeal to erase the past, but it sometimes appears the effort is slowly moving up the food chain.

More prominent names like President Woodrow Wilson have been targeted and for several years Princeton University has been protested because of the way it still honors Wilson, asserting he was a Virginia racist. Last year, Yale removed John C. Calhoun’s name from one of its residential colleges because he was one of the more vocal advocates of slavery, opening the path to the Civil War by supporting states’ rights to decide the slavery issue in South Carolina (which is an unquestionable fact). Dallas finally got around to removing some prominent Robert E. Lee statues, although one of the forklifts broke in the process.

Personally, I don’t object to any of this, especially if it helps to reunite America. So many different things seem to end up dividing us even further and this only weakens the United States (“United we stand, divided we fall”).

However, I hope to still be around if (when?) we erase Thomas Jefferson from the Declaration of Independence and are only left with George Washington and his extensive slavery practices (John Adams did not own slaves and Massachusetts was probably the first state to outlaw it).

It would seem to be relatively easy to change Mount Vernon or re-Washington, D.C., as the nation’s capital. But the Washington Monument may be an engineering nightmare. The Continental Congress proposed a monument to the Father of Our Country in 1783, even before the treaty conferring American independence was received. It was to honor his role as commander-in-chief during the Revolutionary War. But when Washington became president, he canceled it since he didn’t believe public money should be used for such honors. (If only that ethos was still around.)

But the idea for a monument resurfaced on the centennial of Washington’s birthday in 1832 (Washington died in 1799). A private group, the Washington National Monument Society – headed by Chief Justice John Marshall – was formed to solicit contributions. However, they were not sophisticated fundraisers since they limited gifts to $1 per person a year. (These were obviously very different times.) This restriction was exacerbated by the economic depression that gripped the country in 1832. This resulted in the cornerstone being delayed until July 4, 1848. An obscure congressman by the name of Abraham Lincoln was in the cheering crowd.

Even by the start of the Civil War 13 years later, the unsightly stump was still only 170 feet high, a far cry from the 600 feet originality projected. Mark Twain joined in the chorus of critics: “It has the aspect of a chimney with the top broken off … It is an eyesore to the people. It ought to be either pulled down or built up and finished,” Finally, President Ulysses S. Grant got Congress to appropriate the money and it was started again and ultimately opened in 1888. At the time, it was 555 feet tall and the tallest building in the world … a record that was eclipsed the following year when the Eiffel Tower was completed.

For me, it’s an impressive structure, with its sleek marble silhouette. I’m an admirer of the simplicity of plain, unadorned obelisks, since there are so few of them (only two in Maryland that I’m aware of). I realize others consider it on a par with a stalk of asparagus, but I’m proud to think of George Washington every time I see it.

Even so, if someday someone thinks it should be dismantled as the last symbol of a different period, they will be disappointed when they learn of all the other cities, highways, lakes, mountains and even a state that remain to go. Perhaps we can find a better use for all of that passion, energy and commitment and start rebuilding a crumbling infrastructure so in need of repairs. One can only hope.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Fillmore Among Presidents Who Juggled Balance Between Free and Slave States

This folk art campaign banner for Millard Fillmore’s failed 1856 bid for the presidency sold for $11,950 at a June 2013 Heritage auction.

By Jim O’Neal

On his final day in office, President James Polk wrote in his diary: “Closed my official term of President of the United States at 6am this morning.”

Later, after one last stroll through the silent White House, he penned a short addendum: “I feel exceedingly relieved that I am now free from all public cares. I am sure that I will be a happier man in my retirement than I have been for 4 years ….” He died 103 days later, the shortest retirement in presidential history and the first president survived by his mother. His wife Sarah (always clad only in black) lived for 42 more lonely years.

Fillmore

The Washington, D.C., that greeted his successor, General Zachary Taylor (“Old Rough and Ready”), still looked “unfinished” – even after 50 years of planning and development. The Mall was merely a grassy field where cows and sheep peacefully grazed. The many plans developed in the 1840s were disparate projects. Importantly, the marshy expanse south of the White House was suspected of emitting unhealthy vapors that were especially notable in the hot summers. Cholera was the most feared disease and it was prevalent until November each year when the first frost appeared.

Taylor

Naturally, the affluent left the Capitol for the entire summer. Since the Polks had insisted on remaining, there was a widespread belief that his death so soon after departing was directly linked to spending the presidential summers in the White House. The theory grew even stronger when Commissioner of Public Buildings Charles Douglas proposed to regrade the sloping fields into handsome terraces under the guise of “ornamental improvement.” Insiders knew the real motive was actually drainage and sanitation to eliminate the foul air that hung ominously around the White House. (It’s not clear if Donald Trump’s campaign promise to “drain the swamp” was another effort or a political metaphor.)

President Taylor was inaugurated with a predictable storm of jubilation since his name was a household word. After a 40-year career in the military (1808-1848), he had the distinction of serving in four difference wars: War of 1812, Black Hawk War (1832), Second Seminole War (1835-1842), and the Mexican-American War (1846-1848). By 1847, Taylormania broke out and his picture was everywhere … on ice carts, tall boards, fish stands, butcher stalls, cigar boxes and so on. After four years under the dour Polk, the public was ready to once again idolize a war hero with impeccable integrity and a promise to staff his Cabinet with the most experienced men in the country.

Alas, a short two years later, on July 9, 1850, President Taylor became the second president to die in office (William Henry Harrison lasted 31 days). On July 4, after too long in the hot sun listening to ponderous orations and too much ice water to cool off, he returned to the White House. It was there that he gorged on copious quantities of cherries, slathered with cream and sugar. After dinner, he developed severe stomach cramps and then the doctors took over and finished him off with calomel opium, quinine and, lastly, raising blisters and drawing blood. He survived this for several days and the official cause of death was cholera morbus, a gastrointestinal illness common in Washington where poor sanitation made it risky to eat raw fruit and fresh dairy products in the summer.

Vice President Millard Fillmore took the oath of office and spent the rest of the summer trying to catch up. Taylor had spent little time with his VP and then the entire Cabinet submitted their resignations over the next few days, which Fillmore cheerfully accepted. He immediately appointed a new Cabinet featuring the great Daniel Webster as Secretary of State. On Sept. 9, 1850, he signed a bill admitting California as the 31st state and as “a free state.” This was the first link in a chain that became the Compromise of 1850.

The Constitutional Congress did not permit the words “slave” or “slavery” since James Madison thought it was wrong to admit in the Constitution the idea that men could be considered property. In order to get enough states to approve it, it also prohibited Congress from passing any laws blocking it for 20 years (1808), by which it was assumed slavery would have long been abandoned for economic reasons. However, cotton production flourished after the invention of the cotton gin and on Jan. 1, 1808, President Thomas Jefferson signed into law that “Congress will have the power to exterminate slavery from our borders.”

This explains why controlling Congress was key to controlling slavery, so all the emphasis turned to maintaining a delicate balance whenever a new state was to be admitted … as either “free” or “slave.” Fillmore thus became the first of three presidents – including Franklin Pierce and James Buchanan – who worked hard to maintain harmony. However, with the election of Abraham Lincoln in 1860, it was clear what would happen … and all the Southern states started moving to the exit signs.

A true Civil War was now the only option to permanently resolving the slavery dilemma and it came with an enormous loss of life, property and a culture that we still struggle with yet today. That dammed cotton gin!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Washington Remains Our Greatest President

A George Washington inaugural button, perhaps the earliest artifact that refers to Washington as the “Father of His Country,” realized $225,000 at a February 2018 Heritage auction.

By Jim O’Neal

Presidential scholars typically list George Washington, Abraham Lincoln and Franklin Delano Roosevelt as our finest presidents. I tend to favor Washington since without him, we would probably have a much different country in so many aspects. If there were any doubts about the feats of the “Father of Our Country,” they were certainly dispelled in 2005 when David McCullough’s 1776 hit bookstores, followed five years later by Ron Chernow’s masterful Washington: A Life, which examined the man in exquisite detail. They didn’t leave much ground uncovered, but there are still a few tidbits that haven’t become overused and still interesting for those interested in fresh anecdotes.

For example, Washington wasn’t aware that on Nov. 30, 1782, a preliminary Treaty of Paris was signed that brought American Revolutionary hostilities to an end. The United States was prevented from dealing directly with Great Britain due to an alliance with France that stipulated we would not negotiate with Britain without them. Had he known, Washington would have been highly suspicious since King George III “will push the war as long as the nation will find men or money.” In a way, Washington would have been right since the United States had demanded full recognition as a sovereign nation, in addition to removal of all troops and fishing rights in Newfoundland. The king rejected this since he was still determined to keep the United States as a British colony, with greater autonomy. Ben Franklin naturally opposed this and countered with adding 100 percent of Canada to the United States. And so it went until May 12, 1784, when the documents bringing the Revolutionary War to an end were finally ratified and exchanged by all parties.

It was during these protracted negotiations that Washington was concerned that the army might lose its fighting edge. He kept drilling the troops while issuing a steady stream of instructions: “Nothing contributes so much to the appearance of a soldier, or so plainly indicates discipline, as an erect carriage, firm step and steady countenance.” After all these years of hardships and war, Washington was still a militant committed to end the haughty pride of the British. To help ensure the fighting spirit of his army, Washington introduced a decoration designated as the Badge of Military Merit on Aug. 7, 1782. He personally awarded three and then authorized his subordinate officers to issue them in cases of unusual gallantry or extraordinary fidelity and essential service. Soldiers received a purple heart-shaped cloth, to be worn over the left breast. After a lapse, it was redesigned and is now the Purple Heart medal, awarded to those wounded or killed. The first was awarded on Feb. 22, 1932, the 200th anniversary of Washington’s birthday.

The victorious conclusion of the Revolutionary War left many questions unanswered concerning American governance, prominently the relationship between the government and the military. At the end, army officers had several legitimate grievances. Congress was in arrears with pay and had not settled officer food and clothing accounts or made any provisions for military pensions. In March 1783, an anonymous letter circulated calling on officers to take a more aggressive stance, draw up a list of demands, and even possibly defy the new government! Washington acted quickly, calling for a meeting of all officers and at the last moment delivered one of the most eloquent and important speeches of his life.

After the speech, he drew a letter from a pocket that outlined Congressional actions to be undertaken. He hesitated and then fumbled in his pockets and remarked, “Gentlemen, you will permit me to put on my spectacles, for I have not only grown gray, but almost blind, in the service of my country.” By all accounts, the officers were brought to tears, and the potentially dangerous conspiracy collapsed immediately.

He gets my vote.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Sea Battle off Coast of France a Crucial Union Victory

The USS Kearsarge’s sinking of the CSS Alabama gave the North a much-needed boost in morale. This image appeared on an 1864 Union ballot.

By Jim O’Neal

On Sunday, June 19, 1864, in the English Channel off Cherbourg, France, one turbulent hour brought to a climax the worldwide struggle for sea power between the North and South. Within sight of the French cliffs, lined with hundreds of people who came to see the announced spectacle of duel, were the USS Kearsarge and the Confederate warship the CSS Alabama.

French spectators munched from food baskets as the drama unfolded.

These ships, so far from home, appeared to be twins as far as the landsman could see. However, major differences in guns, crews, armor and ammunition could not be seen from shore. The Kearsarge’s 11-inch guns outmatched those of her foe. Her sides were sheathed in metal chains – covered with boards. She had been in dock for repairs the past three months, engines carefully tuned and both powder and shot in excellent condition.

By coincidence, the two captains, Raphael Semmes of the Alabama and John Winslow of the Kearsarge, were longtime friends as messmates, roommates and shipmates in the pre-war Navy – and both were Southerners. As they maneuvered their ships into position, a French warship played Confederate music as the Alabama steamed out of harbor.

The Alabama had long been on her way to this historic destiny.

Built in Liverpool, England, under subterfuge, christened anonymously as Erica and variously known as “The 190” and the “Emperor of China’s Yacht,” she had almost literally swept United States merchant shipping from the seas. In 22 months, she had cruised 75,000 miles – equal to three times around the world – overhauled 295 vessels of many flags, taken 29 Union ships as prizes, and burned another 14 valued at over $5 million!

She had been fitted with guns in the Azores to complement her large sails, modern engines and a special propeller that could be raised for greater speed under sail. It was not by chance that she caught virtually every quarry sighted. However, she had not changed her black powder (now foul) and most of her shells were possibly defective. She had arrived in port at Cherbourg to repair and take on coal. In a rare stroke of bad luck, Napoleon III could not be reached to grant the obligatory asylum needed by any belligerent.

Alerted at Flushing, the Kearsarge pounced and was at Cherbourg in two days, patrolling the harbor and visually inspecting the Alabama through glasses. Captain Semmes, basically trapped, announced he would fight rather than sneak away at night. Cherbourg was crowded with sightseers … all had come to see the Americans in action. Semmes wisely sent ashore all the ship’s valuables and had his men compose their wills before engaging to fight.

The gunners went to their posts and Semmes and his officers, in full-dress uniforms, steamed out of the harbor ready for battle. Soon, the Alabama deck was littered with bodies, many badly mutilated, and there were gaping holes at the waterline. The Alabama, with her graceful black hull, which bore no name and marked only by a motto on the stern, Aide Toi, Et Dieu T’Aidera (God Helps Those who Help Themselves), was no more.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Bikes Symbolized Progress for a Nation Ready for Growth

A rare campaign button shows presidential candidate William McKinley riding a bicycle at the height of the bike boom of the 1890s.

By Jim O’Neal

As the bicycle became more popular in the latter part of the 1800s, it was inevitable that millions of new enthusiasts would soon be demanding better roads to accommodate this object of pleasure, so symbolic of progress. It was hailed as a democratizing force for the masses and neatly bridged the gap between the horse and the automobile, which was still a work in progress.

The popularity of this silent, steel steed had exploded with the advent of the “safety bicycle” (1885), which dramatically reduced the hazards of the giant “high wheelers.” The invention of the pneumatic tire in 1889 greatly enhanced the comfort of riding and further expanded the universe of users. However, this explosion in activity also increased the level of animosity as cities tried to cope by restricting hours of use, setting speed limits and passing ordinances that curtailed access to streets.

There were protest demonstrations in all major cities, but it came to a head in 1896 in San Francisco. The city’s old dirt roads were crisscrossed with streetcar tracks, cable slots and abandoned street rail franchises. Designed for a population of 40,000, the nation’s third-wealthiest city was now a metropolis of 350,000 and growing. On July 25, 1896, advocates of good streets and organized cyclists paraded in downtown with 100,000 spectators cheering them on.

The “Bicycle Wars” were soon a relic of the past as attention shifted to a product that was destined to change the United States more than anything in its history: Henry Ford’s Model T. Production by the Ford Motor Company began in August 1908 and the new cars came rolling out of the factory the next month. It was an immediate success since it solved three chronic problems: automobiles were scarce, prohibitively expensive and consistently unreliable.

Voila, the Model T was easy to maintain, highly reliable and priced to fit the budgets of the vast number of Americans with only modest incomes. It didn’t start the Automobile Age, but it did start in the hearts and souls of millions of people eager to join in the excitement that accompanied this new innovation. It accelerated the advent of the automobile directly into American society by at least a decade or more.

By 1918, 50 percent of the cars in the United States were Model Ts.

There were other cars pouring into the market, but Model Ts, arriving by the hundreds of thousands, gave a sharp impetus to the support structure – roads, parking lots, traffic signals, service stations – that made all cars more desirable and inexorably changed our daily lives. Automotive writer John Keats summed it up well in The Insolent Chariots: The automobile changed our dress, our manners, social customs, vacation habits, the shapes of our cities, consumer purchasing patterns and common tasks.

By the 1920s, one in eight American workers was employed in a related automobile industry, be it petroleum refining, rubber making or steel manufacturing. The availability of jobs helped create the beginning of a permanent middle class and, thanks to the Ford Motor Company, most of these laborers made a decent living wage on a modern five-day, 40-hour work week.

Although 8.1 million passenger cars were registered by the 1920s, paved streets were more often the exception than the rule. The dirt roads connecting towns were generally rutted, dusty and often impassable. However, spurred by the rampant popularity of the Model T, road construction quickly became one of the principal activities of government and expenditures zoomed to No. 2 behind education. Highway construction gave birth to other necessities: the first drive-in restaurant in Dallas 1921 (Kirby’s Pig Stand), first “mo-tel” in San Luis Obispo in 1925, and the first public garage in Detroit in 1929.

The surrounding landscape changed with the mushrooming of gas stations from coast to coast, replacing the cumbersome practice of buying gas by the bucket from hardware stores or street vendors. Enclosed curbside pumps became commonplace as did hundreds of brands, including Texaco, Sinclair and Gulf. The intense competition inspired dealers to distinguish with identifiable stations and absurd buildings. Then, in the 1920s, the “City Beautiful” movement resulted in gas stations looking like ancient Greek temples, log cabins or regional Colonial New England and California Spanish mission style fuel stops.

What a glorious time to be an American and be able to drive anywhere you pleased and see anything you wished. This really is a remarkable place to live and to enjoy the bountiful freedoms we sometimes take for granted.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].