Notorious traitors? Let’s look at Benedict Arnold

A May 24, 1776, letter by Benedict Arnold, signed, to Gen. William Thompson, realized $23,750 at an April 2016 Heritage auction.

By Jim O’Neal

Vidkun Quisling is an obscure name from World War II. To those unfamiliar with some of the lesser-known details, “Quisling” has become a synonym for a traitor or collaborator. From 1942 to 1945, he was Prime Minister of Norway, heading a pro-Nazi puppet government after Germany invaded. For his role, Quisling was put on trial for high treason and executed by firing squad on Oct. 24, 1945.

Obviously better known are Judas Iscariot of Last Supper fame (30 pieces of silver); Guy Fawkes, who tried to assassinate King James I by blowing up Parliament (the Gunpowder Plot); and Marcus Junius Brutus, who stabbed Julius Caesar (“Et tu, Brute?”). In American history, it’s a close call between John Wilkes Booth and Benedict Arnold.

Arnold

The irony concerning Benedict Arnold (1741-1801) is that his early wartime exploits had made him a legendary figure, but Arnold never forgot the sleight he received in February 1777 when Congress bypassed him while naming five new major generals … all of them junior to him. Afterward, George Washington pledged to help Arnold “with opportunities to regain the esteem of your country,” a promise he would live to regret.

Unknown to Washington, Arnold had already agreed to sell secret maps and plans of West Point to the British via British Maj. John André. There have always been honest debates over Arnold’s real motives for this treacherous act, but it seems clear that purely personal gain was the primary objective. Heavily in debt, Arnold had brokered a deal that included having the British pay him 6,000 pound sterling and award him a British Army commission for his treason. There is also little doubt that his wife Peggy was a full accomplice, despite a dramatic performance pretending to have lost her mind rather than her loyalty.

The history of West Point can be traced back to when it was occupied by the Continental Army after the Second Continental Congress (1775-1781) was designated to manage the Colonial war effort. West Point – first known as Fort Arnold and renamed Fort Clinton – was strategically located on high ground overlooking the Hudson River, with panoramic views extending all the way to New York City, ideal for military purposes. Later, in 1801, President Jefferson ordered plans to establish the U.S. Marine Corps there, and West Point has since churned out many distinguished military leaders … first for the Mexican-American War and then for the Civil War, including both Ulysses S. Grant and Robert E. Lee. It is the oldest continuously operating Army post in U.S. history.

To understand this period in American history, it helps to start at the end of the Seven Years’ War (1756-63), which was really a global conflict that included every major European power and spanned five continents. Many historians consider it “World War Zero,” and on the same scale as the two 20th century wars. In North America, the skirmishes started two years earlier in the French and Indian War, with Great Britain an active participant.

The Treaty of Paris in 1763 ended the conflict, with the British winning a stunning series of battles, France surrendering its Canadian holdings, and the Spanish ceding its Florida territories in exchange for Cuba. Consequently, the British Empire emerged as the most powerful political force in the world. The only issue was that these conflicts had nearly doubled England’s debt from 75 million to 130 million sterling.

A young King George III and his Parliament quietly noted that the Colonies were nearly debt free and decided it was time for them to pay for the 8,000-10,000 Redcoat peacetime militia stationed in North America. In April 1864, they passed legislation via the Currency Act and the Sugar Act. This limited inflationary Colonial currency and cut the trade duty on foreign molasses. In 1765, they struck again. Twice. The Quartering Act forced the Colonists to pay for billeting the king’s troops. Then the infamous Stamp Act placed direct taxes on Americans for the first time.

This was one step too far and inevitably led to the Revolutionary War, with armed conflict that involved hot-blooded, tempestuous individuals like Benedict Arnold. A brilliant military leader of uncommon bravery, Arnold poured his life into the Revolutionary cause, sacrificing his family life, health and financial well-being for a conflict that left him physically crippled. Sullied with false accusations, he became profoundly alienated from the American cause for liberty. His bitterness unknown to Washington, on Aug. 3, 1780, the future first president announced Arnold would take command of the garrison at West Point.

The appointed commander calculated that turning West Point over to the British, perhaps along with Washington as well, would end the war in a single stroke by giving the British control over the Hudson River. The conspiracy failed when André was captured with incriminating documents. Arnold fled to a British warship and they refused to trade him for André, who was hanged as a spy after pleading to be shot by a firing squad. Arnold went on to lead British troops in Virginia, survived the war, and eventually settled in London. He quickly became the most vilified figure in American history and remains the symbol of treason yet today.

Gen. Nathanael Greene, often called Washington’s most gifted and dependable officer, summed it up after the war most succinctly: “Since the fall of Lucifer, nothing has equaled the fall of Arnold.”

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Selecting a justice has always been a messy, partisan process

This photograph, circa 1968, autographed by Chief Justice Earl Warren and the eight associate justices, sold for $2,031 at a June 2010 Heritage auction.

By Jim O’Neal

The Senate Judiciary Committee began hearings this week to consider the nomination of Judge Brett Kavanaugh to the Supreme Court in their “advise and consent” role to the president of the United States. Once considered a formality in the justice system, it has devolved into a high-stakes political process and is a vivid example of how partisanship has divided governance, especially in the Senate.

Fifty years ago, President Nixon provided a preview of politics gone awry as he attempted to reshape the Supreme Court to fit his vision of a judiciary. His problems actually started during the final year of Lyndon Johnson’s presidency. On June 26, 1968, LBJ announced that Chief Justice Earl Warren intended to resign the seat he had held since 1953. He also said that he intended to nominate Associate Justice Abe Fortas as his successor.

For the next three months, the Senate engaged in an acrimonious debate over the Fortas nomination. Finally, Justice Fortas asked the president to withdraw his nomination to stop the bitter partisan wrangling. Chief Justice Warren, who had been a keen observer of the Senate’s squabbling, decided to end the controversy in a different way. He withdrew his resignation and in a moment of pique said, “Since they won’t take Abe, they will have me!” True to his promise, Warren served another full term until May 1969.

By then, there was another new president – Richard Nixon – and he picked Warren Burger to be Warren’s replacement. Burger was a 61-year-old judge on the U.S. Court of Appeals with impeccable Republican credentials, just as candidate Nixon had promised during the 1968 presidential election campaign. As expected, Burger’s confirmation was speedy and decisive … 74-3.

Jubilant over his first nomination confirmation to the court, Nixon had also received a surprise bonus earlier in 1969. In May, Justice Fortas had decided to resign his seat on the court. In addition to the bitter debate the prior year, the intense scrutiny of his record had uncovered a dubious relationship with Louis Wolfson, a Wall Street financier sent to prison for securities violations. To avoid another Senate imbroglio over some shady financial dealings, Fortas decided to resign. In stepping down, Fortas became the first Supreme Court justice to resign under threat of impeachment.

So President Nixon had a second opportunity to add a justice. After repeating his criteria for Supreme Court nominees, Nixon chose Judge Clement Haynsworth Jr. of the U.S. Court of Appeals, Fourth Circuit, to replace Fortas. Attorney General John Mitchell had encouraged the nomination since Haynsworth was a Harvard Law alumnus and a Southern jurist with conservative judicial views. He seemed like an ideal candidate since Nixon had a plan to gradually reshape the court.

However, to the president’s anger and embarrassment, Judiciary Committee hearings exposed clear evidence of financial and conflict-of-interest improprieties. There were no actual legal implications, but how could the Senate force Fortas to resign and then essentially just overlook basically the same issues now? Finally, the Judiciary Committee approved Haynsworth 10-7, but on Nov. 21, 1969, the full Senate rejected the nomination 55-45. A livid Nixon blamed anti-Southern, anti-conservative partisans for the defeat.

The president – perhaps in a vengeful mood – quickly countered by nominating Judge G. Harold Carswell of Florida, a little-known undistinguished ex-U.S. District Court judge with only six months experience on the Court of Appeals. The Senate was clearly now hoping to approve him until suspicious reporters discovered a statement in a speech he had made to the American Legion 20-plus years before in 1948: “I yield to no man as a fellow candidate or as a citizen in the firm, vigorous belief in the principles of White Supremacy and I shall always be so governed!”

Oops.

Even allowing for his youth and other small acts of racial bias, the worst was yet to come. It turned out that he was a lousy judge with a poor grasp of the law. His floor manager, U.S. Senator Roman Hruska, a Nebraska Republican, then made a fumbling inept attempt to convert Carswell’s mediocrity into an asset. “Even if he is mediocre, there are lots of mediocre judges, people and lawyers. They are entitled to a little representation aren’t they, and a little chance?” This astonishing assertion was then compounded when it was seconded by Senator Russell Long, a Democrat from Louisiana! When the confirmation vote was taken on April 9, 1970, Judge Carswell’s nomination was defeated 51-45.

A bitter President Nixon, with two nominees rejected in less than six months, continued to blame it on sectional prejudice and philosophical hypocrisy. So he turned to the North and selected Judge Harry Blackmun, a close friend of Chief Justice Burger who urged his nomination. Bingo … he was easily confirmed by a vote of 94-0. At long last, the vacant seat of Abe Fortas was filled.

There would be no further vacancies for 15 months, but in September 1971, justices Hugo Black and John Harlan announced they were terminally ill and compelled to resign from the court. Nixon was finally able to develop a strategy to replace these two distinguished jurors, but it was only after a complicated and convoluted process. It would ultimately take Nixon eight tries to fill four seats, and the process has only become more difficult.

Before Judge Kavanaugh is able to join the court, as is widely predicted, expect the opposing party to throw up every possible roadblock they have in their bag of tricks. This process is now strictly political and dependent on partisan voting advantages. The next big event will probably involve Justice Ruth Bader Ginsburg, a 25-year court member (1993) and only the second woman on the court after Sandra Day O’Connor. At age 85, you can be sure that Democrats are wishing her good health until they regain control of the Oval Office and the Senate. If not, stay tuned for the Battle of the Century!

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

100 Years Before Rosa Parks, There was Octavius Catto

Rosa Parks refused to give up her seat on a segregated bus, sparking the Montgomery, Ala., bus boycott.

By Jim O’Neal

Most Americans are familiar with Rosa Parks and recall the heroic story of a weary black woman on her way home after a hard day at work who refused to give up her seat and “move to the back of the bus” to make room for white people. The date was Dec. 1, 1955, and the city was Montgomery, Ala.

Later, she would be arrested during the ensuing Montgomery bus boycott that lasted 381 days. She was fined $10, but ultimately vindicated by the U.S. Supreme Court, which ruled the segregation law was unconstitutional. After her death, she became the first African-American woman to have her likeness depicted in the National Statuary Hall in the U.S. Capitol.

Parks (1913-2005) earned her way into the pantheon of civil rights leaders, but few remember a remarkable man who preceded her by a century when streetcars were pulled by horses.

Catto

His name was Octavius Valentine Catto (1839-1871) and history was slow in recognizing his astonishing accomplishments. Even the epitaph on his tombstone shouts in bold letters “THE FORGOTTEN HERO.” One episode in his far-too-short but inspiring life is eerily similar to the events in Montgomery, only dramatically more so. Catto was a fierce enemy of the entire Philadelphia trolley car system, which banned black passengers. On May 18, 1865, The New York Times ran a story about an incident involving Catto that occurred the previous afternoon in Philadelphia, “The City of Brotherly Love” (at least for some).

Paraphrasing the story, it describes how a colored man (Catto) had refused all attempts to get him to leave a strictly segregated trolley car. Frustrated and in fear of being fined if he physically ejected him, the conductor cleverly side railed the car, detached the horses and left the defiant passenger in the now-empty stationary car. Apparently, the stubborn man was still on-board after spending the night. It caused a neighborhood sensation that led to even more people challenging the rules.

The following year, there was an important meeting with the Urban League to protest the forcible ejection of several black women from Philadelphia streetcars. The intrepid Catto presented a number of resolutions that highlighted the inequities in segregation, principles of freedom, civil liberty and a heavily biased judicial system. He also boldly solicited support from fellow citizens in his quest for fairness and justice.

He got specific help from Pennsylvania Congressman Thaddeus Stevens, a leader of the “Radical Republicans” who had a fiery passion for desegregation and abolition of slavery, and who criticized President Lincoln for lack of more forceful action. Stevens is a major character in Steven Spielberg’s 2013 Oscar-nominated film Lincoln, with Tommy Lee Jones gaining an Oscar nomination for his portrayal of Stevens. On Feb. 3, 1870, the 15th Amendment to the Constitution guaranteed suffrage to black men (women of all colors would have to wait another 50 years until 1920 to gain the right to vote in all states). It would also lead to Catto’s death. On Election Day, Oct. 10, 1871, Catto was out encouraging black men to vote for Republicans. He was fatally shot by white Democrats who wanted to suppress the black vote.

Blacks continued to vote heavily for Republicans until the early 20th century and were not even allowed to attend Democratic conventions until 1924. This was primarily due to the fact that Southern states had white governors who mostly discouraged equal rights and supported Jim Crow laws that were unfair to blacks. As comedian Dick Gregory (1932-2017) famously joked, he was at a white lunch counter where he was told, “We don’t serve colored people here,” and Gregory replied, “That’s all right. I don’t eat colored people … just bring me a whole fried chicken!”

Octavius Catto, who broke segregation on trolley cars and was an all-star second basemen long before Jackie Robinson, would have to wait until the 20th century to get the recognition he deserved. I suspect he would be surprised that we are still struggling to “start a national conversation” about race when that’s what he sacrificed his life for.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Usual Fireworks Expected with Latest Supreme Court Selection

This photograph, signed by Supreme Court Chief Justice William H. Taft and the eight associate justices, circa 1927, sold for $14,340 at a September 2011 Heritage auction.

By Jim O’Neal

It is that time again when the news will be filled with predictions of pestilence, war, famine and death (the Four Horsemen of the Apocalypse) as President Trump tees up his next candidate for the Supreme Court. One side will talk about the reversal of Roe v. Wade as an example of the terrible future that lies ahead. The other side will be quick to point out that this fear-mongering first started in 1981 when Sandra Day O’Connor (the first woman to serve on the court) was nominated by President Reagan and that nothing has happened in the intervening 37 years.

My prediction is that regardless of whoever is confirmed, there will be no evidence from the past on any opinions on “Roe” and he or she will have been groomed by the “Murder Boards” to answer that it is settled law. Murder Boards are groups of legal experts who will rehearse the nominee on how to answer every possible question the Senate Judiciary Committee might ask on any subject, not just Roe, in their role in giving advice and consent. It produces what former Vice President Joe Biden described as a “Kabuki dance” when he was in the Senate.

The questioning does produce great public theater, but it is a tradition that dates to 1925 when nominee Harlan Stone actually requested he be allowed to answer questions about rumors of improper ties to Wall Street. It worked and he was confirmed by a vote of 71-6 and would later serve as Chief Justice (1941-46). In 1955, John Marshall Harlan II was next when Southern Senators wanted to know his views on public school desegregation vis-à-vis Brown v. Board of Education. He was also successfully confirmed 71-11 and since then, every nominee to the court has been questioned by the Senate Judiciary Committee. The apparent record is the 30 hours of grilling Judge Robert Bork experienced in 1987, when he got “Borked” by trying to answer every single question honestly. Few make that mistake today.

Roe v. Wade was a 1973 case in which the issue was whether a state court could constitutionally make it a crime to perform an abortion, except to save the mother’s life. Abortion had a long, legal history dating to the 1820s when anti-abortion statues began to appear that resembled an 1803 law in Britain that made it illegal after “quickening” (start of fetal movements) using various rationales such as illegal sexual conduct, unsafe procedures and the state’s responsibilities in protecting prenatal life.

The criminalization accelerated from the 1860s and by 1900, abortion was a felony in every state. Despite this, the practice continued to grow and in 1921, Margaret Sanger founded the American Birth Control League. By the 1930s, licensed physicians performed an estimated 800,000 procedures each year. In 1967, Colorado became the first state to decriminalize abortion in cases of rape, incest or permanent disability of the woman. By 1972, 13 states had similar laws and in 1970, Hawaii was the first state to legalize abortion on the request of the woman. So the legal situation prior to Roe was that abortion was illegal in 30 states and legal in the other 20 under certain conditions.

“Jane Roe” was an unmarried pregnant woman who supposedly wished to terminate her pregnancy and instituted an action in the U.S. District Court for the Northern District of Texas. A three-judge panel found Texas criminal statues unconstitutionally vague and the right to choose to have children was protected by the 9th through the 14th Amendments. All parties appealed and on Jan. 22, 1973, the Supreme Court ruled the Texas statute was unconstitutional. The court declined to define when human life begins.

Jane Roe’s real name was Norma McCorvey and she became a pro-life advocate before she died and maintained she never had the abortion and that she was the victim of two young, ambitious lawyers looking for a plaintiff. Henry Wade was district attorney of Dallas from 1951 to 1987 and the longest serving DA in United States history. He was also involved in the prosecution of Jack Ruby for killing Lee Harvey Oswald. After he was convicted, Ruby appealed and the verdict was overturned, but he died of lung cancer and is constitutionally presumed innocent.

Stay tuned for the fireworks.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As Nation Moved to Civil War, the North had the Financial Edge

Richard Montgomery was an Irish soldier who served in the British Army before joining the Continental Army.

By Jim O’Neal

Richard Montgomery (1738-75) was a little-known hero-soldier born in Dublin, Ireland, who became a captain in the British Army in 1756. Later, he became a major general in the Continental Army after the Continental Congress elected George Washington as Commander in Chief of the Continental Army in June 1775. This position was created specifically to coordinate the military efforts of the 13 Colonies in the revolt against Great Britain.

Montgomery was killed in a failed attack on Quebec City led by General Benedict Arnold (before he defected). Montgomery was mourned in both Britain and America as his remains were interned at St. Paul’s Chapel in New York City.

A remarkably diverse group of schools, battleships and cities named in his honor remain yet today. Montgomery, Ala., is the capital and second-largest city in the state; it’s where Rosa Parks refused to give up her bus seat to a white passenger on Dec. 1, 1955, sparking the famous Montgomery bus boycott. Martin Luther King Jr. used Montgomery to great advantage in organizing the civil rights movement.

Montgomery was also the first capital of the Provisional Congress of the Confederate States when the first meeting was convened in February 1861. The first seven states that seceded from the United States had hastily selected representatives to visit the new Confederate capital. They arrived to find the hotels dirty, dusty roads, and noisy lobbyists overflowing in the statehouse. Montgomery was not prepared to host any large group, especially a large political convention.

Especially notable was that most of the South’s most talented men had already either joined the Army, the Cabinet or were headed for diplomatic assignments. By default, the least-talented legislators were given the responsibility of writing a Constitution, installing the new president (Jefferson Davis), and then authorizing a military force of up to 400,000 men. This conscription was for three years or the duration of the war. Like the North, virtually everyone was confident it would be a short, decisive battle.

Jefferson Davis was a well-known name, having distinguished himself in the Mexican War and serving as Secretary of War for President Franklin Pierce. Like many others, he downplayed the role of slavery in the war, seeing the battle as a long-overdue effort to overturn the exploitive economic system that was central to the North. In his view, the evidence was obvious. The North and South were like two different countries: one a growing industrial power and the other stuck in an agricultural system that had not evolved from 1800 when 80 percent of its labor force was on farms and plantations. The South now had only 18 percent of the industrial capacity and trending down.

That mediocre group of lawmakers at the first Confederate meeting was also tasked with the challenge of determining how to finance a war against a formidable enemy with vastly superior advantages in nearly every important aspect. Even new migrants were attracted to the North’s ever-expanding opportunities, as slave states fell further behind in manufacturing, canals, railroads and even conventional roads, all while the banking system became weaker.

Cotton production was a genuine bright spot for the South (at least for plantation owners), but ironically, it generated even more money for the North with its vast network of credit, warehousing, manufacturing and shipping companies. The North manufactured a dominant share of boots, shoes, cloth, pig iron and almost all the firearms … an ominous fact for people determined to fight a war. The South was forced to import foodstuffs in several regions. Southern politicians had spoken often of the need to build railroads and manufacturing, but these were rhetorical, empty words. Cotton had become the powerful narcotic that lulled them into complacency. Senator James Hammond of South Carolina summed it up neatly in his “Cotton is King” speech on March 4, 1858: “Who can doubt, that has looked at recent events, that cotton is supreme?”

Southerners sincerely believed that cotton would rescue them from the war and “after a few punches in the nose,” the North would gladly surrender.

One of those men was Christopher G. Memminger, who was selected as Confederate States Secretary of the Treasury and responsible for rounding up gold and silver to finance the needs of the Confederate States of America (CSA). A lawyer and member of the South Carolina legislature, he was also an expert on banking law. His first priority was for the Treasury to get cash and he started in New Orleans, the financial center of the South, by raiding the mint and customs house.

He assumed there would be at least enough gold to coin money and commissioned a design for a gold coin with the goddess of liberty seated, bearing a shield and a staff flanked by bales of cotton, sugar cane and tobacco. Before any denominations were finalized, it was discovered there was not enough gold available and the mint was closed in June.

This was followed by another nasty surprise: All the banks in the South possessed only $26 million in gold, silver and coins from Spain and France. No problem. Memminger estimated that cotton exports of $200 million would be enough to secure hundreds of millions in loans. Oops. President Lincoln had anticipated this and blockaded all the ports after Fort Sumter in April 1861. No cotton, no credit, no guns.

In God we trust. All others pay cash.

One small consolation was that his counterpart in the North, Salmon P. Chase, was also having trouble raising cash and had to resort to the dreaded income tax. However, both sides managed to keep killing each other for four long years, leaving a legacy of hate.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

America has a Long History of Rough-and-Tumble Politics

A cabinet card photograph dated 1852, shortly after the marriage of Rutherford and Lucy Hayes, went to auction in October 2008.

By Jim O’Neal

A surprisingly high number of political pundits ascribe the current bitter partisan divide to the presidential election of 2000, when the Supreme Court ordered the recount of “under-votes” in Florida to cease. As a result, the previously certified election results would stand and George W. Bush would receive all 25 Florida electoral votes, thus providing him a 271-266 nationwide victory over Al Gore. Democrats almost universally believed the election had been “stolen” due to the seemingly unprecedented action by the Supremes.

Although obviously a factor in the situation today, it seems too simplistic to me, as I remember the Clinton Impeachment, the start of the Iraq War (and the president who lied us into war), and, of course, Obamacare – all of which were also major contributors to the long, slow erosion of friendly bipartisanship. Now, we’re in an era when each new day seems to drag up a new issue that Americans can’t agree on and the schism widens ever so slightly.

Could it be worse?

The answer is obviously “yes,” since we once tried to kill each other into submission during the Civil War. Another good example is the highly controversial presidential election of 1876, which resulted in Rutherford B. Hayes becoming president. The loser, Samuel J. Tilden, had such staunch supporters that they promised “blood would run in the streets” if their candidate lost. After a highly ultra-controversial decision threw the election to Hayes, Democrats continued to make wild threats, and public disturbances were rampant across New York City hotels, saloons, bars and any other venues where crowds gathered.

The unrest was so high that outgoing President Ulysses S. Grant gradually became convinced that a coup was imminent. This was the closest the Dems had come to the White House since James Buchanan’s election 20 years earlier in 1856 and passions were so high that they would not be calmed easily. The level of resentment was much more than about losing an election or the ascendancy of the Republican Party with all their fierce abolitionists. It seems apparent even today that the election results had been politically rigged or, at a minimum, very cleverly stolen in a quasi-legalistic maneuver.

Grant’s primary concern was one of timing. The normal inauguration date of March 4 fell on a Sunday and tradition called for it to be held the next day, on Monday, March 5 (as with Presidents James Monroe and Zachary Taylor). Thus the presidency would be technically vacant from noon on Sunday until noon on Monday. The wily old military genius knew this would be plenty of time to pull off a coup d’état. He insisted Hayes not wait to take the oath of office.

In a clever ruse, the Grants made arrangements for a secret oath-taking on Saturday evening by inviting 38 people to an honorary dinner at the White House. While the guests were being escorted to the State Dining Room, Grant and Hayes slipped into the Red Room, where Chief Justice Morrison Waite was waiting with the proper documents. All went as planned until it was discovered there was no Bible available. No problem … Hayes was sworn in as the 19th president of the United States with a simple oath.

The passing of power has been one of the outstanding aspects of our constitutional form of governance.

Hayes was born on Oct. 4, 1822 – 2½ months after his father had died of tetanus, leaving his pregnant mother with two young children. From these less-than-humble beginnings, the enterprising “Rud” got a first-rate education that culminated with an LLB degree from Harvard Law School. Returning to Ohio, he established a law practice, was active in the Civil War and finally served two non-consecutive terms as governor of Ohio, which proved to be a steppingstone to the White House.

Most historians believe Hayes and his family were the richest occupants of the White House until Herbert and Lou Hoover showed up 52 years later. They certainly had a reputation for living on the edge of extravagance, and some cynics believe this was in large part due to the banning all alcohol in the White House (presidents in those days paid for booze and wine personally). Incidentally, the nickname for the first lady, “Lemonade Lucy,” did not happen until long after they left the White House.

President Hayes kept his pledge to serve only one term; he died of a heart attack in 1893 at age 70. The first Presidential Library in the United States was built in his honor in 1916.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Scientists Like Joseph Lister Have Made Life Better for All of Us

A March 25, 1901, letter signed by Joseph Lister went to auction in October 2014.

By Jim O’Neal

In the 1880s, American physicist Albert Michelson embarked on a series of experiments that undermined a long-held belief in a luminiferous ether that was thought to permeate the universe and affect the speed of light ever so slightly. Embraced by Isaac Newton (and almost venerated by all others), the ether theory was considered an absolute certainty in 19th century physics in explaining how light traveled across the universe.

However, Michelson’s experiments (partially funded by Alexander Graham Bell) proved the exact opposite of the theory. In the words of author William Cropper, “It was probably the most famous negative result in the history of physics.” The fact was that the speed of light was the same in all directions and in every season – reversing Newton’s law that had been thought to be a constant for the past 200 years. But, not everyone agreed for a long time.

The more modern scientist Max Planck (1858-1947) helped explain the resistance to accept new facts in a rather novel way: “A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.”

Even if true, it still makes it no less easy to accept the fact that the United States was the only nation “that remained unconvinced of the merits of Joseph Lister’s methods of modern antiseptic medicine.” In fact, Henry Jacob Bigelow (1818-1890), the esteemed Harvard professor of surgery and a fellow of the Academy of Arts and Sciences, derided antisepsis as “medical hocus-pocus.” This is even more remarkable when one considers he was the leading surgeon in New England and his contributions to orthopedic and urologic surgery are legendary.

But this short story begins with a sleight of hand by asking: In the 19th century, what do you think was the most dangerous place in the vast territories of the British Empire? The frozen wastes of the Northwest Passage or the treacherous savannas of Zululand? Or perhaps the dangerous passes of Hindu Kush? The surprising answer is almost undoubtedly the Victorian teaching hospital, where patients entered with a trauma and exited to a cemetery after a deadly case of “hospital gangrene.”

Victorian hospitals were described as factories of death, reeking with an unmistakable stench resembling rotting fish, cheerfully described as “hospital stink.” Infectious wounds were considered normal or beneficial to recovery. Stories abound of surgeons operating on a continuous flow of patients, and bloody smocks were badges of honor or evidence of their dedication to saving lives. The eminent surgeon Sir Frederick Treves (1853-1923) recalled, “There was one sponge to a ward. With this putrid article and a basin of once clear water, all the wounds in the ward were washed twice a day. By this ritual, any chance that a patient had of recovery was eliminated.”

Fortunately, Joseph Lister was born in 1827 and chose the lowly, mechanical profession of surgery over the more prestigious practice of internal medicine. In 1851, he was appointed one of four residents of surgery at London’s University College Hospital. The head of surgery was wrongfully convinced that infections came from miasma, a peculiar type of noxious air that emanated from the rot and decay.

Ever skeptical, Lister scoured out rotten tissue from gangrene wounds using mercury pernitrate on the healthy tissue. Thus began Lister’s lifelong journey to investigate the cause of infection and prevention through modern techniques. He spent the next 25 years in Scotland, becoming the Regius Professor of Surgery at the University of Glasgow. After Louis Pasteur confirmed germs caused infections rather than bad air, Lister discovered that carbolic acid (a derivative of coal tar) could prevent many amputations by cleaning the skin and wounds.

He then went on the road, advocating his gospel of antisepsis, which was eagerly adopted by the scientific Germans and some Scots, but plodding and practical English surgeons took much longer. Thus left were the isolated Americans who, like Dr. Bigelow, were too stubborn and unwilling to admit the obvious.

Planck was right all along. It would take a new generation, but we are the generation that has derived the greatest benefits from the astonishing advances in 20th century medical breakthroughs, which only seem to be accelerating. It is a good time to be alive.

So enjoy it!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Maybe a Simple Theory Explains Nature’s Mysteries

A Charles Darwin signature is among a set of autographs by famed scientists that sold for $4,750 at a January 2017 auction.

By Jim O’Neal

Cosmologists generally agree the universe is 13.8 billion years old, and Earth 4.6 billion years old. They also agree the universe is expanding at an ever-increasing rate, and is creating new space in the process. It is already so immense that even by traveling at the speed of light, you would simply end up back where you started due to the curvature of space. This eliminates one of my lifelong desires to poke my head thru and “see what’s out there.” The answer is nothing, a hard concept to grasp … at least for me.

What no one seems to know is where, when or how life on Earth began. Or, for that matter, if life (as we know it) exists anywhere other than on our small tiny orb tucked in a remote part of our modest galaxy, at the precise distance from the Sun to permit our existence.

Author Bill Bryson writes about the work of a graduate student at the University of Chicago, Stanley Miller, who in 1953 tried to synthesize life in a chemistry lab. He hooked up two flasks, one containing water and the other a mixture of methane, ammonia and hydrogen gases. By adding electrical sparks to simulate atmospheric lightning, he was able to convert this concoction to a green and yellow broth of amino acids, fatty acids and other organic compounds. His euphoric professor – Nobel laureate Harold Urey – exclaimed, “If God didn’t do it this way, he missed a good bet!” Since it was subsequently pointed out that Earth never had such noxious conditions, we are no closer to creating life today, 65 years later.

Others have speculated that life on Earth arrived when a meteorite crashed into the planet in a process known as panspermia. The problem with this theory is that it still doesn’t explain how life BEGAN and just moves the problem to some other remote place.

Since modern man dates back 200,000 years to Africa, I’m more curious as to why it took us so long to fly. It was only rather recently, on Dec. 17, 1903, that two brothers from Dayton, Ohio – Orville and Wilbur Wright – rose into the air in Kitty Hawk, N.C., and descended 120 feet further than the take-off point. Wilbur had tried first and stalled, but Orville took the controls and set off into a strong wind with Wilbur steadying the wingtip running alongside.

They made three more flights that morning, with the longest covering 852 feet. When a wind gust broke the airframe, they just packed all the parts and went back to Dayton. What makes this achievement even more remarkable is that neither had any formal academic education in physics, although both were high school graduates. Today, the “Flyer” hangs proudly above the entrance at the Smithsonian Air and Space Museum in Washington under a long inscription that ends “…Taught Men to Fly and Opened the Era of Aviation.”

Of course, flying in the true sense has mostly been restricted to birds, as Charles Darwin theorized. In his travels aboard the Beagle survey ship, he noted that finch beaks on different islands in the Galapagos varied to exploit local resources. He speculated the birds had not been originally created this way, but had adapted themselves to gain a strategic advantage to acquire scarce resources. They had indeed, but it should be noted that Darwin did not coin the phrase “survival of the fittest” and even the word “evolution” didn’t appear until the sixth printing of On the Origin of Species. And even this book was delayed for many years since his editor urged him to write about pigeons. “Everyone is interested in pigeons.”

A lot has been written about “locomotion,” with the flight of birds being the most interesting … and the Pterosaur from 100 million years ago especially so. With a wingspan of 16 feet and weighing a mere 22 pounds, it was able to dominate eastern England by staying aloft for extended periods on rising warm-air currents … presumably as a hovering predator.

Once again, we face the same questions. How it developed is a mystery, as is its anatomy, since it couldn’t manage take-off via traditional wing flapping. Perhaps it relied on gravity and thermals to become airborne. But this would have required plunging into the air from seaside cliffs, like modern frigate birds.

My theory to cover all these mysterious questions is more simplistic: Evolution is just “one damned thing after another.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Roosevelt Used Radio to Encourage, Hitler to Fuel Rage

A Franklin D. Roosevelt photograph, signed and inscribed to Eleanor Roosevelt, sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

Saul Bellow was a Canadian-born writer who became a nationalized U.S. citizen when he discovered he had immigrated to the United States illegally as a child. He hit the big time in 1964 with his novel Herzog. It won the U.S. National Book Award for fiction. Time magazine named it one of the 100 best novels in the English language since “the beginning of Time” (March 3, 1923).

Along the way, Bellow (1915-2005) also managed to squeeze in a Pulitzer Prize, the Nobel Prize for Literature, and the National Medal of Arts. He is the only writer to win the National Book Award for Fiction three times.

Saul Bellow

Bellow loved to describe his personal experience listening to President Roosevelt, an American aristocrat (Groton and Harvard educated), hold the nation together, using only a radio and the power of his personality. “I can recall walking eastward on the Chicago Midway … drivers had pulled over, parking bumper to bumper, and turned on their radios to hear every single word. They had rolled down the windows and opened the car doors. Everywhere the same voice, its odd Eastern accent, which in anyone else would have irritated Midwesterners. You could follow without missing a single word as you strolled by. You felt joined to these unknown drivers, men and women smoking their cigarettes in silence, not so much considering the president’s words as affirming the rightness of his tone and taking assurances from it.”

The nation needed the assurance of those fireside chats, the first of which was delivered on March 12, 1933. Between a quarter and a third of the workforce was unemployed. It was the nadir of the Great Depression.

The “fireside” was figurative; most of the chats emanated from a small, cramped room in the White House basement. Secretary of Labor Frances Perkins described the change that would come over the president just before the broadcasts. “His face would smile and light up as though he were actually sitting on the front porch or in the parlor with them. People felt this, and it bound them to him in affection.”

Roosevelt’s fireside chats and, indeed, all of his efforts to communicate contrasted with those of another master of the airwaves, Adolf Hitler, who fueled rage in the German people via radio and encouraged their need to blame, while FDR reasoned with and encouraged America. Hitler’s speeches were pumped through cheap plastic radios manufactured expressly to ensure complete penetration of the German consciousness. The appropriation of this new medium by FDR for reason and common sense was one of the great triumphs of American democracy.

Herr Hitler ended up committing suicide after ordering the building burned to the ground to prevent the Allies from retrieving any of his remains. So ended the grand 1,000-year Reich he had promised … poof … gone with the wind.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Early Automotive Pioneers Among America’s Top Innovators

A Lincoln Motor Company stock certificate, issued in October 1918 and signed by Henry M. Leland, sold for $500 at an October 2013 auction.

By Jim O’Neal

Doctors called it a “chauffeur’s fracture,” the radial styloid or wrist fracture that occurred when a driver tried to start a horseless carriage by turning the crank at the front of the car. If the engine backfired, the crank would spin backward, often causing broken bones. Those early automobiles motoring down the streets of American cities were considered engineering marvels.

But what a challenge to start!

The two requirements were a blacksmith’s arm and a perfect sense of timing. The driver had to adjust the spark and the throttle before jumping out to turn the crank mounted on the car’s outside front grill. Once the spark caught and the motor fired, the driver dashed back to the control to adjust the spark and throttle before the engine could die. Oh, and if the car started, but was in gear, it could lurch forward and run over the cranker!

Sound farfetched?

In 1908, tragedy struck when Byron Carter (1863-1908) – inventor of the Cartercar – died after trying to start a stalled car. The crank hit him in the jaw. Complications with gangrene set in and he died of pneumonia. It was a fluke involving a stalled motorist he was trying to help. The driver forgot to retard the spark. Whamo!

The car involved was a new Cadillac, one of the premier luxury brands, and Carter was good friends with the man who ran Cadillac, Henry Leland (who also owned Lincoln). When Leland found out his friend had been killed, he vowed: “The Cadillac car will kill no more men if we can possibly help it!” Cadillac engineers finally succeeded in manufacturing an electric self-starter, but were never able to scale it for commercial use.

Enter Charles Franklin Kettering (1876-1958), a remarkable man (in the same league as Thomas Edison) whose versatile skills included engineering and savvy business management. He was a prolific inventor with 186 notable patents. One of them was a self-starter small enough to fit under the hood of a car, running off a small storage battery. A New York inventor (Clyde J. Coleman) had applied for a patent in 1899 for an electric self-starter, but it was only a theoretical solution and never marketed.

After graduating from Ohio State College of Engineering, Kettering went to work for the invention staff at National Cash Register (NCR) company. He invented a high-torque electric motor to drive a cash register, allowing a salesperson to ring up a sale without turning a hand-crank twice each time. After five years at NCR, he set up his own laboratory in Dayton, Ohio. Working with a group of engineers, mechanics and electricians, he developed the new ignition system for the Cadillac Automobile Company.

Leland sold Cadillac to General Motors in 1909 for $4.5 million and there is no record of any Cadillac ever killing another person, at least from turning a crank to start the engine! Since Cadillac had been formed from remnants of the Henry Ford Company (the second of two failed attempts by Ford), it was renamed for Antoine Laumet de La Mothe, sieur de Cadillac (the founder of Detroit 200 years earlier).

Later, Leland would sell Lincoln, his other marque luxury brand, to Ford Motor Company for a healthy $10 million, while Kettering and his crew formed Dayton Engineering Laboratories Co., which became Delco, still a famous name in electronic automobile parts. Kettering went on to have a long, sterling career and was featured on the cover of Time on Jan. 9, 1933 … the week after president-elect Franklin Delano Roosevelt was named the magazine’s Man of the Year (Jan. 2).

My only quibble is the work Kettering did with Thomas Midgley Jr. in developing Ethyl gasoline, which eliminated engine knock, but loaded the air we breathe with lead (a deadly neurotoxin) for the next 50 years. And he developed Freon … a much safer refrigerant, but which released CFCs, which will be destroying our atmospheric ozone for the next 100-200 years.

I don’t recall ever personally turning an engine crank. My cars went from ignition keys to keyless and I plan to skip the driverless models and wait for a Jet-Cab … unless Jeff Bezos can provide an Uber-style version using one of his drones.

Things change.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].