John Adams saw the White House as a home for ‘honest and wise men’

A vintage creamware punch bowl, commemorating “John Adams President of the United States,” sold for $15,535 at a March 2008 Heritage auction.

By Jim O’Neal

As the states prepared for the first presidential election under the new Constitution, it was clear that George Washington was the overwhelming favorite to become the first president of the United States.

Under the rules, each state would cast two votes and at the February 1789 Electoral College, all 69 Electors cast one of their votes for Washington, making him the unanimous choice of 10 states. Two of the original Colonies (North Carolina and Rhode Island) had not yet ratified the Constitution, and New York had an internal dispute and did not chose Electors in time to participate. Eleven other men received a total of 69 votes, with John Adams topping the list with 34 votes, slightly less than 50 percent. He became the first vice president.

Four years later, there were 15 states (Vermont and Kentucky) and the Electoral College increased to 132 Electors. Again, Washington was elected president unanimously, with 132 votes. Adams was also re-elected with 77 votes, besting George Clinton, Thomas Jefferson and Aaron Burr. All three of the runner-ups would later become vice presidents, with Clinton serving a term for two different presidents (Jefferson and Madison). Jefferson had cleverly picked Clinton as his VP due to his age, correctly assuming Clinton would be too old to secede him … thus ensuring that Secretary of State James Madison would be the logical choice. Clinton would actually be the first VP to die in office.

John Adams

Two-time Vice President John Adams would finally win the presidency on his third try after Washington decided not to seek a third term in 1796. Still, Adams barely squeaked by, defeating Jefferson 71-68. Jefferson would become vice president after finishing second. It was during the Adams presidency that the federal government would make its final move to the South after residing first in New York City and then Philadelphia.

This relocation was enabled by the 1790 Residence Act, a compromise that was brokered by Jefferson with Alexander Hamilton and James Madison, with the proviso that the federal government assume all remaining state debts from the Revolutionary War. In addition to specifying the Potomac River area as the permanent seat of the government, it further authorized the president to select the exact spot and allowed a 10-year window for completion.

Washington rather eagerly agreed to assume this responsibility and launched into it with zeal. He personally selected the exact spot, despite expert advice against it. He even set the stakes for the foundation himself and carefully supervised the myriad details involved during actual construction. When the stone walls were rising, everyone on the project assembled, laid the cornerstone and affixed an engraved plate. Once in the mortar, the plate sank and has never been located since. An effort was made to find it on the 200th anniversary in 1992. All the old maps were pored over and the area was X-rayed … all to no avail. It remained undetected.

The project was completed on time and with Washington in his grave for 10 months, plans were made to relocate the White House from Philadelphia. The first resident, President John Adams, entered the President’s House at 1 p.m. on Nov. 1, 1800. It was the 24th year of American independence and three weeks later, he would deliver his fourth State of the Union address to a joint session of Congress. It was the last annual message delivered personally for 113 years. Thomas Jefferson discontinued the practice and it was not revived until 1913 (by Woodrow Wilson). With the advent of radio, followed by television, it was just too tempting for any succeeding presidents to pass up the opportunity.

John Adams was a fifth-generation American. He followed his father to Harvard and dabbled in teaching before becoming a lawyer. His most well-known case was defending the British Captain and eight soldiers involved in the Boston Massacre on March 5, 1770. He was not involved in the Boston Tea Party, but rejoiced since he suspected it would inevitably lead to the convening of the First Continental Congress in Philadelphia in 1774.

He married Abigail Smith … the first woman married to a president who also had a son become president. Unlike Barbara Bush, she died 10 years before John Quincy Adams actually became president in 1825. Both father and son served only one term. Abigail had not yet joined the president at the White House, but the next morning he sent her a letter with a benediction for their new home: “I pray heaven to bestow the best blessing on this house and on all that shall hereafter inhabit it. May none but honest and wise men ever rule under this roof.” Franklin D. Roosevelt was so taken with it that he had it carved into the State Dining Room mantle in 1945.

Amen.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Benjamin Franklin’s basement was literally filled with skeletons

A pre-1850 folk art tavern sign depicting Benjamin Franklin sold for $11,250 at a May 2014 Heritage auction.

By Jim O’Neal

The Benjamin Franklin House is a formal museum in Central London near Trafalgar Square. It’s a popular location for kooky political speeches and peaceful demonstrations. Although anyone is free to speak about virtually anything, many visitors are not raptly paying attention, preferring to instead feed the pigeons. I never had the temerity to practice my public speaking, although I’m sometimes tempted (“Going wobbly,” as my English friends would observe).

Known once as Charing Cross, Trafalgar Square now commemorates the British naval victory in October 1805 off the coast of Cape Trafalgar, Spain. Admiral Horatio Nelson defeated the Spanish and French fleets there, resulting in Britain gaining global sea supremacy for the next century.

The Franklin House is reputedly the only building still standing where Franklin actually lived … anywhere. He resided there for several years after accepting a diplomatic role from the Pennsylvania Assembly in pre-Revolutionary times. Derelict for most of the 20th century, the site caused a stir 20-plus years ago while it was being renovated. During the extensive excavation, a cache of several hundred human bones were unearthed

Since anatomy was one of the few scientific things Franklin did not dabble in, the general consensus was that one of his colleagues did, at a time when privately dissecting cadavers was unlawful and those who did it were very discreet. I discovered the museum while riding a black cab on the way to the American Bar at the nearby Savoy Hotel. I may take the full tour if we ever return to London.

However, my personal favorite is likely to remain the Franklin Institute in the middle of Philadelphia. A large rotunda features the official national memorial to Franklin: a 20-foot marble statue sculpted by James Earle Fraser in 1938. It was dedicated by Vice President Nelson Aldrich Rockefeller in 1976. Fraser is well known in the worlds of sculpting, medals and coin collecting. He designed the Indian Head (Buffalo) nickel, minted from 1913-38; several key dates in high grade have sold for more than $100,000 at auction. I’ve owned several nice ones, including the popular 3-Leg variety that was minted in Denver in 1937. (Don’t bother checking your change!).

Fraser (1876-1953) grew up in the West and his father, an engineer, was one of the men asked to help retrieve remains from Custer’s Last Stand. George Armstrong Custer needs no introduction due to his famous massacre by the Lakota, Cheyenne and Arapaho in 1876 – the year Fraser was born – in the Battle of the Little Bighorn (Montana). But it helps explain his empathy for American Indians as they were forced off their reservations. His famous statue titled End of the Trail depicts the despair in a dramatic and memorable way. The Beach Boys used it for the cover of their 1971 album Surf’s Up.

Another historic Fraser sculpture is 1940’s Equestrian Statue of Theodore Roosevelt at the American Museum of Natural History (AMNH) in New York City. Roosevelt is on horseback with an American Indian standing on one side and an African-American man on the other. The AMNH was built using private funds, including from TR’s father, and it is an outstanding world-class facility in a terrific location across from Central Park.

However, there is a movement to have Roosevelt’s statue removed, with activists claiming it is racist and emblematic of the theft of land by Europeans. Another group has been active throwing red paint on the statue while a commission appointed by Mayor Bill de Blasio studies how to respond to the seemingly endless efforts to erase history. Apparently, the city’s Columbus Circle and its controversial namesake have dropped off the radar screen.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Airplanes added economic, psychological factors to warfare

Alexander Leydenfrost’s oil on canvas Bombers at Night sold for $4,182 at a February 2010 Heritage auction.

By Jim O’Neal

In August 1945, a historic event occurred: Foreign forces occupied Japan for the first time in recorded history. It was, of course, the end of World War II and cheering crowds were celebrating in the streets of major cities around the world as peace returned to this little planet.

A major factor in finally ending this long, costly war against the Axis powers of Germany and Japan was, ultimately, the use of strategic bombing. An essential element was the development of the B-29 bomber – an aircraft not even in use when Japan attacked Pearl Harbor in 1941, forcing a reluctant United States into a foreign war. Maybe it was hubris or fate, but the attack was a highly flawed decision that would not end well for the perpetrators.

The concept of war being waged from the air dates to the 17th century when several wrote about it while speculating on when or where it would begin. The answer turned out to be the Italo-Turkish War (1911-12), when an Italian pilot on Oct. 23, 1911, flew the first aerial reconnaissance mission. A week later, the first aerial bomb was dropped on Turkish troops in Libya. The Turks responded by shooting down an airplane with rifle fire.

As World War I erupted seemingly out of nowhere, the use of airplanes became more extensive. However, for the most part, the real war was still being waged on the ground by static armies. One bitter legacy of this particular war was the frustration over the futility and horror of trench warfare, which was employed by most armies. Many experts knew, almost intuitively, that airplanes could play a role in reducing the slaughter of trench warfare and a consensus evolved that airplanes could best be used as tactical army support.

However, in the 20-year pause between the two great wars, aviation technology improved much faster than other categories of weaponry. Arms, tanks, submarines and other amphibious units were only undergoing incremental changes. The airplane benefited by increased domestic use and major improvements in engines and airframes. The conversion to all-metal construction from wood quickly spread to wings, crew positions, landing gear and even the lowly rivet.

As demand for commercial aircraft expanded rapidly, increased competition led to significant improvements in speed, reliability, load capacity and, importantly, increased range. Vintage bombers were phased out in favor of heavier aircraft with modern equipment. A breakthrough occurred in February 1932 when the Martin B-10 incorporated all the new technologies into a twin-engine plane. The new B-10 was rated the highest performing bomber in the world.

Then, in response to an Air Corps competition for multi-engine bombers, Boeing produced a four-engine model that had its inaugural flight in July 1935. It was the highly vaunted B-17, the Flying Fortress. Henry “Hap” Arnold, chief of the U.S. Army Air Forces, declared it was a turning point in American airpower. The AAF had created a genuine air program.

Arnold left active duty in February 1946 and saw his cherished dream of an independent Air Force become a reality the following year. In 1949, he was promoted to five-star general, becoming the only airman to achieve that rank. He died in 1950.

War planning evolved with the technology and in Europe, the effectiveness of strategic long-range bombing was producing results. By destroying cities, factories and enemy morale, the Allies hastened the German surrender. The strategy was comparable to Maj. Gen. William Tecumseh Sherman’s “March to the Sea” in 1864, which added economic and psychological factors to sheer force. Air power was gradually becoming independent of ground forces and generally viewed as a faster, cheaper strategic weapon.

After V-E Day, it was time to force the end of the war by compelling Japan to surrender. The island battles that led toward the Japanese mainland in the Pacific had ended after the invasion of Okinawa on April 1, 1945, and 82 days of horrific fighting that resulted in the loss of life for 250,000 people. This had been preceded by the March 9-10 firebombing of Tokyo, which killed 100,000 civilians and destroyed 16 square miles, leaving an estimated 1 million homeless.

Now for the mainland … and the choices were stark and unpleasant: either a naval blockade and massive bombings, or an invasion. Based on experience, many believed that the Japanese would never surrender, acutely aware of the “Glorious Death of 100 Million” campaign, designed to convince every inhabitant that an honorable death was preferable to surrendering to “white devils.” The bombing option had the potential to destroy the entire mainland.

The decision to use the atomic bomb on Hiroshima (Aug. 6) and Nagasaki (Aug. 9) led to the surrender on Aug. 10, paving the way for Gen. Douglas MacArthur to gain agreement to an armistice and 80-month occupation by the United States. Today, that decision still seems prudent despite the fact we only had the two atomic bombs. Japan has the third-largest economy in the world at $5 trillion and is a key strategic partner with the United States in the Asia-Pacific region.

Now about those ground forces in the Middle East…

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As court controversy rages, let’s not forget what we do best

A photograph of Franklin D. Roosevelt signed and inscribed to Eleanor Roosevelt sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

The Supreme Court was created by the Constitution, but the document wisely calls for Congress to decide the number of justices. This was vastly superior to a formula based on the number of states or population, which would have resulted in a large, unwieldy committee. The 1789 Judiciary Act established the initial number at six, with a chief justice and five associates all selected by President Washington.

In 1807, the number was increased to seven (to avoid tie votes) and in 1837 to nine, and then to 10 in 1863. The Judiciary Act of 1866 temporarily reduced the court to seven in response to post-Civil War politics and the Andrew Johnson presidency. Finally, the 1869 Act settled on nine, where it has remained to this day. The major concern has consistently been over the activities of the court and the fear it would inevitably try to create policy rather than evaluate it (ensuring that Congressional legislation was lawful and conformed to the intent of the Constitution).

The recent confirmation hearings are the latest example of both political parties vying for advantage by using the court to shape future policies, reflecting political partisanship at its worst. Despite the fact that the Supreme Court can’t enforce its decisions since Congress has the power of the purse and the president the power of force, the court has devolved into a de facto legislative function through its deliberations. In a sharply divided nation, on most issues, policy has become the victim, largely since Congress is unable to find consensus. The appellate process is simply a poor substitute for this legislative weakness.

We have been here before and it helps to remember the journey. Between 1929 and 1945, two great travails were visited on our ancestors: a terrible economic depression and a world war. The economic crisis of the 1930s was far more than the result of the excesses of the 1920s. In the 100 years before the 1929 stock-market crash, our dynamic industrial revolution had produced a series of boom-bust cycles, inflicting great misery on capital and on many people. Even the fabled Roaring ’20s had excluded great segments of the population, especially blacks, farmers and newly arrived immigrants. Who or what to blame?

“[President] Hoover will be known as the greatest innocent bystander in history, a brave man fighting valiantly, futile, to the end,” populist newspaperman William Allen White wrote in 1932.

The same generation that suffered through the Great Depression was then faced with war in Europe and Asia, the rationing of common items, entrance to the nuclear age and, eventually, the responsibilities for rebuilding the world. Our basic way of life was threatened by a global tyranny with thousands of nukes wired to red buttons on two desks 4,862 miles apart.

FDR was swept into office in 1932 during the depth of the Great Depression and his supporters believed he possessed just what the country needed: inherent optimism, confidence, decisiveness, and the desire to get things done. We had 13 million unemployed, 9,100 banks closed, and a government at a standstill. “This nation asks for action and action now!”

In his first 100 days, Roosevelt swamped Congress with a score of carefully crafted legislative actions designed to bring about economic reforms. Congress responded eagerly. But the Supreme Court, now dubbed the “Nine Old Men,” said no to most New Deal legislation by votes of 6-3 or 5-4. They made mincemeat of the proposals. But the economy did improve and resulted in an even bigger landslide re-election. FDR won 60.3 percent of the popular vote and an astonishing 98.5 percent of the electoral votes, losing only Vermont and Maine.

In his 1937 inaugural address, FDR emphasized that “one-third of the nation was ill-housed, ill-clad and ill-nourished.” He called for more federal support. However, Treasury Secretary Henry Morgenthau worried about business confidence and argued for a balanced budget, and in early 1937, Roosevelt, almost inexplicably, ordered federal spending reduced. Predictably, the U.S. economy went into decline. Industrial production had fallen 14 percent and in October alone, another half million people were thrown out of work. It was clearly now “Roosevelt’s Recession.”

Fearing that the Supreme Court would continue to nullify the New Deal, Roosevelt in his ninth Fireside Chat unveiled a new plan for the judiciary. He proposed that the president should have the power to appoint additional justices – up to a maximum of six, one for every member of the Supreme Court over age 70 who did not retire in six months. The Judicial Procedures Reform Bill of 1937 (known as the “court-packing plan”) hopelessly split the Democratic majority in the Senate, caused a storm of protest from bench to bar, and created an uproar among both Constitutional conservatives and liberals. The bill was doomed from the start and even the Senate Judiciary reported it to the floor negatively, 10-14. The Senate vote was even worse … 70-20 to bury it.

We know how that story ended, as Americans were united to fight a Great War and then do what we do best: work hard, innovate and preserve the precious freedoms our forebears guaranteed us.

Unite vs. Fight seems like a good idea to me.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Artists, writers tapped into America’s traveling spirit

Robert Crumb’s original illustration of Jack Kerouac sold for $33,460 at a February 2011 Heritage auction.

By Jim O’Neal

While we lived in London, I was always fascinated by one common employee characteristic. Irrespective of the school – be it Eton, Oxford or the London School of Economics – every curriculum vitae (CV) included an explanation of a student’s foreign travels the year following graduation. Asia and Australia were the most popular; only rarely did it include the United States. After all the studying, cramming for exams and other typical activities on campus, they felt an overwhelming compulsion to just travel for as long as a year.

America was like that one time not too long ago. Novelist/journalist James Agee (A Death in the Family) wrote about it in Fortune magazine in 1934: Hunger for movement, he said, was “very probably the profoundest and most compelling of American racial hungers.” The road could help satisfy that hunger. Just put the hood ornament on the center line, the speedometer on 80 and let ’er rip. The urge was there before the car … long before … and invariably sent the country westward. As Huck Finn said: “But I reckon I got to light out for the territory ahead of the rest, because Aunt Sally she’s going to adopt me and sivilize me, and I can’t stand it. I been there before.”

The road was our nation’s ticket to ride and, more precisely, to ride away on. Maybe it was away from who we were, but it was, for sure, away from where we were. To where? Who knew? How about just a fresh start? We could put it all behind us as fast as the car could go.

Novelist/playwright William Saroyan, who liked getting behind the wheel of his Buick, wrote about his desire to hit the road: “It isn’t simply driving at night, it’s going on … to find out what’s out there now, not so much along the highway, in the terrain, under the sky, but in the interior of the driver himself.” Romance with the road was all about get up and go. Wherever you want to go, whenever you want to leave. There were no schedules and no reservations. Time of arrival? Whenever.

Lolita’s Humbert Humbert chose to hit the road to find his interior. Humbert’s creator, the Russian lepidopterist/novelist Vladimir Nabokov, spent two summers on America’s highways, chasing butterflies. A great year for the road was 1957. It was the year painter Edward Hopper gave us his classic Western Motel, the stark symbol of mobility and restlessness. The year that Jack Kerouac, out of the grim mill town of Lowell, Mass., weighed in with his novel On the Road. The road was Kerouac’s characters’ means of escape like the Mississippi was for Huck and Jim. On the Road captured the energy of trying to satisfy that hunger for movement.

The true north of the road was west. The West owned those lonesome, inexhaustible roads with few-and-far between motels designed so that cars could be parked about 20 feet from the beds. There was a lot of nowhere for these roads to cover. Distance was measured by hours (18 hours from Amarillo to Santa Monica), providing time to think. Playwright Sam Shepard used the road for writing and that may explain how he got the West so right.

John Steinbeck wrote that our “Mother Road” was Route 66. The Okies (including my whole family) called it their highway to heaven because it got us to California. We didn’t pick fruit like the Joads in The Grapes of Wrath. We bought real estate in Southern California that had its own fruit trees. I picked peaches and apricots off our three acres and I sold them in front of our house for 50 cents and $1 a lug. One uncle was a carpenter and he bought an entire block, built two houses, sold one for a tidy profit and lived in the other with a semi-alcoholic aunt.

My mother’s three brothers all found great jobs building airplanes and my father bought Pacific Cold Storage in Central Los Angeles (after he divorced my mother). I had two paper routes that netted me $60 a month after expenses (bicycle tires and rubber bands). I could also play night league softball in Huntington Park (we lived in Downey, home of the first Taco Bell 25 years later), and one-on-one basketball every spare minute.

My friends and I lived vicariously through TV shows like Route 66 with Martin Milner and George Maharis playing drifters in a Corvette – the only fictional series that shot all over North America – with their stories of working in shipbuilding, oilrigs and shrimping from Chicago to Los Angeles. (Corvette sales doubled). This was followed by Michael Parks in Then Came Bronson and the classic Easy Rider with Peter Fonda, Jack Nicholson and Dennis Hopper.

We had scratched our itch and found gold fast (sunshine, beaches, long-legged tan girls), but it was still fun watching others make their way west.

I wonder if space travel is what itches Elon Musk and Jeff Bezos?

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How did cotton farmers give the Union a run for its money?

A three-piece coin silver coffee set, circa 1855, that belonged to Jefferson Davis and his wife sold for $28,680 at a June 2013 Heritage auction.

By Jim O’Neal

Newly elected President Franklin Pierce quickly selected his Cabinet and strategically picked a Southern Senator, Jefferson Davis of Mississippi, to be his Secretary of War in 1853. Davis would become president as well, as the first and only president of the Confederate States of America (1861-65).

Jeff Davis, like so many others in the South, did not support the secessionist movement, since he was convinced the North would not allow this to occur peacefully. However, he was also convinced that each state was sovereign and had an unquestioned right to secede. When war finally came, loyalty to state was an easy choice to make, irrespective of personal views on slavery.

President Davis had an extensive military career and his four years as Secretary of War made him fully aware of the North’s vastly superior military and industrial power. Further, there were 21 million people in the North (mostly white), a 2-to-1 advantage over the South, which had several million slaves. Nevertheless, on April 29, 1861, Davis requested an Army of 100,000 volunteers, knowing full well it would be difficult to equip and arm them on a sustainable basis.

Another man familiar with this significant issue was Colonel Josiah Gorgas, head of the Confederate Ordnance Bureau. Gorgas had three stark sources of supply for the Confederate armed forces: inventory (on hand), home production, and foreign imports. By using arms seized from federal arsenals, Gorgas had (barely) enough weapons to outfit the initial 100,000 forces called out by President Davis. Then he turned his full attention to the future.

Unlike others in the South, Gorgas was savvy enough to know that the war would not be over quickly and realized his meager on-hand stocks of munitions would soon disappear. Given enough time, he planned to establish munitions plants that would make the new nation self-sustainable, but until then, “certain articles of prime necessity” would have to be imported from Europe. In April 1861, he dispatched Captain Caleb Huse to Great Britain to set up a purchasing arrangement to obtain foreign supplies. Only two things went wrong. First, no local munitions were ever produced and no supply lines from Europe were set up because the funding strategy failed due to “King Cotton.”

King Cotton was a political and economic theory based on the coercive power of Southern cotton. The British textile industry imported 80 percent of the South’s cotton. Deny them this supply and the severe impact on the British economy would force them to intervene in the war to help the South. The second tenet was that Northern textile mills were reliant on Southern cotton and starving them would disrupt the Northern economy as well.

Then the South curiously imposed an embargo on cotton shipments in the summer of 1861 and, although designed to bring the British into the war, really only deprived the European group of the funds to buy imported supplies.

The obvious question is how did this small group of cotton farmers … with limited supplies and munitions and a failed strategy to obtain more … fight a war against an armed group backed by an industrial powerhouse, and manage to last four years while inflicting great losses and sustaining even greater losses of lives and property?

My simplistic answer:

  1. President Lincoln and his generals (especially George McClellan) were not focused on the total destruction of the enemy (hopeful of coaxing them back into the Union).
  2. They were interested in winning battles rather than controlling territory.
  3. They avoided destroying infrastructure (until William Tecumseh Sherman demonstrated its benefits).
  4. The South was fighting for its future. I see similarities to both Vietnam and Afghanistan … people who would never surrender and, as the Taliban explained, “You have the watches. We have the time.”

Thank the Lord for generals Grant and Sherman.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Webster understood the power of words, common language

Webster used American spellings like “color” instead of the English “colour” and added American words that weren’t in English dictionaries like “skunk” and “squash.”

By Jim O’Neal

In 1802, a French mining engineer proposed a two-level tunnel under the English Channel to connect France and Great Britain. During the Paris Peace Conference in 1919, following the end of World War I, British Prime Minister David Lloyd George revived the idea as a means of assuring France that Great Britain was ready to help if Germany ever launched another invasion. Great Britain is the largest of the British Isles and the 9th-largest island in the world.

Although a tunnel was not available to help France when Germany invaded again in World War II, construction of the Channel Tunnel – with high-speed Eurostar train service – was finally completed in 1993. Queen Elizabeth was a passenger on the inaugural trip from London to Paris after a one-day delay due to a small, embarrassing mechanical problem. We were living in London at the time and Nancy and I were able to get reservations on the second day. We got an impressive special commemorative stamp in our U.S. passports to celebrate the occasion.

The English have a long history of special stamps, and printing and copyright laws dating to the Statute of (Queen) Anne in 1710. This act of Parliament was the first to provide copyright regulation by the government, rather than by private parties. Prior to this action, the Licensing of the Press Act of 1662 required printing presses to be approved by the e Stationers’ Company, a Royal Charter group that had a literal monopoly on the entire publishing industry.

They also had a Dr. Samuel Johnson, who published A Dictionary of the English Language in 1755. This two-volume set, and Dr. Johnson, dominated English lexicography for more than a century.

Copyright law in America dates to the Colonial period, however, the Statute of Anne did not apply for reasons that have been lost to antiquity. Instead, federal law in America was established in the late 18th century by the Copyright Act of 1790. We also did not have our own dictionary and even the spelling of words (as with their meanings) was generally left to the writer’s choice or presumed intent. Documents of that period are complicated to read and subject to varying interpretations.

Enter Noah Webster (1755-1843), who did much more than publish a dictionary. A Yale graduate with a law degree that he never quite turned into a job, he was a co-founder of Amherst College, started America’s first daily newspaper and worked for Alexander Hamilton on the Federalist Papers. With his legal training and publishing experience, he witnessed the flaws in our system and was continually lobbying Congress to pass copyright laws specifically applicable to America.

Webster was also adamant that we needed our own dictionary, not Dr. Johnson’s, which he attacked and scorned, perhaps extreme in his criticism. “Not a single page of Johnson’s dictionary is correct!” In his mind, we needed a dictionary that was clearly American, containing words unique to the young nation’s growing vernacular. Clearly, Webster also had his own agenda to advance. In the case of “equal,” he believed the Declaration of Independence was wrong in stating that “all men are created equal.”

Webster believed in equality of opportunity, but not equality of conditions. He also disagreed with “free” – that all men were free to act according to their will. That meaning threatened government, giving people the sense they were above authority. The tension with the meaning of “free” continues, and neither Webster’s dictionary nor any other can remove it.

But there is little doubt that a comprehensive American dictionary was a critical issue that had to be resolved as our society continued to grow and expand beyond the reach of English dogma.

Noah Webster thought an American dictionary would take him three to five years to complete and he bravely stated, “I ask no favors, the undertaking is Herculean, but it is of far less consequence to me than to my country…”

To prepare himself, he restudied his college Greek, Latin and Hebrew, perfected his French and German, then studied Danish, Anglo-Saxon, Welsh, Old Irish, Persian and seven Asiatic and Assyrian-based languages. His American Dictionary of the English Language had two volumes, 800 pages each, and sold for $20 in 1828, the year it was published. It had taken him nearly 20 years to complete.

Along the way to a common language, with common pronunciation, he included spelling books and improved teaching methods for Americans of all ages. Webster’s lasting place in the nation’s history derives from his conviction in the power of language and the necessity of fully understanding the language to exercise its power.

That conviction produced the final volume of independence from England.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Despite numerous failed examples, socialism still fascinates some people

An 1872 presidential campaign banner for Horace Greeley sold for $40,000 at a December 2016 Heritage auction.

By Jim O’Neal

Many credit the famous 19th century motto of “Go West, young man” to newspaperman Horace Greeley for a line in a July 1865 editorial. However, there is still a debate over whether it was first penned by Greeley or the lesser-known John Soule in an 1851 edition of the Terre Haute (Ind.) Express. Either way, the dictum helped fuel the westward movement of Americans in our quest for Manifest Destiny (“From sea to shining sea”). Clearly, Greeley helped more to popularize the concept due to the great influence of his successful newspaper.

Greeley was much less successful as a politician. He was sent to Congress in 1848 in a special election to represent New York. His colleagues groused that the brief three months he spent there were primarily devoted to exposing Congressional corruption in his newspaper rather than passing legislation. He was unable to generate any meaningful support for re-election, which relegated him back to his real interest, which was reporting on news and exposing crooked politicians.

Despite this setback to his political career, Greeley remained a powerful force in American politics throughout the entire Civil War period and beyond. After exposing the corruption in the first term of the Grant presidency (1868-1872), he found himself in the curious position of being the presidential candidate for both the Democratic Party (which he had opposed on every issue for many years) and the Liberal-Republican Party (which was an offshoot that objected to the corruption).

The 1872 presidential election was especially bitter, with both sides resorting to dirty tricks and making wild allegations against each other. Grant won the Republican nomination unanimously and as the incumbent, chose not to actively campaign. Greeley was a virtual whirlwind, traveling widely and making 20 or more speeches every day. A cynic observed that the problem was it was the wrong message to the wrong audience, but fundamentally, the issue was that Greeley was simply a poor campaigner and Grant was still a very popular president/general.

Grant easily won his re-election bid for a second term with 56 percent of the popular vote and Greeley died on Nov. 29 – just 24 days after the election and before the electoral votes were cast or counted. This is the first and only time a nominee for president of a major party has died during the election process. Grant went on to snag a comfortable 56 electoral votes as the others were spread among several candidates, including three for the deceased Greeley (which were later contested).

Thus ended the life of Horace Greeley (1811-1872), who had been founder and editor of the New-York Tribune, arguably in the top tier of great American newspapers. Established in 1841, it was renamed the New-York Daily Tribune (1842-1866) as its daily circulation exploded to 200,000. Greeley was endlessly promoting utopian reforms such as vegetarianism, agrarianism, feminism and socialism. In 1852-62, the paper retained Karl Marx as its London-based European correspondent to elaborate on his basic tenets of Marxism.

Great Britain had just finished its decennial census, which put the population at precisely 20,959,477. This was just 1.6 percent of the world’s population, but nowhere on the planet was there a more rich or productive group of people. The empire produced 50 percent of the world’s iron and coal, controlled two-thirds of the shipping and accounted for one-third of all trade. London’s banks had more money on deposit than all other financial centers … combined! Virtually all the finished cotton in the world was produced in Great Britain on machines built in Britain by British inventors.

The famous British Empire covered 11.5 million square miles and included 25 percent of the world’s population. By whatever measurement, it was the richest, most innovative and skilled nation known to man, and in London – where he was living the good life – primarily on his friend Friedrich Engels’ money – Marx was still churning out socialist propaganda. He made no attempt to explain that for the first time in history, there was a lot of everything in most people’s lives. Victorian London was not only the largest city in the world, but the only place one could buy 500 different kinds of hammers and a dazzling array of nails to pound on.

While Marxism morphed into Bolshevism, communism and socialism – polluting the economic systems of many hopeful utopians like Greeley – capitalism and the market-based theories of Adam Smith (“the father of modern economics”) quietly crept over America almost unnoticed. Despite the numerous failed examples of socialism in the real world, there will always be a new generation of people wanting to try it.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Peaceful transfer of presidential power is one of our strengths

Seven Days in May, starring Burt Lancaster and Kirk Douglas, is a 1964 movie about a military/political cabal’s planned takeover of the U.S. government.

By Jim O’Neal

It seems clear that one of the bedrock fundamentals that contributes to the stability of the U.S. government is the American presidency. Even considering the terrible consequences of the Civil War – 11 states seceding, 620,000 lives lost and widespread destruction – it’s important to remember that the federal government held together surprisingly well. The continuity of unbroken governance is a tribute to a success that is the envy of the world.

Naturally, the Constitution, our system of justice and the rule of law – along with all the other freedoms we cherish – are all critical contributors. But it’s been our leadership at the top that’s made it all possible. In fact, one could argue that having George Washington as the first president for a full eight years is equal in importance to all other factors. His unquestioned integrity and broad admiration, in addition to precedent-setting actions, got us safely on the road to success despite many of the governed being loyal to the British Crown.

Since that first election in 1789, 44 different men have held the office of president (Grover Cleveland for two separate terms), and six of them are alive today. I agree with Henry Adams, who argued, “A president should resemble a captain of a ship at sea. He must have a helm to grasp, a course to steer, a port to seek. Without headway, the ship would arrive nowhere and perpetual calm is as detrimental to purpose as a perpetual hurricane.” The president is the one who must steer the ship, as a CEO leads an organization, be it small or large.

In the 229 intervening years, there have been brief periods of uncertainty, primarily due to vague Constitutional language. The first occurred in 1800, when two Federalists each received 73 electoral votes. It was assumed that Thomas Jefferson would be president and Aaron Burr would be vice president. The wily Burr spotted an opportunity and refused to concede, forcing the decision into the House. Jefferson and Burr remained tied for 35 ballots until Alexander Hamilton (convinced that Jefferson was the lesser of two evils) swayed a few votes to Jefferson, who won on the 36th ballot. This technical glitch was modified by the 12th Amendment in 1804 by requiring an elector to pick both a president and a vice president to avoid any uncertainty.

A second blip occurred after William Henry Harrison and John Tyler defeated incumbent Martin Van Buren. At age 68, Harrison was the oldest to be sworn in as president, a record he held until Ronald Reagan’s inauguration in 1981 at age 69. Harrison died 31 days after his inauguration (also a record), the first time a president had died in office. A controversy arose over the successor. The Presidential Succession Act of 1792 specifically provided for a special election in the event of a double vacancy, but the Constitution was not specific regarding just the presidency.

Vice President Tyler, at age 51, would be the youngest man to assume leadership. He was well educated, intelligent and experienced in governance. However, the Cabinet met and concluded he should bear the title of “Vice President, Acting as President” and addressed him as Mr. Vice President. Ignoring the Cabinet, Tyler was confident that the powers and duties fell to him automatically and immediately as soon as Harrison had died. He moved quickly to make this known, but doubts persisted and many arguments followed until the Senate voted 38 to 8 to recognize Tyler as the president of the United States. (It was not until 1967 that the 25th Amendment formally stipulated that the vice president becomes president, as opposed to acting president, when a president dies, resigns or is removed from office.)

In July 1933, an extraordinary meeting was held by a group of disgruntled financiers and Gen. Smedley Butler, a recently retired, two-time Medal of Honor winner. According to official Congressional testimony, Smedley claimed the group proposed to overthrow President Franklin Roosevelt because of the implications of his socialistic New Deal agenda that would create enormous federal deficits if allowed to proceed.

Smiley Darlington Butler was a U.S. Marine Corps major general – the highest rank authorized and the most decorated Marine in U.S. history. Butler (1881-1940) testified in a closed session that his role in the conspiracy was to issue an ultimatum to the president: FDR was to immediately announce he was incapacitated due to his crippling polio and needed to resign. If the president refused, Butler would march on the White House with 500,000 war veterans and force him out of power. Butler claimed he refused the offer despite being offered $3 million and the backing of J.P. Morgan’s bank and other important financial institutions.

A special committee of the House of Representatives (a forerunner to the Committee on Un-American Activities) headed by John McCormack of Massachusetts heard all the testimony in secret, but no additional investigations or prosecutions were launched. The New York Times thought it was all a hoax, despite supporting evidence. Later, President Kennedy privately mused that he thought a coup d’état might succeed if a future president thwarted the generals too many times, as he had done during the Bay of Pigs crisis. He cited a military plot like the one in the 1962 book Seven Days in May, which was turned into a 1964 movie starring Burt Lancaster and Kirk Douglas.

In reality, the peaceful transfer of power from one president to the next is one of the most resilient features of the American Constitution and we owe a deep debt of gratitude to the framers and the leaders who have served us so well.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Notorious traitors? Let’s look at Benedict Arnold

A May 24, 1776, letter by Benedict Arnold, signed, to Gen. William Thompson, realized $23,750 at an April 2016 Heritage auction.

By Jim O’Neal

Vidkun Quisling is an obscure name from World War II. To those unfamiliar with some of the lesser-known details, “Quisling” has become a synonym for a traitor or collaborator. From 1942 to 1945, he was Prime Minister of Norway, heading a pro-Nazi puppet government after Germany invaded. For his role, Quisling was put on trial for high treason and executed by firing squad on Oct. 24, 1945.

Obviously better known are Judas Iscariot of Last Supper fame (30 pieces of silver); Guy Fawkes, who tried to assassinate King James I by blowing up Parliament (the Gunpowder Plot); and Marcus Junius Brutus, who stabbed Julius Caesar (“Et tu, Brute?”). In American history, it’s a close call between John Wilkes Booth and Benedict Arnold.

Arnold

The irony concerning Benedict Arnold (1741-1801) is that his early wartime exploits had made him a legendary figure, but Arnold never forgot the sleight he received in February 1777 when Congress bypassed him while naming five new major generals … all of them junior to him. Afterward, George Washington pledged to help Arnold “with opportunities to regain the esteem of your country,” a promise he would live to regret.

Unknown to Washington, Arnold had already agreed to sell secret maps and plans of West Point to the British via British Maj. John André. There have always been honest debates over Arnold’s real motives for this treacherous act, but it seems clear that purely personal gain was the primary objective. Heavily in debt, Arnold had brokered a deal that included having the British pay him 6,000 pound sterling and award him a British Army commission for his treason. There is also little doubt that his wife Peggy was a full accomplice, despite a dramatic performance pretending to have lost her mind rather than her loyalty.

The history of West Point can be traced back to when it was occupied by the Continental Army after the Second Continental Congress (1775-1781) was designated to manage the Colonial war effort. West Point – first known as Fort Arnold and renamed Fort Clinton – was strategically located on high ground overlooking the Hudson River, with panoramic views extending all the way to New York City, ideal for military purposes. Later, in 1801, President Jefferson ordered plans to establish the U.S. Marine Corps there, and West Point has since churned out many distinguished military leaders … first for the Mexican-American War and then for the Civil War, including both Ulysses S. Grant and Robert E. Lee. It is the oldest continuously operating Army post in U.S. history.

To understand this period in American history, it helps to start at the end of the Seven Years’ War (1756-63), which was really a global conflict that included every major European power and spanned five continents. Many historians consider it “World War Zero,” and on the same scale as the two 20th century wars. In North America, the skirmishes started two years earlier in the French and Indian War, with Great Britain an active participant.

The Treaty of Paris in 1763 ended the conflict, with the British winning a stunning series of battles, France surrendering its Canadian holdings, and the Spanish ceding its Florida territories in exchange for Cuba. Consequently, the British Empire emerged as the most powerful political force in the world. The only issue was that these conflicts had nearly doubled England’s debt from 75 million to 130 million sterling.

A young King George III and his Parliament quietly noted that the Colonies were nearly debt free and decided it was time for them to pay for the 8,000-10,000 Redcoat peacetime militia stationed in North America. In April 1864, they passed legislation via the Currency Act and the Sugar Act. This limited inflationary Colonial currency and cut the trade duty on foreign molasses. In 1765, they struck again. Twice. The Quartering Act forced the Colonists to pay for billeting the king’s troops. Then the infamous Stamp Act placed direct taxes on Americans for the first time.

This was one step too far and inevitably led to the Revolutionary War, with armed conflict that involved hot-blooded, tempestuous individuals like Benedict Arnold. A brilliant military leader of uncommon bravery, Arnold poured his life into the Revolutionary cause, sacrificing his family life, health and financial well-being for a conflict that left him physically crippled. Sullied with false accusations, he became profoundly alienated from the American cause for liberty. His bitterness unknown to Washington, on Aug. 3, 1780, the future first president announced Arnold would take command of the garrison at West Point.

The appointed commander calculated that turning West Point over to the British, perhaps along with Washington as well, would end the war in a single stroke by giving the British control over the Hudson River. The conspiracy failed when André was captured with incriminating documents. Arnold fled to a British warship and they refused to trade him for André, who was hanged as a spy after pleading to be shot by a firing squad. Arnold went on to lead British troops in Virginia, survived the war, and eventually settled in London. He quickly became the most vilified figure in American history and remains the symbol of treason yet today.

Gen. Nathanael Greene, often called Washington’s most gifted and dependable officer, summed it up after the war most succinctly: “Since the fall of Lucifer, nothing has equaled the fall of Arnold.”

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].