Link Between Value of Money and Gold a Quaint Relic of the Past

This Serial Number 1 Stephen Decatur $20 1878 Silver Certificate, Fr. 306b, is believed to be the first silver certificate ever produced. It sold for $175,375 at a May 2005 Heritage auction.

By Jim O’Neal

In 1961, I was a member of a high-powered bowling team that competed on Tuesday nights at the South Gate Bowling Center in Southern California. We all had 200-plus averages, but only managed to win one league championship in the four years we were together. In February, one of my teammates, Carl Belcher, bowled a perfect game (12 strikes) and received 250 silver dollars from a promotional gimmick the arena used to attract customers. Nobody paid much attention and I personally thought it was an unnecessary inconvenience to lug the sacks to a local bank to get rid of them.

Most of the silver dollars in circulation were probably in Nevada since all the Reno and Las Vegas casino slot machines used them instead of tokens. Even paper currency was printed with the promise to “pay to the bearer on demand … one silver dollar,” which evolved into “one dollar in silver.” For a while, it was possible to get a small plastic bag of silver equivalent to the denomination of the paper currency.

Silver certificates were authorized by two Acts of Congress. The first on Feb. 28, 1878, followed by another on Aug. 9, 1886. These notes are particularly attractive, quite rare and sometimes expensive. At one time, I owned an especially distinguished $20 bill with the head of Captain Stephen Decatur, naval hero of the War of 1812. It was serial number 1 and experts believe that since the Treasury generally printed the $20s first, this note was probably the first silver certificate ever printed. Heritage Auctions auctioned it in 2005 for $175,000 when I sold my currency collection.

However, after Executive Order 6102 of 1933, there were no more gold coins or silver dollars minted in the United States and paper notes were used for denominations above 50 cents. Up to 1964, dimes, quarters and half dollars were minted in 90 percent silver, and half dollars contained 40 percent silver from 1965-70. Even the lowly penny had most of its copper content removed and is now made primarily of zinc, with a thin copper plating.

For 4,000 years, the only period in which civilization has not based its currency on metal, especially gold and silver, is the past 46 years. On Aug. 15, 1971 (“A date that has lived in infamy”), President Richard Nixon announced the temporary suspension of dollars into gold. The White House tapes from the previous week reveal that he thought gold prices would explode after being de-linked since the Federal Reserve would print money like crazy once the currency was not collateralized and this overprinting would affect jobs (unemployment had just gone from 4 percent to 6 percent). And Nixon was “not about to be a hero” (his words) on inflation at the expense of employment.

Then the administration imposed a rigorous regime of wage and price controls, enforced by IRS audits and leverage over federal contracts. The plan failed spectacularly and the 1970s were rife with double-digit inflation, energy shortages and ultimately the “stagflation” that torpedoed both the Ford and Carter presidencies.

Flash forward to today as we are still trying to use monetary policy to solve economic issues and unwilling to even touch the critical fiscal issues that are fundamental to the future economic challenges everyone acknowledges. The only thing that has changed is that there is no need to actually print money when it can be “whistled into existence” via monetary legerdemain called quantitative easing, where the Federal Reserve loans money to the Treasury Department.

Since the financial crisis of 2008, the world’s central bankers have materialized $12.25 trillion by tapping on a computer keyboard. For perspective, the value of all the gold that’s ever been mined, according to the World Gold Council, is a mere $7.4 trillion. The historical linkage between the value of our money and its metal content is a quaint relic of the past.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Only Four Presidents Never Appointed a Supreme Court Justice

An 1840 silk banner depicting William Henry Harrison realized $33,460 at a May 2010 Heritage auction.

By Jim O’Neal

When Donald Trump’s appointee fills the Supreme Court vacancy created by the death of Justice Antonin Scalia, the chief executive will escape from a small group of presidents who did not appoint a single nominee confirmed by the Senate. Trump’s pick will join the other 117 justices, 17 chief justices and four women who have served on the court.

Presidents without a Supreme Court appointee:

  • William Henry Harrison (1841) – Died only 31 days after being inaugurated.
  • Zachary Taylor (1849-50) – Died 16 months after inauguration.
  • Andrew Johnson (1865-69) – Victim of a hostile Congress that blocked several nominees.
  • Jimmy Carter (1977-81) – The only president to serve a full term with no vacancies during his four years in office.

It seems clear that the Founding Fathers did not spend a lot of time considering the importance of the Supreme Court as an equal branch of government. That would come later during the tenure of Chief Justice John Marshall, who many credit with providing the balance to ensure that our fragile democracy survived.

One example is there are no legal or constitutional requirements for a federal judgeship. There does exist an unwritten prerequisite to have practiced law or to have been a member of the bar, but it is not mandatory. As a matter of historical record, no non-lawyer has ever been a member of the Supreme Court – and it is a virtual certainty that none ever will.

And, although the methodology for judicial appointments was subject to intense debate, the criteria for such appointments was apparently not a matter of significance. Those few delegates who did raise the issue of criteria did so by assuming merit over favoritism. Congress also did not foresee the role political parties would very soon come to play in the appointment and confirmation process.

Only John Adams clearly anticipated the rise of political parties but, of course, he was not a member of the Constitutional Committee. He summarized it rather well: “Partisan considerations, rather than the fitness of the nominees, will often be the controlling consideration of the Senate in passing on nominations.”

I suspect they would all be disappointed by the dramatic, partisan “gotcha” grilling that nominees face today.

Personally, I would prefer the old process the Scots used to select Supreme Court justices. The nominations came from the lawyers, who invariably selected the most successful and talented members of the legal community. This effectively eliminated their most fierce competition, which then allowed them to solicit their best customers. The court would then truly be assured of getting the best-of-the best, while the profession competed for clientele.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Ford Viewed his Legacy as Rebuilding Confidence in the Presidency

gerald-ford-presidential-seal-hooked-rug
Gerald Ford’s Presidential Seal hooked rug, used in his home office in Rancho Mirage, Calif., sold for $13,145 at a December 2012 Heritage auction.

By Jim O’Neal

Public opinion polls as early as 1975 indicated that President Gerald Ford would be unlikely to win the Republican nomination for president in 1976. The main competition came from the conservative former governor of California, Ronald Reagan. However, Ford was determined to campaign hard and plunged into an aggressive schedule.

The mass demonstrations at the White House had finally started to wind down, although there was another incident in March 1975. Sixty-two protesters entered the grounds on the regular daily tour and then refused to leave, saying the U.S. should end involvement in the Indochina war and liberate the 200,000 political prisoners in South Vietnam. President Ford’s amnesty offer to those who had avoided the draft expired on March 1, and the protesters also demanded amnesty for “anyone who had resisted the war.” Most were booked and released from jail.

As the president started his campaign trip West, there were some nasty surprises lurking in Northern California. On Sept. 5, 27-year-old Lynette “Squeaky” Fromme, a cult follower of convicted mass murderer Charles Manson, pulled a partially loaded Colt-45 and fired it at Ford when he was two feet away. There was no bullet in the firing chamber and an alert Secret Service agent grabbed the gun before it could be fired again.

Three weeks later, as Ford left his San Francisco hotel (the St. Francis), 45-year-old Sara Jane Moore, a civil-rights activist, fired a 38-caliber revolver at him, but missed. A bystander prevented her from taking a second shot. Both women were convicted and given life sentences. Subsequently, both were released under a federal law that allows parole after 30 years, although “Squeaky” served two extra years for a prison escape/recapture.

gerald-ford
President Ford

At the GOP convention in Kansas City, Ford narrowly won the nomination on Aug. 19 with 1,187 votes to Reagan’s 1,070. He chose Bob Dole for his running mate. The Democrats picked Jimmy Carter and once again the opinion polls showed that the president was far less popular than the Georgia peanut farmer.

Ford challenged Carter to a series of televised debates – the first time an incumbent president debated an opponent. Ford also campaigned hard and nearly caught Carter, but in the November election he became the first sitting president to be defeated since Herbert Hoover in 1932.

In his final State of the Union address to Congress on Jan. 12, 1977, Ford said, “I am proud of the part I have played in rebuilding confidence in the presidency, confidence in our free system and confidence in our future. Once again, Americans believe in themselves, believe in their leaders, and in the promise that tomorrow holds for their children.”

Amen.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Iranian Revolution Created Tensions That Have Yet to be Resolved

don-ivan-punchatz-ayatollah-khomeini-unpublished-alternate-time-magazine-cover-illustration-original-art-1984
An unpublished Time cover illustration of Ayatollah Khomeini by Don Ivan Punchatz (1936-2009), dated 1984, went to auction in September 2012.

By Jim O’Neal

No act of terror could have exceeded the profound tension of the 1970s after the unpredictable drama that enveloped a Middle East nation in 1979. For Americans, it closed out the decade with a new and ferocious attack on our pride and sense of well-being.

It arrived from a most unlikely source: a bearded, humorless, 79-year-old Muslim cleric – in exile the previous 15 years from his native Iran, the last of them in Neauphle-le-Château (outside of Paris), preaching Sharia law and campaigning for the ouster of the Shah.

Remarkably, in early 1979, the Ayatollah Ruhollah Khomeini achieved his life’s goal: toppling the Shah’s Pahlavi dynasty and replacing him as de facto head of a modern theocracy. As he did, the enthusiasm of his insurrection fanned the flames of anti-Western fanaticism throughout the Muslim world. An Islamic Revolution was formally under way.

After centuries of being guided by conservative mullahs, Iran had been wrenched into the 20th century by what the Shah described as a “white revolution” (bloodless). He was the son of an army officer who had seized control of Iran in the 1920s. The Shah succeeded his father, was briefly deposed and then reinstalled by a CIA-led coup in 1953.

The Shah was active, stripping the clergy of their vast land holdings, declaring radical new rights for women, dramatically increasing urbanization and strengthening ties to the West. In addition to being a source of oil, Iran became a strategic impediment to the advancement of its neighbor, the Soviet Union. As western alliances flourished, so did Iran. Previously a desert state, it was transformed into a stunning country with shiny steel mills, nuclear power and an army well-stocked with American artillery.

Unfortunately, much of the populace did not want to abandon their rich heritage. They found inspiration in the sermons of Muslim leaders and viewed the western world as plagued with problems. The increasing tension forced the Shah to crack down hard and by 1979, he could not prevent popular resistance.

Early on Nov. 4, 1979, a mob of demonstrators breached the American Embassy in Tehran, took the staff as hostages and began their 444-day declaration of vengeance against the Great Satan. They defied the United Nations, the United States, and a failed 1980 rescue mission that left aircraft wreckage, the bodies of eight U.S. servicemen, and Jimmy Carter’s reelection effort in the desert sands.

Thirty-seven years later, the struggle of East-West continues and only the leaders have changed. However, the West is now viewed as occupiers instead of hostages and multiple conflicts in various countries offer little hope for peace. Civil wars usually last about 10 years. This may turn out to be a generational conflict, involving competing civilizations, perhaps all armed with nuclear capabilities.

To date, no one has offered a coherent strategy for an endgame as we continue to argue and debate who or what to blame.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Washington Recognized the Chaos of Autonomous States

The bronze sculpture George Washington at Valley Forge by Henry Merwin Shrady, modeled in 1905, cast circa 1906, sold for $54,970 at an April 2007 Heritage auction.

By Jim O’Neal

It had been a long war and George Washington was both tired and relieved to be returning to his plantation in Virginia for a well-deserved retirement. Mount Vernon was badly in need of his full-time attention and his finances were frayed.

However, he was apprehensive about a central government that consisted of a chaotic, ramshackle Congress considered by GW to be “wretchedly managed.” The legislature was a one-vote, one-state body that required a quorum of nine states to operate and a unanimous vote for major laws. This was no “United States,” but a loosely governed confederation of 13 states that were largely autonomous.

It seemed clear that the Articles of Confederation were impotent and in need of major revisions. However, it would probably require a crisis to force the changes and GW could sense that others would be looking to him (once again!) to provide the leadership needed, retirement or not.

He was right on both counts.

The crisis came when thousands of farmers in rural Massachusetts rebelled against tax increases on land the state had imposed to help pay off heavy debts. The farmers, many of whom had lost their land to foreclosure, swamped courthouses and threatened judges using their pitchforks.

They were led by Daniel Shays (hence “Shays’ Rebellion”), an ex-militia captain, and they finally marched on the Springfield arsenal intent on seizing muskets and powder. This anarchy was met by the Massachusetts militia, who fired point-blank into the crowd, and then by General Benjamin Lincoln, who arrived the next day with 4,000 soldiers to quell the rebels.

Washington was mortified by these events, since he feared disgrace from the Europeans who were still skeptical of American self-rule. More importantly, it galvanized him to join James Madison, James Monroe and Edmund Randolph to strengthen the Articles of Confederation they had fought so hard for.

Eventually, an executive branch was established and in February 1789, all 69 presidential electors chose GW unanimously to be the first president of the United States. In March, the new U.S. Constitution officially took effect and, in April, Congress formally sent word to Washington that he had won the presidency.

He borrowed money to pay off his debts and travel to New York again, this time to be inaugurated. After a second four-year term, he was finally able to resume his retirement. This time, it only lasted two years since he died on Dec. 14, 1799.

President Jimmy Carter bestowed the rank of “six-star general” and “General of the Armies of Congress” in the hope that Washington would be the highest-ranking military person of all time. Irrespective of future grade inflation, I’m betting this rank will not be surpassed.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

State of the Union Speeches Will Continue Evolving

Twenty-three lines in Abraham Lincoln’s own handwriting from his last State of the Union address went to auction in June 2009.

By Jim O’Neal

On Jan. 15, 1975, President Gerald Ford in his State of the Union speech said:

“The State of The Union is not good. Millions of Americans are out of work. Recession and inflation are eroding the money of millions more.”

“Prices are too high and sales too slow.”

“The national debt will rise to over $500 billion.”

“We depend on others for essential energy.”

These were remarkably candid admissions and atypical from most of his predecessors, who took great leeway with the facts to spin a nice story.

George Washington personally delivered the first State of the Union to a joint session of Congress on Jan. 8, 1790.

Then Thomas Jefferson abandoned the “in person” practice because it was too similar to what a monarch might do, something he was trying to avoid (i.e., a speech from the throne).

In 1913, President Woodrow Wilson revived the practice and it has gradually become a major national event. It has also morphed into a presidential wish list rather than a practical, non-political assessment of national conditions … as designed.

Personal attendance by high-profile politicians is a “must,” except for one Cabinet member who is in the line of secession (a designated survivor) in the event of a major catastrophe.

In 1981, Jimmy Carter felt compelled to issue an “exit” State of the Union, but that lame-duck ritual has been discontinued.

However, I suspect presidents will increasingly remind us … one more time … about everything that was accomplished, in case we forgot. It provides an excellent chance to combine a farewell with the start of a memoir … and not leave a legacy assessment in the hands of less gentle hands.

I would.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].