100 Years Before Rosa Parks, There was Octavius Catto

Rosa Parks refused to give up her seat on a segregated bus, sparking the Montgomery, Ala., bus boycott.

By Jim O’Neal

Most Americans are familiar with Rosa Parks and recall the heroic story of a weary black woman on her way home after a hard day at work who refused to give up her seat and “move to the back of the bus” to make room for white people. The date was Dec. 1, 1955, and the city was Montgomery, Ala.

Later, she would be arrested during the ensuing Montgomery bus boycott that lasted 381 days. She was fined $10, but ultimately vindicated by the U.S. Supreme Court, which ruled the segregation law was unconstitutional. After her death, she became the first African-American woman to have her likeness depicted in the National Statuary Hall in the U.S. Capitol.

Parks (1913-2005) earned her way into the pantheon of civil rights leaders, but few remember a remarkable man who preceded her by a century when streetcars were pulled by horses.

Catto

His name was Octavius Valentine Catto (1839-1871) and history was slow in recognizing his astonishing accomplishments. Even the epitaph on his tombstone shouts in bold letters “THE FORGOTTEN HERO.” One episode in his far-too-short but inspiring life is eerily similar to the events in Montgomery, only dramatically more so. Catto was a fierce enemy of the entire Philadelphia trolley car system, which banned black passengers. On May 18, 1865, The New York Times ran a story about an incident involving Catto that occurred the previous afternoon in Philadelphia, “The City of Brotherly Love” (at least for some).

Paraphrasing the story, it describes how a colored man (Catto) had refused all attempts to get him to leave a strictly segregated trolley car. Frustrated and in fear of being fined if he physically ejected him, the conductor cleverly side railed the car, detached the horses and left the defiant passenger in the now-empty stationary car. Apparently, the stubborn man was still on-board after spending the night. It caused a neighborhood sensation that led to even more people challenging the rules.

The following year, there was an important meeting with the Urban League to protest the forcible ejection of several black women from Philadelphia streetcars. The intrepid Catto presented a number of resolutions that highlighted the inequities in segregation, principles of freedom, civil liberty and a heavily biased judicial system. He also boldly solicited support from fellow citizens in his quest for fairness and justice.

He got specific help from Pennsylvania Congressman Thaddeus Stevens, a leader of the “Radical Republicans” who had a fiery passion for desegregation and abolition of slavery, and who criticized President Lincoln for lack of more forceful action. Stevens is a major character in Steven Spielberg’s 2013 Oscar-nominated film Lincoln, with Tommy Lee Jones gaining an Oscar nomination for his portrayal of Stevens. On Feb. 3, 1870, the 15th Amendment to the Constitution guaranteed suffrage to black men (women of all colors would have to wait another 50 years until 1920 to gain the right to vote in all states). It would also lead to Catto’s death. On Election Day, Oct. 10, 1871, Catto was out encouraging black men to vote for Republicans. He was fatally shot by white Democrats who wanted to suppress the black vote.

Blacks continued to vote heavily for Republicans until the early 20th century and were not even allowed to attend Democratic conventions until 1924. This was primarily due to the fact that Southern states had white governors who mostly discouraged equal rights and supported Jim Crow laws that were unfair to blacks. As comedian Dick Gregory (1932-2017) famously joked, he was at a white lunch counter where he was told, “We don’t serve colored people here,” and Gregory replied, “That’s all right. I don’t eat colored people … just bring me a whole fried chicken!”

Octavius Catto, who broke segregation on trolley cars and was an all-star second basemen long before Jackie Robinson, would have to wait until the 20th century to get the recognition he deserved. I suspect he would be surprised that we are still struggling to “start a national conversation” about race when that’s what he sacrificed his life for.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As Sir Newton Noted, Vast Oceans of Truth Lie Undiscovered

Sir Isaac Newton’s autograph was among of group of three signatures by famous scientists that sold for $4,750 at a January 2017 Heritage auction.

By Jim O’Neal

Charles Eliot was president of Harvard College from 1869 to 1909, taking charge at the surprisingly young age of 35. He made some surprising statements, too, starting with his inauguration speech. He matter-of-factly observed that the world knew very little about the natural mental capacities of women. Further, he plainly stated the university was not the place to be experimenting with that notion, and that women would not be admitted to the regular college. He was also concerned about women living near the university and the obvious implications of being near to male students. It was his firm belief that once society resolved issues of inequality, perhaps the issue would become clearer. However, even long after his retirement, he maintained his doubts, since women were “simply physically too fragile.”

Another insight into his perspective occurred when the school’s baseball team had a winning season. When he learned that one of the factors that contributed to this success was the use of the curve ball, he opined this was a skill that was surely unethical and certainly not appropriate for Harvard players.

Fortunately, this was not a systemwide ethos and he may have been unaware that one of his professors, Edward Charles Pickering (director of the Harvard College Observatory), fired his entire staff of men due to their apparent inability to stay up with all the data that was routinely generated. Instead, he simply hired his maid/housekeeper to handle the numbers, eventually hiring 80-plus women, who became better known as the Harvard Computers.

One of these women was a little-known Radcliffe College graduate named Henrietta Swan Leavitt, who was “allowed” to measure the brightness of stars using the observatory’s photographic plates (women were not allowed to actually operate the telescopes). Leavitt devised a novel way to measure how far certain stars were and expressed the values in “standard candles,” a term still in common use today. Another of the computers, Annie Jump Cannon, created a new system of stellar classifications. Together, their inferences would prove to be invaluable to answering two critical questions about the universe: How old is it and how big?

The man who came up with the answer using their inferences was lawyer-turned-astronomer Edwin Powell Hubble. He was born in 1889 and lived until 1953. When he sat down to peer through the Mount Wilson (Calif.) Observatory’s 100-inch Hooker telescope in 1917 (the world’s largest until 1949), there was exactly one known galaxy: our lonely little Milky Way. Hubble not only proved the universe consisted of additional galaxies, but that the universe was still expanding. How much credit was given to the Harvard Computers group is still an area of contention, but only to what degree their work represented in these new discoveries.

Hubble was a handsome, star athlete who won seven high school track events in one day and was also a skilled boxer. He never won a Nobel Prize, but won everlasting fame when NASA named their long-overdue space telescope in honor of his scientific contributions. The Hubble Space Telescope was launched into orbit on April 24, 1990, by the Space Shuttle Discovery. Since then, it has been repaired and upgraded five different times by American astronauts and the results have been nothing short of amazingly remarkable. NASA is now confident that before Hubble’s replacement in four to five years, they will be capable of looking back into deep space far enough to see the big bang that started everything 13.8 billion years ago.

For perspective, consider what has been learned since Sir Isaac Newton was born on Christmas Day in 1642, when the accepted theory was a heliocentric model of the universe with Earth and the other planets orbiting our Sun. It was similar to what Copernicus published at the end of his life in 1543. Since then, all the great scientific minds have been focused on our galaxy trying to prove the laws of motion, the theory of light and the effects of gravity on what they believed was the entire universe. All big, important concepts (but only as it relates to our little infinitesimal piece of real estate). And then along came quantum mechanics, which added the world of the very small with its atoms, electrons, neutrons and other smaller pieces too numerous to be named as we bang particles into each other to see what flies off.

What Hubble has gradually exposed is that we have been gazing at our navel while the sheer magnitude of what lies “out there” has grown exponentially – and there may be no end as the vastness of an expansionary universe continues to speed up. I seem to recall that some wild forecasters thought there might be as many as 140 billion galaxies in the universe. Now, thanks to Hubble and lots of very smart people, the number of galaxies may be 2 trillion! If they each average 100 billion stars, that means the number of stars is now 200 sextillion – or the number 2 followed by 23 zeros.

That is a big number, but what if we are living in a multiverse with as many as 11 more universes?

I recall a quote by Sir Isaac Newton: “To myself I am only a child playing on the beach, while vast oceans of truth lie undiscovered below me.”

Aren’t we all.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Yes, Presidential Elections Have Consequences

Chief Justice of the Supreme Court John Marshall is featured on this Fr. 375 Serial Number One $20 1891 Treasury Note, which sold for $114,000 at an April 2018 Heritage auction.

By Jim O’Neal

In theory, there is no mystery or debate regarding the intention of the Founding Fathers in the selection of members to serve on the Supreme Court.

The Constitution crisply explains, in the second paragraph of Article II, Section 2, that the president shall nominate, and by and with the advice and consent of the Senate, shall appoint judges of the Supreme Court. This provision means exactly what it says and is unchanged by any modifications since its adoption. That includes a simple majority vote of the Senate to grant such consent, to reject or refuse to take action on the presidential nominee.

One idea discussed, but not acted upon, was Benjamin Franklin’s explanation of the Scottish mode of appointment “in which the nomination proceeded from the lawyers, who always selected the ablest of the profession in order to get rid of him, and share his practice among themselves” – a uniquely clever way to eliminate superior competition.

What has changed is the adoption of the “nuclear option” in 2017, which invoked cloture to end filibustering in the Judicial Committee and forced a vote of the committee either up or down on making their recommendation to the full Senate. House Majority Leader Harry Reid had used it to great effect for all legislation that he allowed to the floor while the Democrats were in the majority. Republicans expanded it to include Supreme Court nominees after they regained the majority in 2016. Neil Gorsuch was elected to the Supreme Court under this new rule with a 54-45 Senate vote, picking up three anxious Democrat votes in the process. It’s widely assumed that current nominee Judge Brent Kavanaugh will be elected to the Supreme Court following a similar path since his opponents appear helpless to stop him.

As President Obama once explained, in not too subtle fashion, “Elections have consequences.”

It now seems clear that the Founding Fathers did not foresee that political parties would gradually increase their influence and that partisan considerations of the Senate would become more prominent than experience, wisdom and merit. This was magnified in the current effort to stymie a nomination when the opposition announced they would oppose any candidate the Chief Executive chose. Period. It may not seem reasonable on a literal basis, but it has gradually become routine and will only get worse (if that’s still possible).

It may astonish some to learn that no legal or constitutional requirements for a federal judgeship exist. President Roosevelt appointed James F. Byrnes as an associate justice in 1941 and his admission to practice was by “reading law.” This is an obsolete custom now – Byrnes was the last to benefit – that proceeded modern institutions that specialize in law exclusively. In Byrnes’ case, it’s not clear that he even had a high school diploma. But he was a governor and member of Congress. He resigned 15 months later (the second shortest tenure) in order to become head of the Office of Economic Stabilization and was a trusted FDR advisor who many assumed would replace Vice President Henry Wallace as FDR’s running mate in 1944. That honor went to the little-known, high-school educated Harry Truman, who would assume the presidency the following year when FDR died suddenly.

Thomas Jefferson never dreamed the Supreme Court would become more than just a necessary evil to help balance the government in minor legal proceedings and would be more than astonished that they now are the final arbiter of what is or isn’t constitutional. The idea that six judges (who didn’t even have a dedicated building) would be considered equal to the president and Congress would have been anathema to him.

However, that was before he met ex-Secretary of State John Marshall when he became Chief Justice of the Supreme Court and started the court’s long journey to final arbiter of the Constitution when he ruled on Marbury v. Madison in 1803. There was a new sheriff in town and the next 40 years witnessed the transformation of the court to the pinnacle of legal power. They even have their own building thanks to President William Howard Taft, who died two years before it was complete. Someday, Netflix will persuade them to livestream their public discussions for all of us to watch, although I personally prefer C-SPAN to eliminate the mindless talking heads that pollute cable television.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why We Owe a Lot to Second President John Adams

An 1805 oil-on-canvas portrait of John Adams attributed to William Dunlap sold for $35,000 at a May 2017 Heritage auction.

By Jim O’Neal

John Adams had the misfortune of being squeezed into the presidency of the United States (for a single term) between George Washington and Thomas Jefferson, two of the most famous presidents of all time. As a result, Adams (1735-1826) was often overlooked as one of America’s greatest statesmen and perhaps the most learned and penetrating thinker of his time. The importance of his role in the founding of America was noted by Richard Stockton, a delegate to the Continental Congress: “The man to whom the country is most indebted for the great measure of independence. … I call him the Atlas of American Independence.”

On the way to that independence, his participation started as early as 1761 when he assisted James Otis in defending Boston merchants against Britain’s enforcement of the Sugar Tax. When the American Revolution ended, Adams played a key role in the peace treaty that formally ended the war in 1783. In between those two bookends, he wrote many of the most significant essays and treatises, led the radical movement in Boston, and articulated the principles at the Continental Congress.

Following the infamous Stamp Act in 1765, he attacked it with a vengeance and wrote A Dissertation on the Canon and Feudal Law, asserting it deprived the colonists of two basic rights: taxation by consent and a jury trial by peers – both guaranteed to all Englishmen by the Magna Carta. Within a brief 10 years, he was acknowledged as one of America’s best constitutional scholars. When Parliament passed the Coercive Acts in 1774, Adams drafted the principal clause of the Declaration of Rights and Grievances; no man worked harder in the movement for independence and the effort to constitutionalize the powers of self-government.

After the Battles of Lexington and Concord, Adams argued for the colonies to declare independence and in 1776, Congress passed a resolution recommending the colonies draft new constitutions and form new governments. Adams wrote a draft blueprint, Thoughts on Government, and four states used it to shape new constitutions. In summer 1776, Congress considered arguments for a formal independence and John Adams made a four-hour speech that forcefully persuaded the assembly to vote in favor. Thomas Jefferson later recalled that “it moved us from our seats … He was our colossus on the floor.”

Three years later, Adams drafted the Massachusetts Constitution, which was copied by other states and guided the framers of the Federal Constitution of 1787.

He faithfully served two full terms as vice president for George Washington at a time when the office had only two primary duties: preside over the Senate and break any tie votes, and count the ballots for presidential elections. Many routinely considered the office to be part of Congress as opposed to the executive branch. He served one term as president and then lost the 1800 election to his vice president, Thomas Jefferson, as the party system (and Alexander Hamilton) conspired against his re-election. Bitter and disgruntled, he left Washington, D.C., before Jefferson was inaugurated and returned to his home in Massachusetts. His wife Abigail had departed earlier as their son Charles died in November from the effects of chronic alcoholism.

Their eldest son, John Quincy Adams, served as the sixth president (for a single term) after a contentious election, and they both gradually sunk into relative obscurity. This changed dramatically in 2001 when historian David McCullough published a wonderful biography that reintroduced John and Abigail Adams to a generation that vaguely knew he had died on the same day as Thomas Jefferson, July 4, 1826 – the 50th anniversary of the signing of the Declaration of Independence. In typical McCullough fashion, it was a bestseller and led to an epic TV mini-series that snagged four Golden Globes and a record 13 Emmys in 2008.

Television at its very best!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Withholding Created 75 Years Ago – Giving Us a ‘Charge It’ Government

A preliminary sketch Norman Rockwell did for a 1945 Saturday Evening Post cover, titled Income Taxes (Beating the Deadline), sold for $59,375 at a November 2014 Heritage auction.

“Our new Constitution is now established, and has an appearance of permanency, but in this world nothing can be said to be certain, except death and taxes.” – Benjamin Franklin, 1789

By Jim O’Neal

I suspect Benjamin Franklin would be pleased that our Constitution has become the most revered document of our United States, but mildly surprised that the U.S. Internal Revenue Code – while undoubtedly much more prosaic – now symbolizes the highly intimate relationship between the people and their federal government. Detailed IRS regulations guide the filing of federal tax returns, an activity that is the most universal civic act in our history. Its 14,000 pages and 4 million words represent a remarkable achievement unparalleled by any government on earth.

As the size and cost of government have grown, so has the size and difficulty of the tax return itself. In 1913, the first year the modern income tax was levied (an emergency income tax levied during the Civil War was allowed to expire in 1872), the top rate was 7 percent and then only for incomes over $500,000. The rate for people between $3,000 and $20,000 was just 1 percent, and below that zero. All but 1 percent of Americans were exempt from taxes. This was by design since advocates wanted a tax directed only at excess corporate and personal profits, not the wages of ordinary people. It was a way of reasserting the values of the early republic – now focused principally on equality – in reaction to gross inequities brought on by industrialization, and a way to force millionaire industrialists to share their wealth with society.

But WWI and the Great Depression increased the responsibilities of the federal government and rates took a quantum leap with the demands of WWII as the government took advantage of American patriotism. The number of tax filers rose to a point that what had been a “class tax” became a “mass tax.” The April 15 deadline is now a national rite, dreaded as much as it is observed. The complexity has become so pervasive that most filers require the aid of professional tax preparers. Looking back, it still seems remarkable that the income tax could have been extended to include so many people without creating a backlash. The wars helped, as did the success of government in defeating our enemies and the post-war economic growth. But the primary reason was that a new way had been devised to collect it.

For that, the IRS can thank Beardsley Ruml, a mid-century Macy’s executive who came up with a plan to institute what is politely called “withholding.” Until 1943, income tax was paid each year in a lump sum and filers were expected to put aside the money to make the payment. Yet that year, when the number of wage earners included under the tax grew by nearly 35 million and the Treasury Department became nervous about how many were actually prepared to pay, Ruml offered an idea. Aware that customers in his store were comfortable buying big-ticket items when they could pay in installments, he suggested the government get businesses to collect the tax in small increments and report that amount to the employees and IRS each year for future reconciliation.

To get the public’s endorsement, he also suggested a tax amnesty for the previous year. Congress did just that by forgiving 75 percent of the previous year’s tax liability while they installed the machinery for the withholding that has operated ever since. To appreciate the profound shift that a broad-based income tax brought to the Treasury, just consider that in 1910, tariffs and excise taxes brought in more than 90 percent of federal monies; by the end of the century, income tax had replaced tariffs, providing 90 percent of the nation’s revenue, or $2 trillion! More importantly, it changed the debates – from regional tariffs or whiskey producers versus cattle growers, to which income levels should be taxed more. Class versus class and a “soak the rich” is always the first reaction to feed the insatiable appetite at every level of government.

As an elastic source of revenue, the income tax became a fundamental part of statism, a tool to be used in the interest of creating a more democratic social order. Look to Washington, D.C., today to see what this has wrought: a city bursting at the seams with lobbyists, industry organizations, tax lawyers and political advocacy groups. Any tall building will have a group with the word “tax” in its title, all working to shape policy and regulations. Yet despite our best efforts, we have become addicted to spending more than our revenue and simply saying “charge it.”

I suspect even Mr. Ruml would be surprised about the success of our “buy-now-pay-later” system that so closely resembles his Macy’s secret sauce.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Let’s Not Forget What Colonists Created 400 Years Ago

This 1976 Gold Bicentennial Medal, graded PR64 NGC, sold for $23,500 at a July 2017 Heritage auction.

By Jim O’Neal

We live on a landmass currently called North America that is relatively young in its present incarnation with most estimates in the range of 200 million years. In periods before, a large portion of it was called Laurentia, which drifted across the equator joining and separating from supercontinents in various collisions that shaped it. There are thick layers of sedimentary rocks that are up to 2 billion years old. Eventually, it was surrounded by ocean and anti-passive margins (i.e. no boundaries by tectonic plates). Then an island chain slammed into it, raising mountain ranges. For perspective, the Appalachians were as tall and majestic as today’s Himalayas.

For tens of millions of years, there was not a single human being in North America, primarily because it was covered by a thick sheet of ice and it was before mankind had evolved. As the Ice Age ebbed, adventurous souls began walking across land bridges as glacial movements changed the landscape. There were multiple migrations in and out of other areas of the world, but people who moved into the Americas were generally on a one-way ticket. In modern times, there is no consensus on who “discovered” America first.

North American exploration spans an entire millennium, with the Vikings in Newfoundland circa 1000 A.D. through England’s colonization on the Atlantic Coast in the 17th century. Spain and Portugal squabbled over the discoveries of Juan Ponce de León and Vasco da Gama, as France and the Netherlands had their own claims to litigate. But our America is really a British story starting with Jamestown, Va. (1607) that gradually grew into 13 colonies. They grew tired of the English yoke and declared independence in 1776 and conquered the British Army in a well-known story of revolution. They formed a somewhat imperfect union called the United States of America, with a constitution and a smallish national government that is still struggling with the line between states’ rights.

French sculptor Frédéric Auguste Bartholdi attended the 1876 Centennial Exposition in Philadelphia to celebrate the 100th anniversary of the signing of the Declaration of Independence. Bartholdi had a strong personal passion for the concepts of independence, liberty and self-determination. He became a member of the Franco-American Union organization and suggested a massive statue to commemorate the American Revolution and a century of friendship between our two countries. A national fundraising campaign was launched that included traditional contributions, as well as fundraising auctions, lotteries and even boxing exhibitions.

Bartholdi collaborated with engineer Gustave Eiffel to build a 305-foot-tall copper and iron statue, and after completion, it was disassembled for shipment to the United States. Finally, on June 17, 1885, the dismantled statue – 350 individual pieces in 200-plus cases – arrived in New York Harbor. It was a fitting gift, emblematic of the friendship between the French and American people. It was formally dedicated the following year in a ceremony presided over by President Grover Cleveland, who said, “We will not forget that Liberty has here made her home; nor shall her chosen altar be neglected.” The statue was dubbed “Liberty Enlightening the World.”

In 1892, Ellis Island opened as America’s chief immigration station and for the next six-plus decades, the statue looked over more than 12 million immigrants who came to find the freedom they were seeking and the “Streets of Gold” in NYC. A plaque inscribed with a sonnet titled “The New Colossus” was placed on an interior wall in 1903. It had been written 20 years earlier by the poet Emma Lazarus.

Gustave Eiffel was given little credit, despite having built virtually the whole interior of what would become the Statue of Liberty and he vowed not to make that mistake again. Perhaps that is why his magnificent Paris landmark is simply an incredible skeleton framework with none of the conventional sheathing of most tall structures of that era.

One thing is certain: We may not know who discovered America first, but there is little doubt that whoever follows us will be aware of what those few people huddled along the East Coast 400 years ago were able to accomplish. Maybe Elon Musk will have a colony on Mars that is still functioning when the ice or oceans envelop Earth again.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Do We Risk Forgetting the Past … Again?

An illustration of Jimmy Carter and Gerald Ford, dated 1977 and attributed to Al Hirschfeld, sold for $4,500 at an October 2015 auction.

By Jim O’Neal

People of my generation recall the 1970s as a decade of chronic financial instability. A lethal combination of rising inflation, slower growth and unpredictable economic policies resulted in a level of volatility that made the stock market a tricky place to navigate. Although the Dow Jones Industrial Average had closed near $1,000 in 1966, it went sideways for the next 17 years. 1972 produced a boomlet for “Nifty Fifty” stock prices that was followed by a steep decline. By spring 1980, the Dow Jones was back below $800.

Risk-averse investors piled into Money Market Funds (MMF) with high yields and low risk. Ross Perot supposedly bought $1 billion of 30-year Treasury Notes and locked in a 15 percent yield. Others chose to speculate in commodities or precious metals as a hedge against the pernicious effects of high inflation. President Ford waged a war on inflation with his WIN (Whip Inflation Now) program that was more of a slogan than a tangible set of financial policies. Cash was something to convert into tangible assets before it lost its buying power.

One prominent example in 1978 was the wife of the governor of Arkansas. The future first lady turned a modest bankroll of $1,000 into $100,000 in 10 short months by trading in cattle futures, soybeans and live hogs. She explained her market prowess was due to reading The Wall Street Journal. Perhaps even more remarkable was that her trades were mostly “shorts” at a time when cattle prices doubled.

But all commodities were generally on the rise and after the Soviets invaded Afghanistan in 1979, the price of gold rose to $875 an ounce. Nelson Bunker Hunt and his brothers tried to corner the silver market and bought control of 200 million ounces – equivalent to 50 percent of the world’s supply. In the process, silver prices shot up tenfold to $50. The Commodities Exchange (COMEX) and the Federal Reserve stepped in and changed the rules and the price quickly plummeted to $10 in March 1980. Despite losing over a billion dollars, they seemed to be mildly amused and still ended up in Johnny’s BBQ for the usual. Later, they were forced into bankruptcy, but a lot of silverware in Dallas homes got melted down, along with jewelry, teapots and other silver-based objects.

I lost a $20 gold coin when gold was at $430 and I bet it would fall to $400 before it hit $500. I had won it on a different bet by knowing a horse had to run 3 15/16th miles to win the Triple Crown. The Wall Street Journal was not involved in either case.

Then the 1980s gave way to the rise of the professional market trader after several leading investment banks had gone public; transforming cautious partners with limited capital to anonymously secret shareholders with large capital resources. “Proprietary Trading” produced quick profits and large bonuses that offset the elimination of fixed commissions by the NYSE. The flashy trader became a symbol of Wall Street – “Masters of the Universe” as chronicled in Tom Wolfe’s The Bonfire of the Vanities. It was now the era of greed and it became an international phenomenon as deregulation and globalization exploded.

Capital whirled around the globe in 24-hour trading and the remnants of conservatism from the Great Depression had quietly vanished. Debt was now viewed as a tax-efficient way to finance corporate takeovers and deregulation replaced supervision. Hedge funds and private partnerships proliferated like George Soros’ Quantum Fund, which generated 25 percent returns with highly leveraged bets on stocks, currency or “risk arbitrage.” In summer 1982, the Federal Reserve reduced the discount rate and incentivized the leveraged buyouts of public companies (LBO).

Falling interest rates and rising stock prices created a perfect setting for “junk bonds” and leverage became a strategy rather than a risk. Eventually it relied on trading on illegal proprietary insider information. Corporate raiders had a field day until 1986, when Ivan Boesky was arrested and the action moved to the federal courts. Naturally, the virus spread into the large home mortgage market and the savings and loan bubble collapsed.

It took a while for a new generation of greedy financiers to come along, and this time the leverage almost took down the world’s financial system in 2008.

Philosopher George Santayana was right: “Those who do not remember the past are condemned to repeat it.” What’s in your wallet?

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Twain’s Era Marked America’s Emergence on the World Stage

An 1876 first edition, first printing of Mark Twain’s The Adventures of Tom Sawyer sold for $13,750 at an August 2015 Heritage auction.

By Jim O’Neal

American writer and satirist Mark Twain was born on Nov. 30, 1835 – exactly two weeks after Halley’s Comet made its appearance. In his 1909 biography, he wrote, “I came in with Halley’s Comet in 1835. It is coming again next year and I expect to go out with it. It will be the greatest disappointment of my life if I don’t go out with Halley’s Comet. The Almighty has said, no doubt, ‘Now here are these two unaccountable freaks, they came in together, they must go out together.’” Twain died shortly after the comet returned.

Twain – real name Samuel Langhorne Clemens – co-wrote a novel with his friend Charles Dudley Warner titled The Gilded Age: A Tale of Today. It was the only time Twain wrote with a collaborator and it was supposedly the result of a dare from their wives. Whatever the truth, the novel lent its name to the post-Civil War period, which has become widely known as the Gilded Age. The novel skewered that era of American history because of the widespread corruption and materialistic greed of a few at the expense of the downtrodden masses.

Twain

From a purely economic standpoint, the period of 1870-90 was when the United States became the dominant economy in the world. For the majority of recorded history, China and India were the global powerhouses, with 70 percent of world GDP. Economic output, up until about 200 years ago, was largely driven by large populations of people. But with the industrial revolution, followed by the information revolution, the significance of mere huge populations declined. While Europe was going through its resurgence following the Dark Ages, the Asian superpowers were divided into small kingdoms fighting each other.

Factors contributing to the post-Civil War growth were primarily in the North as industrial expansion surged while the slave-labor system was abolished and cotton prices collapsed. New discoveries of coal in the Appalachian Mountains, oil in Pennsylvania, and iron ore around Lake Superior fueled the growth of the United States infrastructure. Railroad systems more than tripled from 1860 to 1880 – concurrent with the Transcontinental Railroad (1869) that linked remote areas with the large industrial hubs; along with commercial farming, ranching and mining. London and Paris poured money into U.S. railroads and American steel production surpassed the combination of Britain, Germany and France. Technology flourished with 500,000 patents issued for new inventions and Thomas Edison and Nikola Tesla electrified the industrial world.

Capital investment increased by 500 percent and capital formation doubled. By 1890, the United States surpassed Britain for manufacturing output and by the beginning of the 20th century, per-capita income was double that of Germany or France and 50 percent higher than Great Britain.

Then, inexplicably, Europeans started a world war and 20 years later, both the European and Asian nations started another global conflict. The United States strategically entered both wars late, preserving our capital, military and human resources. Excluding a few ships here and there (e.g. Pearl Harbor), we kept 100 percent of our domestic infrastructure intact. Excluding 9/11, we have probably damaged more of our own cities in domestic protests and rioting than all foreign enemies combined in acts of war.

As Pogo wisely observed, “We have met the enemy and he is us.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Usual Fireworks Expected with Latest Supreme Court Selection

This photograph, signed by Supreme Court Chief Justice William H. Taft and the eight associate justices, circa 1927, sold for $14,340 at a September 2011 Heritage auction.

By Jim O’Neal

It is that time again when the news will be filled with predictions of pestilence, war, famine and death (the Four Horsemen of the Apocalypse) as President Trump tees up his next candidate for the Supreme Court. One side will talk about the reversal of Roe v. Wade as an example of the terrible future that lies ahead. The other side will be quick to point out that this fear-mongering first started in 1981 when Sandra Day O’Connor (the first woman to serve on the court) was nominated by President Reagan and that nothing has happened in the intervening 37 years.

My prediction is that regardless of whoever is confirmed, there will be no evidence from the past on any opinions on “Roe” and he or she will have been groomed by the “Murder Boards” to answer that it is settled law. Murder Boards are groups of legal experts who will rehearse the nominee on how to answer every possible question the Senate Judiciary Committee might ask on any subject, not just Roe, in their role in giving advice and consent. It produces what former Vice President Joe Biden described as a “Kabuki dance” when he was in the Senate.

The questioning does produce great public theater, but it is a tradition that dates to 1925 when nominee Harlan Stone actually requested he be allowed to answer questions about rumors of improper ties to Wall Street. It worked and he was confirmed by a vote of 71-6 and would later serve as Chief Justice (1941-46). In 1955, John Marshall Harlan II was next when Southern Senators wanted to know his views on public school desegregation vis-à-vis Brown v. Board of Education. He was also successfully confirmed 71-11 and since then, every nominee to the court has been questioned by the Senate Judiciary Committee. The apparent record is the 30 hours of grilling Judge Robert Bork experienced in 1987, when he got “Borked” by trying to answer every single question honestly. Few make that mistake today.

Roe v. Wade was a 1973 case in which the issue was whether a state court could constitutionally make it a crime to perform an abortion, except to save the mother’s life. Abortion had a long, legal history dating to the 1820s when anti-abortion statues began to appear that resembled an 1803 law in Britain that made it illegal after “quickening” (start of fetal movements) using various rationales such as illegal sexual conduct, unsafe procedures and the state’s responsibilities in protecting prenatal life.

The criminalization accelerated from the 1860s and by 1900, abortion was a felony in every state. Despite this, the practice continued to grow and in 1921, Margaret Sanger founded the American Birth Control League. By the 1930s, licensed physicians performed an estimated 800,000 procedures each year. In 1967, Colorado became the first state to decriminalize abortion in cases of rape, incest or permanent disability of the woman. By 1972, 13 states had similar laws and in 1970, Hawaii was the first state to legalize abortion on the request of the woman. So the legal situation prior to Roe was that abortion was illegal in 30 states and legal in the other 20 under certain conditions.

“Jane Roe” was an unmarried pregnant woman who supposedly wished to terminate her pregnancy and instituted an action in the U.S. District Court for the Northern District of Texas. A three-judge panel found Texas criminal statues unconstitutionally vague and the right to choose to have children was protected by the 9th through the 14th Amendments. All parties appealed and on Jan. 22, 1973, the Supreme Court ruled the Texas statute was unconstitutional. The court declined to define when human life begins.

Jane Roe’s real name was Norma McCorvey and she became a pro-life advocate before she died and maintained she never had the abortion and that she was the victim of two young, ambitious lawyers looking for a plaintiff. Henry Wade was district attorney of Dallas from 1951 to 1987 and the longest serving DA in United States history. He was also involved in the prosecution of Jack Ruby for killing Lee Harvey Oswald. After he was convicted, Ruby appealed and the verdict was overturned, but he died of lung cancer and is constitutionally presumed innocent.

Stay tuned for the fireworks.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

For North, Tariffs and Taxes to Fund War Gave Way to Printing Money

Series 1861 $10 Demand Notes were placed into circulation in 1862 and were among the first of U.S. Federal banknotes ever issued. This sample, graded PMG Very Fine 30 EPQ, sold for $381,875 at an August 2014 Heritage auction.

By Jim O’Neal

A follow-up to my previous post:

The North had a tough time raising money for the war as well. After the defeat at Bull Run, they suffered a new crisis: the collapse of the bond market. Under the Constitution, the U.S. House of Representatives had responsibilities for originating all revenue measures and under pressure from Treasury Secretary Salmon P. Chase started considering legislation to raise taxes. Ways and Means started with tariffs, but a storm of criticism erupted since it would fall on the poor who needed tea, coffee, sugar and whiskey.

The next option was real estate via “direct taxes,” but Congress objected by noting that wealth in stocks and bonds was excluded, which meant the wealthy could escape paying any taxes quite easily. The more Congress debated the property tax the louder the opposition became. U.S. Rep. Schuyler Colfax from Indiana (a future Republican vice president) said, “I cannot go home and tell my constituents I voted for a bill that would allow a man, a millionaire, who has put his entire property in stock, to be exempt from taxation, while a farmer who lives by his side must pay a tax!” Colfax proposed a tax on stocks, bonds, mortgages and interest on money – and income earned from them. An income tax (inevitably).

U.S. Rep. Thomas Edwards from New Hampshire proposed calling the new tax something other than a direct tax. “Why should we not impose the burdens which are to fall on this country equally, in proportion to their ability to pay them?” An amendment was passed imposing a 3 percent tax on incomes over $600 per year. Someone quoted John Milton in Paradise Lost – he compared the taxpayer to Adam and Eve, driven by necessity “from our untaxed garden, to rely upon the sweat of our brow for support.” An income tax it was.

Secretary Chase was skeptical. He doubted merely labelling the income tax to be indirect would not make it constitutional. More importantly, there were no provisions made for a bureaucratic or enforcement mechanism. The income tax was not collectible. Since it was only a recommendation, he ignored it since he was far too busy with the need to borrow money for the war. As banks were all reluctant to loan a shaky government any money, he turned to a young Philadelphia banker, Jay Cooke, who had a scheme to market the government debt to the public, with Cooke taking a sales commission.

They finally got a consortium of 39 banks to loan $150 million in gold to be paid in three $50 million installments for sale to private individuals. The first $50 million barely sold and the second round failed completely, which killed the scheme. By Dec. 30, 1861, the banks were so stressed they were forced to stop honoring gold payments to their other customers, which was almost tantamount to becoming insolvent.

By the start of 1862, Chase realized he had grossly underestimated the costs of the war. His new estimate for year one was $530 million and the assumed revenues from taxes, tariffs and other schemes were falling short and the Treasury funds were almost depleted. New taxes or loans could not possibly fill the gap in time. With no other alternatives available, Chase and President Lincoln overcame their misgivings and endorsed the idea of simply printing money – $50 million in green paper money that the government would just declare to be valid legal tender, though not redeemable in gold or silver.

Then Congress passed the Legal Tender Act in February 1862, providing for $150 million in currency that became known as greenbacks – the first paper money ever issued by the U.S. government … a practice that continues today as the debt has exceeded $20 trillion and seems to be accelerating. I hope to be around to see how it ends.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].