Peter the Great Modernized Russia and Opened its Path to Power

An extremely rare mint state Peter I Rouble 1723 sold for $63,250 at a May 2008 Heritage auction.

By Jim O’Neal

When I was diagnosed with prostate cancer in 2001, it didn’t seem to faze me. After reading all the literature on this pernicious disease, I was convinced of two things: First, surgery was the best option and second, the skill of the surgeon was the critical variable to ensure a positive result. I had heard the finest surgeon for this kind of procedure was a doctor at Johns Hopkins named Patrick Walsh and I had a connection that got me an appointment in June of that year.

After waiting in line behind the governor of Connecticut and the king of Spain, my operation was scheduled for Sept. 5 (it was a long line and Walsh performed only four procedures a week). Strictly by chance, I watched 9/11 unfold on CNN while recuperating in a rental recliner in a Baltimore hotel room.

PepsiCo made a healthy donation to a special research fund and invited Dr. Walsh to be a guest speaker at a boondoggle in 2003 in St. Petersburg, Russia, that coincided with the city’s 300-year anniversary. Pepsi-Cola had been the first western brand sold in the USSR (1972) since Chairman Don Kendall had a theory that trade was a better alternative than nuclear war. Since the Russians were short of hard currency, we traded Pepsi concentrate for Stolichnaya vodka. By the time I got to Europe, there were 26 Pepsi-Cola bottling plants in Russia.

St. Petersburg was always intriguing to me since it had been founded by Peter the Great in 1703. Peter (1672-1725) became ruler of Russia in 1682 (yes, he was 10 years old), at first jointly with his half-brother as co-Tsar and his mother as regent. In 1696, he became sole ruler of a vast empire. Seven years later, he founded St. Petersburg on the estuary of the River Neva and this new city, fortress and port by the Baltic Sea gave Russia direct access to Europe. This opened new opportunities for trade and military conquest, so Peter boldly made his new city Russia’s capital, stripping the title from the ancient seat of Moscow.

An admirer of Western palaces, Peter employed European architects to design the government buildings, palaces, houses and university in the fashionable baroque style. Labor was no problem with 30,000 peasants, Russian convicts and Swedish prisoners of war available for the construction gangs. More than 100,000 died, but those who survived could earn their freedom.

Peter proceeded to use his unchallenged power to make significant changes in Russia by founding the Russian navy and reforming the army along European lines, developing new iron and munition industries to equip it. By 1725, Russia had a first-rate army of 130,000 men. His court system was also transformed, adopting French-style dress. New colleges forced the nobility to educate their children and established a meritocracy for promotion. However, he treated rebels ruthlessly and adopted an aggressive foreign policy that gave him control of the Baltic Sea.

Although Peter wisely forged diplomatic ties with Western Europe, he failed to form an alliance against the Ottomans. His enlightened reforms established him as a powerful emperor of a vast empire and monarchy that survived until the bloody Russian Revolution in 1917.

“I built St. Petersburg as a window to let in the light of Europe!” Not a bad legacy and certainly superior to what has occurred in the past 100 years.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Robert Morris Deserves His Place on This $1,000 Bill

The rare 1863 $1000 legal tender note featuring Robert Morris could be the most attractive bill ever printed.

By Jim O’Neal

Imagine a situation where you write a terrific biography that is nominated for prestigious awards and stays on The New York Times best-seller list for three months. Not bad. But then, imagine the elation when 10 years later, it is turned into a Tony Award-winning musical and you are part of the team that created it. That actually happened to author Ron Chernow with his book about Alexander Hamilton.

Even more remarkable is that Hamilton is still the hottest ticket in town three years later, and playwright Lin-Manuel Miranda has racked up a Pulitzer Prize, three Tony Awards, two Grammys, an Emmy, and will be honored on the Hollywood Walk of Fame in 2018. The only thing left would be a film and, not surprisingly, Hamilton the movie is already in development.

Many people now know that Alexander Hamilton was the first Secretary of Treasury for the United States. However, he was not the first choice of George Washington when the president was forming his Cabinet. That distinction goes to another of the true Founding Fathers: Robert Morris Jr. (1734-1806), whose name has gradually faded from view. That was certainly not the case in 1775 when he was believed to be the richest man in America.

President Washington offered him the position primarily since he had been the First Superintendent of Finance for the United States (1781-84), but Morris recommended Hamilton since they shared similar views, including the idea of creating a national bank. Besides, next to George Washington, Morris was already considered the most powerful man in America.

After migrating to America from England as a teenager, he became a partner (at age 24) of Thomas Willing when they formed a banking-shipping firm, Willing, Morris & Co. This dual charter allowed them to self-finance their trading activities, which included slaves. However, disputes over tariffs and taxes like the Stamp Act inevitably drew Morris into politics and eventually the war for independence from England. Robert Morris and Roger Sherman of Connecticut are the only two people to sign the Declaration of Independence, the Articles of Confederation and the United States Constitution.

Robert Morris is also credited with being one of the founders of the financial system for the United States, along with Hamilton and Albert Gallatin, who was Treasury Secretary for Jefferson and Madison from 1801 to 1814, the longest tenure in this office in history. Morris used his great wealth and financial acumen to support Continental troops under Washington when the country was broke. The dome in the U.S. Capitol Building has a fresco painting (The Apotheosis of Washington) that includes a scene with Mercury, the Roman god of commerce, handing Morris a bag of gold to commemorate his service as “the Financier of the American Revolution.”

Twenty years ago, I had the pleasure of viewing Robert Morris on a $1,000 bill when Frank Levitan sold his wonderful collection of United States paper currency. It’s my personal choice for the most attractive bill ever printed and is ultra-rare (only two are known to exist). Lot #104 sold for $451,000 – a staggering amount at the time, but a fraction of the price it would bring today.

Thank you, Mr. Morris. I won’t forget.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Link Between Value of Money and Gold a Quaint Relic of the Past

This Serial Number 1 Stephen Decatur $20 1878 Silver Certificate, Fr. 306b, is believed to be the first silver certificate ever produced. It sold for $175,375 at a May 2005 Heritage auction.

By Jim O’Neal

In 1961, I was a member of a high-powered bowling team that competed on Tuesday nights at the South Gate Bowling Center in Southern California. We all had 200-plus averages, but only managed to win one league championship in the four years we were together. In February, one of my teammates, Carl Belcher, bowled a perfect game (12 strikes) and received 250 silver dollars from a promotional gimmick the arena used to attract customers. Nobody paid much attention and I personally thought it was an unnecessary inconvenience to lug the sacks to a local bank to get rid of them.

Most of the silver dollars in circulation were probably in Nevada since all the Reno and Las Vegas casino slot machines used them instead of tokens. Even paper currency was printed with the promise to “pay to the bearer on demand … one silver dollar,” which evolved into “one dollar in silver.” For a while, it was possible to get a small plastic bag of silver equivalent to the denomination of the paper currency.

Silver certificates were authorized by two Acts of Congress. The first on Feb. 28, 1878, followed by another on Aug. 9, 1886. These notes are particularly attractive, quite rare and sometimes expensive. At one time, I owned an especially distinguished $20 bill with the head of Captain Stephen Decatur, naval hero of the War of 1812. It was serial number 1 and experts believe that since the Treasury generally printed the $20s first, this note was probably the first silver certificate ever printed. Heritage Auctions auctioned it in 2005 for $175,000 when I sold my currency collection.

However, after Executive Order 6102 of 1933, there were no more gold coins or silver dollars minted in the United States and paper notes were used for denominations above 50 cents. Up to 1964, dimes, quarters and half dollars were minted in 90 percent silver, and half dollars contained 40 percent silver from 1965-70. Even the lowly penny had most of its copper content removed and is now made primarily of zinc, with a thin copper plating.

For 4,000 years, the only period in which civilization has not based its currency on metal, especially gold and silver, is the past 46 years. On Aug. 15, 1971 (“A date that has lived in infamy”), President Richard Nixon announced the temporary suspension of dollars into gold. The White House tapes from the previous week reveal that he thought gold prices would explode after being de-linked since the Federal Reserve would print money like crazy once the currency was not collateralized and this overprinting would affect jobs (unemployment had just gone from 4 percent to 6 percent). And Nixon was “not about to be a hero” (his words) on inflation at the expense of employment.

Then the administration imposed a rigorous regime of wage and price controls, enforced by IRS audits and leverage over federal contracts. The plan failed spectacularly and the 1970s were rife with double-digit inflation, energy shortages and ultimately the “stagflation” that torpedoed both the Ford and Carter presidencies.

Flash forward to today as we are still trying to use monetary policy to solve economic issues and unwilling to even touch the critical fiscal issues that are fundamental to the future economic challenges everyone acknowledges. The only thing that has changed is that there is no need to actually print money when it can be “whistled into existence” via monetary legerdemain called quantitative easing, where the Federal Reserve loans money to the Treasury Department.

Since the financial crisis of 2008, the world’s central bankers have materialized $12.25 trillion by tapping on a computer keyboard. For perspective, the value of all the gold that’s ever been mined, according to the World Gold Council, is a mere $7.4 trillion. The historical linkage between the value of our money and its metal content is a quaint relic of the past.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Fillmore Often Makes the ‘Forgettable Presidents’ Club

Millard Fillmore appears on the lower right corner of this Union Bank of Missouri $100 Color Proof. It realized $61,687.50 at an October 2015 Heritage auction.

By Jim O’Neal

Millard Fillmore, the 13th president, was the last not affiliated with either the Democrat or Republican parties. Born in a log cabin, he developed slowly since he did not read well and was apprenticed when he was 14 years old. After several years, he bought out his indenture for $30, but never saw a map of the United States until he was 19.

However, he learned to love books and spent a lot of time just reading.

Later, his entry into politics was through the New York State Assembly as an anti-Mormon candidate. Eventually, he made it into the U.S. House by following Whig Party policies. He even made a run at being the Whig Party VP candidate in 1844, but finished a weak third. Then, to top it off, he was defeated for governor of New York that same year.

It looked like his career had peaked.

However, his luck changed in 1848 when the Whigs picked General Zachary Taylor to run for president. Taylor was a slaveholder from Louisiana, had never run for office, and had never even voted.

Taylor and Fillmore had also never met, but the Whigs hoped Fillmore would help balance the ticket … a strategy that worked!

Vice President Fillmore was largely ignored when the administration finally took office. That is until President Taylor died unexpectedly and Fillmore was thrust into the Oval Office.

Alas, he gradually lost support of the Whig Party and was unable to generate a lot of support for reelection. One major cause was signing and then enforcing the proslavery Fugitive Slave Law, which alienated Northern Whigs.

During the 1852 convention, Fillmore made a valiant effort, but on the 53rd ballot, Winfield Scott finally prevailed as the Whig Party candidate. He would go on to lose the general election to Democrat Franklin Pierce.

In 1856, the American Party (“Know Nothings”) convinced Fillmore to make another run for the presidency; he won a single state. Curiously, many historians argue that Fillmore was never an actual American Party member, never attended a single meeting, and was even out of the country when all this happened.

All of this is true, but they overlook the fact that he did mail a letter affirming his acceptance of the nomination. So, I say he was an official candidate despite the unusual circumstances and the rather obvious lack of any real interest.

Fillmore often makes the “Forgettable Presidents” club … but we remember him because he was the first president to turn down an Honorary Degree … a Doctor of Civil Law from Oxford. His reason was a little hokey (he could not read or understand it since it was in Latin), but that only makes him more qualified for our club.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Julius Caesar Still Influencing Culture 2,000 Years Later

Many Romans in 44 B.C. must have been stunned to see the image of Julius Caesar stamped on newly issued silver denarii. This example sold for $57,500 at a September 2011 Heritage auction.

By Jim O’Neal

Rome, “The Eternal City,” began as a cluster of small villages on seven hills by the River Tiber and grew into a city-state. According to legend, it was first ruled by kings, who were overthrown, before becoming a republic. A new constitution allowed the election of two senators to run the state. Their terms were limited to one year, as the office of king was prohibited.

It became remarkably successful between 500 and 300 B.C., extending its power through conquest and diplomacy until it encompassed the whole of Italy. By 120 B.C., Rome dominated parts of North Africa, the Iberian Peninsula, Greece and Southern France. The conquered territories were organized into provinces ruled by short-term governors who maintained order and ensured the collection of taxes.

By the 1st century B.C., Rome was a Mediterranean superpower, yet its long tradition of collective government, in which no individual could gain much control, was challenged by the personal ambitions of a few immensely powerful military men. A series of civil wars and unrest culminated in the dictatorship of Julius Caesar, a brilliant general and statesman.

Gaius Julius Caesar was born in Rome in 100 B.C. to a family of distinguished ancestry. From an early age, he grasped that money was the key to power in a political system that had become hopelessly corrupt. He also learned that forging a network of alliances and patronage would be crucial to his success.

After serving in the war to crush the slave revolt led by Spartacus, he returned to Rome in 60 B.C. and spent vast sums of money buying influence and positions. Eventually, he teamed up with two other powerful Romans, Crassus and Pompey, to form the First Triumvirate. Then Caesar was first consul and two years later, governor of Gaul, which gave him a springboard to true military glory.

Over the next eight years, he conquered Gaul, bringing the whole of France, parts of Germany, and Belgium under his personal rule. Buoyed by his achievements, he then tried to dictate the terms for returning to Rome. Roman laws required military leaders to relinquish control of their armies before returning to Rome, a prerequisite for running for public office.

When Caesar refused, the Roman Senate declared him hostis (public enemy) and then came the unthinkable: He decided to march his army on Rome! En route, he paused at the border between the Gallic provinces and Italy proper … a small river called the Rubicon. Acutely aware that crossing that river would constitute a declaration of war, he announced “alea iacta est” (the die is cast) and led his army forward, telling them, “Even yet we may draw back, but once across that little bridge, and the whole issue is with the sword.”

“Crossing the Rubicon” is still in vogue today and represents making a difficult decision that cannot be reversed once taken.

Obviously, Caesar won the ensuing civil war, but soon a conspiracy developed with 60 senators planning to assassinate him on March 15, 44 B.C. (the infamous “Ides of March”). What is curious is that even after more than 2,000 years, we find Caesar references so often. The latest is the flap over a play in NYC’s Central Park, Julius Caesar, in which the title character bears a not-so-subtle resemblance to President Trump, with The New York Times questioning whether he can survive living in Caesar’s Palace.

Et tu, Brute?

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Bitter Enemies United Forever on Currency

This 1861 Confederate States of America $1000 Montgomery Note, featuring John Calhoun and Andrew Jackson, sold for $76,375 at an October 2015 Heritage auction.

By Jim O’Neal

John Caldwell Calhoun served his full four years as vice president under John Quincy Adams, but the year was now 1828 and he needed to make a decision about his political future.

He previously had been a member of the House of Representatives (1811-17) and Secretary of War (1817-25). (He was later Secretary of State, and a U.S. Senator.)

He finally decided to run for the vice presidency again. But, in a twist, he decided to switch horses and run with Andrew Jackson rather than JQA. It seemed like a prudent choice at the time, and he and Jackson easily won the 1828 election. Then they started trying to work together.

They differed on so many fundamental issues, including states’ rights and nullification, that a schism seemed inevitable. Then, to make tensions even worse, his wife Floride Bonneau started meddling in White House politics … and Jackson’s famous temper was riled up. He even threatened to just grab Calhoun and hang him (another duel would have apparently been unseemly).

The end was much less dramatic, as Jackson simply picked Martin Van Buren to be his running mate in the 1832 presidential election. When they won, Calhoun resigned.

Calhoun would remain the only vice president to resign until Spiro Agnew joined the club.

On March 9, 1861, the Confederate States of America issued a $1,000 banknote depicting both Calhoun and Jackson. So the two bitter enemies remain joined for eternity.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

It’s Unlikely a Wall Will Solve the Immigration Issue

The Great Wall appears on this China/People’s Republic 200 Yuan, 1949.

By Jim O’Neal

One of the Great Walls of China dates to 200 B.C., intended as a protective barrier for its inhabitants rather than a way to restrict the movement of people. The concept surfaced again in the Ming Dynasty era in the 14th century. The Chinese have always preferred a closed societal culture … until the 20th century, when they discovered the advantages of low-cost labor to produce goods for export.

Conversion of a strategic asset like low-cost labor to generate capital for economic development becomes more feasible with every technology improvement. The competitive advantages among nations has evolved into a “flat world” (Tom Friedman) that economists typically call globalization.

However, we still have nations like Japan, which strongly prefers a monoculture (similar to a beehive) and relegates foreigners to service roles. As an island nation with a strong navy, they can implement immigration policies that are enforceable. The rub is that birth rates are so low and the population increasingly aging that their economy has been stagnant for 20-plus years.

Europeans who migrated here in the early 16th century did not have to worry about physical barriers to entry. Their challenges were primarily in crossing the dangerous Atlantic Ocean and then surviving in a new, uncivilized land. The Pilgrims (English Separatists) came from Plymouth, England, via the Mayflower in 1620. They were joined by Puritans, who established the Massachusetts Bay Colony. Many were seeking religious freedom, fame and fortune, or simply personal freedom.

When the U.S. Constitution was adopted on Sept. 17, 1787, it expressly gave Congress the power to establish a uniform rule of naturalization. In 1790, they passed the Naturalization Act, which enabled those with two-year residency to apply for citizenship. However, it was restricted to “free white people” of good moral character. In 1795, it was modified to five-year residency and a three-year notification clause. A 1798 Act increased residency to 14 years and notice of intent to five years.

In 1802, Congress passed the Naturalization Law – “free white” retained – alien intent to three years – residency to five years – resident children included – ditto children abroad – former British soldiers barred from citizenship.

This was the last real major act of naturalization in the 19th century, except in 1870, when it was opened to African-Americans. But in the 20th century, the game changed. The U.S. focus changed from legislation that regulated immigration to restrictions. The Immigration Act of 1917 included literacy tests and barred immigration from the Asia-Pacific Zone. It dramatically increased the list of “undesirables” – alcoholics, idiots, those with contagious diseases, and political radicals. The “Barred Zone” included much of Asia and the Pacific Islands.

The Immigration Act of 1924 severely restricted the immigration of Africans and banned immigration of Arabs and Asians. In 1986, the Immigration Reform and Control Act was designed to be the final solution. Four million undocumented workers got a path to citizenship in return for “border security.” It was never fully implemented.

Build a wall? Sure, no problem. Solve the issue? Two hundred years of American history says probably not.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

‘We Can Never Know Enough About the American Revolution’

The 1998-S Crispus Attucks $1 was struck to commemorate the 275th anniversary of the birth of Attucks and to honor the nation’s Black Patriots.

By Jim O’Neal

A friend, Oscar Robertson, NBA Hall of Fame player, gained notoriety in 1955 by leading Crispus Attucks High School to the Indiana state championship, becoming the first all-black school in the nation to win a state title. In 1956, Oscar and his teammates won the state championship again, and this time they became the first Indiana high school to complete a season undefeated.

Crispus Attucks was the first person killed in the Boston Massacre (March 5, 1770) and many consider the former slave the first casualty of the American Revolution. In the 1850s, he became a martyr for the abolitionist movement. His probable mixed-race heritage – African and American Indian – allowed both African Americans and Native Americans to leverage his fame in their struggles for justice.

Despite the many eyewitness accounts, scholarly research and dozens of highly acclaimed books, this period is filled with alternate versions and is a continuing source of debate and uncertainty.

A common denominator in many of the high-profile events of the era is the city of Boston, with the Stamp Act of 1765 being a convenient place to start. This was an egregious act of the British Parliament putting a tax on all printed matter – newspapers, books, playing cards and legal documents. It aroused a storm of protest in all the colonies, with Boston’s reaction particularly violent. A Stamp Act administrator was burned in effigy and a mob ransacked the governor’s mansion.

Parliament repealed the Stamp Act, but insisted they maintained the right to pass laws regulating all trade and issuing new taxes at will. This caused protests that were even more violent and into this highly volatile situation, Britain landed 4,000 troops in Boston, strictly “in anticipation of a crisis.”

By 1770, Boston was in an economic decline and the population of 15,000 was smaller than 30 years earlier in 1740. There was continual competition for scarce resources and tensions between British troops and citizens continued to increase. Finally, an argument over payment for a haircut escalated into an angry mob that challenged troops stationed at the Customs House.

The people taunted the soldiers with “Fire! Fire! Fire! We dare you to fire!” At some point, an order was given and they shot into the crowd. Four people were killed and several others wounded. The next day, British Captain Thomas Preston and a small group of soldiers were arrested and taken to Queen Street jail to await trial. Future President John Adams and Josiah Quincy Jr. agreed to be their lawyers. A little-known fact is that four citizens were accused of shooting into the crowd, but they were found not guilty along with all but two of the British soldiers.

Then came the famous Boston Tea Party (1773), when colonists dressed as Indians destroyed 342 chests of tea on three ships in Boston Harbor after the British Parliament levied taxes on tea and granted a monopoly to the British East India Company. All the elements were in place for a war and it lasted for seven years.

The 35 years from 1765 to 1800 are some of the most interesting times in American history and will continue to attract scholarly research and an unending parade of books. However, few have the insight of Pulitzer Prize-winning historian David McCullough, who has said, “We can never know enough about the American Revolution if we want to understand who we are, why we are the way we are, and why we’ve accomplished what we’ve been able to accomplish that no other country has.”

I agree.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

After World War II, America Immediately Faced Challenges in China, Russia

Taiwan struck a gold 2000 Yuan Year 55 (1966) to commemorate the 80th birthday of Chaing Kai-shek.

By Jim O’Neal

Chaing Kai-shek joined the Chinese Nationalist Party in 1918, succeeding founder Sun Yat-sen as the leader. In 1925, he expelled Chinese Communists from the party and led a successful reunification of China. When the Allies declared war on Japan in 1941, China took its place among the Allied nations.

Chaing may have been an ally of the United States, but he presided over a corrupt society made ungovernable by China’s decade-long occupation at the hands of the Japanese and the growing strength of communist revolutionary Mao Zedong. Inflation was rampant, as was starvation, but Chiang’s police crushed opposition and no amount of American pressure could dissuade him.

In 1946, George Marshall made a valiant effort to consolidate power between Chiang and Mao, but it proved futile. As the Cold War advanced, Americans saw their own security at risk by supporting the anti-communists. Then, the Communist Revolution created an ardent hatred of all things American, followed by more bad news in September 1949. As the last of the Chinese Nationalists fled to Formosa (now Taiwan), a squadron of USAF B-29s detected traces of radioactive material while flying over the North Pole. This provided irrefutable evidence that the Soviet Union had successfully exploded their first atomic bomb.

Americans were disillusioned. This was not the way things were supposed to go. Right was supposed to triumph over wrong, freedom over oppression, God over the godless. Hadn’t the Allies just finished proving this on the beaches of Normandy and in the vast waters of the Pacific? And hadn’t the gods determined that Americans alone should possess the atomic secrets to keep the forces of evil in check?

Mao’s victory and Joseph Stalin’s bomb forced a reconsideration of plans for occupied Japan, for now the line between East and West had to be drawn even more firmly, and every American decision had to be viewed through the prism of the Cold War. The initial strategy, as it had been for occupied Germany, had been to halt Japan’s capacity for future aggression, to disarm the former enemy and slowly introduce democracy. But, just as the Russian actions in Eastern Europe had changed the pace of reeducation in West Germany, the victory of the Chinese Communists made it essential that Japan be immediately strengthened to resist the spread of the Red Tide in Asia.

General Douglas MacArthur, the supreme commander of occupied Japan, had personally written the new Japanese constitution, which banned “land, sea and air forces” and stated any war potential “will never be maintained – or the development of a military industry.” Just three years after the end of the war, that ban was lifted, creating a “self-defense force” of 75,000.

Today, as North Korean nuclear threats continue to grow, there are discussions about Japan assuming total responsibility for their own defense, including the possibility of a nuclear deterrence, something that many believe could be viable in a matter of months.

We seem to be incapable of eradicating or even mitigating war capabilities. Maybe there is just too much profit potential involved.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Clash of Democracy and Oligarchy Dates to Ancient Times

persia-under-alexander-mazaeus-as-satrap-of-babylon
This double-daric gold coin, provisionally dated to the years Alexander the Great was King of Persia, sold for $70,500 at a September 2013 Heritage auction.

“Dictatorship naturally arises out of democracy and the most aggravated form of tyranny and slavery out of the most extreme liberty.” – Plato

By Jim O’Neal

During the Peloponnesian War (431-404 B.C.), Athens was ultimately defeated by the Spartans. Athenian democracy was twice suspended. In 411 and 404 B.C., Athenian oligarchs claimed that Athens’ weak position was due to democracy and led a counter-revolution to replace democratic rule with an extreme oligarchy. In both cases, democratic rule was restored within one year.

Democracy flourished for the next eight decades. However, after the Macedonian conquest of Athens under Phillip II and his son Alexander (later Alexander the Great) in 332 B.C., Athenian democracy was abolished. It was intermittently restored in the Hellenistic age in the 1st and 2nd centuries B.C., but the Roman conquest of Greece in 146 B.C. effectively killed it off.

Although democratic rule had been quashed, Athenian science and philosophy lived on. The renown and influence of Plato and Aristotle endured through the ages that followed and much of their work continues to influence Western thought to this day.

It is ironic that Aristotle tutored Alexander the Great at age 16 since throughout antiquity, Alexander was widely viewed as the most remarkable man who ever lived. When his father was assassinated in 336 B.C., he secured the Macedonian throne by destroying his rivals, forcing the Greek city/states to accept his authority in 334 B.C. and then marching into Asia Minor (modern-day Turkey) at the head of an army of 43,000 foot soldiers and a cavalry of 5,500. At its heart lay the Macedonian phalanx, a well-drilled corps of 15,000 men armed with the sarissa, a double-pointed 23-foot pike. They were simply invincible.

He then defeated the Persian emperors, subdued Greece, drove his troops across mountains, deserts and rivers into Afghanistan, Central Asia and on to the Indian Punjab, ruthlessly crushing all resistance. Alexander was now king of a vast and ethnically diverse empire that included 70 newly founded cities. It is said that he sat down and cried when he ran out of new places to conquer. He died in 323 B.C., having been history’s most successful military commander.

Not bad for a 32-year-old.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].