Cotton Gin Extended America’s Abhorrent Practice of Slavery

The 1796 patent signed by George Washington for “new machinery called the Cotton Gin” realized $179,250 at a May 2011 Heritage auction.

By Jim O’Neal

In 1776, Scottish economist, philosopher and teacher Adam Smith wrote The Wealth of Nations, a book that helped create a new understanding of modern economics. A pervasive theme was the idea that any economic system could be automatic and self-regulating if it was not burdened by monopolies or artificial trade barriers. This theory has become widely known as “the invisible hand.” It heavily influenced my favorite economist Milton Friedman and his Free to Choose basic philosophy.

One highly topical insight was that slavery was not economically viable and contributed to inefficient markets. Aside from the obvious moral issue, Smith believed slave owners would benefit by switching to a wage-labor model, since it was much more inexpensive to hire workers than own them and provide decent conditions. Buying slaves was much more costly due to ongoing expenses of feeding, housing and caring for workers with a high mortality rate, workers who eventually would have to be replaced.

In the United States, there was also a major disconnect between the concepts of all men being created equal and the cruel practice of slavery, which was prevalent especially in the agrarian states of the South. Although many sincerely believed that slavery would gradually die out, powerful Southern states needed some kind of assurances before they agreed to the new federal Constitution. Section 9 Article 1 of the Constitution barred any attempt to outlaw the slave trade before 1808. Other provisions prohibited states from freeing slaves who fled from other states, and further required them to return “chattel property” (slaves) to their owners. Kicking the issue down the road 20 years enabled the delegates to reach a consensus.

Historian James Oliver Horton wrote about the power slaveholder politicians had over Congress and the influence commodity crops had on the politics and economy of the entire country. A remarkable statistic is that in the 72 years between the election of George Washington (1788) and Abraham Lincoln (1860), in 50 of those years, the president of the United States was a slaveholder; as was every single two-term president.

The passage in 1807 of the Act of Prohibiting Importation of Slaves in America, and the Slave Trade Act in Great Britain marked a radical shift in Western thinking. Even as late as the 1780s, the trade in slaves was still regarded as natural economic activity. Both U.S. and European colonies in the Caribbean depended on slave labor, which was relatively easily obtained in West Africa.

However, it was really the invention of the cotton gin by Eli Whitney in 1793 that dramatically extended the abhorrent practice of slavery. Cotton was suddenly transformed from a labor intensive, low-margin commodity with limited demand into a highly lucrative crop. Production in Southern states exploded as demand skyrocketed. The number of slaves grew concurrently from 700,000 in 1790 to 3.2 million by 1850. The United States quickly grew into the largest supplier in the world and snagged 80 percent of the market in Great Britain, whose appetite seemed insatiable.

As an economist, Adam Smith was undoubtedly right about hiring workers versus owning them, but everybody was too busy getting rich to worry about optimizing labor costs. And the more demanding abolitionists in the industrializing North denounced slavery the more Southern states were determined to retain it. It would take a bloody four-year Civil War and 630,000 casualties to settle it.

Harry Truman once explained why he preferred one-armed economists: It was because they couldn’t say “On the other hand…”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

McCarthy Exploited Vulnerabilities of Frightened Public by Simplifying Complex Issues

A copy of Joseph McCarthy’s McCarthyism: The Fight for America, 1952, signed by the senator, sold for $206.25 at an October 2013 auction.

By Jim O’Neal

It’s rather interesting to compare the 1930s with the late 1940s and the transition from the era of the New Deal – when liberal ideas were ascendant, and communism, while not popular, was hardly the abhorrent demon it would become.

To Whittaker Chambers (whose 1952 book Witness became a bestseller) and many other Americans, communism was more than a system of government. It had morphed into a campaign for control of the mind and the masses.

Too many Americans seemed to have fallen victim to the “Soviet Experiment” and were infatuated by its promise of egalitarianism, while ignoring the crimes of its authoritarian leadership. Chambers was a gifted intellectual writer, but the anti-communists were to find their most vocal champion by accident. And he was a buffoon.

Joseph McCarthy

Senator Joseph McCarthy of Wisconsin was a hard-drinking, coarse man who later said he knew so little about his crusade that he would find it hard to distinguish Karl Marx from Groucho Marx. In a May 1950 speech to Republicans in West Virginia, he claimed to have a list of 205 communists working in the State Department. He had no list, but in subsequent speeches the number grew to thousands and then four.

But, with self-aggrandizement being his real personal goal, he soon realized he was onto something big when reporters started asking for more information. He played along and became anti-communism’s most captivating spokesman. By suggestion, innuendo and diversion, McCarthy pointed his finger at labor and liberals, at America’s elite, its prominent educational institutions, and at FDR and the New Deal.

Soon, he was not the only one ruining careers and smearing reputations. Around the country, untold numbers of civil servants, schoolteachers and scientists were driven from their jobs by witch-hunts just as vicious as the Wisconsin senator’s. The hysteria included schools banning the tale of Robin Hood for its communist themes; the Cincinnati Reds changing their name to the Redlegs; and Mickey Spillane having his tough private eye going after communist subversives instead of gangsters. Jackie Robinson was called before the House Committee on Un-American Activities to testify about communism’s influence in the black community. Even Hollywood had its own “blacklist” of writers, directors and actors.

Only when McCarthy challenged the character of President Truman’s Secretary of Defense George Marshall did his public opinion begin to sour.

There were plenty of communist agents or sympathizers in America, but it is unlikely that McCarthy or his followers ever found any. What they did was exploit the vulnerability of frightened or insecure people by simplifying complex international developments into language that tapped into cultural divisions. McCarthy helped them find someone to blame.

Fortunately, it didn’t last long after the Senate censored him … twice. He died a hopeless alcoholic at age 48.

The 2005 movie Good Night, and Good Luck with David Strathairn and George Clooney does a terrific job of capturing the era of McCarthyism through the lens of TV journalist Edward R. Murrow’s experience. It’s among my top 20 favorite movies.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Even with United Nations, War and Terrorism Persist

charter-of-the-united-nations-and-statute-of-the-international-court-of-justice
This 1945 copy of Charter of the United Nations and Statute of the International Court of Justice, signed by John F. Kennedy, Henry Cabot Lodge and Adlai E. Stevenson, sold for $2,375 at an April 2014 auction.

“… To save succeeding generations from the scourge of war …” — From the United Nations Charter

By Jim O’Neal

Edward Stettinius, chair of the U.S. delegation to the United Nations conference in San Francisco, signed the U.N. Charter in 1945. President Harry Truman was in attendance and later signed the document by which he ratified the charter of the United Nations.

The charter established the structure of the United Nations and outlined its guiding principles to prevent war, affirm fundamental human rights, facilitate international peace and security, promote improved living standards, and support social progress and economic advancements (whew!).

The United States, Britain and the USSR were the primary designers of the decision-making structure. The General Assembly consisted of all member countries. The Security Council, which was responsible for international peace and security, originally had 11 members, six of which were elected to two-year terms. Five – the United States, Britain, USSR, France and China – were permanent members, and each had veto power on Security Council resolutions.

Disagreements based on national interests plagued the discussions at the April conference, but they did not prevent the formal U.N. formation. There was also considerable debate about the voting process and veto provisions. Finally, on June 25, the delegates unanimously adopted the charter and the next day they all signed the document.

After the permanent members of the Security Council and most other members ratified the charter, the United Nations was officially established on Oct. 24, 1945. The world had entered a new period of international collaboration determined to avoid a repeat of the two wars that had caused so much devastation in the first half of the 20th century.

Alas, these lofty aims did not last long as the Cold War soon started, followed by major conflicts in Korea, Vietnam (twice), Afghanistan, etc. When we look around the world today, it’s estimated that the United States has Special Forces in over 70 countries (at least) and ad hoc terrorism is a routine, daily occurrence in many places. A new Cold War is gradually taking shape and even nuclear proliferation is back in the news.

Maybe conflict is in our DNA.

One thing is certain. Assuming the United Nations survives, they will have plenty to do for a long time.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

For President Johnson, Goal was Reached with ‘Great Society’ Legislation

lyndon-b-johnson-great-society-bill-signing-pens-from-1965
A complete set of 50 pens President Johnson used to sign “Great Society” legislation in 1965 sold for $18,750 at a November 2015 Heritage auction.

By Jim O’Neal

Whether Lyndon B. Johnson intended to run a second time for the presidency (after his 1964 election) is uncertain. Many of his predecessors had made it clear that one elected term was enough.

Theodore Roosevelt made a campaign promise not to run again for president and regretted it so much that he later ran anyway (in 1912). Rutherford B. Hayes never intended to run more than once (and was happy he hadn’t), and neither did Harry Truman or Calvin Coolidge. Except for TR, these men were no longer popular by the end of their first elected term, and it most likely would have been a waste of time.

So it was with LBJ. On March 31, 1968, he took the nation by surprise when he announced abruptly in a televised address from his office, “I shall not seek, and I will not accept, the nomination of my party for another term as your president.”

Johnson had even spoken of resigning, but if anything deterred him, it was the fear of losing his “Great Society” programs in Congress. Even the media-fueled support for Robert Kennedy was threatening, because Johnson never trusted him and was leery of his lack of power with Congress to be sure the programs got enacted. Johnson cared more about his agenda than the presidency.

lf
President Johnson signs legislation.

Then, shortly after his retirement speech, came the assassinations of Dr. Martin Luther King Jr. (April) and Kennedy (June), which stirred even more violence in the streets. The military was on stand-by and ready to pour into Washington if rioting was too much for the police. For the man in the White House, the outside world was a horror show and the idea of returning to his ranch grew more appealing. A long-time colleague from the old days, Congressman Jack Brooks, said the president did not seek reelection because he “kind of wanted to get back home,” adding for those who might not understand, “It’s not so bad out on the ranch, you know.”

Some presidents depart the White House invigorated, but most leave exhausted. For LBJ, the office had drained his vigor and confidence. He also believed that history would never give him credit for achieving the most powerful social agenda since Roosevelt’s New Deal. It was Johnson’s political skill that made it happen, not JFK, but Johnson believed that somehow the applause would inevitably go to his more popular predecessor. Sadly, he was right, but in recent years, a more balanced narrative has evolved.

Republicans nominated Richard Nixon in August 1968 and the Democrats chose VP Hubert Humphrey. LBJ did not attend the convention to share Humphrey’s triumph since he didn’t want to add any Vietnam War baggage to the ticket. During the campaign, the war flared on and LBJ was still impassioned to end it. On Oct. 31, just days before the election, he even announced a halt to the bombing, but it was too late.

On Jan. 14, 1969, President Johnson delivered his final State of the Union to Congress. It was strong, pragmatic and well-received by his old Senate colleagues – and in a venue where he was very comfortable.

Then it was time to pack up and head back to Texas.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Eisenhower Crucial to ‘Greatest Engineering Project in World History’

eisenhower-inaugural-photograph-signed-by-four-presidents
A photograph of President Dwight D. Eisenhower’s inauguration on Jan. 20, 1953 – autographed by Eisenhower, Richard Nixon, Harry Truman and Herbert Hoover – realized $8,365 at an October 2006 Heritage auction.

By Jim O’Neal

As federal war-game planners considered their objectives in mobilizing a West Coast battle response, railroads were quickly ruled out because they could not carry the amount of equipment involved and some of the weapons, especially tanks, were too heavy for trains and tracks.

Since the Army already had plenty of wheeled and tracked vehicles, dispatching a test expedition by road and having a Motor Transport Corps drive the convoy could prove, once and for all, the superiority of wheels over hoofs or railways. Inexplicably, they failed to include any assumptions about the condition of the roads en route.

At the appointed time in 1919, the convoy gathered at a monument by the South Lawn of the White House. The column was three miles long and consisted of 79 vehicles, including 34 heavy trucks, oil and water pumpers, a mobile blacksmith shop, a tractor, staff observation cars, searchlight carriers, a mobile hospital and other wheeled necessities to support the actual war machines.

Nine vehicles were wrecked en route and 21 men injured – leaving 237 soldiers, 24 officers and 15 observers – including then-Brevet Lt. Col. Dwight D. Eisenhower (who kept a concise daily diary). When they arrived in Lincoln Park in San Francisco 62 days later, it was undisputed that the conditions of the roads – essentially non-existent west of the Missouri River – would preclude any timely defense of the West Coast and that any Asian enemy would have been victorious in any battles along the way.

The journey left an indelible impression on the young officer from West Point, who would later be Commander-in-Chief of the nation. The Army and Eisenhower had indisputably proved what many in the capital had suspected. The American West had few, if any, roads that were even remotely usable for military or civilian use.

Only when they reached California and beyond the state capital of Sacramento did the roads become great – with macadamized surfaces, proper drainage, road rules, gas stations and tire-repair depots … all in sufficient quantity to service existing needs.

But this did not appease Eisenhower in the slightest. This great convoy, called into action to deal with a hypothetical threat to the country’s vital West Coast, had crossed 3,251 miles of the country at an average speed of 5.6 mph, making any potential response virtually useless. The vehicles were in fine shape and the men brave and intelligent, but the roads were deplorable. If nothing else, Eisenhower wrote, the experience of this expedition should spur the building – as a national effort – of a fast, safe and properly designed system of transcontinental highways.

This led to the creation of America’s Interstate Highway System – the greatest engineering project in world history … an intrinsic network of high-speed roads built with the sole purpose of uniting the corners, edges and center of this vast nation.

Fittingly, “The Dwight D. Eisenhower National Interstate and Defense Highways Act” was authorized by the Federal-Aid Highway Act of 1956 during the second term of the 34th president of the United States. “I LIKE IKE!”

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

For Germany, Economic Development Has Trumped Disastrous Wars

Stanley Kramer’s 1961 film about the trial of Nazi war criminals, Judgment at Nuremberg, featured some of the best actors working in Hollywood, including Burt Lancaster, Marlene Dietrich, Spencer Tracy and Maximilian Schell.

By Jim O’Neal

The 34th Academy Awards ceremony was held on April 9, 1962, to honor films from 1961. West Side Story dominated the field with 11 nominations and 10 Oscar winners.

Another strong contender was Judgment at Nuremberg with 11 nominations, including two for best actor: Maximilian Schell (winner) and Spencer Tracy for his portrayal of Chief Judge Dan Haywood, a fictionalized character. Many moviegoers (and probably others) naturally assumed this was the extent of post-war judicial actions. In fact, the film only represented the third (“The Judges’ Trial”) of 12 trials for German war crimes.

Even before Germany surrendered, the Allies had planned to establish courts to try Nazi military and political leaders for their actions during the war. On May 2, 1945, President Harry S. Truman selected Supreme Court Justice Robert Jackson to organize the proceedings and represent the United States.

Judge Jackson started by developing the London Charter, which established the International Military Tribunal and trial procedures. It was agreed to hold the trials in Nuremberg, where the Nazis held their annual rallies. Much of the city was damaged, but the huge Palace of Justice and a prison remained intact.

On Nov. 20, 1945, the Nuremberg Trials began.

“The wrongs which we seek to condemn and punish have been so devastating that civilization cannot tolerate their being ignored, because it cannot survive their being repeated.” – Justice Robert Jackson, November 1945

In the first trial, 22 Nazis faced one or more charges of war crimes, crimes against peace or crimes against humanity. The defendants included Luftwaffe Commander Hermann Goering, Adolf Hitler’s deputy Rudolf Hess and the Fuhrer’s successor Admiral Karl Donitz. (Martin Bormann was tried in absentia and Hitler, Joseph Goebbels and Heinrich Himmler had committed suicide.)

Over the next 10 months, prosecutors offered evidence of propaganda movies, vivid films of concentration camp liberations and damning testimony from many eyewitnesses. The evidence was so overwhelming, the 250 journalists attending the trial were often heard weeping in the courtroom or sobbing in the hallways.

On Oct. 1, 1946, the court handed down the verdicts.

Twelve high-ranking men, including Goering, were sentenced to death by hanging. Three more were sentenced to life sentences in prison. Four got prison sentences of 10 to 20 years and three minor political figures were acquitted.

The Nazi leaders had been tried in courtroom 600 of the Palace of Justice, where all proceedings were recorded. Some were broadcast in radio reports. Many people still claim it was the first time they learned of Nazi atrocities, the concentration camps or the gas chamber horrors (“The Final Solution”).

What is interesting, at least to me, is just how much more the Germans have accomplished through economic development than they ever did with guns, planes and tanks. Just ask the Greeks.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].