After Independence, United States and South America Took Different Paths

This Banco de Venezuela 1000 Bolivares ND, circa 1890, proof, featuring Simón Bolívar, sold for $2,115 at a January 2017 Heritage auction.

By Jim O’Neal

Important scholars believe that Simón Bolívar should have been the George Washington of South America. He, too, overthrew an empire (Spain) – but obviously failed to create an Estados Unidos of South America. The American Revolution not only achieved unity for the former British colonies; independence also set the United States on the road to unsurpassed prosperity and power. South America ended up in a different place entirely due to a complex set of events.

Bolívar was born in Caracas in July 1783 to a prosperous, aristocratic family. He was an orphan by age 10 and soldier at the tender age of 14. He studied in Spain and France, spending time in Paris even after all foreigners were expelled in response to a food shortage. He returned to Venezuela by 1807, inspired by Napoleon and disgusted with Spanish rule. Sent to London to seek British help, he met Francisco de Miranda, the veteran campaigner for Venezuelan independence.

On their return in July 1811, they boldly proclaimed the First Republic of Venezuela.

The Republic ended in failure since a disproportionate number were excluded from voting by a flawed constitution and Bolívar betrayed Miranda to the Spanish before fleeing to New Granada. From there, he proclaimed a Second Republic – with himself in the role of dictator and winning the epithet El Libertador.

Eventually, Bolívar became master of what he termed Gran Colombia, which encompassed New Granada, Venezuela and Quito (modern Ecuador). José de San Martín, the liberator of Argentina and Chile, yielded political leadership to him. By April 1825, his men had driven the last Spanish from Peru, and Upper Peru was renamed Bolivia in his honor. The next step was to create an Andean Confederation of Gran Colombia, Peru and Bolivia. Why did Bolívar fail to establish this as the core of a United States of Latin America? The superficial answer – his determination to centralize power and resistance of the local warlords – misses much more complicated circumstances.

First is that South Americans had virtually no experience or history in democratic decision-making or representative government of the sort that had been normal in North America’s colonial assemblies. So Bolívar’s dream of democracy turned out to be dictatorship because, as he once said, “our fellow citizens are not yet able to exercise their rights … because they lack the political virtues that characterize true republicans.” Under the constitution he devised, Bolívar was to be dictator for life, with the right to nominate his successor. “I am convinced to the very marrow of my bones that America can only be ruled by an able despotism … We cannot afford to place laws above leaders and principles above men.”

For remaining skeptics, perhaps it is better to let Simón Bolívar explain in his own words, in a December 1830 letter he wrote a month before his death:

“I ruled for 20 years … and I derived only a few certainties:

  • South America is ungovernable.
  • Those who serve a revolution plough the sea.
  • The only thing one can do in America is to emigrate.
  • This country will fall inevitably into the hands of the unbridled masses.
  • Once we have been devoured by every crime and extinguished by utter ferocity, the Europeans will not even regard us as worth conquering.
  • If it were possible for any part of the world to revert to primitive chaos, it would be [South] America in her final hour.”

This was a painfully accurate prediction for the next 150 years of Latin American history and the result was a cycle of revolution and counter-revolution, coup and counter-coup. One only needs to read of Venezuela today to grasp the totality of how dire the future remains.

By the way, much of this insight comes by way of the highly recommended Civilization: The West and the Rest by Niall Ferguson.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Napoleonic-Era Book Explains Evolving Dark Art of War

A title lobby card for the 1927 silent French epic film Napoléon sold for $10,157 at a July 2008 Heritage auction.

By Jim O’Neal

It is generally accepted dogma that the French Revolution devoured not only its own children. Many of those who fought against it were literally children. Carl von Clausewitz was only 12 when he first saw action against the French.

A true warrior-scholar, Clausewitz (1780-1831) survived the shattering defeat at Jena-Auerstedt (today’s Germany) in 1806, refused to fight with the French against the Russians in 1812 and saw action at Ligny in 1815. As noted in his book Civilization: The West and the Rest, British historian Niall Ferguson says it was Clausewitz who, better than anyone (including Napoleon himself), understood the way the Revolution transformed the dark art of war.

The Prussian general’s posthumously published masterpiece On War (1832) remains the single most important work on the subject produced by a Western author. Though in many ways timeless, Ferguson points out On War is also the indispensable commentary on the Napoleonic era. It explains why war had changed in its scale and the implications for those who chose to wage it.

Clausewitz declared that war is “an act of force to compel our enemy to do our will … (it is) not merely an act of policy but a true political instrument, a continuation of political intercourse, carried on with other means.” These are considered his most famous words, and also the most misunderstood and mistranslated (at least from what I have read … which is extensive).

But they were not his most important.

Clausewitz’s brilliant insight was that in the wake of the French Revolution, a new passion had arrived on the field of battle. “Even the most civilized of peoples [ostensibly referring to the French] can be fired with passionate hatred for each other…” After 1793, “war again became the business of the people,” as opposed to the hobby of kings, Ferguson writes. It became a juggernaut, driven by the temper of a nation.

This was new.

Clausewitz did acknowledge Bonaparte’s genius as the driver of this new military juggernaut, yet his exceptional generalship was less significant than the new “popular” spirit that propelled his army. Clausewitz called it a paradoxical trinity of primordial violence, hatred and enmity. If that was true, then it helps explain the many people-wars of the 19th century, but is a perplexer (at least to me) when applied to events a century later.

The Battle of the Somme, started on July 1, 1916, is infamous primarily because of 58,000 British troop casualties (one-third of them killed) – to this day a one-day record. It was the main Allied attack on the Western front in 1916 and lasted until Nov. 18 when terrible weather brought it to a halt. The attack resulted in over 620,000 British and French casualties. German casualties were estimated at 500,000. It is one of the bloodiest battles in human history.

The Allies gained a grand total of 12 kilometers of (non-strategic) ground!

It is hard to fit Clausewitz’s thesis into this form of military stupidity. I prefer the rationale offered by the greatest mind of the 20th century: “Older men declare war. But it is the youth that must fight and die,” said Albert Einstein.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Libertarian Streak Set United States Apart from Rest of Continent

A rare 1874 Venezuelan Republic silver proof, featuring Simón Bolívar and struck by the Paris mint, realized $70,500 at an August 2014 Heritage auction.

By Jim O’Neal

By 1775, North and South America had become remarkably different from a societal standpoint, with economic systems profoundly dissimilar. The only significant similarity was they were both still composed of colonies ruled by kings in distant lands.

That was about to change, rather dramatically.

On July 2, 1776, the Second Continental Congress declared its colonies independent from Britain and King George. Spain’s rule in Latin America would end 40 years later, but the North’s revolution assured the rights of property owners and established a federal republic that would become the world’s wealthiest nation in a relatively short 100 years.

Latin American revolutions, on the other hand, consigned nations south of the Rio Grande to 200 years of instability, disunity and economic underdevelopment.

One important reason was that the independence claimed by Britain’s 13 North American colonies was driven by a libertarian society of merchants and farmers who rebelled against an overzealous extension of imperial authority. It was not only the old issues of taxation and representation; land had become a much more important issue that, in turn, fueled the revolutionary fires. The British government’s efforts to limit further settlement west of the Appalachians struck at the heart of the colonists’ vision of the future – a vision of “manifest larceny” that was especially attractive to property speculators like George Washington.

Still, war may have been averted by concessions on taxes, better diplomacy or even if British generals were more adept and less arrogant. It is even possible to imagine the colonies falling apart instead of coming together. Post-war economic conditions were severe: inflation near 400 percent, per capita income slashed by 50 percent, a mountain of debt over 60 percent of GDP. But losing the yoke of the British Crown created a sense of newfound freedom and brotherhood. These states were now united.

However, had the revolution not progressed beyond the Articles of Confederation, then perhaps the fate of the United States would have been more like that of South America – a story of fragmentation rather than unification. It took the Constitution of 1787, the most impressive piece of political institution-building in all of history, to establish a viable federal structure for the new republic. There was a single market, a single trade policy, a single currency, a single army, and a single law of bankruptcy for people whose debts exceeded assets.

The major flaw was not resolving the issue of slavery and the naive assumption it would vanish over time. It obviously did not and the burden of the Civil War nearly destroyed all of the astonishing progress that followed. The sheer brilliance of Abraham Lincoln’s insight to avoid any action that would lead to disunion trumped the temptation to co-mingle states’ rights or slavery. “One nation, indivisible…”

Yet independence from Spain left much of Latin America with an enduring legacy of conflict, poverty and inequality. Why did capitalism and democracy fail to thrive?

The short answer was Simón Bolívar.

A longer, more balanced view will have to wait for its own blog entry.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Roosevelt Used Radio to Encourage, Hitler to Fuel Rage

A Franklin D. Roosevelt photograph, signed and inscribed to Eleanor Roosevelt, sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

Saul Bellow was a Canadian-born writer who became a nationalized U.S. citizen when he discovered he had immigrated to the United States illegally as a child. He hit the big time in 1964 with his novel Herzog. It won the U.S. National Book Award for fiction. Time magazine named it one of the 100 best novels in the English language since “the beginning of Time” (March 3, 1923).

Along the way, Bellow (1915-2005) also managed to squeeze in a Pulitzer Prize, the Nobel Prize for Literature, and the National Medal of Arts. He is the only writer to win the National Book Award for Fiction three times.

Saul Bellow

Bellow loved to describe his personal experience listening to President Roosevelt, an American aristocrat (Groton and Harvard educated), hold the nation together, using only a radio and the power of his personality. “I can recall walking eastward on the Chicago Midway … drivers had pulled over, parking bumper to bumper, and turned on their radios to hear every single word. They had rolled down the windows and opened the car doors. Everywhere the same voice, its odd Eastern accent, which in anyone else would have irritated Midwesterners. You could follow without missing a single word as you strolled by. You felt joined to these unknown drivers, men and women smoking their cigarettes in silence, not so much considering the president’s words as affirming the rightness of his tone and taking assurances from it.”

The nation needed the assurance of those fireside chats, the first of which was delivered on March 12, 1933. Between a quarter and a third of the workforce was unemployed. It was the nadir of the Great Depression.

The “fireside” was figurative; most of the chats emanated from a small, cramped room in the White House basement. Secretary of Labor Frances Perkins described the change that would come over the president just before the broadcasts. “His face would smile and light up as though he were actually sitting on the front porch or in the parlor with them. People felt this, and it bound them to him in affection.”

Roosevelt’s fireside chats and, indeed, all of his efforts to communicate contrasted with those of another master of the airwaves, Adolf Hitler, who fueled rage in the German people via radio and encouraged their need to blame, while FDR reasoned with and encouraged America. Hitler’s speeches were pumped through cheap plastic radios manufactured expressly to ensure complete penetration of the German consciousness. The appropriation of this new medium by FDR for reason and common sense was one of the great triumphs of American democracy.

Herr Hitler ended up committing suicide after ordering the building burned to the ground to prevent the Allies from retrieving any of his remains. So ended the grand 1,000-year Reich he had promised … poof … gone with the wind.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Early Automotive Pioneers Among America’s Top Innovators

A Lincoln Motor Company stock certificate, issued in October 1918 and signed by Henry M. Leland, sold for $500 at an October 2013 auction.

By Jim O’Neal

Doctors called it a “chauffeur’s fracture,” the radial styloid or wrist fracture that occurred when a driver tried to start a horseless carriage by turning the crank at the front of the car. If the engine backfired, the crank would spin backward, often causing broken bones. Those early automobiles motoring down the streets of American cities were considered engineering marvels.

But what a challenge to start!

The two requirements were a blacksmith’s arm and a perfect sense of timing. The driver had to adjust the spark and the throttle before jumping out to turn the crank mounted on the car’s outside front grill. Once the spark caught and the motor fired, the driver dashed back to the control to adjust the spark and throttle before the engine could die. Oh, and if the car started, but was in gear, it could lurch forward and run over the cranker!

Sound farfetched?

In 1908, tragedy struck when Byron Carter (1863-1908) – inventor of the Cartercar – died after trying to start a stalled car. The crank hit him in the jaw. Complications with gangrene set in and he died of pneumonia. It was a fluke involving a stalled motorist he was trying to help. The driver forgot to retard the spark. Whamo!

The car involved was a new Cadillac, one of the premier luxury brands, and Carter was good friends with the man who ran Cadillac, Henry Leland (who also owned Lincoln). When Leland found out his friend had been killed, he vowed: “The Cadillac car will kill no more men if we can possibly help it!” Cadillac engineers finally succeeded in manufacturing an electric self-starter, but were never able to scale it for commercial use.

Enter Charles Franklin Kettering (1876-1958), a remarkable man (in the same league as Thomas Edison) whose versatile skills included engineering and savvy business management. He was a prolific inventor with 186 notable patents. One of them was a self-starter small enough to fit under the hood of a car, running off a small storage battery. A New York inventor (Clyde J. Coleman) had applied for a patent in 1899 for an electric self-starter, but it was only a theoretical solution and never marketed.

After graduating from Ohio State College of Engineering, Kettering went to work for the invention staff at National Cash Register (NCR) company. He invented a high-torque electric motor to drive a cash register, allowing a salesperson to ring up a sale without turning a hand-crank twice each time. After five years at NCR, he set up his own laboratory in Dayton, Ohio. Working with a group of engineers, mechanics and electricians, he developed the new ignition system for the Cadillac Automobile Company.

Leland sold Cadillac to General Motors in 1909 for $4.5 million and there is no record of any Cadillac ever killing another person, at least from turning a crank to start the engine! Since Cadillac had been formed from remnants of the Henry Ford Company (the second of two failed attempts by Ford), it was renamed for Antoine Laumet de La Mothe, sieur de Cadillac (the founder of Detroit 200 years earlier).

Later, Leland would sell Lincoln, his other marque luxury brand, to Ford Motor Company for a healthy $10 million, while Kettering and his crew formed Dayton Engineering Laboratories Co., which became Delco, still a famous name in electronic automobile parts. Kettering went on to have a long, sterling career and was featured on the cover of Time on Jan. 9, 1933 … the week after president-elect Franklin Delano Roosevelt was named the magazine’s Man of the Year (Jan. 2).

My only quibble is the work Kettering did with Thomas Midgley Jr. in developing Ethyl gasoline, which eliminated engine knock, but loaded the air we breathe with lead (a deadly neurotoxin) for the next 50 years. And he developed Freon … a much safer refrigerant, but which released CFCs, which will be destroying our atmospheric ozone for the next 100-200 years.

I don’t recall ever personally turning an engine crank. My cars went from ignition keys to keyless and I plan to skip the driverless models and wait for a Jet-Cab … unless Jeff Bezos can provide an Uber-style version using one of his drones.

Things change.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

President Lincoln Understood Technology and Adapted

This photograph of Abraham Lincoln was among 348 Civil War albumen images in a collection that sold for $83,650 at a December 2010 Heritage auction.

By Jim O’Neal

Presidents have always been challenged to communicate their policies and priorities to the public. As the political party system evolved, newspapers became more partisan depending on their level of editorial bias – usually due to strong-willed owners/editors – forcing administrations to devise creative ways to deliver unfiltered messages.

In the 20th century, President Wilson established the first presidential press conference in March 1913. All of his predecessors have continued using this innovation with only minor variants. FDR used “Fireside Chats” to help ease public concerns during the Great Depression, using bromides like, “The only thing we have to fear is fear itself” or explaining how the banking system works to restore confidence in the financial system.

President Eisenhower preferred off-the-record sessions with reporters and heavily edited film clips.

Then by 1960, with 87 percent of households having televisions, people could tune in twice a month and see the young, telegenic JFK – live and uncut – deliver his aggressive agenda for America. Up until then, press conferences were strictly off the record to provide the opportunity to correct any gaffes or poorly phrased answers to difficult questions. President Truman once told reporters “the greatest asset the Kremlin has is Senator [Joe] McCarthy” … but the quote was reworded before being released!

President Trump has adopted modern technology to bypass the media and communicate directly to anyone interested (which includes his base and the frustrated media). Daily WH briefings have become increasingly adversarial as many in the media are in various stages of open warfare, especially The New York Times and CNN. The 24/7 news cycle allows viewers to choose media that are consistent with their personal opinions and the result is a giant echo-sphere.

In the 19th century, President Lincoln was often confronted with extreme press hostility, especially by the three large newspapers in NYC, which attacked him personally and for his failing Civil War policies, particularly after the Civil War Draft Riots. Lincoln retaliated with dramatic letters in 1862-63 – ostensibly to New York Tribune editor Horace Greeley, but also strategically to all newspapers to reach a far wider audience. At the very least, he reduced editorial influence and in doing so revolutionized the art of presidential communications.

And then it was suddenly Nov. 19, 1863, at Gettysburg, Pa. What Lincoln said that day has been analyzed, memorized and explained … but never emulated. The only flaw was the prediction that “The world will little note, nor long remember, what we say here …”

The compactness and concision of the Gettysburg Address have something to do with the mystery of its memorability. It was 271 words. It had 10 sentences, the final one accounting for a third of the entire length; 205 words had a single syllable; 46 had two; 20 had three syllables or more. The pronoun “I” was never uttered. Lincoln had admired and seen at once the future of the telegraph, which required one to get to the point, with clarity. The telegraphic quality can be clearly heard in the speech – “We cannot dedicate, we cannot consecrate, we cannot hallow this ground.” Rhythm, compression, precision … all were emphasized.

Perhaps the most overshadowed speech in history was the one featured as the main event that day: Edward Everett’s oration. He was a Harvard man (later its president), a professor of Greek, governor of Massachusetts, and ambassador to England. Everett’s two-hour speech (13,607 words) was well received. Lincoln congratulated him.

Afterward, in a note to Lincoln, Everett wrote: “I should be glad to flatter myself that I came as near to the central idea of the occasion, in two hours, as you did in two minutes.” Lincoln’s grateful reply concluded with “I am pleased to know that in your judgment, the little I did say was not a failure.”

Not bad for a man traveling with the fever of a smallpox infection! 

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Yes, George C. Marshall Earned Title of ‘Greatest Living American’

A photograph of General George C. Marshall, signed, went to auction in October 2007.

By Jim O’Neal

In Harvard Yard, a venue carefully chosen as dignified and non-controversial, Secretary of State George C. Marshall’s 15-minute speech on June 5, 1947, painted a grim picture for the graduates. With words crafted and refined by the most brilliant minds in the State Department, Marshall outlined the “continuing hunger, poverty, desperation and chaos” in a Europe still devastated after the end of World War II.

Marshall, one of the greatest Secretaries of State the United States has ever produced, asserted unequivocally that it was time for a comprehensive recovery plan. The only caveat was that “the initiation must come from Europe.” His words were much more than typical boilerplate commencement rhetoric and Great Britain’s wily Foreign Minister Ernest Bevin heard the message loud and clear. By July 3, he and his French counterpart, Georges Bidault, had invited 22 nations to Paris to develop a European Recovery Program (ERP). Bevin had been alerted to the importance by Dean Acheson, Marshall’s Under Secretary of State. Acheson was point man for the old Eastern establishment and had already done a masterful job of laying the groundwork for Marshall’s speech. He made the public aware that European cities still looked like bombs had just started falling, ports were still blocked, and farmers were hoarding crops because they couldn’t get a decent price. Furthur, Communist parties of France and Italy (upon direct orders from the Kremlin) had launched waves of strikes, destabilizing already shaky governments.

President Harry S. Truman was adamant that any assistance plan be called the Marshall Plan, honoring the man he believed to be the “greatest living American.” Yet much of Congress still viewed it as “Operation Rat Hole,” pouring money into an untrustworthy socialist blueprint.

The Soviets and their Eastern European satellites refused an invitation to participate and in February 1948, Joseph Stalin’s vicious coup in Prague crumpled Czechoslovakia’s coalition, which inspired speedy passage of the ERP. This dramatic action marked a significant step away from the FDR-era policy of non-commitment in European matters, especially expensive aid programs. The Truman administration had pragmatically accepted a stark fact – the United States was the only Western country with any money after WWII.

Shocked by reports of starvation in most of Europe and desperate to bolster friendly governments, the administration offered huge sums of money to any democratic country in Europe able to develop a plausible recovery scheme – even those in the Soviet sphere of influence – despite the near-maniacal resistance of the powerful and increasingly paranoid Stalin.

With no trepidation, on April 14, the freighter John H. Quick steamed out of Texas’ Galveston Harbor, bound for Bordeaux with 9,000 tons of American wheat. Soon, 150 ships were busy shuttling across the Atlantic carrying food, fuel, industrial equipment and construction materials – essential to rebuilding entire countries. The Marshall Plan’s most impressive achievement was its inherent magnanimity, for its very success returned Europe to a competitive position with the United States!

Winston Churchill wrote, “Many nations have arrived at the summit of the world, but none, before the United States, on this occasion, has chosen that moment of triumph, not for aggrandizement, but for further self-sacrifices.”

Truman may have been right about this greatest living American and his brief speech that altered a ravaged world and changed history for millions of people – who may have long forgotten the debt they owe him. Scholars are still studying the brilliant tactics involved.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

For a While, the Tiny Mosquito was Destroying Armies

A key to the success of the Panama Canal was Colonel William P. Gorgas’ mosquito-eradication program, which saved thousands of workers’ lives. This example of a U.S. gold coin commemorating the construction of the Panama Canal and the rebuilding of the City of San Francisco sold for $152,750 at a January 2017 Heritage auction.

By Jim O’Neal

The French were no strangers to the Stegomyia fasciata. It was, after all, that tiny mosquito that wiped out the French troops Napoleon had dispatched to squash a slave uprising in Haiti and then the French-controlled colony of Saint-Domingue. From 1802 to 1803, yellow fever ravaged 50,000 troops, including their commanding officer General Emmanuel Leclerc. His replacement, General Rochambeau, retreated with a mere 3,000 soldier-survivors.

Experts estimate that twice as many soldiers were lost in Haiti than were killed in the world famous Battle of Waterloo!

Napoleon finally conceded that they were no match for this mysterious, silent killer and abandoned the ambitious plans to expand the empire into the Louisiana Territory, selling it for $15 million, which doubled the size of the young United States. It was an epic bargain for the United States and dramatically reduced the risk of future wars with France, which were almost inevitable.

Later in 1889, another Frenchman, Ferdinand de Lesseps, led a decade-long and terminally troubled attempt to build a canal across the Isthmus of Panama that crumbled after 20,000 workers (one-third of his force) died from yellow fever – a highly contagious, usually fatal disease contracted from a single mosquito bite.

Au revoir, monsieurs.

In America, the fever reached epidemic proportions as well. More than 300,000 cases were reported in the United States between 1793 and 1900; at times with mortality rates of up to 85 percent. The disease attacks the liver, turns the skin yellow, raises body temperatures and causes internal bleeding before the victim lapses into a coma. More U.S. troops were killed in the Spanish-American War by yellow fever than by the enemy. Yellow fever, nicknamed “yellow jack” after the pennants that flew to signal a quarantine, arrived in Central America in mid-16th century aboard slave ships travelling from Africa. Despite countless hypotheses, the cause of the disease and its rapid spread remained a mystery.

Dr. Carlos Finlay of Cuba had long theorized that mosquitos carried and spread yellow fever. The conventional medical establishment criticized Finlay, calling him “mosquito man.” But no one had a better idea. In desperation, U.S. Army Major Walter Reed, his fellow doctors, and a detachment of soldiers traveled to Havana in June 1900 and tested Finlay’s theory by volunteering to let indigenous mosquitos bite them.

On Aug. 27, Dr. James Carroll allowed himself to be bitten, fell ill with the disease, but survived. Reed survived his bout as well. Several other colleagues died and both Reed and Carroll sustained lasting damage to their health. The soldiers refused to accept a $250 bonus, believing it would cheapen their sacrifice. Public opinion was cynical and negative. American newspapers mocked the experiment or simply ignored it. Congress even denied a pension to one soldier, the first one who developed the test even though the experiment left him paralyzed.

Yet the team prevailed and in October 1900, Walter Reed finally declared publicly that “the mosquito served as the intermediate host for the parasite of yellow fever.” The disease’s cycle was soon unraveled. Female mosquitos picked up yellow fever in the first three days of a patient’s infection and became contagious after a 12-day incubation period with the pandemic disease.

Eventually, Maj. William Crawford Gorgas eradicated the disease in Panama and the Canal Zone. He also wiped out another mosquito that spread malaria and rats that carried bubonic plague. Gorgas’ triumph allowed the United States to begin their canal dig and finish it by 1914. Panama’s death rate from yellow jack had dropped to only half that of the United States.

Problem solved.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

People Flocked to California in Hopes of Finding Instant Riches

A daguerreotype of a California gold-mining scene by Robert Vance, circa 1850, sold for $83,650 at a May 2011 auction.

By Jim O’Neal

When Brigham Young heard that fellow Mormon Sam Brannan had been tithing the gold miners at the Mormon Diggings in California, he sent an envoy to demand the church’s money. In a version of the story circulated by sawmill operator John Sutter, Brannan replied, “You go back and tell Brigham Young that I’ll give up the Lord’s money when he sends me a receipt signed by the Lord!”

Brannan’s success transcended his dealings with local miners. As the rush to the mines accelerated, his Sacramento store did huge business, as much as $5,000 a day. With the proceeds, the wily entrepreneur opened additional stores throughout gold territory and constructed hotels, warehouses and other commercial buildings. In San Francisco, he organized a consortium that built the city’s first large wharf at a cost of $200,000. By quickly repaying all owner-investors, Brannan’s reputation and wealth continued to grow.

Sam Brannan is widely recognized as the first authentic millionaire in California.

When gold was discovered on the American River above Sutter’s Fort in January 1848, California was a sparsely populated frontier. The gold had been formed over a 200-million-year period with the constant recycling of the earth’s crust as minerals precipitated out in streaks or veins. Gold occurs in the crust of the earth at an average concentration of 5 parts per billion. But, the melting and cooling that produced the Sierra Batholith yielded veins of gold-bearing quartz as high as 100 parts per billion.

Most of this gold was trapped far below the surface of the earth, where it remained for tens of millions of years until the crust crumbled and the glaciers took over. The heat of the earth – which had driven the crystal plates to their collisions with the western edge of North America – then melted the rock and boiled out the precious metal. All that remained was for humans to harvest what the earth had collected. And they did so with enormous zeal.

The astonishing news of “Gold! In California!” prompted hundreds of thousands of people from around the world to flock to California in hopes of finding instant riches. They sailed from Australia and China, from Europe and South America. They ventured across the disease-plagued Isthmus of Panama and through the treacherous waters of Cape Horn. And they traveled by foot, wagon and horseback and over the towering Sierras. They abandoned wives and families, homesteads and farms.

Sacramento and San Francisco popped up overnight as did scores of mining camps. Entrepreneurs such as Leland Stanford, Sam Brannan and merchants like Levi Strauss amassed fortunes simply by supplying miners with picks and shovel, tents, food and other items needed to harvest the gold. By 1850, California had become a state … marking the fastest journey to statehood in United States history.

Sam Brannan hit a bad streak when a divorce forced him to liquidate his entire holdings to pay a court-ordered 50/50 division of assets … in cash. He died penniless and establishing a precedent that would plague future husbands who were divorced in California.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Bikes Symbolized Progress for a Nation Ready for Growth

A rare campaign button shows presidential candidate William McKinley riding a bicycle at the height of the bike boom of the 1890s.

By Jim O’Neal

As the bicycle became more popular in the latter part of the 1800s, it was inevitable that millions of new enthusiasts would soon be demanding better roads to accommodate this object of pleasure, so symbolic of progress. It was hailed as a democratizing force for the masses and neatly bridged the gap between the horse and the automobile, which was still a work in progress.

The popularity of this silent, steel steed had exploded with the advent of the “safety bicycle” (1885), which dramatically reduced the hazards of the giant “high wheelers.” The invention of the pneumatic tire in 1889 greatly enhanced the comfort of riding and further expanded the universe of users. However, this explosion in activity also increased the level of animosity as cities tried to cope by restricting hours of use, setting speed limits and passing ordinances that curtailed access to streets.

There were protest demonstrations in all major cities, but it came to a head in 1896 in San Francisco. The city’s old dirt roads were crisscrossed with streetcar tracks, cable slots and abandoned street rail franchises. Designed for a population of 40,000, the nation’s third-wealthiest city was now a metropolis of 350,000 and growing. On July 25, 1896, advocates of good streets and organized cyclists paraded in downtown with 100,000 spectators cheering them on.

The “Bicycle Wars” were soon a relic of the past as attention shifted to a product that was destined to change the United States more than anything in its history: Henry Ford’s Model T. Production by the Ford Motor Company began in August 1908 and the new cars came rolling out of the factory the next month. It was an immediate success since it solved three chronic problems: automobiles were scarce, prohibitively expensive and consistently unreliable.

Voila, the Model T was easy to maintain, highly reliable and priced to fit the budgets of the vast number of Americans with only modest incomes. It didn’t start the Automobile Age, but it did start in the hearts and souls of millions of people eager to join in the excitement that accompanied this new innovation. It accelerated the advent of the automobile directly into American society by at least a decade or more.

By 1918, 50 percent of the cars in the United States were Model Ts.

There were other cars pouring into the market, but Model Ts, arriving by the hundreds of thousands, gave a sharp impetus to the support structure – roads, parking lots, traffic signals, service stations – that made all cars more desirable and inexorably changed our daily lives. Automotive writer John Keats summed it up well in The Insolent Chariots: The automobile changed our dress, our manners, social customs, vacation habits, the shapes of our cities, consumer purchasing patterns and common tasks.

By the 1920s, one in eight American workers was employed in a related automobile industry, be it petroleum refining, rubber making or steel manufacturing. The availability of jobs helped create the beginning of a permanent middle class and, thanks to the Ford Motor Company, most of these laborers made a decent living wage on a modern five-day, 40-hour work week.

Although 8.1 million passenger cars were registered by the 1920s, paved streets were more often the exception than the rule. The dirt roads connecting towns were generally rutted, dusty and often impassable. However, spurred by the rampant popularity of the Model T, road construction quickly became one of the principal activities of government and expenditures zoomed to No. 2 behind education. Highway construction gave birth to other necessities: the first drive-in restaurant in Dallas 1921 (Kirby’s Pig Stand), first “mo-tel” in San Luis Obispo in 1925, and the first public garage in Detroit in 1929.

The surrounding landscape changed with the mushrooming of gas stations from coast to coast, replacing the cumbersome practice of buying gas by the bucket from hardware stores or street vendors. Enclosed curbside pumps became commonplace as did hundreds of brands, including Texaco, Sinclair and Gulf. The intense competition inspired dealers to distinguish with identifiable stations and absurd buildings. Then, in the 1920s, the “City Beautiful” movement resulted in gas stations looking like ancient Greek temples, log cabins or regional Colonial New England and California Spanish mission style fuel stops.

What a glorious time to be an American and be able to drive anywhere you pleased and see anything you wished. This really is a remarkable place to live and to enjoy the bountiful freedoms we sometimes take for granted.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].