News Reporting Has Come a Long Way, but Kinks Remain

A sketch of Edward R. Murrow, signed, by artist Johnny Raitt is among six sketches of famous newscasters that went to auction in July 2010.

By Jim O’Neal

In the past year, several prominent newspapers and TV networks have corrected or retracted provocative political stories that were factually wrong. Critics are prone to blame the insatiable appetite to feed the 24/7 news-cycle beast and, increasingly, a news organization’s rush to be first. This has been compounded by the steady transition from costly field correspondents to much-less expensive panelists sitting around a table in the TV studio offering personal opinions.

Most of these discussions start with “I think” or “In my opinion,” which by definition blurs facts with subjective comments. Unbiased, factual reporting is mixed into a lethal cocktail that blurs reality and has inexorably led to an environment where charges of “fake news” are routine. Then social media further distort issues and reality. People can now easily shop for any “facts” on TV or the internet that support their opinions. However, the “need for speed” is not a recent phenomenon.

Triggered by the oldest of journalism’s preoccupations – the desire to be first with a dramatic story – Edward R. Murrow, William L. Shirer and their network, the Columbia Broadcasting System (CBS), made broadcast journalism history on March 13, 1938. What set the stage was CBS founder and CEO William S. Paley’s realization that his radio network had just been soundly beaten again by the National Broadcasting Company (NBC) and their reporter “Ubiquitous Max” Jordan, with his eyewitness account of Austria’s fall.

Worse, the fault was Paley’s. Until Jordan’s story and its effect on America, Paley had supported news director Paul White’s decision not to use network employees for hard-news reporting. To their increasing chagrin, men like Murrow and Shirer were forced to cover truly soft stories like concerts instead of Adolf Hitler and the Third Reich’s actions and intentions. Paley had had enough. He asked White to call Shirer and tell him, “We want a European roundup tonight.” The broadcast would cover the European reaction to the Nazis’ Austrian takeover. The players would include Shirer in London with a member of Parliament; Murrow in Vienna; and American newspaper correspondents in Paris, Berlin and Rome.

They had eight hours to put together what had never been done before. As Stanley Cloud and Lynne Olson describe in their book The Murrow Boys: “Never mind that it was five o’clock, London time, on a Sunday afternoon, which meant that all offices were closed and that all technicians and correspondents and members of Parliament they would need were out of town, off in the country or otherwise unreachable. Never mind the seemingly insuperable technical problems of arranging the lines and transmitters, of ensuring the necessity of split-second timing. Never mind any of that. That was what being a foreign correspondent was all about. It was part of the code of the brotherhood. When the bastards asked if you could do something impossible, the only acceptable answer was yes. Shirer reached for the phone and called Murrow in Vienna.”

Beginning at 8 p.m., with announcer Robert Trout’s words, “We take you now to London,” Murrow, Shirer and their comrades proved radio was not only able to report news as it occurred but also able to put it into context, to link it with news from elsewhere – and do it with unprecedented speed and immediacy. They set in motion with that 30-minute broadcast in March 1938 a chain of events that would lead, in only one year, to radio’s emergence as America’s chief news medium and to the beginning of CBS’s decades-long dominance of broadcast journalism. The broadcasts by Murrow and his team during the London blitz and over the entire course of the war set the standard for broadcast reporting style and eloquence.

We have come a long way since then, but it’s not clear to me if we’ve made any progress.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Chester Arthur Surprised His Critics, Overcame Negative Reputation

This ribbon with an engraved portrait of Chester Alan Arthur, issued as a souvenir for an Oct. 11, 1882, “Dinner to The President of the United States by The City of Boston,” sold for $437 at a November 2014 auction.

By Jim O’Neal

President Ulysses S. Grant appointed Chester Alan Arthur to the lucrative post of Collector of the Port of New York in 1871. Arthur held the job for seven years, and with an annual gross income of $50,000, was able to accumulate a modest fortune. He was responsible for the collection of about 75 percent of the entire nation’s duties from ships that landed in his jurisdiction, which included the entire coast of New York state, the Hudson River and ports in New Jersey.

In 1872, he raised significant contributions from Custom House employees to support Grant’s successful re-election for a second term. The spoils system was working as designed, despite occasional charges of corruption.

Five years later, the Jay Commission was created to formally investigate corruption in the New York Custom House and (future president) Chester Arthur was the primary witness. The commissioner recommended a thorough housecleaning and President Rutherford B. Hayes fired Arthur and then offered him an appointment as consul general in Paris. Arthur refused and went back to New York law and politics.

At the 1880 Republican National Convention, eventual nominee James Garfield first offered the VP slot to wealthy New York Congressman Levi Morton (later vice president for Benjamin Harrison), who refused. Garfield then turned to Chester Arthur, who, when he accepted, declared, “The office of the vice president is a greater honor than I ever dreamed of attaining.” It would be the only election he would ever win, but it was enough to foist him into the presidency.

The Garfield-Arthur ticket prevailed and after being sworn in on March 4, 1881, the 49-year-old Garfield’s first act was to turn and kiss his aged mother. It was the first time a president’s mother had ever been present at an inauguration. She would outlive her son by almost seven years. President James Polk (1845-1849) also died three years before his mother, the first time that had happened.

On the morning of July 2, President Garfield was entering the Baltimore and Potomac Railroad Station in Washington, D.C., where he was to board a train to attend the 25th reunion of his class at Williams College. A mentally disturbed office seeker, Charles J. Guiteau, shot him twice. He died 80 days later and for the fourth time in history, a man clearly only meant to be vice president ascended to the presidency.”

“CHET ARTHUR PRESIDENT OF THE UNITED STATES! GOOD GOD!”

Although President Arthur’s greatest achievement may have been the complete renovation of the White House, he surprised even some of his harshest critics. Mark Twain may have summed it up best: “I am but one in 55 million, still in the opinion of this one-fifty-five millionth of the country’s population, it would be hard to better President Arthur’s administration.”

Faint praise, yet probably accurate. (First, do no harm.)

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

‘M*A*S*H’ Showed Us How Far Intelligent TV Can Go

Rick Meyerowitz’s original M*A*S*H art for a 1974 cover of TV Guide sold for $657 at a June 2008 Heritage auction.

By Jim O’Neal

By 1983, the population of the United States had increased to 232 million … and virtually everyone was watching television on a regular basis. On Feb. 23, over 50 percent of them (best estimate is 125 million) tuned in that night to the last episode of one of their all-time favorite shows, M*A*S*H.

For weeks, newspapers had run contests asking readers to suggest how the show should end. “M*A*S*H Bashes” were held in every major city and people donned old army fatigues to watch the show, primarily in bars. Seventy-one percent of viewers watching television that night helped “Goodbye, Farewell and Amen” become the most-watched show in history. (No. 2 is Cheers for its finale “One For The Road.”)

We were saying farewell, not just to beloved television characters, but to an era and anti-war spirit that the show had captured so brilliantly.

M*A*S*H, which ran for 11 years (1972-83), with 251 episodes that snagged nearly 100 Emmy nominations, is still broadcast in reruns and is considered one of network television’s finest efforts. It was based on Richard Hooker’s 1968 bestselling novel Mash: A Novel About Three Army Doctors and the 1970 feature film directed by Robert Altman. (Note: Some nitpickers claim it was really based on the failed film M*A*S*H Goes to Maine, which itself was based on Hooker’s 1972 book sequel.)

The story of a fictional Mobile Army Surgical Hospital near the front lines of the Korean War (technically a U.N. “police action”), the TV show was filled with the high jinks typical of the book and movie, yet it established its own tone of prickly intelligence, wit and sardonic warmth. In tackling the darker aspects of war, the show perfectly echoed a conscience-stricken America, deeply troubled by Vietnam. In reality, and with an exquisite touch of irony, the book’s author was a surgeon from Maine who served in a MASH unit in Korea and actually hated the show for its anti-war message!

M*A*S*H creator, comedy writer Larry Gelbart, put the wise-cracking, womanizing, yet humane Benjamin Franklin “Hawkeye” Pierce (Alan Alda) at the center of the action. Sharing Hawkeye’s flea-bitten tent were fellow surgeons “Trapper” John McIntyre and Frank Burns, who was having an affair with Margaret “Hot Lips” Houlihan, the strong-willed head nurse. A favorite was Max Klinger, a cross-dresser who would try anything to get home. The cast changed over time, and finally even Gelbart left, exhausted from battles with network sensors.

The last episode, “Goodbye, Farewell and Amen,” which ran 2½ hours, was a remarkable culmination of everything the series represented: good, funny television drama that probed the ugly underside of war – in that last case, the savagery of peace in the closing days of Korea.

“What happened on the bus?” psychiatrist Sidney Freedman keeps asking Hawkeye, who in the final episode’s opening is in a mental institution.

Slowly, we learn that on July 4, after a day at the beach, the unit’s bus stopped to pick up refugees and wounded G.I.’s who told them to drive the bus into the bushes to hide from an enemy patrol. Hawkeye keeps hissing at a refugee woman, to keep her rooster on her lap quiet. The woman complies and eventually the repressed memory emerges … the woman has smothered her own child.

Hawkeye shakily returns to the 4077th and on the night of the armistice, one of the worst rounds of casualties is brought in. “Does this look like peace to you?” Margaret asks. Then over the PA system comes a litany of the war’s damage, ending with “2 million killed and 100,000 Korean orphans.” As the unit is broken down, each character gropes toward civilian life.

As Hawkeye lifts off in a helicopter, he sees down below on the deserted 4077, spelled out in stones, a message from his friend B.J. Hunnicutt: GOODBYE.

This last episode, considered the best in television history, was more than a goodbye. It was an example of how far serious and intelligent television can go, and a reminder that it very rarely does.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Carnegie Coveted Crown of Richest Man in the World

This Andrew Carnegie photograph – inscribed, signed and dated Dec. 11, 1917 – realized $1,015 at a September 2011 Heritage auction.

By Jim O’Neal

Jeff Bezos of Amazon is the world’s richest man, with an estimated net worth of more than $100 billion. A hundred years ago (1916), John D. Rockefeller became America’s first billionaire, which in today’s economy would be two to three times greater than Bezos’ fortune. In the late 19th century, Andrew Carnegie coveted this crown and saw steel as his road to stardom.

In the post-Civil War era, America grew rapidly as railroads crisscrossed the country and extended their reach to all four corners. Electricity arrived to light up buildings and homes, oil supplemented kerosene and coal, iron and steel production grew as demand soared to keep up with rapid economic expansion. Occasional booms/busts occurred since the markets were unregulated and coordination was difficult.

Carnegie had led the growth in the American steel industry and his ambition to snatch Rockefeller’s crown became more acute. One of the key industry developments involved the construction of a steel bridge to connect St. Louis and East St. Louis on opposite banks of the mighty Mississippi River. The Eads Bridge, named for its designer, engineer James B. Eads, relied heavily on steel for its revolutionary design. It was set to become the first significant bridge using steel girders and a cantilever form.

A young Carnegie supplied the financing and the steel, despite skepticism over the sturdiness of the structure after it was completed. A man named John Robinson came up with a clever way to dispel any doubts. Elephants were believed to have good instincts about where they stepped, so Robinson borrowed a fully grown one from a traveling circus. On June 14, 1874, he led the beast across the length of the bridge, with crowds on both ends going wild. Later, a convoy of locomotives were driven back and forth as a further (and final) test of soundness.

On July 4, 1874, the bridge officially opened with General William Tecumseh Sherman driving the last spike as 150,000 people looked on. Demand for steel exploded, forcing Carnegie to develop creative ways to boost production. One was a modified vertical production technique that maximized factory output. But that was still not enough. It became obvious that a 12-hour, six-day workweek was needed. The only problem was that workers’ health couldn’t keep up. Carnegie hired tough managers to impose the onerous schedule and he left for Scotland to escape the critics. Later, his guilty conscience led him to an unprecedented binge of philanthropy after he sold the Carnegie Steel Company to J.P. Morgan for $480 million. It became U.S. Steel, the first billion-dollar corporation in the world.

John D. Rockefeller took an even more devious strategy to his domination of the oil-refining industry. In 1872, he formed a shell corporation: the South Improvement Company (SIC). He then struck an agreement with large railroad companies whereby they sharply raised freight rates for all oil refineries, except those in the SIC (notably Standard Oil), which received substantial rebates – up to 50 percent off crude and refined oil shipments. Then came the most deadly innovation – SIC members also received “drawbacks” on shipments made by rival refineries. So when Standard Oil made shipments from Pennsylvania to Cleveland, they received a 40-cent rebate on every barrel, plus another 40 cents for every barrel of oil shipped by every competitor!

It has been called “an instrument of competitive cruelty unparalleled in industry.” In fact, it was collusion on a scale never equaled in American history. And it was only one of several techniques employed. But it did help Mr. Rockefeller and his investors achieve a 90 percent share of the entire U.S. oil business.

All Bezos has is the internet.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].