Here’s why Benjamin Rush is an unsung hero of the American Revolution

A Dr. Benjamin Rush autograph manuscript titled “References to texts of Scripture related to each other upon particular Subjects” sold for $5,975 at a February 2006 Heritage auction.

By Jim O’Neal

Early in 1813, two former U.S. presidents were in grief over the death of a mutual friend and colleague. Dr. Benjamin Rush had been responsible for reconciling the ex-presidents and healing the bitter rift that had grown worse after they left office. Now, Dr. Rush was dead and both John Adams and Thomas Jefferson were convinced that the eminent physician deserved to be honored for more than his enthusiasm for American liberty.

Benjamin Rush was born in 1746 in a small township a few miles outside of Philadelphia. Just 30 years later, he was one of the younger of the 56 men who bravely signed the Declaration of Independence (Edward Rutledge, age 26, was the youngest). Benjamin was 5 years old when his father died, but in a stroke of pure providence, his mother took notice of his remarkable intellect and was determined to see that her precocious youngster got special tutoring. She sent him to live with an aunt and uncle, who enrolled him in a boarding school run by the Reverend Samuel Finley, an academic who founded the West Nottingham Academy (1744) and later the College of New Jersey (Princeton University).

Rush (predictably) flourished in this rarefied intellectual atmosphere and at age 13 was admitted to Princeton. After graduating in one year, he was then apprenticed to Philadelphia’s foremost physician, Dr. John Redman. However, eager to continue his studies, he sailed to Scotland in 1766. He entered the University of Edinburg, rated the finest medical school in the British Empire. Again, serendipity reigned since this was the blooming of the Scottish Enlightenment. This period was coincidental with the European movement that encouraged rational thought, while resisting the traditional imposition of sovereign authority, especially from Great Britain. By divine providence, the American colonies were gradually drifting into similar territory and the example of taxation was considered undermining independent action, which curtailed liberty.

During the next three years, Rush not only became a fully qualified doctor of medicine, but was exposed to some of the greatest thinkers, politicians and artists that were alive. When he returned to Philadelphia, his bandwidth had continued to expand as he absorbed radical alternatives to conventional theories. He became obsessed with the concept of public service and a champion of the common man.

Establishing a medical practice was challenging since the poor represented the equivalent of today’s middle class and the wealthy naturally controlled the best and most experienced practitioners. Since Rush was now eager to help close social inequalities, he sought out the sick in the slums of Philadelphia and offered his services. He was forced to accept a position as professor of chemistry at the College of Philadelphia to bolster his income (his family had grown to 13) and, importantly, provide an outlet for his prodigious medical papers.

He is credited with being the first to highlight the deleterious effects of alcohol and tobacco, but in the process alienated both heavy users and most producers. Even more controversial was his anti-slavery position with the South growing more reliant on slave labor as the integral part of their agrarian economic development. With Great Britain seemingly intent on oppressing all Americans, the nation was inevitably being drawn into war. Dr. Rush was eager to leverage his medical skills to assist the military and was appointed Surgeon General of part of the Continental Army. His broad experience resulted in a pamphlet called “Directions for Preserving the Health of Soldiers.” He keenly observed that “a greater proportion of men have perished with sickness in our armies than have fallen by the sword.” Looming in the future, the Civil War and World War I would prove just how prescient he was.

Today, Dr. Benjamin Rush is generally forgotten or relegated to the second tier of Founding Fathers, an oversight that even Adams and Jefferson recognized when he died in 1813. It is a curious situation when one considers the sincere eulogies expressed by his colleagues and students. It’s estimated that he trained 3,000 doctors and his writings, both personal and technical, are astonishing in breadth and depth. Jefferson was effusive with his praise and John Adams declared he “knew of no one, living or dead, who had done more real good in America.” High praise from two such prominent men who were there to witness it.

Another man who benefited from association with Rush was the firebrand Thomas Paine, who generally falls into the same category. His publication of Plain Truth is one of the most powerful forces behind the colonies’ quest for independence from the British Crown. Never heard of it? That’s not surprising since Plain Truth was changed to Common Sense after Dr. Rush had Paine read him every line before it was published. He persuaded Paine to make the change and it remains the best-selling book in American history and set the colonies firmly on the road to independence.

When I view the current political landscape, I’m persuaded that all that’s missing is … Common Sense!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Today’s business tycoons would be wise to not forget the past

A statement on Union Iron Mills stationary signed by Andrew Carnegie and dated Sept. 29, 1870, sold for $6,572 at an April 2013 auction.

By Jim O’Neal

It may come as no surprise to learn that virtually all the major cities in California were incorporated in the same year. 1850 was also the year California became the 31st state to join the United States. It is now the most populous state and supports a $3 trillion economy, which ranks No. 5 in the world … larger than Great Britain. However, you probably don’t know that, in terms of land area, the three largest cities are Los Angeles, San Diego and (surprisingly) California City.

This large chunk of land was formed in the boomlet following World War 2 with the intent of rivalling Los Angeles. Southern California was flourishing due to temperate weather, Pacific Ocean beaches and nearby mountains. It seemed logical that with a large influx of people, all that was lacking were lots of affordable housing, automobiles, streets, freeways and plenty of water to drink and as irrigation for the orange groves.

An ambitious land developer spotted this unique opportunity and bought 82,000 acres of prime California City land just north of the SoCal basin. He commissioned a high-power, architectural master-plan community with detailed maps of blocks, lots and streets. Next was hiring a small army of 1,300 salesmen to promote land sales to individuals, while building a 26-acre artificial lake, two golf courses and a four-story Holiday Inn.

This was land speculation on a grand scale; they sold 50,000 lots for $100 million before the market dried up. Some reports claim that only 175 new homes were actually built. The fundamental reason was that Southern California land development primarily evolved south along the coastline toward San Diego and the prime ocean-front property in Malibu, Long Beach and Orange County. Although the scheme failed, California City was finally incorporated in 1965. Today, the 15,000 inhabitants, many from Edwards Air Force base, are sprinkled liberally over 204 square miles.

A prominent No. 4 on the list is San Jose, which narrowly escaped being destroyed in a 1906 earthquake that nearly leveled nearby San Francisco. When we lived there (1968-71), it was a small, idyllic oasis with plum trees growing in undeveloped lots in the shadow of the Santa Cruz Mountains. There were nice beaches an hour away and for $14.15, PSA would fly you 400 miles to LAX in 45 minutes. In addition to the short drive to San Francisco with all its wonders, Lake Tahoe offered gambling, except when it snowed on the Sierra Nevada, a mere 200 miles away.

Nobody dreamed that the miracle of Silicon Valley was on the horizon and the enormous impact of the Internet would result in the boom-bust of the dot.com era in the late 1990s. The stock market was up 400% and then down 80%, wiping out most of the gains. However, post 2002, and the proliferation of the personal computer, there was another technology revolution that would create more wealth than anyplace in the history of the world.

Apple, Google, Facebook, eBay, Intel, Cisco and Instagram are at the core of a technological society that has revolutionized our economy and communications, our lives and, by extension, the world. Smart phones, search engines and social-media giants – plus a community of 2,000 tech firms and venture capitalists – have generated enormous fortunes. Electric vehicles have morphed into driverless cars and trucks that will result in more creative destruction. AI and robotics will obsolete large swaths of production and elevate privacy and anti-trust concerns that will rival early 20th century government action to break up trusts.

Consider when Andrew Carnegie sold his Carnegie Steel Company to J.P. Morgan in 1901 for an astounding $303 million. He became the richest man in America, surpassing even John D. Rockefeller for several years. JPM then merged it with two other steel companies to form the first billion-dollar U.S. company, U.S. Steel. While Rockefeller continued to expand his oil monopoly, Carnegie devoted the last 18 years of his life to large-scale philanthropy. He literally lived by his credo: “The man who dies rich dies disgraced.” President Teddy Roosevelt would lead the trust-busting that became necessary.

Tim Cook, Mark Zuckerberg, the Google gang and Jeff Bezos would be wise to heed George Santayana’s aphorism: “Those who cannot remember the past are condemned to repeat it.” Especially when social media becomes more addictive than crack cocaine.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

LBJ exhibited ambition, decisiveness, a strong work ethic … and fear of failure

Lyndon B. Johnson artifacts, including signed photographs and a Civil Rights Bill signing pen, sold for $15,000 at an October 2018 Heritage auction.

By Jim O’Neal

Lyndon Baines Johnson was born in August 1908 in the Texas Hill Country nearly 112 years ago. (Tempus does fugit!). He shined shoes and picked cotton for pocket money, graduating from high school at age 15. Both his parents were teachers and encouraged the reading habits that would benefit him greatly for the rest of his life.

Tired of both books and study, he bummed his way to Southern California, where he picked peaches, washed dishes and did other odd jobs like a common hobo. The deep farm recession forced him back to Texas, where he borrowed $75 to earn a teaching degree from a small state college. Working with poor, impoverished Mexican children gave him a unique insight into poverty. He loved to tell stories from that time in his life, especially when he was working on legislation that improved life for common people.

His real power was developed when he electrified the rural Hill Country by creating a pool of money from power companies that he doled out to politicians all over the country who needed campaign funds and were willing to barter their votes in Congress. The women and girls who lived in Texas were known as “bent women” from toting water – two buckets at a time from water wells – to their homes. Having electricity to draw the water eliminated a generation of women who were not hump-backed. They said of LBJ, “He brought us light.” This caught FDR’s attention and lead to important committee assignments.

He married 20-year-old Claudia Alta Taylor in 1934 (at birth, a nanny had exclaimed “She looks just like a “little lady bird”). A full-grown Lady Bird parlayed a small inheritance into an investment in an Austin radio station that grew into a multimillion-dollar fortune.

Robert Caro has written about LBJ’s ambition, decisiveness and willingness to work hard. But how does that explain the trepidation to run for president in 1960? He had been Senate Majorly Leader, accumulated lots of political support and had a growing reputation for his Civil Rights record. He even told his associates, “I am destined to be president. I was meant to be president. And I’m going to be president!” Yet in 1958, when he was almost perfectly positioned to make his move, he was silent.

His close friend, Texas Governor John Connally, had a theory: “He was afraid of failing.”

His father was a fair politician but failed, lost the family ranch, plunged into bankruptcy and was the butt of town jokes. In simple terms, LBJ was afraid to run for the candidacy and lose. That explains why he didn’t announce until it was too late and JFK had it sewed up.

Fear of failure.

After JFK won the 1960 nomination at the Democratic National Convention in Los Angeles, he knew LBJ would be a valuable vice president on the Democratic ticket against Richard Nixon. Johnson’s Southwestern drawl expanded the base and the 50 electoral votes in Texas was too tempting to pass up. They were all staying at the Biltmore Hotel in L.A. and were a mere two floors away. Kennedy personally convinced LBJ to accept, despite brother Bobby’s three attempts to get him to decline (obviously unsuccessful).

The 1960 election was incredibly close with only 100,000 votes separating Kennedy and Nixon. Insiders were sure that a recount would uncover corruption in Illinois and Nixon would be declared the winner. But in a big surprise, RMN refused to demand a recount to avoid the massive disruption in the country. (Forty years later, Gore vs. Bush demonstrated the chaos in the 2000 Florida “hanging chads” debacle and the stain on SCOTUS by stopping just the Florida recount).

After the Kennedy assassination in November 1963, LBJ was despondent since he was sure he’d become the “accidental president.” But, when he demolished Barry Goldwater in 1968 the old Lyndon was back. The Johnson-Humphrey ticket won by of the greatest landslides in American history. LBJ got 61.1 percent of the popular vote and 486 electoral votes to Goldwater’s 52. More importantly, Democrats increased their majorities in both houses of Congress.

This level of domination provided LBJ with the leverage to implement his full Great Society agenda with the help of the 89th Congress, which approved multibillion-dollar budgets. After LBJ ramrodded through Congress his liberal legislative programs in 1965-66, it seemed that he might go down in history as one of the nation’s truly great presidents. But, his failure to bring Vietnam to a successful conclusion, the riots in scores of cities in 1967-68, and the spirit of discontent that descended on the country turned his administration into a disaster.

On Jan. 22, 1973, less than a month after President Truman died, the 64-year-old Johnson died of a heart attack. His fear of failure, a silent companion.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

More than 50 years later, streets filled with a new generation, with new demands

A Time magazine signed by Martin Luther King Jr. realized $6,875 at an October 2013 Heritage auction. The issue is dated Feb. 18, 1957, two months after the end of the Montgomery Bus Boycott.

By Jim O’Neal

In the two decades following WW2, the African-American struggle for civil rights lacked focus and broad support. Americans had spent four long years fighting a bitter war to free the world from tyranny and were now intent on resuming a peaceful recovery. However, in the process of restoring a normal life, they became disgusted by racial segregation and systemic exploitation of minorities. As important legal victories increased, protesters and marchers were helping change attitudes as the basis for faster, more sincere progress.

Alas, irrespective of the legal triumphs and the changes in public opinion, Jim Crow segregation was deeply embedded in the Deep South and portions of the West. Discriminatory laws generally targeted Latinos and American Indians, in addition to African-Americans. All levels of state government often failed to honor court decisions, while civil rights workers were subjected to mob violence that even included law enforcement officers.

Finally, on Aug. 28, 1963, the largest civil rights protest in American history occurred when 250,000 people gathered on the National Mall in Washington, D.C., to begin “The March on Washington for Jobs and Freedom” (later known simply as “The March”). The highlight of the day was a 15-minute closing speech by the Rev. Martin Luther King Jr.

In it, King offered his version of the American Dream, drawing on the Declaration of Independence, the Constitution and the Emancipation Proclamation. It was quickly to become known as the “I Have a Dream” speech. The Dream included a hope that people “will not be judged by the color of their skin, but by the content of their character,” that black and white children could “join hands … and walk together as sisters and brothers,” and that “the sons of former slaves and the sons of former slave owners will be able to sit down together at the table of brotherhood.”

MLK cautioned the nation there was still a long way to go. He cited the broken promises Americans made after the Civil War. Slavery was gone, but vicious racism still existed in general society. He said bluntly, “One hundred years later, the life of the Negro is still sadly crippled by the manacles of segregation and the chains of discrimination.” King also highlighted the curse of widespread poverty during an era of postwar prosperity. He closed by exhorting white Americans to strive for a realization of the cherished phrase “Let Freedom Ring” and join in the old Negro spiritual that proclaimed “Free at last! Free at last! Great God a-mighty, we are free at last!”

The March on Washington was pivotal in the passage of the 1964 Civil Rights Act, which eliminated segregation and began dismantling Jim Crow practices.

On Nov. 21, 1963, Lyndon Baines Johnson, the eighth vice president to assume the nation’s highest office following the death of a president, was better prepared to take command than any of his predecessors. Despite the personal disrespect and abuse from the elitist Kennedy crowd (especially RFK), LBJ was a master politician, serving 24 years as a representative and 12 years as a senator. As Majority Leader, he was truly “Master of the Senate,” as biographer Robert Caro has written so carefully.

The new president stayed in the background as the nation grappled with the enormous grief that engulfed Kennedy’s family and the people during funeral services for the slain president. Then, five days after the assassination – on the day before Thanksgiving – Johnson addressed a special joint session of Congress. He challenged them to honor Kennedy’s memory by carrying forward the dead president’s New Frontier program, saying “Let us continue.”

He asked for early passage of a new civil rights bill “to eliminate from the nation every trace of discrimination and oppression that is based on color or race.” Congress responded by passing every law LBJ pleaded for. Yet, over 50 years later, our streets are filled with a new generation, with new demands, as Congress is deadlocked once again. So, no new laws…

Perhaps Cassius was right after all: “The fault, dear Brutus, is not in our stars, but in ourselves.” (William Shakespeare’s Julius Caesar).

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As Sir Newton Noted, Vast Oceans of Truth Lie Undiscovered

Sir Isaac Newton’s autograph was among of group of three signatures by famous scientists that sold for $4,750 at a January 2017 Heritage auction.

By Jim O’Neal

Charles Eliot was president of Harvard College from 1869 to 1909, taking charge at the surprisingly young age of 35. He made some surprising statements, too, starting with his inauguration speech. He matter-of-factly observed that the world knew very little about the natural mental capacities of women. Further, he plainly stated the university was not the place to be experimenting with that notion, and that women would not be admitted to the regular college. He was also concerned about women living near the university and the obvious implications of being near to male students. It was his firm belief that once society resolved issues of inequality, perhaps the issue would become clearer. However, even long after his retirement, he maintained his doubts, since women were “simply physically too fragile.”

Another insight into his perspective occurred when the school’s baseball team had a winning season. When he learned that one of the factors that contributed to this success was the use of the curve ball, he opined this was a skill that was surely unethical and certainly not appropriate for Harvard players.

Fortunately, this was not a systemwide ethos and he may have been unaware that one of his professors, Edward Charles Pickering (director of the Harvard College Observatory), fired his entire staff of men due to their apparent inability to stay up with all the data that was routinely generated. Instead, he simply hired his maid/housekeeper to handle the numbers, eventually hiring 80-plus women, who became better known as the Harvard Computers.

One of these women was a little-known Radcliffe College graduate named Henrietta Swan Leavitt, who was “allowed” to measure the brightness of stars using the observatory’s photographic plates (women were not allowed to actually operate the telescopes). Leavitt devised a novel way to measure how far certain stars were and expressed the values in “standard candles,” a term still in common use today. Another of the computers, Annie Jump Cannon, created a new system of stellar classifications. Together, their inferences would prove to be invaluable to answering two critical questions about the universe: How old is it and how big?

The man who came up with the answer using their inferences was lawyer-turned-astronomer Edwin Powell Hubble. He was born in 1889 and lived until 1953. When he sat down to peer through the Mount Wilson (Calif.) Observatory’s 100-inch Hooker telescope in 1917 (the world’s largest until 1949), there was exactly one known galaxy: our lonely little Milky Way. Hubble not only proved the universe consisted of additional galaxies, but that the universe was still expanding. How much credit was given to the Harvard Computers group is still an area of contention, but only to what degree their work represented in these new discoveries.

Hubble was a handsome, star athlete who won seven high school track events in one day and was also a skilled boxer. He never won a Nobel Prize, but won everlasting fame when NASA named their long-overdue space telescope in honor of his scientific contributions. The Hubble Space Telescope was launched into orbit on April 24, 1990, by the Space Shuttle Discovery. Since then, it has been repaired and upgraded five different times by American astronauts and the results have been nothing short of amazingly remarkable. NASA is now confident that before Hubble’s replacement in four to five years, they will be capable of looking back into deep space far enough to see the big bang that started everything 13.8 billion years ago.

For perspective, consider what has been learned since Sir Isaac Newton was born on Christmas Day in 1642, when the accepted theory was a heliocentric model of the universe with Earth and the other planets orbiting our Sun. It was similar to what Copernicus published at the end of his life in 1543. Since then, all the great scientific minds have been focused on our galaxy trying to prove the laws of motion, the theory of light and the effects of gravity on what they believed was the entire universe. All big, important concepts (but only as it relates to our little infinitesimal piece of real estate). And then along came quantum mechanics, which added the world of the very small with its atoms, electrons, neutrons and other smaller pieces too numerous to be named as we bang particles into each other to see what flies off.

What Hubble has gradually exposed is that we have been gazing at our navel while the sheer magnitude of what lies “out there” has grown exponentially – and there may be no end as the vastness of an expansionary universe continues to speed up. I seem to recall that some wild forecasters thought there might be as many as 140 billion galaxies in the universe. Now, thanks to Hubble and lots of very smart people, the number of galaxies may be 2 trillion! If they each average 100 billion stars, that means the number of stars is now 200 sextillion – or the number 2 followed by 23 zeros.

That is a big number, but what if we are living in a multiverse with as many as 11 more universes?

I recall a quote by Sir Isaac Newton: “To myself I am only a child playing on the beach, while vast oceans of truth lie undiscovered below me.”

Aren’t we all.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

‘Your Show of Shows’ a Classic Reminder of Television’s Potential

A Sid Caesar and Imogene Coco signed photo from Your Show of Shows went to auction in February 2003.

By Jim O’Neal

In the fall of 1949, inside a room on the 23rd floor of NBC’s Manhattan headquarters, an ensemble of comedy writers were preparing to give Americans a reason to stay home on Saturday nights, glued to their 10-inch televisions. Soon, their efforts would spread fear down Broadway, just as Hollywood was beginning to worry about the effects of television on box-office receipts.

But this seemed different. The sophisticated, rowdy and mainly 20-something staff was about to use the relatively new medium of television to create stay-at-home laughter that surpassed everything that came before. From its premiere on Feb. 25, 1950, the manic energy of Your Show of Shows, starring Sid Caesar, Imogene Coca, Carl Reiner and Howard Morris, turned Saturday nights into a showcase for pure comedic genius.

Aside from launching legendary careers for Woody Allen, Mel Brooks and Larry Gelbart, the 90-minute revue provided a beacon of hope and direction to a medium that needed both.

It had started when NBC’s television programming chief, Pat Weaver, pitched Your Show of Shows in 1949 to Max Liebman, a veteran producer. Since television’s debut in 1946, networks attracted advertisers by allowing sponsors to buy entire timeslots and produce their own shows. A prime example was the Texaco Star Theater featuring Milton Berle.

Weaver had an entirely different concept. His network would air its own shows and sell “spots” of airtime to multiple companies. Liebman had been the first to pair Caesar and Coca when he directed a sponsor-driven program, the Admiral Broadway Revue, and he agreed to produce this new NBC-owned show. He quickly decided to reunite Caesar and Coca and then form the writing team around them.

A dream team as it turned out. Sid Caesar described it best: “This writing staff was pure magic. We were all a little bit crazy, but it somehow produced terrific material.” All that talent converged in the smoke-filled “writers’ room.” It was where the 21-year-old Mel Brooks would punctuate his chronic lateness by screaming, “Lindy has landed!” – much to the open anger of the demanding Sid Caesar.

Lucille Kallen, the lone female writer, is quoted as saying the team literally lived Your Show of Shows, working seven days a week, 39 weeks a year from that office … and loving every minute.

In what would later be known as television’s Golden Age, the incomparable staff created a gallery of memorable sketches, brought to life by Caesar, Coca, Carl Reiner and Howard Morris. There were movie parodies like From Here to Obscurity, Mel Brooks’ 2000 Year Old Man, and the extraordinary chemistry between Coca and Caesar, displayed in the saga of Doris and Charlie Hickenlooper’s floundering marriage. And, of course, Caesar’s portrayal of the “Professor.”

The show was so popular that Broadway movie and theater owners, after experiencing a dramatic box-office decline, pleaded with NBC executives to move the TV show to midweek. Even critics loved it. The notoriously harsh Larry Wolters of the Chicago Tribune wrote, “Sid Caesar doesn’t steal jokes, he doesn’t borrow ideas or material. A gag is as useless to him as a fresh situation to Milton Berle.” Alfred Hitchcock said, “The young Mr. Caesar best approaches the great Chaplin of the early years.”

What effect did the show, running over five years, have on television? Consider this: When it debuted in 1950, there were 4 million sets in American households. When the final curtain fell on the Hickenloopers and company, over half of the nation’s 48 million homes had a television.

Coincidence? Perhaps, but it’s also a reminder of what television at its very best could be … before the “vast wasteland” encroached.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

National Debt on Automatic Pilot to More Growth

A letter by President George W. Bush, signed and dated July 4, 2001, sold for $16,730 at an April 2007 Heritage auction.

By Jim O’Neal

In May 2001 – just 126 days after President George W. Bush took office – Congress passed his massive tax proposal. The Bush tax cuts had been reduced to $1.3 trillion from the $1.65 trillion submitted, but it was still a significant achievement from any historical perspective. It had taken Ronald Reagan two months longer to win approval of his tax cut and that was 20 years earlier.

George W. Bush

Bush was characteristically enthusiastic about this, but it had come with a serious loss in political capital. Senator James Jeffords, a moderate from Vermont, announced his withdrawal from the Republican Party, tipping control of the Senate to the Democrats, the first time in history that had occurred as the result of a senator switching parties. In this instance, it was from Republican to Independent, but the practical effect was the same. Several months later (after the terrorist attacks on the World Trade Center and the Pentagon), there was a loud chorus of calls to reverse the tax cuts to pay for higher anticipated spending.

Bush had a counter-proposal: Cut taxes even more!

Fiscal conservatives were worried that there would be the normal increase in the size and power of the federal government, lamenting that this was a constant instinctive companion of hot wars. James Madison’s warning that “A crisis is the rallying cry of the tyrant” was cited against centralization that would foster liberal ideas about the role of government and even more dependency on the federal system.

Ex-President Bill Clinton chimed in to say that he regretted not using the budget surplus (really only a forecast) to pay off the Social Security trust fund deficit. Neither he nor his former vice president had dispelled the myth about a “lock box” or explained the federal building in Virginia that had been built exclusively to hold government IOUs to Social Security. In reality, they were simply worthless pieces of scrip, stored in unlocked filing cabinets. The only changes that had ever occurred with Social Security funds were whether they were included in a “unified budget” or not. They had never been kept separate from other revenues the federal government received.

But this was Washington, D.C., where, short of a revolution or civil war, change comes in small increments. Past differences, like family arguments, linger in the air like the dust that descends from the attic. All of the huge surpluses totally disappeared with the simple change in the forecast and have never been discussed since.

Back at the Treasury Department of 15th Street, a statue to Alexander Hamilton commemorates the nation’s first Treasury Secretary, a fitting honor to the man who created our fiscal foundation. But on the other side stands Albert Gallatin, President Thomas Jefferson’s Treasury Secretary, who struggled to pay off Hamilton’s debts and shrink the bloated bureaucracy he built.

Hamilton also fared better than his onetime friend and foe, James Madison. The “Father of the Constitution” had no statue, no monument, no lasting tribute until 1981, when the new wing of the Library of Congress was named for him. This was a drought that was only matched by John Adams, the Revolutionary War hero and ardent nationalist. It was only after a laudatory biography by David McCulloch in 2001 that Congress commissioned a memorial to the nation’s second president.

Since the Bush tax cut and the new forecast, the national debt has ballooned to $20 trillion as 9/11, wars in Iraq and Afghanistan, and the 2008 financial meltdown produced a steady stream of budget deficits in both the Bush and Barack Obama administrations. The Donald Trump administration is poised to approve tax reform, amid arguments on the stimulative effect on the economy and who will benefit. In typical Washington fashion, there is no discussion over the fact that the national debt is inexorably on automatic pilot to $25 trillion, irrespective of tax reform. But this is Washington, where your money (and all they can borrow) is spent almost with no effort.

“Just charge it.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

News Reporting Has Come a Long Way, but Kinks Remain

A sketch of Edward R. Murrow, signed, by artist Johnny Raitt is among six sketches of famous newscasters that went to auction in July 2010.

By Jim O’Neal

In the past year, several prominent newspapers and TV networks have corrected or retracted provocative political stories that were factually wrong. Critics are prone to blame the insatiable appetite to feed the 24/7 news-cycle beast and, increasingly, a news organization’s rush to be first. This has been compounded by the steady transition from costly field correspondents to much-less expensive panelists sitting around a table in the TV studio offering personal opinions.

Most of these discussions start with “I think” or “In my opinion,” which by definition blurs facts with subjective comments. Unbiased, factual reporting is mixed into a lethal cocktail that blurs reality and has inexorably led to an environment where charges of “fake news” are routine. Then social media further distort issues and reality. People can now easily shop for any “facts” on TV or the internet that support their opinions. However, the “need for speed” is not a recent phenomenon.

Triggered by the oldest of journalism’s preoccupations – the desire to be first with a dramatic story – Edward R. Murrow, William L. Shirer and their network, the Columbia Broadcasting System (CBS), made broadcast journalism history on March 13, 1938. What set the stage was CBS founder and CEO William S. Paley’s realization that his radio network had just been soundly beaten again by the National Broadcasting Company (NBC) and their reporter “Ubiquitous Max” Jordan, with his eyewitness account of Austria’s fall.

Worse, the fault was Paley’s. Until Jordan’s story and its effect on America, Paley had supported news director Paul White’s decision not to use network employees for hard-news reporting. To their increasing chagrin, men like Murrow and Shirer were forced to cover truly soft stories like concerts instead of Adolf Hitler and the Third Reich’s actions and intentions. Paley had had enough. He asked White to call Shirer and tell him, “We want a European roundup tonight.” The broadcast would cover the European reaction to the Nazis’ Austrian takeover. The players would include Shirer in London with a member of Parliament; Murrow in Vienna; and American newspaper correspondents in Paris, Berlin and Rome.

They had eight hours to put together what had never been done before. As Stanley Cloud and Lynne Olson describe in their book The Murrow Boys: “Never mind that it was five o’clock, London time, on a Sunday afternoon, which meant that all offices were closed and that all technicians and correspondents and members of Parliament they would need were out of town, off in the country or otherwise unreachable. Never mind the seemingly insuperable technical problems of arranging the lines and transmitters, of ensuring the necessity of split-second timing. Never mind any of that. That was what being a foreign correspondent was all about. It was part of the code of the brotherhood. When the bastards asked if you could do something impossible, the only acceptable answer was yes. Shirer reached for the phone and called Murrow in Vienna.”

Beginning at 8 p.m., with announcer Robert Trout’s words, “We take you now to London,” Murrow, Shirer and their comrades proved radio was not only able to report news as it occurred but also able to put it into context, to link it with news from elsewhere – and do it with unprecedented speed and immediacy. They set in motion with that 30-minute broadcast in March 1938 a chain of events that would lead, in only one year, to radio’s emergence as America’s chief news medium and to the beginning of CBS’s decades-long dominance of broadcast journalism. The broadcasts by Murrow and his team during the London blitz and over the entire course of the war set the standard for broadcast reporting style and eloquence.

We have come a long way since then, but it’s not clear to me if we’ve made any progress.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].