United States has been tested before, but history shows we shall not perish

A newly discovered 1823 Stone printing of the Declaration of Independence sold for $597,500 at a 2012 Heritage auction.

By Jim O’Neal

The 59th presidential election in 2020 was unusual in several aspects. Despite the complications of a lethal pandemic, voter turnout was remarkable, with more than 159 million or 66% of eligible voters casting their ballots. In recent years, 40% to 50% has been considered normal, a major slump from the 73% in 1900. Joe Biden’s 51.3% was the highest since 1932 – the first of FDR’s four elections. Both candidates snared over 74 million votes, which topped Barack Obama’s record of 69.5 million in 2008. (Biden’s 81 million is the most any presidential candidate has ever received.)

However, it is not unusual for an incumbent president to lose a bid for re-election. Ten presidents before President Trump suffered a similar fate starting with John Adams in 1800. This was the first election where political parties played a role and Adams’ own vice president defeated him.

Another unusual facet of 2020 was the delay in getting all the votes counted and then certified, along with an unprecedented number of legal actions asserting irregularities or voter fraud. Post-election polls indicate that a high percentage of Republican voters still believe that their candidate won. This is unfortunate since the United States has a long, impeccable reputation for smooth, peaceful transfers of power.

By contrast, throughout recorded history – at least from ancient Rome to modern Britain – all great empires maintained their dominance with force of arms and raw political power. Then, the United States became a global powerhouse and the first to dominate through the creation of wealth. It is a truly remarkable story, liberally sprinkled with adversity, financial panics, a horrendous Civil War and Great Depression without an owner’s manual or quick-fix guide. However, in 1782, Congress passed an act that declared our national motto would be E PLURIBUS UNUM (“One from many”), which, in combination with a culture of “can do,” bound us together and crowded out the skeptics and naysayers. In 1931, “In God we trust” was added just in case we needed a little divine help occasionally.

In the beginning, it was the land.

After Columbus stumbled into the New World while trying to reach Asia by sailing west, Europeans were eager to fund expeditions to this unknown New World. Spain was aggressive and hit the lottery, first in Mexico, followed by Peru. Portugal hit a veritable gold mine by growing sugar in Brazil using slave labor. Even the French developed a remarkable trading empire deep in America using fur trading with American Indians in the Great Lakes area and staking claims to broad sections of land. England was the exception, primarily since they were more focused on opportunities for colonization. The east coast of America had been generally ignored (too hot, too cold, no gold) until Sir Walter Raleigh tried (twice) to establish a viable colony in present day North Carolina. It literally vanished, leaving only a word carved on a tree: Croatoan.

However, the English were still highly motivated to colonize by basic economic pressures. The population had grown from 3 million in 1500 to 6 million in 1650, but without a corresponding increase in jobs. Hordes of starving people naturally gravitated to the large cities and the seaside. The largest was London, and it swelled to 350,000 people by 1650. To exacerbate the situation, the influx of gold and silver into Europe spiked inflation, making a difficult situation unsustainable. In the 16th century, prices rose a staggeringly 400%.

But we are living proof that England’s colonization of the Atlantic coast was finally successful in the 16th and 17th century. Colonial America grew and prospered as 13 colonies evolved into a quasi-nation that was on the verge of even greater accomplishments. However, by 1775 the greed of King George III became too much to tolerate and they declared their independence from Great Britain. The American Revolutionary War lasted seven years (1775-83) and the United States of America was established … the first modern constructional liberal democracy. Losing the war and the colonies both shocked and surprised Great Britain (and many others) and even today historians debate whether it was “almost a miracle” or that the odds favored the Americans from the start.

Then we began to expand across the vast unknown continent due to a series of bold moves. President Jefferson doubled the size of the nation in 1803 with the remarkable “Louisiana Purchase.” President Polk engineered a war with Mexico that concluded quickly with the United States taking control of most of the Southwest, followed soon by the annexation of the Texas Republic. The discovery of gold in the San Francisco area attracted people from all over the world. Despite all of this, not enough has been written about the strategic era just after the end of the Revolutionary War.

The Treaty of Paris signed on March 1, 1786, did far more than formalize the peace and recognize the new United States of America. Great Britain also ceded (despite objections of France) all the land that comprised the immense Northwest Territory. This was a veritable wilderness area northwest of the Ohio River totaling 265,878 acres, similar to the existing size of America, and containing the future states of Ohio, Indiana, Illinois and Wisconsin. With this and the Louisiana lands, the United States was eight times larger! In addition, the Northwest Ordinance included three astounding conditions: 1. Freedom of religion, 2. Free universal education and, importantly, 3. Prohibition of slavery.

Also consider that until that point, the United States did not technically own a single acre of land! Now we had an unsettled empire, double the size, north and west of the Ohio River, larger than all of France, with access to four of the five Great Lakes. And then there was the Ohio River itself, a great natural highway west!

This, my friends, is how you build a powerful nation, populate it with talent from all over the world, encourage innovation never seen before, and then trust the people to do the rest. Whenever we are temporarily distracted, have faith that this nation has been tested before and that government of the people, by the people, for the people shall not perish from the earth. As Aesop and his fables remind … United we stand. Divided we fall.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

What would you do if you saw your obituary?

Francis H.C. Crick’s Nobel Prize Medal and Nobel Diploma, awarded in 1962 for his work related to DNA molecules, sold for $2.27 million at an April 2013 Heritage auction.

By Jim O’Neal

In 1888, a French newspaper published Alfred Nobel’s obituary with the following title: “Le marchand de la mort est mort” or “The merchant of death is dead.”

In reality, it was actually his brother Ludvig who had died, but Alfred was appalled that this kind of sendoff could tarnish his own professional legacy. One presumes that the only error was the mix-up in names since the sobriquet seemed apt given Alfred’s contributions to the effectiveness of substances that resulted in death.

In a complicated maneuver, the inventor of dynamite attempted to rectify future obits by posthumously donating the majority of his estate (94 percent) to the establishment of the Nobel Prizes, designed to expunge his reputation for all the deaths resulting from his explosive product. It was only partially successful since he was accused of treason against France for selling Ballistite (a smokeless propellant composed of two explosives) to Italy. The French forced him to leave Paris and he moved to Sanremo, Italy, where he died in 1896. There were five Nobel categories with an emphasis on “peace” … for obvious reasons.

A native of Stockholm, Nobel made a fortune when he invented dynamite in 1867 as a more reliable alternative to nitroglycerin. As a chemist and engineer, he basically revolutionized the field of explosives. Some accounts give him credit for 355 inventions. In 1895, a year before his death, he signed the final version of his will, which established the organization that would bear his name and “present prizes to those who, during the preceding year, shall have conferred the greatest benefit to mankind.”

Nobel’s family contested the will and the first prizes were not handed out until 1901. Among the first winners were German physicist Wilhelm Conrad Röntgen, who discovered X-rays, and German microbiologist Emil Adolf von Behring, who developed a treatment for diphtheria. The Nobel Prizes were soon recognized as the most prestigious in the world. Except for war-related interruptions, prizes have been awarded virtually every year. The category of economics was added in 1969.

The first American to receive a Nobel was President Theodore Roosevelt, who garnered the prize in 1906 after he helped mediate an end to the Russian-Japanese war. The German-born American scientist Albert Michelson claimed the physics prize the next year. However, the peace and literature prizes would become the most familiar to Americans and are some of the most controversial. Critics voiced concerns over Roosevelt, Woodrow Wilson (1919), George Marshall (1953) and Secretary of State Henry Kissinger (1973). More recently, winners have included Al Gore (2007) for making an Oscar-winning documentary on climate change, and Barack Obama (2009) “for his extraordinary efforts to strengthen international diplomacy and cooperation between peoples.” (for more, see Obama’s Wars by Bob Woodward).

William Faulkner, Ernest Hemingway, John Steinbeck and Toni Morrison generally have escaped criticism, as have multiple winners like Marie Curie (the first woman in 1911, and in two separate categories), and Linus Pauling, among others. The Red Cross has snagged three. From a personal standpoint, the most obvious non-winner is Mahatma Gandhi, or as someone quipped, “Gandhi can do without a Nobel Prize, but can the Nobel Committee do without Gandhi?”

I think not.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Yes, Presidential Elections Have Consequences

Chief Justice of the Supreme Court John Marshall is featured on this Fr. 375 Serial Number One $20 1891 Treasury Note, which sold for $114,000 at an April 2018 Heritage auction.

By Jim O’Neal

In theory, there is no mystery or debate regarding the intention of the Founding Fathers in the selection of members to serve on the Supreme Court.

The Constitution crisply explains, in the second paragraph of Article II, Section 2, that the president shall nominate, and by and with the advice and consent of the Senate, shall appoint judges of the Supreme Court. This provision means exactly what it says and is unchanged by any modifications since its adoption. That includes a simple majority vote of the Senate to grant such consent, to reject or refuse to take action on the presidential nominee.

One idea discussed, but not acted upon, was Benjamin Franklin’s explanation of the Scottish mode of appointment “in which the nomination proceeded from the lawyers, who always selected the ablest of the profession in order to get rid of him, and share his practice among themselves” – a uniquely clever way to eliminate superior competition.

What has changed is the adoption of the “nuclear option” in 2017, which invoked cloture to end filibustering in the Judicial Committee and forced a vote of the committee either up or down on making their recommendation to the full Senate. House Majority Leader Harry Reid had used it to great effect for all legislation that he allowed to the floor while the Democrats were in the majority. Republicans expanded it to include Supreme Court nominees after they regained the majority in 2016. Neil Gorsuch was elected to the Supreme Court under this new rule with a 54-45 Senate vote, picking up three anxious Democrat votes in the process. It’s widely assumed that current nominee Judge Brent Kavanaugh will be elected to the Supreme Court following a similar path since his opponents appear helpless to stop him.

As President Obama once explained, in not too subtle fashion, “Elections have consequences.”

It now seems clear that the Founding Fathers did not foresee that political parties would gradually increase their influence and that partisan considerations of the Senate would become more prominent than experience, wisdom and merit. This was magnified in the current effort to stymie a nomination when the opposition announced they would oppose any candidate the Chief Executive chose. Period. It may not seem reasonable on a literal basis, but it has gradually become routine and will only get worse (if that’s still possible).

It may astonish some to learn that no legal or constitutional requirements for a federal judgeship exist. President Roosevelt appointed James F. Byrnes as an associate justice in 1941 and his admission to practice was by “reading law.” This is an obsolete custom now – Byrnes was the last to benefit – that proceeded modern institutions that specialize in law exclusively. In Byrnes’ case, it’s not clear that he even had a high school diploma. But he was a governor and member of Congress. He resigned 15 months later (the second shortest tenure) in order to become head of the Office of Economic Stabilization and was a trusted FDR advisor who many assumed would replace Vice President Henry Wallace as FDR’s running mate in 1944. That honor went to the little-known, high-school educated Harry Truman, who would assume the presidency the following year when FDR died suddenly.

Thomas Jefferson never dreamed the Supreme Court would become more than just a necessary evil to help balance the government in minor legal proceedings and would be more than astonished that they now are the final arbiter of what is or isn’t constitutional. The idea that six judges (who didn’t even have a dedicated building) would be considered equal to the president and Congress would have been anathema to him.

However, that was before he met ex-Secretary of State John Marshall when he became Chief Justice of the Supreme Court and started the court’s long journey to final arbiter of the Constitution when he ruled on Marbury v. Madison in 1803. There was a new sheriff in town and the next 40 years witnessed the transformation of the court to the pinnacle of legal power. They even have their own building thanks to President William Howard Taft, who died two years before it was complete. Someday, Netflix will persuade them to livestream their public discussions for all of us to watch, although I personally prefer C-SPAN to eliminate the mindless talking heads that pollute cable television.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

National Debt on Automatic Pilot to More Growth

A letter by President George W. Bush, signed and dated July 4, 2001, sold for $16,730 at an April 2007 Heritage auction.

By Jim O’Neal

In May 2001 – just 126 days after President George W. Bush took office – Congress passed his massive tax proposal. The Bush tax cuts had been reduced to $1.3 trillion from the $1.65 trillion submitted, but it was still a significant achievement from any historical perspective. It had taken Ronald Reagan two months longer to win approval of his tax cut and that was 20 years earlier.

George W. Bush

Bush was characteristically enthusiastic about this, but it had come with a serious loss in political capital. Senator James Jeffords, a moderate from Vermont, announced his withdrawal from the Republican Party, tipping control of the Senate to the Democrats, the first time in history that had occurred as the result of a senator switching parties. In this instance, it was from Republican to Independent, but the practical effect was the same. Several months later (after the terrorist attacks on the World Trade Center and the Pentagon), there was a loud chorus of calls to reverse the tax cuts to pay for higher anticipated spending.

Bush had a counter-proposal: Cut taxes even more!

Fiscal conservatives were worried that there would be the normal increase in the size and power of the federal government, lamenting that this was a constant instinctive companion of hot wars. James Madison’s warning that “A crisis is the rallying cry of the tyrant” was cited against centralization that would foster liberal ideas about the role of government and even more dependency on the federal system.

Ex-President Bill Clinton chimed in to say that he regretted not using the budget surplus (really only a forecast) to pay off the Social Security trust fund deficit. Neither he nor his former vice president had dispelled the myth about a “lock box” or explained the federal building in Virginia that had been built exclusively to hold government IOUs to Social Security. In reality, they were simply worthless pieces of scrip, stored in unlocked filing cabinets. The only changes that had ever occurred with Social Security funds were whether they were included in a “unified budget” or not. They had never been kept separate from other revenues the federal government received.

But this was Washington, D.C., where, short of a revolution or civil war, change comes in small increments. Past differences, like family arguments, linger in the air like the dust that descends from the attic. All of the huge surpluses totally disappeared with the simple change in the forecast and have never been discussed since.

Back at the Treasury Department of 15th Street, a statue to Alexander Hamilton commemorates the nation’s first Treasury Secretary, a fitting honor to the man who created our fiscal foundation. But on the other side stands Albert Gallatin, President Thomas Jefferson’s Treasury Secretary, who struggled to pay off Hamilton’s debts and shrink the bloated bureaucracy he built.

Hamilton also fared better than his onetime friend and foe, James Madison. The “Father of the Constitution” had no statue, no monument, no lasting tribute until 1981, when the new wing of the Library of Congress was named for him. This was a drought that was only matched by John Adams, the Revolutionary War hero and ardent nationalist. It was only after a laudatory biography by David McCulloch in 2001 that Congress commissioned a memorial to the nation’s second president.

Since the Bush tax cut and the new forecast, the national debt has ballooned to $20 trillion as 9/11, wars in Iraq and Afghanistan, and the 2008 financial meltdown produced a steady stream of budget deficits in both the Bush and Barack Obama administrations. The Donald Trump administration is poised to approve tax reform, amid arguments on the stimulative effect on the economy and who will benefit. In typical Washington fashion, there is no discussion over the fact that the national debt is inexorably on automatic pilot to $25 trillion, irrespective of tax reform. But this is Washington, where your money (and all they can borrow) is spent almost with no effort.

“Just charge it.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Harvard-Educated Adams Cracked Down on Non-Citizens, Free Speech

An 1805-dated oil on canvas portrait of John Adams, attributed to William Dunlap, sold for $35,000 at a May 2017 Heritage auction.

By Jim O’Neal

When Barack Obama was sworn in on Jan. 20, 2009, he became the eighth president to have graduated from Harvard, which has educated more U.S. presidents than any other university. Yale is second with five, with George W. Bush counting for both Yale and Harvard (where he earned an MBA).

The first of the “Harvard Presidents” goes all the way back to 1796, when John Adams narrowly defeated Thomas Jefferson 71 to 68 in the electoral vote count. It was the only election in history in which a president and a vice president were elected from opposing parties.

However, Jefferson bounced back four years later in a bitter campaign characterized by malicious personal attacks. Alexander Hamilton played a pivotal role in sabotaging President Adams’ attempt to win a second term by publishing a pamphlet that charged Adams was “emotionally unstable, given to impulsive decisions, unable to co-exist with his closest advisers, and was generally unfit to be president.”

When all the votes were counted in 1800, Adams actually ended up third behind both Jefferson and Aaron Burr (who eventually became vice president). John and Abigail Adams took the loss very emotionally and it alienated their relationship with Jefferson for 20-plus years. Adams departed the White House before dawn on Inauguration Day, skipped the entire inauguration ceremony and headed home to Massachusetts. The two men ultimately reconciled near the end of their lives (both died on July 4, 1826).

Adams had been an experienced executive-office politician after serving eight years as vice president for George Washington. However, his four years as president were controversial. It started when the Federalist-dominated Congress passed four bills, collectively called the Alien and Sedition Acts, which President Adams signed into law in 1798. The Naturalization Act made it harder for immigrants to become citizens, and the Alien Friends Act allowed the president to imprison and deport non-citizens deemed dangerous or from a hostile nation (Alien Enemy Act). And finally, the Sedition Act made it a crime to make false statements that were critical of the federal government.

Collectively, these bills invested President Adams with sweeping authority to deport resident non-citizens he considered dangerous; they criminalized free speech, forbidding anyone to “write, print, utter or publish … any false, scandalous and malicious writing or writing against the government of the United States … or either House of Congress of the United States … with intent to defame … or bring them into contempt or dispute … or to excite against them or either of them … the hatred of the good people of the United States.”

Editors were arrested and tried for publishing pieces the Adams administration deemed seditious. Editors were not the only targets. Matthew Lyon, a Vermont Congressman, was charged with sedition for a letter he wrote to the Vermont Journal denouncing Adams’ power grab. After he was indicted, tried and convicted, Lyon was sentenced to four months in prison and fined $1,000.

For Vice President Jefferson, the Alien and Sedition Acts were a cause of despair and wonderment. “What person, who remembers the times we have seen, could believe that within such a short time, not only the spirit of liberty, but the common principles of passive obedience would be trampled on and violated.” He suspected that Adams was conspiring to establish monarchy again.

It would not be the last time Americans would sacrifice civil liberties for the sake of national security. More on this later.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].