Were early presidents too jaded to solve divisive issue of slavery?

An Abraham Lincoln and Stephen Douglas pocket mirror issued to commemorate the 50th anniversary of the Lincoln-Douglas debates went to auction in 2015.

By Jim O’Neal 

The U.SConstitution is generally considered the most revered document in world history. John Adams described it as “the greatest single effort of national declaration that the world had ever seen. It was a seminal event in the history of human liberty. While containing remarkable concepts — All men are created equal” … “Endowed with certain unalienable rights” … “Consent of the governed” (instead of the will of the majority) — the founders proved to be incapable of reconciling the practice of slavery with these lofty ambitions. 
 
In order to gain consensus, they deftly employed what has become known colloquially as “Kick the can. International slavetrading was banned in the United States, but Congress was denied the right to eliminate slavery per se for 20 years (1808). The assumption (hope) was that slavery would just naturally phase out without the need for formal legislation. Then there was the obvious contradiction between men being born equal while slavery was allowed to continue. The explanation was a tortured rationale that equal was meant to mean “under the law” and not racial equality. 
 
We now know that rather than phasing out, slavery flourished as Southern agrarian economies became even more dependent on slave labor and geographical expansion added to the importance of the issue. So the dispute took on new dimensions as each new state entered the Union. Was it to be free or slave? The answer was up to a divided Congress to decide. In an effort to maintain harmony, Congress was forced to negotiate a series of compromisesfirst in 1820 and again in 1850 and 1854. Rather than continue to battle in CongressSouthern slave states turned to secession from the Union when it was clear that they weren’t strong enough to rely on nullification alone. 
 
What the Northern states needed desperately was a president with the will-power to keep the Union intact … with or without slavery. 
 
His name was Abraham Lincoln, a littleknown lawyer from Illinois. Today, most Americans know the major details of the life of the man who would become the 16th president of the United States. His humble upbringing in a pioneer family, his rise from lawyer to state legislator and presidential candidate, his wit and intelligence, his growth as a statesman to become the virtual conscience of the nation during the bloodiest rift in its history. Far fewer are familiar with the decisions and qualities which combined to create the most extraordinary figure in our political history. 
 
In 1858, he challenged Illinois Senator Stephen Douglas in his bid for re-election. Although Lincoln lost, he developed a national prominence when they engaged in a series of highprofile debates, primarily over slavery. Lincoln was eloquent in his attacks from a moral-ethical standpoint, while Douglas was firm in his belief in states rights to decide important issues. Then came the presidential election of 1860, with the country poised for war, and the outcome would be the determining factor. It was during the hotly contested campaign that the Democratic nominee Douglas would perform an epic act of “Nation over Party.” 
 
Two years later, Douglas sensed that Lincoln would win the presidency as Pennsylvania, Ohio and Indiana swung to the Republicans. Douglas famously declared “Mr. Lincoln is the next president. We must try to save the Union. I will go South!” Despite a valiant effort consisting of speeches to dissuade the South, it was too late. During the 16 weeks between Lincoln’s election in 1860 and the March 4, 1861, inauguration, seven states had seceded from the Union and formed the Confederate States of America. 
 
On June 31861, the first skirmish of the war on land occurred in (West) Virginia. It was called the Battle of Philippi and it was a Union victory. A minor affair that lasted 20 minutes with a few fatalities, the Union nevertheless celebrated it with fanfare. Ironically, Senator Douglas died on the same day at age 48. Three weeks later, the Civil War exploded at the Battle of Bull Run and would continue for four long bloody years. 
 
One has to wonder if this could have been avoided if our remarkable founders had been more prescient about the slavery issue and ended it with the adoption of the Bill of Rights. Or were those early Virginia presidents Washington, Jefferson, Madison and Monroe too jaded or selfish to make the personal sacrifice?

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell]. 

Here’s why Dwight D. Eisenhower admired Germany’s autobahn

A baseball signed by President Dwight D. Eisenhower circa 1960 sold for $9,588 at a December 2017 Heritage auction.

By Jim O’Neal

Tempus fugit!

When I think about President Dwight D. Eisenhower, my mind associates him with America in the 1950s. Images of sunny Southern California suburbs, rock music, languid days of surfing, backyard BBQs and my first car (1953), a 1932 Ford Victoria 4 banger. Long gone are any memories of ducking under a desk, the potential darkness of a nuclear war or concerns that communists were lurking in hidden corners.

It was a time of social confidence and military men taking advantage of the G.I. Bill by returning to college or starting families in the tract homes that were proliferating. Good factory jobs were plentiful, with auto-assembly plants gradually replacing shuttered aircraft shells. My posse knew the year and model of every Ford, Chevrolet, Buick or Oldsmobile that went whizzing by. Soon, I was a senior working in a General Motors plant assembling Buicks, Pontiacs and Oldsmobiles from 3 to 11 in South Gate. Shoddy quality, but the emphasis was on quantity and the pay was staggering: $2.55 an hour with daily overtime + Saturday. My cup runneth over.

In reality, Dwight David Eisenhower was a 19th-century man. Born in 1890 in Dennison, Texas, he moved a year later to Abilene, Kan., and a small, two-story frame house. He recalled the 1896 election when William McKinley defeated William Jennings Bryan, the golden-throated “Boy Orator of the Prairie.” This was the first of Bryan’s three defeats; he joined Henry Clay as the only losing candidates who received electoral votes in three separate presidential elections.

A good athlete, “Little Ike” yearned to attend the University of Michigan – the home of Coach Fielding Yost and his “point a minute” Wolverine football teams. He was encouraged to take the service academy exam and he failed to qualify for the vaunted Naval Academy. However, he squeaked into West Point and then married Mamie Geneva Doud in 1916. Although eager to join the war in Europe, he ended up in San Antonio training in the 57th Army Infantry, followed by a stint in Gettysburg, Pa., with a crack tank crew. A military legend was gradually taking shape.

When the U.S. Army returned from Europe at the end of WWI, they sponsored the 1919 Transcontinental Motor Convoy. Ike joined 300 other soldiers to drive a group of 81 motorized vehicles from Gettysburg to San Francisco. The convoy wound along the Lincoln Highway for 3,251 miles. Because of the rudimentary, haphazard web of paved roads, it took an almost unbelievable 62 days. Lieutenant Eisenhower would long remember the impassable roads and tortoise-like pace, never realizing he would later have an opportunity to rectify the issue.

For perspective, at the end of the 19th century, there had only been one motorized vehicle for every 18,000 people (today, we have about 300 million cars and trucks, or almost one per person). Also, the “roads” in 1900 were not asphalt or concrete; instead, they were too often packed dirt or mud, depending on the time of year. Even worse, outside cities and towns, there were few gas stations; rest stops would be a convenience in the future. In 1910, The Boston Eagle newspaper observed that automobiling was not an easy way to get anywhere … “it is an adventure … the last call of the wild.”

With Henry Ford’s help, that was about to change … dramatically. When Ford introduced the 1908 Model T, Americans finally had a dependable, affordable car. Over the next 20 years, 15 million “Tin Lizzies” rolled off the Ford assembly lines. Along with all the other car manufacturers, automobiles evolved from a luxury to a necessity. With this transition to a “nation of drivers” came the inevitable questions of who would pay for indirect costs involved. The powerful automobile industry ultimately prevailed, with governments at all levels agreeing to pay for streets, signage, highways, bridges and all the other things we now take for granted. Taxing gas was an easy target, but there were major infrastructure projects that are still difficult to fund.

Fortunately, during WWII, Eisenhower was the Supreme Commander of U.S. troops in Europe. He witnessed firsthand the genius of the German autobahn, a highly sophisticated and strategic network of highways. The Germans had used it to launch its Blitzkrieg attacks; waves of lightning fast, motorized armored infantry that quickly subdued most of Europe in a matter of days or weeks. When he became president in 1953, Ike remembered the fiasco of the transcontinental convoy and the devastation unleashed courtesy of the autobahn.

Voila! In 1954, he announced a plan to build a transcontinental interstate highway system for the United States. Naturally, this was not a new idea; Congress had passed the Federal-Aid Highway Act in 1944, which authorized the construction of a 40,000-mile National System of Interstate Highways. The only thing lacking was the funding to pay for it. What President Eisenhower did was cleverly bundle the “critical need for speed” in case of atomic attack on our key cities (a national defense imperative) with a terrific rationale for a highway system that would benefit common citizens. Fresh produce from Florida to New England overnight or year-round fresh fruit and vegetables from California to anywhere.

On June 29, 1956, President Eisenhower signed the bill creating a National System of Interstate and Defense Highways with a major economic stimulus via construction jobs followed by booms in numerous industries: trucking, petroleum, automotive, motels, restaurants. The list was endless. The nation’s interstate program stands as the largest public work project in world history. This time, there was funding and vastly improved state and local highways. The complaints would come later as planners used eminent domain to seize lands for roads. Thousands of farms were bifurcated by four-lane highways and scores of cities leveled or divided, with poor and minority communities destroyed.

Just another chapter in our history, but without the EPA or federal court injunctions.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Another milestone in American history just a few months away

This 1840 Silk Campaign Flag for William Henry Harrison realized $87,500 at a June 2018 Heritage auction.

By Jim O’Neal

Every four years, Americans get an opportunity to choose who will be president of the United States. To vote, people must be citizens, 18 years old and registered to vote. The actual direct voting is by delegates to an Electoral College, generally representing the Republican or Democratic political parties. Since 1789, 44 different men have occupied the Oval Office and Donald Trump is the 45th. Grover Cleveland accounts for the difference since he was elected twice, once in 1884 (#22) and again in 1892 (#24); he is the only president to serve two non-consecutive terms.

Of these 44 presidents, there is only one African-American and no women. One … John Quincy Adams … was selected by the House of Representatives in 1824 when none of the candidates received a majority of votes. In this century, George W. Bush and Donald Trump lost the popular vote, but had more votes in the Electoral College. Al Gore and Hillary Clinton placed second. Four of the presidents died in office and four were assassinated.

The first to die was William Henry Harrison in 1841 after serving only 31 days. John Tyler became the first vice president to assume the presidency without an election. To preclude any Constitutional uncertainty, Tyler immediately took the oath of office, moved into the White House and assumed full presidential powers. His political opponents argued (unsuccessfully) that he should be “acting president” until a new election was held. One president (Richard Nixon) resigned to avoid a trial in the Senate after the House of Representatives voted to impeach on three articles; he was virtually assured of conviction.

Each time, the nation withstood the shock of an unanticipated change and a safe transition was managed, almost routinely.

It is quite instructive to broadly categorize the men who have served in this office by analyzing their relationship with the people and the development of the nation. There are interesting correlations with the evolving role and power of the chief executive as the Union became more geographically diverse and ever-expanding. At times, it is arbitrary as the changes were often contentious, but society has flourished despite political discord. A few examples are all that space allows, but the story keeps getting more complex.

First consider the first five, from George Washington to James Monroe … both two-term presidents from Virginia (as were Thomas Jefferson and James Madison). Washington was elected unanimously twice, something Monroe nearly matched until one vote was cast to preserve GW’s record. Monroe served in the “Era of Good Feelings,” a time of harmony never to be replicated. These five presidents are easily labelled as “formative” in every sense of the word. There were few precedents to follow and the Constitution was uselessly vague on specifics.

Washington (1789-97) chose to meet primarily with the upper elite of society (eschewing the common man) and even assiduously avoiding shaking hands. He rode in a yellow chariot decorated with gilded cupids and his Coat of Arms. His executive mansion was staffed with 14 white servants and seven slaves. A different man might have easily assumed the role as king, irrespective of the war for independence. After all, that action was against King George III, the greedy British Parliament and taxation without representation. Further, he had been elected by a small group of mature (older) white men – and exclusively landowners, who numbered 6 percent of the total population.

Washington was acutely aware of the precedents he was setting and their historical importance. In 1789, he appeared before the Senate and presented an Indian treaty for approval. When the Senate decided to study it before approval, Washington huffed out after vowing to never appear before Congress again. It was a vow he kept. Similarly, when he refused to comply with a Congressional demand for his papers on the controversial Jay Treaty, he reminded Congress that the Constitution did not require their approval! Thus were the roots of executive privilege established.

When Washington declined a third term in 1796, George III famously declared, “If he does that, he will be the greatest man in the world.” He did and it was a precedent that spanned 144 years until Franklin Delano Roosevelt declared for the presidency a third time in 1940 (and won). From 1932 – with the Great Depression, the New Deal and the Second World War on the horizon – FDR had subsumed the federal government. To the common man, he epitomized the American landscape totally.

Other vivid examples include Jacksonian Democracy for the common man … the War with Mexico and the Western expansion of Manifest Destiny … Lincoln, his generals and the Civil War … Reconstruction without Lincoln’s wisdom … the Great War machine in the 20th century and the Cold War.

In a few months, we may have a chance to witness an inflection point in American history as another generation goes to the ballot box and votes. This time, voters will include women, blacks, Latinos, American Indians and Asians.

I plan to enjoy it.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

‘Peace, commerce, honest friendship with all nations … entangling alliances with none!’

This haunting World War I recruitment poster (Boston Public Safety Committee, 1915), featuring art by Fred Spear, sold for $14,400 at a November 2018 Heritage auction.

By Jim O’Neal

World War I officially erupted in Europe on July 28, 1914. The following month, British commentator and author H.G. Wells wrote a series of articles that blamed the Central Powers (Austria-Hungary, Germany, Bulgaria and the Ottoman Empire) for starting the war. Wells also argued that eliminating militarism in Germany was essential to avoiding future wars. Subsequently, the articles appeared in a small book titled The War That Will End War. The book’s title was far too optimistic, but Mr. Well’s thesis about Germany’s military would prove to be eerily prophetic.

As the war inexorably spread throughout Europe, conventional wisdom dictated that the United States would never become directly involved due to long-standing political policies dating to its founding. George Washington’s famous Farewell Address in 1789 had warned us to “steer clear of permanent alliances” and Thomas Jefferson echoed these sentiments: “Peace, commerce and honest friendship with all nations … entangling alliances with none!”

The Germans were confident America would remain on the sidelines. Their surprisingly broad network of spies in the United States kept reassuring them of the strong sentiment to avoid foreign wars and misinterpreted pacificism as a sign of weakness. It had only been 49 years since the end of hostilities in the Civil War and the ashes were still warm. Furthur, the American army was small (ranking 17th in the world), had not been involved in any major operations, and lacked the modern equipment of the 20th century.

President Woodrow Wilson had been re-elected in 1916 under the slogan “He Kept Us Out of War” and the promise of four more years of peace was comforting. It further emboldened the Germans and they became even more provocative by implementing an “unrestricted” policy for their fleet of U-boat submarines in the Atlantic. They pledged to attack any ship irrespective of cargo or innocent civilians to buy enough time to conquer Great Britain. However, the sinking of the Lusitania proved to be one step too far.

On April 2, 1917 at precisely 8:30 p.m., President Wilson assembled both Houses of Congress, the Supreme Court and his Cabinet. In a 36-minute speech, he outlined the vicious attacks by Germany on our ships and the innocent lives lost. Finally, he concluded by formally requesting Congress to declare war on Germany (only). The final words were lost or unheard amid the boisterous cheering and flag-waving. Later, back at the White House, he expressed his feelings of wonderment and commented to his aides: “Just stop and think about what they were applauding…” Finally alone, he wept almost silently.

On April 6, Congress declared war on Germany and by June 25, the American Expeditionary Forces (AEF) arrived in France, led by General John J. “Black Jack” Pershing. On July 4, Independence Day, elements of Pershing’s force paraded in Paris. Pershing holds the distinction of being the first living general to be promoted to general of the Armies and allowed to select his own insignia. He chose four gold stars to distinguish his rank from generals who wore four silver stars. There is no record of any familial relationship to either of the Pattons.

Throughout the months that followed, fresh units continued to be added and World War I would end on the memorable point of time of 11 a.m. on the 11th day of the 11th month in 1918. President Wilson was awarded the 1919 Nobel Peace Prize, but was unable to convince the U.S. Congress to join the League of Nations. Absent the United States, there was not much hope in helping Europe avoid another war. It was time to bring the boys home. Among them was a young lieutenant who would rise to prominence as the supreme commander of U.S. forces when we returned 20-plus years for the second round of fighting.

In comparison to the choices of today, I REALLY like Ike!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Madison is certainly the father of the Constitution

This 1809 James Madison commemorative Indian Peace Medal realized $24,000 at an August 2019 Heritage auction.

By Jim O’Neal

It is mildly amusing to listen to members of Congress refer to the Founding Fathers whenever they’re trying to validate a political point or opinion (“That’s exactly what the framers intended when they wrote the Constitution!”). They seem to believe that our Founders held a Constitutional Convention (partially true), quickly hammered out a list of sacred provisions and then had each state ratify them en masse. Naturally, the real story is much more complicated and Constitutional scholars of today still debate various aspects of what is meant. Even the Supreme Court struggles to gain a consensus on “original intent.”

After the 13 American colonies tired of monarchical rule under King George III and Parliament, they decided to form an independent country. A committee was formed to start the ball rolling with a Declaration of Independence. Thomas Jefferson was formally elected to write the document and years later (1822), John Adams wrote a letter to Timothy Pickering explaining how Jefferson was selected: “First, he had a reputation for literature, science and a talent of composition. Though silent in Congress, he was prompt, frank, explicit and decisive upon committee and in conversation. He seized my heart and I gave him my vote. When he asked my reasons, I said – You are a Virginian and I am obnoxious and unpopular. Lastly, you can write 10 times better than I can!”

Following the American Revolutionary War (1775-1783), the 13 original states ratified Articles of Confederation that served as the first Constitution. The primary principle of these articles was to preserve the sovereignty and independence of each individual state. A weak central government was formed, but great care was taken to ensure that it did not have any more power than previously assumed by the British King and Parliament. This issue of maintaining states’ rights would continue to perplex any efforts to federalize.

The states continued struggling under several different forms of Articles, Confederations and Conventions … all with loosely defined laws and regulations. Important issues like foreign policy, taxation, currency and basic commerce were hindered by competing state interests. Even the U.S. Army was under the direction of a Congress that was not well organized. These and other issues greatly worried the Founders, who believed the Union, as it existed in 1786, was in serious danger of breaking apart.

So it is true that we look to the Founding Fathers when we examine the great American experiment in democracy. But, the question remains: To whom did they turn for wisdom and guidance? Many found inspiration from Great Britain in the previous century, when the conflict between the King and Parliament escalated into a civil war. The generally Puritan Parliament simply moved to abolish the monarchy, executed Charles I in 1649 for treason and bravely established England’s first and only Republic. Oliver Cromwell became Lord Protector of England, Scotland and Wales. However, his death in 1658 created a power vacuum that was filled by Charles’ eldest son. So not much was really accomplished and they reverted back to a King + Parliament that ruled with deficiencies that continue to exist today.

Besides, it was now crystal clear that major changes were needed in America and, finally, a Constitutional Convention was scheduled for May-September 1787 in Philadelphia. It was described as an effort to revise the league of states and many state delegates arrived assuming the purpose was to debate and draft improvements. However, powerful voices were determined to forge a powerful new national government. Among this group were James Madison and Alexander Hamilton, who intended to create a new government rather than tinker with fixing the existing one.

After a long hot summer of debate, 39 of the original 55 delegates signed the new Constitution. It was released to the public to debate and gain state ratification. They immediately hit a snag over the absence of a Bill of Rights. There had been discussions among the delegates over the need for such a bill, but it was rejected by the Convention. The lack of a Bill of Rights became a rallying cry for the anti-federalists until advocates for the Constitution (led by James Madison) agreed to add one in the first session of Congress. Ratified on Dec. 15, 1791, the first 10 amendments – called the Bill of Rights – include sweeping restrictions on the federal government to protect rights and limit powers. Delaware was the first state to ratify the Constitution and the last was Rhode Island.

I am solidly in the camp of those who regard James Madison as Father of the U.S. Constitution. One does not need to look any further. No other delegate was better prepared for the Federal Convention of 1787 and no one contributed more in shaping the ideas of the document and explaining its meaning. He was a proponent for a consolidated, central republic to replace the loose and dysfunctional alliance under the Articles of Confederation. The Virginia Plan he brought to Philadelphia became the basis for the Convention agenda. His wish to clearly establish the sovereignty of the national government over the states has proven to be very durable. In 230 years, over 10,000 attempts have been made to amend it and as of now, only 27 have succeeded.

I rest my case.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Astonishing technologies will continue upending our world

Thomas Hart Benton’s ink, pencil and watercolor on paper titled “Poking Stick in Cotton Gin,” circa 1930, sold for $12,500 at a May 2018 Heritage auction.

By Jim O’Neal

A Luddite is an obscure term loosely used to describe people who dislike new technology. After a superficial self-assessment, I’ve concluded I’m probably a modern-day Luddite at heart. The evidence is abundant since I’ve scrupulously avoided Facebook, have zero interest in posting pictures or video on Instagram and consider Twitter an enormous thief of time. Social media is not a place I’m interested in wasting my remaining time on.

That said, I don’t know how to account for my iPhone 11, iPad Pro or the 75-inch Sony 4K LED that dominates our den (or the six other cable boxes on three floors). With my iPad, I rarely use my desktop computer except to print documents. I abhor texting and still have a Netflix account that sends me DVDs by snail mail. After spending the past 60 years questing for ever-larger TV screens, the idea of squinting at a cellphone or watch-size TV program is mildly abhorrent. Even more annoying is the spate of robo-calls offering new Medicare options. I routinely turn off my devices for hours (ah … peace).

The original Luddites were British weavers and textile workers who objected to the increased use of mechanized looms and knitting frames. It is popularly claimed that they named themselves after Ned Ludd, a young apprentice who was rumored to have personally wrecked a textile apparatus (“in a fit of rage”) in 1779. There is no evidence Ludd actually existed, but he eventually became the mythical leader of the movement. They even issued manifestos and threatening letters under his name.

The first major instance of malicious machine breaking took place in 1811 in Nottingham. The British government moved to quash the uprisings by making machine breaking punishable by death. The unrest finally reached its peak in April 1812 when a few Luddites were gunned down during an attack on a mill near Huddersfield. Finally, the army deployed several thousand troops and dozens of Luddites were either hanged or transported to Australia.

In the intervening years, astonishing new technologies have increased productivity, lowered costs and created hundreds of millions of new jobs. A few of the more obvious include:

  • The cotton en(gin)e that turned a marginally profitable farm crop into a bonanza by minimizing labor by over 90 percent while increasing workers from 700,000 to 3.2 million. Historians point out the South gained a 75 percent share of world demand, but also doomed them to remain an agricultural economy (with slaves). Others contend this single invention led directly to the Civil War.
  • Cyrus McCormick’s mechanical reaper helped convert millions of acres to food production and developed Midwest family farms with “wheat fields shining from sea to sea.” Presumably, American Indians, buffalo, dense forests and pristine rivers and lakes were unimpressed.
  • The Wright Brothers gave man the gifts of flight, aircraft factory jobs, cargo shipments and holiday travel for the masses. It also enabled two world wars and the ability to destroy entire cities.
  • Commercialization of the Bessemer process to supply the enormous steel demand for railroad tracks that crisscrossed the nation and enabled high-rise buildings with Otis elevators and office workers too numerous to count.
  • Henry Ford’s assembly line made automobiles affordable … in turn, creating more workers to stay up with demand and higher wages to buy the product. This was followed by oil-gas, tires, paved roads, motels and Uber/Lyft). Also smog, toll roads and clogged freeways in every city.
  • The Internet is obsoleting retail stores and shopping malls, while enabling Apple, Google, Amazon and global outsourcing that has raised 500 million people out poverty.

We are now challenged to reconcile population growth with climate change and plastic oceans; and robots and artificial intelligence with displaced workers and a K-12 Education System that is failing so many currently. Joseph Schumpeter’s 1942 theory of Creative Destruction is still valid.

The London Mensa Organization just accepted a 3 year old with an IQ of 142+. There will also be more Elon Musks and they will figure it out. One suggestion is to simply operationalize what’s known as “5G-based” nuclear power plants, which are 100 percent green (it will shut itself down if needed) and run on spent fuel stockpiles. Imagine unlimited clean power that will desalinate sea water and gobble up current waste. Bill Gates is an investor in the technology.

Just a casual idea as we watch the Washington Circus and the people we rely on for leadership.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s why Commodore Perry is known as ‘Father of the Steam Navy’

This silver Matthew Calbraith Perry “Treaty with Japan” medal, commissioned by a group of Boston merchants and struck at the U.S. Mint in Philadelphia in 1856, sold for $26,290 at a May 2011 Heritage auction.

By Jim O’Neal

Generally, James Watt (1736-1819) is credited with the invention of the steam engine. Perhaps this is due to the proximity of this brilliant Scottish engineer and chemist to Great Britain’s Industrial Revolution. His work certainly played a major role in the country’s transition to the world’s leading commercial nation in the early 19th century. However, Watt actually only improved existing steam engines by reducing waste and redesigning the basic technology of heating and cooling liquids.

The result was a dramatic improvement in cost-effectiveness that lowered production costs. England could deliver virtually anywhere cheaper than local production. In a relatively short time, England’s global trading empire stretched from Europe to the North American colonies, through the Caribbean and to the Indian subcontinent. In the process, the nation transformed from an agricultural economy into an industrial juggernaut. The old saying that “the sun never sets on the British Empire” has been used by historians to dramatize the vastness of land under British control. At its apex, it covered 25 percent of Earth’s landmass and daylight was present somewhere at all times.

Then the vaunted British Empire began a long, slow descent into what has become a tired monarchy, with a sclerotic Parliament stuck in the mire of Democratic-Socialism. The embarrassing Brexit erased the vestiges of the Thacker era and raised the specter of disunion in Scotland and a divided Ireland. Recent events have inevitably raised questions about the durability of the royal family. I’m betting Queen Elizabeth II will remain unfazed and continue her remarkable 68-year reign, despite her children’s many escapades.

The actual story of “steam power” stretches back to Hero of Alexandria (circa 10-70 AD), a Greek scientist credited with developing the aeolipile – a rocket-like device that produced a rotary motion from escaping steam. For the next 1,800 years, the world’s inventors, mathematicians and scientists were busy making incremental improvements.

A prominent example is Matthew C. Perry (1794-1858), the first authentic Commodore of the U.S. Navy. He was appointed commandant of the New York Navy Yard in June 1840 by Navy Secretary James Paulding (primarily a writer of note). Perry was an experienced seaman and recognized the critical need for improving the education of Naval personnel. He helped design an apprenticeship system to train new sailors that eventually led to the establishment of the United States Naval Academy in 1845. Near Annapolis, Md., they train 800 to 1,000 plebes (Roman slang) annually to be midshipmen who represent the best traditions of America’s elite military.

Commodore Perry also earned the moniker “Father of the Steam Navy” after organizing the nascent corps of Naval engineers and founding the U.S. Naval gunnery on the New Jersey seashore. He took command of the U.S.S. Fulton (the nation’s second steam frigate). Perry supervised the construction and his extensive naval experience provided an ideal platform to advocate for extensive modernization.

In 1852, President Millard Fillmore assigned Commodore Perry to carry out a strategic mission: Force the Japanese Empire to open all their ports that had been closed to foreigners for 250 years … using gunboat diplomacy if necessary. On July 8, 1853, the Perry Expedition sailed into Edo Bay (Tokyo) and opened trade negotiations. However, it took a second trip in February 1854, this time with 10 vessels and 1,600 men. Perry proceeded to land 500 men in 27 boat ships while bands played the Star-Spangled Banner.

Silently following along was the “Law of Unintended Consequences.” The Japanese quickly realized that Perry’s warships, armaments and technology so out-powered them that it would be prudent to throw open their markets to foreign technology. The feudal lord Shimazu Nariakira summarized it nicely by observing: “If we take the initiative, we can dominate; if we do not, we will be dominated!” They did take the initiative and over the next century defeated Taiwan, Russia and China … taking control of the entire Korean Peninsula from 1910 forward.

Ironically, 100 years later, on Sept. 2, 1945, our war with Japan formally ended. But, days earlier, the battleship USS Missouri glided into Tokyo Bay and anchored within cannon-shot range of Commodore Perry’s moorage of 1853. The Missouri’s deck was arranged with surrender documents, and displayed above was the 31-star flag that Perry had flown on the USS Mississippi, built under the personal supervision of the commodore. It has been on display in the Naval Museum. The Missouri flagstaff luffed the 48-star flag that had flown on the Capitol dome in Washington, D.C., on Dec. 7, 1941. America and Japan were finally at peace.

Now we are ensconced in the Middle East with no visible exit and the Navy is busy contending with China over Asian Oceans of questionable value. But we did sleep in a Holiday Inn after mooring a nuclear submarine.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Jackson arrived in D.C. and proceeded to upset the apple cart

A rare Andrew Jackson “pewter rim,” most likely dating to the War of 1812 and celebrating its heroes, sold for $20,000 at a June 2015 Heritage auction.

By Jim O’Neal

Andrew Jackson, the seventh president of the United States, was ready to go home. After serving eight years, he rejoiced that his vice president would succeed him. Occasionally, Jackson had considered resigning to ensure a smooth transition and, more importantly, a continuation of Jacksonianism. VP Martin Van Buren had consistently opposed this and finally “Old Hickory” dropped the idea. The president’s health was failing and many descriptions painted a picture of an old man (he was 70 years old and frail).

Eight years earlier, the president-elect had slipped into Washington, D.C. A welcoming salute had been cancelled since counting the electoral votes was still before Congress. Four years before (1825), Jackson had been denied the presidency despite winning a plurality of popular and electoral votes. Absent a majority of electoral votes, the election had been decided by the House of Representatives in accordance with the 12th Amendment. They chose John Quincy Adams.

Now, while waiting for the final count, Jackson was in deep mourning over the death of Mrs. Jackson a few days before Christmas. The cause was deep depression followed by a heart attack. A bitter controversy had erupted during the campaign when political enemies charged their marriage was bigamous. Rachel Donelson Jackson was mortified to learn a divorce was in question from a prior marriage. Winning the presidency had magnified the embarrassment and she cried out to friends, “I would rather be a doorkeeper in the house of God than live in that palace in Washington.”

So began a new era in American politics as a strident, partisan president took office still seeking revenge. The new president was obsessed with attacking all special interest groups and their corrupt influence on Congress. Under his leadership, Democrats became the party of the common man. The mantle of populism rested easily on his shoulders and Washington politics would be transformed for an entire generation. The two-party system was now dominant as Democrats and Whigs shared power until the 1860 election.

The turbulence of AJ’s life carried over into the presidency as he defined his policies, not by enacting legislation, but by defiantly thwarting it! He vetoed more bills than the combined total of all six of his predecessors. He was a man in a hurry and Cabinet members either followed his orders or they were quickly dispatched. As an example, the national debt was $58 million when he assumed office in 1829 and by Jan. 1, 1835, it had totally been eliminated (for the first and ONLY time to this day).

Nothing was sacred from his reform crusade and that especially included the Bank of the United States (BUS). The original BUS was created by Alexander Hamilton in 1791 to get the new government operating despite heavy debts from the War of Independence. The bank had been chartered for 20 years with the expectation the charter would be renewed. A successor BUS was founded in 1816, again with another 20-year quasi-monopoly. Jackson believed the bank was unconstitutional (as had Jefferson). Jackson surprised everyone by attacking the bank in his very first message to Congress.

He then promptly vetoed the bill to renew the charter in 1836 by saying, “It is to be regretted that the rich and powerful too often bend the acts of government to their selfish purposes.”

However, since the bank charter wouldn’t expire until 1836, Jackson decided not to wait. He ordered his Treasury Secretary, William Duane, to withdraw all government funds from the bank and deposit them with state charted banks. Congress had just legislated against this and Secretary Duane refused Jackson’s edict. The president simply fired him and transferred Attorney General Roger Taney into the Treasury job. The Senate, now controlled by Whigs, was furious and refused to approve Taney’s nomination. But they were too slow and the damage was already complete.

Totally frustrated, in March 1834 the Senate adopted a resolution of censure of Jackson, charging him with “assuming authority and power not conferred by the Constitution and laws, but in derogation of both.” It was viewed as an impeachment, but without the Constitutional process.

The Whigs responded, “The resolution, then, was in substance an impeachment of the president, and in passage amounts to a declaration by the majority of the Senate that he is guilty of an impeachable offence. As such, it is spread upon the journals of the Senate, published to the nation and to the world, made part of our enduring archives, and incorporated in the history of the age.”

That enduring “history of the age” lasted less than three years. In January 1837, Democrats, back in control of the Senate, voted to expunge the censure resolution, writing boldly across the original record, “EXPUNGED BY ORDER OF THE SENATE THIS 16TH DAY OF JANUARY, IN THE YEAR OF OUR LORD, 1837.”

Amen.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

America, Spanish colonies took vastly different turns

A signed photographic print of Admiral George Dewey went to auction in June 2013.

By Jim O’Neal

On April 25, 1898, the U.S. Congress declared war on Spain, ostensibly because of the sinking of the battleship USS Maine on Feb. 15. The armored cruiser was docked in Cuba’s Havana Harbor, having been dispatched by President McKinley to protect America’s people and interests during an uprising of Cuban dissidents. The cause of the explosion is still a subject of debate yet today.

The first act of war was to prevent Spain’s battleships in the Philippines from going to Cuba to join in the pending fighting. When the U.S. Army arrived in Cuba, they won a series of battles. The most famous was San Juan Hill, featuring a group called the Rough Riders. They were primarily farmers and cowboys that comprised the 1st U.S. Volunteer Cavalry that also included future President Theodore Roosevelt. TR was posthumously awarded the Medal of Honor in 2001 for his actions in Cuba.

It was a short war (the actual fighting stopped by Aug. 13) and a formal treaty was signed on Dec. 10. Cuba gained its long quest for independence and the United States gained control of Guam, Puerto Rico and the Philippines islands, an archipelago south of China. American business interests viewed the Spanish colony as a strategic gateway to lucrative trade with East Asian markets.

However, these islands had been under Spanish control since 1521 and Filipinos had also been waging a war of independence since 1896. Gaining their support to oust Spain was critical. U.S. Navy forces were under the command of Commodore George Dewey, and the Battle of Manila Bay against the Spanish flotilla started early on May 1, 1898. Spanish battleships and harbor fort guns were out of range to reach the American fleet, but Admiral Dewey had superiority in armaments. After confirming the distance, he gave a famous command to the captain of the USS Olympia: “You may fire when you are ready, Gridley.” What followed was the destruction of eight Spanish battleships and with only seven American seamen wounded. The entire battle was over in the first day of fighting.

Spain surrendered and sold the Philippines to the United States for $20 million.

Perhaps, not surprisingly, Filipino nationalists were not interested in trading one colonial master for another. In February 1899, fighting between Filipinos and the U.S. military started. In June, the Filipino Republic officially declared war on the invading U.S. forces. Suddenly, the United States had become mired in a war of colonial conquest. It would last for three years and become exceedingly vicious at times.

The United States controlled the capital of Manila, but Filipino revolutionaries, outgunned but with the advantage in manpower and home terrain, predictably resorted to guerilla warfare. U.S. forces quickly forced civilians into internment camps to prevent them from helping or joining the guerillas. It took until 1903 for the United States to prevail, with American troops suffering more than 4,000 casualties, 75 percent from tropical diseases. Roughly a quarter-million Filipinos perished, 90 percent of them innocent civilians. It was not until 1946 that the Treaty of Manila granted the Philippines full independence.

Looking back to the early 19th century, Spain’s colonies in North America were vastly superior to the young United States. This situation was totally reversed by 1900. In terms of territory, population and resources, the United States dominated the Western Hemisphere. It is a story of Protestant austerity, democracy and incursions led by American frontiersmen, farmers, shopkeepers, bankers and waves of European immigrants arriving on our shores, ready to make their fortunes.

The Spanish colonies fragmented as the primarily Catholic and tyrannical governments were unable to maintain coherence and viability. The transformation is marked by three distinct phases starting with Florida and the Southeast by 1820. This was followed by Texas, California, the greater Southwest (1855) and finally Central America and the Caribbean directly as a result of the Spanish-American War. During these major annexation phases, Mexico lost half its territory and 75 percent of its mineral resources.

The story of how to achieve Manifest Destiny from “Sea to shining sea” is embedded in these short episodes.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

We’ve seen incredibly successful hucksters and three-ring circuses before

A 1913 poster promoting the Barnum & Bailey elephant baseball team sold for $9,600 at a February 2019 Heritage auction.

By Jim O’Neal

One of the world’s greatest hucksters died in 1891. He was born in Bethel, Conn., and died 80 years later on April 7 in Bridgeport, where he had been mayor in 1875-76. Earlier, he had served four terms in the Connecticut House of Representatives, without distinction. The three-ring circus of modern life with all its hustle and bustle had to start somewhere, so why not simply start with the man responsible for the actual three-ring circus?

Phineas Taylor Barnum had been a loyal Democrat until the 1854 Kansas-Nebraska Act, which supported slavery, was drafted by Democrats and signed by President Franklin Pierce. It effectively nullified the 1859 Missouri Compromise, escalated tensions over the slavery issue and led to a series of violent civil confrontations known as “Bloody Kansas,” a political stain on American democracy.

Barnum promptly switched political parties, becoming a member of the new anti-slavery Republican Party, which was expanding rapidly with defecting abolitionists. John C. Frémont – “The Pathfinder” – was the first presidential candidate of the Republican Party, losing to Democrat James Buchanan in 1856. Abraham Lincoln prevailed in 1860 and 1864, and Republicans would dominate national politics for the rest of the 19th century.

Yes, we’re talking about that Barnum, who would become world famous as founder of “P.T. Barnum’s Grand Traveling Museum, Menagerie, Caravan & Hippodrome.” Most Americans know the name, but whether they know that “P.T.” stands for Phineas Taylor or that he did not enter the circus business until he was 60 years old is doubtful. If not, then it is surely because of the extraordinary, eponymous circus formed when he and James Bailey teamed up in 1881.

Barnum was an energetic 70-year-old impresario. “The Greatest Show on Earth” may have been a slight exaggeration, but it’s not clear who would have rivaled them for the top spot. Clearly it was a distinctive assertion in a life filled with remarkable contradictions. Perhaps it is more precise to think of him as “the Greatest Showman on Earth” or other lofty positions as one desires. (He would undoubtedly find an angle to exploit to the fullest).

He actually had a modest beginning in his show-biz career, starting at age 25. He purchased a blind, nearly paralyzed black slave woman (Joice Heth) who purportedly was 161 years old and a nurse to a young George Washington. She sang hymns, told jokes and answered audience questions about “Little George.” Barnum cleverly worked around existing laws and exhibited her 10 to 12 hours a day to recoup his $1,000 investment.

As Barnum bribed newspaper editors for extra press coverage (always mentioning his name), he also co-produced a sensationalized biographical pamphlet to further hype the hoax. When Heth died in 1836, Barnum sold tickets to another “event” – a public autopsy to judge her actual age. More than 1,300 people eagerly attended the spectacle, which critics slammed as “morally specious.” At 50 cents a ticket, it provided a surprisingly nice profit. Barnum attempted to appease the abolitionists by claiming (falsely) that all proceeds from this flagrant exploitation would be used to buy her great-grandchildren’s freedom.

It is here that that experts who study such arcane issues will argue that it’s important to define the pejorative term “humbug,” using Barnum’s own precepts. To him, a humbug was a fake that delights audiences without scamming them. It is sleight of hand, not bait-and-switch. He called himself the “Prince of Humbugs.” Perhaps it is a distinction without a difference. However, Barnum, still searching for a code of ethics, fled this humbug. Even in his 1854 biography, he wrote that he wanted people to remember him for something other than Joice Heth. It would haunt him until his death.

By 1841, he was touring the country with magicians and jugglers. He bought John Scudder’s struggling American Museum in lower Manhattan, promptly renaming it with the Barnum brand. While displaying a cabinet of curiosities, he introduced pseudo-scientific exhibitions, live freaks and the normal hokums. Still struggling with his ethical bankruptcy, he gambled on backing a national tour for Jenny Lind, the most celebrated soprano in the world, offering her $1,500 for every performance. He calculated it would be worth losing $50,000 just to enhance his reputation.

Her virtuosic arias drew crowds in the thousands, as Barnum wishfully hoped his association with “the Swedish Nightingale” would lessen his reputational baggage. But driven by an outsize eagerness to enrich himself, he peddled spectacles like the “Feejee Mermaid,” the torso and head of a monkey and the back half of a fish, bound together by the clever art of taxidermy. He continued to worship at the altar of celebrity and the power of the press. He created attractions like General Tom Thumb, who at 5, learned to drink wine; at 7, he was smoking a cigar.

He parlayed an audience with President Lincoln into a European tour involving Queen Victoria, gambling that her subjects would be interested as well. The trip paid off big and was extended to include visits with the Tsar of Russia and other nobles. It is not surprising that in his quest for money and fame, his name itself conjured up qualities of audacity, greed and humbug. But how to account or judge the value of excitement, entertainment and gentle controversy? Even as Charles Darwin was jolting the scientific and religious communities with evolution via his Origin of Species, P.T. Barnum introduced William Henry Johnson, a microcephalic black man who spoke a mysterious language … “solving” the quest to find the Missing Link of mankind.

Sadly, on May 21, 2017, Ringling Bros. and Barnum & Bailey Circus gave the last performance of its 146-year history after the elephants had vanished under pressure from animal rights activists. The audience rose for a standing ovation while singing Auld Lang Syne. Then it was over.

Except that it wasn’t!

P.T. Barnum, famous for grabbing headlines, reached up from the grave as Hugh Jackman lionized him in the movie The Greatest Showman. Recent one-word-titled books like Fraud, Hoax and Bunk have found analogies to today while a generation of Madonnas, Warhols and Kardashians have mastered the media to enhance the power of celebrity. We now have the modern equivalent of a three-ring circus continuously playing on Twitter or any cable news channel 24/7. The Romans knew this when they built the coliseum and so did Walt Disney when Disneyland popped up in 1955.

I do miss the cotton candy.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].