As court controversy rages, let’s not forget what we do best

A photograph of Franklin D. Roosevelt signed and inscribed to Eleanor Roosevelt sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

The Supreme Court was created by the Constitution, but the document wisely calls for Congress to decide the number of justices. This was vastly superior to a formula based on the number of states or population, which would have resulted in a large, unwieldy committee. The 1789 Judiciary Act established the initial number at six, with a chief justice and five associates all selected by President Washington.

In 1807, the number was increased to seven (to avoid tie votes) and in 1837 to nine, and then to 10 in 1863. The Judiciary Act of 1866 temporarily reduced the court to seven in response to post-Civil War politics and the Andrew Johnson presidency. Finally, the 1869 Act settled on nine, where it has remained to this day. The major concern has consistently been over the activities of the court and the fear it would inevitably try to create policy rather than evaluate it (ensuring that Congressional legislation was lawful and conformed to the intent of the Constitution).

The recent confirmation hearings are the latest example of both political parties vying for advantage by using the court to shape future policies, reflecting political partisanship at its worst. Despite the fact that the Supreme Court can’t enforce its decisions since Congress has the power of the purse and the president the power of force, the court has devolved into a de facto legislative function through its deliberations. In a sharply divided nation, on most issues, policy has become the victim, largely since Congress is unable to find consensus. The appellate process is simply a poor substitute for this legislative weakness.

We have been here before and it helps to remember the journey. Between 1929 and 1945, two great travails were visited on our ancestors: a terrible economic depression and a world war. The economic crisis of the 1930s was far more than the result of the excesses of the 1920s. In the 100 years before the 1929 stock-market crash, our dynamic industrial revolution had produced a series of boom-bust cycles, inflicting great misery on capital and on many people. Even the fabled Roaring ’20s had excluded great segments of the population, especially blacks, farmers and newly arrived immigrants. Who or what to blame?

“[President] Hoover will be known as the greatest innocent bystander in history, a brave man fighting valiantly, futile, to the end,” populist newspaperman William Allen White wrote in 1932.

The same generation that suffered through the Great Depression was then faced with war in Europe and Asia, the rationing of common items, entrance to the nuclear age and, eventually, the responsibilities for rebuilding the world. Our basic way of life was threatened by a global tyranny with thousands of nukes wired to red buttons on two desks 4,862 miles apart.

FDR was swept into office in 1932 during the depth of the Great Depression and his supporters believed he possessed just what the country needed: inherent optimism, confidence, decisiveness, and the desire to get things done. We had 13 million unemployed, 9,100 banks closed, and a government at a standstill. “This nation asks for action and action now!”

In his first 100 days, Roosevelt swamped Congress with a score of carefully crafted legislative actions designed to bring about economic reforms. Congress responded eagerly. But the Supreme Court, now dubbed the “Nine Old Men,” said no to most New Deal legislation by votes of 6-3 or 5-4. They made mincemeat of the proposals. But the economy did improve and resulted in an even bigger landslide re-election. FDR won 60.3 percent of the popular vote and an astonishing 98.5 percent of the electoral votes, losing only Vermont and Maine.

In his 1937 inaugural address, FDR emphasized that “one-third of the nation was ill-housed, ill-clad and ill-nourished.” He called for more federal support. However, Treasury Secretary Henry Morgenthau worried about business confidence and argued for a balanced budget, and in early 1937, Roosevelt, almost inexplicably, ordered federal spending reduced. Predictably, the U.S. economy went into decline. Industrial production had fallen 14 percent and in October alone, another half million people were thrown out of work. It was clearly now “Roosevelt’s Recession.”

Fearing that the Supreme Court would continue to nullify the New Deal, Roosevelt in his ninth Fireside Chat unveiled a new plan for the judiciary. He proposed that the president should have the power to appoint additional justices – up to a maximum of six, one for every member of the Supreme Court over age 70 who did not retire in six months. The Judicial Procedures Reform Bill of 1937 (known as the “court-packing plan”) hopelessly split the Democratic majority in the Senate, caused a storm of protest from bench to bar, and created an uproar among both Constitutional conservatives and liberals. The bill was doomed from the start and even the Senate Judiciary reported it to the floor negatively, 10-14. The Senate vote was even worse … 70-20 to bury it.

We know how that story ended, as Americans were united to fight a Great War and then do what we do best: work hard, innovate and preserve the precious freedoms our forebears guaranteed us.

Unite vs. Fight seems like a good idea to me.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Peaceful transfer of presidential power is one of our strengths

Seven Days in May, starring Burt Lancaster and Kirk Douglas, is a 1964 movie about a military/political cabal’s planned takeover of the U.S. government.

By Jim O’Neal

It seems clear that one of the bedrock fundamentals that contributes to the stability of the U.S. government is the American presidency. Even considering the terrible consequences of the Civil War – 11 states seceding, 620,000 lives lost and widespread destruction – it’s important to remember that the federal government held together surprisingly well. The continuity of unbroken governance is a tribute to a success that is the envy of the world.

Naturally, the Constitution, our system of justice and the rule of law – along with all the other freedoms we cherish – are all critical contributors. But it’s been our leadership at the top that’s made it all possible. In fact, one could argue that having George Washington as the first president for a full eight years is equal in importance to all other factors. His unquestioned integrity and broad admiration, in addition to precedent-setting actions, got us safely on the road to success despite many of the governed being loyal to the British Crown.

Since that first election in 1789, 44 different men have held the office of president (Grover Cleveland for two separate terms), and six of them are alive today. I agree with Henry Adams, who argued, “A president should resemble a captain of a ship at sea. He must have a helm to grasp, a course to steer, a port to seek. Without headway, the ship would arrive nowhere and perpetual calm is as detrimental to purpose as a perpetual hurricane.” The president is the one who must steer the ship, as a CEO leads an organization, be it small or large.

In the 229 intervening years, there have been brief periods of uncertainty, primarily due to vague Constitutional language. The first occurred in 1800, when two Federalists each received 73 electoral votes. It was assumed that Thomas Jefferson would be president and Aaron Burr would be vice president. The wily Burr spotted an opportunity and refused to concede, forcing the decision into the House. Jefferson and Burr remained tied for 35 ballots until Alexander Hamilton (convinced that Jefferson was the lesser of two evils) swayed a few votes to Jefferson, who won on the 36th ballot. This technical glitch was modified by the 12th Amendment in 1804 by requiring an elector to pick both a president and a vice president to avoid any uncertainty.

A second blip occurred after William Henry Harrison and John Tyler defeated incumbent Martin Van Buren. At age 68, Harrison was the oldest to be sworn in as president, a record he held until Ronald Reagan’s inauguration in 1981 at age 69. Harrison died 31 days after his inauguration (also a record), the first time a president had died in office. A controversy arose over the successor. The Presidential Succession Act of 1792 specifically provided for a special election in the event of a double vacancy, but the Constitution was not specific regarding just the presidency.

Vice President Tyler, at age 51, would be the youngest man to assume leadership. He was well educated, intelligent and experienced in governance. However, the Cabinet met and concluded he should bear the title of “Vice President, Acting as President” and addressed him as Mr. Vice President. Ignoring the Cabinet, Tyler was confident that the powers and duties fell to him automatically and immediately as soon as Harrison had died. He moved quickly to make this known, but doubts persisted and many arguments followed until the Senate voted 38 to 8 to recognize Tyler as the president of the United States. (It was not until 1967 that the 25th Amendment formally stipulated that the vice president becomes president, as opposed to acting president, when a president dies, resigns or is removed from office.)

In July 1933, an extraordinary meeting was held by a group of disgruntled financiers and Gen. Smedley Butler, a recently retired, two-time Medal of Honor winner. According to official Congressional testimony, Smedley claimed the group proposed to overthrow President Franklin Roosevelt because of the implications of his socialistic New Deal agenda that would create enormous federal deficits if allowed to proceed.

Smiley Darlington Butler was a U.S. Marine Corps major general – the highest rank authorized and the most decorated Marine in U.S. history. Butler (1881-1940) testified in a closed session that his role in the conspiracy was to issue an ultimatum to the president: FDR was to immediately announce he was incapacitated due to his crippling polio and needed to resign. If the president refused, Butler would march on the White House with 500,000 war veterans and force him out of power. Butler claimed he refused the offer despite being offered $3 million and the backing of J.P. Morgan’s bank and other important financial institutions.

A special committee of the House of Representatives (a forerunner to the Committee on Un-American Activities) headed by John McCormack of Massachusetts heard all the testimony in secret, but no additional investigations or prosecutions were launched. The New York Times thought it was all a hoax, despite supporting evidence. Later, President Kennedy privately mused that he thought a coup d’état might succeed if a future president thwarted the generals too many times, as he had done during the Bay of Pigs crisis. He cited a military plot like the one in the 1962 book Seven Days in May, which was turned into a 1964 movie starring Burt Lancaster and Kirk Douglas.

In reality, the peaceful transfer of power from one president to the next is one of the most resilient features of the American Constitution and we owe a deep debt of gratitude to the framers and the leaders who have served us so well.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Notorious traitors? Let’s look at Benedict Arnold

A May 24, 1776, letter by Benedict Arnold, signed, to Gen. William Thompson, realized $23,750 at an April 2016 Heritage auction.

By Jim O’Neal

Vidkun Quisling is an obscure name from World War II. To those unfamiliar with some of the lesser-known details, “Quisling” has become a synonym for a traitor or collaborator. From 1942 to 1945, he was Prime Minister of Norway, heading a pro-Nazi puppet government after Germany invaded. For his role, Quisling was put on trial for high treason and executed by firing squad on Oct. 24, 1945.

Obviously better known are Judas Iscariot of Last Supper fame (30 pieces of silver); Guy Fawkes, who tried to assassinate King James I by blowing up Parliament (the Gunpowder Plot); and Marcus Junius Brutus, who stabbed Julius Caesar (“Et tu, Brute?”). In American history, it’s a close call between John Wilkes Booth and Benedict Arnold.

Arnold

The irony concerning Benedict Arnold (1741-1801) is that his early wartime exploits had made him a legendary figure, but Arnold never forgot the sleight he received in February 1777 when Congress bypassed him while naming five new major generals … all of them junior to him. Afterward, George Washington pledged to help Arnold “with opportunities to regain the esteem of your country,” a promise he would live to regret.

Unknown to Washington, Arnold had already agreed to sell secret maps and plans of West Point to the British via British Maj. John André. There have always been honest debates over Arnold’s real motives for this treacherous act, but it seems clear that purely personal gain was the primary objective. Heavily in debt, Arnold had brokered a deal that included having the British pay him 6,000 pound sterling and award him a British Army commission for his treason. There is also little doubt that his wife Peggy was a full accomplice, despite a dramatic performance pretending to have lost her mind rather than her loyalty.

The history of West Point can be traced back to when it was occupied by the Continental Army after the Second Continental Congress (1775-1781) was designated to manage the Colonial war effort. West Point – first known as Fort Arnold and renamed Fort Clinton – was strategically located on high ground overlooking the Hudson River, with panoramic views extending all the way to New York City, ideal for military purposes. Later, in 1801, President Jefferson ordered plans to establish the U.S. Marine Corps there, and West Point has since churned out many distinguished military leaders … first for the Mexican-American War and then for the Civil War, including both Ulysses S. Grant and Robert E. Lee. It is the oldest continuously operating Army post in U.S. history.

To understand this period in American history, it helps to start at the end of the Seven Years’ War (1756-63), which was really a global conflict that included every major European power and spanned five continents. Many historians consider it “World War Zero,” and on the same scale as the two 20th century wars. In North America, the skirmishes started two years earlier in the French and Indian War, with Great Britain an active participant.

The Treaty of Paris in 1763 ended the conflict, with the British winning a stunning series of battles, France surrendering its Canadian holdings, and the Spanish ceding its Florida territories in exchange for Cuba. Consequently, the British Empire emerged as the most powerful political force in the world. The only issue was that these conflicts had nearly doubled England’s debt from 75 million to 130 million sterling.

A young King George III and his Parliament quietly noted that the Colonies were nearly debt free and decided it was time for them to pay for the 8,000-10,000 Redcoat peacetime militia stationed in North America. In April 1864, they passed legislation via the Currency Act and the Sugar Act. This limited inflationary Colonial currency and cut the trade duty on foreign molasses. In 1765, they struck again. Twice. The Quartering Act forced the Colonists to pay for billeting the king’s troops. Then the infamous Stamp Act placed direct taxes on Americans for the first time.

This was one step too far and inevitably led to the Revolutionary War, with armed conflict that involved hot-blooded, tempestuous individuals like Benedict Arnold. A brilliant military leader of uncommon bravery, Arnold poured his life into the Revolutionary cause, sacrificing his family life, health and financial well-being for a conflict that left him physically crippled. Sullied with false accusations, he became profoundly alienated from the American cause for liberty. His bitterness unknown to Washington, on Aug. 3, 1780, the future first president announced Arnold would take command of the garrison at West Point.

The appointed commander calculated that turning West Point over to the British, perhaps along with Washington as well, would end the war in a single stroke by giving the British control over the Hudson River. The conspiracy failed when André was captured with incriminating documents. Arnold fled to a British warship and they refused to trade him for André, who was hanged as a spy after pleading to be shot by a firing squad. Arnold went on to lead British troops in Virginia, survived the war, and eventually settled in London. He quickly became the most vilified figure in American history and remains the symbol of treason yet today.

Gen. Nathanael Greene, often called Washington’s most gifted and dependable officer, summed it up after the war most succinctly: “Since the fall of Lucifer, nothing has equaled the fall of Arnold.”

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Franklin had Faults, but he Remains an Extraordinary American

Norman Rockwell’s illustration of Ben Franklin for a 1926 cover of The Saturday Evening Post sold for $762,500 at a May 2018 Heritage auction.

By Jim O’Neal

George Washington was the only United States president who lived his entire life in the 18th century (1732-1799). However, every early vice president – from John Adams on – spent a significant part of their lives in the 19th century. Of the Founding Fathers, Benjamin Franklin (often considered the “grandfather” of the group), was born in 1706 and died in 1790 – nine years earlier than even Washington. In reality, he was actually a member of the previous generation and spent virtually all of his life as a loyal British subject.

As a result, Franklin didn’t have an opportunity to observe the nation he helped create as it struggled to function smoothly. Many were determined not to simply replicate the English monarchy, but to establish a more perfect union for the common man to prosper (women, slaves and non-property owners would have to wait). Franklin was also a man of vast contradictions. He was really a most reluctant revolutionary and discretely wished to preserve the traditions of the British Empire he had grown so familiar with. He secretly mourned the final break, even as he helped lead the fight for America’s independence.

Even while signing the Declaration of Independence and the Constitution, like many other loyalists, he hoped for some sort of reconciliation, a hopeless cause after so many careless British transgressions.

Fortunately, we have a rich history of this remarkable man’s life since he was such a prolific writer and his correspondence was greatly admired, and thus preserved by those lucky enough to receive it. Additionally, his scientific papers were highly respected and covered a vast breadth of topics that generated interest by the brightest minds in the Western world. He knew most of them personally on a first-name basis due to the many years he lived in France and England while traveling the European continent. Government files are replete with the many letters exchanged with heads of state.

Despite his passion for science, Franklin viewed his breakthrough experiments as secondary to his civic duties. He became wealthy as a young man and this provided the freedom to travel and assume important government assignments. Somehow, he was also able to maintain a pleasant marriage despite his extended absences, some for as long 10 years. He rather quickly developed a reputation as a “ladies’ man” and his social life flourished at the highest levels of society.

Some historians consider him the best-known celebrity of the 18th century. Even today, we still see his portrait daily on our $100 bills – colloquially known as “Benjamins” – and earlier on common 50-cent pieces and various denominations of postage stamps. Oddly, he is probably better known today by people of all ages than those 200 years ago. That is true stardom that very few manage to attain.

Every student in America generally knows something about Franklin flying a kite in a thunderstorm. They may not know that he proved the clouds were electrified and that lightning is a form of electricity. Or that Franklin’s work inspired Joseph Priestley to publish a comprehensive work on The History and Present State of Electricity in 1767. And it would be exceedingly rare if they knew the prestigious Royal Society honored him with its first Copley Medal for the advancement of scientific knowledge. But they do know Franklin from any picture.

Others may know of his connection to the post office, unaware that the U.S. postal system was established on July 26, 1775, by the Second Continental Congress when virtually all the mail was sent to Europe, not to themselves. There were no post offices in the Colonies and bars and taverns filled that role nicely. Today, there are 40,000 of them handling 175 billion pieces (six per second) and they have an arrangement with Amazon to deliver their packages, even on Sundays. Mr. Franklin helped create this behemoth as the first Postmaster General.

Franklin was also a racist in an era when the word didn’t even exist. He finally freed his house slaves and later became a staunch opponent of slavery, even sponsoring legislation. But he literally envisioned a White America, most especially for the Western development of the country. He was alarmed about German immigrants flooding Philadelphia and wrote passionately about their not learning English or assimilating into society. He was convinced it would be better if blacks stayed in Africa. His dream was to replicate England since the new nation had so much more room for expansion than that tiny island across the Atlantic. But we are here to examine the extraordinary mind and curiosity that led to so many successful experiments. Franklin always bemoaned the fact that he had been born too early and dreamed about all the new wonderful things that would be 300 years in the future.

Dear Ben, you just wouldn’t believe it!

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why We Owe a Lot to Second President John Adams

An 1805 oil-on-canvas portrait of John Adams attributed to William Dunlap sold for $35,000 at a May 2017 Heritage auction.

By Jim O’Neal

John Adams had the misfortune of being squeezed into the presidency of the United States (for a single term) between George Washington and Thomas Jefferson, two of the most famous presidents of all time. As a result, Adams (1735-1826) was often overlooked as one of America’s greatest statesmen and perhaps the most learned and penetrating thinker of his time. The importance of his role in the founding of America was noted by Richard Stockton, a delegate to the Continental Congress: “The man to whom the country is most indebted for the great measure of independence. … I call him the Atlas of American Independence.”

On the way to that independence, his participation started as early as 1761 when he assisted James Otis in defending Boston merchants against Britain’s enforcement of the Sugar Tax. When the American Revolution ended, Adams played a key role in the peace treaty that formally ended the war in 1783. In between those two bookends, he wrote many of the most significant essays and treatises, led the radical movement in Boston, and articulated the principles at the Continental Congress.

Following the infamous Stamp Act in 1765, he attacked it with a vengeance and wrote A Dissertation on the Canon and Feudal Law, asserting it deprived the colonists of two basic rights: taxation by consent and a jury trial by peers – both guaranteed to all Englishmen by the Magna Carta. Within a brief 10 years, he was acknowledged as one of America’s best constitutional scholars. When Parliament passed the Coercive Acts in 1774, Adams drafted the principal clause of the Declaration of Rights and Grievances; no man worked harder in the movement for independence and the effort to constitutionalize the powers of self-government.

After the Battles of Lexington and Concord, Adams argued for the colonies to declare independence and in 1776, Congress passed a resolution recommending the colonies draft new constitutions and form new governments. Adams wrote a draft blueprint, Thoughts on Government, and four states used it to shape new constitutions. In summer 1776, Congress considered arguments for a formal independence and John Adams made a four-hour speech that forcefully persuaded the assembly to vote in favor. Thomas Jefferson later recalled that “it moved us from our seats … He was our colossus on the floor.”

Three years later, Adams drafted the Massachusetts Constitution, which was copied by other states and guided the framers of the Federal Constitution of 1787.

He faithfully served two full terms as vice president for George Washington at a time when the office had only two primary duties: preside over the Senate and break any tie votes, and count the ballots for presidential elections. Many routinely considered the office to be part of Congress as opposed to the executive branch. He served one term as president and then lost the 1800 election to his vice president, Thomas Jefferson, as the party system (and Alexander Hamilton) conspired against his re-election. Bitter and disgruntled, he left Washington, D.C., before Jefferson was inaugurated and returned to his home in Massachusetts. His wife Abigail had departed earlier as their son Charles died in November from the effects of chronic alcoholism.

Their eldest son, John Quincy Adams, served as the sixth president (for a single term) after a contentious election, and they both gradually sunk into relative obscurity. This changed dramatically in 2001 when historian David McCullough published a wonderful biography that reintroduced John and Abigail Adams to a generation that vaguely knew he had died on the same day as Thomas Jefferson, July 4, 1826 – the 50th anniversary of the signing of the Declaration of Independence. In typical McCullough fashion, it was a bestseller and led to an epic TV mini-series that snagged four Golden Globes and a record 13 Emmys in 2008.

Television at its very best!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As Nation Moved to Civil War, the North had the Financial Edge

Richard Montgomery was an Irish soldier who served in the British Army before joining the Continental Army.

By Jim O’Neal

Richard Montgomery (1738-75) was a little-known hero-soldier born in Dublin, Ireland, who became a captain in the British Army in 1756. Later, he became a major general in the Continental Army after the Continental Congress elected George Washington as Commander in Chief of the Continental Army in June 1775. This position was created specifically to coordinate the military efforts of the 13 Colonies in the revolt against Great Britain.

Montgomery was killed in a failed attack on Quebec City led by General Benedict Arnold (before he defected). Montgomery was mourned in both Britain and America as his remains were interned at St. Paul’s Chapel in New York City.

A remarkably diverse group of schools, battleships and cities named in his honor remain yet today. Montgomery, Ala., is the capital and second-largest city in the state; it’s where Rosa Parks refused to give up her bus seat to a white passenger on Dec. 1, 1955, sparking the famous Montgomery bus boycott. Martin Luther King Jr. used Montgomery to great advantage in organizing the civil rights movement.

Montgomery was also the first capital of the Provisional Congress of the Confederate States when the first meeting was convened in February 1861. The first seven states that seceded from the United States had hastily selected representatives to visit the new Confederate capital. They arrived to find the hotels dirty, dusty roads, and noisy lobbyists overflowing in the statehouse. Montgomery was not prepared to host any large group, especially a large political convention.

Especially notable was that most of the South’s most talented men had already either joined the Army, the Cabinet or were headed for diplomatic assignments. By default, the least-talented legislators were given the responsibility of writing a Constitution, installing the new president (Jefferson Davis), and then authorizing a military force of up to 400,000 men. This conscription was for three years or the duration of the war. Like the North, virtually everyone was confident it would be a short, decisive battle.

Jefferson Davis was a well-known name, having distinguished himself in the Mexican War and serving as Secretary of War for President Franklin Pierce. Like many others, he downplayed the role of slavery in the war, seeing the battle as a long-overdue effort to overturn the exploitive economic system that was central to the North. In his view, the evidence was obvious. The North and South were like two different countries: one a growing industrial power and the other stuck in an agricultural system that had not evolved from 1800 when 80 percent of its labor force was on farms and plantations. The South now had only 18 percent of the industrial capacity and trending down.

That mediocre group of lawmakers at the first Confederate meeting was also tasked with the challenge of determining how to finance a war against a formidable enemy with vastly superior advantages in nearly every important aspect. Even new migrants were attracted to the North’s ever-expanding opportunities, as slave states fell further behind in manufacturing, canals, railroads and even conventional roads, all while the banking system became weaker.

Cotton production was a genuine bright spot for the South (at least for plantation owners), but ironically, it generated even more money for the North with its vast network of credit, warehousing, manufacturing and shipping companies. The North manufactured a dominant share of boots, shoes, cloth, pig iron and almost all the firearms … an ominous fact for people determined to fight a war. The South was forced to import foodstuffs in several regions. Southern politicians had spoken often of the need to build railroads and manufacturing, but these were rhetorical, empty words. Cotton had become the powerful narcotic that lulled them into complacency. Senator James Hammond of South Carolina summed it up neatly in his “Cotton is King” speech on March 4, 1858: “Who can doubt, that has looked at recent events, that cotton is supreme?”

Southerners sincerely believed that cotton would rescue them from the war and “after a few punches in the nose,” the North would gladly surrender.

One of those men was Christopher G. Memminger, who was selected as Confederate States Secretary of the Treasury and responsible for rounding up gold and silver to finance the needs of the Confederate States of America (CSA). A lawyer and member of the South Carolina legislature, he was also an expert on banking law. His first priority was for the Treasury to get cash and he started in New Orleans, the financial center of the South, by raiding the mint and customs house.

He assumed there would be at least enough gold to coin money and commissioned a design for a gold coin with the goddess of liberty seated, bearing a shield and a staff flanked by bales of cotton, sugar cane and tobacco. Before any denominations were finalized, it was discovered there was not enough gold available and the mint was closed in June.

This was followed by another nasty surprise: All the banks in the South possessed only $26 million in gold, silver and coins from Spain and France. No problem. Memminger estimated that cotton exports of $200 million would be enough to secure hundreds of millions in loans. Oops. President Lincoln had anticipated this and blockaded all the ports after Fort Sumter in April 1861. No cotton, no credit, no guns.

In God we trust. All others pay cash.

One small consolation was that his counterpart in the North, Salmon P. Chase, was also having trouble raising cash and had to resort to the dreaded income tax. However, both sides managed to keep killing each other for four long years, leaving a legacy of hate.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

If President Jackson had Followed Through with a Threat…

This U.S. Colt Model 1877 Bulldog Gatling Gun, with five 18-inch barrels secured in brass casement, realized $395,000 at a December 2014 Heritage auction.

“An army travels on its stomach.”

By Jim O’Neal

Both Frederick the Great and Napoleon Bonaparte are credited with aphorisms similar to this theme intended to emphasize the concept that a well-provisioned military is critical to its performance. In 1775, France offered 10,000 francs to anyone who could improve this persistent problem. In 1809, a confectioner named Nicolas Appert claimed the prize by inventing a heating, boiling and sealing system that preserved food similar to modern technology.

During the Revolutionary War, General Washington had to contend with this issue, as well as uniforms and ordnance (e.g. arms, powder and shot), which were essential to killing and capturing the British enemies. Responsibilities were far too dispersed and decision-making overly reliant on untrained personnel.

By the dawn of the War of 1812, the War Department convinced Congress that all these activities should be consolidated under experienced military personnel. On May 14, 1812, the U.S. Army Ordnance Corps was established. Over the past 200-plus years, 41 different men (mostly generals) have held the title of Army Chief of Ordnance. The system has evolved slowly and is regarded as a highly effective organization at the center of military actions in many parts of the world.

However, when the Civil War started in 1861, the man in charge was General James Wolfe Ripley (1794-1870), a hardheaded, overworked old veteran that Andrew Jackson had once threatened to hang for disobedience during the war with the Creek Indians. Ripley believed that the North would make this a short war and all they needed was an ample supply of orthodox weapons. He flatly refused to authorize the purchase of additional rifle-muskets for the infantry; primarily because of a large inventory of smooth bore muskets in various U.S. ordnance centers. Furthermore, he adamantly refused to allow the introduction of the more modern breech-loading repeating rifles due to a bizarre belief that ammunition would be wasted.

After two years of defiantly resisting the acquisition of new, modern weaponry, he was forced to retire. He was derided by the press as an old foggy, while some military historians claim he was personally responsible for extending the war by two years – a staggering indictment of enormous significance if in fact true!

One prominent example occurred in early June 1861 when President Lincoln met the first-known salesman of machine guns: J.D. Mills of New York, who performed a demonstration in the loft of a carriage shop near the Willard Hotel. Lincoln was so impressed that a second demonstration was held for the president, five generals and three Cabinet members. The generals were equally impressed and ready to place an order on the spot. But, Ripley stubbornly managed to delay any action.

Lincoln was also stubborn and personally ordered 10 guns from Mills for $1,300 each without consulting anyone. It was the first machine gun order in history.

Then, on Dec. 18, 1861, General George McClellan bought 50 of the guns on a cost-plus basis for $750 each. Two weeks later, a pair of these guns debuted in the field under Colonel John Geary, a veteran of the Mexican War, the first mayor of San Francisco and, later, governor of both Kansas and Pennsylvania. Surprisingly, he wrote a letter saying they were “inefficient and unsafe to the operators.” But the colorful explorer General John C. Fremont, who commanded in West Virginia, sent an urgent dispatch to Ripley demanding 16 of the new machine guns.

Ripley characteristically replied:

“Have no Union Repeating Guns on hand and am not aware that any have been ordered.”

After several other tests produced mixed results, Scientific American wrote a requiem for the weapon, saying, “They had proved to be of no practical value to the Army of the Potomac and are now laid up in a storehouse in Washington.”

Then, belatedly, came a gifted inventor, Richard J. Gatling, who patented a six-barrel machine gun on Nov. 4, 1862. Gatling tried to interest Lincoln, who had now turned to other new weapons. However, some managed to get into service and three were used to help guard The New York Times building in the draft riots of July 1863. The guns eventually made Gatling rich and famous, but it was more than a year after the end of the war – Aug. 14, 1866 – when the U.S. Army became the first to adopt a machine gun … Gatlings!

It is always fun to consider counterfactuals (i.e. expressing what might have happened under different circumstances). In this case, if Andrew Jackson had hanged Ripley, then the North would have had vastly superior weaponry – especially the machine gun – and the war would have ended two years earlier. Many battles would have been avoided … Gettysburg … Sherman’s March to the Sea. Lincoln would have made a quick peace, thereby avoiding the assassination on April 14, 1865.

If … if … if …

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How Far Will We Go In Amending American History?

A collection of items related to the dedication of the Washington Monument went to auction in May 2011.

By Jim O’Neal

Four years ago, George Clooney, Matt Damon and Bill Murray starred in a movie titled The Monuments Men, about a group of almost 400 specialists who were commissioned to try and retrieve monuments, manuscripts and artwork that had been looted in World War II.

The Germans were especially infamous for this and literally shipped long strings of railroad cars from all over Europe to German generals in Berlin. While they occupied Paris, they almost stripped the city of its fabled art collections by the world’s greatest artists. Small stashes of hidden art hoards are still being discovered yet today.

In the United States, another generation of anti-slavery groups are doing the exact opposite: lobbying to have statues and monuments removed, destroyed or relocated to obscure museums to gather dust out of the public eyes. Civil War flags and memorabilia on display were among the first to disappear, followed by Southern generals and others associated with the war. Now, streets and schools are being renamed. Slavery has understandably been the reason for the zeal to erase the past, but it sometimes appears the effort is slowly moving up the food chain.

More prominent names like President Woodrow Wilson have been targeted and for several years Princeton University has been protested because of the way it still honors Wilson, asserting he was a Virginia racist. Last year, Yale removed John C. Calhoun’s name from one of its residential colleges because he was one of the more vocal advocates of slavery, opening the path to the Civil War by supporting states’ rights to decide the slavery issue in South Carolina (which is an unquestionable fact). Dallas finally got around to removing some prominent Robert E. Lee statues, although one of the forklifts broke in the process.

Personally, I don’t object to any of this, especially if it helps to reunite America. So many different things seem to end up dividing us even further and this only weakens the United States (“United we stand, divided we fall”).

However, I hope to still be around if (when?) we erase Thomas Jefferson from the Declaration of Independence and are only left with George Washington and his extensive slavery practices (John Adams did not own slaves and Massachusetts was probably the first state to outlaw it).

It would seem to be relatively easy to change Mount Vernon or re-Washington, D.C., as the nation’s capital. But the Washington Monument may be an engineering nightmare. The Continental Congress proposed a monument to the Father of Our Country in 1783, even before the treaty conferring American independence was received. It was to honor his role as commander-in-chief during the Revolutionary War. But when Washington became president, he canceled it since he didn’t believe public money should be used for such honors. (If only that ethos was still around.)

But the idea for a monument resurfaced on the centennial of Washington’s birthday in 1832 (Washington died in 1799). A private group, the Washington National Monument Society – headed by Chief Justice John Marshall – was formed to solicit contributions. However, they were not sophisticated fundraisers since they limited gifts to $1 per person a year. (These were obviously very different times.) This restriction was exacerbated by the economic depression that gripped the country in 1832. This resulted in the cornerstone being delayed until July 4, 1848. An obscure congressman by the name of Abraham Lincoln was in the cheering crowd.

Even by the start of the Civil War 13 years later, the unsightly stump was still only 170 feet high, a far cry from the 600 feet originality projected. Mark Twain joined in the chorus of critics: “It has the aspect of a chimney with the top broken off … It is an eyesore to the people. It ought to be either pulled down or built up and finished,” Finally, President Ulysses S. Grant got Congress to appropriate the money and it was started again and ultimately opened in 1888. At the time, it was 555 feet tall and the tallest building in the world … a record that was eclipsed the following year when the Eiffel Tower was completed.

For me, it’s an impressive structure, with its sleek marble silhouette. I’m an admirer of the simplicity of plain, unadorned obelisks, since there are so few of them (only two in Maryland that I’m aware of). I realize others consider it on a par with a stalk of asparagus, but I’m proud to think of George Washington every time I see it.

Even so, if someday someone thinks it should be dismantled as the last symbol of a different period, they will be disappointed when they learn of all the other cities, highways, lakes, mountains and even a state that remain to go. Perhaps we can find a better use for all of that passion, energy and commitment and start rebuilding a crumbling infrastructure so in need of repairs. One can only hope.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Washington Remains Our Greatest President

A George Washington inaugural button, perhaps the earliest artifact that refers to Washington as the “Father of His Country,” realized $225,000 at a February 2018 Heritage auction.

By Jim O’Neal

Presidential scholars typically list George Washington, Abraham Lincoln and Franklin Delano Roosevelt as our finest presidents. I tend to favor Washington since without him, we would probably have a much different country in so many aspects. If there were any doubts about the feats of the “Father of Our Country,” they were certainly dispelled in 2005 when David McCullough’s 1776 hit bookstores, followed five years later by Ron Chernow’s masterful Washington: A Life, which examined the man in exquisite detail. They didn’t leave much ground uncovered, but there are still a few tidbits that haven’t become overused and still interesting for those interested in fresh anecdotes.

For example, Washington wasn’t aware that on Nov. 30, 1782, a preliminary Treaty of Paris was signed that brought American Revolutionary hostilities to an end. The United States was prevented from dealing directly with Great Britain due to an alliance with France that stipulated we would not negotiate with Britain without them. Had he known, Washington would have been highly suspicious since King George III “will push the war as long as the nation will find men or money.” In a way, Washington would have been right since the United States had demanded full recognition as a sovereign nation, in addition to removal of all troops and fishing rights in Newfoundland. The king rejected this since he was still determined to keep the United States as a British colony, with greater autonomy. Ben Franklin naturally opposed this and countered with adding 100 percent of Canada to the United States. And so it went until May 12, 1784, when the documents bringing the Revolutionary War to an end were finally ratified and exchanged by all parties.

It was during these protracted negotiations that Washington was concerned that the army might lose its fighting edge. He kept drilling the troops while issuing a steady stream of instructions: “Nothing contributes so much to the appearance of a soldier, or so plainly indicates discipline, as an erect carriage, firm step and steady countenance.” After all these years of hardships and war, Washington was still a militant committed to end the haughty pride of the British. To help ensure the fighting spirit of his army, Washington introduced a decoration designated as the Badge of Military Merit on Aug. 7, 1782. He personally awarded three and then authorized his subordinate officers to issue them in cases of unusual gallantry or extraordinary fidelity and essential service. Soldiers received a purple heart-shaped cloth, to be worn over the left breast. After a lapse, it was redesigned and is now the Purple Heart medal, awarded to those wounded or killed. The first was awarded on Feb. 22, 1932, the 200th anniversary of Washington’s birthday.

The victorious conclusion of the Revolutionary War left many questions unanswered concerning American governance, prominently the relationship between the government and the military. At the end, army officers had several legitimate grievances. Congress was in arrears with pay and had not settled officer food and clothing accounts or made any provisions for military pensions. In March 1783, an anonymous letter circulated calling on officers to take a more aggressive stance, draw up a list of demands, and even possibly defy the new government! Washington acted quickly, calling for a meeting of all officers and at the last moment delivered one of the most eloquent and important speeches of his life.

After the speech, he drew a letter from a pocket that outlined Congressional actions to be undertaken. He hesitated and then fumbled in his pockets and remarked, “Gentlemen, you will permit me to put on my spectacles, for I have not only grown gray, but almost blind, in the service of my country.” By all accounts, the officers were brought to tears, and the potentially dangerous conspiracy collapsed immediately.

He gets my vote.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

After Independence, United States and South America Took Different Paths

This Banco de Venezuela 1000 Bolivares ND, circa 1890, proof, featuring Simón Bolívar, sold for $2,115 at a January 2017 Heritage auction.

By Jim O’Neal

Important scholars believe that Simón Bolívar should have been the George Washington of South America. He, too, overthrew an empire (Spain) – but obviously failed to create an Estados Unidos of South America. The American Revolution not only achieved unity for the former British colonies; independence also set the United States on the road to unsurpassed prosperity and power. South America ended up in a different place entirely due to a complex set of events.

Bolívar was born in Caracas in July 1783 to a prosperous, aristocratic family. He was an orphan by age 10 and soldier at the tender age of 14. He studied in Spain and France, spending time in Paris even after all foreigners were expelled in response to a food shortage. He returned to Venezuela by 1807, inspired by Napoleon and disgusted with Spanish rule. Sent to London to seek British help, he met Francisco de Miranda, the veteran campaigner for Venezuelan independence.

On their return in July 1811, they boldly proclaimed the First Republic of Venezuela.

The Republic ended in failure since a disproportionate number were excluded from voting by a flawed constitution and Bolívar betrayed Miranda to the Spanish before fleeing to New Granada. From there, he proclaimed a Second Republic – with himself in the role of dictator and winning the epithet El Libertador.

Eventually, Bolívar became master of what he termed Gran Colombia, which encompassed New Granada, Venezuela and Quito (modern Ecuador). José de San Martín, the liberator of Argentina and Chile, yielded political leadership to him. By April 1825, his men had driven the last Spanish from Peru, and Upper Peru was renamed Bolivia in his honor. The next step was to create an Andean Confederation of Gran Colombia, Peru and Bolivia. Why did Bolívar fail to establish this as the core of a United States of Latin America? The superficial answer – his determination to centralize power and resistance of the local warlords – misses much more complicated circumstances.

First is that South Americans had virtually no experience or history in democratic decision-making or representative government of the sort that had been normal in North America’s colonial assemblies. So Bolívar’s dream of democracy turned out to be dictatorship because, as he once said, “our fellow citizens are not yet able to exercise their rights … because they lack the political virtues that characterize true republicans.” Under the constitution he devised, Bolívar was to be dictator for life, with the right to nominate his successor. “I am convinced to the very marrow of my bones that America can only be ruled by an able despotism … We cannot afford to place laws above leaders and principles above men.”

For remaining skeptics, perhaps it is better to let Simón Bolívar explain in his own words, in a December 1830 letter he wrote a month before his death:

“I ruled for 20 years … and I derived only a few certainties:

  • South America is ungovernable.
  • Those who serve a revolution plough the sea.
  • The only thing one can do in America is to emigrate.
  • This country will fall inevitably into the hands of the unbridled masses.
  • Once we have been devoured by every crime and extinguished by utter ferocity, the Europeans will not even regard us as worth conquering.
  • If it were possible for any part of the world to revert to primitive chaos, it would be [South] America in her final hour.”

This was a painfully accurate prediction for the next 150 years of Latin American history and the result was a cycle of revolution and counter-revolution, coup and counter-coup. One only needs to read of Venezuela today to grasp the totality of how dire the future remains.

By the way, much of this insight comes by way of the highly recommended Civilization: The West and the Rest by Niall Ferguson.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].