America’s greatest contribution to humanity remains ‘freedom’

An 1852 presentation copy of the Constitution of the United States, signed by President Millard Fillmore, sold for $15,000 at an April 2016 Heritage auction.

By Jim O’Neal

Constitutional scholars are prone to claim there is a direct and historic link between the First Commandment of the Old Testament and the First Amendment of the U.S. Constitution … that one leads inexorably to the other. The First Commandment explains the origins of freedom and the First Amendment defines the scope of freedom in broad categories.

Both point unmistakably to freedom as America’s greatest contribution to humanity. Not the automobile, jazz or Hollywood. Not the word processor, the internet or the latest smartphones. All of these are often described as America’s unique assets, but it is the awesome concept of “freedom” that is America’s ultimate symbol, attraction … and even export!

The First Commandment reads, “I am the Lord thy God, who brought thee forth out of Egypt, out of the house of bondage, and thou shalt have no other gods before me.” In this way, God sanctions an escape from bondage and puts people on a path toward the “promised land,” toward freedom. This powerful message ricocheted through history until it finally found a permanent home in Colonial America.

In the early 18th century, the trustees of Yale, many of them scholars who read scripture in the original Hebrew, designed a coat of arms for the college in the shape of a book open to two words, Urim and Thummim, which have come to mean “Light” and “Truth” in English. The book depicted, of course, was the bible.

Not too far north, Harvard graduates had to master three languages … Latin, Greek and Hebrew. True gentlemen in those days had to have more than a passing familiarity with the Old Testament. It was not a mere coincidence that carved into the Liberty Bell in Philadelphia, relaying a brave message to a people overthrowing British rule, was an uplifting phrase selected from chapter 25 of Leviticus: “Proclaim liberty throughout the land, unto all inhabitants thereof.”

The Commandment that blessed an escape from oppression and embraced the pursuit of freedom led the Founding Fathers to pen the Bill of Rights. They had much on their minds, many interests to try and reconcile, but they agreed that the delineation of freedom was to be their primary responsibility. It was to become the First Amendment to the recently ratified Constitution, inspired in part by the First Commandment, and it read, “Congress shall make no law respecting an establishment of religion or prohibiting an establishment of religion, or prohibiting the free exercise thereof, or abridging the freedom or speech, or of the press, or the right of the people peaceably to assemble, and to petition the government for a redress of grievances.”

Throughout American history, these freedoms have become intertwined with American life, one indistinguishable from the other. Just consider that in one small grouping of words, into a single Amendment, resides more freedom for mankind than had ever existed in the history of the world. Somewhat remarkable, in my opinion, yet we take it for granted today and can only fine tune minor opinions on original intent in small insignificant instances.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Believe it or not, electing presidents has never been a pleasant affair

An 1889 letter in which Rutherford B. Hayes discusses his inauguration sold for $19,120 at an April 2007 Heritage auction.

By Jim O’Neal

One discouraging trend in American culture is treating everything from a partisan-political standpoint. I can recall not too long ago after an election, we’d simply forget about our disagreements about candidates and resume normal civility. Now it seems that nearly everything gets politicized, dividing the nation into continually warring tribes of Red and Blue. Some political pundits see the starting point as the 2000 Gore versus Bush election, with its hanging chads and the controversial Supreme Court decision to stop the vote recount in Florida. Others believe the feud between President Bill Clinton and Speaker Newt Gingrich exacerbated it.

However, to accept either theory requires ignoring the 1876 presidential election between Samuel Tilden and Rutherford B. Hayes.

Hayes, the Republican, was a lawyer from Ohio who distinguished himself during the Civil War as a brave soldier who was wounded five times and eventually promoted to a brevet major general. After the war, he served in Congress and was elected governor of Ohio three times.

Tilden also had a legal background and was the 25th governor of New York (1875-76). As the Democratic candidate for the presidency in 1876, he is still the only individual to win an outright majority (not just a plurality) of the popular vote, but lose the election … in a rather bizarre series of events. Four other candidates have lost the presidency despite having a plurality of the popular vote (Al Gore and Hillary Clinton are the most recent to suffer this fate).

It had generally been assumed that incumbent President Ulysses S. Grant would run for a third term, despite a troubled economy and numerous scandals that had been discovered during his two terms, which started in 1869. There was also the two-term precedent established by George Washington. In spite of these formidable barriers, Grant’s inner circle of advisors were eager to maintain political power. While Grant was on the verge of announcing his candidacy, the House of Representatives preempted him by passing a resolution by an overwhelming margin, 233-18, establishing a two-term limit to prevent a dictatorship. Grant reluctantly withdrew his name from consideration.

The Democrats proceeded with their National Convention in June 1876 in St. Louis (the first time a major political convention was held west of the Mississippi). They selected Tilden on the second ballot and added Thomas Hendricks for vice president, since he was the only one nominated. The Democrats were hungry for a win since they had been out of power since James Buchanan, who was elected a full 20 years earlier in 1856.

What followed was the most contentious presidential election in American history. On the first vote in the Electoral College, Tilden had 184 votes (only one short) while Hayes was stuck at 165. However, there were 20 votes being contested in four states (Florida, Louisiana, South Carolina and Oregon) and both parties were claiming victory. This impasse caused a Constitutional crisis and, finally, a beleaguered Congress passed a law on Jan. 29, 1877, to form a special 15-member Electoral Commission to settle the dispute. After a great debate, the commission awarded all 20 disputed votes to Hayes, who became president with 185 votes to Tilden’s 184.

In return, Republicans passed a resolution that required an end to Reconstruction and the removal of all federal troops from every Southern state. Over the next 20 years, the states passed all kinds of laws and regulations that effectively wiped out the provisions of the 14th and 15th Amendments to the Constitution that granted numerous rights to the black population. It would take another 60 years to regain them when LBJ was president and finally crack the “Solid South” grip on national politics.

Maybe we are doomed to be a divided nation, but I suspect that strong leaders will emerge, eventually, and help us remember the advantages of a group of united states … E pluribus unum.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As court controversy rages, let’s not forget what we do best

A photograph of Franklin D. Roosevelt signed and inscribed to Eleanor Roosevelt sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

The Supreme Court was created by the Constitution, but the document wisely calls for Congress to decide the number of justices. This was vastly superior to a formula based on the number of states or population, which would have resulted in a large, unwieldy committee. The 1789 Judiciary Act established the initial number at six, with a chief justice and five associates all selected by President Washington.

In 1807, the number was increased to seven (to avoid tie votes) and in 1837 to nine, and then to 10 in 1863. The Judiciary Act of 1866 temporarily reduced the court to seven in response to post-Civil War politics and the Andrew Johnson presidency. Finally, the 1869 Act settled on nine, where it has remained to this day. The major concern has consistently been over the activities of the court and the fear it would inevitably try to create policy rather than evaluate it (ensuring that Congressional legislation was lawful and conformed to the intent of the Constitution).

The recent confirmation hearings are the latest example of both political parties vying for advantage by using the court to shape future policies, reflecting political partisanship at its worst. Despite the fact that the Supreme Court can’t enforce its decisions since Congress has the power of the purse and the president the power of force, the court has devolved into a de facto legislative function through its deliberations. In a sharply divided nation, on most issues, policy has become the victim, largely since Congress is unable to find consensus. The appellate process is simply a poor substitute for this legislative weakness.

We have been here before and it helps to remember the journey. Between 1929 and 1945, two great travails were visited on our ancestors: a terrible economic depression and a world war. The economic crisis of the 1930s was far more than the result of the excesses of the 1920s. In the 100 years before the 1929 stock-market crash, our dynamic industrial revolution had produced a series of boom-bust cycles, inflicting great misery on capital and on many people. Even the fabled Roaring ’20s had excluded great segments of the population, especially blacks, farmers and newly arrived immigrants. Who or what to blame?

“[President] Hoover will be known as the greatest innocent bystander in history, a brave man fighting valiantly, futile, to the end,” populist newspaperman William Allen White wrote in 1932.

The same generation that suffered through the Great Depression was then faced with war in Europe and Asia, the rationing of common items, entrance to the nuclear age and, eventually, the responsibilities for rebuilding the world. Our basic way of life was threatened by a global tyranny with thousands of nukes wired to red buttons on two desks 4,862 miles apart.

FDR was swept into office in 1932 during the depth of the Great Depression and his supporters believed he possessed just what the country needed: inherent optimism, confidence, decisiveness, and the desire to get things done. We had 13 million unemployed, 9,100 banks closed, and a government at a standstill. “This nation asks for action and action now!”

In his first 100 days, Roosevelt swamped Congress with a score of carefully crafted legislative actions designed to bring about economic reforms. Congress responded eagerly. But the Supreme Court, now dubbed the “Nine Old Men,” said no to most New Deal legislation by votes of 6-3 or 5-4. They made mincemeat of the proposals. But the economy did improve and resulted in an even bigger landslide re-election. FDR won 60.3 percent of the popular vote and an astonishing 98.5 percent of the electoral votes, losing only Vermont and Maine.

In his 1937 inaugural address, FDR emphasized that “one-third of the nation was ill-housed, ill-clad and ill-nourished.” He called for more federal support. However, Treasury Secretary Henry Morgenthau worried about business confidence and argued for a balanced budget, and in early 1937, Roosevelt, almost inexplicably, ordered federal spending reduced. Predictably, the U.S. economy went into decline. Industrial production had fallen 14 percent and in October alone, another half million people were thrown out of work. It was clearly now “Roosevelt’s Recession.”

Fearing that the Supreme Court would continue to nullify the New Deal, Roosevelt in his ninth Fireside Chat unveiled a new plan for the judiciary. He proposed that the president should have the power to appoint additional justices – up to a maximum of six, one for every member of the Supreme Court over age 70 who did not retire in six months. The Judicial Procedures Reform Bill of 1937 (known as the “court-packing plan”) hopelessly split the Democratic majority in the Senate, caused a storm of protest from bench to bar, and created an uproar among both Constitutional conservatives and liberals. The bill was doomed from the start and even the Senate Judiciary reported it to the floor negatively, 10-14. The Senate vote was even worse … 70-20 to bury it.

We know how that story ended, as Americans were united to fight a Great War and then do what we do best: work hard, innovate and preserve the precious freedoms our forebears guaranteed us.

Unite vs. Fight seems like a good idea to me.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Notorious traitors? Let’s look at Benedict Arnold

A May 24, 1776, letter by Benedict Arnold, signed, to Gen. William Thompson, realized $23,750 at an April 2016 Heritage auction.

By Jim O’Neal

Vidkun Quisling is an obscure name from World War II. To those unfamiliar with some of the lesser-known details, “Quisling” has become a synonym for a traitor or collaborator. From 1942 to 1945, he was Prime Minister of Norway, heading a pro-Nazi puppet government after Germany invaded. For his role, Quisling was put on trial for high treason and executed by firing squad on Oct. 24, 1945.

Obviously better known are Judas Iscariot of Last Supper fame (30 pieces of silver); Guy Fawkes, who tried to assassinate King James I by blowing up Parliament (the Gunpowder Plot); and Marcus Junius Brutus, who stabbed Julius Caesar (“Et tu, Brute?”). In American history, it’s a close call between John Wilkes Booth and Benedict Arnold.

Arnold

The irony concerning Benedict Arnold (1741-1801) is that his early wartime exploits had made him a legendary figure, but Arnold never forgot the sleight he received in February 1777 when Congress bypassed him while naming five new major generals … all of them junior to him. Afterward, George Washington pledged to help Arnold “with opportunities to regain the esteem of your country,” a promise he would live to regret.

Unknown to Washington, Arnold had already agreed to sell secret maps and plans of West Point to the British via British Maj. John André. There have always been honest debates over Arnold’s real motives for this treacherous act, but it seems clear that purely personal gain was the primary objective. Heavily in debt, Arnold had brokered a deal that included having the British pay him 6,000 pound sterling and award him a British Army commission for his treason. There is also little doubt that his wife Peggy was a full accomplice, despite a dramatic performance pretending to have lost her mind rather than her loyalty.

The history of West Point can be traced back to when it was occupied by the Continental Army after the Second Continental Congress (1775-1781) was designated to manage the Colonial war effort. West Point – first known as Fort Arnold and renamed Fort Clinton – was strategically located on high ground overlooking the Hudson River, with panoramic views extending all the way to New York City, ideal for military purposes. Later, in 1801, President Jefferson ordered plans to establish the U.S. Marine Corps there, and West Point has since churned out many distinguished military leaders … first for the Mexican-American War and then for the Civil War, including both Ulysses S. Grant and Robert E. Lee. It is the oldest continuously operating Army post in U.S. history.

To understand this period in American history, it helps to start at the end of the Seven Years’ War (1756-63), which was really a global conflict that included every major European power and spanned five continents. Many historians consider it “World War Zero,” and on the same scale as the two 20th century wars. In North America, the skirmishes started two years earlier in the French and Indian War, with Great Britain an active participant.

The Treaty of Paris in 1763 ended the conflict, with the British winning a stunning series of battles, France surrendering its Canadian holdings, and the Spanish ceding its Florida territories in exchange for Cuba. Consequently, the British Empire emerged as the most powerful political force in the world. The only issue was that these conflicts had nearly doubled England’s debt from 75 million to 130 million sterling.

A young King George III and his Parliament quietly noted that the Colonies were nearly debt free and decided it was time for them to pay for the 8,000-10,000 Redcoat peacetime militia stationed in North America. In April 1864, they passed legislation via the Currency Act and the Sugar Act. This limited inflationary Colonial currency and cut the trade duty on foreign molasses. In 1765, they struck again. Twice. The Quartering Act forced the Colonists to pay for billeting the king’s troops. Then the infamous Stamp Act placed direct taxes on Americans for the first time.

This was one step too far and inevitably led to the Revolutionary War, with armed conflict that involved hot-blooded, tempestuous individuals like Benedict Arnold. A brilliant military leader of uncommon bravery, Arnold poured his life into the Revolutionary cause, sacrificing his family life, health and financial well-being for a conflict that left him physically crippled. Sullied with false accusations, he became profoundly alienated from the American cause for liberty. His bitterness unknown to Washington, on Aug. 3, 1780, the future first president announced Arnold would take command of the garrison at West Point.

The appointed commander calculated that turning West Point over to the British, perhaps along with Washington as well, would end the war in a single stroke by giving the British control over the Hudson River. The conspiracy failed when André was captured with incriminating documents. Arnold fled to a British warship and they refused to trade him for André, who was hanged as a spy after pleading to be shot by a firing squad. Arnold went on to lead British troops in Virginia, survived the war, and eventually settled in London. He quickly became the most vilified figure in American history and remains the symbol of treason yet today.

Gen. Nathanael Greene, often called Washington’s most gifted and dependable officer, summed it up after the war most succinctly: “Since the fall of Lucifer, nothing has equaled the fall of Arnold.”

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Selecting a justice has always been a messy, partisan process

This photograph, circa 1968, autographed by Chief Justice Earl Warren and the eight associate justices, sold for $2,031 at a June 2010 Heritage auction.

By Jim O’Neal

The Senate Judiciary Committee began hearings this week to consider the nomination of Judge Brett Kavanaugh to the Supreme Court in their “advise and consent” role to the president of the United States. Once considered a formality in the justice system, it has devolved into a high-stakes political process and is a vivid example of how partisanship has divided governance, especially in the Senate.

Fifty years ago, President Nixon provided a preview of politics gone awry as he attempted to reshape the Supreme Court to fit his vision of a judiciary. His problems actually started during the final year of Lyndon Johnson’s presidency. On June 26, 1968, LBJ announced that Chief Justice Earl Warren intended to resign the seat he had held since 1953. He also said that he intended to nominate Associate Justice Abe Fortas as his successor.

For the next three months, the Senate engaged in an acrimonious debate over the Fortas nomination. Finally, Justice Fortas asked the president to withdraw his nomination to stop the bitter partisan wrangling. Chief Justice Warren, who had been a keen observer of the Senate’s squabbling, decided to end the controversy in a different way. He withdrew his resignation and in a moment of pique said, “Since they won’t take Abe, they will have me!” True to his promise, Warren served another full term until May 1969.

By then, there was another new president – Richard Nixon – and he picked Warren Burger to be Warren’s replacement. Burger was a 61-year-old judge on the U.S. Court of Appeals with impeccable Republican credentials, just as candidate Nixon had promised during the 1968 presidential election campaign. As expected, Burger’s confirmation was speedy and decisive … 74-3.

Jubilant over his first nomination confirmation to the court, Nixon had also received a surprise bonus earlier in 1969. In May, Justice Fortas had decided to resign his seat on the court. In addition to the bitter debate the prior year, the intense scrutiny of his record had uncovered a dubious relationship with Louis Wolfson, a Wall Street financier sent to prison for securities violations. To avoid another Senate imbroglio over some shady financial dealings, Fortas decided to resign. In stepping down, Fortas became the first Supreme Court justice to resign under threat of impeachment.

So President Nixon had a second opportunity to add a justice. After repeating his criteria for Supreme Court nominees, Nixon chose Judge Clement Haynsworth Jr. of the U.S. Court of Appeals, Fourth Circuit, to replace Fortas. Attorney General John Mitchell had encouraged the nomination since Haynsworth was a Harvard Law alumnus and a Southern jurist with conservative judicial views. He seemed like an ideal candidate since Nixon had a plan to gradually reshape the court.

However, to the president’s anger and embarrassment, Judiciary Committee hearings exposed clear evidence of financial and conflict-of-interest improprieties. There were no actual legal implications, but how could the Senate force Fortas to resign and then essentially just overlook basically the same issues now? Finally, the Judiciary Committee approved Haynsworth 10-7, but on Nov. 21, 1969, the full Senate rejected the nomination 55-45. A livid Nixon blamed anti-Southern, anti-conservative partisans for the defeat.

The president – perhaps in a vengeful mood – quickly countered by nominating Judge G. Harold Carswell of Florida, a little-known undistinguished ex-U.S. District Court judge with only six months experience on the Court of Appeals. The Senate was clearly now hoping to approve him until suspicious reporters discovered a statement in a speech he had made to the American Legion 20-plus years before in 1948: “I yield to no man as a fellow candidate or as a citizen in the firm, vigorous belief in the principles of White Supremacy and I shall always be so governed!”

Oops.

Even allowing for his youth and other small acts of racial bias, the worst was yet to come. It turned out that he was a lousy judge with a poor grasp of the law. His floor manager, U.S. Senator Roman Hruska, a Nebraska Republican, then made a fumbling inept attempt to convert Carswell’s mediocrity into an asset. “Even if he is mediocre, there are lots of mediocre judges, people and lawyers. They are entitled to a little representation aren’t they, and a little chance?” This astonishing assertion was then compounded when it was seconded by Senator Russell Long, a Democrat from Louisiana! When the confirmation vote was taken on April 9, 1970, Judge Carswell’s nomination was defeated 51-45.

A bitter President Nixon, with two nominees rejected in less than six months, continued to blame it on sectional prejudice and philosophical hypocrisy. So he turned to the North and selected Judge Harry Blackmun, a close friend of Chief Justice Burger who urged his nomination. Bingo … he was easily confirmed by a vote of 94-0. At long last, the vacant seat of Abe Fortas was filled.

There would be no further vacancies for 15 months, but in September 1971, justices Hugo Black and John Harlan announced they were terminally ill and compelled to resign from the court. Nixon was finally able to develop a strategy to replace these two distinguished jurors, but it was only after a complicated and convoluted process. It would ultimately take Nixon eight tries to fill four seats, and the process has only become more difficult.

Before Judge Kavanaugh is able to join the court, as is widely predicted, expect the opposing party to throw up every possible roadblock they have in their bag of tricks. This process is now strictly political and dependent on partisan voting advantages. The next big event will probably involve Justice Ruth Bader Ginsburg, a 25-year court member (1993) and only the second woman on the court after Sandra Day O’Connor. At age 85, you can be sure that Democrats are wishing her good health until they regain control of the Oval Office and the Senate. If not, stay tuned for the Battle of the Century!

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

100 Years Before Rosa Parks, There was Octavius Catto

Rosa Parks refused to give up her seat on a segregated bus, sparking the Montgomery, Ala., bus boycott.

By Jim O’Neal

Most Americans are familiar with Rosa Parks and recall the heroic story of a weary black woman on her way home after a hard day at work who refused to give up her seat and “move to the back of the bus” to make room for white people. The date was Dec. 1, 1955, and the city was Montgomery, Ala.

Later, she would be arrested during the ensuing Montgomery bus boycott that lasted 381 days. She was fined $10, but ultimately vindicated by the U.S. Supreme Court, which ruled the segregation law was unconstitutional. After her death, she became the first African-American woman to have her likeness depicted in the National Statuary Hall in the U.S. Capitol.

Parks (1913-2005) earned her way into the pantheon of civil rights leaders, but few remember a remarkable man who preceded her by a century when streetcars were pulled by horses.

Catto

His name was Octavius Valentine Catto (1839-1871) and history was slow in recognizing his astonishing accomplishments. Even the epitaph on his tombstone shouts in bold letters “THE FORGOTTEN HERO.” One episode in his far-too-short but inspiring life is eerily similar to the events in Montgomery, only dramatically more so. Catto was a fierce enemy of the entire Philadelphia trolley car system, which banned black passengers. On May 18, 1865, The New York Times ran a story about an incident involving Catto that occurred the previous afternoon in Philadelphia, “The City of Brotherly Love” (at least for some).

Paraphrasing the story, it describes how a colored man (Catto) had refused all attempts to get him to leave a strictly segregated trolley car. Frustrated and in fear of being fined if he physically ejected him, the conductor cleverly side railed the car, detached the horses and left the defiant passenger in the now-empty stationary car. Apparently, the stubborn man was still on-board after spending the night. It caused a neighborhood sensation that led to even more people challenging the rules.

The following year, there was an important meeting with the Urban League to protest the forcible ejection of several black women from Philadelphia streetcars. The intrepid Catto presented a number of resolutions that highlighted the inequities in segregation, principles of freedom, civil liberty and a heavily biased judicial system. He also boldly solicited support from fellow citizens in his quest for fairness and justice.

He got specific help from Pennsylvania Congressman Thaddeus Stevens, a leader of the “Radical Republicans” who had a fiery passion for desegregation and abolition of slavery, and who criticized President Lincoln for lack of more forceful action. Stevens is a major character in Steven Spielberg’s 2013 Oscar-nominated film Lincoln, with Tommy Lee Jones gaining an Oscar nomination for his portrayal of Stevens. On Feb. 3, 1870, the 15th Amendment to the Constitution guaranteed suffrage to black men (women of all colors would have to wait another 50 years until 1920 to gain the right to vote in all states). It would also lead to Catto’s death. On Election Day, Oct. 10, 1871, Catto was out encouraging black men to vote for Republicans. He was fatally shot by white Democrats who wanted to suppress the black vote.

Blacks continued to vote heavily for Republicans until the early 20th century and were not even allowed to attend Democratic conventions until 1924. This was primarily due to the fact that Southern states had white governors who mostly discouraged equal rights and supported Jim Crow laws that were unfair to blacks. As comedian Dick Gregory (1932-2017) famously joked, he was at a white lunch counter where he was told, “We don’t serve colored people here,” and Gregory replied, “That’s all right. I don’t eat colored people … just bring me a whole fried chicken!”

Octavius Catto, who broke segregation on trolley cars and was an all-star second basemen long before Jackie Robinson, would have to wait until the 20th century to get the recognition he deserved. I suspect he would be surprised that we are still struggling to “start a national conversation” about race when that’s what he sacrificed his life for.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Usual Fireworks Expected with Latest Supreme Court Selection

This photograph, signed by Supreme Court Chief Justice William H. Taft and the eight associate justices, circa 1927, sold for $14,340 at a September 2011 Heritage auction.

By Jim O’Neal

It is that time again when the news will be filled with predictions of pestilence, war, famine and death (the Four Horsemen of the Apocalypse) as President Trump tees up his next candidate for the Supreme Court. One side will talk about the reversal of Roe v. Wade as an example of the terrible future that lies ahead. The other side will be quick to point out that this fear-mongering first started in 1981 when Sandra Day O’Connor (the first woman to serve on the court) was nominated by President Reagan and that nothing has happened in the intervening 37 years.

My prediction is that regardless of whoever is confirmed, there will be no evidence from the past on any opinions on “Roe” and he or she will have been groomed by the “Murder Boards” to answer that it is settled law. Murder Boards are groups of legal experts who will rehearse the nominee on how to answer every possible question the Senate Judiciary Committee might ask on any subject, not just Roe, in their role in giving advice and consent. It produces what former Vice President Joe Biden described as a “Kabuki dance” when he was in the Senate.

The questioning does produce great public theater, but it is a tradition that dates to 1925 when nominee Harlan Stone actually requested he be allowed to answer questions about rumors of improper ties to Wall Street. It worked and he was confirmed by a vote of 71-6 and would later serve as Chief Justice (1941-46). In 1955, John Marshall Harlan II was next when Southern Senators wanted to know his views on public school desegregation vis-à-vis Brown v. Board of Education. He was also successfully confirmed 71-11 and since then, every nominee to the court has been questioned by the Senate Judiciary Committee. The apparent record is the 30 hours of grilling Judge Robert Bork experienced in 1987, when he got “Borked” by trying to answer every single question honestly. Few make that mistake today.

Roe v. Wade was a 1973 case in which the issue was whether a state court could constitutionally make it a crime to perform an abortion, except to save the mother’s life. Abortion had a long, legal history dating to the 1820s when anti-abortion statues began to appear that resembled an 1803 law in Britain that made it illegal after “quickening” (start of fetal movements) using various rationales such as illegal sexual conduct, unsafe procedures and the state’s responsibilities in protecting prenatal life.

The criminalization accelerated from the 1860s and by 1900, abortion was a felony in every state. Despite this, the practice continued to grow and in 1921, Margaret Sanger founded the American Birth Control League. By the 1930s, licensed physicians performed an estimated 800,000 procedures each year. In 1967, Colorado became the first state to decriminalize abortion in cases of rape, incest or permanent disability of the woman. By 1972, 13 states had similar laws and in 1970, Hawaii was the first state to legalize abortion on the request of the woman. So the legal situation prior to Roe was that abortion was illegal in 30 states and legal in the other 20 under certain conditions.

“Jane Roe” was an unmarried pregnant woman who supposedly wished to terminate her pregnancy and instituted an action in the U.S. District Court for the Northern District of Texas. A three-judge panel found Texas criminal statues unconstitutionally vague and the right to choose to have children was protected by the 9th through the 14th Amendments. All parties appealed and on Jan. 22, 1973, the Supreme Court ruled the Texas statute was unconstitutional. The court declined to define when human life begins.

Jane Roe’s real name was Norma McCorvey and she became a pro-life advocate before she died and maintained she never had the abortion and that she was the victim of two young, ambitious lawyers looking for a plaintiff. Henry Wade was district attorney of Dallas from 1951 to 1987 and the longest serving DA in United States history. He was also involved in the prosecution of Jack Ruby for killing Lee Harvey Oswald. After he was convicted, Ruby appealed and the verdict was overturned, but he died of lung cancer and is constitutionally presumed innocent.

Stay tuned for the fireworks.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As Nation Moved to Civil War, the North had the Financial Edge

Richard Montgomery was an Irish soldier who served in the British Army before joining the Continental Army.

By Jim O’Neal

Richard Montgomery (1738-75) was a little-known hero-soldier born in Dublin, Ireland, who became a captain in the British Army in 1756. Later, he became a major general in the Continental Army after the Continental Congress elected George Washington as Commander in Chief of the Continental Army in June 1775. This position was created specifically to coordinate the military efforts of the 13 Colonies in the revolt against Great Britain.

Montgomery was killed in a failed attack on Quebec City led by General Benedict Arnold (before he defected). Montgomery was mourned in both Britain and America as his remains were interned at St. Paul’s Chapel in New York City.

A remarkably diverse group of schools, battleships and cities named in his honor remain yet today. Montgomery, Ala., is the capital and second-largest city in the state; it’s where Rosa Parks refused to give up her bus seat to a white passenger on Dec. 1, 1955, sparking the famous Montgomery bus boycott. Martin Luther King Jr. used Montgomery to great advantage in organizing the civil rights movement.

Montgomery was also the first capital of the Provisional Congress of the Confederate States when the first meeting was convened in February 1861. The first seven states that seceded from the United States had hastily selected representatives to visit the new Confederate capital. They arrived to find the hotels dirty, dusty roads, and noisy lobbyists overflowing in the statehouse. Montgomery was not prepared to host any large group, especially a large political convention.

Especially notable was that most of the South’s most talented men had already either joined the Army, the Cabinet or were headed for diplomatic assignments. By default, the least-talented legislators were given the responsibility of writing a Constitution, installing the new president (Jefferson Davis), and then authorizing a military force of up to 400,000 men. This conscription was for three years or the duration of the war. Like the North, virtually everyone was confident it would be a short, decisive battle.

Jefferson Davis was a well-known name, having distinguished himself in the Mexican War and serving as Secretary of War for President Franklin Pierce. Like many others, he downplayed the role of slavery in the war, seeing the battle as a long-overdue effort to overturn the exploitive economic system that was central to the North. In his view, the evidence was obvious. The North and South were like two different countries: one a growing industrial power and the other stuck in an agricultural system that had not evolved from 1800 when 80 percent of its labor force was on farms and plantations. The South now had only 18 percent of the industrial capacity and trending down.

That mediocre group of lawmakers at the first Confederate meeting was also tasked with the challenge of determining how to finance a war against a formidable enemy with vastly superior advantages in nearly every important aspect. Even new migrants were attracted to the North’s ever-expanding opportunities, as slave states fell further behind in manufacturing, canals, railroads and even conventional roads, all while the banking system became weaker.

Cotton production was a genuine bright spot for the South (at least for plantation owners), but ironically, it generated even more money for the North with its vast network of credit, warehousing, manufacturing and shipping companies. The North manufactured a dominant share of boots, shoes, cloth, pig iron and almost all the firearms … an ominous fact for people determined to fight a war. The South was forced to import foodstuffs in several regions. Southern politicians had spoken often of the need to build railroads and manufacturing, but these were rhetorical, empty words. Cotton had become the powerful narcotic that lulled them into complacency. Senator James Hammond of South Carolina summed it up neatly in his “Cotton is King” speech on March 4, 1858: “Who can doubt, that has looked at recent events, that cotton is supreme?”

Southerners sincerely believed that cotton would rescue them from the war and “after a few punches in the nose,” the North would gladly surrender.

One of those men was Christopher G. Memminger, who was selected as Confederate States Secretary of the Treasury and responsible for rounding up gold and silver to finance the needs of the Confederate States of America (CSA). A lawyer and member of the South Carolina legislature, he was also an expert on banking law. His first priority was for the Treasury to get cash and he started in New Orleans, the financial center of the South, by raiding the mint and customs house.

He assumed there would be at least enough gold to coin money and commissioned a design for a gold coin with the goddess of liberty seated, bearing a shield and a staff flanked by bales of cotton, sugar cane and tobacco. Before any denominations were finalized, it was discovered there was not enough gold available and the mint was closed in June.

This was followed by another nasty surprise: All the banks in the South possessed only $26 million in gold, silver and coins from Spain and France. No problem. Memminger estimated that cotton exports of $200 million would be enough to secure hundreds of millions in loans. Oops. President Lincoln had anticipated this and blockaded all the ports after Fort Sumter in April 1861. No cotton, no credit, no guns.

In God we trust. All others pay cash.

One small consolation was that his counterpart in the North, Salmon P. Chase, was also having trouble raising cash and had to resort to the dreaded income tax. However, both sides managed to keep killing each other for four long years, leaving a legacy of hate.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

America has a Long History of Rough-and-Tumble Politics

A cabinet card photograph dated 1852, shortly after the marriage of Rutherford and Lucy Hayes, went to auction in October 2008.

By Jim O’Neal

A surprisingly high number of political pundits ascribe the current bitter partisan divide to the presidential election of 2000, when the Supreme Court ordered the recount of “under-votes” in Florida to cease. As a result, the previously certified election results would stand and George W. Bush would receive all 25 Florida electoral votes, thus providing him a 271-266 nationwide victory over Al Gore. Democrats almost universally believed the election had been “stolen” due to the seemingly unprecedented action by the Supremes.

Although obviously a factor in the situation today, it seems too simplistic to me, as I remember the Clinton Impeachment, the start of the Iraq War (and the president who lied us into war), and, of course, Obamacare – all of which were also major contributors to the long, slow erosion of friendly bipartisanship. Now, we’re in an era when each new day seems to drag up a new issue that Americans can’t agree on and the schism widens ever so slightly.

Could it be worse?

The answer is obviously “yes,” since we once tried to kill each other into submission during the Civil War. Another good example is the highly controversial presidential election of 1876, which resulted in Rutherford B. Hayes becoming president. The loser, Samuel J. Tilden, had such staunch supporters that they promised “blood would run in the streets” if their candidate lost. After a highly ultra-controversial decision threw the election to Hayes, Democrats continued to make wild threats, and public disturbances were rampant across New York City hotels, saloons, bars and any other venues where crowds gathered.

The unrest was so high that outgoing President Ulysses S. Grant gradually became convinced that a coup was imminent. This was the closest the Dems had come to the White House since James Buchanan’s election 20 years earlier in 1856 and passions were so high that they would not be calmed easily. The level of resentment was much more than about losing an election or the ascendancy of the Republican Party with all their fierce abolitionists. It seems apparent even today that the election results had been politically rigged or, at a minimum, very cleverly stolen in a quasi-legalistic maneuver.

Grant’s primary concern was one of timing. The normal inauguration date of March 4 fell on a Sunday and tradition called for it to be held the next day, on Monday, March 5 (as with Presidents James Monroe and Zachary Taylor). Thus the presidency would be technically vacant from noon on Sunday until noon on Monday. The wily old military genius knew this would be plenty of time to pull off a coup d’état. He insisted Hayes not wait to take the oath of office.

In a clever ruse, the Grants made arrangements for a secret oath-taking on Saturday evening by inviting 38 people to an honorary dinner at the White House. While the guests were being escorted to the State Dining Room, Grant and Hayes slipped into the Red Room, where Chief Justice Morrison Waite was waiting with the proper documents. All went as planned until it was discovered there was no Bible available. No problem … Hayes was sworn in as the 19th president of the United States with a simple oath.

The passing of power has been one of the outstanding aspects of our constitutional form of governance.

Hayes was born on Oct. 4, 1822 – 2½ months after his father had died of tetanus, leaving his pregnant mother with two young children. From these less-than-humble beginnings, the enterprising “Rud” got a first-rate education that culminated with an LLB degree from Harvard Law School. Returning to Ohio, he established a law practice, was active in the Civil War and finally served two non-consecutive terms as governor of Ohio, which proved to be a steppingstone to the White House.

Most historians believe Hayes and his family were the richest occupants of the White House until Herbert and Lou Hoover showed up 52 years later. They certainly had a reputation for living on the edge of extravagance, and some cynics believe this was in large part due to the banning all alcohol in the White House (presidents in those days paid for booze and wine personally). Incidentally, the nickname for the first lady, “Lemonade Lucy,” did not happen until long after they left the White House.

President Hayes kept his pledge to serve only one term; he died of a heart attack in 1893 at age 70. The first Presidential Library in the United States was built in his honor in 1916.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Scientists Like Joseph Lister Have Made Life Better for All of Us

A March 25, 1901, letter signed by Joseph Lister went to auction in October 2014.

By Jim O’Neal

In the 1880s, American physicist Albert Michelson embarked on a series of experiments that undermined a long-held belief in a luminiferous ether that was thought to permeate the universe and affect the speed of light ever so slightly. Embraced by Isaac Newton (and almost venerated by all others), the ether theory was considered an absolute certainty in 19th century physics in explaining how light traveled across the universe.

However, Michelson’s experiments (partially funded by Alexander Graham Bell) proved the exact opposite of the theory. In the words of author William Cropper, “It was probably the most famous negative result in the history of physics.” The fact was that the speed of light was the same in all directions and in every season – reversing Newton’s law that had been thought to be a constant for the past 200 years. But, not everyone agreed for a long time.

The more modern scientist Max Planck (1858-1947) helped explain the resistance to accept new facts in a rather novel way: “A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.”

Even if true, it still makes it no less easy to accept the fact that the United States was the only nation “that remained unconvinced of the merits of Joseph Lister’s methods of modern antiseptic medicine.” In fact, Henry Jacob Bigelow (1818-1890), the esteemed Harvard professor of surgery and a fellow of the Academy of Arts and Sciences, derided antisepsis as “medical hocus-pocus.” This is even more remarkable when one considers he was the leading surgeon in New England and his contributions to orthopedic and urologic surgery are legendary.

But this short story begins with a sleight of hand by asking: In the 19th century, what do you think was the most dangerous place in the vast territories of the British Empire? The frozen wastes of the Northwest Passage or the treacherous savannas of Zululand? Or perhaps the dangerous passes of Hindu Kush? The surprising answer is almost undoubtedly the Victorian teaching hospital, where patients entered with a trauma and exited to a cemetery after a deadly case of “hospital gangrene.”

Victorian hospitals were described as factories of death, reeking with an unmistakable stench resembling rotting fish, cheerfully described as “hospital stink.” Infectious wounds were considered normal or beneficial to recovery. Stories abound of surgeons operating on a continuous flow of patients, and bloody smocks were badges of honor or evidence of their dedication to saving lives. The eminent surgeon Sir Frederick Treves (1853-1923) recalled, “There was one sponge to a ward. With this putrid article and a basin of once clear water, all the wounds in the ward were washed twice a day. By this ritual, any chance that a patient had of recovery was eliminated.”

Fortunately, Joseph Lister was born in 1827 and chose the lowly, mechanical profession of surgery over the more prestigious practice of internal medicine. In 1851, he was appointed one of four residents of surgery at London’s University College Hospital. The head of surgery was wrongfully convinced that infections came from miasma, a peculiar type of noxious air that emanated from the rot and decay.

Ever skeptical, Lister scoured out rotten tissue from gangrene wounds using mercury pernitrate on the healthy tissue. Thus began Lister’s lifelong journey to investigate the cause of infection and prevention through modern techniques. He spent the next 25 years in Scotland, becoming the Regius Professor of Surgery at the University of Glasgow. After Louis Pasteur confirmed germs caused infections rather than bad air, Lister discovered that carbolic acid (a derivative of coal tar) could prevent many amputations by cleaning the skin and wounds.

He then went on the road, advocating his gospel of antisepsis, which was eagerly adopted by the scientific Germans and some Scots, but plodding and practical English surgeons took much longer. Thus left were the isolated Americans who, like Dr. Bigelow, were too stubborn and unwilling to admit the obvious.

Planck was right all along. It would take a new generation, but we are the generation that has derived the greatest benefits from the astonishing advances in 20th century medical breakthroughs, which only seem to be accelerating. It is a good time to be alive.

So enjoy it!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].