Selecting a justice has always been a messy, partisan process

This photograph, circa 1968, autographed by Chief Justice Earl Warren and the eight associate justices, sold for $2,031 at a June 2010 Heritage auction.

By Jim O’Neal

The Senate Judiciary Committee began hearings this week to consider the nomination of Judge Brett Kavanaugh to the Supreme Court in their “advise and consent” role to the president of the United States. Once considered a formality in the justice system, it has devolved into a high-stakes political process and is a vivid example of how partisanship has divided governance, especially in the Senate.

Fifty years ago, President Nixon provided a preview of politics gone awry as he attempted to reshape the Supreme Court to fit his vision of a judiciary. His problems actually started during the final year of Lyndon Johnson’s presidency. On June 26, 1968, LBJ announced that Chief Justice Earl Warren intended to resign the seat he had held since 1953. He also said that he intended to nominate Associate Justice Abe Fortas as his successor.

For the next three months, the Senate engaged in an acrimonious debate over the Fortas nomination. Finally, Justice Fortas asked the president to withdraw his nomination to stop the bitter partisan wrangling. Chief Justice Warren, who had been a keen observer of the Senate’s squabbling, decided to end the controversy in a different way. He withdrew his resignation and in a moment of pique said, “Since they won’t take Abe, they will have me!” True to his promise, Warren served another full term until May 1969.

By then, there was another new president – Richard Nixon – and he picked Warren Burger to be Warren’s replacement. Burger was a 61-year-old judge on the U.S. Court of Appeals with impeccable Republican credentials, just as candidate Nixon had promised during the 1968 presidential election campaign. As expected, Burger’s confirmation was speedy and decisive … 74-3.

Jubilant over his first nomination confirmation to the court, Nixon had also received a surprise bonus earlier in 1969. In May, Justice Fortas had decided to resign his seat on the court. In addition to the bitter debate the prior year, the intense scrutiny of his record had uncovered a dubious relationship with Louis Wolfson, a Wall Street financier sent to prison for securities violations. To avoid another Senate imbroglio over some shady financial dealings, Fortas decided to resign. In stepping down, Fortas became the first Supreme Court justice to resign under threat of impeachment.

So President Nixon had a second opportunity to add a justice. After repeating his criteria for Supreme Court nominees, Nixon chose Judge Clement Haynsworth Jr. of the U.S. Court of Appeals, Fourth Circuit, to replace Fortas. Attorney General John Mitchell had encouraged the nomination since Haynsworth was a Harvard Law alumnus and a Southern jurist with conservative judicial views. He seemed like an ideal candidate since Nixon had a plan to gradually reshape the court.

However, to the president’s anger and embarrassment, Judiciary Committee hearings exposed clear evidence of financial and conflict-of-interest improprieties. There were no actual legal implications, but how could the Senate force Fortas to resign and then essentially just overlook basically the same issues now? Finally, the Judiciary Committee approved Haynsworth 10-7, but on Nov. 21, 1969, the full Senate rejected the nomination 55-45. A livid Nixon blamed anti-Southern, anti-conservative partisans for the defeat.

The president – perhaps in a vengeful mood – quickly countered by nominating Judge G. Harold Carswell of Florida, a little-known undistinguished ex-U.S. District Court judge with only six months experience on the Court of Appeals. The Senate was clearly now hoping to approve him until suspicious reporters discovered a statement in a speech he had made to the American Legion 20-plus years before in 1948: “I yield to no man as a fellow candidate or as a citizen in the firm, vigorous belief in the principles of White Supremacy and I shall always be so governed!”

Oops.

Even allowing for his youth and other small acts of racial bias, the worst was yet to come. It turned out that he was a lousy judge with a poor grasp of the law. His floor manager, U.S. Senator Roman Hruska, a Nebraska Republican, then made a fumbling inept attempt to convert Carswell’s mediocrity into an asset. “Even if he is mediocre, there are lots of mediocre judges, people and lawyers. They are entitled to a little representation aren’t they, and a little chance?” This astonishing assertion was then compounded when it was seconded by Senator Russell Long, a Democrat from Louisiana! When the confirmation vote was taken on April 9, 1970, Judge Carswell’s nomination was defeated 51-45.

A bitter President Nixon, with two nominees rejected in less than six months, continued to blame it on sectional prejudice and philosophical hypocrisy. So he turned to the North and selected Judge Harry Blackmun, a close friend of Chief Justice Burger who urged his nomination. Bingo … he was easily confirmed by a vote of 94-0. At long last, the vacant seat of Abe Fortas was filled.

There would be no further vacancies for 15 months, but in September 1971, justices Hugo Black and John Harlan announced they were terminally ill and compelled to resign from the court. Nixon was finally able to develop a strategy to replace these two distinguished jurors, but it was only after a complicated and convoluted process. It would ultimately take Nixon eight tries to fill four seats, and the process has only become more difficult.

Before Judge Kavanaugh is able to join the court, as is widely predicted, expect the opposing party to throw up every possible roadblock they have in their bag of tricks. This process is now strictly political and dependent on partisan voting advantages. The next big event will probably involve Justice Ruth Bader Ginsburg, a 25-year court member (1993) and only the second woman on the court after Sandra Day O’Connor. At age 85, you can be sure that Democrats are wishing her good health until they regain control of the Oval Office and the Senate. If not, stay tuned for the Battle of the Century!

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Copernicus, Galileo Helped Discover Our True Place in the Universe

A 1635 copy of Galileo’s Dialogo, containing the argument that Earth revolves around the sun, sold for $31,070 at a February 2010 Heritage auction.

“Then God said, ‘Let there be light,’ and there was light.” Genesis 1:3

By Jim O’Neal

About 4.6 billion years ago, an enormous swirl of gas and matter started a process of aggregation that ultimately formed our solar system. Nearly all of this mass was transformed into our sun and the remainder eventually coalesced into clumps that became our planetary system. Fortunately for us, one of these clumps became our current home, Earth. Then, a large clump approximately the size of Mars collided with Earth and knocked out enough material to form an object we call the moon.

Several other factors were also in our favor. First was the size of the sun, since if it were larger it may have burned out by now and us with it (we still have about 5 billion years to go). The second was the distance between the sun and Earth. We literally have a “Goldilocks position” – not too close and not too far. If we were 5 percent closer or 15 percent farther away, we would either be fried or trying to live on a large ice ball. The odds of being exactly where we are is not pleasant to contemplate. Then again, if we weren’t exactly here, I assume we wouldn’t be aware of our bad luck. Even our orbit around the sun produces a nice blend of moderate seasons of winter and summer.

However, we are still subject to occasional collisions with other objects, like asteroids and comets. In the past, these have produced mass extinctions like the one that surprised the dinosaurs, who had enjoyed their time here for about 150 million years. Life has managed to adapt through five or six major incidents and countless smaller ones. The modern form of humans (our ancestors) have only been here for about 200,000 years, but we’ve been clever enough to develop weapons that could destroy Earth (a first that we should not be too proud of).

Throughout the early history of our residency, the conventional wisdom was that Earth was the center of everything, with the sun, moon and stars orbiting us. This “geocentric model” seemed logical using common sense, as we don’t feel any motion standing on the ground and there was no observational evidence that our planet was in motion, either. This system was widely accepted and became entrenched in classical philosophy via the combined works of Plato and Aristotle in the fourth century B.C.

However, when the ancient Greeks measured the movements of the planets, it became clear the geocentric model had too many discrepancies. To explain the complications, Greek astronomers introduced the concept of epicycles to reconcile their theories. These sub-orbits were refined by the great Greco-Roman astronomer Ptolemy of Alexandria. Despite competing ideas and debates, the Ptolemaic system prevailed, but with significant implications.

As the Roman Empire dwindled, the Christian Church inherited many long-standing assumptions, including the idea that Earth was the center of everything and, perhaps more importantly, that man was the pinnacle of God’s creation – with determination over Earth, a central tenet that held sway in Europe until the 16th century. However, by this point, the Ptolemaic model was becoming absurdly complicated. By the beginning of the 16th century, things were about to change … dramatically. The twin forces of the Renaissance and the Protestant Reformation challenged the old religious dogmas. Suddenly, a Polish Catholic canon, Nicolaus Copernicus (1473-1543), put forth the first modern heliocentric theory, shifting the center of the universe to the sun.

Fortunately, Copernicus was followed by an Italian polymath from Pisa, Galileo Galilei (1564-1642), who bravely helped transform philosophy to modern science and the scientific Renaissance into a scientific revolution. He championed heliocentrism, despite powerful opposition from astronomers, and was subjected to a formal Roman Inquisition. The body found him “vehemently suspect of heresy” and forced him into house arrest. He spent the remainder of his life basically imprisoned.

Now famous for his work as an astronomer, physicist, engineer, philosopher and mathematician, he received the ultimate honor of being named the “Father of Science” by both Stephen Hawking and Albert Einstein!

Not bad for someone in an ankle bracelet.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

100 Years Before Rosa Parks, There was Octavius Catto

Rosa Parks refused to give up her seat on a segregated bus, sparking the Montgomery, Ala., bus boycott.

By Jim O’Neal

Most Americans are familiar with Rosa Parks and recall the heroic story of a weary black woman on her way home after a hard day at work who refused to give up her seat and “move to the back of the bus” to make room for white people. The date was Dec. 1, 1955, and the city was Montgomery, Ala.

Later, she would be arrested during the ensuing Montgomery bus boycott that lasted 381 days. She was fined $10, but ultimately vindicated by the U.S. Supreme Court, which ruled the segregation law was unconstitutional. After her death, she became the first African-American woman to have her likeness depicted in the National Statuary Hall in the U.S. Capitol.

Parks (1913-2005) earned her way into the pantheon of civil rights leaders, but few remember a remarkable man who preceded her by a century when streetcars were pulled by horses.

Catto

His name was Octavius Valentine Catto (1839-1871) and history was slow in recognizing his astonishing accomplishments. Even the epitaph on his tombstone shouts in bold letters “THE FORGOTTEN HERO.” One episode in his far-too-short but inspiring life is eerily similar to the events in Montgomery, only dramatically more so. Catto was a fierce enemy of the entire Philadelphia trolley car system, which banned black passengers. On May 18, 1865, The New York Times ran a story about an incident involving Catto that occurred the previous afternoon in Philadelphia, “The City of Brotherly Love” (at least for some).

Paraphrasing the story, it describes how a colored man (Catto) had refused all attempts to get him to leave a strictly segregated trolley car. Frustrated and in fear of being fined if he physically ejected him, the conductor cleverly side railed the car, detached the horses and left the defiant passenger in the now-empty stationary car. Apparently, the stubborn man was still on-board after spending the night. It caused a neighborhood sensation that led to even more people challenging the rules.

The following year, there was an important meeting with the Urban League to protest the forcible ejection of several black women from Philadelphia streetcars. The intrepid Catto presented a number of resolutions that highlighted the inequities in segregation, principles of freedom, civil liberty and a heavily biased judicial system. He also boldly solicited support from fellow citizens in his quest for fairness and justice.

He got specific help from Pennsylvania Congressman Thaddeus Stevens, a leader of the “Radical Republicans” who had a fiery passion for desegregation and abolition of slavery, and who criticized President Lincoln for lack of more forceful action. Stevens is a major character in Steven Spielberg’s 2013 Oscar-nominated film Lincoln, with Tommy Lee Jones gaining an Oscar nomination for his portrayal of Stevens. On Feb. 3, 1870, the 15th Amendment to the Constitution guaranteed suffrage to black men (women of all colors would have to wait another 50 years until 1920 to gain the right to vote in all states). It would also lead to Catto’s death. On Election Day, Oct. 10, 1871, Catto was out encouraging black men to vote for Republicans. He was fatally shot by white Democrats who wanted to suppress the black vote.

Blacks continued to vote heavily for Republicans until the early 20th century and were not even allowed to attend Democratic conventions until 1924. This was primarily due to the fact that Southern states had white governors who mostly discouraged equal rights and supported Jim Crow laws that were unfair to blacks. As comedian Dick Gregory (1932-2017) famously joked, he was at a white lunch counter where he was told, “We don’t serve colored people here,” and Gregory replied, “That’s all right. I don’t eat colored people … just bring me a whole fried chicken!”

Octavius Catto, who broke segregation on trolley cars and was an all-star second basemen long before Jackie Robinson, would have to wait until the 20th century to get the recognition he deserved. I suspect he would be surprised that we are still struggling to “start a national conversation” about race when that’s what he sacrificed his life for.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Twain’s Era Marked America’s Emergence on the World Stage

An 1876 first edition, first printing of Mark Twain’s The Adventures of Tom Sawyer sold for $13,750 at an August 2015 Heritage auction.

By Jim O’Neal

American writer and satirist Mark Twain was born on Nov. 30, 1835 – exactly two weeks after Halley’s Comet made its appearance. In his 1909 biography, he wrote, “I came in with Halley’s Comet in 1835. It is coming again next year and I expect to go out with it. It will be the greatest disappointment of my life if I don’t go out with Halley’s Comet. The Almighty has said, no doubt, ‘Now here are these two unaccountable freaks, they came in together, they must go out together.’” Twain died shortly after the comet returned.

Twain – real name Samuel Langhorne Clemens – co-wrote a novel with his friend Charles Dudley Warner titled The Gilded Age: A Tale of Today. It was the only time Twain wrote with a collaborator and it was supposedly the result of a dare from their wives. Whatever the truth, the novel lent its name to the post-Civil War period, which has become widely known as the Gilded Age. The novel skewered that era of American history because of the widespread corruption and materialistic greed of a few at the expense of the downtrodden masses.

Twain

From a purely economic standpoint, the period of 1870-90 was when the United States became the dominant economy in the world. For the majority of recorded history, China and India were the global powerhouses, with 70 percent of world GDP. Economic output, up until about 200 years ago, was largely driven by large populations of people. But with the industrial revolution, followed by the information revolution, the significance of mere huge populations declined. While Europe was going through its resurgence following the Dark Ages, the Asian superpowers were divided into small kingdoms fighting each other.

Factors contributing to the post-Civil War growth were primarily in the North as industrial expansion surged while the slave-labor system was abolished and cotton prices collapsed. New discoveries of coal in the Appalachian Mountains, oil in Pennsylvania, and iron ore around Lake Superior fueled the growth of the United States infrastructure. Railroad systems more than tripled from 1860 to 1880 – concurrent with the Transcontinental Railroad (1869) that linked remote areas with the large industrial hubs; along with commercial farming, ranching and mining. London and Paris poured money into U.S. railroads and American steel production surpassed the combination of Britain, Germany and France. Technology flourished with 500,000 patents issued for new inventions and Thomas Edison and Nikola Tesla electrified the industrial world.

Capital investment increased by 500 percent and capital formation doubled. By 1890, the United States surpassed Britain for manufacturing output and by the beginning of the 20th century, per-capita income was double that of Germany or France and 50 percent higher than Great Britain.

Then, inexplicably, Europeans started a world war and 20 years later, both the European and Asian nations started another global conflict. The United States strategically entered both wars late, preserving our capital, military and human resources. Excluding a few ships here and there (e.g. Pearl Harbor), we kept 100 percent of our domestic infrastructure intact. Excluding 9/11, we have probably damaged more of our own cities in domestic protests and rioting than all foreign enemies combined in acts of war.

As Pogo wisely observed, “We have met the enemy and he is us.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Usual Fireworks Expected with Latest Supreme Court Selection

This photograph, signed by Supreme Court Chief Justice William H. Taft and the eight associate justices, circa 1927, sold for $14,340 at a September 2011 Heritage auction.

By Jim O’Neal

It is that time again when the news will be filled with predictions of pestilence, war, famine and death (the Four Horsemen of the Apocalypse) as President Trump tees up his next candidate for the Supreme Court. One side will talk about the reversal of Roe v. Wade as an example of the terrible future that lies ahead. The other side will be quick to point out that this fear-mongering first started in 1981 when Sandra Day O’Connor (the first woman to serve on the court) was nominated by President Reagan and that nothing has happened in the intervening 37 years.

My prediction is that regardless of whoever is confirmed, there will be no evidence from the past on any opinions on “Roe” and he or she will have been groomed by the “Murder Boards” to answer that it is settled law. Murder Boards are groups of legal experts who will rehearse the nominee on how to answer every possible question the Senate Judiciary Committee might ask on any subject, not just Roe, in their role in giving advice and consent. It produces what former Vice President Joe Biden described as a “Kabuki dance” when he was in the Senate.

The questioning does produce great public theater, but it is a tradition that dates to 1925 when nominee Harlan Stone actually requested he be allowed to answer questions about rumors of improper ties to Wall Street. It worked and he was confirmed by a vote of 71-6 and would later serve as Chief Justice (1941-46). In 1955, John Marshall Harlan II was next when Southern Senators wanted to know his views on public school desegregation vis-à-vis Brown v. Board of Education. He was also successfully confirmed 71-11 and since then, every nominee to the court has been questioned by the Senate Judiciary Committee. The apparent record is the 30 hours of grilling Judge Robert Bork experienced in 1987, when he got “Borked” by trying to answer every single question honestly. Few make that mistake today.

Roe v. Wade was a 1973 case in which the issue was whether a state court could constitutionally make it a crime to perform an abortion, except to save the mother’s life. Abortion had a long, legal history dating to the 1820s when anti-abortion statues began to appear that resembled an 1803 law in Britain that made it illegal after “quickening” (start of fetal movements) using various rationales such as illegal sexual conduct, unsafe procedures and the state’s responsibilities in protecting prenatal life.

The criminalization accelerated from the 1860s and by 1900, abortion was a felony in every state. Despite this, the practice continued to grow and in 1921, Margaret Sanger founded the American Birth Control League. By the 1930s, licensed physicians performed an estimated 800,000 procedures each year. In 1967, Colorado became the first state to decriminalize abortion in cases of rape, incest or permanent disability of the woman. By 1972, 13 states had similar laws and in 1970, Hawaii was the first state to legalize abortion on the request of the woman. So the legal situation prior to Roe was that abortion was illegal in 30 states and legal in the other 20 under certain conditions.

“Jane Roe” was an unmarried pregnant woman who supposedly wished to terminate her pregnancy and instituted an action in the U.S. District Court for the Northern District of Texas. A three-judge panel found Texas criminal statues unconstitutionally vague and the right to choose to have children was protected by the 9th through the 14th Amendments. All parties appealed and on Jan. 22, 1973, the Supreme Court ruled the Texas statute was unconstitutional. The court declined to define when human life begins.

Jane Roe’s real name was Norma McCorvey and she became a pro-life advocate before she died and maintained she never had the abortion and that she was the victim of two young, ambitious lawyers looking for a plaintiff. Henry Wade was district attorney of Dallas from 1951 to 1987 and the longest serving DA in United States history. He was also involved in the prosecution of Jack Ruby for killing Lee Harvey Oswald. After he was convicted, Ruby appealed and the verdict was overturned, but he died of lung cancer and is constitutionally presumed innocent.

Stay tuned for the fireworks.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As Nation Moved to Civil War, the North had the Financial Edge

Richard Montgomery was an Irish soldier who served in the British Army before joining the Continental Army.

By Jim O’Neal

Richard Montgomery (1738-75) was a little-known hero-soldier born in Dublin, Ireland, who became a captain in the British Army in 1756. Later, he became a major general in the Continental Army after the Continental Congress elected George Washington as Commander in Chief of the Continental Army in June 1775. This position was created specifically to coordinate the military efforts of the 13 Colonies in the revolt against Great Britain.

Montgomery was killed in a failed attack on Quebec City led by General Benedict Arnold (before he defected). Montgomery was mourned in both Britain and America as his remains were interned at St. Paul’s Chapel in New York City.

A remarkably diverse group of schools, battleships and cities named in his honor remain yet today. Montgomery, Ala., is the capital and second-largest city in the state; it’s where Rosa Parks refused to give up her bus seat to a white passenger on Dec. 1, 1955, sparking the famous Montgomery bus boycott. Martin Luther King Jr. used Montgomery to great advantage in organizing the civil rights movement.

Montgomery was also the first capital of the Provisional Congress of the Confederate States when the first meeting was convened in February 1861. The first seven states that seceded from the United States had hastily selected representatives to visit the new Confederate capital. They arrived to find the hotels dirty, dusty roads, and noisy lobbyists overflowing in the statehouse. Montgomery was not prepared to host any large group, especially a large political convention.

Especially notable was that most of the South’s most talented men had already either joined the Army, the Cabinet or were headed for diplomatic assignments. By default, the least-talented legislators were given the responsibility of writing a Constitution, installing the new president (Jefferson Davis), and then authorizing a military force of up to 400,000 men. This conscription was for three years or the duration of the war. Like the North, virtually everyone was confident it would be a short, decisive battle.

Jefferson Davis was a well-known name, having distinguished himself in the Mexican War and serving as Secretary of War for President Franklin Pierce. Like many others, he downplayed the role of slavery in the war, seeing the battle as a long-overdue effort to overturn the exploitive economic system that was central to the North. In his view, the evidence was obvious. The North and South were like two different countries: one a growing industrial power and the other stuck in an agricultural system that had not evolved from 1800 when 80 percent of its labor force was on farms and plantations. The South now had only 18 percent of the industrial capacity and trending down.

That mediocre group of lawmakers at the first Confederate meeting was also tasked with the challenge of determining how to finance a war against a formidable enemy with vastly superior advantages in nearly every important aspect. Even new migrants were attracted to the North’s ever-expanding opportunities, as slave states fell further behind in manufacturing, canals, railroads and even conventional roads, all while the banking system became weaker.

Cotton production was a genuine bright spot for the South (at least for plantation owners), but ironically, it generated even more money for the North with its vast network of credit, warehousing, manufacturing and shipping companies. The North manufactured a dominant share of boots, shoes, cloth, pig iron and almost all the firearms … an ominous fact for people determined to fight a war. The South was forced to import foodstuffs in several regions. Southern politicians had spoken often of the need to build railroads and manufacturing, but these were rhetorical, empty words. Cotton had become the powerful narcotic that lulled them into complacency. Senator James Hammond of South Carolina summed it up neatly in his “Cotton is King” speech on March 4, 1858: “Who can doubt, that has looked at recent events, that cotton is supreme?”

Southerners sincerely believed that cotton would rescue them from the war and “after a few punches in the nose,” the North would gladly surrender.

One of those men was Christopher G. Memminger, who was selected as Confederate States Secretary of the Treasury and responsible for rounding up gold and silver to finance the needs of the Confederate States of America (CSA). A lawyer and member of the South Carolina legislature, he was also an expert on banking law. His first priority was for the Treasury to get cash and he started in New Orleans, the financial center of the South, by raiding the mint and customs house.

He assumed there would be at least enough gold to coin money and commissioned a design for a gold coin with the goddess of liberty seated, bearing a shield and a staff flanked by bales of cotton, sugar cane and tobacco. Before any denominations were finalized, it was discovered there was not enough gold available and the mint was closed in June.

This was followed by another nasty surprise: All the banks in the South possessed only $26 million in gold, silver and coins from Spain and France. No problem. Memminger estimated that cotton exports of $200 million would be enough to secure hundreds of millions in loans. Oops. President Lincoln had anticipated this and blockaded all the ports after Fort Sumter in April 1861. No cotton, no credit, no guns.

In God we trust. All others pay cash.

One small consolation was that his counterpart in the North, Salmon P. Chase, was also having trouble raising cash and had to resort to the dreaded income tax. However, both sides managed to keep killing each other for four long years, leaving a legacy of hate.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

America has a Long History of Rough-and-Tumble Politics

A cabinet card photograph dated 1852, shortly after the marriage of Rutherford and Lucy Hayes, went to auction in October 2008.

By Jim O’Neal

A surprisingly high number of political pundits ascribe the current bitter partisan divide to the presidential election of 2000, when the Supreme Court ordered the recount of “under-votes” in Florida to cease. As a result, the previously certified election results would stand and George W. Bush would receive all 25 Florida electoral votes, thus providing him a 271-266 nationwide victory over Al Gore. Democrats almost universally believed the election had been “stolen” due to the seemingly unprecedented action by the Supremes.

Although obviously a factor in the situation today, it seems too simplistic to me, as I remember the Clinton Impeachment, the start of the Iraq War (and the president who lied us into war), and, of course, Obamacare – all of which were also major contributors to the long, slow erosion of friendly bipartisanship. Now, we’re in an era when each new day seems to drag up a new issue that Americans can’t agree on and the schism widens ever so slightly.

Could it be worse?

The answer is obviously “yes,” since we once tried to kill each other into submission during the Civil War. Another good example is the highly controversial presidential election of 1876, which resulted in Rutherford B. Hayes becoming president. The loser, Samuel J. Tilden, had such staunch supporters that they promised “blood would run in the streets” if their candidate lost. After a highly ultra-controversial decision threw the election to Hayes, Democrats continued to make wild threats, and public disturbances were rampant across New York City hotels, saloons, bars and any other venues where crowds gathered.

The unrest was so high that outgoing President Ulysses S. Grant gradually became convinced that a coup was imminent. This was the closest the Dems had come to the White House since James Buchanan’s election 20 years earlier in 1856 and passions were so high that they would not be calmed easily. The level of resentment was much more than about losing an election or the ascendancy of the Republican Party with all their fierce abolitionists. It seems apparent even today that the election results had been politically rigged or, at a minimum, very cleverly stolen in a quasi-legalistic maneuver.

Grant’s primary concern was one of timing. The normal inauguration date of March 4 fell on a Sunday and tradition called for it to be held the next day, on Monday, March 5 (as with Presidents James Monroe and Zachary Taylor). Thus the presidency would be technically vacant from noon on Sunday until noon on Monday. The wily old military genius knew this would be plenty of time to pull off a coup d’état. He insisted Hayes not wait to take the oath of office.

In a clever ruse, the Grants made arrangements for a secret oath-taking on Saturday evening by inviting 38 people to an honorary dinner at the White House. While the guests were being escorted to the State Dining Room, Grant and Hayes slipped into the Red Room, where Chief Justice Morrison Waite was waiting with the proper documents. All went as planned until it was discovered there was no Bible available. No problem … Hayes was sworn in as the 19th president of the United States with a simple oath.

The passing of power has been one of the outstanding aspects of our constitutional form of governance.

Hayes was born on Oct. 4, 1822 – 2½ months after his father had died of tetanus, leaving his pregnant mother with two young children. From these less-than-humble beginnings, the enterprising “Rud” got a first-rate education that culminated with an LLB degree from Harvard Law School. Returning to Ohio, he established a law practice, was active in the Civil War and finally served two non-consecutive terms as governor of Ohio, which proved to be a steppingstone to the White House.

Most historians believe Hayes and his family were the richest occupants of the White House until Herbert and Lou Hoover showed up 52 years later. They certainly had a reputation for living on the edge of extravagance, and some cynics believe this was in large part due to the banning all alcohol in the White House (presidents in those days paid for booze and wine personally). Incidentally, the nickname for the first lady, “Lemonade Lucy,” did not happen until long after they left the White House.

President Hayes kept his pledge to serve only one term; he died of a heart attack in 1893 at age 70. The first Presidential Library in the United States was built in his honor in 1916.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Scientists Like Joseph Lister Have Made Life Better for All of Us

A March 25, 1901, letter signed by Joseph Lister went to auction in October 2014.

By Jim O’Neal

In the 1880s, American physicist Albert Michelson embarked on a series of experiments that undermined a long-held belief in a luminiferous ether that was thought to permeate the universe and affect the speed of light ever so slightly. Embraced by Isaac Newton (and almost venerated by all others), the ether theory was considered an absolute certainty in 19th century physics in explaining how light traveled across the universe.

However, Michelson’s experiments (partially funded by Alexander Graham Bell) proved the exact opposite of the theory. In the words of author William Cropper, “It was probably the most famous negative result in the history of physics.” The fact was that the speed of light was the same in all directions and in every season – reversing Newton’s law that had been thought to be a constant for the past 200 years. But, not everyone agreed for a long time.

The more modern scientist Max Planck (1858-1947) helped explain the resistance to accept new facts in a rather novel way: “A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.”

Even if true, it still makes it no less easy to accept the fact that the United States was the only nation “that remained unconvinced of the merits of Joseph Lister’s methods of modern antiseptic medicine.” In fact, Henry Jacob Bigelow (1818-1890), the esteemed Harvard professor of surgery and a fellow of the Academy of Arts and Sciences, derided antisepsis as “medical hocus-pocus.” This is even more remarkable when one considers he was the leading surgeon in New England and his contributions to orthopedic and urologic surgery are legendary.

But this short story begins with a sleight of hand by asking: In the 19th century, what do you think was the most dangerous place in the vast territories of the British Empire? The frozen wastes of the Northwest Passage or the treacherous savannas of Zululand? Or perhaps the dangerous passes of Hindu Kush? The surprising answer is almost undoubtedly the Victorian teaching hospital, where patients entered with a trauma and exited to a cemetery after a deadly case of “hospital gangrene.”

Victorian hospitals were described as factories of death, reeking with an unmistakable stench resembling rotting fish, cheerfully described as “hospital stink.” Infectious wounds were considered normal or beneficial to recovery. Stories abound of surgeons operating on a continuous flow of patients, and bloody smocks were badges of honor or evidence of their dedication to saving lives. The eminent surgeon Sir Frederick Treves (1853-1923) recalled, “There was one sponge to a ward. With this putrid article and a basin of once clear water, all the wounds in the ward were washed twice a day. By this ritual, any chance that a patient had of recovery was eliminated.”

Fortunately, Joseph Lister was born in 1827 and chose the lowly, mechanical profession of surgery over the more prestigious practice of internal medicine. In 1851, he was appointed one of four residents of surgery at London’s University College Hospital. The head of surgery was wrongfully convinced that infections came from miasma, a peculiar type of noxious air that emanated from the rot and decay.

Ever skeptical, Lister scoured out rotten tissue from gangrene wounds using mercury pernitrate on the healthy tissue. Thus began Lister’s lifelong journey to investigate the cause of infection and prevention through modern techniques. He spent the next 25 years in Scotland, becoming the Regius Professor of Surgery at the University of Glasgow. After Louis Pasteur confirmed germs caused infections rather than bad air, Lister discovered that carbolic acid (a derivative of coal tar) could prevent many amputations by cleaning the skin and wounds.

He then went on the road, advocating his gospel of antisepsis, which was eagerly adopted by the scientific Germans and some Scots, but plodding and practical English surgeons took much longer. Thus left were the isolated Americans who, like Dr. Bigelow, were too stubborn and unwilling to admit the obvious.

Planck was right all along. It would take a new generation, but we are the generation that has derived the greatest benefits from the astonishing advances in 20th century medical breakthroughs, which only seem to be accelerating. It is a good time to be alive.

So enjoy it!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Bell’s Influence on National Geographic Society Often Overlooked

An archive of documents from the early days of Bell Telephone Company – including correspondence by Gardiner G. Hubbard, Alexander Graham Bell’s father-in-law – sold for $10,157 at an October 2012 Heritage auction.

By Jim O’Neal

In 2013, Nancy and I took a cruise from New York City to Montreal. On Sept. 23, we had the great pleasure of touring the Alexander Graham Bell museum in Baddeck, Nova Scotia. We were struck by its unusual design, which is based on the tetrahedron form used in his many flight experiments with kites. There were also numerous original artifacts, photographs and exhibits of his groundbreaking scientific accomplishments.

Alexander Graham Bell

Bell (1847-1922) was awarded patent #174465 just four days after his 29th birthday for the first practical telephone – “the most valuable single patent ever issued” in any country. Our guide informed us that Bell would not allow a telephone in his study or laboratory since he considered it a distraction to his reading and experiments. I was aware that both his mother and wife were deaf and this had a profound effect on his passion for working on sound, speech and hearing. What surprised me was the breadth of his scientific achievements. He was awarded 18 patents and collaborated on another 12 in medicine, aeronautics, genetics, electricity, sound and marine engineering.

Another surprise was that his wife Mabel was the daughter of Gardiner Greene Hubbard, founder and first president of the National Geographic Society (founded in 1888) and also the first president of Bell Telephone Company (later AT&T). Although AGB (he got a middle name only after constantly nagging his father) was not a founder of National Geographic, he was its second president, following his father-in-law. This was organizational incest on a scale that rivaled the British monarchy.

But the result was an organization that has given several generations a certain sense of where we are and where we want to go. Commanders-in-chief, explorers, schoolchildren and even daydreamers have put their full trust in the splendid maps of the National Geographic Society and their brilliant cartographers. The elegant and clearly legible typefaces for place names, one source of the map’s mystique, were designed by the magazine’s staff in the 1930s.

It was founded in Washington, D.C., at the Cosmos Club, another venerable organization founded in 1878 and boasting of membership by three presidents, two vice presidents, 12 Supreme Court justices, and 36 Nobel and 61 Pulitzer Prize winners (they don’t bother with ordinary U.S. senators).

During World War II, National Geographic maps were at the epicenter of the action, thanks in part to a U.S. president who was deeply interested in geography. The society had furnished Franklin D. Roosevelt with a cabinet that was mounted on the wall behind the desk in his private White House study. Maps of continents and oceans could be pulled down by the president like window shades; they were in constant use throughout the war.

In the early winter of 1942, President Roosevelt urged the American people to have a world map available for his next fireside chat, scheduled for the evening of Feb. 23. FDR told his aides, “I’m going to speak about strange places that many have never heard of – places that are now the battleground for civilization. … I want to explain to the people something about geography – what our problem is and what the overall strategy of the war has to be. I want to tell it to them in simple terms of ABC so that they will understand what is going on and how each battle fits into the picture. … If they understand the problem and what we are driving at, I am sure that they can take any kind of news on the chin.”

There was an unprecedented run on maps and atlases. The audience, more than 80 percent of the country’s adult population, was the largest for any geography lesson in history.

The National Geographic Society went on, expanding the scope of its focus – with maps for the amazing Mount Everest to outer space and the ocean floor. As the Society’s former chief cartographer put it: “I like to think that National Geographic maps are the crown jewels of the mapping world.”

He was right, until Google maps created a new technology in need of its own headware.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Maybe a Simple Theory Explains Nature’s Mysteries

A Charles Darwin signature is among a set of autographs by famed scientists that sold for $4,750 at a January 2017 auction.

By Jim O’Neal

Cosmologists generally agree the universe is 13.8 billion years old, and Earth 4.6 billion years old. They also agree the universe is expanding at an ever-increasing rate, and is creating new space in the process. It is already so immense that even by traveling at the speed of light, you would simply end up back where you started due to the curvature of space. This eliminates one of my lifelong desires to poke my head thru and “see what’s out there.” The answer is nothing, a hard concept to grasp … at least for me.

What no one seems to know is where, when or how life on Earth began. Or, for that matter, if life (as we know it) exists anywhere other than on our small tiny orb tucked in a remote part of our modest galaxy, at the precise distance from the Sun to permit our existence.

Author Bill Bryson writes about the work of a graduate student at the University of Chicago, Stanley Miller, who in 1953 tried to synthesize life in a chemistry lab. He hooked up two flasks, one containing water and the other a mixture of methane, ammonia and hydrogen gases. By adding electrical sparks to simulate atmospheric lightning, he was able to convert this concoction to a green and yellow broth of amino acids, fatty acids and other organic compounds. His euphoric professor – Nobel laureate Harold Urey – exclaimed, “If God didn’t do it this way, he missed a good bet!” Since it was subsequently pointed out that Earth never had such noxious conditions, we are no closer to creating life today, 65 years later.

Others have speculated that life on Earth arrived when a meteorite crashed into the planet in a process known as panspermia. The problem with this theory is that it still doesn’t explain how life BEGAN and just moves the problem to some other remote place.

Since modern man dates back 200,000 years to Africa, I’m more curious as to why it took us so long to fly. It was only rather recently, on Dec. 17, 1903, that two brothers from Dayton, Ohio – Orville and Wilbur Wright – rose into the air in Kitty Hawk, N.C., and descended 120 feet further than the take-off point. Wilbur had tried first and stalled, but Orville took the controls and set off into a strong wind with Wilbur steadying the wingtip running alongside.

They made three more flights that morning, with the longest covering 852 feet. When a wind gust broke the airframe, they just packed all the parts and went back to Dayton. What makes this achievement even more remarkable is that neither had any formal academic education in physics, although both were high school graduates. Today, the “Flyer” hangs proudly above the entrance at the Smithsonian Air and Space Museum in Washington under a long inscription that ends “…Taught Men to Fly and Opened the Era of Aviation.”

Of course, flying in the true sense has mostly been restricted to birds, as Charles Darwin theorized. In his travels aboard the Beagle survey ship, he noted that finch beaks on different islands in the Galapagos varied to exploit local resources. He speculated the birds had not been originally created this way, but had adapted themselves to gain a strategic advantage to acquire scarce resources. They had indeed, but it should be noted that Darwin did not coin the phrase “survival of the fittest” and even the word “evolution” didn’t appear until the sixth printing of On the Origin of Species. And even this book was delayed for many years since his editor urged him to write about pigeons. “Everyone is interested in pigeons.”

A lot has been written about “locomotion,” with the flight of birds being the most interesting … and the Pterosaur from 100 million years ago especially so. With a wingspan of 16 feet and weighing a mere 22 pounds, it was able to dominate eastern England by staying aloft for extended periods on rising warm-air currents … presumably as a hovering predator.

Once again, we face the same questions. How it developed is a mystery, as is its anatomy, since it couldn’t manage take-off via traditional wing flapping. Perhaps it relied on gravity and thermals to become airborne. But this would have required plunging into the air from seaside cliffs, like modern frigate birds.

My theory to cover all these mysterious questions is more simplistic: Evolution is just “one damned thing after another.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].