Franklin had Faults, but he Remains an Extraordinary American

Norman Rockwell’s illustration of Ben Franklin for a 1926 cover of The Saturday Evening Post sold for $762,500 at a May 2018 Heritage auction.

By Jim O’Neal

George Washington was the only United States president who lived his entire life in the 18th century (1732-1799). However, every early vice president – from John Adams on – spent a significant part of their lives in the 19th century. Of the Founding Fathers, Benjamin Franklin (often considered the “grandfather” of the group), was born in 1706 and died in 1790 – nine years earlier than even Washington. In reality, he was actually a member of the previous generation and spent virtually all of his life as a loyal British subject.

As a result, Franklin didn’t have an opportunity to observe the nation he helped create as it struggled to function smoothly. Many were determined not to simply replicate the English monarchy, but to establish a more perfect union for the common man to prosper (women, slaves and non-property owners would have to wait). Franklin was also a man of vast contradictions. He was really a most reluctant revolutionary and discretely wished to preserve the traditions of the British Empire he had grown so familiar with. He secretly mourned the final break, even as he helped lead the fight for America’s independence.

Even while signing the Declaration of Independence and the Constitution, like many other loyalists, he hoped for some sort of reconciliation, a hopeless cause after so many careless British transgressions.

Fortunately, we have a rich history of this remarkable man’s life since he was such a prolific writer and his correspondence was greatly admired, and thus preserved by those lucky enough to receive it. Additionally, his scientific papers were highly respected and covered a vast breadth of topics that generated interest by the brightest minds in the Western world. He knew most of them personally on a first-name basis due to the many years he lived in France and England while traveling the European continent. Government files are replete with the many letters exchanged with heads of state.

Despite his passion for science, Franklin viewed his breakthrough experiments as secondary to his civic duties. He became wealthy as a young man and this provided the freedom to travel and assume important government assignments. Somehow, he was also able to maintain a pleasant marriage despite his extended absences, some for as long 10 years. He rather quickly developed a reputation as a “ladies’ man” and his social life flourished at the highest levels of society.

Some historians consider him the best-known celebrity of the 18th century. Even today, we still see his portrait daily on our $100 bills – colloquially known as “Benjamins” – and earlier on common 50-cent pieces and various denominations of postage stamps. Oddly, he is probably better known today by people of all ages than those 200 years ago. That is true stardom that very few manage to attain.

Every student in America generally knows something about Franklin flying a kite in a thunderstorm. They may not know that he proved the clouds were electrified and that lightning is a form of electricity. Or that Franklin’s work inspired Joseph Priestley to publish a comprehensive work on The History and Present State of Electricity in 1767. And it would be exceedingly rare if they knew the prestigious Royal Society honored him with its first Copley Medal for the advancement of scientific knowledge. But they do know Franklin from any picture.

Others may know of his connection to the post office, unaware that the U.S. postal system was established on July 26, 1775, by the Second Continental Congress when virtually all the mail was sent to Europe, not to themselves. There were no post offices in the Colonies and bars and taverns filled that role nicely. Today, there are 40,000 of them handling 175 billion pieces (six per second) and they have an arrangement with Amazon to deliver their packages, even on Sundays. Mr. Franklin helped create this behemoth as the first Postmaster General.

Franklin was also a racist in an era when the word didn’t even exist. He finally freed his house slaves and later became a staunch opponent of slavery, even sponsoring legislation. But he literally envisioned a White America, most especially for the Western development of the country. He was alarmed about German immigrants flooding Philadelphia and wrote passionately about their not learning English or assimilating into society. He was convinced it would be better if blacks stayed in Africa. His dream was to replicate England since the new nation had so much more room for expansion than that tiny island across the Atlantic. But we are here to examine the extraordinary mind and curiosity that led to so many successful experiments. Franklin always bemoaned the fact that he had been born too early and dreamed about all the new wonderful things that would be 300 years in the future.

Dear Ben, you just wouldn’t believe it!

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Copernicus, Galileo Helped Discover Our True Place in the Universe

A 1635 copy of Galileo’s Dialogo, containing the argument that Earth revolves around the sun, sold for $31,070 at a February 2010 Heritage auction.

“Then God said, ‘Let there be light,’ and there was light.” Genesis 1:3

By Jim O’Neal

About 4.6 billion years ago, an enormous swirl of gas and matter started a process of aggregation that ultimately formed our solar system. Nearly all of this mass was transformed into our sun and the remainder eventually coalesced into clumps that became our planetary system. Fortunately for us, one of these clumps became our current home, Earth. Then, a large clump approximately the size of Mars collided with Earth and knocked out enough material to form an object we call the moon.

Several other factors were also in our favor. First was the size of the sun, since if it were larger it may have burned out by now and us with it (we still have about 5 billion years to go). The second was the distance between the sun and Earth. We literally have a “Goldilocks position” – not too close and not too far. If we were 5 percent closer or 15 percent farther away, we would either be fried or trying to live on a large ice ball. The odds of being exactly where we are is not pleasant to contemplate. Then again, if we weren’t exactly here, I assume we wouldn’t be aware of our bad luck. Even our orbit around the sun produces a nice blend of moderate seasons of winter and summer.

However, we are still subject to occasional collisions with other objects, like asteroids and comets. In the past, these have produced mass extinctions like the one that surprised the dinosaurs, who had enjoyed their time here for about 150 million years. Life has managed to adapt through five or six major incidents and countless smaller ones. The modern form of humans (our ancestors) have only been here for about 200,000 years, but we’ve been clever enough to develop weapons that could destroy Earth (a first that we should not be too proud of).

Throughout the early history of our residency, the conventional wisdom was that Earth was the center of everything, with the sun, moon and stars orbiting us. This “geocentric model” seemed logical using common sense, as we don’t feel any motion standing on the ground and there was no observational evidence that our planet was in motion, either. This system was widely accepted and became entrenched in classical philosophy via the combined works of Plato and Aristotle in the fourth century B.C.

However, when the ancient Greeks measured the movements of the planets, it became clear the geocentric model had too many discrepancies. To explain the complications, Greek astronomers introduced the concept of epicycles to reconcile their theories. These sub-orbits were refined by the great Greco-Roman astronomer Ptolemy of Alexandria. Despite competing ideas and debates, the Ptolemaic system prevailed, but with significant implications.

As the Roman Empire dwindled, the Christian Church inherited many long-standing assumptions, including the idea that Earth was the center of everything and, perhaps more importantly, that man was the pinnacle of God’s creation – with determination over Earth, a central tenet that held sway in Europe until the 16th century. However, by this point, the Ptolemaic model was becoming absurdly complicated. By the beginning of the 16th century, things were about to change … dramatically. The twin forces of the Renaissance and the Protestant Reformation challenged the old religious dogmas. Suddenly, a Polish Catholic canon, Nicolaus Copernicus (1473-1543), put forth the first modern heliocentric theory, shifting the center of the universe to the sun.

Fortunately, Copernicus was followed by an Italian polymath from Pisa, Galileo Galilei (1564-1642), who bravely helped transform philosophy to modern science and the scientific Renaissance into a scientific revolution. He championed heliocentrism, despite powerful opposition from astronomers, and was subjected to a formal Roman Inquisition. The body found him “vehemently suspect of heresy” and forced him into house arrest. He spent the remainder of his life basically imprisoned.

Now famous for his work as an astronomer, physicist, engineer, philosopher and mathematician, he received the ultimate honor of being named the “Father of Science” by both Stephen Hawking and Albert Einstein!

Not bad for someone in an ankle bracelet.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Tension Between Federal, State Governments Lingers Even Today

Slave hire badges were likely hung from the necks of slaves who were leased out by their masters for short-term hire. This 1801 Charleston badge sold for $11,875 at a May 2015 Heritage auction.

By Jim O’Neal

There have always been disagreements about the real cause(s) of the American Civil War. One major culprit – the differences between the North and South over the issue of slavery – is seen as a top reason. There are earlier academic arguments that the cause was really about differences over the dividing line between states’ rights and the authority of the federal government. Where did one authority stop or become superseded by the other?

A third reason was the simple motivation to keep our hard-won United States intact and not splinter into states that were merely loosely confederated, as opposed to “One Nation of States United.” President Lincoln added fuel to this logic when he publicly wrote, “My paramount object in this struggle is to save the Union, and is not either to save or to destroy slavery. If I could save the Union without freeing any slave I would do it, and if I could save it by freeing all the slaves I would do it.”

In his historic Emancipation Proclamation, he literally freed only the slaves in the Confederate States, the only ones he had no authority over. Slaves in the so-called border states (slave states that did not declare secession) were unaffected by the Proclamation for fear of having more states secede. Maintaining the Union triumphed over all other objectives.

But what if the Civil War was really about something else, like, say, basic democracy? Sound crazy? Well, maybe, but consider the years after the American Revolution and the framing of the Constitution when the basic tenets of democracy were highly contentious. The Founding Fathers (both Federalists and Jeffersonians) clashed constantly and ferociously over the role of ordinary citizens in a new government of “We the People.” Who were these people and what rights did they have?

Even the ratification of the Constitution was delayed since there was not a specific “Bill of Rights.” Opponents of this argued strenuously that the rights already existed, and adding a special list could call into question other rights that were not specifically included. In the end, it was necessary to compromise and include a special list that was narrowed down to the ones in our current Bill of Rights. However, somewhat ironically, they generally cover what the government could not do, rather than specific rights for individual citizens.

The triumph of Andrew Jackson in 1828 modified this role on the national level, while city Democrats, anti-Masons, fugitive slaves and other Americans worked to carve out their interests on the local level. These cumulative decisions led inexorably to the beginning of a series of regional differences. The free-labor Northern states and the slaveholding South – loosely linked by an evolving federal government – were in reality two distinct political systems with fundamentally antagonist cultures. By the time of Jackson’s second inaugural address on March 4, 1833, he felt compelled to declare, “In the domestic policy of this government, there are two objects which especially deserve the attention of the people and their representatives, and which have been and will continue to be subjects of my increasing solicitude. They are the preservation of the rights of the several states and the integrity of the Union.”

The issue that smoldered and occasionally burst into flame was that of nullification. John Calhoun made himself the leader of the movement that declared that a state had the right to decide which federal laws it wished to observe and which to reject. Calhoun and his followers also felt that a state had the right to secede from the Union. Jackson was adamant and publicly declared, “Our Federal Union, it must be preserved!” National leaders from Martin Van Buren to Henry Clay were partially successful in mitigating the ever-growing chasm between these diametrically opposing views. However, the election of Abraham Lincoln signaled an end to this delicate balance. These two almost alien cultures began to quickly unravel, as the firebrands in the South were convinced the federal government was determined to change their fundamental democratic rights and were gleeful that they could stop the encroachment.

Rebellion in the form of secession led directly to the long-predicted armed conflict. Both sides were convinced they were right and were willing to sacrifice their lives.

Flash forward to today and we see the same fissures that are likely to worsen. We’re seeing battles between the states and the federal government over highly emotionally packed issues like voting rights, immigration, travel bans, free speech, inequality, religion, health care … all seeking redress (nullification) from an overloaded Supreme Court, as a highly partisan Congress has become more impotent.

United States Democracy, Act 2?

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Efforts Under Way (Again) to Divide the Golden State

Albert Bierstadt’s 1872 oil on canvas Mount Brewer from King’s River Canyon, California, sold for $602,500 at a November 2012 auction.

By Jim O’Neal

On June 4, 1965, the California State Senate voted 27-12 to divide California into two separate states. To make the proposal effective, it required approval by the State Assembly, followed by both California voters and the U.S. Congress. The plan failed to generate enough support in the State Assembly and did not proceed. In 1992, the State Assembly passed a proposal to allow a referendum vote in each county to partition California into three states: North, South and Central California. In a twist, this time the proposal died in the Senate.

These were not new or even unique legislative actions. Since California became the 31st state to join the United States in 1850, there have been more than 220 similar attempts, obviously none successful. Even while California was under Spanish rule, the province was divided into Alta California (upper) and Baja California (lower). Alta California was the portion that entered the Union, while Baja remained a territory under Mexico rule. It is now one of Mexico’s 31 states … an extremely nice place for turistas to enjoy the sun, sand, fishing and golf.

After Lewis and Clark finished their historic expedition to map out exactly what we actually acquired from France via the Louisiana Purchase, President Jefferson envisioned the Northwest area eventually becoming an independent country … the “Pacific Empire.” He thought it might include a good chunk of Canada, along with what is now Washington, Oregon and Idaho. The war with Mexico trumped that idea when we ended up with the United States expanding into the western Pacific area from San Diego to the Canadian border, followed by the discovery of gold, which ensured dramatic migration from east to west.

One of the more interesting episodes in carving up California in the 20th century is worth re-telling. In October 1941, the mayor of a small town in Oregon announced that four counties in Oregon planned to unite with three counties in Northern California to form the new state of Jefferson, in honor of the late president. On Nov. 27, 1941, a group of armed men stopped all traffic on U.S. Route 99 and handed out copies of their Proclamation of Independence for the State of Jefferson, along with their intent to secede from the Union. On Dec. 4, 1941, they selected local district attorney John Childs to be the governor of Jefferson. Alas, their efforts were foiled by Japanese bombers at Pearl Harbor three days later.

My friend Stan Delaplane won the 1942 Pulitzer Prize for a series of articles on the State of Jefferson that appeared in the San Francisco Chronicle. However, Stan’s bigger claim to fame was convincing the owner of the Buena Vista Café in San Francisco to introduce Irish coffee (after he had it at an airport in Ireland). The café is now one of the “must do” destinations in San Francisco, near the wharf area where the cable cars turn around. The BV claims they sell more Irish whiskey than anywhere in the world: 100 bottles a day, equal to 2,000 special recipes of Irish coffee. It is also a great people-watching place. Everyone you know will end up in the BV (if you are patient).

Stan worked for the Chronicle for 53 years and one of his favorite lines was, “Years ago, someone tilted the United States and all the loose nuts and bolts rolled to California.” What a terrific place for a writer, with new stuff happening every day!

Currently, there is another effort to create a New California as authorized and certified in Article 4, Section 3 of the U.S. Constitution. (The last time it was invoked was on June 20, 1863, when West Virginia detached from Virginia). The current proposal envisions New California (population 15 million) and California (population 25 million), and on Jan. 15, 2018, issued a Declaration of Independence with the intent to form a 51st state. These plans have been revamped to include an “autonomous Native American Nation.” The Calexit proposal (modeled after Brexit) establishes a non-reservation nation for American Indians through retrocession, primarily using federal land.

This form of reparation will see the state sliced down the middle, from Oregon to Mexico, with the coastal half remaining the home for two-thirds of the existing population. All that’s left to get the proposal on the 2021 ballot is 365,000+ signatories.

Let’s hope that what happens in California stays in California!

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Linus Pauling is Among Our Greatest Scientists

A lot that included Linus Pauling’s signature was offered in January 2017.

By Jim O’Neal

Serious writers about Albert Einstein almost invariably include two episodes in his life. The first is the year 1905, when he published four stunning scientific papers. The first explained how to measure molecules in a liquid; the second explained how to determine their movement. The third was a revolutionary concept that described how light rays come in packets called photons. The fourth merely changed the world!

A second highlight deals with a “fudge factor” Einstein (1879-1955) called a “cosmological constant,” whose only purpose was to cancel out the troublesome cumulative effects of gravity on his masterful general theory of relativity. He would later call it “the biggest blunder of my life.” Personally, I prefer a much more simplistic observation that perfectly captures his nonchalance. The poet Paul Valéry (1871-1945) once asked him if he had a notebook to keep track of all his ideas. A rather amused Einstein quickly replied, “Oh, no. That’s not necessary. It is very seldom I have one.”

History is replete with examples of people who had a good year. It was 1941 for Yankees great Joe DiMaggio when he hit in 56 consecutive games, and Babe Ruth’s 60 home runs in 1927. For Bobby Jones, it was 1930, when he won all four of golf’s major championships. Some people have good days, like Isaac Newton when he observed an apple falling from a tree and instantly conceptualized his theory of gravity.

Linus Pauling was different. His entire life was filled with curiosity, followed by extensive scientific research to understand the factors that had provoked him to wonder why. Pauling was born in 1901. His father died in 1910, leaving his mother to figure out how to support three children. Fortunately, a young school friend got an inexpensive chemistry set as a gift and that was enough to spark Pauling’s passion for research. He was barley 13, but the next 80 years were spent delving into the world of the unknown and finding important answers to civilization’s most complex issues.

He left high school without a diploma (two credits short that a teacher wouldn’t let him make up), but then heard about quantum mechanics and in 1926 won a Guggenheim Fellowship to study the subject under top physicists in Europe. (He was eventually given an honorary high school diploma … after he won his first Nobel Prize.) By the late 1920s and early 1930s, Pauling was busy cranking out a series of landmark scientific papers explaining the quantum-mechanical nature of chemical bonds that dazzled the scientific community.

Eventually, he returned to the California Institute of Technology (with his honorary high school diploma) to teach the best and brightest of that era. Robert Oppenheimer (of the Manhattan Project) unsuccessfully tried to recruit him to build the atomic bomb, but failed (presumably because he also tried to seduce Pauling’s wife). However, Pauling did work on numerous wartime military projects … explosives, rocket propellants and an armor-piercing shell. It’s a small example of how versatile he was. In 1948, President Truman awarded him a Presidential Medal for Merit.

In 1954, he won the Nobel Prize in Chemistry for his research on the chemical bond and its application to the elucidation of the structure of complex substances … which I shall not try to explain. And along the way, he became a passionate pacifist, joining the Emergency Committee of Atomic Scientists, chaired by Einstein, in an effort “to warn the people of the pending dangers of nuclear weapons.” His reward was to be called a communist; he had his passport revoked and his patriotism challenged, along with many others, in the dark days of McCarthyism.

In 1958, he petitioned the United Nations, calling for the cessation of nuclear weapons. In addition to his wife, it was signed by over 11,000 scientists from 50 countries. First ban the bomb, then ban nuclear testing, followed by a global treaty to end war, per se. He received a second Nobel Prize for Peace in 1963, but that was for trying to broker an early peace with Vietnam, making him one of only four people to win more than one prize, including Marie Curie in 1903 (physics) and 1911 (chemistry). His other awards are far too numerous to mention. As an aside, he died in one of my favorite places: Big Sur, Calif., at age 93.

Sadly, in later life, his reputation was damaged by his enthusiasm for alternative medicine. He championed the use of high-dose vitamin C as a defense against the common cold, a treatment that was subsequently shown to be ineffective (though there’s some evidence it may shorten the length of colds). I still take it and see scientific articles more frequently about the benefit of infused vitamin C being tested in several cancer trials.

If he were still working on it, let’s say the smart money would be on Pauling.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

100 Years Before Rosa Parks, There was Octavius Catto

Rosa Parks refused to give up her seat on a segregated bus, sparking the Montgomery, Ala., bus boycott.

By Jim O’Neal

Most Americans are familiar with Rosa Parks and recall the heroic story of a weary black woman on her way home after a hard day at work who refused to give up her seat and “move to the back of the bus” to make room for white people. The date was Dec. 1, 1955, and the city was Montgomery, Ala.

Later, she would be arrested during the ensuing Montgomery bus boycott that lasted 381 days. She was fined $10, but ultimately vindicated by the U.S. Supreme Court, which ruled the segregation law was unconstitutional. After her death, she became the first African-American woman to have her likeness depicted in the National Statuary Hall in the U.S. Capitol.

Parks (1913-2005) earned her way into the pantheon of civil rights leaders, but few remember a remarkable man who preceded her by a century when streetcars were pulled by horses.

Catto

His name was Octavius Valentine Catto (1839-1871) and history was slow in recognizing his astonishing accomplishments. Even the epitaph on his tombstone shouts in bold letters “THE FORGOTTEN HERO.” One episode in his far-too-short but inspiring life is eerily similar to the events in Montgomery, only dramatically more so. Catto was a fierce enemy of the entire Philadelphia trolley car system, which banned black passengers. On May 18, 1865, The New York Times ran a story about an incident involving Catto that occurred the previous afternoon in Philadelphia, “The City of Brotherly Love” (at least for some).

Paraphrasing the story, it describes how a colored man (Catto) had refused all attempts to get him to leave a strictly segregated trolley car. Frustrated and in fear of being fined if he physically ejected him, the conductor cleverly side railed the car, detached the horses and left the defiant passenger in the now-empty stationary car. Apparently, the stubborn man was still on-board after spending the night. It caused a neighborhood sensation that led to even more people challenging the rules.

The following year, there was an important meeting with the Urban League to protest the forcible ejection of several black women from Philadelphia streetcars. The intrepid Catto presented a number of resolutions that highlighted the inequities in segregation, principles of freedom, civil liberty and a heavily biased judicial system. He also boldly solicited support from fellow citizens in his quest for fairness and justice.

He got specific help from Pennsylvania Congressman Thaddeus Stevens, a leader of the “Radical Republicans” who had a fiery passion for desegregation and abolition of slavery, and who criticized President Lincoln for lack of more forceful action. Stevens is a major character in Steven Spielberg’s 2013 Oscar-nominated film Lincoln, with Tommy Lee Jones gaining an Oscar nomination for his portrayal of Stevens. On Feb. 3, 1870, the 15th Amendment to the Constitution guaranteed suffrage to black men (women of all colors would have to wait another 50 years until 1920 to gain the right to vote in all states). It would also lead to Catto’s death. On Election Day, Oct. 10, 1871, Catto was out encouraging black men to vote for Republicans. He was fatally shot by white Democrats who wanted to suppress the black vote.

Blacks continued to vote heavily for Republicans until the early 20th century and were not even allowed to attend Democratic conventions until 1924. This was primarily due to the fact that Southern states had white governors who mostly discouraged equal rights and supported Jim Crow laws that were unfair to blacks. As comedian Dick Gregory (1932-2017) famously joked, he was at a white lunch counter where he was told, “We don’t serve colored people here,” and Gregory replied, “That’s all right. I don’t eat colored people … just bring me a whole fried chicken!”

Octavius Catto, who broke segregation on trolley cars and was an all-star second basemen long before Jackie Robinson, would have to wait until the 20th century to get the recognition he deserved. I suspect he would be surprised that we are still struggling to “start a national conversation” about race when that’s what he sacrificed his life for.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As Sir Newton Noted, Vast Oceans of Truth Lie Undiscovered

Sir Isaac Newton’s autograph was among of group of three signatures by famous scientists that sold for $4,750 at a January 2017 Heritage auction.

By Jim O’Neal

Charles Eliot was president of Harvard College from 1869 to 1909, taking charge at the surprisingly young age of 35. He made some surprising statements, too, starting with his inauguration speech. He matter-of-factly observed that the world knew very little about the natural mental capacities of women. Further, he plainly stated the university was not the place to be experimenting with that notion, and that women would not be admitted to the regular college. He was also concerned about women living near the university and the obvious implications of being near to male students. It was his firm belief that once society resolved issues of inequality, perhaps the issue would become clearer. However, even long after his retirement, he maintained his doubts, since women were “simply physically too fragile.”

Another insight into his perspective occurred when the school’s baseball team had a winning season. When he learned that one of the factors that contributed to this success was the use of the curve ball, he opined this was a skill that was surely unethical and certainly not appropriate for Harvard players.

Fortunately, this was not a systemwide ethos and he may have been unaware that one of his professors, Edward Charles Pickering (director of the Harvard College Observatory), fired his entire staff of men due to their apparent inability to stay up with all the data that was routinely generated. Instead, he simply hired his maid/housekeeper to handle the numbers, eventually hiring 80-plus women, who became better known as the Harvard Computers.

One of these women was a little-known Radcliffe College graduate named Henrietta Swan Leavitt, who was “allowed” to measure the brightness of stars using the observatory’s photographic plates (women were not allowed to actually operate the telescopes). Leavitt devised a novel way to measure how far certain stars were and expressed the values in “standard candles,” a term still in common use today. Another of the computers, Annie Jump Cannon, created a new system of stellar classifications. Together, their inferences would prove to be invaluable to answering two critical questions about the universe: How old is it and how big?

The man who came up with the answer using their inferences was lawyer-turned-astronomer Edwin Powell Hubble. He was born in 1889 and lived until 1953. When he sat down to peer through the Mount Wilson (Calif.) Observatory’s 100-inch Hooker telescope in 1917 (the world’s largest until 1949), there was exactly one known galaxy: our lonely little Milky Way. Hubble not only proved the universe consisted of additional galaxies, but that the universe was still expanding. How much credit was given to the Harvard Computers group is still an area of contention, but only to what degree their work represented in these new discoveries.

Hubble was a handsome, star athlete who won seven high school track events in one day and was also a skilled boxer. He never won a Nobel Prize, but won everlasting fame when NASA named their long-overdue space telescope in honor of his scientific contributions. The Hubble Space Telescope was launched into orbit on April 24, 1990, by the Space Shuttle Discovery. Since then, it has been repaired and upgraded five different times by American astronauts and the results have been nothing short of amazingly remarkable. NASA is now confident that before Hubble’s replacement in four to five years, they will be capable of looking back into deep space far enough to see the big bang that started everything 13.8 billion years ago.

For perspective, consider what has been learned since Sir Isaac Newton was born on Christmas Day in 1642, when the accepted theory was a heliocentric model of the universe with Earth and the other planets orbiting our Sun. It was similar to what Copernicus published at the end of his life in 1543. Since then, all the great scientific minds have been focused on our galaxy trying to prove the laws of motion, the theory of light and the effects of gravity on what they believed was the entire universe. All big, important concepts (but only as it relates to our little infinitesimal piece of real estate). And then along came quantum mechanics, which added the world of the very small with its atoms, electrons, neutrons and other smaller pieces too numerous to be named as we bang particles into each other to see what flies off.

What Hubble has gradually exposed is that we have been gazing at our navel while the sheer magnitude of what lies “out there” has grown exponentially – and there may be no end as the vastness of an expansionary universe continues to speed up. I seem to recall that some wild forecasters thought there might be as many as 140 billion galaxies in the universe. Now, thanks to Hubble and lots of very smart people, the number of galaxies may be 2 trillion! If they each average 100 billion stars, that means the number of stars is now 200 sextillion – or the number 2 followed by 23 zeros.

That is a big number, but what if we are living in a multiverse with as many as 11 more universes?

I recall a quote by Sir Isaac Newton: “To myself I am only a child playing on the beach, while vast oceans of truth lie undiscovered below me.”

Aren’t we all.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Yes, Presidential Elections Have Consequences

Chief Justice of the Supreme Court John Marshall is featured on this Fr. 375 Serial Number One $20 1891 Treasury Note, which sold for $114,000 at an April 2018 Heritage auction.

By Jim O’Neal

In theory, there is no mystery or debate regarding the intention of the Founding Fathers in the selection of members to serve on the Supreme Court.

The Constitution crisply explains, in the second paragraph of Article II, Section 2, that the president shall nominate, and by and with the advice and consent of the Senate, shall appoint judges of the Supreme Court. This provision means exactly what it says and is unchanged by any modifications since its adoption. That includes a simple majority vote of the Senate to grant such consent, to reject or refuse to take action on the presidential nominee.

One idea discussed, but not acted upon, was Benjamin Franklin’s explanation of the Scottish mode of appointment “in which the nomination proceeded from the lawyers, who always selected the ablest of the profession in order to get rid of him, and share his practice among themselves” – a uniquely clever way to eliminate superior competition.

What has changed is the adoption of the “nuclear option” in 2017, which invoked cloture to end filibustering in the Judicial Committee and forced a vote of the committee either up or down on making their recommendation to the full Senate. House Majority Leader Harry Reid had used it to great effect for all legislation that he allowed to the floor while the Democrats were in the majority. Republicans expanded it to include Supreme Court nominees after they regained the majority in 2016. Neil Gorsuch was elected to the Supreme Court under this new rule with a 54-45 Senate vote, picking up three anxious Democrat votes in the process. It’s widely assumed that current nominee Judge Brent Kavanaugh will be elected to the Supreme Court following a similar path since his opponents appear helpless to stop him.

As President Obama once explained, in not too subtle fashion, “Elections have consequences.”

It now seems clear that the Founding Fathers did not foresee that political parties would gradually increase their influence and that partisan considerations of the Senate would become more prominent than experience, wisdom and merit. This was magnified in the current effort to stymie a nomination when the opposition announced they would oppose any candidate the Chief Executive chose. Period. It may not seem reasonable on a literal basis, but it has gradually become routine and will only get worse (if that’s still possible).

It may astonish some to learn that no legal or constitutional requirements for a federal judgeship exist. President Roosevelt appointed James F. Byrnes as an associate justice in 1941 and his admission to practice was by “reading law.” This is an obsolete custom now – Byrnes was the last to benefit – that proceeded modern institutions that specialize in law exclusively. In Byrnes’ case, it’s not clear that he even had a high school diploma. But he was a governor and member of Congress. He resigned 15 months later (the second shortest tenure) in order to become head of the Office of Economic Stabilization and was a trusted FDR advisor who many assumed would replace Vice President Henry Wallace as FDR’s running mate in 1944. That honor went to the little-known, high-school educated Harry Truman, who would assume the presidency the following year when FDR died suddenly.

Thomas Jefferson never dreamed the Supreme Court would become more than just a necessary evil to help balance the government in minor legal proceedings and would be more than astonished that they now are the final arbiter of what is or isn’t constitutional. The idea that six judges (who didn’t even have a dedicated building) would be considered equal to the president and Congress would have been anathema to him.

However, that was before he met ex-Secretary of State John Marshall when he became Chief Justice of the Supreme Court and started the court’s long journey to final arbiter of the Constitution when he ruled on Marbury v. Madison in 1803. There was a new sheriff in town and the next 40 years witnessed the transformation of the court to the pinnacle of legal power. They even have their own building thanks to President William Howard Taft, who died two years before it was complete. Someday, Netflix will persuade them to livestream their public discussions for all of us to watch, although I personally prefer C-SPAN to eliminate the mindless talking heads that pollute cable television.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why We Owe a Lot to Second President John Adams

An 1805 oil-on-canvas portrait of John Adams attributed to William Dunlap sold for $35,000 at a May 2017 Heritage auction.

By Jim O’Neal

John Adams had the misfortune of being squeezed into the presidency of the United States (for a single term) between George Washington and Thomas Jefferson, two of the most famous presidents of all time. As a result, Adams (1735-1826) was often overlooked as one of America’s greatest statesmen and perhaps the most learned and penetrating thinker of his time. The importance of his role in the founding of America was noted by Richard Stockton, a delegate to the Continental Congress: “The man to whom the country is most indebted for the great measure of independence. … I call him the Atlas of American Independence.”

On the way to that independence, his participation started as early as 1761 when he assisted James Otis in defending Boston merchants against Britain’s enforcement of the Sugar Tax. When the American Revolution ended, Adams played a key role in the peace treaty that formally ended the war in 1783. In between those two bookends, he wrote many of the most significant essays and treatises, led the radical movement in Boston, and articulated the principles at the Continental Congress.

Following the infamous Stamp Act in 1765, he attacked it with a vengeance and wrote A Dissertation on the Canon and Feudal Law, asserting it deprived the colonists of two basic rights: taxation by consent and a jury trial by peers – both guaranteed to all Englishmen by the Magna Carta. Within a brief 10 years, he was acknowledged as one of America’s best constitutional scholars. When Parliament passed the Coercive Acts in 1774, Adams drafted the principal clause of the Declaration of Rights and Grievances; no man worked harder in the movement for independence and the effort to constitutionalize the powers of self-government.

After the Battles of Lexington and Concord, Adams argued for the colonies to declare independence and in 1776, Congress passed a resolution recommending the colonies draft new constitutions and form new governments. Adams wrote a draft blueprint, Thoughts on Government, and four states used it to shape new constitutions. In summer 1776, Congress considered arguments for a formal independence and John Adams made a four-hour speech that forcefully persuaded the assembly to vote in favor. Thomas Jefferson later recalled that “it moved us from our seats … He was our colossus on the floor.”

Three years later, Adams drafted the Massachusetts Constitution, which was copied by other states and guided the framers of the Federal Constitution of 1787.

He faithfully served two full terms as vice president for George Washington at a time when the office had only two primary duties: preside over the Senate and break any tie votes, and count the ballots for presidential elections. Many routinely considered the office to be part of Congress as opposed to the executive branch. He served one term as president and then lost the 1800 election to his vice president, Thomas Jefferson, as the party system (and Alexander Hamilton) conspired against his re-election. Bitter and disgruntled, he left Washington, D.C., before Jefferson was inaugurated and returned to his home in Massachusetts. His wife Abigail had departed earlier as their son Charles died in November from the effects of chronic alcoholism.

Their eldest son, John Quincy Adams, served as the sixth president (for a single term) after a contentious election, and they both gradually sunk into relative obscurity. This changed dramatically in 2001 when historian David McCullough published a wonderful biography that reintroduced John and Abigail Adams to a generation that vaguely knew he had died on the same day as Thomas Jefferson, July 4, 1826 – the 50th anniversary of the signing of the Declaration of Independence. In typical McCullough fashion, it was a bestseller and led to an epic TV mini-series that snagged four Golden Globes and a record 13 Emmys in 2008.

Television at its very best!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].