The Rocky Path to Proving Einstein’s Theory of Relativity

 

A first edition of Albert Einstein’s ‘Relativity: The Special and the General Theory, a Popular Exposition’ sold for $10,625 in a December 2021 Heritage auction.

By Jim O’Neal

While watching TV someone mentioned Principe and it piqued my interest, primarily because I knew so little about this relatively small island off the West Coast of Africa in the Gulf of Guinea. After digging a little deeper, I discovered Principe was where Albert Einstein’s theory of general relativity was corroborated by a gifted mathematician, Sir Arthur Eddington, during a total solar eclipse.

The date was May 29, 1919.

To fully understand the sequence of events, it helps to start about 75 years earlier.

During the 19th century, the map of Europe was frequently redrawn as old empires crumbled and new powers emerged. You may not know that in 1850, the countries of Germany and Italy that we know so well today did not exist. Instead, there were many German-speaking and Italian-speaking states, each with their own leaders.

Then in the 1860s and 1870s ambitious politicians started merging these states through a combination of hostile actions and political agreements. A prime example was the Franco-Prussian War of 1870-71. The German states, led by Otto von Bismarck (the “Iron Chancellor”), defeated France soundly. This ended France’s domination of Europe and forced Napoleon into exile in Britain.

The formerly feuding 25 German principalities were transformed into a new unified German Empire on January 18, 1871. The new superpower had an enormous Army, a streamlined economy and intellectual institutions that dwarfed the European continent. The seeds of Germanic militarism would plague the world well into the 20th century. Following a visit by the peripatetic Mark Twain in 1878 he wrote, “What a paradise this land is! Such clean clothes and good faces, what tranquil contentment, what  prosperity, what genuine freedom, what superb government! And I am so happy, for I am responsible for none of it. I am only here to enjoy.” High praise for a country that would wreak havoc upon the world twice in the next century.

By the turn of the 20th century, a peculiar devolution emerged with a complicated network of alliances and rivalries. When the heir to the Austrian throne, Archduke Franz Ferdinand, was assassinated by a Serbian nationalist in June 1914, Austria declared war on Serbia. Then the others followed like a line of dominoes cascading helter-skelter. In Europe the fighting took place on two fronts: the Western Front, stretching from Belgium to Switzerland, and the Eastern Front, from the Baltic to the Black Sea. Then it quickly spread to the European colonies throughout the world. The war raged for four and a half years, and more than 20 million lost their lives.

In the United States, seemingly isolated by two great oceans, President Woodrow Wilson was re-elected in 1916 under the slogan “He kept us out of war.” Five months later he asked Congress to declare war against Germany, and 60 days later American troops landed in France. The distant sounds of guns had somehow morphed into a cacophony of a world war.

At the time, very few were aware of the effects the Great War had on Einstein’s efforts to prove his general theory of relativity. The industrialized slaughter bled Europe from 1914-1918 without regard for the weak or strong, the right or the wrong. Wars do not have a conscience or any pity for the young or old. Their objective is destruction. Men make these decisions.

At age 39, Einstein was working in Berlin, literally starving due to food blockades. He lost 50 pounds in three months and was unable to communicate with his closest colleagues. He was alone, honing his theory of relativity, hailed at birth as “one of the greatest — perhaps the greatest — of achievements in the history of human thought.” This was the first complete revision of our conception of the universe since Sir Isaac Newton.

Scientists seeking to confirm Einstein’s ideas were arrested as spies. Technical journals were banned as enemy propaganda. His closest ally was separated by barbed wire and U-boats. Sir Arthur Eddington became secretary of the Royal Astronomical Society and they secretly collaborated. It was actually Eddington who supplied Einstein’s astonishingly mathematical work to help prove the theory. He was convinced that Einstein was correct. But how to physically prove it?

In May 1919, with Europe still in the chaos left by war, Eddington led a globe-spanning expedition to catch a fleeting solar eclipse on film. This confirmed Einstein’s boldest prediction that light has weight and could be bent around an orbiting object — in this case Earth itself. The event put Einstein’s picture on front pages around the world!

Later, on August 2, 1939, he would write a letter to President Franklin D. Roosevelt warning him that an atomic bomb was feasible and the Germans, with all their talented rocket scientists, could be dangerously close to making one. He encouraged the president to secure a supply of uranium and make it a priority.

We now know how this ended.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Einstein was right: Income taxes are hard to understand

A 1952 Topps trading card for oil industry magnate, industrialist and philanthropist John D. Rockefeller sold for $720 at a February 2019 auction.

By Jim O’Neal

Next year promises to be another year when a major political party has an unusually large number (perhaps 20) of eager aspirants wanting to become president. Republicans had to contend with a similar problem in 2016. Sixteen hopefuls filled out the requisite forms. Most of them withdrew during the primaries and the last two – Ted Cruz and John Kasich – withdrew when Donald Trump won Indiana. On May 2, Trump became the presumptive nominee for Republicans.

There are advantages to having a broad, diverse group of candidates, but almost as many headaches. One that I suspect will become more prominent is “too many hogs at the trough.” Every candidate is challenged to find a unique angle to woo prospective voters, but it is surprising how many are making promises that will be impractical to keep. Free health care, college debt forgiveness, and free college tuition are fundamentally different than promises of lower taxes or new high-paying jobs.

Generally, populist proposals are “paid for” by taxing the rich with a “billionaire tax” or a radical wealth tax on existing assets. Since 70 percent of federal revenues come from the top 20 percent of taxpayers, the math just doesn’t work. This is especially true with 10,000 people retiring every day and existing entitlement projections squeezing out almost all other spending. One fact is certain: We can’t use the same tax dollar to pay for four different expenditures (unless we really crank up the printing presses … which are now just electronic gizmos).

Frederick Law Olmsted

Naturally, opponents are quickly stigmatizing all creative spending proposals as merely different variants of socialism. As “The Iron Lady” Margaret Thatcher wisely observed, “The trouble with Socialism is that eventually you run out of other people’s money.”

There may have been a time 150 years ago during the Gilded Age when some of these things made sense. America’s industrial success produced an era of financial magnificence when many basked in dynastic wealth of inexhaustible dimensions. John D. Rockefeller made the equivalent of $1 billion and paid no income tax. No one else did either since income tax didn’t exist in the United States at the time. Congress passed a 2 percent tax on earnings over $4,000, but it was ruled unconstitutional by the Supreme Court. Twenty years later in 1914, a modest income tax was finally approved. Albert Einstein said: “The hardest thing in the world to understand is the income tax.” I agree.

The wealthy in the 19th and 20th centuries found that spending all of their money became a full-time job. After emptying all of Europe’s fine art and artifacts, they built houses on a truly grand scale. The grandest of all were said to be the Vanderbilts. Ten mansions on 5th Avenue (one with 137 rooms). Next was Newport, R.I., where magnificent homes were quaintly called “cottages.” Then came George Washington Vanderbilt II (1862-1914), who, with successful and influential landscape designer Frederick Law Olmsted, built the Biltmore House – an estate outside Ashville, N.C. The house is believed to be the largest domestic dwelling in the United States, with 250 rooms nestled on 125,000 acres.

Olmsted and his senior partner, Calvert Vaux, designed America’s most famous park: New York City’s Central Park. The list of Olmsted’s projects is too extensive to list here even by category. The most fitting description of his work comes from the words of Daniel Burnham – the great American architect and urban designer: “An artist, he paints with lakes and wooded slopes, with lawns and banks and forest-covered hills, with mountainsides and ocean views.”

Worthy of a great man’s epitaph!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Copernicus, Galileo Helped Discover Our True Place in the Universe

A 1635 copy of Galileo’s Dialogo, containing the argument that Earth revolves around the sun, sold for $31,070 at a February 2010 Heritage auction.

“Then God said, ‘Let there be light,’ and there was light.” Genesis 1:3

By Jim O’Neal

About 4.6 billion years ago, an enormous swirl of gas and matter started a process of aggregation that ultimately formed our solar system. Nearly all of this mass was transformed into our sun and the remainder eventually coalesced into clumps that became our planetary system. Fortunately for us, one of these clumps became our current home, Earth. Then, a large clump approximately the size of Mars collided with Earth and knocked out enough material to form an object we call the moon.

Several other factors were also in our favor. First was the size of the sun, since if it were larger it may have burned out by now and us with it (we still have about 5 billion years to go). The second was the distance between the sun and Earth. We literally have a “Goldilocks position” – not too close and not too far. If we were 5 percent closer or 15 percent farther away, we would either be fried or trying to live on a large ice ball. The odds of being exactly where we are is not pleasant to contemplate. Then again, if we weren’t exactly here, I assume we wouldn’t be aware of our bad luck. Even our orbit around the sun produces a nice blend of moderate seasons of winter and summer.

However, we are still subject to occasional collisions with other objects, like asteroids and comets. In the past, these have produced mass extinctions like the one that surprised the dinosaurs, who had enjoyed their time here for about 150 million years. Life has managed to adapt through five or six major incidents and countless smaller ones. The modern form of humans (our ancestors) have only been here for about 200,000 years, but we’ve been clever enough to develop weapons that could destroy Earth (a first that we should not be too proud of).

Throughout the early history of our residency, the conventional wisdom was that Earth was the center of everything, with the sun, moon and stars orbiting us. This “geocentric model” seemed logical using common sense, as we don’t feel any motion standing on the ground and there was no observational evidence that our planet was in motion, either. This system was widely accepted and became entrenched in classical philosophy via the combined works of Plato and Aristotle in the fourth century B.C.

However, when the ancient Greeks measured the movements of the planets, it became clear the geocentric model had too many discrepancies. To explain the complications, Greek astronomers introduced the concept of epicycles to reconcile their theories. These sub-orbits were refined by the great Greco-Roman astronomer Ptolemy of Alexandria. Despite competing ideas and debates, the Ptolemaic system prevailed, but with significant implications.

As the Roman Empire dwindled, the Christian Church inherited many long-standing assumptions, including the idea that Earth was the center of everything and, perhaps more importantly, that man was the pinnacle of God’s creation – with determination over Earth, a central tenet that held sway in Europe until the 16th century. However, by this point, the Ptolemaic model was becoming absurdly complicated. By the beginning of the 16th century, things were about to change … dramatically. The twin forces of the Renaissance and the Protestant Reformation challenged the old religious dogmas. Suddenly, a Polish Catholic canon, Nicolaus Copernicus (1473-1543), put forth the first modern heliocentric theory, shifting the center of the universe to the sun.

Fortunately, Copernicus was followed by an Italian polymath from Pisa, Galileo Galilei (1564-1642), who bravely helped transform philosophy to modern science and the scientific Renaissance into a scientific revolution. He championed heliocentrism, despite powerful opposition from astronomers, and was subjected to a formal Roman Inquisition. The body found him “vehemently suspect of heresy” and forced him into house arrest. He spent the remainder of his life basically imprisoned.

Now famous for his work as an astronomer, physicist, engineer, philosopher and mathematician, he received the ultimate honor of being named the “Father of Science” by both Stephen Hawking and Albert Einstein!

Not bad for someone in an ankle bracelet.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Linus Pauling is Among Our Greatest Scientists

A lot that included Linus Pauling’s signature was offered in January 2017.

By Jim O’Neal

Serious writers about Albert Einstein almost invariably include two episodes in his life. The first is the year 1905, when he published four stunning scientific papers. The first explained how to measure molecules in a liquid; the second explained how to determine their movement. The third was a revolutionary concept that described how light rays come in packets called photons. The fourth merely changed the world!

A second highlight deals with a “fudge factor” Einstein (1879-1955) called a “cosmological constant,” whose only purpose was to cancel out the troublesome cumulative effects of gravity on his masterful general theory of relativity. He would later call it “the biggest blunder of my life.” Personally, I prefer a much more simplistic observation that perfectly captures his nonchalance. The poet Paul Valéry (1871-1945) once asked him if he had a notebook to keep track of all his ideas. A rather amused Einstein quickly replied, “Oh, no. That’s not necessary. It is very seldom I have one.”

History is replete with examples of people who had a good year. It was 1941 for Yankees great Joe DiMaggio when he hit in 56 consecutive games, and Babe Ruth’s 60 home runs in 1927. For Bobby Jones, it was 1930, when he won all four of golf’s major championships. Some people have good days, like Isaac Newton when he observed an apple falling from a tree and instantly conceptualized his theory of gravity.

Linus Pauling was different. His entire life was filled with curiosity, followed by extensive scientific research to understand the factors that had provoked him to wonder why. Pauling was born in 1901. His father died in 1910, leaving his mother to figure out how to support three children. Fortunately, a young school friend got an inexpensive chemistry set as a gift and that was enough to spark Pauling’s passion for research. He was barley 13, but the next 80 years were spent delving into the world of the unknown and finding important answers to civilization’s most complex issues.

He left high school without a diploma (two credits short that a teacher wouldn’t let him make up), but then heard about quantum mechanics and in 1926 won a Guggenheim Fellowship to study the subject under top physicists in Europe. (He was eventually given an honorary high school diploma … after he won his first Nobel Prize.) By the late 1920s and early 1930s, Pauling was busy cranking out a series of landmark scientific papers explaining the quantum-mechanical nature of chemical bonds that dazzled the scientific community.

Eventually, he returned to the California Institute of Technology (with his honorary high school diploma) to teach the best and brightest of that era. Robert Oppenheimer (of the Manhattan Project) unsuccessfully tried to recruit him to build the atomic bomb, but failed (presumably because he also tried to seduce Pauling’s wife). However, Pauling did work on numerous wartime military projects … explosives, rocket propellants and an armor-piercing shell. It’s a small example of how versatile he was. In 1948, President Truman awarded him a Presidential Medal for Merit.

In 1954, he won the Nobel Prize in Chemistry for his research on the chemical bond and its application to the elucidation of the structure of complex substances … which I shall not try to explain. And along the way, he became a passionate pacifist, joining the Emergency Committee of Atomic Scientists, chaired by Einstein, in an effort “to warn the people of the pending dangers of nuclear weapons.” His reward was to be called a communist; he had his passport revoked and his patriotism challenged, along with many others, in the dark days of McCarthyism.

In 1958, he petitioned the United Nations, calling for the cessation of nuclear weapons. In addition to his wife, it was signed by over 11,000 scientists from 50 countries. First ban the bomb, then ban nuclear testing, followed by a global treaty to end war, per se. He received a second Nobel Prize for Peace in 1963, but that was for trying to broker an early peace with Vietnam, making him one of only four people to win more than one prize, including Marie Curie in 1903 (physics) and 1911 (chemistry). His other awards are far too numerous to mention. As an aside, he died in one of my favorite places: Big Sur, Calif., at age 93.

Sadly, in later life, his reputation was damaged by his enthusiasm for alternative medicine. He championed the use of high-dose vitamin C as a defense against the common cold, a treatment that was subsequently shown to be ineffective (though there’s some evidence it may shorten the length of colds). I still take it and see scientific articles more frequently about the benefit of infused vitamin C being tested in several cancer trials.

If he were still working on it, let’s say the smart money would be on Pauling.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

It’s a Long Way from Hot Air Balloons to Lethal Drones

An 1887 complete set of 25 Lone Jack “Inventors and Inventions” cards, featuring Michael Faraday, sold for $3,107 at a November 2011 Heritage auction.

By Jim O’Neal

Wars fought from the air can be traced back to the creation of rubber balloons first made by Professor Michael Faraday in 1824 for use in his experiments with hydrogen at the Royal Institution in London (where he would later become the first Fullerian Professor of Chemistry). However, it was his work with electricity and magnetism that earned him greater fame. Virtually all electricity produced today is based on Faraday’s principles, whether it is coal, oil, gas, nuclear, hydro or wind.

Faraday (1791-1867) declined an offer of knighthood since he believed the Bible forbade earthly accumulations of wealth or honor and stated that he “preferred to be plain Mr. Faraday to the end.” He also declined to assist the British government with the production of chemical weapons, on ethical grounds, for the Crimean War (1853-1856) … a position the current government now apparently agrees with given recent activities in Syria.

I can attest to the many honorific symbols in greater London today and Albert Einstein kept a picture of Faraday on a wall next to Isaac Newton to acknowledge their enormous contributions to the extension of electromagnetism in space. Not bad for “plain” Mr. Faraday.

His work on balloons was preceded by the French Montgolfier brothers in 1783 while perched on a hillside watching a bonfire:

“I wonder what makes the smoke go up.”

“Perhaps warm air is lighter and the cold air pushes it up.”

“Then if we filled a bag with hot air, it would fly!”

Aeronautics was born.

Then on June 18, 1861, a stunned audience in Washington watched a giant balloon, the Enterprise, rise 500 feet. A man in it sent a telegraph to President Lincoln … “Sir: From this point of observation we command an area of nearly 50 miles in diameter. I have the pleasure of sending you this first telegram ever dispatched from an aerial station… T.S.C. Lowe”

This was a prelude to the short-lived formal use of aerial observations by the Armed Forces. The first balloon bought for the American military was an $850 model of raw India silk built by John Wise of Lancaster, Pa. Both sides in the Civil War were basically incapable of utilizing balloons for little more than observation of troop positions since any kind of armament was simply too heavy to be carried aloft. Aerial photography service was offered but never acted on. After viewing the First Battle of Bull Run, Lowe and other balloonists formed the Union Army Balloon Corps, but disbanded in August 1863. Confederate efforts were even more modest and legend has it that (sadly) the very last silk dress in the entire Confederacy was used to try and make a balloon.

Then a man by the name of Billy Mitchell enlisted as a private in the Spanish-American War, where he became a member of the U.S. Army Signal Corps. Subsequently, he served in France during World War I and ultimately became regarded as the father of the U.S. Air Force. It was his stubborn insistence that “the day has passed when armies or navies on the sea can be the arbiter of a nation’s destiny in war. The main power of defense and the power of initiative against an enemy has passed to the air” (November 1918).

That and a statement accusing senior leaders in the Army and Navy of incompetence and “almost treasonable administration of national defense” got him court-martialed. The court, which included Major General Douglas MacArthur as one of the judges, found him guilty on Dec. 17, 1925, and suspended him for five years. Ironically, MacArthur suffered a similar fate decades later for challenging conventional military wisdom.

We are now in the era where wars are fought using lethally armed unmanned aerial vehicles (UAVs) piloted from remote locations, and cruise missiles launched from ships up to 1,500 miles away. I was hoping we had the cyber-technology to destroy an enemy’s power infrastructure, disable their communications and simply render their offensive and defensive capabilities useless.

Maybe that’s only feasible for presidential elections using Facebook.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Napoleonic-Era Book Explains Evolving Dark Art of War

A title lobby card for the 1927 silent French epic film Napoléon sold for $10,157 at a July 2008 Heritage auction.

By Jim O’Neal

It is generally accepted dogma that the French Revolution devoured not only its own children. Many of those who fought against it were literally children. Carl von Clausewitz was only 12 when he first saw action against the French.

A true warrior-scholar, Clausewitz (1780-1831) survived the shattering defeat at Jena-Auerstedt (today’s Germany) in 1806, refused to fight with the French against the Russians in 1812 and saw action at Ligny in 1815. As noted in his book Civilization: The West and the Rest, British historian Niall Ferguson says it was Clausewitz who, better than anyone (including Napoleon himself), understood the way the Revolution transformed the dark art of war.

The Prussian general’s posthumously published masterpiece On War (1832) remains the single most important work on the subject produced by a Western author. Though in many ways timeless, Ferguson points out On War is also the indispensable commentary on the Napoleonic era. It explains why war had changed in its scale and the implications for those who chose to wage it.

Clausewitz declared that war is “an act of force to compel our enemy to do our will … (it is) not merely an act of policy but a true political instrument, a continuation of political intercourse, carried on with other means.” These are considered his most famous words, and also the most misunderstood and mistranslated (at least from what I have read … which is extensive).

But they were not his most important.

Clausewitz’s brilliant insight was that in the wake of the French Revolution, a new passion had arrived on the field of battle. “Even the most civilized of peoples [ostensibly referring to the French] can be fired with passionate hatred for each other…” After 1793, “war again became the business of the people,” as opposed to the hobby of kings, Ferguson writes. It became a juggernaut, driven by the temper of a nation.

This was new.

Clausewitz did acknowledge Bonaparte’s genius as the driver of this new military juggernaut, yet his exceptional generalship was less significant than the new “popular” spirit that propelled his army. Clausewitz called it a paradoxical trinity of primordial violence, hatred and enmity. If that was true, then it helps explain the many people-wars of the 19th century, but is a perplexer (at least to me) when applied to events a century later.

The Battle of the Somme, started on July 1, 1916, is infamous primarily because of 58,000 British troop casualties (one-third of them killed) – to this day a one-day record. It was the main Allied attack on the Western front in 1916 and lasted until Nov. 18 when terrible weather brought it to a halt. The attack resulted in over 620,000 British and French casualties. German casualties were estimated at 500,000. It is one of the bloodiest battles in human history.

The Allies gained a grand total of 12 kilometers of (non-strategic) ground!

It is hard to fit Clausewitz’s thesis into this form of military stupidity. I prefer the rationale offered by the greatest mind of the 20th century: “Older men declare war. But it is the youth that must fight and die,” said Albert Einstein.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

What Nature Separated, Explorers Brought Together

A letter Albert Einstein sent to Charles Hapgood in November 1954 sold for $3,125 at an April 2015 Heritage auction.

By Jim O’Neal

To my knowledge, the only trained geologist I have met is Simon Winchester. He is now a successful journalist and author. Most of the people I know tend to look up or back (in history) as opposed to downward. I assume most geologists are average, normal people, except when I read about 20th century geology and the difficulties they had in reaching anything close to a consensus about Earth.

As early as 1912, a German – Alfred Wegener – developed the theory that the continents had come together in a single landmass before it drifted apart (he called it Pangaea). However, virtually everyone else argued that continents moved up and down, but definitely not “sideways.” They developed elaborate theories to explain all the evidence.

Just before he died in 1955, Albert Einstein enthusiastically endorsed a theory by geologist Charles Hapgood, who wrote the book “Earth’s Shifting Crust: A Key to Some Basic Problems of Earth Science.” Hapgood systematically debunked the theory of continental drift and dismissed anyone who believed it as “gullible.” When it was finally realized that the whole crust of the Earth was in motion – and not just the continents – it took a while to settle on a new name.

It wasn’t until 1968, when the Journal of Geophysical Research published an article by three American seismologists, that the new science was dubbed plate tectonics. Still, as late as 1980, one in eight geologists didn’t believe in plate-tectonic theory. It is not clear if there are any continental drift-deniers left, but we know for sure that the pre-1492 American and Eurasian ecosystems existed in complete isolation for thousands of years. The arrival of the first Europeans in North and Central America reconnected them and started what was to become known as the great Columbian Exchange.

Lives and economies that had evolved gradually over centuries were suddenly transformed by the influx of new crops, animals, technology and pathogens. Many of the effects were unforeseen and generally misunderstood by both American Indians and Europeans; but once the first landing occurred, massive changes were inexorably under way. One small example is that 60 percent of all crops grown in the world today originated in the Americas. A more immediate and devastating impact was the introduction of new diseases into the Americas that wiped out hundreds of thousands of Indians who had no biological defenses against small pox, measles, malaria, chickenpox and yellow fever.

The dramatic and irrevocable changes brought about on both sides of the Atlantic by the Columbian Exchange continued to shape lives for centuries, just as the movement of the Earth’s crust shapes our lives today by earthquakes, ocean movements and other ecological events.

We live in and on things that will continue to change as our televisions remind us every day. The longer-term changes are still being debated and are now politicized to the point where prudent policies are paralyzed. It must be something in our DNA.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Atom Bombs: From Pop Culture Novelty to Unimaginable Threat

first-day-cover-postmarked-july-28-1955
A First Day Cover postmarked July 28, 1955, and signed by six crew members of the Enola Gay, which dropped the first atomic bomb on Hiroshima, went to auction in April 2005.

By Jim O’Neal

As North Korea continues to relentlessly pursue offensive atomic weapons – perhaps a weaponized missile delivered by a submersible vessel – the world is perplexed over how to respond. U.S. sanctions are ignored, China is permissive, complicit or both, and South Korea and Japan grow more anxious as the United Nations is irrelevant, as usual.

Concurrently, polls indicate that attitudes about the use of atomic bombs against Japan to end World War II are less favorable. But this was not always the case.

At first, most people had approved the use of the bomb on Hiroshima, followed by a second bomb a few days later on Nagasaki. They agreed the bombs hastened the end of the war and saved more American lives than they had taken from the Japanese. Most people shared the view of President Truman and the majority of the defense establishment: The bomb was just an extension of modern weapons technology.

There had even been some giddiness about the Atomic Age. The bar at the National Press Club started serving an “atomic cocktail.” Jewelers sold “atomic earrings” in the shape of a mushroom cloud. General Mills offered an atomic ring and 750,000 children mailed in 15 cents and a cereal box top to “see genuine atoms split to smithereens.”

But the joking masked a growing anxiety that was slowly developing throughout our culture. In the months after it ended the war, the bomb also began to effect an extraordinary philosophical reassessment and generate a gnawing feeling of guilt and fear.

Then, the entire Aug. 31, 1946, issue of The New Yorker magazine was devoted to a 30,000-word article by John Hersey entitled, simply, “Hiroshima.” The writer described the lives of six survivors before, during and after the dropping of the bomb: a young secretary, a tailor’s wife, a German Jesuit missionary, two doctors and a Japanese Methodist minister.

The power of Hersey’s reporting, devoid of any melodrama, brought human content to an unimaginable tragedy and the response was overwhelming. The magazine sold out. A book version became a runaway bestseller (still in print). Albert Einstein bought 1,000 copies and distributed them to friends. An audience of millions tuned in to hear the piece, in its entirety, over the ABC radio network.

After Hersey’s book with its explicit description of the atomic horror (“Their faces wholly burned, their eye sockets were hollow, the fluid from their melted eyes had run down on their cheeks”), it was impossible to ever see the bomb as just another weapon. The only solace was that only America possessed this terrible weapon.

However, it soon became clear that it was only a matter of time before the knowledge would spread and atomic warfare between nations would become possible. People were confronted for the first time of the real possibility of human extinction. They finally grasped the fact that the next war could indeed be what Woodrow Wilson had dreamed the First World War would be – a war to end all wars – although only because it would likely end life itself.

Let’s hope our world leaders develop a consensus about the Korean Peninsula (perhaps reunification) before further escalation. It is time to end this threat, before it has a chance to end us.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Albert Einstein was Much More than a Scientist

This signed Albert Einstein photograph realized $17,500 at an October 2014 auction.

By Jim O’Neal

Mention the name Albert Einstein and instinctively the image of the iconic scientist with the unruly hair, pensive expression and the word “genius” spring to mind. As a theoretical physicist, his work on general relativity is a theory of gravitation that has evolved into a crucial tool in modern astrophysics and is foundational for current “black hole” research.

In popular culture, his mass-energy equivalence formula of energy equals mass multiplied by the speed of light squared (E = mc2) is generally regarded as “the world’s most famous equation.”

Then, of course, there was Einstein the mortal man.

This aspect is understandably less well known despite his empathy for mankind and the practical application of both his intellect and celebrity to help improve the world and its inhabitants. He was an avowed pacifist who considered war a “disease” and even advocated for a global democratic government that had control over the nation-states (e.g. Nazi Germany).

He viewed racism in the United States as a multi-generational problem and joined the NAACP as an activist to help cure “America’s worst disease.”

An earlier incident in 1925 even led to a series of related activities that eventually helped defeat the Germans in World War II. While reading a local German newspaper, he saw a tragic story about a couple that had died from leaking gases used in early refrigerators.

Einstein collaborated with fellow physicist Leo Szilard and they received patent #1,781,541 for an improved, safer refrigerator. Although they later sold it to Electrolux for 3,150 DM ($10,000), Einstein’s basic motive was to simply improve living standards for common people. BTW, he later invented a hearing aid for the same reason.

When Szilard immigrated to London, he ran across a book by H.G. Wells, The World Set Free, which describes an invention (unnamed) that could accelerate the process of radioactive decay, producing bombs which “continue to explode for days on end.” This inspired Szilard to develop the concept of a nuclear chain reaction in 1933 and then he patented the idea of a nuclear reactor with the famous Italian physicist Enrico Fermi. Basically, he had a patent on the first atomic bomb.

But, in 1936 Szilard sold/assigned his chain-reaction patent to the British Admiralty to ensure its secrecy from the Germans or others considered untrustworthy.

He later suspected the Germans had a clandestine nuclear weapon project and on the eve of World War II drafted a letter to FDR to alert him to the potential development “of extremely powerful bombs of a new type.” He got Einstein to endorse it and to urge the United States to begin similar research.

This inevitably led to the Manhattan Project, which preempted the Germans and saved the world in the eyes of most experts.

Although Einstein supported the development of nuclear weapons to defend the Allies, he denounced the use of nuclear weapons as an offensive force. He never renounced his resolve as a pacifist or as an agnostic.

In 1999, Time magazine named Albert Einstein their choice as “Person of the Century.”

I hope we get one for this century … soon.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

The Blunder Before the Genius

This inscribed photograph of Albert Einstein, taken during his first visit to America, realized $26,290 at a February 2010 Heritage auction.

By Jim O’Neal

Albert Einstein called it “the greatest blunder of my life.”

Since he was not a cosmologist, he had accepted the prevailing wisdom that the universe was both fixed and eternal. As a result, when he was formulating his general theory, he dropped into his equations something called the “cosmological constant.” It was designed to arbitrarily counter the effects of gravity.

Typically, history books tend to forgive Einstein for this lapse but in reality, it was a terrible piece of scientific work … and he knew it.

Fortunately, Vesto Slipher at the Lowell Observatory in Arizona was taking spectrographic readings of distant stars and noticed a Doppler shift. That proved, beyond any doubt, that the universe was NOT static. The stars were moving away from Earth, which implied an expansionary condition. This was simply astounding and reversed all conventional thinking about a fixed universe.

Unfortunately, Edwin Hubble took all the credit for this remarkable discovery. It is what propelled him into becoming the most outstanding astronomer of the 20th century. (Maybe more on him later since he had such an inflated view of his importance.)

Now, however, flash back to a young Einstein and we find he was a mere assistant clerk in the Swiss patent office. He had no university affiliations, no access to a lab and only a modest library at the patent office.

He had been rejected for an assistant teaching position and was passed over for promotion until “he learned more about machine technology.”

He had a lot of spare time, which he used to gaze out his window and just think.

Then in 1905, he published a series of five scientific papers, of which three, according to C.P. Snow, “were among the greatest in the history of physics.”

The first would earn Einstein a Nobel Prize. The second provided proof that atoms DID exist – a fact that had been in dispute.

The third simply changed the world.

To learn more, read “Einstein: His Life and Universe” by Walter Isaacson (2007).

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].