Here’s why I admire Helen Keller, Sir Christopher Wren, Mark Twain and Doctor Who

Peter Cushing starred in Dr. Who and the Daleks, a 1965 movie based on the TV series. A British “quad” poster for the film sold for $3,585 at a July 2017 Heritage auction.

By Jim O’Neal

Doctor Who was a popular sci-fi TV series in Britain that originally ran from 1963-89 on BBC. Myth has it that the first episode was delayed for 80 seconds due to an announcement of President Kennedy’s assassination in Dallas. We had the opportunity to watch a 1996 made-for-TV movie in London that co-starred Eric Roberts (Julia’s older brother). Alas, it failed to generate enough interest to revive the original Doctor Who series (at least until a new version was launched in 2005).

A 1982 episode from the first run of the show is still popular since the story claimed that aliens were responsible for the Great Fire of London of 1666 and mentioned Pudding Lane. Ever curious, I drove to Pudding Lane, a rather small London street, where Thomas Farriner’s bakery started the Great Fire on Sunday, Sept. 2, shortly after midnight, and then proceeded to rain terror down on one of the world’s great cities.

Pudding Lane also holds the distinction of being one of the first one-way streets in the world. Built in 1617 to alleviate congestion, it reminds one just how long Central London has been struggling with this issue that plagues every large city. Across from the bakery site is a famous landmark monument built in memory of the Great Fire. Not surprisingly, it was designed by the remarkable Sir Christopher Wren (1632-1723).

Wren is an acclaimed architect (perhaps the finest in history) who helped rebuild London with the help of King Charles II. This was no trivial task since 80 percent of the city was destroyed, including many churches, most public buildings and private homes … up to 80,000 people were rendered homeless. Even more shocking is that this disaster followed closely the Great Plague of 1665, when as many as 100,000 people died. A few experts have suggested that the 1666 fire and massive refurbishment helped the disease-ridden city by eliminating the vermin still infesting parts of London.

One of Wren’s more famous restorations is St. Paul’s Cathedral, perhaps the most famous and recognizable sight in London yet today. Many high-profile events have been held there, including the funerals of Prime Minister Winston Churchill and Margaret Thatcher, jubilee celebrations for Queen Victoria and Queen Elizabeth II, and the wedding of Prince Charles and Lady Diana … among many others.

Even Wren’s tomb is in St. Paul’s Cathedral. It is truly a magnificent sight to view Wren’s epitaph:

“Here in its foundations lies the architect of this church and city, Christopher Wren, who lived beyond ninety years, not for his own profit but for the public good. Reader, if you seek his monument – look around you. Died 25 Feb. 1723, age 91.”

In addition to Wren’s reputation as an architect, he was renowned for his astounding work as an astronomer, a co-founder of the elite Royal Society, where he discussed anything scientific with Sir Isaac Newton, Blaise Pascal, Robert Hooke and, importantly, Edmond Halley of comet fame. Halley’s Comet is the only known short-period comet that is regularly (75-76 years) visible to the naked eye. It last appeared in our solar system in 1986 and will return in mid-2061.

Samuel Langhorne Clemens (aka Mark Twain) was born shortly after the appearance of Halley’s Comet in 1835 and predicted he “would go out with it.” He died the day after the comet made its closest approach to earth in 1910 … presumably to pick up another passenger. We all know about Twain, Tom Sawyer and Huckleberry Finn. But far fewer know about his unique relationship with Helen Keller (1880-1968). She was a mere 14 when she met the world-famous Twain in 1894.

They became close friends and he arranged for her to go to Radcliffe College of Harvard University. She graduated in 1904 as the first deaf and blind person in the world to earn a Bachelor of Arts degree. She learned to read English, French, Latin and German in braille. Her friend Twain called her “one of the two most remarkable people in the 19th century.” Curiously, the other candidate was Napoleon.

I share his admiration for Helen Keller.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Linus Pauling is Among Our Greatest Scientists

A lot that included Linus Pauling’s signature was offered in January 2017.

By Jim O’Neal

Serious writers about Albert Einstein almost invariably include two episodes in his life. The first is the year 1905, when he published four stunning scientific papers. The first explained how to measure molecules in a liquid; the second explained how to determine their movement. The third was a revolutionary concept that described how light rays come in packets called photons. The fourth merely changed the world!

A second highlight deals with a “fudge factor” Einstein (1879-1955) called a “cosmological constant,” whose only purpose was to cancel out the troublesome cumulative effects of gravity on his masterful general theory of relativity. He would later call it “the biggest blunder of my life.” Personally, I prefer a much more simplistic observation that perfectly captures his nonchalance. The poet Paul Valéry (1871-1945) once asked him if he had a notebook to keep track of all his ideas. A rather amused Einstein quickly replied, “Oh, no. That’s not necessary. It is very seldom I have one.”

History is replete with examples of people who had a good year. It was 1941 for Yankees great Joe DiMaggio when he hit in 56 consecutive games, and Babe Ruth’s 60 home runs in 1927. For Bobby Jones, it was 1930, when he won all four of golf’s major championships. Some people have good days, like Isaac Newton when he observed an apple falling from a tree and instantly conceptualized his theory of gravity.

Linus Pauling was different. His entire life was filled with curiosity, followed by extensive scientific research to understand the factors that had provoked him to wonder why. Pauling was born in 1901. His father died in 1910, leaving his mother to figure out how to support three children. Fortunately, a young school friend got an inexpensive chemistry set as a gift and that was enough to spark Pauling’s passion for research. He was barley 13, but the next 80 years were spent delving into the world of the unknown and finding important answers to civilization’s most complex issues.

He left high school without a diploma (two credits short that a teacher wouldn’t let him make up), but then heard about quantum mechanics and in 1926 won a Guggenheim Fellowship to study the subject under top physicists in Europe. (He was eventually given an honorary high school diploma … after he won his first Nobel Prize.) By the late 1920s and early 1930s, Pauling was busy cranking out a series of landmark scientific papers explaining the quantum-mechanical nature of chemical bonds that dazzled the scientific community.

Eventually, he returned to the California Institute of Technology (with his honorary high school diploma) to teach the best and brightest of that era. Robert Oppenheimer (of the Manhattan Project) unsuccessfully tried to recruit him to build the atomic bomb, but failed (presumably because he also tried to seduce Pauling’s wife). However, Pauling did work on numerous wartime military projects … explosives, rocket propellants and an armor-piercing shell. It’s a small example of how versatile he was. In 1948, President Truman awarded him a Presidential Medal for Merit.

In 1954, he won the Nobel Prize in Chemistry for his research on the chemical bond and its application to the elucidation of the structure of complex substances … which I shall not try to explain. And along the way, he became a passionate pacifist, joining the Emergency Committee of Atomic Scientists, chaired by Einstein, in an effort “to warn the people of the pending dangers of nuclear weapons.” His reward was to be called a communist; he had his passport revoked and his patriotism challenged, along with many others, in the dark days of McCarthyism.

In 1958, he petitioned the United Nations, calling for the cessation of nuclear weapons. In addition to his wife, it was signed by over 11,000 scientists from 50 countries. First ban the bomb, then ban nuclear testing, followed by a global treaty to end war, per se. He received a second Nobel Prize for Peace in 1963, but that was for trying to broker an early peace with Vietnam, making him one of only four people to win more than one prize, including Marie Curie in 1903 (physics) and 1911 (chemistry). His other awards are far too numerous to mention. As an aside, he died in one of my favorite places: Big Sur, Calif., at age 93.

Sadly, in later life, his reputation was damaged by his enthusiasm for alternative medicine. He championed the use of high-dose vitamin C as a defense against the common cold, a treatment that was subsequently shown to be ineffective (though there’s some evidence it may shorten the length of colds). I still take it and see scientific articles more frequently about the benefit of infused vitamin C being tested in several cancer trials.

If he were still working on it, let’s say the smart money would be on Pauling.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As Sir Newton Noted, Vast Oceans of Truth Lie Undiscovered

Sir Isaac Newton’s autograph was among of group of three signatures by famous scientists that sold for $4,750 at a January 2017 Heritage auction.

By Jim O’Neal

Charles Eliot was president of Harvard College from 1869 to 1909, taking charge at the surprisingly young age of 35. He made some surprising statements, too, starting with his inauguration speech. He matter-of-factly observed that the world knew very little about the natural mental capacities of women. Further, he plainly stated the university was not the place to be experimenting with that notion, and that women would not be admitted to the regular college. He was also concerned about women living near the university and the obvious implications of being near to male students. It was his firm belief that once society resolved issues of inequality, perhaps the issue would become clearer. However, even long after his retirement, he maintained his doubts, since women were “simply physically too fragile.”

Another insight into his perspective occurred when the school’s baseball team had a winning season. When he learned that one of the factors that contributed to this success was the use of the curve ball, he opined this was a skill that was surely unethical and certainly not appropriate for Harvard players.

Fortunately, this was not a systemwide ethos and he may have been unaware that one of his professors, Edward Charles Pickering (director of the Harvard College Observatory), fired his entire staff of men due to their apparent inability to stay up with all the data that was routinely generated. Instead, he simply hired his maid/housekeeper to handle the numbers, eventually hiring 80-plus women, who became better known as the Harvard Computers.

One of these women was a little-known Radcliffe College graduate named Henrietta Swan Leavitt, who was “allowed” to measure the brightness of stars using the observatory’s photographic plates (women were not allowed to actually operate the telescopes). Leavitt devised a novel way to measure how far certain stars were and expressed the values in “standard candles,” a term still in common use today. Another of the computers, Annie Jump Cannon, created a new system of stellar classifications. Together, their inferences would prove to be invaluable to answering two critical questions about the universe: How old is it and how big?

The man who came up with the answer using their inferences was lawyer-turned-astronomer Edwin Powell Hubble. He was born in 1889 and lived until 1953. When he sat down to peer through the Mount Wilson (Calif.) Observatory’s 100-inch Hooker telescope in 1917 (the world’s largest until 1949), there was exactly one known galaxy: our lonely little Milky Way. Hubble not only proved the universe consisted of additional galaxies, but that the universe was still expanding. How much credit was given to the Harvard Computers group is still an area of contention, but only to what degree their work represented in these new discoveries.

Hubble was a handsome, star athlete who won seven high school track events in one day and was also a skilled boxer. He never won a Nobel Prize, but won everlasting fame when NASA named their long-overdue space telescope in honor of his scientific contributions. The Hubble Space Telescope was launched into orbit on April 24, 1990, by the Space Shuttle Discovery. Since then, it has been repaired and upgraded five different times by American astronauts and the results have been nothing short of amazingly remarkable. NASA is now confident that before Hubble’s replacement in four to five years, they will be capable of looking back into deep space far enough to see the big bang that started everything 13.8 billion years ago.

For perspective, consider what has been learned since Sir Isaac Newton was born on Christmas Day in 1642, when the accepted theory was a heliocentric model of the universe with Earth and the other planets orbiting our Sun. It was similar to what Copernicus published at the end of his life in 1543. Since then, all the great scientific minds have been focused on our galaxy trying to prove the laws of motion, the theory of light and the effects of gravity on what they believed was the entire universe. All big, important concepts (but only as it relates to our little infinitesimal piece of real estate). And then along came quantum mechanics, which added the world of the very small with its atoms, electrons, neutrons and other smaller pieces too numerous to be named as we bang particles into each other to see what flies off.

What Hubble has gradually exposed is that we have been gazing at our navel while the sheer magnitude of what lies “out there” has grown exponentially – and there may be no end as the vastness of an expansionary universe continues to speed up. I seem to recall that some wild forecasters thought there might be as many as 140 billion galaxies in the universe. Now, thanks to Hubble and lots of very smart people, the number of galaxies may be 2 trillion! If they each average 100 billion stars, that means the number of stars is now 200 sextillion – or the number 2 followed by 23 zeros.

That is a big number, but what if we are living in a multiverse with as many as 11 more universes?

I recall a quote by Sir Isaac Newton: “To myself I am only a child playing on the beach, while vast oceans of truth lie undiscovered below me.”

Aren’t we all.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s Why Scientists Like Joseph Lister Have Made Life Better for All of Us

A March 25, 1901, letter signed by Joseph Lister went to auction in October 2014.

By Jim O’Neal

In the 1880s, American physicist Albert Michelson embarked on a series of experiments that undermined a long-held belief in a luminiferous ether that was thought to permeate the universe and affect the speed of light ever so slightly. Embraced by Isaac Newton (and almost venerated by all others), the ether theory was considered an absolute certainty in 19th century physics in explaining how light traveled across the universe.

However, Michelson’s experiments (partially funded by Alexander Graham Bell) proved the exact opposite of the theory. In the words of author William Cropper, “It was probably the most famous negative result in the history of physics.” The fact was that the speed of light was the same in all directions and in every season – reversing Newton’s law that had been thought to be a constant for the past 200 years. But, not everyone agreed for a long time.

The more modern scientist Max Planck (1858-1947) helped explain the resistance to accept new facts in a rather novel way: “A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.”

Even if true, it still makes it no less easy to accept the fact that the United States was the only nation “that remained unconvinced of the merits of Joseph Lister’s methods of modern antiseptic medicine.” In fact, Henry Jacob Bigelow (1818-1890), the esteemed Harvard professor of surgery and a fellow of the Academy of Arts and Sciences, derided antisepsis as “medical hocus-pocus.” This is even more remarkable when one considers he was the leading surgeon in New England and his contributions to orthopedic and urologic surgery are legendary.

But this short story begins with a sleight of hand by asking: In the 19th century, what do you think was the most dangerous place in the vast territories of the British Empire? The frozen wastes of the Northwest Passage or the treacherous savannas of Zululand? Or perhaps the dangerous passes of Hindu Kush? The surprising answer is almost undoubtedly the Victorian teaching hospital, where patients entered with a trauma and exited to a cemetery after a deadly case of “hospital gangrene.”

Victorian hospitals were described as factories of death, reeking with an unmistakable stench resembling rotting fish, cheerfully described as “hospital stink.” Infectious wounds were considered normal or beneficial to recovery. Stories abound of surgeons operating on a continuous flow of patients, and bloody smocks were badges of honor or evidence of their dedication to saving lives. The eminent surgeon Sir Frederick Treves (1853-1923) recalled, “There was one sponge to a ward. With this putrid article and a basin of once clear water, all the wounds in the ward were washed twice a day. By this ritual, any chance that a patient had of recovery was eliminated.”

Fortunately, Joseph Lister was born in 1827 and chose the lowly, mechanical profession of surgery over the more prestigious practice of internal medicine. In 1851, he was appointed one of four residents of surgery at London’s University College Hospital. The head of surgery was wrongfully convinced that infections came from miasma, a peculiar type of noxious air that emanated from the rot and decay.

Ever skeptical, Lister scoured out rotten tissue from gangrene wounds using mercury pernitrate on the healthy tissue. Thus began Lister’s lifelong journey to investigate the cause of infection and prevention through modern techniques. He spent the next 25 years in Scotland, becoming the Regius Professor of Surgery at the University of Glasgow. After Louis Pasteur confirmed germs caused infections rather than bad air, Lister discovered that carbolic acid (a derivative of coal tar) could prevent many amputations by cleaning the skin and wounds.

He then went on the road, advocating his gospel of antisepsis, which was eagerly adopted by the scientific Germans and some Scots, but plodding and practical English surgeons took much longer. Thus left were the isolated Americans who, like Dr. Bigelow, were too stubborn and unwilling to admit the obvious.

Planck was right all along. It would take a new generation, but we are the generation that has derived the greatest benefits from the astonishing advances in 20th century medical breakthroughs, which only seem to be accelerating. It is a good time to be alive.

So enjoy it!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

It’s a Long Way from Hot Air Balloons to Lethal Drones

An 1887 complete set of 25 Lone Jack “Inventors and Inventions” cards, featuring Michael Faraday, sold for $3,107 at a November 2011 Heritage auction.

By Jim O’Neal

Wars fought from the air can be traced back to the creation of rubber balloons first made by Professor Michael Faraday in 1824 for use in his experiments with hydrogen at the Royal Institution in London (where he would later become the first Fullerian Professor of Chemistry). However, it was his work with electricity and magnetism that earned him greater fame. Virtually all electricity produced today is based on Faraday’s principles, whether it is coal, oil, gas, nuclear, hydro or wind.

Faraday (1791-1867) declined an offer of knighthood since he believed the Bible forbade earthly accumulations of wealth or honor and stated that he “preferred to be plain Mr. Faraday to the end.” He also declined to assist the British government with the production of chemical weapons, on ethical grounds, for the Crimean War (1853-1856) … a position the current government now apparently agrees with given recent activities in Syria.

I can attest to the many honorific symbols in greater London today and Albert Einstein kept a picture of Faraday on a wall next to Isaac Newton to acknowledge their enormous contributions to the extension of electromagnetism in space. Not bad for “plain” Mr. Faraday.

His work on balloons was preceded by the French Montgolfier brothers in 1783 while perched on a hillside watching a bonfire:

“I wonder what makes the smoke go up.”

“Perhaps warm air is lighter and the cold air pushes it up.”

“Then if we filled a bag with hot air, it would fly!”

Aeronautics was born.

Then on June 18, 1861, a stunned audience in Washington watched a giant balloon, the Enterprise, rise 500 feet. A man in it sent a telegraph to President Lincoln … “Sir: From this point of observation we command an area of nearly 50 miles in diameter. I have the pleasure of sending you this first telegram ever dispatched from an aerial station… T.S.C. Lowe”

This was a prelude to the short-lived formal use of aerial observations by the Armed Forces. The first balloon bought for the American military was an $850 model of raw India silk built by John Wise of Lancaster, Pa. Both sides in the Civil War were basically incapable of utilizing balloons for little more than observation of troop positions since any kind of armament was simply too heavy to be carried aloft. Aerial photography service was offered but never acted on. After viewing the First Battle of Bull Run, Lowe and other balloonists formed the Union Army Balloon Corps, but disbanded in August 1863. Confederate efforts were even more modest and legend has it that (sadly) the very last silk dress in the entire Confederacy was used to try and make a balloon.

Then a man by the name of Billy Mitchell enlisted as a private in the Spanish-American War, where he became a member of the U.S. Army Signal Corps. Subsequently, he served in France during World War I and ultimately became regarded as the father of the U.S. Air Force. It was his stubborn insistence that “the day has passed when armies or navies on the sea can be the arbiter of a nation’s destiny in war. The main power of defense and the power of initiative against an enemy has passed to the air” (November 1918).

That and a statement accusing senior leaders in the Army and Navy of incompetence and “almost treasonable administration of national defense” got him court-martialed. The court, which included Major General Douglas MacArthur as one of the judges, found him guilty on Dec. 17, 1925, and suspended him for five years. Ironically, MacArthur suffered a similar fate decades later for challenging conventional military wisdom.

We are now in the era where wars are fought using lethally armed unmanned aerial vehicles (UAVs) piloted from remote locations, and cruise missiles launched from ships up to 1,500 miles away. I was hoping we had the cyber-technology to destroy an enemy’s power infrastructure, disable their communications and simply render their offensive and defensive capabilities useless.

Maybe that’s only feasible for presidential elections using Facebook.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Luftwaffe’s Incendiary Bombs Devastated British Treasures

A first edition of John Dalton’s A New System of Chemical Philosophy (Manchester: S. Russell, 1808-10) sold for $7,812.50 at an October 2013 Heritage auction.

By Jim O’Neal

“Peace for our time” was proudly announced by British Prime Minister Neville Chamberlain after signing the Munich Pact in 1938. This agreement effectively conceded the annexation of the Sudetenland regions of Czechoslovakia to Nazi Germany in the hope it would quell Adolf Hitler’s appetite for European expansion. Today, it is universally regarded as a naive act of appeasement as Germany promptly invaded Poland.

A full year before, the British Museum had located a deserted, remote mine to store their priceless treasures in anticipation of war. Other institutions like the Victoria and Albert Museum and the National Gallery joined in by relocating historic records, manuscripts and artwork. Steel racks were constructed to store boxes and other containers, while shelves were hollowed out of solid rock walls. Special consideration was given to maintaining proper humidity, temperature and delicate atmospheric pressure. It turned out to be a prudent strategy.

However, despite all the frenzied planning, once the bombing started, there were simply too many British libraries to protect and the Germans were using special incendiary bombs designed to ignite buildings rather than destroy them. The effect was devastating and before the war ended more than one million rare volumes were destroyed.

One particularly perplexing example was the remarkable library of the Manchester Literary and Philosophical Society (the famous “Lit & Phil”), England’s oldest scientific society. Alas, this included one of the most fascinating and least-known scientists, John Dalton.

Dalton

Dalton was born in 1766 and was so exceptionally bright he was put in charge of his Quaker school at the improbable age of 12. He was already reading one of the most difficult books to comprehend – Sir Isaac Newton’s Principia – in the original Latin! Later, at Manchester, he was an intellectual whirlwind, producing books and papers ranging from meteorology to grammar. But it was a thick tome titled A New System of Chemical Philosophy that established his lasting reputation. In a short chapter of just five pages (out of 900), people of learning first encountered something approaching modern conception. His astounding insight was that at the root of all matter are exceedingly tiny, irreducible particles. Today, we call them atoms.

The great physicist Richard Feynman famously observed that the most important scientific knowledge is the simple fact that all things are made of atoms. They are everywhere and they constitute everything. Look around you. It is all atoms … and they are in numbers you really can’t conceive.

When Dalton died in 1844, about 40,000 people viewed the coffin and the funeral cortège stretched for two miles. His entry in the Dictionary of National Biography is one of the longest, rivalled by only Charles Darwin and a few others.

Shame on the Luftwaffe for destroying so much of his original work. It is somehow comforting to know they weren’t bombed out of existence since their atoms are now merely part of something else … somewhere in our universe.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Darwin Asked Basic Questions and Changed How We Look at Life

A first edition of Charles Darwin’s On the Origin of Species realized $83,500 at an April 2012 Heritage auction.

By Jim O’Neal

Charles Darwin is a rich source of interesting facts and one finds him in the most unusual of places. As the most versatile scientist of the 19th century, he originally intended to follow his father into medicine and was subsequently sent to Cambridge to train as an Anglican cleric. Endlessly curious, he was interested in almost any scientific question.

The publication of his book, On The Origin of Species (1859), introduced a new understanding of what gradually came to be known as evolution. In it, he asked fundamental questions. The world teems with plant and animal life. Where and what had it come from? How had it been created?

Darwin was far from the first to propose that a process of change over vast periods had produced this diversity, but he was the first to suggest an explanatory theme, which he called “natural selection.” At the core of Darwin’s idea was that all animal life was derived from a single, common ancestor – that the ancestors for all mammals, humans included, for example, were fish. And in a natural world that was relentlessly violent, only those able to adapt would survive, in the process evolving into new species.

Charles Darwin

Darwin was honored many times in his lifetime, but never for On The Origin of Species or for The Descent of Man. When the Royal Society bestowed on him the prestigious Copley Medal, it was for his geology, zoology and botany work – not for evolutionary theories. And the Linnean Society was also pleased to honor him, without a mention of his radical scientific work on evolutionary themes. His theories didn’t really gain widespread acceptance until the 1930s and 1940s with the advance of a refined theory called the Modern Synthesis, which combined his work with others.

He was never knighted, although he was buried in 1882 in Westminster Abbey  – next to Sir Isaac Newton.

This seems exceptionally fitting given the combined versatility of these two remarkably gifted men with voracious appetites for knowledge. Surely, they must have found a way to communicate with each other after all this time. What a conversation to eavesdrop on!

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].