King Charles’ Baker Reminds Us that Small Things Can Lead to Huge Events

This “Elephant & Castle” 1/2 Crown, showing British King Charles II and minted 15 years after the Great Fire of London, realized $35,250 at a September 2014 auction.

By Jim O’Neal

Thomas Farynor, baker for England’s King Charles II, usually doused the fires in his oven before going to bed. But, on Saturday, Sept. 1, 1666, he forgot and at 2 a.m. was awakened by fire engulfing his house.

Farynor lived on Pudding Lane near Thames Street, a busy thoroughfare lined with warehouses that ran along the river wharves. It was typical of London streets … very narrow and crammed with houses made of timber.

As the flames spread and people awoke and started scrambling to escape, nearby Fish Street Hill exploded into fire as piles of straw were ignited.

Samuel Pepys climbed to the top of the Tower of London to get a better view. At 7 a.m., he described how an east wind suddenly turned into a gale and whipped the fire into a raging conflagration. The Great Fire of London was out of control.

As early as 1664, writer John Evelyn had warned of the danger of such an event due to so many open fires and furnaces in such a “wooden … and inartificial congestion of houses on either side that seemed to lean over and touch each other.” Everyone was too busy to worry about it.

There were fire engines for emergencies, but they were rudimentary and privately owned. There was no official London fire brigade. In the chaos, any pumps that did get into service were hampered by large crowds clogging the streets dragging furniture in a vain attempt to salvage valuables.

The other strategy was fire breaks, which consisted of pulling down buildings with huge iron hooks and quickly clearing the debris to create barren areas. However, the fire was moving so quickly that it blazed through the debris before it could be cleared.

Back on the Tower of London, Pepys observed “an infinite great fire headed right at London Bridge.”

London Bridge spanned the Thames River and was an extraordinary structure … lined with homes and shops separated by a passageway only a few yards wide. The fire attacked the bridge greedily, leaping from rooftop to rooftop as people frantically fled.

By Sunday evening, boats carrying people swarmed across the river where onlookers lined the shore mesmerized by the enormous blaze.

On Monday, a powerful wind drove the fire through London. Houses, churches and buildings were all consumed as the blaze continued to rage. An East India warehouse full of spices blew up and the smoke carried the smell of incense across the city.

Finally, by Wednesday, the wind subsided and 200,000 Londoners looked in astonishment at their great city, now turned to ash … 13,000 houses, 87 churches, St Paul’s Cathedral, the Royal Exchange, Customs House, all city prisons and the Great Post Office were all destroyed.

The mystic Anthony Wood said, “All astrologers did use to say Rome would have an end and the Antichrist come, 1666, but the prophecie fell on London.”

All because a baker forgot to put out his oven.

We all know what Smokey the Bear would say.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

George Washington and That Unhappy Affair at Trenton

Emanuel Leutze’s Washington Crossing the Delaware resides in the Metropolitan Museum of Art in New York.

By Jim O’Neal

In the New York Metropolitan Museum of Art, there is a famous painting of Washington Crossing the Delaware by Emanuel Leutze that was painted circa 1850. It is one of the most famous pieces of American art and purports to depict the Dec. 25, 1776, event.

It is also infamous for a number of factual errors.

For example, it shows the crossing with a glowing horizon, when it actually happened in the middle of a dark, sleety night. The American flag is also wrong, since the Stars and Stripes did not exist at the time. Even the ice floes are wrong.

Despite these errors – and many more – it is considered memorable because it captures the determination, desperation and dignity of these men as they rowed into the fight of their lives.

The American Revolution started in early 1776 with skirmishing near Boston, followed by full-scale war. The Continental Army was pushed out of New York and into New Jersey and then Pennsylvania.

By December, half of Washington’s army had been killed, wounded or captured, which left 5,000-6,000 (including the injured).

British General William Howe planned to finish the job when the Delaware River froze and he could capture the Capitol and end the war. Instead, Washington started crossing at midnight and at 8 a.m. divided his troops and attacked in Trenton, catching the British by surprise.

Everywhere, groups of Hessians were surrounded by Continental troops with fixed bayonets and they “struck their colors” (surrendered). Of the 1,500, about 900 were captured, 400 escaped and the rest killed or wounded.

Along with the prisoners, Washington captured six artillery pieces, 1,000 muskets and seven wagonloads of powder and ammunition. These supplies were badly needed and helped against counterattacks at Princeton on Jan. 2 and 3.

Though the triumph at Trenton was followed by greater battles, it was pivotal. Later, British Secretary of State Lord George Germain said, “All our hopes were dashed by that unhappy affair at Trenton.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Now that it’s Behind Us, Let’s Examine the Violent History of Valentine’s Day

Saturday Evening Post illustrator Edmund F. Ward (1892-1990) completed his own version of the St. Valentine’s Day Massacre. This oil on canvas went to auction in August 2010.

By Jim O’Neal

On Feb. 14, 1929, two uniformed policemen and two men in business suits entered a Chicago-area garage, lined up seven men against a wall and killed them. Two of the men used Thompson machine guns (“Tommy Guns”) and the deceased were members of the George “Bugs” Moran gang.

The two policemen marched the shooters out of the building and witnesses just assumed they were part of a rival gang that had been swiftly apprehended by the “heat.” In fact, all four were part of the Al Capone crew that routinely eliminated local competition.

Due to the date, the gangland killing was quickly dubbed “The St. Valentine’s Day Massacre” and 87 years later, its notoriety still persists. Primarily, this is due to the numerous movies, television shows and books based on the event. My personal favorite is the 1967 film The St. Valentine’s Day Massacre, starring Jason Robards, Bruce Dern, George Segal and John Agar (Shirley Temple’s troubled first husband).

Ironically, the first Valentine also died a violent death on Feb. 14 about 278 AD, during the reign of Roman Emperor Claudius II, who was involved in many unpopular and bloody military campaigns. As a result, he had difficulty recruiting soldiers because of their strong attachment to wives and children.

In a questionable effort to solve this chronic issue, “Claudius the Cruel” banned all wedding engagements in Rome. Valentine (granted sainthood posthumously) defied the emperor and continued performing marriages in secret ceremonies.

When discovered, he was arrested and dragged before the Prefect of Rome, who promptly condemned him to be beaten to death with clubs and then beheaded. Legend has him leaving a note to his jailer’s daughter signed “From your Valentine.”

In truth, there are several legends associated with various Valentines through history. According to the Catholic Encyclopedia, at least three different Valentines, all of them martyrs, are mentioned under the date of Feb. 14; one was a priest in Rome, the second a Bishop of Interamna (now Terni, Italy), and the third Valentine a martyr in the Roman province of Africa.

There is also uncertainty over how the martyrs’ names became connected with romance. However, we do know that Pope Gelasius decided to end pagan festivals of love and declared that Feb. 14 simply be celebrated as St. Valentine’s Day. Gradually, the practice of love letters, poems and flowers found their way back in.

In one final effort in 1969, the Catholic Church discontinued liturgical veneration of him (them?), although the name remains on a list of recognized saints. (Note: I’ve run across a dozen St. V’s and even a Pope Valentine).

An incontrovertible fact is that St. Valentine is the patron saint of beekeepers and epilepsy.

Jim O'Niel Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Custer’s Last Stand Marked the Beginning of the End for American Indian Warriors

This large half-plate ambrotype of George Armstrong Custer was taken circa September 1863 by William Frank Browne. It realized $83,650 at a December 2012 Heritage auction.

By Jim O’Neal

T.J. Stiles’ new book Custer’s Trials: A Life on the Frontier of a New America is being praised for the author’s ability to cut through decades of “revisionist baggage,” change the camera’s angle and examine Custer’s life as actually lived … to better gauge the man, his times and his “larger meaning” (whatever that means).

I’m a skeptic, but since Stiles’ biography on Cornelius Vanderbilt was brilliant (winning the Pulitzer Prize and National Book Award), I will probably Kindle it anyway.

What I know is that on the morning of June 25, 1876, Lieutenant Colonel George Armstrong Custer and 210 members of the 7th Cavalry (including two brothers) were killed by the Lakota Sioux and Northern Cheyenne.

Custer was born in Ohio in 1839, was lucky not to be expelled from West Point (he finished last in his class of 34 cadets) and had a decent career in the Civil War. He was probably indifferent to the issue of slavery and appears to be the type that thrived on war … like so many others of that period.

He undoubtedly loved being called “The Boy General.” With his long, blond hair, he was “the synonym of dashing gallantry and unfaltering fidelity” – at least according to The New York Times.

As a failed business speculator, the war offered him a perfect fit for his ambition and many wondered what he might do if he survived. The answer was quite simple: more war.

But this time, the Plains Indians were aggressively defending the land ceded to them by the Second Treaty of Fort Laramie.

In return for a cessation of attacks against miners and other settlers, the federal government gave the Sioux much of western South Dakota and eastern Wyoming. They also pledged to keep others away from the Sioux’s sacred Paha Sapa, or Black Hills.

However, in 1876 gold was discovered in the Black Hills and soon a hoard of 15,000 miners swarmed the territory. President Grant sent troops to push the Indians farther west and this put the Sioux on a direct collision course with Custer.

The Battle of the Little Big Horn, or “Custer’s Last Stand,” is now legendary, but the larger point is that this single event marked the beginning of the end for the thousands of Sioux warriors involved. In fact, it also included all of the Indian peoples of America.

Following the defeat, public outcry turned Custer into a martyr whose spilt blood had to be avenged. An expanded Army fiercely hounded the great Sioux leader Sitting Bull (he escaped to Canada) and his people. Most of the Sioux surrendered and ended up on reservations.

Within 15 years of Custer’s death, the battles had all faded into legend … waiting patiently to be revived by filmmakers, biographers and blog writers.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Edith Bolling Wilson Played the Role of First Woman President Long Before Hillary

This Wilson & Marshall jugate was offered at a June 2015 auction.

By Jim O’Neal

Former President Bill Clinton often reminded voters that if they voted for him, they would get “two for the price of one” – referring, of course, to Hillary. Little mention was made that Al Gore was included in the deal. One assumes that now Hillary has a similar promise tucked away for the appropriate time.

Quite the opposite was true when Woodrow Wilson won the presidency in 1912. His first choice for VP, House Speaker Oscar Underwood of Alabama, turned him down, and the delegates chose Thomas Riley Marshall, much to Wilson’s dismay.

Later, he treated his VP with disdain, not terribly uncommon, except in this instance, Wilson unfairly branded him an unworthy featherweight – “A small caliber man” … “brought along to deliver Indiana’s electoral votes and little more.”

Once in Washington, Marshall spoke his mind early and often, but quickly saw it was a waste of time. After Wilson literally forced him to move his office out of the White House, he settled into the tedium of his daily chores and practiced keeping his wit sharp as a well-paid public speaker.

However, he soon decided his role as President of the Senate was his primary constitutional duty and devoted most of his time there. Marshall sincerely believed the office of VP was an extension of the legislature as opposed to the executive branch. On March 8, 1917, he led an effort to impose a rule on senators to end filibusters if two-thirds of voting senators agreed. This helped eliminate anti-war efforts to block supplies for Europe.

One exception was when Wilson was in Europe after the United States entered World War I. VP Marshall became the first to hold Cabinet meetings in the absence of the president. But this was short-lived.

After President Wilson was partially paralyzed and without any doubt incapacitated by a second stroke in October 1919, Vice President Marshall should have moved forcefully to assume the presidency.

He had the backing of Secretary of State Robert Lansing, Cabinet members and Congressional leaders. Instead, he allowed the First Lady, Wilson’s personal physician and Wilson’s cronies to conceal the president’s condition in an elaborate cover-up involving seclusion, forged signatures and false health reports.

This button and ribbon shows Woodrow Wilson and First Lady Edith Bolling Wilson.

And so the man who, as governor of Indiana, had personally laid the final “Golden Brick” to complete the Indy 500 Speedway in 1909 contented himself with press reports, senatorial oversight and some of the most scathingly delightful commentaries and one-liners ever uttered about the office of the vice president.

Voters literally got “one for the price of two,” but ironically this did not include either President Wilson or VP Marshall, but Edith Bolling Wilson – the First Lady and Wilson’s second wife.

P.S. Marshall’s only real claim to fame is the phrase “What this country needs is a good 5-cent cigar.” That line actually originated in Kim Hubbard’s comic strip Abe Martin of Brown County. Marshal saw it, repeated it on the Senate floor and myth became history (once again).

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As We Pay Tribute to Scalia, Let’s Recall Landmark Appointment Case

After he was defeated in the 1800 presidential election, John Adams retired to Massachusetts as a gentleman farmer. A letter he wrote and signed 13 years later realized $46,875 at an October 2014 Heritage Auction.

By Jim O’Neal

The landmark case known as Marbury vs. Madison arose after the bitter 1800 election when Vice President Thomas Jefferson defeated President John Adams, tied with and Aaron Burr and then eventually won when Alexander Hamilton swung the New York boys to him on the 36th ballot.

A bitter Adams made a last-minute attempt to pack the judiciary with Federalists by appointing 16 new circuit judges and 42 new Justices of the Peace for the District of Columbia. However, four of the new justices, including William Marbury, did not get their commissions before Adams’ last day in office.

Secretary of State James Madison refused to give the four men their commissions, so Marbury asked the Supreme Court to issue a writ of mandamus ordering Madison to do it. This put Chief Justice John Marshall (newly appointed by Adams) in a delicate situation. If the Supremes issued the writ, Madison might simply refuse and the Court had no means to enforce compliance.

Alternatively, if the Court did not, then he was risking surrendering judicial power to Jefferson and his Democratic-Republican Party (later to become the Democratic Party).

Marshall decided there was no middle ground and that left the choice of either declaring the Constitution to be superior and binding, or allowing the legislature to be an entity of unchecked power. Since the nation had established a written Constitution with fundamental principles to bind it in the future, it had to be both superior and binding law. And if the Constitution was the superior law, then an act “repugnant” must be invalid.

The decision was to discharge Marbury’s action because the Court did not have original jurisdiction, and the Judiciary Act of 1789, which Marbury argued was the basis of his petition, was unconstitutional. The Court found the Constitution specifically enumerated cases where the Court had both original and appellate jurisdiction. The Court also concluded a writ of mandamus was unconstitutional and void.

In more recent times, the Court has asserted a broad judicial review power and the role as the ultimate interpreter of the Constitution. Once a law is declared unconstitutional, the courts simply decline to enforce it. Judicial review was once controversial. Even Judge Learned Hand felt it was inconsistent with the separation of power. However, “Marbury” served to make the judiciary equal to the executive and legislative branches.

Most scholars and historians give full credit to Chief Justice Marshall for solidifying this principle of an equal tripartite government structure that has served us well for 200-plus years.

Author Harlow Giles Unger goes even further in his 2014 biography (John Marshall: The Chief Justice who Saved the Nation), where he claims Marshall turned into a bulwark against presidential and congressional tyranny and saved American Democracy.

I tend to disagree since the process for selecting members has been politicized to the point the Court seems to be simply an extension of which party controls the lever of power. I suspect we will have a chance to see this phenomenon several times in the next four to eight years as turnover increases.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

 

‘Black Death’ is a Grim Reminder: Never Trust a Dirty Rat

A Folio Society 1999 edition of Philip Ziegler’s The Black Death went to auction in November 2012.

By Jim O’Neal

In five short years beginning in 1347, one-third of Europe – 25 million people – died of the bubonic plague. Many villages and towns lost 80 percent of their populations. A world that had just emerged from the Dark Ages and was moving into a new era was, suddenly, pockmarked with deserted farms, collapsed churches and zombie-like survivors.

Bubonic plague changed world history and mankind in ways that linger to this day. All subsequent epidemics – small pox, cholera, influenza and AIDS – are grim reminders of the terror of the “Black Death” and the specter of a world strewn with bodies and people defenseless against an invisible killer.

Although it wasn’t known at the time, the cause was a rat flea, Xenopsylla cheopis (X. cheopis), a ravenous creature that lived on black rats and other rodents. X. cheopis carried the virulent plague bacillus and it came in two forms, both deadly to humans.

One was from direct contact via a flea bite, which was followed by a black purple bruising and a mortality rate of 60 percent in as little as five to seven days. The predominant form was pneumonic, which spread from person to person by air, infecting the lungs, with death in two to three days.

It had started deep in Asia, where China was in a war with the Mongols that devastated great swaths of the countryside. Infected rats, no longer able to find food in the forests, headed to populated areas, where the disease spread rapidly.

By the 1330s, China had lost 35 million people out of 125 million. Then X. cheopis began to travel with traders across Mongolia and Central Asia. In 1345, the plague hit the lower Volga River, followed by the Caucasus and Crimea before finally arriving in Italy in the summer of 1347.

The disease arrived in London in November and killed one-third to half of the total population within three days. The population of England and Wales was 6 million people.

After a quiet winter, it sprung up again in 1349, burning through England to Scotland, leaping to Ireland and crossing the sea to Scandinavia. After devastating Moscow in 1852, it exhausted itself on the barren/empty Russian Steppes.

The plague returned in 1362 in numerous, smaller recurrences until the 1600s. Another wave of plague swept through Asia in the 19th century and it was then that the role of both X. cheopis and the Y. pestis bacteria was discovered.

Although the last plague pandemic was contained, hundreds of plague cases are reported each year since X. cheopis still exists in remote wild rodents, perhaps with yet another strategy to plague us. Despite the advent of curative antibiotics, Black Death is still lurking … somewhere.

Do you know what’s in your attic?

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

‘Liberty, Equality, Fraternity’ was Battle Cry of the French Revolution

This note signed by Marie Antoinette realized $7,170 at an October 2006 auction.

By Jim O’Neal

In 1788, France was ruled by a monarchy, aristocracy and clergy who lived in luxury, while many of the commoners starved. The Storming of the Bastille is now celebrated as the heroic uprising that started the French Revolution.

It occurred on July 14, 1789, and symbolizes the liberation from the French Crown’s oppressive reign of poverty and crushing taxes. When the mob broke through the gates of the infamous jail, the garrison capitulated. But the prison was almost empty. Unknown to the attackers, the government had scheduled the building to be demolished and only six prisoners were left in its cells.

Four of the prisoners were forgers and the other two insane.

Earlier, when King Louis XVI had assumed the crown (1794), the country was in a major economic crisis, with a staggering national debt and a tax base that was in decline. The Catholic Church (which owned 10 percent of all land) and the nobility took advantage of tax loopholes, leaving the tax burden to poor urban workers. Apparently, economic inequality is not a new situation.

The incident that sparked the Storming was the dismissal of Finance Minister Jacques Necker, who sympathized with the commoners. At dawn on July 14, they broke into Hôtel des Invalides and captured 28,000 muskets and 10 cannons, but the ammunition had been moved to the Bastille … all 20,000 rounds.

Thus, the Bastille was not only a target for ammunition, it represented a symbol of long-standing autocratic political power and social systems. At 2 p.m., someone opened fire and the mob started pouring in.

Later, as the French Revolution went careening out of control, thousands of nobels were executed on any pretext and eventually King Louis XVI and his wife, Queen Marie (“Let them eat cake”) Antoinette, were executed. This set off shock waves all over Europe and nearby nations feared these wildly progressive ideas would spread like wildfire.

During the next decade, France would be radically transformed as widespread mob violence ruled. This “Reign of Terror” would forever tarnish the ideals of the French Revolution. But yet today, Bastille Day is celebrated annually as the day the French people won their freedom.

Vive la France!

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

James Garfield Unique Among American Presidents

This autographed James Garfield cabinet card, dated a month before the president’s assassination, realized nearly $4,500 at a June 2010 auction.

By Jim O’Neal

James Garfield was the last of the Log Cabin Presidents (meaning he was born in one), and in 1880 he was simultaneously a member of the House, a senator-elect and the president-elect. He remains the only person to ever have this unique distinction.

However, he had not gone to the 1880 Republican convention seeking the nomination. Instead, his specific intent was to nominate John Sherman, who was President Rutherford B. Hayes’ Secretary of the Treasury. In fact, Garfield made the formal nominating speech and waited while Ulysses S. Grant and James G. Blaine battled it out. After 35 ballots, Garfield himself became the consensus candidate … and then won the election.

Sherman was eager to become president, but after three failed attempts he gave up. His brother was William Tecumseh Sherman, the general who made the famous “March to the Sea” from Atlanta to Savannah in a scorched earth (total war) campaign that was devastating to Georgia and the Confederacy. His telegram to Abraham Lincoln on Dec. 25, 1864 – “I beg to present you as a Christmas gift the City of Savannah …” – was literally the death knell of the Confederacy and ended the Civil War four months later.

General Sherman was far less political than his brother and at the 1884 convention declared if drafted he would not run; if nominated he would not accept; and if elected he would not serve. We still hear variations of this declaration yet today some 130 years later.

P.S. Garfield was ambidextrous and could write Latin with one hand while writing Greek with the other. Since he favored his left, he is considered the first left-handed president.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

It’s Been 43 Years Since a Human has Been on the Moon

This Apollo 11-flown U.S. flag on a crew-signed presentation certificate sold for $71,875 at a November 2013 Heritage auction.

By Jim O’Neal

On July 16, 1969, three astronauts lay strapped on their backs in their space module atop a massive Saturn V rocket. Neil Armstrong, “Buzz” Aldrin and Michael Collins were going on a trip into the Florida sky headed for a landing on the moon.

The Apollo space program had begun just eight years earlier in April 1961. On April 12, the Russian cosmonaut Yuri Gagarin had become the first person into space and to orbit the Earth. That stirred President Kennedy’s competitive juices.

After Gagarin’s 90-minute orbit, JFK wrote to VP Lyndon Johnson – chairman of the National Space Council – asking: “Do we have a chance of beating the Soviets by putting a lab into space, or a trip around the moon … or by a rocket to go to the moon and back with a man?”

At the time, the American space program was not far behind – as Alan Shepard had traveled into space on May 5, 1961 – but lagged in the technology to reach the moon.

The Russians had already succeeded in launching three hard-landing rockets (unmanned spacecraft shot up with a goal of simply hitting the moon) and America was two years away from that.

So after Shepard’s feat, JFK issued his famous challenge while addressing Congress. “I believe this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to Earth.”

Back in Florida, Apollo 11 – with a mighty roar – lifted off into space to meet that challenge. Only 11 minutes after liftoff, it was in orbit with the three astronauts feeling the early stages of weightlessness.

Thirty-eight-year-old Neil Armstrong was the commander and would be accompanied by Aldrin on the moonwalk after the lunar module Eagle separated from the command module Columbia.

Michael Collins would not touch the moon’s surface, as he was responsible for making sure the Eagle launched and then re-docked for the journey back to Earth.

While only eight years had passed since JFK’s challenge, they had been difficult, turbulent ones. JFK was dead from an assassin’s bullet, as were brother Bobby and MLK Jr.

Riots in major cities and the Vietnam War had ripped at the nation’s fabric. The counterculture of drugs, sex and rock ’n’ roll was still in full throttle. (We were in San Jose and mildly surprised by the daily chaos just 45 miles up Highway 101 in San Francisco. Haight-Ashbury and Golden Gate Park were surreal.)

As millions of Americans watched Apollo 11 with awe and admiration, others felt it was a giant, expensive boondoggle designed to divert attention from widespread racial tensions and the 10 million people living below the poverty line.

Had America lost its mojo or were we entering a new, better phase? The jury was divided.

But nothing had distracted NASA except for a tragedy in 1967 when three astronauts on Apollo 1 died in a launch-pad fire. But they persevered and by July 1969 had made four successful manned flights, put spacecraft into orbit around the moon and tested the lunar module.

The Russian program unraveled when a chief scientist died and their highly secret N1 rockets exploded at least four times. Soviet politicians privately ceded the race to America and could only watch from the sidelines.

It took Apollo 11 three days to reach the moon and on July 19 the Columbia entered lunar orbit. On July 20, Armstrong and Aldrin entered the Eagle and landed it on the moon.

“Houston, Tranquility Base here. The Eagle has landed.” It was July 20, 1969.

There were four more manned missions to the moon. The last was in December 1972. Then the program was scrapped.

It has been 43 years since a human has been on the moon and we now rely on Ridley Scott (The Martian) and other filmmakers to fill the gap as we struggle with overpopulation, geopolitics and terrorism and a resurgence of racial tension.

Progress is difficult.

P.S. A surprising number of people (6 percent to 20 percent by annual polling) believe the whole moon thing was a hoax, anyway.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].