Looking Back, President Ford’s Pardon was the Right Thing to Do

lf
A first edition copy of Gerald Ford’s 1979 autobiography A Time to Heal, inscribed to Caspar Weinberger, sold for nearly $900 at a February 2010 auction.

By Jim O’Neal

When Vice President Spiro Agnew resigned in 1973, accepting conviction for income tax violations in lieu of facing trial on bribery charges, the door to the White House swung open to Gerald R. Ford.

The Constitution’s 25th Amendment, adopted in 1967, came into use for the first time. It provided that a vacancy in the office of vice president could be filled by nomination by the president and confirmation by both houses of Congress. President Richard Nixon, reeling from the twin blows of the Watergate scandals and the Agnew bribery charges, began a frantic scramble to fill the vacancy with someone acceptable to the public and whom Congress would quickly approve. He also needed someone he could trust as unquestionably loyal.

Ford’s nomination was announced by Nixon on Oct. 12, 1973, barely two weeks before the House Judiciary Committee began formal proceedings to determine whether Nixon should be impeached. Nobody in Congress could dig up a smidgen of impropriety regarding Ford and the House approved his nomination 387-35 on Dec. 6 after a Senate vote of 92-3 on Nov. 27. During the hearings, Ford was asked if he would pardon Nixon should he resign and GRF replied, “I do not think the public would stand for it.”

A short but tumultuous eight months later, Ford became the 38th president of the United States in a moment of high drama at noon on Aug. 9, 1974. Shortly before, the nation had been glued to the TV as Nixon became the first president in history to resign. He departed the White House after a tearful farewell to his staff. A few minutes later, the cameras turned to Ford, the first vice president to ascend to the presidency by appointment.

Ford was sworn in on the same East Room platform where Nixon had stood moments earlier, although the White House was not the usual place for a swearing-in ceremony. Rutherford B. Hayes was sworn in before the fireplace in the Red Room, FDR’s fourth term began on the South Portico, and Harry S. Truman had taken the oath in the Cabinet Room. By then, the Nixons were on Air Force One headed for San Clemente. When the clock struck noon, the designation of the plane was dropped.

Within a week of Ford’s swearing in, documents were being hauled out by Nixon staffers in “suitcases and boxes” every day. Working late on Aug. 16, Benton Becker, Ford’s legal counsel, observed a number of military trucks lined up on West Executive Avenue, between the West Wing and the Executive Office Building. Upon inquiry, he was told they were there “to load material that was to be airlifted from Andrews Air Force Base to San Clemente.” Sensing something was wrong, Becker got the Secret Service to intervene and the trucks were unloaded and the material returned to the EOB. Soon, an armed guard was stationed there to protect them.

Perhaps one last ploy by Tricky Dick, à la the 17 minutes of recording “accidentally” erased. We will never know. But we do know that the Ford-Nixon pardon that caused such a national outrage has finally been judged the prudent thing to do … finally.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Concerns Over Harry Truman Vanished as New President Exerted His Leadership

1945-white-house-press-release
A 1945 White House press release signed by Harry S. Truman as president announcing the bombing of Hiroshima realized $77,675 at an October 2010 Heritage auction.

By Jim O’Neal

In February 1945, Franklin Delano Roosevelt traveled to Yalta in southeastern Russia to discuss plans for peace with Winston Churchill and Joseph Stalin. He reported to Congress that plans had been arranged for an organization meeting of the United Nations on April 25, 1945. He said, “There, we all hope, and confidently expect, to execute a definite charter of organization under which the peace of the world will be preserved and the forces of aggression permanently outlawed.”

Upon his return, he looked tired and older than his 63 years. Late in March, he went to Warm Springs, Ga., for an overdue rest. On April 12, 1945, he was working at his desk as an artist painted his portrait when he suddenly complained of “a terrible headache.” A few hours later, at 4:45 p.m., he died of a cerebral hemorrhage. The last words he had written were “The only limit to our realization of tomorrow will be our doubts of today. Let us move forward with strong and active faith.”

harry-s-truman
Truman

His successor, the first president to take office in the midst of a war, Harry S. Truman, said he felt “like the moon, the stars and all the planets had fallen on me.” The nation and world wondered if he was capable of taking Roosevelt’s place. His background and even his appearance added to the nervous uncertainty. He was the first president in 50 years without a college education. He spoke the language of a Missouri dirt farmer and World War I artilleryman – both of which he had been. Instead of talking like a statesman, he looked like a bank clerk or haberdasher – both of which he had been. And worst of all, everyone knew that for more than 20 years he had been a lieutenant of Tom Pendergast, one of the most corrupt political bosses in the country.

What most people didn’t know was that he was scrupulously honest, knew his own mind and was one of the most knowledgeable students of history ever to enter the White House. Importantly, he understood the powers of the president, and knew why some men had been strong chief executives and others had been weak leaders.

When he learned about the atomic bomb, there was no soul-searching or handwringing debates. He ordered it dropped on Japan because he was sure it would save American lives and quickly end World War II. It did not bother him in the least that years later, intellectuals would question whether one man should have made such an awesome decision alone. He knew in his heart that he was right … period.

Two of his well-known sayings capture the essence of Give’m Hell Harry Truman: The Buck Stops Here (a sign on his desk) and my favorite … If you can’t stand the heat, stay the hell out of the kitchen!

Leaders get paid to make tough decisions.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Historian Understood How Frontier Shaped Character of Americans

frederick-jackson-turner-the-significance-of-the-frontier
Frederick Jackson Turner’s address “The Significance of the Frontier in American History” had its first print appearance in “Proceedings of the State Historical Society of Wisconsin at its Forty-First Annual Meeting.” A copy of the 1894 book, original spine perished, realized $3,250 at a September 2016 auction.

By Jim O’Neal

turner-copy
Turner

Frederick Jackson Turner was a young, undistinguished American historian at the University of Wisconsin in 1893, yet he was invited to join a list of speakers at a conference of the American Historical Association being held at the World’s Columbian Exposition in Chicago. The odd location had prompted many of the nation’s best scholars to decline to attend, primarily because of concern they would be reading their latest papers over the din of an outsized carnival.

Of course, the entire decade of the 1890s was viewed as unusual by many, with a pervasive sense that something important was ending. It was also a tumultuous time. The Massacre at Wounded Knee (1890) had resulted in the deaths of several hundred Lakota Sioux – the last major armed conflict between American Indians and the U.S. Army. Ellis Island had opened as a U.S. immigration depot and over the next 20 years, 13 million immigrants would enter via the island. Wyoming became the 44th state to enter the union, the first with women’s suffrage. There was the Panic of 1893, when the government almost ran out of gold and had to get help from J.P. Morgan.

More importantly, the census of 1890 reported that the frontier had vanished! Many Americans had a powerful sense that they were running up against the end of their history and were on the verge of something new, but unknown.

For Turner, this was a totally unexpected opportunity – a career breakthrough – to expound on his pet theology, developed over years of study. An avid fisherman, hiker and proponent of the American West, he had concluded that American life and character owed a debt to the pursuit of the frontier. And now, the 1890 census declared it closed – all of the land explored, claimed and settled. The young Turner saw a nation facing a crisis of the unknown.

However, Turner’s thesis was elementary and appealing. He declared that American pioneers were not simply transplanted Europeans, but a people unto themselves and shaped by their environment, as opposed to their history or institutions. The frontier hardship made them self-reliant and individualist. Free land made them generous and optimistic. Frontier challenges required them to adapt, innovate and even cooperate democratically.

Frederick Jackson Turner on the significance of the frontier in American history:

“American democracy was born of no theorist’s dream. It came out of the American forest and gained new strength each time it touched a new frontier.

“The wilderness masters the colonist. … It takes from him the railroad car and puts him in a birch canoe. It strips off the garments of civilization and arrays him in the hunting shirt and moccasin. … Little by little he transforms the wilderness, but the outcome is not the old Europe. … Here is a new product that is American.”

Teddy Roosevelt and I both wholeheartedly agree with Turner’s thesis about the role of the frontier in shaping the character of Americans.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Principles Articulated by Founders Transcend Time and Technology

the-federalist
An edition of The Federalist: A Collection of Essays, Written in Favour of the New Constitution, as Agreed upon by the Federal Convention, September 17, 1787, in two volumes, by James Madison, Alexander Hamilton and John Jay, sold for $175,000 at a September 2016 Heritage auction.

By Jim O’Neal

In 1776, while Thomas Jefferson was putting the finishing touches on the Declaration of Independence, a committee headed by John Dickerson began meeting to draft the Articles of Confederation, a document designed to specify how this new government would work. Due to lots of internal debates (and the Revolutionary War), the Articles were not ratified until 1781, two years before the war ended.

Then a formal constitutional convention met in Independence Hall in May 1787. They abandoned the Articles and wrote a new document, “The Constitution of the United States of America.” Fifty-five delegates attended, but Vermont was not part of the Union and Rhode Island was absent since it was anti-federal, anti-union and didn’t bother to send a delegation to Philadelphia. Ten amendments were then ratified on Dec. 15, 1791, and we call them the Bill of Rights.

James Madison, “The Father of the Constitution,” played a crucial role at each stage in the entire process … calling the convention, framing the Constitution and carefully deciding how the Bill of Rights would work in a practical sense. To an extraordinary degree, we rely on Madison for our basic insight into the original theories and the ambitions of the Constitution, per se.

Madison had come to the convention totally prepared to control the agenda in a very characteristic way – carefully and deeply. He had studied the fundamental problems of the Articles, the state constitutions and the lessons of history, including his personal experience in the Continental Congress and in Virginia.

The Declaration of Independence and the Constitution combine to address mankind’s most basic political questions and the principles of organization for a government. Thus, they were meant to serve not simply the 18th century, but succeeding generations, whatever their circumstances or the state of their social progress. Because the principles the Founders articulated transcend both time and technology, they will serve us well through the 21st century, but only if we understand them correctly and apply them consistently.

Government officials must respect their oaths to uphold the Constitution and we the people must be vigilant in seeing that they do. The Constitution will live only if it is alive in the hearts and minds of the American people. That perhaps is the most enduring lesson of our experiment in ordered liberty.

It doesn’t seem like too much to expect.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Eisenhower Crucial to ‘Greatest Engineering Project in World History’

eisenhower-inaugural-photograph-signed-by-four-presidents
A photograph of President Dwight D. Eisenhower’s inauguration on Jan. 20, 1953 – autographed by Eisenhower, Richard Nixon, Harry Truman and Herbert Hoover – realized $8,365 at an October 2006 Heritage auction.

By Jim O’Neal

As federal war-game planners considered their objectives in mobilizing a West Coast battle response, railroads were quickly ruled out because they could not carry the amount of equipment involved and some of the weapons, especially tanks, were too heavy for trains and tracks.

Since the Army already had plenty of wheeled and tracked vehicles, dispatching a test expedition by road and having a Motor Transport Corps drive the convoy could prove, once and for all, the superiority of wheels over hoofs or railways. Inexplicably, they failed to include any assumptions about the condition of the roads en route.

At the appointed time in 1919, the convoy gathered at a monument by the South Lawn of the White House. The column was three miles long and consisted of 79 vehicles, including 34 heavy trucks, oil and water pumpers, a mobile blacksmith shop, a tractor, staff observation cars, searchlight carriers, a mobile hospital and other wheeled necessities to support the actual war machines.

Nine vehicles were wrecked en route and 21 men injured – leaving 237 soldiers, 24 officers and 15 observers – including then-Brevet Lt. Col. Dwight D. Eisenhower (who kept a concise daily diary). When they arrived in Lincoln Park in San Francisco 62 days later, it was undisputed that the conditions of the roads – essentially non-existent west of the Missouri River – would preclude any timely defense of the West Coast and that any Asian enemy would have been victorious in any battles along the way.

The journey left an indelible impression on the young officer from West Point, who would later be Commander-in-Chief of the nation. The Army and Eisenhower had indisputably proved what many in the capital had suspected. The American West had few, if any, roads that were even remotely usable for military or civilian use.

Only when they reached California and beyond the state capital of Sacramento did the roads become great – with macadamized surfaces, proper drainage, road rules, gas stations and tire-repair depots … all in sufficient quantity to service existing needs.

But this did not appease Eisenhower in the slightest. This great convoy, called into action to deal with a hypothetical threat to the country’s vital West Coast, had crossed 3,251 miles of the country at an average speed of 5.6 mph, making any potential response virtually useless. The vehicles were in fine shape and the men brave and intelligent, but the roads were deplorable. If nothing else, Eisenhower wrote, the experience of this expedition should spur the building – as a national effort – of a fast, safe and properly designed system of transcontinental highways.

This led to the creation of America’s Interstate Highway System – the greatest engineering project in world history … an intrinsic network of high-speed roads built with the sole purpose of uniting the corners, edges and center of this vast nation.

Fittingly, “The Dwight D. Eisenhower National Interstate and Defense Highways Act” was authorized by the Federal-Aid Highway Act of 1956 during the second term of the 34th president of the United States. “I LIKE IKE!”

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Who Has Stronger Claim to ‘Father of the American Navy’?

john-paul-jones
A signature of John Paul Jones removed from a letter sold for $4,600 at an April 2005 Heritage auction.

By Jim O’Neal

John Barry was a naval officer during the Revolutionary War and was issued commission No. 1 by President George Washington. This resulted in him becoming a commodore.

John Hancock – president of the Continental Congress – gave him a captain’s commission in 1776 and many consider Barry “the Father of the American Navy,” despite his relative anonymity today.

Another naval commander of the Revolutionary War, John Paul (he later added “Jones” to elude authorities after a duel), was born in Scotland in 1747 and became a sailor at age 13.

John Paul Jones joined the American Navy and made his fame in 1779. He was commanding an old French merchant ship refitted and renamed the USS Bonhomme Richard (in honor of Ben Franklin’s “Poor Richard”) when he engaged a British warship in the North Sea.

During the ensuing battle with the HMS Serapis, 300 of 375 American seamen were killed or wounded. Jones’ ship sustained such heavy damage that it sank the following day.

At some point in the battle, the British asked if the Americans were ready to surrender. It was in this situation when JPJ famously replied:

“I have not yet begun to fight!”

Perhaps this legendary quote is why he shares the title with John Barry as “Father of the American Navy.”

You decide.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Cal Rodgers’ Bizarre Flight Mostly Forgotten to Aviation History

the-wright-brothers-original-fabric-from-the-vin-fiz
Calbraith Perry Rodgers made the first transcontinental airplane flight in the Vin Fiz Flyer.

By Jim O’Neal

Before Ben Sliney made the decision to close all the airports in 2001 (see yesterday’s post), most aeronautical efforts were focused on inventing flying machines that would go faster and higher.

Orville and Wilbur Wright were brothers from Ohio who worked on printing presses, motors and bicycles. On Dec. 17, 1903, near Kitty Hawk, N.C., they made the first controlled, sustained flight of a powered, heavier-than-air aircraft. Two years later they perfected controls to make fixed-wing powered flights feasible.

Less than eight years later, William Randolph Hearst offered a prize of $50,000 to the first flier to cross the United States between New York and Pasadena (going either way) within 30 days.

Three men actually tried. One was a race driver, another a jockey, but both failed. The third aspirant, a flamboyant, cigar-chomping showman named Calbraith Perry Rodgers, decided to try despite just learning how to fly. His only lesson was a 90-minute session with Orville Wright, but it was enough for him to receive the 49th license to fly.

By chance, the Armour Meat Co. had developed a soft drink called Vin Fiz that was wildly unsuccessful. In desperation, they hatched a marketing plan to sponsor Cal Rodgers’ flight and it was equally bizarre.

They named the plane Vin Fiz, plastered it with advertising signs and put an oversized bottle between the two wheels. Then they designed a special train to trundle beneath the plane’s flight, loaded with every possible spare part, and Cal’s wife!

cal-rodgers
Cal Rodgers

People all over the country would be exposed to the Vin Fiz brand.

One minor detail was that the offer had a one-year expiration clause and by the time all the preparations were complete, Cal only had 43 days to make the entire trip. Undaunted, on Sept. 17, 1911, Rodgers climbed aboard, shorted the magneto, pulled the choke cable, released the brake and took off. Within 10 minutes, the speck in the sky was gone from view.

In the end, he failed. He made it to Pasadena 19 days too late to win the prize money. However, he pressed on and dipped his wheels in the water at Long Beach, thereby becoming the first man to fly from one coast to the other in just 19 weeks. Thousands more would follow this true adventurer’s aerial footsteps, until Ben Sliney issued his famous order to all aircraft 90 years later. One opened the skies and the other closed them, yet neither are well known.

The Vin Fiz Flyer is on display at the Smithsonian National Air and Space Museum and 12 Vin Fiz 25-cent stamps are known to exist. One sold for $88,000 in 1999. The Vin Fiz grape drink finally fizzled out.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Science Tends to March On, Despite Public Opinion Polls

aldous-huxley-brave-new-world-london-chatto-windus-1932
A 1932 first edition of Aldous Huxley’s Brave New World, with its custom leather clamshell box, sold for $3,585 at an October 2008 Heritage auction.

By Jim O’Neal

The healthy, lusty cry that emanated from a delivery room in a British mill town hospital at precisely 11:47 on a summer night in 1978 brought joy to Lesley and John Brown. Since their marriage in 1969, the couple had wanted a baby and now they had one, thanks to $1,500 John won betting on football. There was also the brilliance of two British doctors who became the first physicians to create a test-tube baby. They had been unsuccessful in 80 previous tries.

The formal term for the method that produced little 6-pound Louise Brown was IVF – “in vitro fertilization” (literally “in glass”), but “test tube” better fit the imagination that was running wild around the world. With the news, people began recalling Aldous Huxley’s 1932 novel Brave New World and its vision of a society where “babies are mass produced from chemical solutions in laboratory bottles.”

Actually, Louise was nothing close to this concept. She represented the union of John’s sperm and Lesley’s egg, and was carried to term by her mother as other babies were. Only the joining of the ingredients had been done in a lab. The incipient embryo was transferred back to Lesley, where it implanted itself on the wall of the hormone-prepared uterus.

The moral and legal implications touched off incendiary debates when the news from Britain spread. And the fact the story came from a hysterical tabloid (the Browns sold the story rights to the Daily Mail for $500,000) took the episode further into the realm of science-fiction.

The clergy were unanimous against “baby factories” and scientists “playing God” – but the issue was overtaken in the headlines by women’s rights, feminism, industrial abuses of the environment (Earth Day), fossil fuels and materialism. Climate change, and income and wealth inequality battles were on the way.

In August 1998, I hosted a PepsiCo dinner for the Scottish scientists from the Roslin Institute (University of Edinburgh) who had just cloned the first mammal from an adult somatic cell, the famous Dolly the Sheep. Dolly was born on July 5, 1996, and the great controversy this time was “designer babies.” As I recall, they suspected the Koreans would be the first to attempt humans, but the only ones I’ve read about are pigs, deer, horses and bulls.

I think Dolly died just before her seventh birthday from a lung disease – living about half as long as hoped. I assume little Louise Brown must have 5 to 10 million IVF cousins by now.

Science marches on, despite public opinion polls.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Penicillin Changed Medicine — But Deadly Enemies Lurk

alexander-fleming-signed-photograph
A photograph signed by Nobel Prize winner Alexander Fleming sold for $1,250 at an April 2016 auction.

By Jim O’Neal

In the fifth century B.C., Herodotus noted in his “History” that every Babylonian was an amateur physician, since the sick were laid out in the street so that any passerby could offer advice for a cure. For the next 2,400 years, that was as good an approach as any to curing infections; doctors’ remedies were universally useless.

Until the middle of the 20th century, people routinely died from infections. Children were killed by scarlet fever, measles and even tonsillitis. Mothers systematically died from infections following childbirth and many who survived were taken later by pneumonia or meningitis.

Soldiers most commonly died from infections such as gangrene or septicemia, not from war injuries. Even a small cut could lead to a fatal infection. Bandaging a wound simply sealed in the infectious killers to carry out their deadly missions. Of the 10 million killed in World War I, 5 million died of infections.

There were few antidotes to infections … vaccination against smallpox with cowpox vaccine (Edward Jenner in 1796), introduction of antiseptics (Joseph Lister in 1865), and the advent of sulfa drugs in 1935. But there was no known cure for a stunning number of other deadly threats: typhoid fever, cholera, plague, typhus, scarlet fever, tuberculosis. The list seemed endless and most of these ended in death.

All of this changed in 1940.

Alexander Fleming’s discovery of penicillin while examining a stray mold in his London lab in 1928, and its eventual development by a team at Oxford University, led to the discovery of antibiotics. This was the most important family of drugs in the modern era. Before World War II ended, penicillin had saved the lives of hundreds of thousands and offered a viable cure for major bacterial scourges such as pneumonia, blood poisoning, scarlet fever, diphtheria and syphilis/gonorrhea.

The credit usually goes to Fleming, but the team of Howard Florey, Ernst Chain, Norman Heatley and a handful of others on the Oxford team deserve a major share. The efficacy and eventual use of the drug required them to perform their laboratory magic.

Neither Fleming nor Florey made a cent from their achievements, although Florey, Fleming and Chain did share a Nobel Prize. British pharmaceutical companies remarkably failed to grasp the significance of the discovery, so American companies – Merck, Abbott, Pfizer – quickly grabbed all the patents and proceeded to make enormous profits from the royalties.

The development of antibiotics is one of the most successful stories in the history of medicine, but it is unclear whether its ending will be a completely happy one. Fleming prophetically warned in his 1945 Nobel lecture that the improper use of penicillin would lead to it becoming ineffective. The danger was not in taking too much, but in taking too little to kill the bacteria and “[educating] them on how to resist it in the future.” Penicillin and the antibiotics that followed were prescribed too freely for ailments they could cure, and for other viral infections they had no effect on. The result is strains of bacteria that are now unfazed by antibiotics.

Today, we face a relentless and deadly enemy that has demonstrated the ability to mutate at increasingly fast rates – and these “super bugs” are capable of developing resistance. We must be sure to “keep a few steps ahead.”

Hear any footsteps?

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Atom Bombs: From Pop Culture Novelty to Unimaginable Threat

first-day-cover-postmarked-july-28-1955
A First Day Cover postmarked July 28, 1955, and signed by six crew members of the Enola Gay, which dropped the first atomic bomb on Hiroshima, went to auction in April 2005.

By Jim O’Neal

As North Korea continues to relentlessly pursue offensive atomic weapons – perhaps a weaponized missile delivered by a submersible vessel – the world is perplexed over how to respond. U.S. sanctions are ignored, China is permissive, complicit or both, and South Korea and Japan grow more anxious as the United Nations is irrelevant, as usual.

Concurrently, polls indicate that attitudes about the use of atomic bombs against Japan to end World War II are less favorable. But this was not always the case.

At first, most people had approved the use of the bomb on Hiroshima, followed by a second bomb a few days later on Nagasaki. They agreed the bombs hastened the end of the war and saved more American lives than they had taken from the Japanese. Most people shared the view of President Truman and the majority of the defense establishment: The bomb was just an extension of modern weapons technology.

There had even been some giddiness about the Atomic Age. The bar at the National Press Club started serving an “atomic cocktail.” Jewelers sold “atomic earrings” in the shape of a mushroom cloud. General Mills offered an atomic ring and 750,000 children mailed in 15 cents and a cereal box top to “see genuine atoms split to smithereens.”

But the joking masked a growing anxiety that was slowly developing throughout our culture. In the months after it ended the war, the bomb also began to effect an extraordinary philosophical reassessment and generate a gnawing feeling of guilt and fear.

Then, the entire Aug. 31, 1946, issue of The New Yorker magazine was devoted to a 30,000-word article by John Hersey entitled, simply, “Hiroshima.” The writer described the lives of six survivors before, during and after the dropping of the bomb: a young secretary, a tailor’s wife, a German Jesuit missionary, two doctors and a Japanese Methodist minister.

The power of Hersey’s reporting, devoid of any melodrama, brought human content to an unimaginable tragedy and the response was overwhelming. The magazine sold out. A book version became a runaway bestseller (still in print). Albert Einstein bought 1,000 copies and distributed them to friends. An audience of millions tuned in to hear the piece, in its entirety, over the ABC radio network.

After Hersey’s book with its explicit description of the atomic horror (“Their faces wholly burned, their eye sockets were hollow, the fluid from their melted eyes had run down on their cheeks”), it was impossible to ever see the bomb as just another weapon. The only solace was that only America possessed this terrible weapon.

However, it soon became clear that it was only a matter of time before the knowledge would spread and atomic warfare between nations would become possible. People were confronted for the first time of the real possibility of human extinction. They finally grasped the fact that the next war could indeed be what Woodrow Wilson had dreamed the First World War would be – a war to end all wars – although only because it would likely end life itself.

Let’s hope our world leaders develop a consensus about the Korean Peninsula (perhaps reunification) before further escalation. It is time to end this threat, before it has a chance to end us.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].