Penicillin Changed Medicine — But Deadly Enemies Lurk

alexander-fleming-signed-photograph
A photograph signed by Nobel Prize winner Alexander Fleming sold for $1,250 at an April 2016 auction.

By Jim O’Neal

In the fifth century B.C., Herodotus noted in his “History” that every Babylonian was an amateur physician, since the sick were laid out in the street so that any passerby could offer advice for a cure. For the next 2,400 years, that was as good an approach as any to curing infections; doctors’ remedies were universally useless.

Until the middle of the 20th century, people routinely died from infections. Children were killed by scarlet fever, measles and even tonsillitis. Mothers systematically died from infections following childbirth and many who survived were taken later by pneumonia or meningitis.

Soldiers most commonly died from infections such as gangrene or septicemia, not from war injuries. Even a small cut could lead to a fatal infection. Bandaging a wound simply sealed in the infectious killers to carry out their deadly missions. Of the 10 million killed in World War I, 5 million died of infections.

There were few antidotes to infections … vaccination against smallpox with cowpox vaccine (Edward Jenner in 1796), introduction of antiseptics (Joseph Lister in 1865), and the advent of sulfa drugs in 1935. But there was no known cure for a stunning number of other deadly threats: typhoid fever, cholera, plague, typhus, scarlet fever, tuberculosis. The list seemed endless and most of these ended in death.

All of this changed in 1940.

Alexander Fleming’s discovery of penicillin while examining a stray mold in his London lab in 1928, and its eventual development by a team at Oxford University, led to the discovery of antibiotics. This was the most important family of drugs in the modern era. Before World War II ended, penicillin had saved the lives of hundreds of thousands and offered a viable cure for major bacterial scourges such as pneumonia, blood poisoning, scarlet fever, diphtheria and syphilis/gonorrhea.

The credit usually goes to Fleming, but the team of Howard Florey, Ernst Chain, Norman Heatley and a handful of others on the Oxford team deserve a major share. The efficacy and eventual use of the drug required them to perform their laboratory magic.

Neither Fleming nor Florey made a cent from their achievements, although Florey, Fleming and Chain did share a Nobel Prize. British pharmaceutical companies remarkably failed to grasp the significance of the discovery, so American companies – Merck, Abbott, Pfizer – quickly grabbed all the patents and proceeded to make enormous profits from the royalties.

The development of antibiotics is one of the most successful stories in the history of medicine, but it is unclear whether its ending will be a completely happy one. Fleming prophetically warned in his 1945 Nobel lecture that the improper use of penicillin would lead to it becoming ineffective. The danger was not in taking too much, but in taking too little to kill the bacteria and “[educating] them on how to resist it in the future.” Penicillin and the antibiotics that followed were prescribed too freely for ailments they could cure, and for other viral infections they had no effect on. The result is strains of bacteria that are now unfazed by antibiotics.

Today, we face a relentless and deadly enemy that has demonstrated the ability to mutate at increasingly fast rates – and these “super bugs” are capable of developing resistance. We must be sure to “keep a few steps ahead.”

Hear any footsteps?

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

President McKinley’s Popularity Soared Despite ‘Imperialist’ Charges

william-mckinley-beaver-top-hat-with-leather-traveling-case
President McKinley’s beaver top hat and leather traveling case realized $17,925 at a December 2008 Heritage auction.

By Jim O’Neal

Exactly 115 years ago this week, on Sept. 14, 1901, the 25th president of the United States, William McKinley, died from an assassin’s bullet. He had been shot on Sept. 6 in Buffalo, N.Y., while attending the Pan-American Exposition.

As he stood shaking hands with a long line of well-wishers at the Temple of Music, a man approached with his right hand wrapped in a handkerchief. As McKinley extended his hand to the man, Leon Czolgosz, a Polish-American anarchist, he was shot twice by a concealed .32-caliber revolver. One bullet deflected off a suit button, but the other entered his stomach, passed through a kidney and lodged in his back.

When doctors operated, they were unable to locate the bullet and he died eight days later from the spread of gangrene throughout his body. It was eerily similar to the assassination of President James Garfield 20 years earlier. He had been shot on July 2, 1881, and did not die until Sept. 19. Again, his doctors were unable to locate the bullet and he suffered for over two months as they probed the wound with their unsanitary hands and instruments until they killed him.

william-mckinley
McKinley

Both men would have easily survived if they had the benefit of modern medicine. By the time of McKinley’s death, the X-ray had been invented and doctors in the Balkan war in 1897 were using it to “see inside patients’ bodies.” However, the possible side effects of radiation were not yet recognized.

William McKinley had entered politics following the Civil War and at age 34 was a member of the House of Representatives for 14 years before losing in 1890. He then served two terms as governor of Ohio and by 1896 was the leading Republican candidate for president. Aided by wealthy Ohio industrialist Mark Hanna, he easily defeated Democrat William Jennings Bryan by the largest margin since the Civil War.

During his first term, McKinley earned a reputation as a protectionist by advocating high tariffs to protect American business and labor from foreign imports. He was a staunch supporter of the gold standard to back up paper money. However, foreign policy became a major issue in April 1898 when the United States intervened in the Cuban struggle for independence from Spain. The Spanish-American War was over in a quick three months and Cuba became a U.S. protectorate. This was followed by the annexation of Puerto Rico, Guam and the Philippines.

Suddenly, the United States had become a colonialist power, with a big interest in Asia, especially China.

President McKinley’s popularity soared during these economic boom times, and despite charges of being an “imperialist,” his margin of victory over William Jennings Bryan was even greater in the 1900 presidential rematch. Theodore Roosevelt was selected as McKinley’s vice president – against Mark Hanna’s strong objections – and naturally became president after McKinley’s unfortunate death.

Teddy “The Rough Rider” Roosevelt, who had charged up San Juan Hill, would bring a new level of energy and spirit to the White House. All Mark Hanna could do was watch and grouse, “Now look! That damn cowboy is president!”

The nation seemed to do just fine.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Rights Granted to Barons in Medieval England are Bedrocks of Modern Society

charter-of-liberties-cardinal-langton-archbishop-of-canterbury
A painting by William Martin (1753-1836) shows Cardinal Langton, Archbishop of Canterbury, producing to the barons the Charter of Liberties, issued in 1100. It’s considered a landmark document in English history and forerunner of the Magna Carta.

By Jim O’Neal

During the heated debates between our Founding Fathers while formulating the Constitution and Bill of Rights, many invoked the principles of the Magna Carta to bolster their arguments. For almost 600 years, the Great Charter had stood as a beacon whenever men discussed their inherent rights.

The truth is somewhat more complicated.

On June 15, 1215, King John of England signed a charter at Runnymede, a meadow beside the Thames River near Windsor fortress. The Archbishop of Canterbury had proposed the charter as a means to make peace between the king and a group of rebel barons. The document – which eventually became the Magna Carta – promised access to swift justice, no illegal imprisonment and limitations on feudal payments.

When John was enthroned as king in 1199, England was a feudal society, a land-based hierarchy where the king owned all the land. The tenants-in-chief (i.e. barons) received land from the king in exchange for loyalty and military service. They, in turn, leased the land to their own retainers, who leased to peasants. However, English monarchs had been levying ever-increasing higher taxes and financial burdens on the barons.

Starting with Henry I (1100-1135), the Crown established a series of Royal Courts that also raised revenue through fines and taxes. Discontent intensified under King John, who had lost a series of expensive campaigns against the French from 1200-1204 and had also lost the land in Normandy, which put the squeeze on the entire system. In addition to being inept at war, King John broke his commitments to the barons and both sides then disavowed any promises made.

One significant irony was that the Magna Carta originally only included the king and the barons. Ordinary citizens were totally ignored.

A new Magna Carta was issued in 1216 by Henry III and revised again in 1226 to raise taxes once again. Finally, in 1297, Edward I agreed to rights that evolved into English statute law, with a broad array of rights granted to all citizens.

The Magna Carta has acquired almost mythical status as the constitutional bedrock of citizen rights. It contributed to the development of parliament from the 13th century and was used by 17th century rebels to argue against the divine right of kings. Several American colonies’ charters contained clauses modeled on it, while the design of the Massachusetts seal chosen at the start of the Revolutionary War depicts a militiaman with sword in one hand and the Magna Carta in the other.

Americans believed the Crown had breached the fundamental law enjoyed by all English citizens, which led to the U.S. Constitution enacted in 1789 and the Bill of Rights adopted two years later. We are fortunate to live in a country governed by the rule of law and limitations on the arbitrary power of a government against its citizens.

Much blood has been spilled by many in defense of these beliefs.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Atom Bombs: From Pop Culture Novelty to Unimaginable Threat

first-day-cover-postmarked-july-28-1955
A First Day Cover postmarked July 28, 1955, and signed by six crew members of the Enola Gay, which dropped the first atomic bomb on Hiroshima, went to auction in April 2005.

By Jim O’Neal

As North Korea continues to relentlessly pursue offensive atomic weapons – perhaps a weaponized missile delivered by a submersible vessel – the world is perplexed over how to respond. U.S. sanctions are ignored, China is permissive, complicit or both, and South Korea and Japan grow more anxious as the United Nations is irrelevant, as usual.

Concurrently, polls indicate that attitudes about the use of atomic bombs against Japan to end World War II are less favorable. But this was not always the case.

At first, most people had approved the use of the bomb on Hiroshima, followed by a second bomb a few days later on Nagasaki. They agreed the bombs hastened the end of the war and saved more American lives than they had taken from the Japanese. Most people shared the view of President Truman and the majority of the defense establishment: The bomb was just an extension of modern weapons technology.

There had even been some giddiness about the Atomic Age. The bar at the National Press Club started serving an “atomic cocktail.” Jewelers sold “atomic earrings” in the shape of a mushroom cloud. General Mills offered an atomic ring and 750,000 children mailed in 15 cents and a cereal box top to “see genuine atoms split to smithereens.”

But the joking masked a growing anxiety that was slowly developing throughout our culture. In the months after it ended the war, the bomb also began to effect an extraordinary philosophical reassessment and generate a gnawing feeling of guilt and fear.

Then, the entire Aug. 31, 1946, issue of The New Yorker magazine was devoted to a 30,000-word article by John Hersey entitled, simply, “Hiroshima.” The writer described the lives of six survivors before, during and after the dropping of the bomb: a young secretary, a tailor’s wife, a German Jesuit missionary, two doctors and a Japanese Methodist minister.

The power of Hersey’s reporting, devoid of any melodrama, brought human content to an unimaginable tragedy and the response was overwhelming. The magazine sold out. A book version became a runaway bestseller (still in print). Albert Einstein bought 1,000 copies and distributed them to friends. An audience of millions tuned in to hear the piece, in its entirety, over the ABC radio network.

After Hersey’s book with its explicit description of the atomic horror (“Their faces wholly burned, their eye sockets were hollow, the fluid from their melted eyes had run down on their cheeks”), it was impossible to ever see the bomb as just another weapon. The only solace was that only America possessed this terrible weapon.

However, it soon became clear that it was only a matter of time before the knowledge would spread and atomic warfare between nations would become possible. People were confronted for the first time of the real possibility of human extinction. They finally grasped the fact that the next war could indeed be what Woodrow Wilson had dreamed the First World War would be – a war to end all wars – although only because it would likely end life itself.

Let’s hope our world leaders develop a consensus about the Korean Peninsula (perhaps reunification) before further escalation. It is time to end this threat, before it has a chance to end us.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

After Civil War, Centralization of Government Changed Fabric of Society

texas-confederate-bonnie-blue-flag
This Texas Confederate “Bonnie Blue” flag, carried by the 3rd Texas State Cavalry, is one of the rarest Confederate flags in existence. It realized $47,800 at a June 2007 Heritage auction

By Jim O’Neal

On May 10, 1865, President Andrew Johnson announced that armed resistance to the federal government had officially ended. However, on May 12-13 in the Battle of Palmito Ranch, a modest force of several hundred Union cavalry attacked a Confederate outpost on the banks of the Rio Grande, 12 miles from Brownsville, Texas.

Confederate troops had done nothing to break an unofficial truce with the Union forces, but after two days of fighting, they forced Union soldiers to first withdraw and then retreat. The skirmish is generally recognized as the final battle of the Civil War.

Before all the Union Army went home, there was a Grand Review in Washington on May 23-24 when Johnson and General Ulysses S. Grant watched the march of the triumphant Union armies down Pennsylvania Avenue from the Capitol. This great procession of 150,000 men would take two full days, while thousands hoisted flags, hummed patriotic songs and showered the troops with flowers. Here was the titanic armada of the United States, the mightiest concentration of power in history. The first day was dominated by the Army of the Potomac, Washington’s own army. At 9 sharp the next day, General William Tecumseh Sherman’s great army took its turn. They were sunburned and shaggy in stark contrast to the crisp and well-kept group from the previous day.

The demobilization was completed very effectively. Within two months, more than 600,000 troops had been discharged and a year later, the million-man army was down to a mere 65,000 men. Further, the number of warships was reduced from 500 to 117 by the end of 1865. Thus, the armed forces did not remain a permanent power and the mustered-out military readjusted to civilian life quite easily. This was much different from those returning from World War II or Vietnam, or the 3 to 4 million still rotating from Afghanistan, Iraq and Syria (some on their fifth and sixth deployments in this l-o-n-g war).

Still, life after the Civil War was profoundly different. Aside from the human carnage and dismal impoverishment of the South, the centralization of the government changed the fabric of society. Until 1861, the only direct contact with the federal government was usually the postal service. Now, the War Department controlled state militias, direct taxes were imposed, national banking instituted, and federal money printed or minted.

The most radical change was naturally in the South. All seceded states were under martial law, an occupation force maintained law and order, and 4 million blacks were neither slaves nor citizens. The North imposed no organized vengeance; no Confederates were tried for treason – the only Southern war criminal was Henry Wirz, commander of the prisoner-of-war camp near Andersonville, Georgia, who was hanged in November 1865. And a military court dispensed swift justice to the Abraham Lincoln assassination conspirators, with four hanged at the Old Penitentiary on July 7.

However, reconstruction of the pre-war Union of the United States was under way and Lincoln’s most fervent prayer – reunification – finally a reality despite the horrendous loss of life involved. Peace had been restored.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Laws Curtailing Free Speech Rejected by Americans – 200 Years Ago

Thomas Jefferson reaffirmed the rights of Americans to “think freely and to speak and write what they think.”

By Jim O’Neal

After serving two full terms as president (1789-97), George Washington was more than ready to leave Philadelphia, where he had lived since relocating from New York City. He returned to his home in Mount Vernon with a profound sense of relief. The plantation had been losing money during his extended absence, leaving him in a financial quandary of being relatively wealthy but cash-poor. Virtually all of his assets were in non-liquid land and slaves.

However, little did he expect that his successor, Vice President John Adams, would allow relations with France to deteriorate to the point of a possible war. Diplomacy had failed and on July 4, 1798, President Adams was forced to offer Washington a commission as Lieutenant General and Commander-in-Chief of the Army with the responsibility of preparing for a potential war. Washington accepted, but wisely delegated the actual work to his trusted friend Alexander Hamilton. All of this happened a mere 17 months before his death at his beloved Mount Vernon.

War with France was avoided, but President Adams utilized some highly controversial tactics, including the Alien and Sedition Acts. These consisted of four bills passed by a highly partisan, Federalist-dominated Congress that were signed into law by Adams in 1798.

Although neither France nor the United States actually declared war, rumors of enemy spies (aliens) or a surprise French invasion frightened many Americans and the Alien Acts were designed to mitigate the risk. The first law, the Naturalization Act, extended the time it took immigrants to gain citizenship from five to 14 years. Another law provided that once war was declared, all male citizens of an enemy nation could be expelled. It was estimated that this would include 25,000 French citizens in the United States. The president also was authorized to deport any non-citizens suspected of plotting against the government during wartime or peace.

The Sedition Act was much more insidious. Sedition means inciting others to resist or rebel against lawful authority. The act first outlawed conspiracies “to oppose any measure of the government.” Further, it made it illegal for anyone to “express any false, scandalous and malicious writing against Congress or the President.” It included published words that had BAD INTENT to DEFAME the government or cause HATRED of the people toward it.

Secretary of State Timothy Pickering was in charge of enforcement and pored over newspapers looking for evidence. Numerous people were indicted, fined and jailed to the point that it became a major issue in the presidential election of 1800. Thomas Jefferson argued the laws violated the First Amendment.

The voters settled the debate by electing Jefferson.

In his inaugural address, Jefferson confirmed a new definition of free speech and press as the right of Americans “to think freely and to speak and write what they think.”

The U.S. Supreme Court never decided whether the Alien and Sedition Acts were constitutional. The laws, quite conveniently, expired on March 3, 1801, John Adams’ last day in office.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Proponents Were Convinced Prohibition Would End Violence, Create Jobs

This “Happy Days Are Here Again” Prohibition repeal pitcher and mug set (Stangl Company, 1934), showing caricatures of Franklin D. Roosevelt and other prominent Democrats, sold for $1,015.75 in June 2008.

“There is as much chance of repealing the 18th Amendment as there is for a hummingbird to fly to the planet Mars with the Washington Monument tied to its tail.” – U.S. Sen. Morris Sheppard of Texas, known as “The Father of Prohibition”

By Jim O’Neal

The crowd that assembled at the First Congressional Church in Washington, D.C., shortly before midnight on Jan. 16, 1920 was filled with anticipation of a new era. Summoned by the imminent arrival of Prohibition, as sanctioned by the 18th Amendment in 1919, thousands had gathered to usher out the sinful past and greet the arrival of a new nation.

At the stroke of midnight, one by one, speakers made their way to the pulpit, decrying the awful demon rum which, with God’s help, had finally been put to rest. And as they did, the wide-eyed audience dreamed of the world that would now emerge, a place where prisons would be turned into factories and slums would be nothing more than a memory.

“Men will walk upright now,” preacher Billy Sunday declared before a similar congregation in Virginia. “Women will smile and children will laugh. Hell will be forever for rent.” In the Washington audience was the beaming U.S. Sen. Morris Sheppard of Texas, author of the 18th Amendment of the U.S. Constitution establishing America as a “dry” country. Sheppard listened as Secretary of the Navy Josephus Daniels described their purpose as the greatest reform movement in the history of the world. The National Prohibition Act (aka the Volstead Act) was effective at midnight Jan. 17, 1920.

The first violation occurred 59 minutes later. In Chicago, six armed men stole $100,000 of medicinal whiskey by emptying two freight cars filled with booze.

And so it would continue throughout the 1920s. The advocates of alcohol prohibition thought they were making America a better place – an alcohol-free zone, a land without alcoholics or family violence, a land where ruined lives would be eliminated, a more stable society.

But they were wrong. Prohibition did little to reduce the demand and simply replaced legal brewers, distillers, vintners and liquor stores with moonshiners, bootleggers and smugglers willing to risk prison. People still wanted bars and restaurants that served alcohol and such places continued to operate as speakeasies by paying off police, prosecutors and judges. The alcohol industry became the province of gangsters, and law enforcement was overwhelmed by illegal, wide-scale alcohol distribution. A new morality was easier to declare than maintain as Sen. Sheppard discovered when a moonshine still – churning out 130 gallons a day – was discovered on his Austin, Texas, ranch.

The “Nobel Experiment” finally ended in 1933 when President Franklin D. Roosevelt signed an amendment to the Volstead Act and declared, “I think this would be a good time to have a beer.”

Cheers!

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Darwin Asked Basic Questions and Changed How We Look at Life

A first edition of Charles Darwin’s On the Origin of Species realized $83,500 at an April 2012 Heritage auction.

By Jim O’Neal

Charles Darwin is a rich source of interesting facts and one finds him in the most unusual of places. As the most versatile scientist of the 19th century, he originally intended to follow his father into medicine and was subsequently sent to Cambridge to train as an Anglican cleric. Endlessly curious, he was interested in almost any scientific question.

The publication of his book, On The Origin of Species (1859), introduced a new understanding of what gradually came to be known as evolution. In it, he asked fundamental questions. The world teems with plant and animal life. Where and what had it come from? How had it been created?

Darwin was far from the first to propose that a process of change over vast periods had produced this diversity, but he was the first to suggest an explanatory theme, which he called “natural selection.” At the core of Darwin’s idea was that all animal life was derived from a single, common ancestor – that the ancestors for all mammals, humans included, for example, were fish. And in a natural world that was relentlessly violent, only those able to adapt would survive, in the process evolving into new species.

Charles Darwin

Darwin was honored many times in his lifetime, but never for On The Origin of Species or for The Descent of Man. When the Royal Society bestowed on him the prestigious Copley Medal, it was for his geology, zoology and botany work – not for evolutionary theories. And the Linnean Society was also pleased to honor him, without a mention of his radical scientific work on evolutionary themes. His theories didn’t really gain widespread acceptance until the 1930s and 1940s with the advance of a refined theory called the Modern Synthesis, which combined his work with others.

He was never knighted, although he was buried in 1882 in Westminster Abbey  – next to Sir Isaac Newton.

This seems exceptionally fitting given the combined versatility of these two remarkably gifted men with voracious appetites for knowledge. Surely, they must have found a way to communicate with each other after all this time. What a conversation to eavesdrop on!

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Trade Has Created Economic Opportunities for More than 100 Years

To celebrate the opening of the Panama Canal, the U.S. Mint produced this 1915-S $50 Panama-Pacific Octagonal. This example, graded MS67 NGC, realized $282,000 at an April 2014 Heritage auction.

By Jim O’Neal

The ceremonial opening on Nov. 17, 1869, of the Suez Canal, linking the Mediterranean and Red seas, was an emphatic declaration of European – specifically French – technological and financial means. It was also a significant illustration of a rapidly emerging and increasingly global economy and, simultaneously, a further boost to Europe’s imperial ambitions.

The Suez Canal reduced the sailing time between London and Bombay by 41 percent and the route to Hong Kong by 26 percent. The impact on trade was obvious, as it greatly simplified the defense of India and its critical markets, Britain’s key imperial goal. Trade in the Indian Ocean was now protected by 21 Royal Naval bases, making it a virtual monopoly.

An even more challenging project was the construction, begun in 1881, of the Panama Canal linking the Atlantic and Pacific oceans. It was a French initiative, but plagued by controversy and a consistently hostile climate that cost the lives of 22,000 laborers. The United States eventually completed the project in August 1914 after the French finally conceded defeat.

It was the largest and most expensive engineering project in the world.

It, too, dramatically reduced sailing times, shortening the Liverpool to San Francisco route by 42 percent and the San Francisco to New York time by 60 percent. The project assumption by the United States marked a crucial shift in attitudes in both trading and advancing U.S. interests in foreign affairs. This started in 1898 when the United States itself became a colonial power by taking over the Philippines from Spain.

It then accelerated under President Teddy Roosevelt (1901-09), when he actively advocated American military involvement, especially in Latin America, to ensure stability as a means of advancing American interests. A major consequence was the strengthening of the U.S. Navy and its “Great White Fleet,” which completed a circumnavigation of the globe between 1907 and 1909. This was followed by President William Howard Taft’s Dollar Diplomacy, by which American commercial interests – primarily in Latin America and East Asia – were secured by the backing of the U.S. government to encourage huge investments.

A hundred years later, we are still actively pursuing a variant of this strategy by advocating two-way investment with Brazil, China and India despite being on a short hiatus until the current political season ends. This is the only rational way to create the jobs we need and keep our trading partners’ markets open.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

When Nation Faces Uncertainty, Good Leaders do What They do Best

Merritt Mauzey’s Depression-era oil on masonite, Uncle Fud and Aunt Boo, realized $77,675 at a December 2007 Heritage auction.

By Jim O’Neal

In January 1931, a 46-year-old tenant farmer drew the nation’s attention to a small event in rural Arkansas. Homer C. Coney harvested corn and cotton on land he rented for $8 an acre. But a tremendous drought the previous summer meant that most farmers had no crop, no money and no way to survive the winter.

Coney tried to sell his truck for $25 … no takers. So he and his family – trapped in a one-room shack – tried to exist on a Red Cross relief ration of $12/month. Coney, his wife and five sons lived on beans mixed with lard (to “give it flavor”).

A young neighbor mother visited the family frantically seeking help because her children had not eaten for two days. Coney said, “Lady you wait here. I am a-going to get some food over at Bells – the Red Cross man that never give out nothing.” In England, Ark., Coney discovered a big crowd of people, hungry since the Red Cross office there was out of food vouchers. Soon, there was a crowd of 500 people who confronted the mayor and chief of police. “We’re not beggars and will work for 50 cents a day, but we will not let our families starve.”

All over the country, people read about the brave souls who gathered to demand food. “500 Farmers Storm Arkansas Town Demanding Food for Their Children,” read the front page of The New York Times. “You let this country get hungry and they are going to eat, no matter what happens to budgets, income taxes or Wall Street values,” wrote populist Will Rogers in his newspaper column. “Washington mustn’t forget who rules when it comes to a showdown.”

The Great Southern Drought of 1930 was a catastrophe, to be sure. But this act of desperation was only a small part of the bigger issue in the new decade. In 1930, 26,000 businesses collapsed. In 1931, 28,000 more, and by the beginning of 1932, 3,500 banks, holding billions in uninsured savings, went under. 12 million people (25 percent of the workforce) were unemployed and real earnings fell by one-third. In some cities, it was worse; 50 percent of Chicago was out of work, 80 percent of Toledo.

Soup lines stretched as far as the eye could see. America the land of possibility was the land of despair. In 1931, the people of Cameroon in West Africa sent a check to the people of New York for $3.77 to aid the “starving.” About 20,000 veterans of WWI arrived at the U.S. Capitol to demand early payment of their pensions. On July 28, 1932, General Douglas MacArthur – side by side with Major Dwight Eisenhower with a parade of infantry, cavalry and tanks – routed the squatters as ordered.

Today, it is hard to imagine the level of expectation that greeted President Franklin D. Roosevelt when he took the reins from the much-maligned Herbert Hoover. However, the Democratic platform in 1932 was much the same as Hoover’s: a balanced budget and a curb on spending. Even the term “New Deal” was a fluke line from a nomination acceptance, until it surprised everyone and became popular.

But Roosevelt had a supreme confidence, enormous energy, and a determination equal to that of George Washington and Abraham Lincoln. He quickly cribbed a line from Henry David Thoreau (“Nothing is so much to be feared as fear”), began fireside chats with the American people from a room with no fireplace, and started leading.

That’s what good leaders do.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].