Airplanes added economic, psychological factors to warfare

Alexander Leydenfrost’s oil on canvas Bombers at Night sold for $4,182 at a February 2010 Heritage auction.

By Jim O’Neal

In August 1945, a historic event occurred: Foreign forces occupied Japan for the first time in recorded history. It was, of course, the end of World War II and cheering crowds were celebrating in the streets of major cities around the world as peace returned to this little planet.

A major factor in finally ending this long, costly war against the Axis powers of Germany and Japan was, ultimately, the use of strategic bombing. An essential element was the development of the B-29 bomber – an aircraft not even in use when Japan attacked Pearl Harbor in 1941, forcing a reluctant United States into a foreign war. Maybe it was hubris or fate, but the attack was a highly flawed decision that would not end well for the perpetrators.

The concept of war being waged from the air dates to the 17th century when several wrote about it while speculating on when or where it would begin. The answer turned out to be the Italo-Turkish War (1911-12), when an Italian pilot on Oct. 23, 1911, flew the first aerial reconnaissance mission. A week later, the first aerial bomb was dropped on Turkish troops in Libya. The Turks responded by shooting down an airplane with rifle fire.

As World War I erupted seemingly out of nowhere, the use of airplanes became more extensive. However, for the most part, the real war was still being waged on the ground by static armies. One bitter legacy of this particular war was the frustration over the futility and horror of trench warfare, which was employed by most armies. Many experts knew, almost intuitively, that airplanes could play a role in reducing the slaughter of trench warfare and a consensus evolved that airplanes could best be used as tactical army support.

However, in the 20-year pause between the two great wars, aviation technology improved much faster than other categories of weaponry. Arms, tanks, submarines and other amphibious units were only undergoing incremental changes. The airplane benefited by increased domestic use and major improvements in engines and airframes. The conversion to all-metal construction from wood quickly spread to wings, crew positions, landing gear and even the lowly rivet.

As demand for commercial aircraft expanded rapidly, increased competition led to significant improvements in speed, reliability, load capacity and, importantly, increased range. Vintage bombers were phased out in favor of heavier aircraft with modern equipment. A breakthrough occurred in February 1932 when the Martin B-10 incorporated all the new technologies into a twin-engine plane. The new B-10 was rated the highest performing bomber in the world.

Then, in response to an Air Corps competition for multi-engine bombers, Boeing produced a four-engine model that had its inaugural flight in July 1935. It was the highly vaunted B-17, the Flying Fortress. Henry “Hap” Arnold, chief of the U.S. Army Air Forces, declared it was a turning point in American airpower. The AAF had created a genuine air program.

Arnold left active duty in February 1946 and saw his cherished dream of an independent Air Force become a reality the following year. In 1949, he was promoted to five-star general, becoming the only airman to achieve that rank. He died in 1950.

War planning evolved with the technology and in Europe, the effectiveness of strategic long-range bombing was producing results. By destroying cities, factories and enemy morale, the Allies hastened the German surrender. The strategy was comparable to Maj. Gen. William Tecumseh Sherman’s “March to the Sea” in 1864, which added economic and psychological factors to sheer force. Air power was gradually becoming independent of ground forces and generally viewed as a faster, cheaper strategic weapon.

After V-E Day, it was time to force the end of the war by compelling Japan to surrender. The island battles that led toward the Japanese mainland in the Pacific had ended after the invasion of Okinawa on April 1, 1945, and 82 days of horrific fighting that resulted in the loss of life for 250,000 people. This had been preceded by the March 9-10 firebombing of Tokyo, which killed 100,000 civilians and destroyed 16 square miles, leaving an estimated 1 million homeless.

Now for the mainland … and the choices were stark and unpleasant: either a naval blockade and massive bombings, or an invasion. Based on experience, many believed that the Japanese would never surrender, acutely aware of the “Glorious Death of 100 Million” campaign, designed to convince every inhabitant that an honorable death was preferable to surrendering to “white devils.” The bombing option had the potential to destroy the entire mainland.

The decision to use the atomic bomb on Hiroshima (Aug. 6) and Nagasaki (Aug. 9) led to the surrender on Aug. 10, paving the way for Gen. Douglas MacArthur to gain agreement to an armistice and 80-month occupation by the United States. Today, that decision still seems prudent despite the fact we only had the two atomic bombs. Japan has the third-largest economy in the world at $5 trillion and is a key strategic partner with the United States in the Asia-Pacific region.

Now about those ground forces in the Middle East…

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How Far Will We Go In Amending American History?

A collection of items related to the dedication of the Washington Monument went to auction in May 2011.

By Jim O’Neal

Four years ago, George Clooney, Matt Damon and Bill Murray starred in a movie titled The Monuments Men, about a group of almost 400 specialists who were commissioned to try and retrieve monuments, manuscripts and artwork that had been looted in World War II.

The Germans were especially infamous for this and literally shipped long strings of railroad cars from all over Europe to German generals in Berlin. While they occupied Paris, they almost stripped the city of its fabled art collections by the world’s greatest artists. Small stashes of hidden art hoards are still being discovered yet today.

In the United States, another generation of anti-slavery groups are doing the exact opposite: lobbying to have statues and monuments removed, destroyed or relocated to obscure museums to gather dust out of the public eyes. Civil War flags and memorabilia on display were among the first to disappear, followed by Southern generals and others associated with the war. Now, streets and schools are being renamed. Slavery has understandably been the reason for the zeal to erase the past, but it sometimes appears the effort is slowly moving up the food chain.

More prominent names like President Woodrow Wilson have been targeted and for several years Princeton University has been protested because of the way it still honors Wilson, asserting he was a Virginia racist. Last year, Yale removed John C. Calhoun’s name from one of its residential colleges because he was one of the more vocal advocates of slavery, opening the path to the Civil War by supporting states’ rights to decide the slavery issue in South Carolina (which is an unquestionable fact). Dallas finally got around to removing some prominent Robert E. Lee statues, although one of the forklifts broke in the process.

Personally, I don’t object to any of this, especially if it helps to reunite America. So many different things seem to end up dividing us even further and this only weakens the United States (“United we stand, divided we fall”).

However, I hope to still be around if (when?) we erase Thomas Jefferson from the Declaration of Independence and are only left with George Washington and his extensive slavery practices (John Adams did not own slaves and Massachusetts was probably the first state to outlaw it).

It would seem to be relatively easy to change Mount Vernon or re-Washington, D.C., as the nation’s capital. But the Washington Monument may be an engineering nightmare. The Continental Congress proposed a monument to the Father of Our Country in 1783, even before the treaty conferring American independence was received. It was to honor his role as commander-in-chief during the Revolutionary War. But when Washington became president, he canceled it since he didn’t believe public money should be used for such honors. (If only that ethos was still around.)

But the idea for a monument resurfaced on the centennial of Washington’s birthday in 1832 (Washington died in 1799). A private group, the Washington National Monument Society – headed by Chief Justice John Marshall – was formed to solicit contributions. However, they were not sophisticated fundraisers since they limited gifts to $1 per person a year. (These were obviously very different times.) This restriction was exacerbated by the economic depression that gripped the country in 1832. This resulted in the cornerstone being delayed until July 4, 1848. An obscure congressman by the name of Abraham Lincoln was in the cheering crowd.

Even by the start of the Civil War 13 years later, the unsightly stump was still only 170 feet high, a far cry from the 600 feet originality projected. Mark Twain joined in the chorus of critics: “It has the aspect of a chimney with the top broken off … It is an eyesore to the people. It ought to be either pulled down or built up and finished,” Finally, President Ulysses S. Grant got Congress to appropriate the money and it was started again and ultimately opened in 1888. At the time, it was 555 feet tall and the tallest building in the world … a record that was eclipsed the following year when the Eiffel Tower was completed.

For me, it’s an impressive structure, with its sleek marble silhouette. I’m an admirer of the simplicity of plain, unadorned obelisks, since there are so few of them (only two in Maryland that I’m aware of). I realize others consider it on a par with a stalk of asparagus, but I’m proud to think of George Washington every time I see it.

Even so, if someday someone thinks it should be dismantled as the last symbol of a different period, they will be disappointed when they learn of all the other cities, highways, lakes, mountains and even a state that remain to go. Perhaps we can find a better use for all of that passion, energy and commitment and start rebuilding a crumbling infrastructure so in need of repairs. One can only hope.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Navajo Code Talkers Represented One of the Boldest Gambits of World War II

A gelatin silver print of Raising the Flag on Mount Suribachi, Iwo Jima, 1945, signed by photographer Joe Rosenthal, sold for $7,500 at an October 2016 Heritage auction.

By Jim O’Neal

Feb. 23, 1945, was a dramatic day in World War II when six Marines raised the American flag on Mount Suribachi to signal a decisive victory at Iwo Jima. Associated Press photographer Joe Rosenthal was there and his photo “Raising the Flag on Iwo Jima” won him the Pulitzer Prize for photography in 1945 – the only one to ever win in the same year it was published.

One of the Marines who hoisted the flag, Ira Hamilton Hayes, portrayed himself in the 1949 movie The Sands of Iwo Jima, which was nominated for four Academy Awards. It starred John Wayne, who received his first Academy Award nomination for Best Actor. He would have to wait 21 years to actually win one for Best Actor in the movie True Grit. Sadly, Hayes, an American Indian, died in 1955 at the tender age of 32 from alcoholism-related circumstances.

The battle for Iwo Jima was the first U.S. attack on the Japanese Imperial home islands and soldiers defended it tenaciously since it was the first stepping stone to the mainland. Of the 21,000 Japanese troops dug into tunnels and heavily fortified positions, 19,000 were killed as they had made a sacred commitment to fight to their death. There were numerous reports of soldiers committing suicide rather than surrendering, although there was also a curious situation where several actually hid in caves for two years before finally giving up. The battle for the entire island lasted from Feb. 23 until March 26 and it was considered a major strategic victory.

The first word of this momentous news crackled over the radio in odd guttural noises and complex intonations. Throughout the war, the Japanese had been repeatedly baffled and infuriated by these bizarre sounds. They conformed to no linguistic system known to Japanese language experts. The curious sounds were the U.S. military’s one form of communicating that master cryptographers in Tokyo were never able to decipher.

This seemingly perfect code was the language of the American Navajo Indian tribe. Its application in WWII as a clandestine system of communication was one of the 20th century’s best-kept secrets. After a string of cryptographic failures, the military in 1942 was desperate for lines of communication among troops that would not be easily intercepted. In the 1940s, there was no such thing as a “secure line.” All talk had to go out over the public airwaves. Standard codes were an option, but cryptographers in Japan had become adept at quickly cracking them. And there was another problem. The Japanese were also proficient at intercepting short-distance communications – walkie-talkies for example – and then having well-trained English-speaking soldiers either sabotage the message or send out false commands to set up an ambush.

That was the situation in 1942 when the Pentagon authorized one of the boldest gambits of the war by recruiting Navajo code talkers. Because the Navajo lacked technical terms for military artillery, the men coined a number of neologisms specific to their task and their war. Thus, the term for a tank was “turtle,” a battleship was “whale,” a hand grenade was “potato” and plain old bombs were “eggs.”

It didn’t take long for the original 29 recruits to expand to an elite corps of Marines, numbering 425 Navajo code talkers, all from the American Southwest. The talkers were so valuable that they traveled everywhere with personal bodyguards. In the event of capture, they had all agreed to commit suicide rather than allow America’s most valuable tool to fall into the hands of the enemy. If a captured Navajo didn’t follow that grim instruction, the bodyguard was told to shoot and kill the code talker.

Their mission and every detail of their messaging was a secret not even their families knew about. It wasn’t until 1968, when the military felt convinced they would not be needed in the future, that America learned about the incredible contributions a handful of American Indians made to winning history’s biggest war. The Navajo code talkers, sending and receiving as many as 800 error-free messages every day, were widely credited with giving troops the decisive edge at Guadalcanal, Tarawa, Saipan, Iwo Jima and Okinawa.

Semper fi.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Yes, George C. Marshall Earned Title of ‘Greatest Living American’

A photograph of General George C. Marshall, signed, went to auction in October 2007.

By Jim O’Neal

In Harvard Yard, a venue carefully chosen as dignified and non-controversial, Secretary of State George C. Marshall’s 15-minute speech on June 5, 1947, painted a grim picture for the graduates. With words crafted and refined by the most brilliant minds in the State Department, Marshall outlined the “continuing hunger, poverty, desperation and chaos” in a Europe still devastated after the end of World War II.

Marshall, one of the greatest Secretaries of State the United States has ever produced, asserted unequivocally that it was time for a comprehensive recovery plan. The only caveat was that “the initiation must come from Europe.” His words were much more than typical boilerplate commencement rhetoric and Great Britain’s wily Foreign Minister Ernest Bevin heard the message loud and clear. By July 3, he and his French counterpart, Georges Bidault, had invited 22 nations to Paris to develop a European Recovery Program (ERP). Bevin had been alerted to the importance by Dean Acheson, Marshall’s Under Secretary of State. Acheson was point man for the old Eastern establishment and had already done a masterful job of laying the groundwork for Marshall’s speech. He made the public aware that European cities still looked like bombs had just started falling, ports were still blocked, and farmers were hoarding crops because they couldn’t get a decent price. Furthur, Communist parties of France and Italy (upon direct orders from the Kremlin) had launched waves of strikes, destabilizing already shaky governments.

President Harry S. Truman was adamant that any assistance plan be called the Marshall Plan, honoring the man he believed to be the “greatest living American.” Yet much of Congress still viewed it as “Operation Rat Hole,” pouring money into an untrustworthy socialist blueprint.

The Soviets and their Eastern European satellites refused an invitation to participate and in February 1948, Joseph Stalin’s vicious coup in Prague crumpled Czechoslovakia’s coalition, which inspired speedy passage of the ERP. This dramatic action marked a significant step away from the FDR-era policy of non-commitment in European matters, especially expensive aid programs. The Truman administration had pragmatically accepted a stark fact – the United States was the only Western country with any money after WWII.

Shocked by reports of starvation in most of Europe and desperate to bolster friendly governments, the administration offered huge sums of money to any democratic country in Europe able to develop a plausible recovery scheme – even those in the Soviet sphere of influence – despite the near-maniacal resistance of the powerful and increasingly paranoid Stalin.

With no trepidation, on April 14, the freighter John H. Quick steamed out of Texas’ Galveston Harbor, bound for Bordeaux with 9,000 tons of American wheat. Soon, 150 ships were busy shuttling across the Atlantic carrying food, fuel, industrial equipment and construction materials – essential to rebuilding entire countries. The Marshall Plan’s most impressive achievement was its inherent magnanimity, for its very success returned Europe to a competitive position with the United States!

Winston Churchill wrote, “Many nations have arrived at the summit of the world, but none, before the United States, on this occasion, has chosen that moment of triumph, not for aggrandizement, but for further self-sacrifices.”

Truman may have been right about this greatest living American and his brief speech that altered a ravaged world and changed history for millions of people – who may have long forgotten the debt they owe him. Scholars are still studying the brilliant tactics involved.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Sanctions Didn’t Stop Germany from Roaring Back After WWI

A 1939 political cartoon by Charles Werner (1909-1997) for Time magazine comments on the worldwide mood 20 years after the Treaty of Versailles. The original art sold for $836 at a February 2006 Heritage auction.

By Jim O’Neal

From 1939 to the winter of 1941, the German military won a series of battles rarely equaled in the history of warfare. In rapid succession, Poland, Norway, France, Belgium, Holland, Yugoslavia, Denmark and Greece all fell victim to the armed forces of the Third Reich. In the summer and fall of 1941, the USSR came close to total defeat at the hands of the Wehrmacht, losing millions of soldiers on the battlefield and witnessing the occupation of a large portion of Russia and the Ukraine. The German air force, the Luftwaffe, played a central role in this remarkable string of victories.

It was even more startling to those countries that had participated in WWI and taken draconian anti-war measures when it ended. This was simply something that was NEVER supposed to happen again, much less a mere 20 years later. How was it even possible?

The Allied powers had been so impressed with the combat efficiency of the German Luftwaffe in WWI that they made a concerted effort to eliminate Germany’s capability to wage war in the air. Then they crippled their civilian aviation capability just to be certain. The Allies demanded the immediate surrender of 2,000 aircraft and rapid demobilization of the Luftwaffe. Then in May 1919, the Germans were forced to surrender vast quantities of aviation material, including 17,000 more aircraft and engines. Germany was permanently forbidden from maintaining a military or naval air force.

No aircraft or parts were to be imported, and in a final twist of the knife, Germany was not allowed to control their own airspace. Allied aircraft were granted free passage over Germany and unlimited landing rights. On May 8, 1920, the Luftwaffe was officially disbanded.

Other provisions of the Versailles Treaty dealt with the limits of the army and navy, which were denied tanks, artillery, poison gas, submarines and other modern weapons. Germany was to be effectively disarmed and rendered militarily helpless. An Inter-Allied Control Commission was given broad authority to inspect military and industrial installations throughout Germany to ensure compliance with all restrictions.

However, one critical aspect got overlooked in the zeal to impose such a broad set of sanctions. They left unsupervised one of the most influential military thinkers of the 20th century … former commander-in-chief of the German Army Hans von Seeckt. He was the only one who correctly analyzed the operational lessons of the war, and accurately predicted the direction that future wars would take. Allied generals clung to outdated principles like using overwhelming force to overcome defensive positions, while Von Seeckt saw that maneuvers and mobility would be the primary means for the future. Mass armies would become cannon fodder and trench warfare would not be repeated.

The story of the transformation of the Luftwaffe is a fascinating one. Faced with total aerial disarmament in 1919, it was reborn only 20 years later as the most combat-effective air force in the world. Concepts of future air war along with training and equipment totally trumped the opposition, which was looking backward … always fighting the last war.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

G.I. Bill Crucial to Creation of our ‘Greatest Generation’

Illustrator Mort Künstler’s depiction of D-Day, which began the liberation of German-occupied northwestern Europe, went to auction in May 2017.

By Jim O’Neal

By 1944, it was clear that World War II would end the following year and America had a difficult question to answer: What to do with the 16.35 million men and women serving in the armed forces when they came home from the war?

One estimate from the Department of Labor was that up to 15 million of them would be unemployed since the economy (which was winding down) would not be able to absorb them, especially in an orderly fashion. A similar post-war situation of lower production and a bulge of returning veterans had resulted in a sharp depression after WWI, from 1921 to 1923. To further complicate things, the world was in worse economic shape following the devastation the war had produced. The government had tried a cash bonus program and it failed so miserably that many Americans were angry for the next decade.

President Franklin Roosevelt was well aware of the potential implications and determined to avoid a repeat performance. He proactively took to the nation’s airwaves, proposing a series of benefits for all the men and women who had sacrificed so much for the country. The veterans’ self-appointed lobby, the American Legion, grabbed onto the proposal with both hands – as did Hearst newspapers. Legion publicist Jack Cejnar came up with the term the “G.I. Bill of Rights,” officially passed as the Servicemen’s Readjustment Act of 1944.

Returning veterans could borrow up to $2,000 to buy a house, start a business or start a farm. They would receive $20 a week for 52 weeks, until they found a job. There would be lifelong medical assistance, improved services for those disabled in action, and a de facto bonus of $1,300 in discharge benefits.

The effect of the program was substantial and immediate. By 1955, 4.3 million home loans worth $33 billion had been granted. Veterans were responsible for 20 percent of all new homes built after the end of the war. Instead of another depression, the country enjoyed unparalleled prosperity for a generation.

However, few veterans bothered to collect their $20-a-week unemployment checks. Instead, they used the money for the most significant benefits of all: education and vocational training. Altogether, 7.8 million vets received education and training benefits. Some 2.3 million went to college, receiving $500 a year for books and tuition, plus $50 a month in living expenses. The effect was to transform American education and help create a middle class.

College was sheer bliss to men used to trenches and K-rations. By 1946, over half the college enrollments in the country were vets, who bonded into close, supportive communities within the wider campuses. Countless G.I. Bill graduates would go on to occupy the highest ranks of business, government and the professions, and even win Nobel Prizes.

The number of degrees awarded by U.S. colleges and universities more than doubled between 1940 and 1950 and the percentage of Americans with bachelor degrees or more rose from 4.6 percent in 1945 to 25 percent a half century later. Joseph C. Goulden writes in The Best Years, 1945-1950 that the G.I. Bill “marked the popularization of higher education in America.” After the 1940s, a college degree was considered an essential passport for entrance into much of the business and professional world.

Thanks to the G.I Bill, a successful entrance into that world was created for the millions of men and women who kept our world free and assured its future. Along the way, they also helped rebuild a world that had been ravaged.

I offer you the Greatest Generation!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Bataan Death March a Cruel Episode of an Already-Brutal War

The 1945 film Back to Bataan starring John Wayne tells the story of the U.S. Army Ranger raid at the Cabanatuan prisoner-of-war camp.

By Jim O’Neal

Last month marked the 75th anniversary of the Japanese attack on Pearl Harbor (Dec. 7, 1941) that resulted in the United States entry into World War II.

The Japanese war plan assumed that a quick strike that disabled American naval forces would deter the United States from interfering with their strategic objective of conquering Asia and acquiring rich natural resources.

They predicted a surprise victory would preclude a declaration of war and keep us focused on Europe, where Nazi Germany was on a rampage. The primary target was the Pacific Fleet, which included aircraft, battleships and aircraft carriers. They intentionally ignored the fuel depots and maintenance facilities since they would become superfluous (wrong!).

Ironically, U.S. plans included a proviso “to avoid charging across the Pacific” … in stark contrast to the core Japanese rationale. Further, the three aircraft carriers (Enterprise, Lexington and Saratoga) were at sea and escaped damage. So, quite perversely, these assets, three aircraft carriers, airplanes and all the supporting infrastructure, were precisely what we used to respond. “Remember Pearl Harbor” was the rallying cry that gave Congress the cover to declare war, something the American public opposed.

Six hours later, in a less-familiar situation, the Japanese also started bombing the U.S. Protectorates in the Philippines and Guam. General Douglas MacArthur was in Manila the day the bombing started – in his cozy suite at the Manila Hotel – and inexplicably failed to pass on the warning he had received hours before. He then relocated to the island of Corregidor in the mouth of Manila Bay and was there from December until March 1942, when FDR ordered him to Australia for his safety.

Bataan is a peninsula in the Philippines between Manila Bay and the South China Sea. It is a mountainous, hot, densely jungled place. It is also the location of one of the worst American defeats in WWII. On April 9, 1942, U.S. and Filipino forces on Bataan surrendered unconditionally to the Japanese after months of bombing and an invasion.

What followed was the infamous Bataan Death March.

More than 70,000 already-weakened Allied POWs were forced to walk over 60 miles to Japanese prison camps; many were sent to the Cabanatuan prison camp on the coast of Luzon. Thousands died en route of sickness, dehydration and murderous acts inflicted by their Japanese captors. Conditions at the camp are almost too gruesome to repeat.

In addition to the ordinary conditions of malaria, dysentery, scurvy, pellagra, beriberi and rickets, the long-term effects of vitamin and mineral deprivation exposed an abyss of human physiology. When the last phantom residues burned away, prisoners lost their voices, hair, eyes, teeth and hearing. Even their skin fell off. It was a pseudo-human medical freak show.

Finally, after nearly three years of tortuous living conditions, in January 1945, 121 hand-selected troops from the elite U.S. Army 6th Ranger Battalion slipped behind enemy lines and rescued the 513 American and British POWs that were still alive at Cabanatuan. It was a long three years for these survivors and it is almost miraculous that any made it.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Concerns Over Harry Truman Vanished as New President Exerted His Leadership

1945-white-house-press-release
A 1945 White House press release signed by Harry S. Truman as president announcing the bombing of Hiroshima realized $77,675 at an October 2010 Heritage auction.

By Jim O’Neal

In February 1945, Franklin Delano Roosevelt traveled to Yalta in southeastern Russia to discuss plans for peace with Winston Churchill and Joseph Stalin. He reported to Congress that plans had been arranged for an organization meeting of the United Nations on April 25, 1945. He said, “There, we all hope, and confidently expect, to execute a definite charter of organization under which the peace of the world will be preserved and the forces of aggression permanently outlawed.”

Upon his return, he looked tired and older than his 63 years. Late in March, he went to Warm Springs, Ga., for an overdue rest. On April 12, 1945, he was working at his desk as an artist painted his portrait when he suddenly complained of “a terrible headache.” A few hours later, at 4:45 p.m., he died of a cerebral hemorrhage. The last words he had written were “The only limit to our realization of tomorrow will be our doubts of today. Let us move forward with strong and active faith.”

harry-s-truman
Truman

His successor, the first president to take office in the midst of a war, Harry S. Truman, said he felt “like the moon, the stars and all the planets had fallen on me.” The nation and world wondered if he was capable of taking Roosevelt’s place. His background and even his appearance added to the nervous uncertainty. He was the first president in 50 years without a college education. He spoke the language of a Missouri dirt farmer and World War I artilleryman – both of which he had been. Instead of talking like a statesman, he looked like a bank clerk or haberdasher – both of which he had been. And worst of all, everyone knew that for more than 20 years he had been a lieutenant of Tom Pendergast, one of the most corrupt political bosses in the country.

What most people didn’t know was that he was scrupulously honest, knew his own mind and was one of the most knowledgeable students of history ever to enter the White House. Importantly, he understood the powers of the president, and knew why some men had been strong chief executives and others had been weak leaders.

When he learned about the atomic bomb, there was no soul-searching or handwringing debates. He ordered it dropped on Japan because he was sure it would save American lives and quickly end World War II. It did not bother him in the least that years later, intellectuals would question whether one man should have made such an awesome decision alone. He knew in his heart that he was right … period.

Two of his well-known sayings capture the essence of Give’m Hell Harry Truman: The Buck Stops Here (a sign on his desk) and my favorite … If you can’t stand the heat, stay the hell out of the kitchen!

Leaders get paid to make tough decisions.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Siege of Leningrad was Devastating for Russian People

nicolai-fechin-russian-girl
Nicolai Fechin’s Russian Girl, an oil on canvas laid on masonite, sold for $109,375 at a November 2015 Heritage auction.

By Jim O’Neal

The suffering brought on by World War II was enormous, but when the total picture is considered there is little doubt that the greatest pain was borne by the people who lived within the grasp of the century’s most vicious tyrants: Joseph Stalin and Adolf Hitler.

While Americans were busy managing the factories that made them the “Arsenal of Democracy” and focusing on Japan, the people of Central Europe and Western Russia were in a life-and-death struggle fought on the very streets of their cities.

Throughout the winter of 1941-42 and onward for 900 days, the people of Leningrad were suffering dramatically. Concerned that his German army might encounter enormous losses if they launched an all-out assault, Hitler ordered a blockade of the city. By starving its 3 million people, he hoped to break Russian morale and force them to surrender.

Since Leningrad was closed on the west by the Baltic Sea, to the east by the 80-mile-wide Lake Ladoga and to the north by the Finnish army, the Wehrmacht only needed to seal the southern flank to isolate the city. But even as the Germans closed ranks around them and started bombing warehouses and supply routes, the hearty citizens showed they would not be so easily defeated. Volunteers built thousands of air-raid shelters and pillboxes, and cut down trees to block the Germans’ path.

By late December 1941, Leningrad was down to two days’ supply of flour and people had to make bread from cellulose, sawdust and floor sweepings of flour. Animal feed became human food, weeds were boiled to create soup and the dead were hidden so families could continue receiving their daily rations. 53,000 perished that month, and by February another 200,000 would join them.

Somehow the city hung on.

Then came a breakthrough. Scientists discovered Lake Ladoga had frozen so deeply that it could support truck traffic. They cautiously started sending convoys across the “Road of Life.” In the first seven days, 40 trucks sunk to the bottom, but dozens of others made it and returned with precious food. Then women and children were evacuated and the city limped along in darkness and silence since there was no oil to light the lamps and even the birds were dead. In fact, every creature – living or dead, including the human corpses in the gutters – had been picked over by the hungry hordes.

Leningrad Radio broadcast from the generator of a ship frozen in a river and aired the sound of a metronome between programs to let listeners know the city was not dead, yet. By the time Leningrad was liberated in January 1944, nearly 1 million people had died.

There were more civilians dead than in any city, in any war, in the history of mankind.

During this siege, Hitler became obsessed with conquering Stalingrad and that proved to be a fatal mistake that cost him the war. The little colonel from Bavaria proved to be a poor general.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

For a Moment, It Seemed Warfare as We Know it Was in Its Final Days

An original 1991 Desert Storm editorial cartoon by Bill Mauldin for the Chicago Sun Times realized $418.25 in a November 2014 Heritage auction.

By Jim O’Neal

When it comes to naming military campaigns, few compare with “Desert Storm.” Besides its obvious evocations of sand-blown landscapes, the name could also work as the title of a pulp novel or B movie, even a video game. In early 1991, more than two dozen allied nations began an assault on Iraq in an attempt to drive its forces from neighboring Kuwait.

It was a classic military rout.

In just over 40 days of American air attacks, followed by fewer than 100 hours of ground fighting, thousands of high-tech bombs (precision-guided munitions) rained down on Iraqi positions. Enemy troops were driven back to Baghdad and into international humiliation.

For the United States, the war was the first since the debacle in Vietnam, and the American public entered into an anguished debate as President George H.W. Bush had pushed for congressional approval. Who could know if Iraq would become to the ’90s what Vietnam had been to the ’60s and ’70s?

Still, there was no denying these were different times. Among the allies standing with the U.S. against Saddam Hussein’s seizure of oil-rich Kuwaiti sands was the Soviet Union, the first instance since World War II in which Americans and Soviets fought on the same side. It also positioned the allied nations as a quasi-international police force stopping acts of raw aggression.

World War I had advanced combat into the sphere of mechanized warfare. World War II had taken technology even further and made civilians targets. Now, in Iraq, computer technology advanced both the tools and the strategy until it resembled science fiction. Beginning with the launch of a Tomahawk missile from the deck of the USS Wisconsin on Jan. 17, 1991, Baghdad became the site of one of the most devastating air raids in history.

There was now no doubt that warfare had entered a new epoch. With satellites mapping the globe it seemed possible war would soon become as simple as deleting a computer file – scanning a battlefield, identifying a target and systematically destroying it.

It was a clean war, precise and efficient, fought so fast it hardly demanded attention. There were few American losses (148 dead vs. 200,000 Iraqis) and undeniable results … Iraq out of Kuwait. Plus, we could tune in to CNN to get the latest update during an occasional coffee break.

The world was finally coming to its senses and if someone committed an act of aggression, it would only take a few coordinated responses to restore harmony. Finally, we could channel our energy and resources to eliminating disease, world hunger and a thorough cleansing of the air and oceans.

War was such a dumb idea. Why did it take us so long to recognize what a waste it was? The new millennium was impatiently waiting for us to get a fresh start.

Sigh.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].