As a ‘champion’ of the working man, Marx lived the high life

A second edition of the first volume of Karl Marx’s Das Kapital (Hamburg: Otto Meissner, 1872) sold for $3,500 at a March 2018 Heritage auction.

By Jim O’Neal

When Ho Chi Minh lived in London, training as a pastry chef under Auguste Escoffier at the Carlson House, he used it as a pillow. Fidel Castro claimed he read 370 pages (about half) in 1953 while he was in prison after a failed revolutionary attack of the barracks of Moncada in Santiago de Cuba. President Xi Jinping of China hailed its author as “the greatest thinker of modern times.”

It’s been 200 years since Karl Marx was born on May 5, 1818, in Trier, Germany. His book Das Kapital was published in 1867, or at least that was when Volume 1 made its way into print. His friend and benefactor Friedrich Engels edited Volumes 2 and 3 after Marx’s death.

Karl Marx

Engels (1820-1895) was born in Prussia, dropped out of high school and finally made it to England to help run his father’s textile factory in Manchester. On his trip, he met Marx for the first time, but it would be later before their friendship blossomed. Perhaps it was due to Engels’ 1845 book The Condition of the Working Class in England.

He had observed the slums of Manchester, the horrors of child labor, and the utter impoverishment of laborers in general and the environmental squalor that was so pervasive. This was not a new indictment since Thomas Robert Malthus (1766-1834) had written, albeit anonymously, about these abysmal conditions. However, he had blamed the poor for their plight and opposed the concept of relief “since it simply increases their tendency to idleness.” He was particularly harsh on the Irish, writing that a “great part of the population should be swept from the soil.”

Not surprisingly, mortality rates soared, especially for the poor, and the average life expectancy fell to an astonishing 18.5 years. These lifespan levels had not existed since the Bronze Age and even in the healthiest areas, life expectancy was in the mid-20s, and nowhere in Britain exceeded 30 years.

Life expectancy had largely been uncertain until Edmond (the Comet) Halley obtained a cache of records from an area in Poland in 1693. Ever the tireless investigator of any and all scientific data, he suddenly realized he could calculate the life expectancy of any person still alive. From these unusually complete data charts, he created the very first actuarial tables. In addition to all the many other uses, this is what enabled the creation of the life insurance industry as a viable service.

One of the few who sympathized with the poor was the aforementioned Friedrich Engels, who spent his time embezzling funds from the family business to support his collaborator Karl Marx. They both passionately blamed the industrial revolution and capitalism for the miserable conditions of the working class. While diligently writing about the evils of capitalism, both men lived comfortably from the benefits it provided them personally. To label them as hypocrites would be far too mild a rebuke.

There was a stable of fine horses, weekends spent fox hunting, slurping the finest wines, a handy mistress, and membership in the elite Albert Club. Marx was an unabashed fraud, denouncing the bourgeoisie while living in excess with his aristocratic wife and his two daughters in private schools. In a supreme act of deception, he accepted a job in 1851 as a foreign correspondent for Horace Greeley’s New-York Tribune. Due to his poor English, he had Engles write the articles and he cashed the checks.

Even then, Marx’s extravagant lifestyle couldn’t be maintained and he convinced Engels to pilfer money from his father’s business. They were partners in crime while denouncing capitalism at every opportunity.

In the 20th century, Eugene Victor Debs ran for U.S. president five consecutive times as the candidate of the Socialist Party of America, the last time (1920) from a prison cell in Atlanta while serving time after being found guilty of 10 counts of sedition. His 1926 obituary told of him having a copy of Das Kapital and “the prisoner Debs read it slowly, eagerly, ravenously.”

In the 21st century, Senator Bernie Sanders of Vermont ran for president in 2016, despite the overwhelming odds at a Democratic National Convention that used superdelegates to select his Democratic opponent. In a series of televised debates, he predictably promised free healthcare for all, a living wage for underpaid workers, college tuition and other “free stuff.” I suspect he will be back in 2020 due to overwhelming support from Millennials, who seem to like the idea of “free stuff,” but he may have 10 to 20 other presidential hopefuls who’ve noticed that energy and enthusiasm.

One thing: You cannot call Senator Sanders a hypocrite like Karl Marx. In 1979, Sanders produced a documentary about Eugene Debs and hung his portrait in the Burlington, Vt., City Hall, when he became its mayor after running as a Socialist.

As British Prime Minister Margaret Thatcher once said: “The problem with Socialism is that eventually you run out of other people’s money.”

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

‘Sesame Street’ inspired by the simple idea of making the world a better place

The original illustration for the July/August 1980 cover of Sesame Street Magazine, by Rick Brown, sold for $2,390 at a December 2017 Heritage auction.

By Jim O’Neal

Can you tell me how to get to Sesame Street?

It’s a question millions of children have been asking every day for the past 50 years. In a single television season, 1969, Sesame Street became the most famous street in America, eclipsing Wall Street, Madison Avenue and even (every town has one) Main Street. So how did we get here in the first place?

Television producer Joan Ganz Cooney recalls a “little dinner party” in 1966 with her husband, Tim Cooney, and Lloyd Morrisett – an executive as the Carnegie Corporation – who asked an innocent but sincere question: “Do you think television can be used to teach young children?” In his book Street Gang: The Complete History of Sesame Street, author Michael Davis writes about Morrisett watching his young daughter sitting in front of their TV watching a test pattern, but who also knew popular TV ad jingles from memory. That sparked a lightbulb question: “What else could she learn and remember?”

Joan Ganz Cooney (b.1929) had already become interested in working for educational television through her work as a documentary producer for WNET, New York’s first educational TV station. She thought about Morrisett’s dinner-party question and answered, “I don’t know, but I’d like to talk about it.” As it turned out, they did a lot more than simply talk.

She claims none of them had any idea when they started that Sesame Street and the Children’s Television Workshop (CTW) would grow into the international institutions they are today. Or that Sesame Street, winner of more Emmys than any other show (167 Emmys and eight Grammys), would also become the longest street in the world, benefiting more children in more countries than any program in history. None of them had any idea that the characters – wonderful, zany and vulnerable Muppets that taught children about letters, numbers and getting along – would become an integral part of American culture, or that they were creating a family that every child watching would feel a part of.

They only knew for sure that they wanted to make a difference in the lives of children and families, particularly low-income children. When they began their work in the late 1960s, this seemed imminently possible. It was a time when many believed they had the responsibility and power to make the world a better place, even if only just a little better. Note: Some of us are still absolutely certain this will happen. For any doubters, just be aware that for the first time in recorded history, more than 50 percent of the world’s population is now living above the poverty line!

Sesame Street’s goal was more modest (and only in retrospect, revolutionary): to use television to help children learn (full stop)!

They knew children watched a great deal of television in the years before pre-school. They were well aware that they loved cartoons, game shows and situation comedies; that they responded to slapstick humor, music with a beat and, above all (sadly), fast-paced, oft-repeated commercials. If they could create an educational show that capitalized on some of commercial television’s most engaging traits – high production values, sophisticated writing – they could attract a sizeable audience that included, most importantly, low-income children.

There was a lot of betting against them because of the number of UHF stations in the public television system, especially in pre-cable days. But their basic instincts were almost impeccable. The wasteland was too vast and the yearning for something better too great. Sesame Street was an instant hit and remains so yet today.

On a personal note, I don’t recall ever watching it (I was 32 when it debuted), but I recall that Big Bird is 8-foot-2 and I knew Joan Ganz’s second husband, Peter G. Peterson, who she married in 1980. He was an absolutely remarkable man who was CEO of Bell & Howell and co-founder of the Blackstone Group with Stephen Schwarzman in 1985 when I was heavily involved in minority business development for Frito-Lay. He was the one who first made me aware of the implications of the national debt and the cruel burden we are passing to the next generation. Peterson’s book, Running on Empty, is still a classic, even as the situation has only gotten worse. The only one who doesn’t seem to get it is New York Times columnist and Noble Prize-winner Paul Krugman, who could easily be a character on Sesame Street if he ever learns his multiplication tables.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Childhood has always been tough, but let’s not go too far

A 1962 first edition, first printing of The Guns of August, signed by author Barbara Tuchman, sold for $625 at an April 2013 Heritage auction.

By Jim O’Neal

Simply mention the name of Barbara Tuchman and it brings back fond memories of her wonderful book The Guns of August, which won the Pulitzer in 1963. It brilliantly explains the complicated political intrigue that started innocuously in the summer of 1914 and then erupted into the horrors of the First World War. It is always a good reminder of how easily our foreign entanglements can innocently provoke another, except survivors today would call it the Last World War.

Tuchman (1912-89) won a second Pulitzer for her biography of General “Vinegar Joe” Stilwell and his travails in China-Burma-India during World War II. He was faced with trying to manage/control aviator Claire “Old Leatherface” Chennault and his famous Flying Tigers. He also had to contend with Chiang Kai-shek, who eventually became the first president of the Republic of China (1950-75). This dynamic duo conspired to have General Stilwell removed and were finally successful by badgering a tired and sick FDR.

However, it was a totally different award-winning book Tuchman published in 1978 that was much more provocative and controversial (at least to me). In A Distant Mirror: The Calamitous 14th Century, she writes, “Of all the characteristics in which the medieval age differs from the modern, none is so striking as the comparative absence of interest in children.” She concluded, chillingly, that “a child was born and died and another took its place.”

Tuchman asserted that investing love in young children was so risky, so unrewarding, that everywhere it was seen as a pointless waste of energy. I politely refuse to accept that our ancestors were ever so jaded and callous. Surely, there was at least a twinge of sorrow, guilt or emotion.

Yet earlier, French author Philippe Ariès in his Centuries of Childhood made a remarkable claim that until the Victorian Age “the concept of childhood did not exist.” There is no doubt that children once died in great numbers. One-third died in their first year and 50 percent before their 5th birthday. Life was full of perils from the moment of conception and the most dangerous milestone was birth itself, when both child and mother were at risk due to a veritable catalog of dangerous practices and harmful substances.

And it was not just happening to poor or needy families. As Stephen Inwood notes in A History of London, death was a visitor in the best of homes and cities. English historian Edward Gibbon (The History of the Decline and Fall of the Roman Empire) lost all six of his siblings while growing up in the 1700s in Putney, a rich, healthy suburb of London. In his autobiography, Gibbon describes his childhood as “a puny child, neglected by my mother, starved by my nurse.” So death was apparently totally indifferent in choosing between rich and poor; but that is a far cry from not having a childhood per se.

To extend this into a situation where mothers became totally indifferent to young children because of high rates on infant mortality and thus it made little sense to invest emotionally in infants defies logic. Obviously, parents adjusted and had many children in order to guarantee that a few would survive … just as a sea turtle lays 1,000 eggs to have a few survive. To do otherwise is to invite extinction.

The world these respected people write so persuasively about would have been a sad, almost morbid place, with a landscape of tiny coffins. I say where is the proof?

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

President Lincoln can teach us a little about working together

A scarce copy of The Photographs of Abraham Lincoln featuring nearly 150 images relating to the president sold for $5,250 at a June 2015 Heritage auction.

By Jim O’Neal

No matter where you stand on the current controversy over the Supreme Court nomination process, it’s almost a certainty that irrespective of the outcome, you haven’t changed your opinion (much). There are just too many places to find others who totally agree with you and who validate your position. Technically, it’s called an “echo chamber” and we are surrounded by it.

Never fear, someone will share your viewpoint and increase your confidence level in believing that you are right … irrespective of what others think.

The 24/7 cable news business model was built on this premise and increased eyeballs, resulting in higher ratings, which drive higher advertising rates. They probably caught on from analyzing newspapers, who learned early that good news doesn’t sell as well, just as their street vendors learned that shouting “Read all about it (fill in the blank)” sold newspapers. Living in the U.K. for five years finally broke my habit, but it was mostly sensory overload from all the tabloids rather than my preference for a juicy story, regardless of the topic.

People who study the echo chamber have been writing about the increase in “tribalism,” in the sense that people are actually moving to communities, joining clubs and sharing social media with like-minded people at an accelerating rate. I suppose this will continue, but I haven’t found a tribe that will have me. In fact, quite the opposite, since I much prefer hearing a broadly diverse spectrum of ideas.

I relish hearing opinions about climate change, gun control, border security, health care, policing our cities, the Electoral College, the Iraq War, media bias and so on … especially from smart people who have fresh ideas … instead of stale recycled talking points borrowed from others. I regularly read both The New York Times and The Wall Street Journal to get basic balance. The only line I draw is at wild conspiracies, unless they’re packaged by people who are also highly entertaining (e.g. Oliver Stone and his JFK or Platoon).

Doris Kearns Godwin’s Team of Rivals does a terrific job of explaining this concept, using the election of 1860 and how Abraham Lincoln leveraged his administration by filling three of his top Cabinet posts with his main election rivals. They became part of the solution rather than critics. In my opinion, the U.S. Congress should practice this to gain consensus rather than relying on an appellate system and the Supreme Court to shape our legal landscape.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Harriet Tubman was a ‘Moses’ to America’s slaves

Charles Wilbert White’s 1949 ink and pencil on board titled Harriet Tubman sold for $25,000 at a May 2015 Heritage auction.

By Jim O’Neal

In the 1830s, word started spreading about an Underground Railroad. A train with no tracks, no locomotive and no tickets needed to ride. From one plantation to the next, rumors spread about a “railroad to freedom” with no timetables and a crew that consisted of good citizens in the finest traditions of a young nation’s civil disobedience. In addition to helping fugitive slaves elude slave-hunters working for owners, Quakers, Protestants, Catholics and even American Indians were bound by their determination to see slavery abolished everywhere. It was a broad constituency of black and white abolitionists from the major cities in the North, including those in the border states.

It inevitably grew into a direct challenge to the Fugitive Slave Law of 1850, which specifically required the return of runaway slaves, even after they had safely reached non-slave states.

This law was possibly the most controversial aspect of the more comprehensive “Compromise of 1850,” and it quickly became nicknamed the “Bloodhound Law” for obvious reasons. Black newspapers went a step further by labeling it “manstealing,” from the Bible’s Exodus 21:16: “And he that stealeth a man, and selleth him, or if he be found in his hand, he shall surely be put to death.” The law also included a proviso to discourage anyone from obstructing the return of a slave by imposing a fine of $1,000 and imprisonment up to six months.

Naturally, many slaves who made the decision to escape merely walked away without any elaborate escape plans. Traveling at night and hoping to find strangers en route to assist them was more risky, but they were determined to follow their North Star. Others planned their escapes carefully by surreptitiously building up a supply of food and money and, if lucky, using a “conductor” who knew safe routes and people who would help. There also were slaves who decided not to risk escape, but were more than willing to aid and abet along the way.

Once a slave crossed the border into the North, there was a network of Underground Railroad people to assist. In addition to food, water and basic nursing, it was also possible to get “papers” that identified them as freedmen – then get directions to the next station. Churches, stables and even attics (a la Anne Frank) became good hiding places until they could get far enough north. Canada became a safe spot, just as it would be the next century during the Vietnam War.

Among the Underground Railroad’s more heroic engineers was the ex-slave Harriet Tubman (c.1822-1913), a native of Maryland’s Eastern Shore who escaped to Philadelphia in 1849. Once free, she wrote: “I looked at my hands to see if I was the same person. There was such a glory over everything … and I felt like I was in Heaven.” Her own escape made Tubman determined to rescue as many slaves from bondage as she could. Her trips were made during the winter months when nights were long. Escapes began on Saturday nights; the slaves would not be missed until Monday. When “wanted” posters went up, she paid black men to tear them down. She kept a supply of paregoric to put babies to sleep so their cries would not raise suspicion.

She carried a gun, not simply for protection, but as inspiration – to threaten anyone in her group feigning fatigue. For her, the welfare of the entire group was paramount. If pursuers got too close, she would hustle her people on a southbound train, a ruse that worked because authorities never expected fugitives to flee in that direction. In addition to slaves, she helped John Brown recruit men for the infamous Raid on Harpers Ferry and worked as a cook, nurse and an armed scout for the U.S. Army after the war started.

Prominent abolitionist William Lloyd Garrison compared her to “Moses,” who led the Hebrews to freedom from Egypt, but this Moses never lost a man. The Moses in Exodus spent 40 years wandering in the desert and then put the future Israelites in the only place in the Middle East with no oil!

Harriet “Moses” Tubman was asked why she was not afraid. She answered: “I can’t die but once.”

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

John Adams saw the White House as a home for ‘honest and wise men’

A vintage creamware punch bowl, commemorating “John Adams President of the United States,” sold for $15,535 at a March 2008 Heritage auction.

By Jim O’Neal

As the states prepared for the first presidential election under the new Constitution, it was clear that George Washington was the overwhelming favorite to become the first president of the United States.

Under the rules, each state would cast two votes and at the February 1789 Electoral College, all 69 Electors cast one of their votes for Washington, making him the unanimous choice of 10 states. Two of the original Colonies (North Carolina and Rhode Island) had not yet ratified the Constitution, and New York had an internal dispute and did not chose Electors in time to participate. Eleven other men received a total of 69 votes, with John Adams topping the list with 34 votes, slightly less than 50 percent. He became the first vice president.

Four years later, there were 15 states (Vermont and Kentucky) and the Electoral College increased to 132 Electors. Again, Washington was elected president unanimously, with 132 votes. Adams was also re-elected with 77 votes, besting George Clinton, Thomas Jefferson and Aaron Burr. All three of the runner-ups would later become vice presidents, with Clinton serving a term for two different presidents (Jefferson and Madison). Jefferson had cleverly picked Clinton as his VP due to his age, correctly assuming Clinton would be too old to secede him … thus ensuring that Secretary of State James Madison would be the logical choice. Clinton would actually be the first VP to die in office.

John Adams

Two-time Vice President John Adams would finally win the presidency on his third try after Washington decided not to seek a third term in 1796. Still, Adams barely squeaked by, defeating Jefferson 71-68. Jefferson would become vice president after finishing second. It was during the Adams presidency that the federal government would make its final move to the South after residing first in New York City and then Philadelphia.

This relocation was enabled by the 1790 Residence Act, a compromise that was brokered by Jefferson with Alexander Hamilton and James Madison, with the proviso that the federal government assume all remaining state debts from the Revolutionary War. In addition to specifying the Potomac River area as the permanent seat of the government, it further authorized the president to select the exact spot and allowed a 10-year window for completion.

Washington rather eagerly agreed to assume this responsibility and launched into it with zeal. He personally selected the exact spot, despite expert advice against it. He even set the stakes for the foundation himself and carefully supervised the myriad details involved during actual construction. When the stone walls were rising, everyone on the project assembled, laid the cornerstone and affixed an engraved plate. Once in the mortar, the plate sank and has never been located since. An effort was made to find it on the 200th anniversary in 1992. All the old maps were pored over and the area was X-rayed … all to no avail. It remained undetected.

The project was completed on time and with Washington in his grave for 10 months, plans were made to relocate the White House from Philadelphia. The first resident, President John Adams, entered the President’s House at 1 p.m. on Nov. 1, 1800. It was the 24th year of American independence and three weeks later, he would deliver his fourth State of the Union address to a joint session of Congress. It was the last annual message delivered personally for 113 years. Thomas Jefferson discontinued the practice and it was not revived until 1913 (by Woodrow Wilson). With the advent of radio, followed by television, it was just too tempting for any succeeding presidents to pass up the opportunity.

John Adams was a fifth-generation American. He followed his father to Harvard and dabbled in teaching before becoming a lawyer. His most well-known case was defending the British Captain and eight soldiers involved in the Boston Massacre on March 5, 1770. He was not involved in the Boston Tea Party, but rejoiced since he suspected it would inevitably lead to the convening of the First Continental Congress in Philadelphia in 1774.

He married Abigail Smith … the first woman married to a president who also had a son become president. Unlike Barbara Bush, she died 10 years before John Quincy Adams actually became president in 1825. Both father and son served only one term. Abigail had not yet joined the president at the White House, but the next morning he sent her a letter with a benediction for their new home: “I pray heaven to bestow the best blessing on this house and on all that shall hereafter inhabit it. May none but honest and wise men ever rule under this roof.” Franklin D. Roosevelt was so taken with it that he had it carved into the State Dining Room mantle in 1945.

Amen.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Benjamin Franklin’s basement was literally filled with skeletons

A pre-1850 folk art tavern sign depicting Benjamin Franklin sold for $11,250 at a May 2014 Heritage auction.

By Jim O’Neal

The Benjamin Franklin House is a formal museum in Central London near Trafalgar Square. It’s a popular location for kooky political speeches and peaceful demonstrations. Although anyone is free to speak about virtually anything, many visitors are not raptly paying attention, preferring to instead feed the pigeons. I never had the temerity to practice my public speaking, although I’m sometimes tempted (“Going wobbly,” as my English friends would observe).

Known once as Charing Cross, Trafalgar Square now commemorates the British naval victory in October 1805 off the coast of Cape Trafalgar, Spain. Admiral Horatio Nelson defeated the Spanish and French fleets there, resulting in Britain gaining global sea supremacy for the next century.

The Franklin House is reputedly the only building still standing where Franklin actually lived … anywhere. He resided there for several years after accepting a diplomatic role from the Pennsylvania Assembly in pre-Revolutionary times. Derelict for most of the 20th century, the site caused a stir 20-plus years ago while it was being renovated. During the extensive excavation, a cache of several hundred human bones were unearthed

Since anatomy was one of the few scientific things Franklin did not dabble in, the general consensus was that one of his colleagues did, at a time when privately dissecting cadavers was unlawful and those who did it were very discreet. I discovered the museum while riding a black cab on the way to the American Bar at the nearby Savoy Hotel. I may take the full tour if we ever return to London.

However, my personal favorite is likely to remain the Franklin Institute in the middle of Philadelphia. A large rotunda features the official national memorial to Franklin: a 20-foot marble statue sculpted by James Earle Fraser in 1938. It was dedicated by Vice President Nelson Aldrich Rockefeller in 1976. Fraser is well known in the worlds of sculpting, medals and coin collecting. He designed the Indian Head (Buffalo) nickel, minted from 1913-38; several key dates in high grade have sold for more than $100,000 at auction. I’ve owned several nice ones, including the popular 3-Leg variety that was minted in Denver in 1937. (Don’t bother checking your change!).

Fraser (1876-1953) grew up in the West and his father, an engineer, was one of the men asked to help retrieve remains from Custer’s Last Stand. George Armstrong Custer needs no introduction due to his famous massacre by the Lakota, Cheyenne and Arapaho in 1876 – the year Fraser was born – in the Battle of the Little Bighorn (Montana). But it helps explain his empathy for American Indians as they were forced off their reservations. His famous statue titled End of the Trail depicts the despair in a dramatic and memorable way. The Beach Boys used it for the cover of their 1971 album Surf’s Up.

Another historic Fraser sculpture is 1940’s Equestrian Statue of Theodore Roosevelt at the American Museum of Natural History (AMNH) in New York City. Roosevelt is on horseback with an American Indian standing on one side and an African-American man on the other. The AMNH was built using private funds, including from TR’s father, and it is an outstanding world-class facility in a terrific location across from Central Park.

However, there is a movement to have Roosevelt’s statue removed, with activists claiming it is racist and emblematic of the theft of land by Europeans. Another group has been active throwing red paint on the statue while a commission appointed by Mayor Bill de Blasio studies how to respond to the seemingly endless efforts to erase history. Apparently, the city’s Columbus Circle and its controversial namesake have dropped off the radar screen.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Airplanes added economic, psychological factors to warfare

Alexander Leydenfrost’s oil on canvas Bombers at Night sold for $4,182 at a February 2010 Heritage auction.

By Jim O’Neal

In August 1945, a historic event occurred: Foreign forces occupied Japan for the first time in recorded history. It was, of course, the end of World War II and cheering crowds were celebrating in the streets of major cities around the world as peace returned to this little planet.

A major factor in finally ending this long, costly war against the Axis powers of Germany and Japan was, ultimately, the use of strategic bombing. An essential element was the development of the B-29 bomber – an aircraft not even in use when Japan attacked Pearl Harbor in 1941, forcing a reluctant United States into a foreign war. Maybe it was hubris or fate, but the attack was a highly flawed decision that would not end well for the perpetrators.

The concept of war being waged from the air dates to the 17th century when several wrote about it while speculating on when or where it would begin. The answer turned out to be the Italo-Turkish War (1911-12), when an Italian pilot on Oct. 23, 1911, flew the first aerial reconnaissance mission. A week later, the first aerial bomb was dropped on Turkish troops in Libya. The Turks responded by shooting down an airplane with rifle fire.

As World War I erupted seemingly out of nowhere, the use of airplanes became more extensive. However, for the most part, the real war was still being waged on the ground by static armies. One bitter legacy of this particular war was the frustration over the futility and horror of trench warfare, which was employed by most armies. Many experts knew, almost intuitively, that airplanes could play a role in reducing the slaughter of trench warfare and a consensus evolved that airplanes could best be used as tactical army support.

However, in the 20-year pause between the two great wars, aviation technology improved much faster than other categories of weaponry. Arms, tanks, submarines and other amphibious units were only undergoing incremental changes. The airplane benefited by increased domestic use and major improvements in engines and airframes. The conversion to all-metal construction from wood quickly spread to wings, crew positions, landing gear and even the lowly rivet.

As demand for commercial aircraft expanded rapidly, increased competition led to significant improvements in speed, reliability, load capacity and, importantly, increased range. Vintage bombers were phased out in favor of heavier aircraft with modern equipment. A breakthrough occurred in February 1932 when the Martin B-10 incorporated all the new technologies into a twin-engine plane. The new B-10 was rated the highest performing bomber in the world.

Then, in response to an Air Corps competition for multi-engine bombers, Boeing produced a four-engine model that had its inaugural flight in July 1935. It was the highly vaunted B-17, the Flying Fortress. Henry “Hap” Arnold, chief of the U.S. Army Air Forces, declared it was a turning point in American airpower. The AAF had created a genuine air program.

Arnold left active duty in February 1946 and saw his cherished dream of an independent Air Force become a reality the following year. In 1949, he was promoted to five-star general, becoming the only airman to achieve that rank. He died in 1950.

War planning evolved with the technology and in Europe, the effectiveness of strategic long-range bombing was producing results. By destroying cities, factories and enemy morale, the Allies hastened the German surrender. The strategy was comparable to Maj. Gen. William Tecumseh Sherman’s “March to the Sea” in 1864, which added economic and psychological factors to sheer force. Air power was gradually becoming independent of ground forces and generally viewed as a faster, cheaper strategic weapon.

After V-E Day, it was time to force the end of the war by compelling Japan to surrender. The island battles that led toward the Japanese mainland in the Pacific had ended after the invasion of Okinawa on April 1, 1945, and 82 days of horrific fighting that resulted in the loss of life for 250,000 people. This had been preceded by the March 9-10 firebombing of Tokyo, which killed 100,000 civilians and destroyed 16 square miles, leaving an estimated 1 million homeless.

Now for the mainland … and the choices were stark and unpleasant: either a naval blockade and massive bombings, or an invasion. Based on experience, many believed that the Japanese would never surrender, acutely aware of the “Glorious Death of 100 Million” campaign, designed to convince every inhabitant that an honorable death was preferable to surrendering to “white devils.” The bombing option had the potential to destroy the entire mainland.

The decision to use the atomic bomb on Hiroshima (Aug. 6) and Nagasaki (Aug. 9) led to the surrender on Aug. 10, paving the way for Gen. Douglas MacArthur to gain agreement to an armistice and 80-month occupation by the United States. Today, that decision still seems prudent despite the fact we only had the two atomic bombs. Japan has the third-largest economy in the world at $5 trillion and is a key strategic partner with the United States in the Asia-Pacific region.

Now about those ground forces in the Middle East…

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As court controversy rages, let’s not forget what we do best

A photograph of Franklin D. Roosevelt signed and inscribed to Eleanor Roosevelt sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

The Supreme Court was created by the Constitution, but the document wisely calls for Congress to decide the number of justices. This was vastly superior to a formula based on the number of states or population, which would have resulted in a large, unwieldy committee. The 1789 Judiciary Act established the initial number at six, with a chief justice and five associates all selected by President Washington.

In 1807, the number was increased to seven (to avoid tie votes) and in 1837 to nine, and then to 10 in 1863. The Judiciary Act of 1866 temporarily reduced the court to seven in response to post-Civil War politics and the Andrew Johnson presidency. Finally, the 1869 Act settled on nine, where it has remained to this day. The major concern has consistently been over the activities of the court and the fear it would inevitably try to create policy rather than evaluate it (ensuring that Congressional legislation was lawful and conformed to the intent of the Constitution).

The recent confirmation hearings are the latest example of both political parties vying for advantage by using the court to shape future policies, reflecting political partisanship at its worst. Despite the fact that the Supreme Court can’t enforce its decisions since Congress has the power of the purse and the president the power of force, the court has devolved into a de facto legislative function through its deliberations. In a sharply divided nation, on most issues, policy has become the victim, largely since Congress is unable to find consensus. The appellate process is simply a poor substitute for this legislative weakness.

We have been here before and it helps to remember the journey. Between 1929 and 1945, two great travails were visited on our ancestors: a terrible economic depression and a world war. The economic crisis of the 1930s was far more than the result of the excesses of the 1920s. In the 100 years before the 1929 stock-market crash, our dynamic industrial revolution had produced a series of boom-bust cycles, inflicting great misery on capital and on many people. Even the fabled Roaring ’20s had excluded great segments of the population, especially blacks, farmers and newly arrived immigrants. Who or what to blame?

“[President] Hoover will be known as the greatest innocent bystander in history, a brave man fighting valiantly, futile, to the end,” populist newspaperman William Allen White wrote in 1932.

The same generation that suffered through the Great Depression was then faced with war in Europe and Asia, the rationing of common items, entrance to the nuclear age and, eventually, the responsibilities for rebuilding the world. Our basic way of life was threatened by a global tyranny with thousands of nukes wired to red buttons on two desks 4,862 miles apart.

FDR was swept into office in 1932 during the depth of the Great Depression and his supporters believed he possessed just what the country needed: inherent optimism, confidence, decisiveness, and the desire to get things done. We had 13 million unemployed, 9,100 banks closed, and a government at a standstill. “This nation asks for action and action now!”

In his first 100 days, Roosevelt swamped Congress with a score of carefully crafted legislative actions designed to bring about economic reforms. Congress responded eagerly. But the Supreme Court, now dubbed the “Nine Old Men,” said no to most New Deal legislation by votes of 6-3 or 5-4. They made mincemeat of the proposals. But the economy did improve and resulted in an even bigger landslide re-election. FDR won 60.3 percent of the popular vote and an astonishing 98.5 percent of the electoral votes, losing only Vermont and Maine.

In his 1937 inaugural address, FDR emphasized that “one-third of the nation was ill-housed, ill-clad and ill-nourished.” He called for more federal support. However, Treasury Secretary Henry Morgenthau worried about business confidence and argued for a balanced budget, and in early 1937, Roosevelt, almost inexplicably, ordered federal spending reduced. Predictably, the U.S. economy went into decline. Industrial production had fallen 14 percent and in October alone, another half million people were thrown out of work. It was clearly now “Roosevelt’s Recession.”

Fearing that the Supreme Court would continue to nullify the New Deal, Roosevelt in his ninth Fireside Chat unveiled a new plan for the judiciary. He proposed that the president should have the power to appoint additional justices – up to a maximum of six, one for every member of the Supreme Court over age 70 who did not retire in six months. The Judicial Procedures Reform Bill of 1937 (known as the “court-packing plan”) hopelessly split the Democratic majority in the Senate, caused a storm of protest from bench to bar, and created an uproar among both Constitutional conservatives and liberals. The bill was doomed from the start and even the Senate Judiciary reported it to the floor negatively, 10-14. The Senate vote was even worse … 70-20 to bury it.

We know how that story ended, as Americans were united to fight a Great War and then do what we do best: work hard, innovate and preserve the precious freedoms our forebears guaranteed us.

Unite vs. Fight seems like a good idea to me.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].