There were no winners or losers in the War of 1812

Portraits of James and Dolley Madison by Lawrence Williams went to auction in October 2007.

By Jim O’Neal

The White House was burned to a shell. The previous evening, British soldiers had found the president’s house abandoned and they feasted on the dinner and wine left there untouched due to the hasty exit of Dolley Madison and the entire staff. The date was Aug. 24, 1814, and the War of 1812 came directly to the young country’s capital. There was little doubt about the enemy’s intentions. Public buildings would be destroyed in retribution for the burning of both the legislature and governor’s residence in York (now Toronto), the capital of Upper Canada.

Someone (other than the First Lady) had rescued the Gilbert Stuart painting of George Washington by trimming it from its heavy frame. Executive papers and personal effects, along with silverware, were hurriedly spirited away by carriage for safekeeping. A torrential rain had mercifully helped minimize the damage.

Three days after the British departed, the Madisons returned to the ruins. The torching of the president’s house had mortified the populace, and political enemies accused Madison of cowardice for fleeing days before the incident. Even the press piled on, asserting that Dolley could have saved more, or worse, that the president could have prevented the entire affair. There was malicious gossip that this might finally reduce the excessive social entertaining of the First Lady.

Fortuitously, refuse from the fire had fallen gracefully within the stone walls of the White House and virtually no debris was scattered on the surrounding grounds. The city superintendent commissioned an assessment of all public buildings and the consensus was the White House was damaged more than the Capitol or other executive buildings. Since the blackened shells were shameful symbols of defeat, a debate arose over whether the federal city should be rebuilt. New buildings in a different location could provide an opportunity for a fresh start.

Cincinnati was mentioned as a perfect candidate since it was more central to the country’s westward expansion; the Ohio River and new steamboat connections to St. Louis and New Orleans would facilitate commerce. It would also minimize the need to contend with crossing the mountains, and the re-centering rationale was similar to the arguments used to support the earlier move from Philadelphia to Virginia. Fate intervened just in time with news of victory and the Treaty of Ghent, which ended the War of 1812 between the United States and the United Kingdom.

Congress hastily ratified an appropriation of $500,000 to fund the restoration of all damaged buildings. Jubilant backers of the city implied promises of more money as needed, knowing that once construction was under way, Congress would have no other option than to continue with the restoration. The capital had been saved and that was all that was important.

A few months earlier in September, the formidable British Navy attacked Fort McHenry in Baltimore. The fort’s soldiers were able to withstand 25 hours of bombardment. The next day, they hoisted an enormous American flag, which provided the inspiration of a poem by Francis Scott Key – The Star-Spangled Banner, which became an instant hit and in 1931 became the national anthem of the United States. British forces withdrew from Chesapeake Bay and organized their forces for a campaign against New Orleans. This strategic location would provide access to the Mississippi River and the entire western part of the United States. They still hadn’t abandoned their ambition of establishing a British North America.

Colonel Andrew Jackson was 45 years old when the War of 1812 started – semiretired on his 640-acre plantation the Hermitage – and still with a burning ambition to get involved. His prayers were answered with the assignment to assume command of New Orleans. His ragtag group of free blacks, pirates (including Jean Lafitte) and loyal Tennessee Volunteers cleverly defeated the British. General Jackson was awarded the Congressional Gold Medal and would become a two-term president in 1828.

In a slight twist, the victory at New Orleans occurred a few weeks after the British had already signed the Treaty of Ghent. However, Jackson’s role in the war was absolutely critical to the future expansion of the country. Not only did he spare an almost certain loss of territory in the Southwest, but he also cleared the air over the status of the Gulf Coast. Great Britain did not recognize any American claims about lands included in the Louisiana Purchase. They disputed – correctly – the legality of the treaty. France had no legal right to sell it to the United States since the 1800 Treaty of San Ildefonso between Spain and France specifically stated that France would not sell without offering to return it to Spain. This meant that none of the lower Mississippi or any of the Gulf Coast belonged to the United States.

Their claims were blithely ignored and the Treaty of Ghent was silent on the entire issue. It has been said that there were no winners or losers in the little War of 1812 … except for American Indians. The United States signed 15 different treaties guaranteeing their lands and then proceeded to break every one of them.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

What would you do if you saw your obituary?

Francis H.C. Crick’s Nobel Prize Medal and Nobel Diploma, awarded in 1962 for his work related to DNA molecules, sold for $2.27 million at an April 2013 Heritage auction.

By Jim O’Neal

In 1888, a French newspaper published Alfred Nobel’s obituary with the following title: “Le marchand de la mort est mort” or “The merchant of death is dead.”

In reality, it was actually his brother Ludvig who had died, but Alfred was appalled that this kind of sendoff could tarnish his own professional legacy. One presumes that the only error was the mix-up in names since the sobriquet seemed apt given Alfred’s contributions to the effectiveness of substances that resulted in death.

In a complicated maneuver, the inventor of dynamite attempted to rectify future obits by posthumously donating the majority of his estate (94 percent) to the establishment of the Nobel Prizes, designed to expunge his reputation for all the deaths resulting from his explosive product. It was only partially successful since he was accused of treason against France for selling Ballistite (a smokeless propellant composed of two explosives) to Italy. The French forced him to leave Paris and he moved to Sanremo, Italy, where he died in 1896. There were five Nobel categories with an emphasis on “peace” … for obvious reasons.

A native of Stockholm, Nobel made a fortune when he invented dynamite in 1867 as a more reliable alternative to nitroglycerin. As a chemist and engineer, he basically revolutionized the field of explosives. Some accounts give him credit for 355 inventions. In 1895, a year before his death, he signed the final version of his will, which established the organization that would bear his name and “present prizes to those who, during the preceding year, shall have conferred the greatest benefit to mankind.”

Nobel’s family contested the will and the first prizes were not handed out until 1901. Among the first winners were German physicist Wilhelm Conrad Röntgen, who discovered X-rays, and German microbiologist Emil Adolf von Behring, who developed a treatment for diphtheria. The Nobel Prizes were soon recognized as the most prestigious in the world. Except for war-related interruptions, prizes have been awarded virtually every year. The category of economics was added in 1969.

The first American to receive a Nobel was President Theodore Roosevelt, who garnered the prize in 1906 after he helped mediate an end to the Russian-Japanese war. The German-born American scientist Albert Michelson claimed the physics prize the next year. However, the peace and literature prizes would become the most familiar to Americans and are some of the most controversial. Critics voiced concerns over Roosevelt, Woodrow Wilson (1919), George Marshall (1953) and Secretary of State Henry Kissinger (1973). More recently, winners have included Al Gore (2007) for making an Oscar-winning documentary on climate change, and Barack Obama (2009) “for his extraordinary efforts to strengthen international diplomacy and cooperation between peoples.” (for more, see Obama’s Wars by Bob Woodward).

William Faulkner, Ernest Hemingway, John Steinbeck and Toni Morrison generally have escaped criticism, as have multiple winners like Marie Curie (the first woman in 1911, and in two separate categories), and Linus Pauling, among others. The Red Cross has snagged three. From a personal standpoint, the most obvious non-winner is Mahatma Gandhi, or as someone quipped, “Gandhi can do without a Nobel Prize, but can the Nobel Committee do without Gandhi?”

I think not.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Betty Ford set a standard that all who follow should study

A portrait of Betty Ford by Lawrence Williams went to auction in 2007.

By Jim O’Neal

Every presidential trivia fan knows that Eleanor Roosevelt’s birth name was Eleanor Roosevelt. She had married her father’s fifth cousin, Franklin. Although the couple had six children, Eleanor said she disliked intimacy with him and wrote she was ill-equipped to be a mother since she didn’t understand or even like small children.

They somehow managed to stay married for 40 years until FDR died in 1945. Franklin did enjoy intimate relations, especially with Lucy Mercer, Eleanor’s social secretary. He wanted a divorce, but his mother (who controlled the family money) would not allow it. This even after a trove of love letters between Franklin and Lucy exposed their elicit relationship.

Eleanor skillfully leveraged her position as First Lady; many consider her the first First Lady since she personally championed so many women’s rights issues. She had an active public life and a serious relationship with reporter Lorena Hickok. Eleanor became well known during her long occupancy in the White House and was highly respected all over the world.

That was not true (initially) of Betty Ford, who became First Lady when Jerry Ford became president after Richard Nixon resigned in 1974. She was born Betty Bloomer and she had divorced after a failed five-year marriage to William Warren, an alcoholic she nursed during his final two years.

She was a dancer before she married the man whose name was Leslie Lynch King Jr. when he was born in 1913 (he changed his name in 1935). As a member of the renowned Martha Graham dance troupe, Ford had performed at Carnegie Hall and later earned the prestigious Presidential Medal of Freedom. It was presented by the recently deceased President George H.W. Bush in 1991.

Betty Ford (1918-2011) had been impressed by Eleanor Roosevelt since childhood. “She eventually became my role model because I admired her so. I loved her Independence … a woman finally speaking out for herself rather than saying what would be politically helpful to her husband. That seemed healthy to me.” Others were quick to note the similarities between the two women. Major publications compared the willingness of both to offer bold, personal opinions on highly controversial issues. I would argue that Betty Ford set a higher standard for candor than any of her predecessors.

One small example is the very first press conference in the State Dining Room. Ford seemed to have no reservations about repeating her strong positions as a supporter of the Equal Rights Amendment and her pro-choice stance on abortion. She admitted she had consulted a psychiatrist, had been divorced, and used tranquilizers for physical pain. Any single one of these uttered today would instantly be “Breaking News” on the cable news channels so starved for fresh material (or innuendo).

Initially, Ford didn’t consider her Ladyship as a “meaningful position,” but rather than letting the role define her, she decided to change it. “I wanted to be a good First Lady … but didn’t feel compelled to emulate my predecessors.” She simply decided to be Betty Bloomer Ford … “and [I] might as well have a good time doing it.” She succeeded on both accounts and the results were more than just surprising.

She talked about “demanding privilege” and “a great opportunity,” but also about the “salvation” that gave her a genuine career of her own … and on a national level she’d never experienced before. Her impact helped reshape her into a likeable leader with broad respect.

Her creative imagination rivaled Jackie’s. “This house has been a grave,” she said. “I want it to sing!” More women were seated at the president’s table, especially second-tier political women who needed a little boost. And they were round tables, which denoted equality. This was the instinct of a free, bohemian spirit, but not by contrivance. She had been a single woman who studied modern dance and introduced it to the ghettos of Grand Rapids, Mich. She spoke deliberately and was unafraid of listening to differing viewpoints.

There were the occasional curious remarks about her drug and alcohol use, but easily rationalized by her well-known physical pain from severe arthritis and pinched nerve courtesy of her dancing. Not even nosy reporters questioned or sought to investigate the degree of her medications. It wasn’t until after the Fords left the White House that the drinking resulted in a family intervention.

In true Betty Ford fashion, after the denial, anger and resentment subsided, a positive outcome resulted. The Betty Ford Center was founded in Rancho Mirage, Calif. The center, known as Camp Betty, has helped celebrities and others overcome substance abuse issues. It offers treatment without shame and, although not a cure or panacea, gives people control over their lives. The opioid crisis of today is using some of the experience gained from Camp Betty.

However, her most lasting and important contribution concerns breast cancer. During the mid-1970s, television didn’t even allow the word “breast” until a determined Betty Ford decided to go very public with her condition. She had accompanied a friend who was having an annual checkup and the doctor suggested she do the same. After several more doctors got involved, a biopsy confirmed she had breast cancer. The White House press office squabbled over releasing information about her condition, but Betty spotted another opportunity.

By the time she was back in the White House two weeks later, women across America were having breast examinations and mammograms. The ensuing media coverage of her honest revelations was credited with saving the lives of thousands of women who had discovered breast tumors. The East Wing was flooded with 60,000 cards, letters and telegrams, 10 percent from women who had mastectomies. The First Lady told the American Cancer Society, “I just cannot stress enough how necessary it is for women to take an active interest in their own health and body … too many women are so afraid … they endanger their lives.”

Ford was a modern day Abigail Adams, but Ford used a megaphone rather than letters, and in a practical way. Bravo to an under-appreciated First Lady, who set a standard that all who follow should study.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

America’s greatest contribution to humanity remains ‘freedom’

An 1852 presentation copy of the Constitution of the United States, signed by President Millard Fillmore, sold for $15,000 at an April 2016 Heritage auction.

By Jim O’Neal

Constitutional scholars are prone to claim there is a direct and historic link between the First Commandment of the Old Testament and the First Amendment of the U.S. Constitution … that one leads inexorably to the other. The First Commandment explains the origins of freedom and the First Amendment defines the scope of freedom in broad categories.

Both point unmistakably to freedom as America’s greatest contribution to humanity. Not the automobile, jazz or Hollywood. Not the word processor, the internet or the latest smartphones. All of these are often described as America’s unique assets, but it is the awesome concept of “freedom” that is America’s ultimate symbol, attraction … and even export!

The First Commandment reads, “I am the Lord thy God, who brought thee forth out of Egypt, out of the house of bondage, and thou shalt have no other gods before me.” In this way, God sanctions an escape from bondage and puts people on a path toward the “promised land,” toward freedom. This powerful message ricocheted through history until it finally found a permanent home in Colonial America.

In the early 18th century, the trustees of Yale, many of them scholars who read scripture in the original Hebrew, designed a coat of arms for the college in the shape of a book open to two words, Urim and Thummim, which have come to mean “Light” and “Truth” in English. The book depicted, of course, was the bible.

Not too far north, Harvard graduates had to master three languages … Latin, Greek and Hebrew. True gentlemen in those days had to have more than a passing familiarity with the Old Testament. It was not a mere coincidence that carved into the Liberty Bell in Philadelphia, relaying a brave message to a people overthrowing British rule, was an uplifting phrase selected from chapter 25 of Leviticus: “Proclaim liberty throughout the land, unto all inhabitants thereof.”

The Commandment that blessed an escape from oppression and embraced the pursuit of freedom led the Founding Fathers to pen the Bill of Rights. They had much on their minds, many interests to try and reconcile, but they agreed that the delineation of freedom was to be their primary responsibility. It was to become the First Amendment to the recently ratified Constitution, inspired in part by the First Commandment, and it read, “Congress shall make no law respecting an establishment of religion or prohibiting an establishment of religion, or prohibiting the free exercise thereof, or abridging the freedom or speech, or of the press, or the right of the people peaceably to assemble, and to petition the government for a redress of grievances.”

Throughout American history, these freedoms have become intertwined with American life, one indistinguishable from the other. Just consider that in one small grouping of words, into a single Amendment, resides more freedom for mankind than had ever existed in the history of the world. Somewhat remarkable, in my opinion, yet we take it for granted today and can only fine tune minor opinions on original intent in small insignificant instances.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Believe it or not, electing presidents has never been a pleasant affair

An 1889 letter in which Rutherford B. Hayes discusses his inauguration sold for $19,120 at an April 2007 Heritage auction.

By Jim O’Neal

One discouraging trend in American culture is treating everything from a partisan-political standpoint. I can recall not too long ago after an election, we’d simply forget about our disagreements about candidates and resume normal civility. Now it seems that nearly everything gets politicized, dividing the nation into continually warring tribes of Red and Blue. Some political pundits see the starting point as the 2000 Gore versus Bush election, with its hanging chads and the controversial Supreme Court decision to stop the vote recount in Florida. Others believe the feud between President Bill Clinton and Speaker Newt Gingrich exacerbated it.

However, to accept either theory requires ignoring the 1876 presidential election between Samuel Tilden and Rutherford B. Hayes.

Hayes, the Republican, was a lawyer from Ohio who distinguished himself during the Civil War as a brave soldier who was wounded five times and eventually promoted to a brevet major general. After the war, he served in Congress and was elected governor of Ohio three times.

Tilden also had a legal background and was the 25th governor of New York (1875-76). As the Democratic candidate for the presidency in 1876, he is still the only individual to win an outright majority (not just a plurality) of the popular vote, but lose the election … in a rather bizarre series of events. Four other candidates have lost the presidency despite having a plurality of the popular vote (Al Gore and Hillary Clinton are the most recent to suffer this fate).

It had generally been assumed that incumbent President Ulysses S. Grant would run for a third term, despite a troubled economy and numerous scandals that had been discovered during his two terms, which started in 1869. There was also the two-term precedent established by George Washington. In spite of these formidable barriers, Grant’s inner circle of advisors were eager to maintain political power. While Grant was on the verge of announcing his candidacy, the House of Representatives preempted him by passing a resolution by an overwhelming margin, 233-18, establishing a two-term limit to prevent a dictatorship. Grant reluctantly withdrew his name from consideration.

The Democrats proceeded with their National Convention in June 1876 in St. Louis (the first time a major political convention was held west of the Mississippi). They selected Tilden on the second ballot and added Thomas Hendricks for vice president, since he was the only one nominated. The Democrats were hungry for a win since they had been out of power since James Buchanan, who was elected a full 20 years earlier in 1856.

What followed was the most contentious presidential election in American history. On the first vote in the Electoral College, Tilden had 184 votes (only one short) while Hayes was stuck at 165. However, there were 20 votes being contested in four states (Florida, Louisiana, South Carolina and Oregon) and both parties were claiming victory. This impasse caused a Constitutional crisis and, finally, a beleaguered Congress passed a law on Jan. 29, 1877, to form a special 15-member Electoral Commission to settle the dispute. After a great debate, the commission awarded all 20 disputed votes to Hayes, who became president with 185 votes to Tilden’s 184.

In return, Republicans passed a resolution that required an end to Reconstruction and the removal of all federal troops from every Southern state. Over the next 20 years, the states passed all kinds of laws and regulations that effectively wiped out the provisions of the 14th and 15th Amendments to the Constitution that granted numerous rights to the black population. It would take another 60 years to regain them when LBJ was president and finally crack the “Solid South” grip on national politics.

Maybe we are doomed to be a divided nation, but I suspect that strong leaders will emerge, eventually, and help us remember the advantages of a group of united states … E pluribus unum.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Here’s why Foote, Faulkner are among our greatest writers

A 1929 first edition of William Faulkner’s The Sound and the Fury, in its original first state dust jacket, sold for $15,000 at a March 2018 Heritage auction.

By Jim O’Neal

Whenever the topic of “favorite author” is inevitably raised, I quickly steer the conversation to two categories. First is non-fiction, since it gives me an opportunity to nominate Shelby Foote for my all-time favorite subject of the Civil War. Secondly, I suggest that fiction favorites be limited to only writers born in the great state of Mississippi.

Shelby Dade Foote Jr. (1916-2005) spent over 20 years working on his masterpiece The Civil War: A Narrative, a three-volume, 3,000-page work that captivated me. However, like many others, it wasn’t until filmmaker Ken Burns aired his PBS documentary in 1990 that I became aware of just how much I truly appreciated it. In the first hour of the 12-hour series, Foote appeared in 90 segments. His sagacious comments and distinctive Southern drawl added a remarkable degree of authenticity to an otherwise only great production.

Legend has it that paperback sales of Foote’s book jumped to 1,000 per day and ended up selling over 400,000 mores copies – all as a result of his newfound celebrity. He reportedly remarked to Burns: “You have made me a millionaire.” A few critics complained that Foote had a Southern bias and cited a passage where he stated that Abraham Lincoln and Confederate Army general Nathan Bedford Forrest were the two smartest men in the entire war and tried to point out a few weaknesses of Forrest when they really objected to simply pairing him with the revered Lincoln.

As for fiction writers born in Mississippi, there are a lot more to choose from than you might expect. Consider Eudora Welty (The Optimist’s Daughter), Willie Morris (North Toward Home) and William Faulkner (As I Lay Dying), to name a few.

Of these, William Cuthbert Faulkner (1897-1962) didn’t give a damn about self-promotion. In fact, you could spell his name with or without the u. “Either way suits me,” he said quite often. As a boy, his parents took him to meet the great Confederate general (and Robert E. Lee’s right arm) James Longstreet (1821-1904). Little William had the temerity to ask, “What was the matter with you at Gettysburg? You should have won!” By reputation, Faulkner had a prickly side his whole life, but it didn’t seem to affect the quality of his writing.

When asked about grants for writers, Faulkner replied, “I’ve never known anything good in writing to come from having accepted any free gift of money. The good writers never apply to a foundation. They’re too busy writing something.” Faulkner would have been unaware that Foote (himself born in Greenville, Miss.) accepted Guggenheim Fellowships (1955-57) and Ford Foundation grants to get him through the 20 years of writing his Civil War narrative. However, as much as Faulkner’s work was admired by other writers, by 1945, all of his books, except for two, were out of print.

Yet just four years later, the unusually myopic Nobel Prize Committee made an unusually clear-sighted decision. In 1949, they awarded Faulkner the Nobel Prize for Literature, for which he became the only Mississippi-born Nobel winner. Two of his other works, A Fable (1954), and his last novel The Reivers (1962) won the Pulitzer for Fiction. Only two others have won the Pulitzer twice: Booth Tarkington 1919/1922 and John Updike 1982/1991.

Ernest Hemingway actually won in both 1941 and 1953, but the president of Columbia University, Nicholas Murray Butler, found Hemingway’s For Whom the Bell Tolls too offensive and convinced the committee to revere their decision and no prize was awarded in 1941. However, the movie version was nominated for nine Academy Awards and is a good piece of film.

In June 1943, Faulkner found an unopened letter that had been there for three months, since he didn’t recognize the return address. It was a proposal from writer and literary critic Malcolm Cowley to publish a “Portable Faulkner” to keep him from falling into literary obscurity. Faulkner was working as a Hollywood screenwriter (The Big Sleep) and was in danger of seeing all his books out of print. It was this effort that resuscitated Faulkner’s career and led directly to the 1949 Nobel Prize. Novelist and literary critic Robert Penn Warren called it the “great watershed moment,” for it saved Faulkner’s reputation and career.

True to style, when Cowley asked Faulkner to get Hemingway to write a preface, he refused. “It would be like asking one racehorse in the middle of the race to broadcast a blurb on another horse running in the same race.” He remained a prickly man to the end and I suspect it and all his wonderful writing came out of the same Southern Bourbon bottle.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As a ‘champion’ of the working man, Marx lived the high life

A second edition of the first volume of Karl Marx’s Das Kapital (Hamburg: Otto Meissner, 1872) sold for $3,500 at a March 2018 Heritage auction.

By Jim O’Neal

When Ho Chi Minh lived in London, training as a pastry chef under Auguste Escoffier at the Carlson House, he used it as a pillow. Fidel Castro claimed he read 370 pages (about half) in 1953 while he was in prison after a failed revolutionary attack of the barracks of Moncada in Santiago de Cuba. President Xi Jinping of China hailed its author as “the greatest thinker of modern times.”

It’s been 200 years since Karl Marx was born on May 5, 1818, in Trier, Germany. His book Das Kapital was published in 1867, or at least that was when Volume 1 made its way into print. His friend and benefactor Friedrich Engels edited Volumes 2 and 3 after Marx’s death.

Karl Marx

Engels (1820-1895) was born in Prussia, dropped out of high school and finally made it to England to help run his father’s textile factory in Manchester. On his trip, he met Marx for the first time, but it would be later before their friendship blossomed. Perhaps it was due to Engels’ 1845 book The Condition of the Working Class in England.

He had observed the slums of Manchester, the horrors of child labor, and the utter impoverishment of laborers in general and the environmental squalor that was so pervasive. This was not a new indictment since Thomas Robert Malthus (1766-1834) had written, albeit anonymously, about these abysmal conditions. However, he had blamed the poor for their plight and opposed the concept of relief “since it simply increases their tendency to idleness.” He was particularly harsh on the Irish, writing that a “great part of the population should be swept from the soil.”

Not surprisingly, mortality rates soared, especially for the poor, and the average life expectancy fell to an astonishing 18.5 years. These lifespan levels had not existed since the Bronze Age and even in the healthiest areas, life expectancy was in the mid-20s, and nowhere in Britain exceeded 30 years.

Life expectancy had largely been uncertain until Edmond (the Comet) Halley obtained a cache of records from an area in Poland in 1693. Ever the tireless investigator of any and all scientific data, he suddenly realized he could calculate the life expectancy of any person still alive. From these unusually complete data charts, he created the very first actuarial tables. In addition to all the many other uses, this is what enabled the creation of the life insurance industry as a viable service.

One of the few who sympathized with the poor was the aforementioned Friedrich Engels, who spent his time embezzling funds from the family business to support his collaborator Karl Marx. They both passionately blamed the industrial revolution and capitalism for the miserable conditions of the working class. While diligently writing about the evils of capitalism, both men lived comfortably from the benefits it provided them personally. To label them as hypocrites would be far too mild a rebuke.

There was a stable of fine horses, weekends spent fox hunting, slurping the finest wines, a handy mistress, and membership in the elite Albert Club. Marx was an unabashed fraud, denouncing the bourgeoisie while living in excess with his aristocratic wife and his two daughters in private schools. In a supreme act of deception, he accepted a job in 1851 as a foreign correspondent for Horace Greeley’s New-York Tribune. Due to his poor English, he had Engles write the articles and he cashed the checks.

Even then, Marx’s extravagant lifestyle couldn’t be maintained and he convinced Engels to pilfer money from his father’s business. They were partners in crime while denouncing capitalism at every opportunity.

In the 20th century, Eugene Victor Debs ran for U.S. president five consecutive times as the candidate of the Socialist Party of America, the last time (1920) from a prison cell in Atlanta while serving time after being found guilty of 10 counts of sedition. His 1926 obituary told of him having a copy of Das Kapital and “the prisoner Debs read it slowly, eagerly, ravenously.”

In the 21st century, Senator Bernie Sanders of Vermont ran for president in 2016, despite the overwhelming odds at a Democratic National Convention that used superdelegates to select his Democratic opponent. In a series of televised debates, he predictably promised free healthcare for all, a living wage for underpaid workers, college tuition and other “free stuff.” I suspect he will be back in 2020 due to overwhelming support from Millennials, who seem to like the idea of “free stuff,” but he may have 10 to 20 other presidential hopefuls who’ve noticed that energy and enthusiasm.

One thing: You cannot call Senator Sanders a hypocrite like Karl Marx. In 1979, Sanders produced a documentary about Eugene Debs and hung his portrait in the Burlington, Vt., City Hall, when he became its mayor after running as a Socialist.

As British Prime Minister Margaret Thatcher once said: “The problem with Socialism is that eventually you run out of other people’s money.”

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Childhood has always been tough, but let’s not go too far

A 1962 first edition, first printing of The Guns of August, signed by author Barbara Tuchman, sold for $625 at an April 2013 Heritage auction.

By Jim O’Neal

Simply mention the name of Barbara Tuchman and it brings back fond memories of her wonderful book The Guns of August, which won the Pulitzer in 1963. It brilliantly explains the complicated political intrigue that started innocuously in the summer of 1914 and then erupted into the horrors of the First World War. It is always a good reminder of how easily our foreign entanglements can innocently provoke another, except survivors today would call it the Last World War.

Tuchman (1912-89) won a second Pulitzer for her biography of General “Vinegar Joe” Stilwell and his travails in China-Burma-India during World War II. He was faced with trying to manage/control aviator Claire “Old Leatherface” Chennault and his famous Flying Tigers. He also had to contend with Chiang Kai-shek, who eventually became the first president of the Republic of China (1950-75). This dynamic duo conspired to have General Stilwell removed and were finally successful by badgering a tired and sick FDR.

However, it was a totally different award-winning book Tuchman published in 1978 that was much more provocative and controversial (at least to me). In A Distant Mirror: The Calamitous 14th Century, she writes, “Of all the characteristics in which the medieval age differs from the modern, none is so striking as the comparative absence of interest in children.” She concluded, chillingly, that “a child was born and died and another took its place.”

Tuchman asserted that investing love in young children was so risky, so unrewarding, that everywhere it was seen as a pointless waste of energy. I politely refuse to accept that our ancestors were ever so jaded and callous. Surely, there was at least a twinge of sorrow, guilt or emotion.

Yet earlier, French author Philippe Ariès in his Centuries of Childhood made a remarkable claim that until the Victorian Age “the concept of childhood did not exist.” There is no doubt that children once died in great numbers. One-third died in their first year and 50 percent before their 5th birthday. Life was full of perils from the moment of conception and the most dangerous milestone was birth itself, when both child and mother were at risk due to a veritable catalog of dangerous practices and harmful substances.

And it was not just happening to poor or needy families. As Stephen Inwood notes in A History of London, death was a visitor in the best of homes and cities. English historian Edward Gibbon (The History of the Decline and Fall of the Roman Empire) lost all six of his siblings while growing up in the 1700s in Putney, a rich, healthy suburb of London. In his autobiography, Gibbon describes his childhood as “a puny child, neglected by my mother, starved by my nurse.” So death was apparently totally indifferent in choosing between rich and poor; but that is a far cry from not having a childhood per se.

To extend this into a situation where mothers became totally indifferent to young children because of high rates on infant mortality and thus it made little sense to invest emotionally in infants defies logic. Obviously, parents adjusted and had many children in order to guarantee that a few would survive … just as a sea turtle lays 1,000 eggs to have a few survive. To do otherwise is to invite extinction.

The world these respected people write so persuasively about would have been a sad, almost morbid place, with a landscape of tiny coffins. I say where is the proof?

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

As court controversy rages, let’s not forget what we do best

A photograph of Franklin D. Roosevelt signed and inscribed to Eleanor Roosevelt sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

The Supreme Court was created by the Constitution, but the document wisely calls for Congress to decide the number of justices. This was vastly superior to a formula based on the number of states or population, which would have resulted in a large, unwieldy committee. The 1789 Judiciary Act established the initial number at six, with a chief justice and five associates all selected by President Washington.

In 1807, the number was increased to seven (to avoid tie votes) and in 1837 to nine, and then to 10 in 1863. The Judiciary Act of 1866 temporarily reduced the court to seven in response to post-Civil War politics and the Andrew Johnson presidency. Finally, the 1869 Act settled on nine, where it has remained to this day. The major concern has consistently been over the activities of the court and the fear it would inevitably try to create policy rather than evaluate it (ensuring that Congressional legislation was lawful and conformed to the intent of the Constitution).

The recent confirmation hearings are the latest example of both political parties vying for advantage by using the court to shape future policies, reflecting political partisanship at its worst. Despite the fact that the Supreme Court can’t enforce its decisions since Congress has the power of the purse and the president the power of force, the court has devolved into a de facto legislative function through its deliberations. In a sharply divided nation, on most issues, policy has become the victim, largely since Congress is unable to find consensus. The appellate process is simply a poor substitute for this legislative weakness.

We have been here before and it helps to remember the journey. Between 1929 and 1945, two great travails were visited on our ancestors: a terrible economic depression and a world war. The economic crisis of the 1930s was far more than the result of the excesses of the 1920s. In the 100 years before the 1929 stock-market crash, our dynamic industrial revolution had produced a series of boom-bust cycles, inflicting great misery on capital and on many people. Even the fabled Roaring ’20s had excluded great segments of the population, especially blacks, farmers and newly arrived immigrants. Who or what to blame?

“[President] Hoover will be known as the greatest innocent bystander in history, a brave man fighting valiantly, futile, to the end,” populist newspaperman William Allen White wrote in 1932.

The same generation that suffered through the Great Depression was then faced with war in Europe and Asia, the rationing of common items, entrance to the nuclear age and, eventually, the responsibilities for rebuilding the world. Our basic way of life was threatened by a global tyranny with thousands of nukes wired to red buttons on two desks 4,862 miles apart.

FDR was swept into office in 1932 during the depth of the Great Depression and his supporters believed he possessed just what the country needed: inherent optimism, confidence, decisiveness, and the desire to get things done. We had 13 million unemployed, 9,100 banks closed, and a government at a standstill. “This nation asks for action and action now!”

In his first 100 days, Roosevelt swamped Congress with a score of carefully crafted legislative actions designed to bring about economic reforms. Congress responded eagerly. But the Supreme Court, now dubbed the “Nine Old Men,” said no to most New Deal legislation by votes of 6-3 or 5-4. They made mincemeat of the proposals. But the economy did improve and resulted in an even bigger landslide re-election. FDR won 60.3 percent of the popular vote and an astonishing 98.5 percent of the electoral votes, losing only Vermont and Maine.

In his 1937 inaugural address, FDR emphasized that “one-third of the nation was ill-housed, ill-clad and ill-nourished.” He called for more federal support. However, Treasury Secretary Henry Morgenthau worried about business confidence and argued for a balanced budget, and in early 1937, Roosevelt, almost inexplicably, ordered federal spending reduced. Predictably, the U.S. economy went into decline. Industrial production had fallen 14 percent and in October alone, another half million people were thrown out of work. It was clearly now “Roosevelt’s Recession.”

Fearing that the Supreme Court would continue to nullify the New Deal, Roosevelt in his ninth Fireside Chat unveiled a new plan for the judiciary. He proposed that the president should have the power to appoint additional justices – up to a maximum of six, one for every member of the Supreme Court over age 70 who did not retire in six months. The Judicial Procedures Reform Bill of 1937 (known as the “court-packing plan”) hopelessly split the Democratic majority in the Senate, caused a storm of protest from bench to bar, and created an uproar among both Constitutional conservatives and liberals. The bill was doomed from the start and even the Senate Judiciary reported it to the floor negatively, 10-14. The Senate vote was even worse … 70-20 to bury it.

We know how that story ended, as Americans were united to fight a Great War and then do what we do best: work hard, innovate and preserve the precious freedoms our forebears guaranteed us.

Unite vs. Fight seems like a good idea to me.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Notorious traitors? Let’s look at Benedict Arnold

A May 24, 1776, letter by Benedict Arnold, signed, to Gen. William Thompson, realized $23,750 at an April 2016 Heritage auction.

By Jim O’Neal

Vidkun Quisling is an obscure name from World War II. To those unfamiliar with some of the lesser-known details, “Quisling” has become a synonym for a traitor or collaborator. From 1942 to 1945, he was Prime Minister of Norway, heading a pro-Nazi puppet government after Germany invaded. For his role, Quisling was put on trial for high treason and executed by firing squad on Oct. 24, 1945.

Obviously better known are Judas Iscariot of Last Supper fame (30 pieces of silver); Guy Fawkes, who tried to assassinate King James I by blowing up Parliament (the Gunpowder Plot); and Marcus Junius Brutus, who stabbed Julius Caesar (“Et tu, Brute?”). In American history, it’s a close call between John Wilkes Booth and Benedict Arnold.

Arnold

The irony concerning Benedict Arnold (1741-1801) is that his early wartime exploits had made him a legendary figure, but Arnold never forgot the sleight he received in February 1777 when Congress bypassed him while naming five new major generals … all of them junior to him. Afterward, George Washington pledged to help Arnold “with opportunities to regain the esteem of your country,” a promise he would live to regret.

Unknown to Washington, Arnold had already agreed to sell secret maps and plans of West Point to the British via British Maj. John André. There have always been honest debates over Arnold’s real motives for this treacherous act, but it seems clear that purely personal gain was the primary objective. Heavily in debt, Arnold had brokered a deal that included having the British pay him 6,000 pound sterling and award him a British Army commission for his treason. There is also little doubt that his wife Peggy was a full accomplice, despite a dramatic performance pretending to have lost her mind rather than her loyalty.

The history of West Point can be traced back to when it was occupied by the Continental Army after the Second Continental Congress (1775-1781) was designated to manage the Colonial war effort. West Point – first known as Fort Arnold and renamed Fort Clinton – was strategically located on high ground overlooking the Hudson River, with panoramic views extending all the way to New York City, ideal for military purposes. Later, in 1801, President Jefferson ordered plans to establish the U.S. Marine Corps there, and West Point has since churned out many distinguished military leaders … first for the Mexican-American War and then for the Civil War, including both Ulysses S. Grant and Robert E. Lee. It is the oldest continuously operating Army post in U.S. history.

To understand this period in American history, it helps to start at the end of the Seven Years’ War (1756-63), which was really a global conflict that included every major European power and spanned five continents. Many historians consider it “World War Zero,” and on the same scale as the two 20th century wars. In North America, the skirmishes started two years earlier in the French and Indian War, with Great Britain an active participant.

The Treaty of Paris in 1763 ended the conflict, with the British winning a stunning series of battles, France surrendering its Canadian holdings, and the Spanish ceding its Florida territories in exchange for Cuba. Consequently, the British Empire emerged as the most powerful political force in the world. The only issue was that these conflicts had nearly doubled England’s debt from 75 million to 130 million sterling.

A young King George III and his Parliament quietly noted that the Colonies were nearly debt free and decided it was time for them to pay for the 8,000-10,000 Redcoat peacetime militia stationed in North America. In April 1864, they passed legislation via the Currency Act and the Sugar Act. This limited inflationary Colonial currency and cut the trade duty on foreign molasses. In 1765, they struck again. Twice. The Quartering Act forced the Colonists to pay for billeting the king’s troops. Then the infamous Stamp Act placed direct taxes on Americans for the first time.

This was one step too far and inevitably led to the Revolutionary War, with armed conflict that involved hot-blooded, tempestuous individuals like Benedict Arnold. A brilliant military leader of uncommon bravery, Arnold poured his life into the Revolutionary cause, sacrificing his family life, health and financial well-being for a conflict that left him physically crippled. Sullied with false accusations, he became profoundly alienated from the American cause for liberty. His bitterness unknown to Washington, on Aug. 3, 1780, the future first president announced Arnold would take command of the garrison at West Point.

The appointed commander calculated that turning West Point over to the British, perhaps along with Washington as well, would end the war in a single stroke by giving the British control over the Hudson River. The conspiracy failed when André was captured with incriminating documents. Arnold fled to a British warship and they refused to trade him for André, who was hanged as a spy after pleading to be shot by a firing squad. Arnold went on to lead British troops in Virginia, survived the war, and eventually settled in London. He quickly became the most vilified figure in American history and remains the symbol of treason yet today.

Gen. Nathanael Greene, often called Washington’s most gifted and dependable officer, summed it up after the war most succinctly: “Since the fall of Lucifer, nothing has equaled the fall of Arnold.”

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].