Roosevelt Used Radio to Encourage, Hitler to Fuel Rage

A Franklin D. Roosevelt photograph, signed and inscribed to Eleanor Roosevelt, sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

Saul Bellow was a Canadian-born writer who became a nationalized U.S. citizen when he discovered he had immigrated to the United States illegally as a child. He hit the big time in 1964 with his novel Herzog. It won the U.S. National Book Award for fiction. Time magazine named it one of the 100 best novels in the English language since “the beginning of Time” (March 3, 1923).

Along the way, Bellow (1915-2005) also managed to squeeze in a Pulitzer Prize, the Nobel Prize for Literature, and the National Medal of Arts. He is the only writer to win the National Book Award for Fiction three times.

Saul Bellow

Bellow loved to describe his personal experience listening to President Roosevelt, an American aristocrat (Groton and Harvard educated), hold the nation together, using only a radio and the power of his personality. “I can recall walking eastward on the Chicago Midway … drivers had pulled over, parking bumper to bumper, and turned on their radios to hear every single word. They had rolled down the windows and opened the car doors. Everywhere the same voice, its odd Eastern accent, which in anyone else would have irritated Midwesterners. You could follow without missing a single word as you strolled by. You felt joined to these unknown drivers, men and women smoking their cigarettes in silence, not so much considering the president’s words as affirming the rightness of his tone and taking assurances from it.”

The nation needed the assurance of those fireside chats, the first of which was delivered on March 12, 1933. Between a quarter and a third of the workforce was unemployed. It was the nadir of the Great Depression.

The “fireside” was figurative; most of the chats emanated from a small, cramped room in the White House basement. Secretary of Labor Frances Perkins described the change that would come over the president just before the broadcasts. “His face would smile and light up as though he were actually sitting on the front porch or in the parlor with them. People felt this, and it bound them to him in affection.”

Roosevelt’s fireside chats and, indeed, all of his efforts to communicate contrasted with those of another master of the airwaves, Adolf Hitler, who fueled rage in the German people via radio and encouraged their need to blame, while FDR reasoned with and encouraged America. Hitler’s speeches were pumped through cheap plastic radios manufactured expressly to ensure complete penetration of the German consciousness. The appropriation of this new medium by FDR for reason and common sense was one of the great triumphs of American democracy.

Herr Hitler ended up committing suicide after ordering the building burned to the ground to prevent the Allies from retrieving any of his remains. So ended the grand 1,000-year Reich he had promised … poof … gone with the wind.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Early Automotive Pioneers Among America’s Top Innovators

A Lincoln Motor Company stock certificate, issued in October 1918 and signed by Henry M. Leland, sold for $500 at an October 2013 auction.

By Jim O’Neal

Doctors called it a “chauffeur’s fracture,” the radial styloid or wrist fracture that occurred when a driver tried to start a horseless carriage by turning the crank at the front of the car. If the engine backfired, the crank would spin backward, often causing broken bones. Those early automobiles motoring down the streets of American cities were considered engineering marvels.

But what a challenge to start!

The two requirements were a blacksmith’s arm and a perfect sense of timing. The driver had to adjust the spark and the throttle before jumping out to turn the crank mounted on the car’s outside front grill. Once the spark caught and the motor fired, the driver dashed back to the control to adjust the spark and throttle before the engine could die. Oh, and if the car started, but was in gear, it could lurch forward and run over the cranker!

Sound farfetched?

In 1908, tragedy struck when Byron Carter (1863-1908) – inventor of the Cartercar – died after trying to start a stalled car. The crank hit him in the jaw. Complications with gangrene set in and he died of pneumonia. It was a fluke involving a stalled motorist he was trying to help. The driver forgot to retard the spark. Whamo!

The car involved was a new Cadillac, one of the premier luxury brands, and Carter was good friends with the man who ran Cadillac, Henry Leland (who also owned Lincoln). When Leland found out his friend had been killed, he vowed: “The Cadillac car will kill no more men if we can possibly help it!” Cadillac engineers finally succeeded in manufacturing an electric self-starter, but were never able to scale it for commercial use.

Enter Charles Franklin Kettering (1876-1958), a remarkable man (in the same league as Thomas Edison) whose versatile skills included engineering and savvy business management. He was a prolific inventor with 186 notable patents. One of them was a self-starter small enough to fit under the hood of a car, running off a small storage battery. A New York inventor (Clyde J. Coleman) had applied for a patent in 1899 for an electric self-starter, but it was only a theoretical solution and never marketed.

After graduating from Ohio State College of Engineering, Kettering went to work for the invention staff at National Cash Register (NCR) company. He invented a high-torque electric motor to drive a cash register, allowing a salesperson to ring up a sale without turning a hand-crank twice each time. After five years at NCR, he set up his own laboratory in Dayton, Ohio. Working with a group of engineers, mechanics and electricians, he developed the new ignition system for the Cadillac Automobile Company.

Leland sold Cadillac to General Motors in 1909 for $4.5 million and there is no record of any Cadillac ever killing another person, at least from turning a crank to start the engine! Since Cadillac had been formed from remnants of the Henry Ford Company (the second of two failed attempts by Ford), it was renamed for Antoine Laumet de La Mothe, sieur de Cadillac (the founder of Detroit 200 years earlier).

Later, Leland would sell Lincoln, his other marque luxury brand, to Ford Motor Company for a healthy $10 million, while Kettering and his crew formed Dayton Engineering Laboratories Co., which became Delco, still a famous name in electronic automobile parts. Kettering went on to have a long, sterling career and was featured on the cover of Time on Jan. 9, 1933 … the week after president-elect Franklin Delano Roosevelt was named the magazine’s Man of the Year (Jan. 2).

My only quibble is the work Kettering did with Thomas Midgley Jr. in developing Ethyl gasoline, which eliminated engine knock, but loaded the air we breathe with lead (a deadly neurotoxin) for the next 50 years. And he developed Freon … a much safer refrigerant, but which released CFCs, which will be destroying our atmospheric ozone for the next 100-200 years.

I don’t recall ever personally turning an engine crank. My cars went from ignition keys to keyless and I plan to skip the driverless models and wait for a Jet-Cab … unless Jeff Bezos can provide an Uber-style version using one of his drones.

Things change.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Yes, George C. Marshall Earned Title of ‘Greatest Living American’

A photograph of General George C. Marshall, signed, went to auction in October 2007.

By Jim O’Neal

In Harvard Yard, a venue carefully chosen as dignified and non-controversial, Secretary of State George C. Marshall’s 15-minute speech on June 5, 1947, painted a grim picture for the graduates. With words crafted and refined by the most brilliant minds in the State Department, Marshall outlined the “continuing hunger, poverty, desperation and chaos” in a Europe still devastated after the end of World War II.

Marshall, one of the greatest Secretaries of State the United States has ever produced, asserted unequivocally that it was time for a comprehensive recovery plan. The only caveat was that “the initiation must come from Europe.” His words were much more than typical boilerplate commencement rhetoric and Great Britain’s wily Foreign Minister Ernest Bevin heard the message loud and clear. By July 3, he and his French counterpart, Georges Bidault, had invited 22 nations to Paris to develop a European Recovery Program (ERP). Bevin had been alerted to the importance by Dean Acheson, Marshall’s Under Secretary of State. Acheson was point man for the old Eastern establishment and had already done a masterful job of laying the groundwork for Marshall’s speech. He made the public aware that European cities still looked like bombs had just started falling, ports were still blocked, and farmers were hoarding crops because they couldn’t get a decent price. Furthur, Communist parties of France and Italy (upon direct orders from the Kremlin) had launched waves of strikes, destabilizing already shaky governments.

President Harry S. Truman was adamant that any assistance plan be called the Marshall Plan, honoring the man he believed to be the “greatest living American.” Yet much of Congress still viewed it as “Operation Rat Hole,” pouring money into an untrustworthy socialist blueprint.

The Soviets and their Eastern European satellites refused an invitation to participate and in February 1948, Joseph Stalin’s vicious coup in Prague crumpled Czechoslovakia’s coalition, which inspired speedy passage of the ERP. This dramatic action marked a significant step away from the FDR-era policy of non-commitment in European matters, especially expensive aid programs. The Truman administration had pragmatically accepted a stark fact – the United States was the only Western country with any money after WWII.

Shocked by reports of starvation in most of Europe and desperate to bolster friendly governments, the administration offered huge sums of money to any democratic country in Europe able to develop a plausible recovery scheme – even those in the Soviet sphere of influence – despite the near-maniacal resistance of the powerful and increasingly paranoid Stalin.

With no trepidation, on April 14, the freighter John H. Quick steamed out of Texas’ Galveston Harbor, bound for Bordeaux with 9,000 tons of American wheat. Soon, 150 ships were busy shuttling across the Atlantic carrying food, fuel, industrial equipment and construction materials – essential to rebuilding entire countries. The Marshall Plan’s most impressive achievement was its inherent magnanimity, for its very success returned Europe to a competitive position with the United States!

Winston Churchill wrote, “Many nations have arrived at the summit of the world, but none, before the United States, on this occasion, has chosen that moment of triumph, not for aggrandizement, but for further self-sacrifices.”

Truman may have been right about this greatest living American and his brief speech that altered a ravaged world and changed history for millions of people – who may have long forgotten the debt they owe him. Scholars are still studying the brilliant tactics involved.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

National Debt on Automatic Pilot to More Growth

A letter by President George W. Bush, signed and dated July 4, 2001, sold for $16,730 at an April 2007 Heritage auction.

By Jim O’Neal

In May 2001 – just 126 days after President George W. Bush took office – Congress passed his massive tax proposal. The Bush tax cuts had been reduced to $1.3 trillion from the $1.65 trillion submitted, but it was still a significant achievement from any historical perspective. It had taken Ronald Reagan two months longer to win approval of his tax cut and that was 20 years earlier.

George W. Bush

Bush was characteristically enthusiastic about this, but it had come with a serious loss in political capital. Senator James Jeffords, a moderate from Vermont, announced his withdrawal from the Republican Party, tipping control of the Senate to the Democrats, the first time in history that had occurred as the result of a senator switching parties. In this instance, it was from Republican to Independent, but the practical effect was the same. Several months later (after the terrorist attacks on the World Trade Center and the Pentagon), there was a loud chorus of calls to reverse the tax cuts to pay for higher anticipated spending.

Bush had a counter-proposal: Cut taxes even more!

Fiscal conservatives were worried that there would be the normal increase in the size and power of the federal government, lamenting that this was a constant instinctive companion of hot wars. James Madison’s warning that “A crisis is the rallying cry of the tyrant” was cited against centralization that would foster liberal ideas about the role of government and even more dependency on the federal system.

Ex-President Bill Clinton chimed in to say that he regretted not using the budget surplus (really only a forecast) to pay off the Social Security trust fund deficit. Neither he nor his former vice president had dispelled the myth about a “lock box” or explained the federal building in Virginia that had been built exclusively to hold government IOUs to Social Security. In reality, they were simply worthless pieces of scrip, stored in unlocked filing cabinets. The only changes that had ever occurred with Social Security funds were whether they were included in a “unified budget” or not. They had never been kept separate from other revenues the federal government received.

But this was Washington, D.C., where, short of a revolution or civil war, change comes in small increments. Past differences, like family arguments, linger in the air like the dust that descends from the attic. All of the huge surpluses totally disappeared with the simple change in the forecast and have never been discussed since.

Back at the Treasury Department of 15th Street, a statue to Alexander Hamilton commemorates the nation’s first Treasury Secretary, a fitting honor to the man who created our fiscal foundation. But on the other side stands Albert Gallatin, President Thomas Jefferson’s Treasury Secretary, who struggled to pay off Hamilton’s debts and shrink the bloated bureaucracy he built.

Hamilton also fared better than his onetime friend and foe, James Madison. The “Father of the Constitution” had no statue, no monument, no lasting tribute until 1981, when the new wing of the Library of Congress was named for him. This was a drought that was only matched by John Adams, the Revolutionary War hero and ardent nationalist. It was only after a laudatory biography by David McCulloch in 2001 that Congress commissioned a memorial to the nation’s second president.

Since the Bush tax cut and the new forecast, the national debt has ballooned to $20 trillion as 9/11, wars in Iraq and Afghanistan, and the 2008 financial meltdown produced a steady stream of budget deficits in both the Bush and Barack Obama administrations. The Donald Trump administration is poised to approve tax reform, amid arguments on the stimulative effect on the economy and who will benefit. In typical Washington fashion, there is no discussion over the fact that the national debt is inexorably on automatic pilot to $25 trillion, irrespective of tax reform. But this is Washington, where your money (and all they can borrow) is spent almost with no effort.

“Just charge it.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Truman Well Aware that Presidency was a Most Terribly Responsible Job

A Harry Truman signed and inscribed photograph, dated Jan. 17, 1953, sold for nearly $3,885 at a February 2006 Heritage auction.

By Jim O’Neal

The news broke shortly before 6 p.m. on April 12, 1945. President Franklin Roosevelt had died of a cerebral hemorrhage in Warm Springs, Ga. Within minutes, the bulletin had reached every part of the country. It was almost midnight in London, but in Berlin, it was already the next day, where it was (ominously) Friday the 13th. However, Joseph Goebbels interpreted it as a lucky turning point when he telephoned Adolf Hitler. He was already devising ways to turn this to Germany’s advantage, even as enemy troops closed in on the Third Reich.

By 7 p.m., Harry Truman, his Cabinet and Bess and Margaret were assembled in the Cabinet Room along with Chief Justice Harlan Stone to administer the oath of office. Within hours of Roosevelt’s death, the country had a new president.

Then the family and the Cabinet were dismissed. Secretary of War Henry Stimson lingered to brief the new president on a matter of extreme urgency. He explained that a new weapon of almost inconceivable power had been developed, but offered no details. Truman had just learned about the existence of the atomic bomb. He canceled a date to play poker and went to bed. It had been a long day.

It was also a long day for America’s top generals: Dwight D. Eisenhower, George S. Patton and Omar Bradley. The shock of losing their trusted commander-in-chief was compounded by genuine concern over Truman’s lack of experience. To make matters worse, they had just seen their first Nazi death camp. All were depressed, but Patton was especially emphatic about his concerns for the future.

The next morning, President Truman arrived at the White House promptly at 9 a.m. It was now April 13, 1945 – 27 years to the day since he had landed at Brest, France (Brittany), as a lowly 1st Lieutenant in the Allied Expeditionary Forces in WWI. Now he was the United States’ commander in chief in the century’s second world war. Everything in the Oval Office was eerily just as FDR had left it. He sat in the chair behind the desk and quietly pondered the challenges he had inherited. Downstairs, the White House staff was frantically coping with the press, the jangle of telephones, and wondering what to do next.

After a routine update on the status of the war, Truman surprised everyone by announcing he was going to the Capitol to “have lunch with some of the boys” … 17 congressmen to be exact. After a few drinks and lunch, he told the group he felt overwhelmed and emphasized he would need their help. Then he stepped out to meet the assembled press and made his now famous remarks: “Boys, if you ever pray, pray for me now. I don’t know whether you fellows ever had a load of hay fall on you, but when they told me yesterday what had happened, I felt like the moon, the stars and all the planets had fallen on me.”

Less than four months later, in August 1945, the man from Independence, Mo., now confident and in control, dropped his own bombshell when he broadcast to the nation:

“Sixteen hours ago, an American airplane dropped one bomb on Hiroshima,” the president said, adding, “We are now prepared to obliterate more rapidly and completely every productive and enterprise the Japanese have above ground in any city. We shall destroy Japan’s power to make war. … If they do not now accept our terms, they may expect a rain of ruin from the air the like of which has never been seen on this earth.”

The buck DID stop here, just as the little sign on his desk promised.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

‘Huck Finn’ Established Enduring Hallmarks of Our National Sense of Humor

A first American edition, first issue of Mark Twain’s Adventures of Huckleberry Finn, with a tipped-in Twain signature, sold for $11,875 at an April 2016 Heritage auction.

By Jim O’Neal

Here’s a good literary rule of thumb. Any book that someone has seen fit to ban has got to be worth reading. And when a book has been banned for so long and so often and for so many reasons – as has The Adventures of Huckleberry Finn – that’s as compellingly an endorsement as you can hope for.

If anything, Huckleberry Finn has become even more disturbing since its first appearance in 1885. It’s a book that, even when you reread it, is never what you expect – by turns raw, sweet, funny, deeply principled and deliberately shocking.

Twain

It has been a lightning rod for the squeamish and for naysayers and prigs of every stripe. Huck Finn was a victim of political correctness long before there was such an imprecise, overused and contentious term. The indictment is by now tiresomely familiar and of some of the charges, at least, plainly guilty. Yes, it uses the “n” word – 215 times to be accurate. People used the word back then and still do today. However, who is allowed to seems to be fungible and the rules enforced for maximum social effect. And Huck – not to mention most of the book’s other characters, black and white – does engage in what we would call racist thinking.

But Huck is not Mark Twain, remember, and early on we discover he has a lot to learn. The story of Huckleberry Finn is, in part, the story of Huck’s education and he is taught by none other than Jim, the runaway slave who is in fact the book’s wisest and most humane character. Set at a time when America was still riven and corrupted by slavery, Huckleberry Finn is a depiction of racism at its most virulent, but it is itself among the most anti-racist novels ever written.

The charge of racism is so specious that it invites us to wonder if some other agenda isn’t at work in the minds of those who raise it. And the same is true of those 19th century moralists like librarians, town fathers and custodians of the public weal who originally objected on grounds of vulgarity and sacrilegiousness. What really upsets people about Huck Finn is that it is so deeply skeptical – subversive even – of received wisdom and official pieties of every sort. It breaks all the rules and does so from its first sentence: “You don’t know about me, without you have read a book by the name of The Adventures of Tom Sawyer, but that ain’t no matter.”

There had never been a sentence like that in American literature before. It is vulgar, in the way that everyday speech is vulgar, and it introduces us not to the familiar narrator of 19th-century fiction, with measured cadences and worldly wisdom, but to a 14-year-old boy, a wiseass who takes nothing on faith, especially not what he’s been told by his elders. It’s a voice so authentically American that it’s startling that we never heard it in books until then. In an almost embarrassing way, it reveals as phony so much written until then.

Ernest Hemingway famously said of Huckleberry Finn that all American literature comes from it and that “there was nothing before” – which is a stretch, but not by much. What is certainly true is that all American comedy comes from Huck Finn and it established in an instant the two enduring hallmarks of our national sense of humor: a deadpan delivery and a take-no-prisoners attitude. We get taken on a tour down Twain’s beloved Mississippi through an America in the process of becoming a country. Filled with an incomparable gallery of rogues, swindlers and hypocrites, it’s Twain’s take on this nascent country … withering, but exact.

I suspect nothing would please Twain more – or surprise him less – to learn that his book still has the power to both amuse and make us wince … things that are in short supply as we hurry from one outrage to another by people that will probably always be angry, about something.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How We Record History Has Evolved Over the Ages

A 1935 copy of The History of Herodotus of Halicarnassus (Nonesuch Press) sold for $1,125 at an October 2013 auction.

By Jim O’Neal

We often fail to remember that history (itself) has a history. From the earliest times, all societies told stories from their past, usually imaginative tales involving the acts of heroes or various gods. Later, civilizations kept records inscribed on clay tablets or the walls of caves. However, ancient societies made no attempt at verification of records, and often failed to differentiate between reality and mythical events and legends.

This changed in the 5th century B.C. when historians like Herodotus and Thucydides explored the past by the interpretation of evidence, despite still including a mixture of myth (“history” means “inquiry” in Greek). Still, Thucydides’ account of the Peloponnesian War satisfies most criteria of modern historical study. It was based on interviews with eyewitnesses and attributed actual events to individuals rather than the intervention of gods.

Thus, Thucydides managed to create the most durable form of history: the detailed narrative of war, political conflict, diplomacy and decision-making. Then, the subsequent rise of Rome to dominance of the Mediterranean encouraged other historians like Polybius (Hellenic) and Livy (Roman) to develop narratives to capture a “big picture” that made sense of events on a longer time frame. Although restricted to just the Roman world, it was the beginning of a universal history to describe progress from origin to present, with a goal of giving the past a purpose.

In addition to making sense of events through narratives, there was a tradition growing to examine the behavior of heroes and villains for future moral lessons. We still attempt this today with a steady stream of studies of Lincoln, Churchill and Gandhi, as well as Stalin, Hitler and Mao.

But there was a big hiccup with the rise of Christianity in the late Roman Empire era, which fundamentally changed the concept of history in Europe. Historical events started to be viewed as “divine providence” or the working of God’s will. Skeptical inquiry was usually neglected and miracles routinely accepted without question. Thankfully, the Muslim world was more sophisticated in medieval times and they rejected accounts of events that could not be verified.

However, neither Christians nor Muslims produced anything close to the chronicle of Chinese history published under the Song Dynasty in 1085. It recorded history spanning almost 1,400 years and filled 294 volumes. (I have no idea how accurate it is!)

By the 20th century, the subject matter of history – which had always focused on kings, queens, prime ministers, presidents and generals – increasingly expanded to embrace common people, whose role in historical events became more accessible. But most world history was written as the story of the triumph of Western civilization, until the second half when the notion of a single grand narrative simply collapsed. Instead, the post-colonial, modern world demanded the study of blacks and women’s histories, in addition to Asians, Africans and American Indians.

Now we are in another new place where it is increasingly difficult to know where to find reliable accounts of real events and a flood of “fake news” is competing for widespread acceptance. Maybe Henry Ford was right after all when he declared that “History is bunk!”

Personally, I don’t mind and still enjoy frequent trips to the past … regardless of factual flaws.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Wall Street was Booming Just Months Before the Great Depression

Vintage photograph shows Calvin Coolidge in Plymouth, Vt., shortly after learning of President Warren G. Harding’s death.

By Jim O’Neal

After the 1928 election, President-elect Herbert Hoover met with incumbent Calvin Coolidge to make a special request. There were four months to go until inauguration and Hoover planned to use six weeks of that time to tour Latin America. He asked the president to place a battleship at his disposal since he wanted to include Mrs. Hoover, who spoke fluent Spanish.

Initially, Coolidge suggested a cruiser “since it does not cost so much,” but finally relented and gave Hoover the battleship USS Maryland one way and then the USS Utah to come home from Montevideo, Uruguay. This was classic Calvin Coolidge, always looking for creative ways to avoid federal spending.

Then Coolidge dispatched his final annual message to Congress on Dec. 4. The document revealed the optimism felt by Coolidge and the nation as a whole: “No Congress of the United States, on surveying the State of the Union, has met with a more promising prospect than that which appears at the present time. In the domestic field, there is tranquility and contentment, harmonious relations between management and wage earner, freedom from industrial strife and the highest record of years of prosperity.”

In his budget address, read to Congress the following day, Coolidge said estimated revenues for 1929 were $3.831 billion with expenditures of $3.794 billion. Since the surplus was smaller than hoped for, he would not ask for yet another tax cut.

Calvin Coolidge – who assumed the presidency when Warren Harding died in 1923 – had a simplistic fiscal philosophy: hold the line on spending and if possible reduce it, while at the same time cutting taxes. He believed this would result in greater personal freedom and a more moral population. In 1923, federal expenditures were $3.1 billion and fell to $3.0 billion by 1928. Despite tax cuts, revenues were the same at $3.9 billion and the national debt fell from $22.3 billion to $17.6 billion. The number of federal employees in Washington fell from 70,000 to 65,000.

By 1929, automobiles jammed the roads, spurring a major construction boom. The Ford Model A was enthusiastically greeted in 1927, but the talk of the industry was Walter Chrysler, who came from nowhere to build the third-largest company in the industry. Auto sales zoomed and the Federal Oil Conservation Board announced the country was in danger of running out of petroleum.

The front-page news of early 1929 was Britain’s ailing King George V, whose sons were rushing home to his bedside. But the business pages focused on RCA’s purchase of the Victor Talking Machine Company, following the acquisition of Keith-Albee-Orpheum, which was renamed RKO. The stock of RCA was now selling at a P/E of 26 and there was talk of a 5-for-1 stock split.

Wall Street was booming and dividends were at an all-time high. The Federal Reserve was complaining about the banks using their money to fuel speculation, but the only response was from the small Dallas Reserve, which raised their discount rate to 5 percent (yawn). A few months later, Wall Street crashed and the entire country spiraled down into the Great Depression, which would last the next 10-plus years.

Welcome to Washington, D.C., President Hoover. It’s all yours!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

For Anthony and Women’s Rights, Failure was Impossible

An 1873 letter by Susan B. Anthony, written one month after her trial for voting illegally, realized $9,375 at a November 2015 Heritage auction.

By Jim O’Neal

On Nov. 5, 1872, Susan B. Anthony wrote to Elizabeth Cady Stanton: “Well, I have been & gone & done it!! – positively voted the Republican ticket – strait [sic] – this a.m. at 7 o’clock.”

Anthony had cast her ballot at a barbershop in Rochester, N.Y. She was one of 6,431,149 citizens who voted in the election between Ulysses S. Grant and Horace Greeley, an election Grant won decisively by more than 760,000 votes. Three weeks later, on Thanksgiving Day, Anthony and a handful of other women who voted with her were arrested and indicted for having “knowingly voted without having a lawful right to vote.”

The verdict at her trial was a forgone conclusion. The judge refused to let her take the witness stand and then instructed the all-male jury to find her guilty without any deliberation. Anthony succeeded in being heard, however, when the judge asked if she “had anything to say why sentence shall not be pronounced?” She quickly replied,

“Yes, your honor, I have many things to say, for in your ordered verdict of guilty, you have trampled underfoot every vital principle of our government. My natural rights, my civil rights, my political rights are all alike ignored. Robbed of the fundamental privilege of citizenship, I am degraded from the status of a citizen to that of a subject … doomed to political subjection.”

Susan B. Anthony

She then refused to pay the $100 fine the judge ordered, but he refused to imprison her, thereby preventing her from appealing to a higher court. Undeterred, Anthony took her case to the public and had thousands of copies of the trial proceedings printed and widely distributed.

Susan B. Anthony would find other ways to relentlessly press the cause of women’s suffrage. Brought up as a Quaker and active as an early supporter of temperance, she soon realized that until women could vote, politicians would not pay any attention to them. For more than 50 years, she urged lawmakers to enfranchise the other half of America’s citizens. She attended her first women’s rights convention in Syracuse, in 1852, and with Elizabeth Cady Stanton founded the American Equal Rights Association in 1866. The two women published a feisty newspaper, The Revolution, whose masthead proclaimed “Men their rights and nothing more; women, their rights and nothing less.”

She appeared before every U.S. Congress between 1869 and 1906 to ask them to pass a Suffrage amendment. She was prepared as any modern-day lobbyist – her copy of the seating chart for all members of Congress has survived. Her speech to a Senate Committee in 1904 reflected her frustration: “I never come here, and this is the seventeenth Congress I have attended, but with the feeling of injustice which ought not to be borne, because the women, one-half the people, are not able to get a hearing from the Representatives and Senators of the United States.”

Her combative tone did not mellow with age. When President Theodore Roosevelt sent congratulations in 1906 for her 86th birthday celebration, her response was indignant: “I wish the men would do something besides extend congratulations … I would rather have him say a word to Congress for the cause than to praise me endlessly.”

She ended that evening’s gathering, her final public appearance, with a ringing prophecy: “There have been others also just as true and devoted to the cause … but with such women consecrating their lives, failure is impossible!”

Less than a month later, on March 13, 1906, she died at her home in Rochester, N.Y. The rights for which she had worked so tirelessly were finally won when the Nineteenth Amendment, the “Susan B. Anthony Amendment,” passed on June 4, 1919, as women stood on the steps of the Capitol to cheer. The vote was close, only one more than the required two-thirds. To enable the passage, two Congressmen had come from hospitals to vote aye; a third left his suffragist wife’s deathbed to cast a vote, then returned for her funeral. When the State of Tennessee became the 36th state to ratify, the amendment was officially adopted on Aug. 18, 1920 – nearly half a century after Susan B. Anthony had illegally voted for Ulysses S. Grant.

A life. A cause. Finally accomplished.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

What Nature Separated, Explorers Brought Together

A letter Albert Einstein sent to Charles Hapgood in November 1954 sold for $3,125 at an April 2015 Heritage auction.

By Jim O’Neal

To my knowledge, the only trained geologist I have met is Simon Winchester. He is now a successful journalist and author. Most of the people I know tend to look up or back (in history) as opposed to downward. I assume most geologists are average, normal people, except when I read about 20th century geology and the difficulties they had in reaching anything close to a consensus about Earth.

As early as 1912, a German – Alfred Wegener – developed the theory that the continents had come together in a single landmass before it drifted apart (he called it Pangaea). However, virtually everyone else argued that continents moved up and down, but definitely not “sideways.” They developed elaborate theories to explain all the evidence.

Just before he died in 1955, Albert Einstein enthusiastically endorsed a theory by geologist Charles Hapgood, who wrote the book “Earth’s Shifting Crust: A Key to Some Basic Problems of Earth Science.” Hapgood systematically debunked the theory of continental drift and dismissed anyone who believed it as “gullible.” When it was finally realized that the whole crust of the Earth was in motion – and not just the continents – it took a while to settle on a new name.

It wasn’t until 1968, when the Journal of Geophysical Research published an article by three American seismologists, that the new science was dubbed plate tectonics. Still, as late as 1980, one in eight geologists didn’t believe in plate-tectonic theory. It is not clear if there are any continental drift-deniers left, but we know for sure that the pre-1492 American and Eurasian ecosystems existed in complete isolation for thousands of years. The arrival of the first Europeans in North and Central America reconnected them and started what was to become known as the great Columbian Exchange.

Lives and economies that had evolved gradually over centuries were suddenly transformed by the influx of new crops, animals, technology and pathogens. Many of the effects were unforeseen and generally misunderstood by both American Indians and Europeans; but once the first landing occurred, massive changes were inexorably under way. One small example is that 60 percent of all crops grown in the world today originated in the Americas. A more immediate and devastating impact was the introduction of new diseases into the Americas that wiped out hundreds of thousands of Indians who had no biological defenses against small pox, measles, malaria, chickenpox and yellow fever.

The dramatic and irrevocable changes brought about on both sides of the Atlantic by the Columbian Exchange continued to shape lives for centuries, just as the movement of the Earth’s crust shapes our lives today by earthquakes, ocean movements and other ecological events.

We live in and on things that will continue to change as our televisions remind us every day. The longer-term changes are still being debated and are now politicized to the point where prudent policies are paralyzed. It must be something in our DNA.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].