LBJ exhibited ambition, decisiveness, a strong work ethic … and fear of failure

Lyndon B. Johnson artifacts, including signed photographs and a Civil Rights Bill signing pen, sold for $15,000 at an October 2018 Heritage auction.

By Jim O’Neal

Lyndon Baines Johnson was born in August 1908 in the Texas Hill Country nearly 112 years ago. (Tempus does fugit!). He shined shoes and picked cotton for pocket money, graduating from high school at age 15. Both his parents were teachers and encouraged the reading habits that would benefit him greatly for the rest of his life.

Tired of both books and study, he bummed his way to Southern California, where he picked peaches, washed dishes and did other odd jobs like a common hobo. The deep farm recession forced him back to Texas, where he borrowed $75 to earn a teaching degree from a small state college. Working with poor, impoverished Mexican children gave him a unique insight into poverty. He loved to tell stories from that time in his life, especially when he was working on legislation that improved life for common people.

His real power was developed when he electrified the rural Hill Country by creating a pool of money from power companies that he doled out to politicians all over the country who needed campaign funds and were willing to barter their votes in Congress. The women and girls who lived in Texas were known as “bent women” from toting water – two buckets at a time from water wells – to their homes. Having electricity to draw the water eliminated a generation of women who were not hump-backed. They said of LBJ, “He brought us light.” This caught FDR’s attention and lead to important committee assignments.

He married 20-year-old Claudia Alta Taylor in 1934 (at birth, a nanny had exclaimed “She looks just like a “little lady bird”). A full-grown Lady Bird parlayed a small inheritance into an investment in an Austin radio station that grew into a multimillion-dollar fortune.

Robert Caro has written about LBJ’s ambition, decisiveness and willingness to work hard. But how does that explain the trepidation to run for president in 1960? He had been Senate Majorly Leader, accumulated lots of political support and had a growing reputation for his Civil Rights record. He even told his associates, “I am destined to be president. I was meant to be president. And I’m going to be president!” Yet in 1958, when he was almost perfectly positioned to make his move, he was silent.

His close friend, Texas Governor John Connally, had a theory: “He was afraid of failing.”

His father was a fair politician but failed, lost the family ranch, plunged into bankruptcy and was the butt of town jokes. In simple terms, LBJ was afraid to run for the candidacy and lose. That explains why he didn’t announce until it was too late and JFK had it sewed up.

Fear of failure.

After JFK won the 1960 nomination at the Democratic National Convention in Los Angeles, he knew LBJ would be a valuable vice president on the Democratic ticket against Richard Nixon. Johnson’s Southwestern drawl expanded the base and the 50 electoral votes in Texas was too tempting to pass up. They were all staying at the Biltmore Hotel in L.A. and were a mere two floors away. Kennedy personally convinced LBJ to accept, despite brother Bobby’s three attempts to get him to decline (obviously unsuccessful).

The 1960 election was incredibly close with only 100,000 votes separating Kennedy and Nixon. Insiders were sure that a recount would uncover corruption in Illinois and Nixon would be declared the winner. But in a big surprise, RMN refused to demand a recount to avoid the massive disruption in the country. (Forty years later, Gore vs. Bush demonstrated the chaos in the 2000 Florida “hanging chads” debacle and the stain on SCOTUS by stopping just the Florida recount).

After the Kennedy assassination in November 1963, LBJ was despondent since he was sure he’d become the “accidental president.” But, when he demolished Barry Goldwater in 1968 the old Lyndon was back. The Johnson-Humphrey ticket won by of the greatest landslides in American history. LBJ got 61.1 percent of the popular vote and 486 electoral votes to Goldwater’s 52. More importantly, Democrats increased their majorities in both houses of Congress.

This level of domination provided LBJ with the leverage to implement his full Great Society agenda with the help of the 89th Congress, which approved multibillion-dollar budgets. After LBJ ramrodded through Congress his liberal legislative programs in 1965-66, it seemed that he might go down in history as one of the nation’s truly great presidents. But, his failure to bring Vietnam to a successful conclusion, the riots in scores of cities in 1967-68, and the spirit of discontent that descended on the country turned his administration into a disaster.

On Jan. 22, 1973, less than a month after President Truman died, the 64-year-old Johnson died of a heart attack. His fear of failure, a silent companion.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Fortunately for America, the primary issue in 1940 was staying out of war

A 1940 Wendell Willkie anti-FDR cartoon pin, featuring an image of the boy who would become Alfred E. Neuman, sold for $1,625 at a February 2020 auction.

By Jim O’Neal

In September 1940, The New York Times surprised many readers when it announced it would support Wendell Willkie for president. It was a critical time for America as Nazi Germany had swept across the democratic nations of Europe and soon would threaten England’s weak defenses. Once that domino fell, the United States would be exposed to direct attacks via the eastern routes of the northern Atlantic Ocean. It would also dispel the long-standing fallacy that our two great oceans provided insurmountable defenses. 
 
While conceding that both presidential candidates were experienced leaders who recognized the magnitude of the threat and, short of direct intervention, clearly understood the major role America must ultimately be forced to play, Willkie was favored over FDR since his extensive business experience would make him better prepared to provide a more robust defense of America. His production experience would be invaluable to gear up the industrial base that would be required. In this role, Willkie was the professional and Roosevelt clearly the amateur. 
 
There was also an almost unspoken concern about the next president being tough enough to defeat an enemy that had demonstrated a level of ruthlessness and cold-blooded efficiency rarely seen in modern times. Maybe it was the wheelchair that was discreetly hidden or the soft, cozy fireside chats to bolster morale during seven years of hard economic times. But FDR’s smiling, cheerleading style faded in comparison to Willkie’s tough talk about “sweat and toil, the emphasis on self-sacrifice and the radiant confidence to rebuild our earlier superiority. 
 
The Times had supported FDR in 1932 and 1936, but the fiscal policies of the New Deal had failed disastrously and the national debt had more than doubled in seven years. A continuation would lead the country to the precipice of bankruptcy. Looking back, it is now obvious that these concerns were totally misjudged. FDR turned out to be a wily, tough executive who managed Winston Churchill and Joseph Stalin with superior strategic skills with the courage to hammer out agreements without blinking. The United States war machine cranked out planes, tanks and military men at a remarkable rate. The public support was overwhelming as the entire nation joined in. American tobacco dropped a color and advertised Lucky Strike Green Has Gone to War.” In the process, a new era of fiscal strength evolved and the gloom of the Great Depression faded in the glare of Rosie the Riveter’s sparks. War bond parades blossomed and my family stored bacon fat in coffee cans without really knowing why. I traded comic books for butter coupons and we started eating something called oleomargarine. 
 
But in 1940, breaking the precedent of no third term established by George Washington in 1796 was viewed as duplicitous. Earlier, FDR had declared, “Last Septemberit was my intention to announce clearly and simply at an early date that under no conditions would I accept re-election.” Now, this had morphed into merely: “He had no wish to be a candidate again. Clearly, it was a bit of political spin that fit the revised situation. In the defeat of FDR and election of Mr. Willkie, there was an opportunity to safeguard a tradition with the wisdom of long experience behind it. 
 
Fortunately for America, the primary issue in the campaign was staying out of war and the isolationist crusade lead by the America First Committee was having a dramatic effect on the nation. Many leading figures across a broad political spectrum vehemently demanded that America stay out. Famous aviator Charles Lindbergh was perhaps the most influential voice heard. FDR was again his usual cunning political self and promised the American people that American boys would not be fighting in any “foreign wars. That was enough to allow him to win a substantial victory in 1940 and a coveted third term. Naturally, when the Japanese attacked Pearl Harbor on Dec71941, that eliminated the FOREIGN war angle commitment and Americans were eager to seek retribution against all enemies. 
 
An interesting epilogue to 1940, when FDR defeated the only presidential candidate with no government experience, was the death of Wendell Willkie in 1944 at age 52. He had poor health as a result of a poor diet, incessant smoking and hard drinking. Had he defeated FDR in 1940, he would have died right after D-Day but before the heavy fighting in the Battle of the Bulge, when victory was not yet assured. However, his VP running mate, Senator Charles McNary of Oregon, had died eight months earlier and for the only time in history, we would have been forced to elevate the secretary of state to president!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell]. 

Moral arguments continue over the use of atomic weapons in WWII

A 1971 photograph of Emperor Hirohito and Empress Nagako, signed, sold for $8,125 at an April 2015 Heritage auction.

By Jim O’Neal

If history is any guide, the month of August will arrive right on schedule. Inevitably, it will be accompanied by yet another birthday (no. 82 if my math is correct) and intellectual debates over the use of atomic bombs dropped on two Japanese cities in August 1945. Despite the passage of 74 years and the fact that it ended World War II, it remains the most controversial decision of a long, bloody war.

As a reminder, President Franklin Roosevelt had died in April 1945 soon after the start of his record fourth term in office. Vice President Harry Truman had taken his place and the new president attended a conference in defeated Germany to discuss how to persuade or force Japan to surrender. Persuasion was not really an option since the Empire of Japan was firmly committed to continue even if it resulted in the annihilation of its people and the total destruction of their country.

They had demonstrated their resolve during the bloody island-by-island fighting that left the Japanese mainland as the final target. Another amphibious landing was ruled out due to the expected enormous loss of life and an oath of 100 million inhabitants to fight until killed. Estimates vary on how many Americans would die … but they were all too high.

One strategy was to simply blockade all their ports and use our overwhelming air superiority to bomb them until they relented. But President Truman had a secret weapon and was fully prepared to use it if Japan resisted.

On July 26, 1945, Truman, British Prime Minister Winston Churchill and President Chiang Kai-shek of the Republic of China signed the Potsdam Declaration that warned the Japanese that if they did not agree to an “unconditional surrender,” they would face “prompt and utter destruction.” In addition, 3 million leaflets were dropped on the mainland to be sure the people were aware of the stakes and perhaps help pressure the leadership.

Afterwards, critics of what became the nuclear option have argued it was inhumane and violated a wartime code-of-ethics, perhaps like mustard gas or the chemical weapons ban we have today. However, it helps to remember that the avoidance of attacking non-combatant civilians had long been discarded by the mass bombings of European cities (e.g. the infamous firebombing of Dresden). And then the even more brutally systematic firebombing of Japanese cities. Destruction became the singular objective, knowing that ending the war would save more lives than any precision bombing.

Case in point is Air Force General Curtis LeMay, who arrived in Guam in January 1945 to take command of the 21st Bomber Command. His theory of war is eerily similar to General William Tecumseh Sherman’s “March to the Sea” in the Civil War. LeMay explained: “You’ve got to kill people, and when you kill enough, they stop fighting.” Precision bombing had given way to terror attacks that included civilian deaths indiscriminately.

Importantly, Lemay had just the right equipment to destroy Japan’s highly flammable cities filled with wooden houses. First was a highly lethal weapon called the M-69 projectile developed by Standard Oil. It was a 6-pound bomblet that consisted of burning gelatinized gasoline that, when stuck to a target, was inextinguishable. Second was a fleet of B-29 Superfortresses, ideal for continental bombing. They were powered by 4×2200 hp engines with a crew of 11 and a range of 4,000 miles. On March 9 … 344 B-29s began dropping M-69s over Tokyo in a crisscross pattern that merged into a sea of flames. The result was 90,000 dead and another million homeless. The victims died from fire, asphyxiation and buildings falling on them. Some were simply boiled to death in superheated canals or ponds where they sought refuge from the fire.

Over the next four to five months, they attacked 66 of Japan’s largest cities, killing another 800,000 and leaving 8 million homeless.

Despite this demonstration of power, the Japanese formal reply to the Potsdam Declaration included the word “mokusatsu,” which was interpreted as an imperial refusal. It was on this basis that Truman gave the order to proceed with bombing Hiroshima on Aug. 6. He left Potsdam and was at sea when the ship’s radio received a prearranged statement from the White House: “16 hours ago, an American airplane dropped one bomb on Hiroshima … it is an atomic bomb … it is harnessing the basic power of the universe.” Three days later on Aug. 9, a second bomb was dropped on Nagasaki.

Japanese Emperor Hirohito agreed to capitulate and an imperial script announcing the decision to the Japanese people was recorded for radio broadcast. Most Japanese had never heard the emperor’s voice.

As the moral arguments continue about the use of atomic weapons on people (in WWII), I find it to be a distinction without a difference … at least compared to having one of Lemay’s little M-69s stuck on my back.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Tornadoes, cash registers, indictments and pardons

One of the most famous tornadoes appeared in 1939’s The Wizard of Oz. This original half sheet promotional poster for the movie sold for $108,000 at a March 2019 Heritage auction.

By Jim O’Neal

On Easter Sunday 1913, a great tornado ripped through Omaha, Neb. As people scrambled for cover, a naked man was blown through a dining-room window. He grabbed the tablecloth to use as a toga and politely asked the startled family for a pair of trousers. The local newspaper dubbed him the “human meteorite” and went on to report how the twister sucked two babies out the window of the town orphanage. Another man reported the body of a 4-year-old girl dropped out of the sky into his arms, while cows were impaled on fence posts and chickens were plucked clean. That night, another dozen clouds raced across Iowa, Illinois, Missouri, Michigan and Indiana.

After the storms’ initial volley, heavy rains began to fall, swelling the Ohio River until its levees could no longer stem the angry waters and they were breached. The submerged cities included Fort Wayne, Columbus, Cincinnati and Indianapolis. However, Dayton, Ohio, was hardest hit as the Miami River rushed downtown, washing away homes and stranding residents on roofs, buildings and even telephone poles.

Dayton-based businessman John Patterson (1844-1922) immediately seized command and converted a factory assembly line to turn out rowboats for use in rescuing trapped inhabitants. A large plant cafeteria started baking bread and other foodstuffs. Most of Dayton’s provisions were either underwater or ruined by floodwaters. Many of the town’s residents owed their lives to Patterson’s quick actions. Still, over 300 people perished and damages topped $2 billion.

Patterson had launched his business career in December 1884 when he purchased the rights for “Ritty’s Incorruptible Cashier” for $6,500. James Ritty (1836-1918) was an Ohio bar owner who discovered what all bar/restaurant owners eventually learn: Employees inevitably start pilfering cash, booze or food. Many a chef has walked out with a ham or turkey under their coat on the way home. Or perhaps served friends and relatives drinks without keeping tabs.

Ritty’s invention was a machine positioned atop an adding machine that kept track of orders or controlled the cash. Patterson improved the design by adding the now familiar pop-up number, a cash drawer and a bell that rang when employees used it. He quickly recognized the potential profit in selling the machine to various retail merchants. All were potential customers. Thus, the National Cash Register Company was formed and Patterson went to work developing a skilled sales organization. Trainees were enrolled in a “Hall of Industrial Education” and after graduating, received their own exclusive territory to sell the new invention.

It was an immediate success and the company gained a reputation for generous commissions. A new factory was built with glass walls so the sun could shine through. This was the era when most factories were called “sweatshops” for good reason. In addition, Patterson included free medical clinics, a swimming pool and an employee cafeteria serving healthy food. The grounds were sculptured landscapes designed by architect John Charles Olmsted.

Patterson was a demanding boss and the list of future prominent businessmen he fired was a long one. One was Thomas Watson Sr. (1874-1956), who owned a butcher shop with a shiny NCR cash register. After the business failed, Watson went to work at NCR until Patterson fired him. Watson would go on the build International Business Machines (IBM) into a world-class institution. Another was Charles Kettering (1876-1958), a near genius engineer who went to work for General Motors after he was fired several times. He would head up GM’s engineering research department for 30 years. In addition to inventing the automobile electric self-starter, he recorded 186 patents and became a towering member of the Inventor Hall of Fame.

Patterson was a ruthless competitor and built a “gloom room” filled with cash registers from all the competing companies he ruined. In 1912, the company was found guilty of violating the Sherman Antitrust Act after acquiring over 80 direct competitors and ending up with a 95 percent market share. Patterson and 26 of his executives were headed to jail for a year after President Wilson refused to pardon them.

But fate intervened and an appeal overturned the conviction, partially because of Patterson’s good deeds during the Great Flood of 1913. Dayton welcomed them with a giant parade.

Pardons can be tricky for presidents, but all have used the power. Franklin Roosevelt holds the record with more than 3,600 acts of clemency! Since they were spread over four terms, there was not much political criticism. President Clinton was not so fortunate. On Jan. 20, 2001, his last day in office, he granted 140 pardons. One was to Marc Rich, an international commodities trader indicted by U.S. Attorney Rudy Giuliani in 1983 on 65 criminal counts involving income tax evasion, wire fraud and racketeering. Rich fled the United States.

When it was revealed he was still a fugitive and the pardon had been handled by Jack Quinn, it caused an uproar from both prominent Democrats and Republicans (Quinn had been Clinton’s White House Counsel). Then things escalated when it was discovered his ex-wife made donations to the DNC, the Clinton library, and Hillary’s Senate race. Attorney General John Ashcroft asked federal prosecutor Mary Jo White to investigate, but James Comey took the lead when White left the government. The probe was closed down after federal investigators ultimately found no evidence of criminal activity.

Hmmn. Giuliani, Clinton, Comey. Small town.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Prohibition one of the most improbable political achievements in U.S. history

Dell in 1961 was publishing a comic-book based on The Untouchables television show.

By Jim O’Neal

In 1959, Desilu Productions aired the first episode of The Untouchables, a TV series starring Robert Stack. The storyline was from a book of the same title that recounted the exploits of an elite group of Prohibition agents working for the Treasury Department. They were battling the mafia in Chicago during Prohibition (1920-1933). “Untouchables” was a nickname for a small group of lawmen who refused to take bribes from bootleggers.

There was a real-life Eliot Ness (1903-1957), who had a career battling booze and bootleggers, but the 119 episodes on TV were generally fictional accounts. The show had a decent run over four years (1959-63) and the format used famous newspaper/radio commentator Walter Winchell as a narrator to make it seem more realistic. Winchell (who was paid $25,000 per program) and Robert Stack were the only two actors to appear in all 119 shows.

In 1987, there was a movie version starring Kevin Costner as Ness and Robert De Niro as a fictionalized Al Capone. Sean Connery played a regular street policeman, and won an Oscar for Best Actor in a supporting role.

Prohibition was one of the most improbable political achievements in American history. The idea that a democratic nation with millions of voting-age drinkers, more than 300,000 taverns and saloons (all eager to slake their thirst), and the fifth-largest industry were unable to prevent this legislation was unthinkable. However, the battle had been raging for as far back as Colonial times. A temperance movement had been fighting alcohol consumption on the basis that it would destroy the moral fiber of the nation. The effort was joined by political forces with booming voices (e.g. William Jennings Bryan) who were convincingly vociferous about the evils of John Barleycornwith only limited success.

Pragmatic business leaders discouraged the use of alcohol, pointing out the negative effects on worker reliability, job-related injuries and, importantly, productivity and quality. But the problem was gradually becoming worse as the nation’s workforce refused to heed the warnings. Perhaps the loudest and most passionate opponents of alcohol were the wives and mothers left to contend with the impact on their families if the primary breadwinner was a drunk. Gradually, it seemed that the only people who were against prohibiting alcohol were the drinkers and all the people making money providing it.

Even the government seemed supportive of alcohol since one-third of federal revenues was derived from the tax on sales. However, in 1913, the 16th Amendment replaced alcohol taxes with a federal income tax, which generated significantly higher revenues. In the 1916 presidential election, neither President Woodrow Wilson nor his opponent Charles Evans Hughes even mentioned the issue. Both parties were leery of discussing it for fear of alienating either the “drys” or the “wets” in such a close race.

But the ratification of the 18th Amendment on Jan. 29, 1919, banned the manufacture, transportation and sale of liquor – introducing a period in America’s history we still call simply Prohibition. Of course, wealthy men and institutions stocked up before the legislation went into effect. Curiously, the Act did not prohibit consumption. President Wilson moved his extensive inventory into the White House as did his successor, Warren Harding. The Yale Club in New York procured a supply that would last 14 years! Cynics were quick to point out that the president could have his martini each evening, but the working man could no longer get a beer and free sandwich at his favorite saloon.

For the less affluent, 15,000 doctors and 57,000 pharmacies got a license to supply medicinal alcohol. Drugstores were discrete sources as evidenced by Walgreens, which grew from 22 stores to over 500 in short order. Another loophole was the exemption that allowed homemade “fruit juices” for consumption exclusively in the home. Wineries in California were supplanted by millions at home. If left untended, the alcohol content in grape juice could soar to 12 percent. Grape juice production exploded.

However, a more insidious source of the banned alcohol developed. As the suburban population migrated to big cities, nightlife became synonymous with “speakeasies.” All one had to do was “speak easy,” act discreet and there was a convenient door where even women were now welcomed. The battle over alcohol shifted to the supply and distribution and there was a massive increase in organized crime as they fought over territories. Law enforcement was badly outgunned or was paid to look the other way.

Welcome to the Roaring Twenties!

As the Great Depression deepened, President Franklin Roosevelt signed legislation on March 21, 1933, that legalized the consumption of 3.2 percent beer, saying, “I think this would be a good time for a beer.”

Amen.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Peaceful transfer of presidential power is one of our strengths

Seven Days in May, starring Burt Lancaster and Kirk Douglas, is a 1964 movie about a military/political cabal’s planned takeover of the U.S. government.

By Jim O’Neal

It seems clear that one of the bedrock fundamentals that contributes to the stability of the U.S. government is the American presidency. Even considering the terrible consequences of the Civil War – 11 states seceding, 620,000 lives lost and widespread destruction – it’s important to remember that the federal government held together surprisingly well. The continuity of unbroken governance is a tribute to a success that is the envy of the world.

Naturally, the Constitution, our system of justice and the rule of law – along with all the other freedoms we cherish – are all critical contributors. But it’s been our leadership at the top that’s made it all possible. In fact, one could argue that having George Washington as the first president for a full eight years is equal in importance to all other factors. His unquestioned integrity and broad admiration, in addition to precedent-setting actions, got us safely on the road to success despite many of the governed being loyal to the British Crown.

Since that first election in 1789, 44 different men have held the office of president (Grover Cleveland for two separate terms), and six of them are alive today. I agree with Henry Adams, who argued, “A president should resemble a captain of a ship at sea. He must have a helm to grasp, a course to steer, a port to seek. Without headway, the ship would arrive nowhere and perpetual calm is as detrimental to purpose as a perpetual hurricane.” The president is the one who must steer the ship, as a CEO leads an organization, be it small or large.

In the 229 intervening years, there have been brief periods of uncertainty, primarily due to vague Constitutional language. The first occurred in 1800, when two Federalists each received 73 electoral votes. It was assumed that Thomas Jefferson would be president and Aaron Burr would be vice president. The wily Burr spotted an opportunity and refused to concede, forcing the decision into the House. Jefferson and Burr remained tied for 35 ballots until Alexander Hamilton (convinced that Jefferson was the lesser of two evils) swayed a few votes to Jefferson, who won on the 36th ballot. This technical glitch was modified by the 12th Amendment in 1804 by requiring an elector to pick both a president and a vice president to avoid any uncertainty.

A second blip occurred after William Henry Harrison and John Tyler defeated incumbent Martin Van Buren. At age 68, Harrison was the oldest to be sworn in as president, a record he held until Ronald Reagan’s inauguration in 1981 at age 69. Harrison died 31 days after his inauguration (also a record), the first time a president had died in office. A controversy arose over the successor. The Presidential Succession Act of 1792 specifically provided for a special election in the event of a double vacancy, but the Constitution was not specific regarding just the presidency.

Vice President Tyler, at age 51, would be the youngest man to assume leadership. He was well educated, intelligent and experienced in governance. However, the Cabinet met and concluded he should bear the title of “Vice President, Acting as President” and addressed him as Mr. Vice President. Ignoring the Cabinet, Tyler was confident that the powers and duties fell to him automatically and immediately as soon as Harrison had died. He moved quickly to make this known, but doubts persisted and many arguments followed until the Senate voted 38 to 8 to recognize Tyler as the president of the United States. (It was not until 1967 that the 25th Amendment formally stipulated that the vice president becomes president, as opposed to acting president, when a president dies, resigns or is removed from office.)

In July 1933, an extraordinary meeting was held by a group of disgruntled financiers and Gen. Smedley Butler, a recently retired, two-time Medal of Honor winner. According to official Congressional testimony, Smedley claimed the group proposed to overthrow President Franklin Roosevelt because of the implications of his socialistic New Deal agenda that would create enormous federal deficits if allowed to proceed.

Smiley Darlington Butler was a U.S. Marine Corps major general – the highest rank authorized and the most decorated Marine in U.S. history. Butler (1881-1940) testified in a closed session that his role in the conspiracy was to issue an ultimatum to the president: FDR was to immediately announce he was incapacitated due to his crippling polio and needed to resign. If the president refused, Butler would march on the White House with 500,000 war veterans and force him out of power. Butler claimed he refused the offer despite being offered $3 million and the backing of J.P. Morgan’s bank and other important financial institutions.

A special committee of the House of Representatives (a forerunner to the Committee on Un-American Activities) headed by John McCormack of Massachusetts heard all the testimony in secret, but no additional investigations or prosecutions were launched. The New York Times thought it was all a hoax, despite supporting evidence. Later, President Kennedy privately mused that he thought a coup d’état might succeed if a future president thwarted the generals too many times, as he had done during the Bay of Pigs crisis. He cited a military plot like the one in the 1962 book Seven Days in May, which was turned into a 1964 movie starring Burt Lancaster and Kirk Douglas.

In reality, the peaceful transfer of power from one president to the next is one of the most resilient features of the American Constitution and we owe a deep debt of gratitude to the framers and the leaders who have served us so well.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Yes, Presidential Elections Have Consequences

Chief Justice of the Supreme Court John Marshall is featured on this Fr. 375 Serial Number One $20 1891 Treasury Note, which sold for $114,000 at an April 2018 Heritage auction.

By Jim O’Neal

In theory, there is no mystery or debate regarding the intention of the Founding Fathers in the selection of members to serve on the Supreme Court.

The Constitution crisply explains, in the second paragraph of Article II, Section 2, that the president shall nominate, and by and with the advice and consent of the Senate, shall appoint judges of the Supreme Court. This provision means exactly what it says and is unchanged by any modifications since its adoption. That includes a simple majority vote of the Senate to grant such consent, to reject or refuse to take action on the presidential nominee.

One idea discussed, but not acted upon, was Benjamin Franklin’s explanation of the Scottish mode of appointment “in which the nomination proceeded from the lawyers, who always selected the ablest of the profession in order to get rid of him, and share his practice among themselves” – a uniquely clever way to eliminate superior competition.

What has changed is the adoption of the “nuclear option” in 2017, which invoked cloture to end filibustering in the Judicial Committee and forced a vote of the committee either up or down on making their recommendation to the full Senate. House Majority Leader Harry Reid had used it to great effect for all legislation that he allowed to the floor while the Democrats were in the majority. Republicans expanded it to include Supreme Court nominees after they regained the majority in 2016. Neil Gorsuch was elected to the Supreme Court under this new rule with a 54-45 Senate vote, picking up three anxious Democrat votes in the process. It’s widely assumed that current nominee Judge Brent Kavanaugh will be elected to the Supreme Court following a similar path since his opponents appear helpless to stop him.

As President Obama once explained, in not too subtle fashion, “Elections have consequences.”

It now seems clear that the Founding Fathers did not foresee that political parties would gradually increase their influence and that partisan considerations of the Senate would become more prominent than experience, wisdom and merit. This was magnified in the current effort to stymie a nomination when the opposition announced they would oppose any candidate the Chief Executive chose. Period. It may not seem reasonable on a literal basis, but it has gradually become routine and will only get worse (if that’s still possible).

It may astonish some to learn that no legal or constitutional requirements for a federal judgeship exist. President Roosevelt appointed James F. Byrnes as an associate justice in 1941 and his admission to practice was by “reading law.” This is an obsolete custom now – Byrnes was the last to benefit – that proceeded modern institutions that specialize in law exclusively. In Byrnes’ case, it’s not clear that he even had a high school diploma. But he was a governor and member of Congress. He resigned 15 months later (the second shortest tenure) in order to become head of the Office of Economic Stabilization and was a trusted FDR advisor who many assumed would replace Vice President Henry Wallace as FDR’s running mate in 1944. That honor went to the little-known, high-school educated Harry Truman, who would assume the presidency the following year when FDR died suddenly.

Thomas Jefferson never dreamed the Supreme Court would become more than just a necessary evil to help balance the government in minor legal proceedings and would be more than astonished that they now are the final arbiter of what is or isn’t constitutional. The idea that six judges (who didn’t even have a dedicated building) would be considered equal to the president and Congress would have been anathema to him.

However, that was before he met ex-Secretary of State John Marshall when he became Chief Justice of the Supreme Court and started the court’s long journey to final arbiter of the Constitution when he ruled on Marbury v. Madison in 1803. There was a new sheriff in town and the next 40 years witnessed the transformation of the court to the pinnacle of legal power. They even have their own building thanks to President William Howard Taft, who died two years before it was complete. Someday, Netflix will persuade them to livestream their public discussions for all of us to watch, although I personally prefer C-SPAN to eliminate the mindless talking heads that pollute cable television.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Tremendous Challenges Awaited the Plainspoken Truman

Fewer than 10 examples of this Harry Truman “60 Million People Working” political pin are known to exist. This pin sold for $19,717 at an August 2008 Heritage auction.

By Jim O’Neal

When Franklin Roosevelt died on April 12, 1945, Harry Truman became the seventh vice president to move into the Oval Office after the death of a president. Truman had been born during the White House years of Chester Arthur, who had followed James Garfield after his assassination (1881). And in Truman’s lifetime, Teddy Roosevelt and Calvin Coolidge had ascended to the presidency after the deaths of William McKinley (1901) and Warren Harding (1923). However, none of these men had been faced with the challenges awaiting the plainspoken Truman.

FDR had been a towering figure for 12 years, first leading the country out of the Great Depression and then deftly steering the United States into World War II after being elected a record four times. Unfortunately, Truman had not been involved in several important decisions, and was totally unaware of several strategic secrets (e.g. the development of the atom bomb) or even side agreements made with others, notably Winston Churchill. He was not prepared to be president.

Even the presidents who preceded FDR tended to exaggerate the gap in Truman’s foreign-relations experience. Woodrow Wilson was a brilliant academic and Herbert Hoover was a world-famous engineer. There were enormously important decisions to be made that would shape the world for the next half century. Even Truman had his sincere doubts about being able to follow FDR, despite the president’s rapidly failing health.

The significance of these decisions has gradually faded, but for Truman, they were foisted upon him in rapid order: April 12, FDR’s death; April 28, Benito Mussolini killed by partisan Italians; two days later Adolf Hitler committed suicide; and on April 29, German military forces surrendered. The news from the Pacific was equally dramatic as troop landings on the critical island of Okinawa had apparently been unopposed by the Japanese. It was clearly the apex of optimism regarding the prospects for an unconditional surrender by Japan and the welcomed return of world peace.

In fact, it was a miracle that turned out to be a mirage.

After victory in Europe (V-E Day), Truman was faced with an immediate challenge regarding the 3 million troops in Europe. FDR and Churchill did not trust Joseph Stalin and were wary of what the Russians would do if we started withdrawing our troops. Churchill proved to be right about Russian motives, as they secretly intended to continue to permanently occupy the whole of Eastern Europe and expand into adjacent territories at will.

Then the U.S. government issued a report stating that the domestic economy could make a smooth transition to pre-war normalcy once the voracious demands from the military war-machine abated. Naturally, the war-weary public strongly supported “bringing the boys home,” but Truman knew that Japan would have to be forced to quit before any shifts in troops or production could start.

There was also a complex scheme under way to redeploy the troops from Europe to the Pacific if the Japanese decided to fight on to defend their sacred homeland. It was a task that George Marshall would call “the greatest administrative and logistical problem in the history of the world.”

Truman pondered in a diary entry: “I have to decide the Japanese strategy – shall we invade proper or shall we bomb and blockade? That is my hardest decision to date.” (No mention was made of “the other option.”)

The battle on Okinawa answered the question. Hundreds of Japanese suicide planes had a devastating effect. Even after 10 days of heavy sea and air bombardment on the island; 30 U.S ships sunk, 300 more damaged; 12,000 Americans killed; 36,000 wounded. It was now obvious that Japan would defend every single island, regardless of their losses. Surrender would not occur and America’s losses would be extreme.

So President Truman made a historic decision that is still being debated today: Drop the atomic bomb on Japan and assume that the effect would be so dramatic that the Japanese would immediately surrender. On Aug. 6, 1945, “Little Boy” was dropped on Hiroshima with devastating effects. Surprisingly, the Japanese maintained their silence, perhaps not even considering that there could be a second bomb. That second bomb – a plutonium variety nicknamed “Fat Man” – was then dropped two days ahead of schedule on Aug. 9 on the seaport city of Nagasaki.

No meeting had been held and there was no second order given (other than by Enola Gay pilot Paul Tibbets). The directive that had ordered the first bomb simply said in paragraph two that “additional bombs will be delivered AS MADE READY.” However, two is all that was needed. Imperial Japan surrendered on Aug. 15, thus ending one of history’s greatest wars.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Roosevelt Used Radio to Encourage, Hitler to Fuel Rage

A Franklin D. Roosevelt photograph, signed and inscribed to Eleanor Roosevelt, sold for $10,000 at an October 2016 Heritage auction.

By Jim O’Neal

Saul Bellow was a Canadian-born writer who became a nationalized U.S. citizen when he discovered he had immigrated to the United States illegally as a child. He hit the big time in 1964 with his novel Herzog. It won the U.S. National Book Award for fiction. Time magazine named it one of the 100 best novels in the English language since “the beginning of Time” (March 3, 1923).

Along the way, Bellow (1915-2005) also managed to squeeze in a Pulitzer Prize, the Nobel Prize for Literature, and the National Medal of Arts. He is the only writer to win the National Book Award for Fiction three times.

Saul Bellow

Bellow loved to describe his personal experience listening to President Roosevelt, an American aristocrat (Groton and Harvard educated), hold the nation together, using only a radio and the power of his personality. “I can recall walking eastward on the Chicago Midway … drivers had pulled over, parking bumper to bumper, and turned on their radios to hear every single word. They had rolled down the windows and opened the car doors. Everywhere the same voice, its odd Eastern accent, which in anyone else would have irritated Midwesterners. You could follow without missing a single word as you strolled by. You felt joined to these unknown drivers, men and women smoking their cigarettes in silence, not so much considering the president’s words as affirming the rightness of his tone and taking assurances from it.”

The nation needed the assurance of those fireside chats, the first of which was delivered on March 12, 1933. Between a quarter and a third of the workforce was unemployed. It was the nadir of the Great Depression.

The “fireside” was figurative; most of the chats emanated from a small, cramped room in the White House basement. Secretary of Labor Frances Perkins described the change that would come over the president just before the broadcasts. “His face would smile and light up as though he were actually sitting on the front porch or in the parlor with them. People felt this, and it bound them to him in affection.”

Roosevelt’s fireside chats and, indeed, all of his efforts to communicate contrasted with those of another master of the airwaves, Adolf Hitler, who fueled rage in the German people via radio and encouraged their need to blame, while FDR reasoned with and encouraged America. Hitler’s speeches were pumped through cheap plastic radios manufactured expressly to ensure complete penetration of the German consciousness. The appropriation of this new medium by FDR for reason and common sense was one of the great triumphs of American democracy.

Herr Hitler ended up committing suicide after ordering the building burned to the ground to prevent the Allies from retrieving any of his remains. So ended the grand 1,000-year Reich he had promised … poof … gone with the wind.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

President Lincoln Understood Technology and Adapted

This photograph of Abraham Lincoln was among 348 Civil War albumen images in a collection that sold for $83,650 at a December 2010 Heritage auction.

By Jim O’Neal

Presidents have always been challenged to communicate their policies and priorities to the public. As the political party system evolved, newspapers became more partisan depending on their level of editorial bias – usually due to strong-willed owners/editors – forcing administrations to devise creative ways to deliver unfiltered messages.

In the 20th century, President Wilson established the first presidential press conference in March 1913. All of his predecessors have continued using this innovation with only minor variants. FDR used “Fireside Chats” to help ease public concerns during the Great Depression, using bromides like, “The only thing we have to fear is fear itself” or explaining how the banking system works to restore confidence in the financial system.

President Eisenhower preferred off-the-record sessions with reporters and heavily edited film clips.

Then by 1960, with 87 percent of households having televisions, people could tune in twice a month and see the young, telegenic JFK – live and uncut – deliver his aggressive agenda for America. Up until then, press conferences were strictly off the record to provide the opportunity to correct any gaffes or poorly phrased answers to difficult questions. President Truman once told reporters “the greatest asset the Kremlin has is Senator [Joe] McCarthy” … but the quote was reworded before being released!

President Trump has adopted modern technology to bypass the media and communicate directly to anyone interested (which includes his base and the frustrated media). Daily WH briefings have become increasingly adversarial as many in the media are in various stages of open warfare, especially The New York Times and CNN. The 24/7 news cycle allows viewers to choose media that are consistent with their personal opinions and the result is a giant echo-sphere.

In the 19th century, President Lincoln was often confronted with extreme press hostility, especially by the three large newspapers in NYC, which attacked him personally and for his failing Civil War policies, particularly after the Civil War Draft Riots. Lincoln retaliated with dramatic letters in 1862-63 – ostensibly to New York Tribune editor Horace Greeley, but also strategically to all newspapers to reach a far wider audience. At the very least, he reduced editorial influence and in doing so revolutionized the art of presidential communications.

And then it was suddenly Nov. 19, 1863, at Gettysburg, Pa. What Lincoln said that day has been analyzed, memorized and explained … but never emulated. The only flaw was the prediction that “The world will little note, nor long remember, what we say here …”

The compactness and concision of the Gettysburg Address have something to do with the mystery of its memorability. It was 271 words. It had 10 sentences, the final one accounting for a third of the entire length; 205 words had a single syllable; 46 had two; 20 had three syllables or more. The pronoun “I” was never uttered. Lincoln had admired and seen at once the future of the telegraph, which required one to get to the point, with clarity. The telegraphic quality can be clearly heard in the speech – “We cannot dedicate, we cannot consecrate, we cannot hallow this ground.” Rhythm, compression, precision … all were emphasized.

Perhaps the most overshadowed speech in history was the one featured as the main event that day: Edward Everett’s oration. He was a Harvard man (later its president), a professor of Greek, governor of Massachusetts, and ambassador to England. Everett’s two-hour speech (13,607 words) was well received. Lincoln congratulated him.

Afterward, in a note to Lincoln, Everett wrote: “I should be glad to flatter myself that I came as near to the central idea of the occasion, in two hours, as you did in two minutes.” Lincoln’s grateful reply concluded with “I am pleased to know that in your judgment, the little I did say was not a failure.”

Not bad for a man traveling with the fever of a smallpox infection! 

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].