Immigrants have sparked controversy since the days of Benjamin Franklin

An interest certificate signed “B. Franklin” and dated Oct. 19, 1785, realized $13,145 at a 2012 Heritage auction.

By Jim O’Neal

Typically, any discussion of Ben Franklin will eventually include the question of why he was never president of the United States. He certainly earned the title as “The First American,” as well as an astonishing reputation as a leading politician, diplomat, scientist, printer, inventor, statesman and brilliant polymath. Biographer Walter Isaacson said it best: “Franklin was the most accomplished American of his age and the most influential in shaping the type of society America would become.”

He was certainly there in Philadelphia on May 5, 1775, representing Pennsylvania as a delegate to the Second Continental Congress, even as skirmishes escalated with the British military. The following June, he was an influential member of the Committee of Five that drafted the Declaration of Independence. In fact, he was the only one of the Founding Fathers to sign the four most important documents: the Declaration of Independence; the 1778 Treaty of Alliance with France (ensuring they would not get involved militarily in the war); the Treaty of Paris (1783), ending the war with Great Britain; and the historic United States Constitution in 1787, which is still firmly the foundation of the nation.

He proved to be a savvy businessman who made a fortune as a printer and prolific scientist who is still revered. His success provided him the personal freedom to spend time in England and France trading ideas with the great minds in the world and enjoying the company of the socially elite.

He was also a shrewd administrator and had a unique talent for writing or making insightful observations on all aspects of life. The only significant issue that seemed to perplex him were the waves of German immigrants that flooded many parts of Pennsylvania. In his opinion, the Crown dumped too many felons that resulted in unsafe cities. They could not speak English and they imported books in German. They erected street signs in German and made no attempts to integrate into the great “melting pot.” Worse was the fact that in many places they represented one-third of the population. He suggested that we export one rattlesnake for every immigrant and void every deed or contract that was not in English. Although the Pennsylvania Dutch (Germans) were an imbalance in the 18th century, by the 1850s they were ideal for the western expansion and proved to be ideal as agrarians.

When World War I erupted in 1914, most Americans viewed it with a sense of detachment and considered it just another European conflict. As a nation of immigrants focused on improving their personal lives, there was little time to root for their homeland. This policy helped Woodrow Wilson win a tight reelection in 1916 and it would last for two years. Since the start of war in Europe fit neatly with the end of the 1913-1914 recession in America, it was a perfect economic fit.

American exports to belligerent nations began rising rapidly; 1913, $825 million – 1917, $2.25 billion. In addition to steel, arms and food, American banks made large loans to finance these supplies. Inevitably, the U.S. was drawn into the war as German submarines sank supply boats. Germany then attempted to bribe Mexico to attack America (the XYZ Affair), and finally lit the fire keg by sinking the Lusitania, which carried a healthy complement of American passengers. Wilson was forced to ask Congress to declare war.

After we were provoked into the largest and deadliest war the world had seen, Wilson then decided that all Americans would be expected to support the war effort. The federal government opened its first propaganda bureau … the “Committee on Public Information.” Thus the creation of the first true “fake news.” Most forums of dissent were banned and it was even unlawful to advocate pacifism. Yes, German-Americans experienced substantial repression as war hysteria rippled through the system. But nothing close to what Japanese-Americans suffered after Japan attacked Pearl Harbor.

On Feb. 12, 1942, President Franklin Delano Roosevelt issued Executive Order 9066, which authorized federal officials to round up Japanese-Americans, including U.S. citizens, and remove them from the Pacific Coast. By June, 120,000 Americans of Japanese descent in California, Oregon and Washington were ordered to assembly centers like fairgrounds and racetracks … with barbed wire … and then shipped to permanent internment camps. Then, astonishingly, they were asked to sign up for military service, and some males 18-45 did since they were subject to the draft.

The U.S. Supreme Court heard three separate cases on the constitutionality and the court decided it was a wartime necessity.

In 1988, President Ronald Reagan signed the Civil Liberties Act and President George H.W. Bush signed letters of apology and payment of $20,000 to heirs. A total of 82,219 Japanese-Americans eventually received $1.6 billion in reparations.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Today’s business tycoons would be wise to not forget the past

A statement on Union Iron Mills stationary signed by Andrew Carnegie and dated Sept. 29, 1870, sold for $6,572 at an April 2013 auction.

By Jim O’Neal

It may come as no surprise to learn that virtually all the major cities in California were incorporated in the same year. 1850 was also the year California became the 31st state to join the United States. It is now the most populous state and supports a $3 trillion economy, which ranks No. 5 in the world … larger than Great Britain. However, you probably don’t know that, in terms of land area, the three largest cities are Los Angeles, San Diego and (surprisingly) California City.

This large chunk of land was formed in the boomlet following World War 2 with the intent of rivalling Los Angeles. Southern California was flourishing due to temperate weather, Pacific Ocean beaches and nearby mountains. It seemed logical that with a large influx of people, all that was lacking were lots of affordable housing, automobiles, streets, freeways and plenty of water to drink and as irrigation for the orange groves.

An ambitious land developer spotted this unique opportunity and bought 82,000 acres of prime California City land just north of the SoCal basin. He commissioned a high-power, architectural master-plan community with detailed maps of blocks, lots and streets. Next was hiring a small army of 1,300 salesmen to promote land sales to individuals, while building a 26-acre artificial lake, two golf courses and a four-story Holiday Inn.

This was land speculation on a grand scale; they sold 50,000 lots for $100 million before the market dried up. Some reports claim that only 175 new homes were actually built. The fundamental reason was that Southern California land development primarily evolved south along the coastline toward San Diego and the prime ocean-front property in Malibu, Long Beach and Orange County. Although the scheme failed, California City was finally incorporated in 1965. Today, the 15,000 inhabitants, many from Edwards Air Force base, are sprinkled liberally over 204 square miles.

A prominent No. 4 on the list is San Jose, which narrowly escaped being destroyed in a 1906 earthquake that nearly leveled nearby San Francisco. When we lived there (1968-71), it was a small, idyllic oasis with plum trees growing in undeveloped lots in the shadow of the Santa Cruz Mountains. There were nice beaches an hour away and for $14.15, PSA would fly you 400 miles to LAX in 45 minutes. In addition to the short drive to San Francisco with all its wonders, Lake Tahoe offered gambling, except when it snowed on the Sierra Nevada, a mere 200 miles away.

Nobody dreamed that the miracle of Silicon Valley was on the horizon and the enormous impact of the Internet would result in the boom-bust of the dot.com era in the late 1990s. The stock market was up 400% and then down 80%, wiping out most of the gains. However, post 2002, and the proliferation of the personal computer, there was another technology revolution that would create more wealth than anyplace in the history of the world.

Apple, Google, Facebook, eBay, Intel, Cisco and Instagram are at the core of a technological society that has revolutionized our economy and communications, our lives and, by extension, the world. Smart phones, search engines and social-media giants – plus a community of 2,000 tech firms and venture capitalists – have generated enormous fortunes. Electric vehicles have morphed into driverless cars and trucks that will result in more creative destruction. AI and robotics will obsolete large swaths of production and elevate privacy and anti-trust concerns that will rival early 20th century government action to break up trusts.

Consider when Andrew Carnegie sold his Carnegie Steel Company to J.P. Morgan in 1901 for an astounding $303 million. He became the richest man in America, surpassing even John D. Rockefeller for several years. JPM then merged it with two other steel companies to form the first billion-dollar U.S. company, U.S. Steel. While Rockefeller continued to expand his oil monopoly, Carnegie devoted the last 18 years of his life to large-scale philanthropy. He literally lived by his credo: “The man who dies rich dies disgraced.” President Teddy Roosevelt would lead the trust-busting that became necessary.

Tim Cook, Mark Zuckerberg, the Google gang and Jeff Bezos would be wise to heed George Santayana’s aphorism: “Those who cannot remember the past are condemned to repeat it.” Especially when social media becomes more addictive than crack cocaine.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Stop whatever you’re doing, grab the soap and scrub your hands

A group of three framed autographs by famed medical scientists, including Jonas Salk, went to auction in January 2017.

By Jim O’Neal

Franklin Delano Roosevelt (1882-1945) was a handsome, virile young man – 39 years old – when he contracted what was diagnosed as poliomyelitis. Polio is an odd infection whose history traces back to the earliest humans and is generally transmitted by water contaminated with human feces. It primarily affects children under age 5, with only 1 in 200 infections leading to irreversible paralysis.

In FDR’s case, the paralysis overtook all of his extremities, but he eventually regained use of his upper limbs and relied on a wheelchair or crutches for mobility. We know that he had a spectacular political career, culminating in being elected president of the United States four separate times. He is frequently ranked among the top three presidents, along with George Washington and Abraham Lincoln. Political commentator George Will observed that some of that steel he relied on must have found its way into his soul and political will.

Some researchers now believe that FDR did not have poliomyelitis, but Guillain-Barré syndrome. If true, it seems to be particularly irrelevant since the treatment/cure was not discovered until this century, just a wee bit late, and it’s still in development.

Importantly, major polio epidemics were not common until the early 1900s, when Europe was plague-ridden. By 1910, frequent epidemics became regular events throughout the modern developed world. It peaked in the summer months and by 1950 was responsible for 500,000 deaths or paralysis every year.

To help fund research for a cure, in 1938 Roosevelt founded the National Foundation for Infantile Paralysis, soon to become the March of Dimes. Dr. Jonas Salk was picked to lead search for a cure. Would FDR have made such a dramatic effort if he had something other than polio? Perhaps not, but it’s probably not practical to rely on this ploy to get to priority funding since we’ve learned it’s harder to find good presidents than breakthrough medical cures.

During the war, Salk had pioneered a highly successful influenza vaccine and then taken a position at the University of Pittsburgh … in need of research funding. In danced the March of Dimes and Salk began working on a polio vaccine in 1948. Since polio is a viral disease, humans build up immunities to viruses after direct exposure. Development of mild strains can occur if a viable delivery system is also co-developed (a difficult task).

Salk’s work generated national attention since national panics were occurring every summer and swimming pools, movie theaters and other gathering places were routinely closed.

In the summer of 1952, he injected several dozen mentally handicapped children with an experimental version of his vaccine (try that today!). Also among the first people to be inoculated were his wife, their three children and Dr. Salk himself. Two years later (1954), the vaccine was ready for extensive field trials. In the interim, 100 million Americans had donated money to the March of Dimes (the 1950 census declared that the total U.S. population was only 150 million).

Next, a literal army of 20,000 public health workers, 64,000 school employees and 220,000 volunteers administered the vaccine to 1.8 million schoolchildren. On April 12, 1955, Salk’s polio vaccine was declared safe and effective. This announcement was broadcast nationally on television and around the world on radio. Polio was finally defeated in the United States. Dr. Salk was in a position to make an enormous amount of money. However, when asked in a television interview who owned the patent, he simply answered, “There is no patent. Could you patent the sun?

I realize it is difficult to feel an increased sense of optimism about COVID-19 by retelling the story of polio. But, one source of encouragement is to listen more carefully to what we’ve been told (ad nauseam) about the powerful defenses everyone has access to: simple soap and water! The coronavirus that has changed our lives perhaps forever is enveloped in fatty layers that are easily dissolved by detergents, which expose the core of the virus and cause it to perish.

So stop whatever you’re doing, get to the soap and water and scrub your hands (just like your mother told you before every meal). I do this frequently while I patiently wait for the miracle vaccine.

What do you have to lose?

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

LBJ exhibited ambition, decisiveness, a strong work ethic … and fear of failure

Lyndon B. Johnson artifacts, including signed photographs and a Civil Rights Bill signing pen, sold for $15,000 at an October 2018 Heritage auction.

By Jim O’Neal

Lyndon Baines Johnson was born in August 1908 in the Texas Hill Country nearly 112 years ago. (Tempus does fugit!). He shined shoes and picked cotton for pocket money, graduating from high school at age 15. Both his parents were teachers and encouraged the reading habits that would benefit him greatly for the rest of his life.

Tired of both books and study, he bummed his way to Southern California, where he picked peaches, washed dishes and did other odd jobs like a common hobo. The deep farm recession forced him back to Texas, where he borrowed $75 to earn a teaching degree from a small state college. Working with poor, impoverished Mexican children gave him a unique insight into poverty. He loved to tell stories from that time in his life, especially when he was working on legislation that improved life for common people.

His real power was developed when he electrified the rural Hill Country by creating a pool of money from power companies that he doled out to politicians all over the country who needed campaign funds and were willing to barter their votes in Congress. The women and girls who lived in Texas were known as “bent women” from toting water – two buckets at a time from water wells – to their homes. Having electricity to draw the water eliminated a generation of women who were not hump-backed. They said of LBJ, “He brought us light.” This caught FDR’s attention and lead to important committee assignments.

He married 20-year-old Claudia Alta Taylor in 1934 (at birth, a nanny had exclaimed “She looks just like a “little lady bird”). A full-grown Lady Bird parlayed a small inheritance into an investment in an Austin radio station that grew into a multimillion-dollar fortune.

Robert Caro has written about LBJ’s ambition, decisiveness and willingness to work hard. But how does that explain the trepidation to run for president in 1960? He had been Senate Majorly Leader, accumulated lots of political support and had a growing reputation for his Civil Rights record. He even told his associates, “I am destined to be president. I was meant to be president. And I’m going to be president!” Yet in 1958, when he was almost perfectly positioned to make his move, he was silent.

His close friend, Texas Governor John Connally, had a theory: “He was afraid of failing.”

His father was a fair politician but failed, lost the family ranch, plunged into bankruptcy and was the butt of town jokes. In simple terms, LBJ was afraid to run for the candidacy and lose. That explains why he didn’t announce until it was too late and JFK had it sewed up.

Fear of failure.

After JFK won the 1960 nomination at the Democratic National Convention in Los Angeles, he knew LBJ would be a valuable vice president on the Democratic ticket against Richard Nixon. Johnson’s Southwestern drawl expanded the base and the 50 electoral votes in Texas was too tempting to pass up. They were all staying at the Biltmore Hotel in L.A. and were a mere two floors away. Kennedy personally convinced LBJ to accept, despite brother Bobby’s three attempts to get him to decline (obviously unsuccessful).

The 1960 election was incredibly close with only 100,000 votes separating Kennedy and Nixon. Insiders were sure that a recount would uncover corruption in Illinois and Nixon would be declared the winner. But in a big surprise, RMN refused to demand a recount to avoid the massive disruption in the country. (Forty years later, Gore vs. Bush demonstrated the chaos in the 2000 Florida “hanging chads” debacle and the stain on SCOTUS by stopping just the Florida recount).

After the Kennedy assassination in November 1963, LBJ was despondent since he was sure he’d become the “accidental president.” But, when he demolished Barry Goldwater in 1968 the old Lyndon was back. The Johnson-Humphrey ticket won by of the greatest landslides in American history. LBJ got 61.1 percent of the popular vote and 486 electoral votes to Goldwater’s 52. More importantly, Democrats increased their majorities in both houses of Congress.

This level of domination provided LBJ with the leverage to implement his full Great Society agenda with the help of the 89th Congress, which approved multibillion-dollar budgets. After LBJ ramrodded through Congress his liberal legislative programs in 1965-66, it seemed that he might go down in history as one of the nation’s truly great presidents. But, his failure to bring Vietnam to a successful conclusion, the riots in scores of cities in 1967-68, and the spirit of discontent that descended on the country turned his administration into a disaster.

On Jan. 22, 1973, less than a month after President Truman died, the 64-year-old Johnson died of a heart attack. His fear of failure, a silent companion.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

More than 50 years later, streets filled with a new generation, with new demands

A Time magazine signed by Martin Luther King Jr. realized $6,875 at an October 2013 Heritage auction. The issue is dated Feb. 18, 1957, two months after the end of the Montgomery Bus Boycott.

By Jim O’Neal

In the two decades following WW2, the African-American struggle for civil rights lacked focus and broad support. Americans had spent four long years fighting a bitter war to free the world from tyranny and were now intent on resuming a peaceful recovery. However, in the process of restoring a normal life, they became disgusted by racial segregation and systemic exploitation of minorities. As important legal victories increased, protesters and marchers were helping change attitudes as the basis for faster, more sincere progress.

Alas, irrespective of the legal triumphs and the changes in public opinion, Jim Crow segregation was deeply embedded in the Deep South and portions of the West. Discriminatory laws generally targeted Latinos and American Indians, in addition to African-Americans. All levels of state government often failed to honor court decisions, while civil rights workers were subjected to mob violence that even included law enforcement officers.

Finally, on Aug. 28, 1963, the largest civil rights protest in American history occurred when 250,000 people gathered on the National Mall in Washington, D.C., to begin “The March on Washington for Jobs and Freedom” (later known simply as “The March”). The highlight of the day was a 15-minute closing speech by the Rev. Martin Luther King Jr.

In it, King offered his version of the American Dream, drawing on the Declaration of Independence, the Constitution and the Emancipation Proclamation. It was quickly to become known as the “I Have a Dream” speech. The Dream included a hope that people “will not be judged by the color of their skin, but by the content of their character,” that black and white children could “join hands … and walk together as sisters and brothers,” and that “the sons of former slaves and the sons of former slave owners will be able to sit down together at the table of brotherhood.”

MLK cautioned the nation there was still a long way to go. He cited the broken promises Americans made after the Civil War. Slavery was gone, but vicious racism still existed in general society. He said bluntly, “One hundred years later, the life of the Negro is still sadly crippled by the manacles of segregation and the chains of discrimination.” King also highlighted the curse of widespread poverty during an era of postwar prosperity. He closed by exhorting white Americans to strive for a realization of the cherished phrase “Let Freedom Ring” and join in the old Negro spiritual that proclaimed “Free at last! Free at last! Great God a-mighty, we are free at last!”

The March on Washington was pivotal in the passage of the 1964 Civil Rights Act, which eliminated segregation and began dismantling Jim Crow practices.

On Nov. 21, 1963, Lyndon Baines Johnson, the eighth vice president to assume the nation’s highest office following the death of a president, was better prepared to take command than any of his predecessors. Despite the personal disrespect and abuse from the elitist Kennedy crowd (especially RFK), LBJ was a master politician, serving 24 years as a representative and 12 years as a senator. As Majority Leader, he was truly “Master of the Senate,” as biographer Robert Caro has written so carefully.

The new president stayed in the background as the nation grappled with the enormous grief that engulfed Kennedy’s family and the people during funeral services for the slain president. Then, five days after the assassination – on the day before Thanksgiving – Johnson addressed a special joint session of Congress. He challenged them to honor Kennedy’s memory by carrying forward the dead president’s New Frontier program, saying “Let us continue.”

He asked for early passage of a new civil rights bill “to eliminate from the nation every trace of discrimination and oppression that is based on color or race.” Congress responded by passing every law LBJ pleaded for. Yet, over 50 years later, our streets are filled with a new generation, with new demands, as Congress is deadlocked once again. So, no new laws…

Perhaps Cassius was right after all: “The fault, dear Brutus, is not in our stars, but in ourselves.” (William Shakespeare’s Julius Caesar).

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Twain’s ‘Gilded Age’ in retrospect resembles a warning – comeuppance wrapped in satire

A signed presentation copy of The Gilded Age by Mark Twain and Charles Dudley Warner sold for $5,750 at an October 2017 Heritage auction.

By Jim O’Neal

The Gilded Age was a fascinating time for millions of people. It is typically used as a metaphor for a period of time in Western history characterized by peace, economic prosperity and optimism. It is assumed to have started circa 1870 and extended until the horrors of World War I spread a plague of death, disease and destruction that consumed civilized nations and destroyed four empires.

In France, it was called La Belle Époque (Beautiful Era) dating from the end of the Franco-Prussian War (1871). In the United Kingdom, it overlapped the Victorian era, and in Spain the Restoration. In Australia, this period included several gold rushes that helped the “convict colonies” transform to semi-progressive cities. These are only a few of many examples.

Historian Robert Roswell Palmer (1909-2002) noted “European civilization achieved its greatest power in global politics, and also exerted its maximum influence upon people outside Europe.” R.R. Palmer was a remarkable and distinguished historian, educated in Chicago (taught at Princeton and Yale) who published A History of the Modern World in 1950. I believe it has been continually updated, the last time in 2013. Although I’ve never actually seen a copy, it gets high marks. At a reported 1,000 pages and weighing five pounds, it is not on any of my wish lists. (His wife once commented she felt sympathetic for his students having to lug it around!)

In the United States, the Gilded Age is considered to have started following the Panic of 1873. There were a number of contributing factors. Naturally, the post-Civil War era benefited from the cessation of mindless destruction in the Southern states. Then the extensive rebuilding boosted economic activity at the same time Western expansion to the Pacific Ocean created widespread urbanization.

With workers’ wages in the United States significantly higher than Europe, millions of immigrants were eager to join and this provided the manpower ingredient to natural-resource opportunities. It was a perfect fit – unlimited land, vast forests, rivers, lakes and unknown quantities of gold, silver and coal. We had fur-bearing animals, unlimited fish, millions of bison and weather that was moderate and dependable. In a 30-year period, real wages grew 60 percent as the silhouette of a new world power was taking shape. All without the tyranny that was so prevalent in the world.

Mark Twain (Samuel Langhorne Clemens) actually coined the terminology in a novel co-authored by Charles Dudley Warner in 1873. Their book – The Gilded Age: A Tale of Today – was a typical Twain satire that captured the widespread social problems that were masked by a thin gold gilding. It also obscured the massive corruption and wealth creation of the perpetrators.

This was well before his better-known work The Adventures of Tom Sawyer (1876), which described life on the Mississippi River. The sequel, The Adventures of Huckleberry Finn (1884), was even more popular. First published in England, it exposed the prevalence of racism and the frequent use of the “n” word. The U.S. publication in 1885 only fanned the flames of racial debate.

It was banned in many schools and libraries, remaining controversial during the entire 20th century. As late as 2016, both Huckleberry Finn and To Kill a Mockingbird were banned in a Virginia school. Of course, there was some irony in the fact that our first black president was serving his second term in office.

In fact, Twain’s Gilded Age was meant as a pejorative and didn’t really enter contemporary usage until the 1920s. But it was an apt description throughout the 1870s, until the late 1920s ushered in the crash of Wall Street. With the Union off the gold standard, credit was readily available and the U.S. monetary supply was far larger than before the war.

Northerners, largely insulated from the actual war, sensed the almost inevitability of an industrialize nation and railroads across the country like an iron spiderweb. The early days of the Gilded Age – before the name gained its truer historical meaning – were alive with the optimism and speculation on America’s potential. It was a great run and established America as the greatest country in the history of man.

Twain’s Gilded Age looks in retrospect like a prescient warning – comeuppance wrapped in satire. The Great Depression quickly evaporated the hopes and dreams of millions and then consigned them back to poverty pending another cycle of war followed by prosperity. Andrew Carnegie noted for posterity his opinion on wealth creation: “The proper policy was to put all eggs in one basket and then watch that basket.”

It is hard to draw lessons from these cycles if we consider the current federal government, the U.K. and their Brexit, Africa, most of Latin America, virtually the entire Middle East, and the possible outcome of Hong Kong-China. But we keep trying.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

 

To borrow a phrase, Chief Justice Roberts looked like he came directly from central casting

A photograph of Chief Justice John G. Roberts Jr. and Associate Justice John Paul Stevens, taken following Roberts’ oath-taking and signed by both on the mat, went to auction in May 2017.

By Jim O’Neal

At noon on Sept. 12, 2005, I was glued to the TV to watch the start of the Senate Judiciary Committee. Chairman Arlen Specter gaveled the committee to order to consider the nomination of Judge John Glover Roberts Jr. as Chief Justice of the Supreme Court. Day one included Judge Roberts’ introduction of his family and friends in attendance, followed by four short speeches by prominent senators advocating for his confirmation. This abbreviated session was to accommodate the short attention spans of Roberts’ two young children. The meeting was adjourned for the day with the formal occasion to resume the following morning.

As an amateur connoisseur of great speeches and the courtroom drama that testimony in front of Congress can engender, I was impatient for the next day. I was not disappointed. Judge Roberts, dressed in a black suit and starched white shirt and tie (but without his customary gold cuff links) looked like he was out of central casting. He assumed his seat after the customary swearing-in ritual. However, what was strikingly different was the starkness. The table where he sat and made his opening statement was devoid of items. Not a single note, pen or even a glass of water. One man sitting all alone looking up at 18 senators (nearly half of them partisan enemies hoping to derail his career), while he looked totally relaxed, confident and alert.

Devoid of any speeches or even cue cards, he politely thanked several and transitioned to his Indiana roots, referring to “the limitless fields punctuated only by a silo or barn.” It evoked an image of Middle America that effortlessly transported the entire committee back to their own memories of growing up. It was a flawless finesse that allowed him to exclude any reference to his life in the exclusive Long Beach community on Lake Michigan or his selective education at La Lumiere, a college prep school where a jacket and tie were required for classes and the dining hall. He graduated No. 1 in his class in 1973 and was the school’s first Harvard-bound student. Naturally, he graduated from Harvard summa cum laude in 1976, and graduated magna cum laude from Harvard Law.

He would let others plump his resume, while he edited himself down to a plainspoken, modest Midwesterner. Janet Malcolm commented in The New Yorker: “Watching Roberts on television was like watching one of the radiantly wholesome heroes that Jimmy Stewart and Henry Fonda played. It was out of the question that such a man be denied a place on the Supreme Court.”

Roberts was aware of the value of including a vivid metaphor, quotable line or a phrase to memorialize the event and he picked a good one. “Judges are like umpires. Umpires don’t make the rules, they apply them. The role of a judge and an umpire is critical. They make sure everyone plays by the rules, but it is a limited role. Nobody ever went to a ballgame to see the umpire.” That is now a common definition often used when needed.

It was a twist of fate that John Roberts was being interviewed for Chief Justice. On July 19, 2005, President Bush had nominated him to fill the vacancy created by the retirement of Justice Sandra Day O’Connor. While this nomination was still pending, on Sept. 3 Chief Justice William H. Rehnquist died from thyroid cancer. During the process of selecting Justice O’Conner’s replacement, Bush had solicited the opinion of several young lawyers in the White House. One was Brett Kavanaugh, who had been nominated to the D.C. Circuit Court of Appeals. Kavanaugh told him that both Roberts and Samuel Alito would be solid choices, but the tiebreaker would be who was most capable of convincing their colleagues through persuasion and strategic thinking. On this basis, Roberts was clearly the best.

After the Hurricane Katrina disaster that August, President Bush had no appetite for controversy. Reports on Judge Roberts’ interviews in the Senate were going so well that he changed Roberts’ nomination to Chief Justice. That would delay the O’Conner replacement for several months and the court would have to operate with only eight members. This was fortunate, since a highly unqualified Harriet Miers, who worked for Bush in the White House, was the lead candidate to replace O’Conner … and with more time to consider her credentials, saner heads prevailed.

The next three days of hearings offered an exquisite buffet for addicts like me. It started with a round of 10 minutes per senator and it was mildly amusing when Senator Joe Biden’s pontificating took so long that he ran out of time before asking a single question. Judge Roberts displayed remarkable intellect – and a wry sense of humor – when discussing important Supreme Court cases. When Senator Lindsey Graham of South Carolina asked Roberts what he would like future historians to say about him, Roberts joked: “I’d like for them to start by saying, ‘He was confirmed!’”

Questions fell into a regular rhythm and Roberts answered them almost effortlessly. In addition to his education and experience, the “Murder Boards” – the phrase used for pre-hearing rehearsals – must have really fine-tuned every aspect of what was anticipated. Even today, prospective nominees study the tapes of his hearing as part of their preparation. I got the feeling he was being polite to a bunch of partisan senators (all lawyers) without acting too condescending.

Senator Chuck Schumer of New York became so frustrated at one point he said, “Why don’t we just concede John Roberts is the smartest guy in the room.” In another memorable exchange, Schumer complained, “You agree we should be finding out your philosophy, and method of legal reasoning, modesty, stability, but when we try to find out what modesty and stability mean, what your philosophy means, we don’t get any answers. It’s as if I asked you what kind of movies you like. Tell me two or three good movies and you say, ‘I like movies with good acting. I like movies with good directing. I like movies with good cinema photography.’ And I ask, no, give me an example of a good movie, you don’t name one. I say, give me an example of a bad movie, you won’t name one, and I ask you if you like Casablanca, and you respond by saying lots of people like Casablanca.”

Senator Specter started to cut Schumer when Roberts interrupted, “I’ll be very succinct. First, Doctor Zhivago, and North by Northwest.” Yes, there was laughter in the room.

Roberts made it out of the Judiciary Committee on a vote of 13-5 as Democrats found creative excuses to vote no. Then it was on to the full Senate, where he was confirmed 78-22. The 50-year-old Roberts became the youngest Chief Justice since 1801, when the venerable John Marshall (46) was selected.

The Chief Justice is mentioned only once in the Constitution, but not in Article 3, which establishes the judiciary. It is in Article 1, covering Congress, and it says the Chief Justice presides over the Senate during any impeachment of the president (Article 1 Section 3 Clause 6).

The framers vested the Senate with the “sole power to try impeachment” for several reasons. First, they believed senators would be better educated, more virtuous and more high-minded than members of the House. Secondly, it was to avoid the possible conflict of interest of a vice president presiding over the removal of the one official standing between him and the presidency. Of our 45 presidents and 17 Chief Justices, only Andrew Johnson and Bill Clinton have been impeached, with Samuel Chase and William Reinquist presiding over their trials. Both were acquitted.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Latest volume on political career of Johnson can’t come soon enough

A photo of Lyndon B. Johnson being sworn in as president, inscribed and signed by Johnson, sold for $21,250 at an August 2018 Heritage auction.

By Jim O’Neal

Like other reverential fans of author Robert Caro’s multi-volume biography of Lyndon Baines Johnson, I’m still waiting patiently for him to finish volume five. It will cover the entire span of LBJ’s presidency, with a special focus on the Vietnam War, the Great Society and the Civil Rights era. Caro’s earlier biography of Robert Moses, The Power Broker, won a well-deserved Pulitzer in 1974.

In 2011, Caro estimated that his final volume on LBJ (his original trilogy had expanded to five volumes) would require “another two to three years to write.” In May 2017, he confirmed he had 400 typed pages completed and intended to actually move to Vietnam. In December 2018, it was reported Caro “is still several years from finishing.”

Since Caro (b.1935) is two years older than me, there may exist a certain anxiety that time may expire unexpectedly. However, it will still be worth the wait and I shall consume it like a fine 3-Star Michelin dinner in Paris. Despite all that’s been written about this period of time, Caro is certain to surprise with new facts and his unique, incomparable perspective.

Recall that planning for the 1963 campaign was well under way by autumn for the 1964 presidential election. The razor-thin victory of JFK over Richard Nixon in 1960 (112,000 votes or 0.12 percent) had largely been due to VP Johnson’s personal efforts to deliver Texas to the Democrats.

Others are quick to remind us that allegations of fraud in Texas and Illinois were obvious and that Nixon could have won if he had simply demanded a recount. New York Herald Tribune writer Earl Mazo had launched a series of articles about voter fraud. However, Nixon persuaded him to call off the investigation, telling him, “Earl, no one steals the presidency of the United States!” He went on to explain how disruptive a recount would be. It would damage the United States’ reputation in foreign countries, who looked to us as the paragon of virtue in transferring power.

Forty years later, in Bush v. Gore, we would witness a genuine recount in Florida, with teams of lawyers, “hanging chads” and weeks of public scrutiny until the Supreme Court ordered Florida to stop the recount immediately. Yet today, many people think George W. Bush stole the 2000 presidential election. I’ve always suspected that much of today’s extreme partisan politics is partially due to the bitter rancor that resulted. His other sins aside, Nixon deserves credit for avoiding this, especially given the turmoil that was just around the corner in the tumultuous 1960s.

Back in 1963, Johnson’s popularity – especially in Texas – had declined to the point JFK was worried it would affect the election. Kennedy’s close advisers were convinced a trip West was critical, with special attention to all the major cities in Texas. Jackie would attend since she helped ensure big crowds. Others, like U.N. Ambassador Adlai Stevenson and Bobby Kennedy, strongly disagreed. They worried about his personal safety. LBJ was also opposed to the trip, but for a different reason. Liberal Senator Ralph Yarborough was locked in a bitter intraparty fight with Governor John Connally; the VP was concerned it would make the president look bad if they both vied for his support.

We all know how this tragically ended at Parkland Hospital on Nov. 22 in Dallas. BTW, Caro has always maintained that he’s never seen a scintilla of evidence that anyone other than Lee Harvey Oswald was involved … period. Conspiracy theorists still suspect the mob, Fidel Castro, Russia, the CIA or even the vice president. After 56 years, not even a whiff of doubt.

Lyndon Baines Johnson was sworn in as president in Dallas aboard Air Force One by Judge Sarah T. Hughes (who remains the only woman in U.S. history to have sworn in a president). LBJ was the third president to take the oath of office in the state where he was born. The others were Teddy Roosevelt in Buffalo, N.Y., following the McKinley assassination (1901) and Calvin Coolidge (1923) after Harding died. Coolidge’s first oath was administered by his father in their Vermont home. Ten years later, it was revealed that he’d taken a second oath in Washington, D.C., to avert any questions about his father’s authority as a Justice of the Peace to swear in a federal-level officer.

On her last night in the lonely White House, Jackie stayed up until dawn writing notes to every single member of the domestic staff, and then she slipped out. When the new First Lady walked in, she found a little bouquet and a note from Jackie: “I wish you a happy arrival in your new home, Lady Bird,” adding a last phrase, “Remember-you will be happy here.”

It was clear that the new president was happy! Just days before, he was a powerless vice president who hated Bobby Kennedy and the other Kennedy staff. They had mocked him as “Rufus Corn Pone” or “Uncle Corn Pone and his little pork chop.” Now in the Oval Office, magically, he was transformed to the old LBJ, who was truly “Master of the Senate.” Lady Bird described him with a “bronze image,” revitalized and determined to pass Civil Rights legislation that was clogged in the Senate under Kennedy. Historians are now busy reassessing this period of his presidency, instead of the prism of the Vietnam quagmire.

LBJ would go on to vanquish Barry Goldwater, the conservative running as a Republican in 1964, with 61.1 percent of the popular vote, the largest margin since the almost uncontested race of 1820 when James Monroe won handily in the “Era of Good Feelings.” 1964 was the first time in history that Vermont voted Democratic and the first time Georgia voted for a Republican. After declining to run in 1968, LBJ died five years later of a heart attack. Jackie Kennedy Onassis died on May 19, 1994, and the last vestiges of Camelot wafted away…

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Clock ticking for one of America’s most influential retailers

A photograph signed by Richard W. Sears sold for $1,792 at an October 2009 Heritage auction.

By Jim O’Neal

This is a highly condensed story of an American retailing giant that only seems relevant as just another casualty of internet e-tailing. In cultural terms, it is generally portrayed as just another backwater wasteland. But this situation seems oddly different, since it seems like Sears has been a central player in the story of American life.

For a long time, the retailer’s products, publications and people influenced commerce, culture and politics. And then, slowly, it became subsumed into the gravitational pull of business bankruptcies that is relentless when corporate balance sheets weaken and finally fail. I selected it – rather than, say, Montgomery Ward, Atlantic & Pacific, or J.C. Penney – because of its long history and because its demise was like losing a century of exciting surprises to an indifferent bankruptcy judge who yawned and gaveled it into the deep, dark cemetery of obscurity.

Sears has roots to 1886, when a man named Richard W. Sears began selling watches to supplement his income as a railroad station agent in North Redwood, Minn. The next year, he moved the business to Chicago and hired Alvah C. Roebuck, a watchmaker, through a classified ad. Together, they sold watches and jewelry. The name was changed in 1893 to Sears, Roebuck & Company and by 1894, sewing machines and baby carriages were added to its flourishing mail-order business. Its famous catalogs soon followed.

Sears & Roebuck helped bring consumer culture to middle America. Think of the isolation of living in a small town 120 years ago. Before the days of cars, people had to ride several days in a horse and buggy to get to the nearest railroad station. What Sears did was make big-city merchandise available to people in small towns, desperate for news and yearning for new things. It made hard work worthwhile knowing that there was a surprise just over the horizon.

The business was transformed when Richard Sears harnessed two great networks – the railroads, which now blanketed the entire United States, and the mighty U.S. Postal Service. When the Postal Service commenced rural free delivery (RFD) in 1896, every homestead in America came within reach.

And Richard Sears reached them!

He used his genius for promotion and advertising to put his catalogs in the hands of 20 million Americans, at a time when the population was 76 million. Sears catalogs could be a staggering 1,500 pages with more than 100,000 items. When pants supplier and manufacturing wizard Julius Rosenwald became his partner, Sears became a virtual, vertically integrated manufacturer. Whether you needed a cream separator or a catcher’s mitt, or a plow or a dress, Sears had it.

The orders poured in from everywhere, as many as 105,000 a day at one point. The company had so much leverage that it could nearly dictate its own terms to manufacturers. Suppliers could flourish if their products were selected to be promoted. Competition was fierce and the Darwinism effect was in full play. Business boomed as the tech-savvy company built factories and warehouses that became magnets for suppliers and rivals as well. City officials complained that it was harming nearby small-town retailers (sound familiar?).

There was a time when you could find anything you wanted in a Sears catalog, including a house for your vacant lot. Between 1906 and 1940, Sears sold 75,000 build-from-a-kit houses, some still undoubtedly still standing. The Sears catalog was second only to the Holy Bible in terms of importance in many homes.

In 1913, the company launched its Kenmore brand, first appearing on a sewing machine. Then came washing machines, dryers, dishwashers and refrigerators. As recently as 2002, Sears sold four out of every 10 major appliances, an astounding 40 percent share in one of the most competitive categories in retailing.

By 1925, they opened a bricks-and-mortar retail store in Chicago. This grew to 300 by 1941 and more than 700 in the 1950s. When post-war prosperity led to growth in suburbia, Sears was perfectly positioned to cash in on another major development: the shopping mall. A Sears store was an ideal fit for a large, corner anchor store with plenty of parking. Sears revenue topped $1 billion for the first time in 1945 and 20 years later it was the world’s largest retailer and, supposedly, unassailable.

Oops.

By 1991, Walmart had zipped by them … never bothering to pause and celebrate. For generations, Sears was an innovator in every area, including home delivery, product testing and employee profit-sharing, with 350,000 dedicated employees and 4,000 outlets. What went wrong?

The answer is many things, but among the most significant was diverting their considerable retail cash flow in an effort to diversify. Between 1981-85, they went on a spending spree, first acquiring Dean Witter Reynolds, the fifth-largest stock brokerage, and then real estate company Coldwell Banker. They ended up selling the real estate empire and then spun off Dean Witter in a desperate effort to return to their retailing roots. This was after someone decided to build a 110-story, 1,450 foot skyscraper with 3 million square feet (the tallest building in the world at the time) to centralize all their Chicago people and then lease whatever was left over. You have to wonder what all these people were doing. (It wasn’t selling perfume or filling catalog orders!). The Sears Tower is now called Willis Tower (don’t ask).

They stopped the catalogs in 1993. One has to speculate what would have happened had they simply put their entire cornucopia of goodies online. I know timing is everything, but in 1995, on April 3, a scientist named John Wainwright bought a book titled Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought. He purchased it online from an obscure company called Amazon.

I will miss Sears when they gurgle for the last time. I cherished those catalogs when we lived in Independence, Calif. (the place Los Angeles stole water from via a 253-mile aqueduct). Me and my Boy Scout buddies all made wish lists, while occasionally sneaking a peek at the lingerie section.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Are we ready to continue building this great nation?

Two weeks before the Treaty of Paris ended the Spanish-American War, Princeton Professor Woodrow Wilson in this letter to an anti-imperialist says it’s too late to protest and that the focus should be on the “momentous responsibilities” facing the nation.

By Jim O’Neal

Many historians believe that the European exploration of the Western Hemisphere (1500 to 1800) was one of the most transformative eras in the history of civilization. The great Scottish philosopher Adam Smith (1723-1790) took it a step further and labeled it “one of the two greatest and most important events recorded in the history of mankind.” Much of the modern world is a direct result of these 400 years of colonization and transference of culture. In the end, the world seemed inexorably on the way to what we now call “globalization.”

It seems overly dramatic to me (and omits great chunks of transformative periods) but also unambiguously clear that – despite the broad participation of other important nations – the people from England and Spain had the most influence on the vast territories of the New World. However, these two genuinely great empires ultimately evolved into dramatically different societies. Also in the crystal-clear category is that, in the end, they both managed to dissipate the powerful advantages they had created.

A quick snapshot of the world today confirms this devolution. The once mighty Spanish Empire is reduced to a relatively small, unimportant European nation (with a shaky economy, disturbing brain drain and geographic unrest). The other powerful empire of even greater influence in the world is now back to being a small island, wracked with political dissent over further retreat from the European Union (Brexit) and a dangerously unstable government.

In their place is the most powerful, democratic, innovative nation in the history of the world. But even the remarkable United States has developed troubling signs that pose a real threat to a continuation of prosperity. If we don’t find a way to reverse the issues that divide us (basically almost every single issue of importance) and close the inequality gap, our future will inevitably end up like those that went before. An economic boomlet has masked deep, difficult issues that politicians are blithely hoping will somehow be solved by some unknown means. We lack leadership at a time when Waiting for “Superman” is not a prudent strategy.

Some believe we are in a steady decline and that China will surpass America in many important areas this century. However, that is pessimistic conjecture. It’s more useful to re-examine the factors that propelled us to a pinnacle of unprecedented prosperity. I find it more interesting to visit the past rather than speculate on a future with so many possible outcomes (e.g. extinction via asteroid collisions, interstellar travel or a billion robots with superior intellect). It is an unknowable with questionable benefits.

One simplistic way is to skip our story of independence from England and correlate the decline of the Spanish Empire with our annexation of the Spanish-speaking borderlands. It broadly occurred in three phases, starting with the annexation of Florida and the Southeast by 1820. This was followed by California, Texas and the greater Southwest by 1855. Mexico lost 50 percent of its land and up to 80 percent of its mineral wealth. The final phase occurred with the Spanish-American War of 1898, which added Central America and the Caribbean to complete the New American Empire.

Virtually every American president was complicit in varying degrees, bookended by Thomas Jefferson and Teddy Roosevelt, who wrote as if this was preordained by a benevolent entity. With immigrants flowing into the East, the promise of free land and the lure of gold in California, the land between the oceans became steadily populated and blended. The short war with Spain was merely the capstone for a century of annexation, population growth and a perfect balance of territory, people and economic development. The motivation was clear (“sea to sea”) and the manipulation perfectly illustrated by this anecdote:

Publisher William Randolph Hearst (eager to have a war to sell more newspapers) hired Frederic Remington to illustrate the revolution erupting in Cuba. In January 1897, Remington wrote to Hearst, “Everything is quiet. There is no trouble. There will be no war. I wish to come home.” Hearst quickly responded, “Please remain. You furnish the pictures and I WILL FURNISH THE WAR.”

A year later, the Treaty of Paris was signed and Spain relinquished all claims of sovereignty and title to Cuba (long coveted by the U.S. for its sugar and labor), then ceded Puerto Rico and Guam to America. The Philippines was (much) more complicated. The islands had been under Spanish rule for four centuries and waging a war for independence since 1896. The U.S. Navy prevailed and Spain sold the Philippines to the U.S. for $20 million. However, Filipino nationalists had no interest in trading one colonial master for another. They declared war on the United States. Finally, in 1946, the U.S. recognized the Philippines’ independence.

And that, dear friends, is how you build (and lose) an empire.

In a different time, we would simply annex the rest of Mexico, eliminate the border with Canada and create a North American juggernaut to counter China and end squabbling over a wall. We could help Mexico (now perhaps a few U.S. states), eliminate drug cartels, develop the entire Baja California coastline to match Malibu and take advantage of the outstanding Mexican labor force to rebuild infrastructure. All the wasted money on border security (DHS, ICE, asylum, deportations, etc.) would be spent rebuilding old stuff.

But, I will need your vote for 2020! (I feel certain Adam Smith would agree.)

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].