Behind the Scenes of Opening Disneyland and the Discovery of Doritos

Disney Poster
“Enchanted Tiki Room” Disneyland Park Entrance Poster (Walt Disney, 1967). “At the Gateway to Adventureland”. This poster sold for $19,200 in a 2018 HA.com auction.

By Jim O’Neal

On Sunday, July 17, 1955 the first Disneyland opened in Anaheim, CA. The $17 million theme park fit nicely on 160 acres of orange groves, and the 13 attractions were designed with the whole family in mind. Opening day was intended for special guests and a crowd size of 15,000. However, the $1 admission tickets were heavily counterfeited and over 28,000 people overwhelmed the Park.

Ronald Reagan, Art Linkletter, and Bob Cummings co-hosted the ABC TV coverage of the event and 70 million people tuned in from home. It was an unusually hot day and Harbor Blvd. had a line of cars seven miles long, filled with parents and kids yelling the usual questions and in need of a rest stop. Later, the Park inevitably ran out of food and water (the plumbing wasn’t finished). Pepsi Cola was readily available, and the publicity was not good (some even implied it was a conspiracy). The day grew in notoriety and soon became known inside Disney as “Black Sunday”. But, within a matter of weeks all the start-up kinks were resolved, and it has been a crowd pleaser ever since.

Although the details are sketchy, 33 local companies (e.g. Carnation, Bank of America, Frito-Lay, etc.) were charter members which helped Disney with the financing and infrastructure involved. Eventually, this evolved into a “Club 33” which included access to an exclusive fine-dining restaurant that serves alcoholic beverages (the only place in the Park). Today there’s a 5-10 year waiting list for companies or individuals eager to pony up $25,000 to have the privilege of paying 6-figure annual dues. There are several versions of the origin of the Club 33 name. One is rather obvious and another involves the California Alcohol Beverage Control regulations that requires a real address in order to have alcohol delivered. Go figure!

Irrespective of the fuzzy facts, when I joined Frito-Lay in 1966, F-L operated Casa de Fritos, a delightful open-air Mexican food restaurant. Visitors could enjoy a family friendly sit-down lunch/dinner or get a Taco Cup to go. The Cup served as a one-handed snack and was easily portable while strolling around the park. Casa de Fritos was managed by a pleasant man named Joe Nugent, who had only one obvious weakness. Casa de Fritos was too profitable!

Let me briefly explain.

Once a year the Flying Circus from Dallas came out to the Western Zone to review our Profit Plan for the next year. President Harold Lilley – who was very direct with few words – would publicly berate poor Joe. ”Dammit, Joe, I told you we didn’t want to make any money at that Mexican joint. We want to promote Fritos corn chips so people will buy them at their local supermarket!”. When I asked Joe about it, he said he had tried increasing the portion sizes and also lowering the menu prices. But, whenever he did even more people would line up to get it. This man may have invented volume price leverage and never had a clue what he was doing!

Soon the VP/GM for SoCal – George Ghesquire – finally convinced the wise men in Dallas to let us test market restaurant style tortilla chips (RSTC). He had grown weary of seeing bags of Fritos over in the Mexican food section. It was proof that someone had initially purchased Fritos, then changed their mind and swapped it for a bag of restaurant style tortilla chips. Dallas had previously been concerned that RSTC would cannibalize sales of Fritos and they were absolutely right. However, the delicious irony was that it was being done by our direct competition!

So our version of RSTC was finally being authorized and would have a brand name of Doritos (you may have seen it). It was to be a 39 cent 6 oz. bag and everyone was eager to get started. To ensure a fast start we loaned some money to Alex Morales the owner of Alex Foods – the supplier of the Taco Cup – and a contract to co-pack. It started with one truckload a week, but they were soon running around the clock. The product was so popular in the West/Southwest that a line had to be added to a plant in Tulsa and the new plant in San Jose. The world of Frito-Lay would never be the same.

To compensate for the lack of salsa, we made a quick trip to a local Vons supermarket and bought some bags of Lawry’s taco seasoning (used in homes to season the meat while preparing homemade tacos for dinner)…voila Doritos Taco flavored chips. Roy Boyd “Mr. Fritos” from Dallas helped us equip Alex Foods with 2 cement mixers from Sears and a handheld oil sprayer to keep the taco powder adhering to the chips.

Now flash forward to 1973 and I was entering my office on the 4th floor of the Frito-Lay Tower near Love Field in Dallas. I noticed a large plastic bag filled with tortilla chips. When I asked Roy, he explained it was a new flavor being evaluated for test market. After tasting 2-3 chips I remember saying “Naw…too dry”. Soon it would become Doritos Nacho flavored tortilla chips, one of the most successful new food products in the later quarter of the 20th century!

Looking back, I now realize that’s probably when I became one of the wise men in Dallas.

The next year (1974) Harry Chapin would sing about the “Cat’s in the cradle” which would earn him a Grammy nomination and eventually a place in the Hall of Fame (2011). I’d nominate the obscure, long forgotten, George Ghesquire for the Frito-Lay Hall of Fame, but I don’t think we have one. So until we do, maybe just “Father of Doritos “. The Crown that now rests with Arch West who absconded with it when he left the Company in 1968.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Immigrants have sparked controversy since the days of Benjamin Franklin

An interest certificate signed “B. Franklin” and dated Oct. 19, 1785, realized $13,145 at a 2012 Heritage auction.

By Jim O’Neal

Typically, any discussion of Ben Franklin will eventually include the question of why he was never president of the United States. He certainly earned the title as “The First American,” as well as an astonishing reputation as a leading politician, diplomat, scientist, printer, inventor, statesman and brilliant polymath. Biographer Walter Isaacson said it best: “Franklin was the most accomplished American of his age and the most influential in shaping the type of society America would become.”

He was certainly there in Philadelphia on May 5, 1775, representing Pennsylvania as a delegate to the Second Continental Congress, even as skirmishes escalated with the British military. The following June, he was an influential member of the Committee of Five that drafted the Declaration of Independence. In fact, he was the only one of the Founding Fathers to sign the four most important documents: the Declaration of Independence; the 1778 Treaty of Alliance with France (ensuring they would not get involved militarily in the war); the Treaty of Paris (1783), ending the war with Great Britain; and the historic United States Constitution in 1787, which is still firmly the foundation of the nation.

He proved to be a savvy businessman who made a fortune as a printer and prolific scientist who is still revered. His success provided him the personal freedom to spend time in England and France trading ideas with the great minds in the world and enjoying the company of the socially elite.

He was also a shrewd administrator and had a unique talent for writing or making insightful observations on all aspects of life. The only significant issue that seemed to perplex him were the waves of German immigrants that flooded many parts of Pennsylvania. In his opinion, the Crown dumped too many felons that resulted in unsafe cities. They could not speak English and they imported books in German. They erected street signs in German and made no attempts to integrate into the great “melting pot.” Worse was the fact that in many places they represented one-third of the population. He suggested that we export one rattlesnake for every immigrant and void every deed or contract that was not in English. Although the Pennsylvania Dutch (Germans) were an imbalance in the 18th century, by the 1850s they were ideal for the western expansion and proved to be ideal as agrarians.

When World War I erupted in 1914, most Americans viewed it with a sense of detachment and considered it just another European conflict. As a nation of immigrants focused on improving their personal lives, there was little time to root for their homeland. This policy helped Woodrow Wilson win a tight reelection in 1916 and it would last for two years. Since the start of war in Europe fit neatly with the end of the 1913-1914 recession in America, it was a perfect economic fit.

American exports to belligerent nations began rising rapidly; 1913, $825 million – 1917, $2.25 billion. In addition to steel, arms and food, American banks made large loans to finance these supplies. Inevitably, the U.S. was drawn into the war as German submarines sank supply boats. Germany then attempted to bribe Mexico to attack America (the XYZ Affair), and finally lit the fire keg by sinking the Lusitania, which carried a healthy complement of American passengers. Wilson was forced to ask Congress to declare war.

After we were provoked into the largest and deadliest war the world had seen, Wilson then decided that all Americans would be expected to support the war effort. The federal government opened its first propaganda bureau … the “Committee on Public Information.” Thus the creation of the first true “fake news.” Most forums of dissent were banned and it was even unlawful to advocate pacifism. Yes, German-Americans experienced substantial repression as war hysteria rippled through the system. But nothing close to what Japanese-Americans suffered after Japan attacked Pearl Harbor.

On Feb. 12, 1942, President Franklin Delano Roosevelt issued Executive Order 9066, which authorized federal officials to round up Japanese-Americans, including U.S. citizens, and remove them from the Pacific Coast. By June, 120,000 Americans of Japanese descent in California, Oregon and Washington were ordered to assembly centers like fairgrounds and racetracks … with barbed wire … and then shipped to permanent internment camps. Then, astonishingly, they were asked to sign up for military service, and some males 18-45 did since they were subject to the draft.

The U.S. Supreme Court heard three separate cases on the constitutionality and the court decided it was a wartime necessity.

In 1988, President Ronald Reagan signed the Civil Liberties Act and President George H.W. Bush signed letters of apology and payment of $20,000 to heirs. A total of 82,219 Japanese-Americans eventually received $1.6 billion in reparations.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

For 40 years, Horace Greeley was the busiest, boldest editor in America

This Horace Greeley 1872 campaign banner with albumen photo and gold-leaf trim sold for $40,000 at a December 2016 Heritage auction.

By Jim O’Neal

“Go West, young man, go west and grow up with the country.”

This widely known quote is directly associated with the concept of Manifest Destiny, as Americans inexorably expanded from being huddled along the Atlantic Ocean, across a vast continent, to the shores of the magnificent Pacific Ocean. What is less agreed is the source of this exuberant exhortation. A vast majority attribute it to a man who could easily be crowned the Nation’s Newsman: Horace Greeley. However, there is no definitive evidence in any of his prolific writing or plethora of speeches.

By 1831, a young (age 20) Horace Greeley arrived in New York, devoid of most things, especially money, except for a burning desire to exploit his skills as a journeyman printer. The following year, his reputation was rapidly expanding, having set up a press to publish his modest first newspaper. At 23, he had a literary weekly and a relationship with the great James Gordon Bennett, founder of the New York Herald. The future beckoned the aspiring writer-orator to bring his encyclopedic skills to the masses in new and exciting ways.

Inevitably, using borrowed money, he started the New-York Tribune, publishing the first issue on April 10, 1841. Perhaps by coincidence or divine intervention, this was the same day New York City hosted a parade in honor of recently deceased President William Henry Harrison (“Tippecanoe and Tyler Too”), who had died on April 4. Harrison, the ninth president, had only served from March 4, the shortest tenure of any U.S. president.

The 68-year-old William Henry Harrison was the oldest president to be inaugurated until Ronald Reagan was elected in 1980 at age 69 (both were young compared to the current president and president-elect). In this situation, Harrison had given a lengthy two-hour inaugural address (8,445 words – even after Daniel Webster had edited out almost half), opted not to wear a coat to demonstrate his strength, caught pneumonia and died four weeks later. His wife Anna was at home also sick and, in a first, Congress awarded her a pension – a one-time payment of $25,000 equal to the president’s salary. Their grandson – Benjamin Harrison – would become the 23rd president in 1889.

The new Greeley newspaper was a mass-circulation publication with a distinctive tone reflecting Greeley’s personal emphasis on civic rectitude and moral persuasion. Despite the challenging competition of 47 other newspapers – 11 of them dailies – the Tribune was a spectacular success. Greeley quickly became the most influential newspaperman of his time. From his pen flowed a torrent of articles, essays and books. From his mouth an almost equal amount. In the process, he revolutionized the conception of newspapers in form and content, literally creating modern journalism.

Then with the advent of steam-powered printing presses and a precipitous drop in prices from 6 cents to a penny, more people were clamoring for more news. The common man, ever eager for more information in any category, began to read about the financial markets and almost everything about everyone.

Greeley was intensely interested in Western emigration and encouraged others to take advantage of the opportunities he envisioned. “I hold that tens of thousands, who are now barely holding on at the East, might thus place themselves on the high road to competence and ultimate independence at the West.” Curiously, he made only one trip west, going to Colorado in 1859 during the Pike’s Peak Gold Rush, joining an estimated 100,000 gold-seekers in one of the greatest rushes in the history of North America. The participants, logically dubbed the “Fifty-Niners,” found enough gold and silver to compel Congress to authorize a Mint in 1862. The new Denver mint was opened in 1906.

Greeley developed a large group of followers who found in his raw eloquence and political fervor a refreshing perspective that fueled their appetite for more. For 40 years, Greeley was the busiest and boldest editor in America. Both men and women were attracted to his fiery perspective and guidance in all the great issues of the time. He spared no one, suffered no favorites and seemed to never let the nation or himself rest.

After becoming the first president of the New York Printers’ Union, he led the fight for distribution of public land to the needy and poor. He was a fierce advocate for government rescues during times of social issues, a new role for officeholders and the sovereign state as well. Others have remarked on the similarities between the 1837 depression and FDR’s New Deal response a century in the future. Still others consider him a trust buster, but 60 years before Teddy Roosevelt and his Big Stick threats.

Perhaps less skilled in the art of personal introspective, Greeley viewed himself as an “indispensable figure in achieving national consensus.” His lofty goal was nothing less than the eradication of political differences and a complete embrace of Whig principles and sensibilities. (We are still waiting for his version of transcendental harmony.) Alas, his yearning for consensus blunted his understanding of political events. He was surprisingly slow to grasp the moral dimension of slavery until the 1850s when violence erupted (i.e. Bleeding Kansas).

He abandoned his dream of consensus in favor of the North’s overwhelming strength to simply impose its will, saying “Let the erring states go in peace.” He then turned to badgering President Lincoln to negotiate a peace to stop the bloodshed – basically preserving slavery. Lincoln’s letter to the editor on Aug. 22, 1862, says it all: “If I could save the Union without freeing any slave, I would do it; and if I could save it by freeing all the slaves I would do it; and if I could save it by freeing some and leaving others alone, I would also do that.” The subtle wisdom not to expand the war into any of the border states is a point often overlooked.

In 1872, the famously eccentric editor from New York ran for president against Ulysses S. Grant, lost badly, and then died before the electoral votes were counted. Lincoln had likened Greeley to an “old shoe — good for nothing now, whatever he has been,” and Greeley himself perceived his failure. “I stand naked before my God, the most utterly, hopelessly wretched and undone of all who ever lived.”

Personally, I think not. (Seek thee proof … simply look around us today.)

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

What would Jefferson think of New York’s population, skyscrapers and deadly plague?

A signed 1786 letter in which Thomas Jefferson writes about Shay’s Rebellion, the national debt and foreign policy sold for $32,500 at an October 2018 Heritage auction.

By Jim O’Neal

I made a mistake 30 years ago when I began reading the six-volume biography of Thomas Jefferson by Dumas Malone. The first volume was published in 1948 – won the Pulitzer in 1975 – and ended with volume six in 1981 (The Sage of Monticello). Malone wrote in a chronicled-narrative style that was like readers catnip. I felt compelled to pick up any volume … start on any page … and not wonder what came before. 
 
Randomly picking up a volume to read was like opening a box of Cracker Jacks and eagerly looking for a prize, in this case, one of Jefferson’s many exploits: first secretary of state … second vice president … third president … second governor of Virginia … principal author of the Declaration of Independence … envoy to France … parttime inventor doubling the size of the United States for a mere $15 million (Louisiana Purchase) … commissioning the Lewis and Clark Expedition architect-builder of Monticello or ending up on sculptor Gutzon Borglum’s Mount Rushmore with George Washington, Abraham Lincoln and Teddy Roosevelt. 
 
Now, while becoming addicted to Governor Cuomo’s daily Covid-19 status reports, it has stirred old memories of Thomas Jefferson and I wonder what he would think about New York City with its millions of people, tall skyscrapers, massive hospital networks and the plague that has immobilized this amazing city. Early American cities were walking cities since only the affluent owned horses. As a result, beyond a few square miles, they were generally impractical. With the advent of crude mass transit like the omnibus (a wagon or small bus pulled by horses), cities expanded into larger metro areas. 
 
By 1855, there were 700 omnibus lines in a few cities transporting 120,000 passengers a day on bumpy, hand-carved cobblestone roads. This soon improved with the introduction of steel rails. By the 1880s, there were 525 horsedrawn rail lines in 325 cities. However, in addition to street pollution, horses broke down in alarming numbers, bringing an end to the production of buggy whips. 
 
The first electric trolley debuted in 1888 in Richmond, Va. Horses were quickly replaced by electricity and as early 1902, 97 percent of urban transit had been electrified. More than 2 billion passengers were riding on 22,000 miles of electric rails annually. Steampowered railroads, first introduced in the 1830s, continued to play an important role in transportation, but sheer size limited their use in cities with small, uneven roads. As the 19th century ended, electric trolleys dominated urban transportation, as steampowered locomotives focused on regional and transcontinental uses. 
 
Yet America’s largest cities, especially New York, had been trying to incorporate railroads as early as 1850. First was a rail line that followed the contours of the Hudson River and catered primarily to commuters. Then NYC introduced an elevated platform with full-size trains, electrified with a third rail providing the power and traversing above the city streets. Chicago and Boston tried similar versions until the 20th century introduced a new-modern concept for train transit. 
 
New York City pioneered the first subwayfull-size trains in massive tunnels that had been dug under city streets. The maiden trip was on Oct2, 1904, and eventually expanded to include 468 stations and 656 miles of commercial track. Thus, the worldfamous NYC subway system that we know today … and a detour to pose a question asked by historian Carl Becker: “What is still living in the political philosophy of Thomas Jefferson?” and your host’s amateurish reply. 
 
The first blow was the Civil War, which destroyed the political primacy of the South … slavery and the doctrine that the states were sovereign agents bound together in a federal compact. Then the 1890 census revealed that the frontier phase of America’s history, made possible by Jefferson, was gone. The 1920 census reported the majority of American citizens lived in urban rather than rural areas. These demographic changes transformed Jefferson’s agrarian vision into a nostalgic memory. 
 
Then the 1930s New Deal capped the urbanization, industrialization and increased density of the population. Roosevelt’s appropriation of Jefferson as a New Deal Democrat has been called “one of the most inspired acts of political thievery in American history.” In fact, the New Deal signaled the death knell for Jefferson’s cherished concept of a minimalist, centralized federal government. Undoubtedly, the massive military buildup to fight the Cold War was precisely the kind of “standing army” that Jefferson truly abhorred. 
 
Lastly, of course, was the modern Supreme Court decision in 1954 in Brown v. Board of Education (Topeka, Kan.). This was followed by all the other decisions regarding an equal, multiracial society. The intrusion into regular order made Jefferson’s belief in legal and physical separation of blacks and whites a literal anachronism. However, I suspect Mr. Jefferson, ever the pragmatic statesman, would observe that we should liberate ourselves from the dead hands of ancestors or predecessors ancient views and seek our own. 
 
Personally, I prefer Ronald Reagan’s uplifting words that we “pluck a flower from Thomas Jefferson’s life and wear it on our soul forever.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell]. 

President Coolidge deserves credit for his guiding hand

An official inaugural medal for Calvin Coolidge, inscribed “Inauguration March 4, 1925,” sold for $16,250 at a May 2019 Heritage auction.

By Jim O’Neal

There must have been thousands of American veterans of World War I still alive when I was born in 1937. After all, it had been less than 19 years since the Peace Armistice had been signed in November 1918. Although the war started in Europe in 1914, the United States didn’t get directly involved until April 1917 after a series of events provoked President Wilson to ask Congress to declare war.

However, my only recollections are about the Second World War, when my father and five of my mother’s brothers went to strange-sounding places like Iwo Jima, Guadalcanal, Tarawa and Okinawa. My Saturdays at the movies were (seemingly) exclusively Westerns and war films. Of course, there were the newsreels narrated mostly by Lowell Thomas, the voice of Movietone News. This was the generation that suffered through the Great Depression and earned the title of “the Greatest Generation” (Tom Brokaw) for their courage, sacrifice and honor. I give them a lot of credit for a time in the 1950s that I fondly recall with television, my own car, more money than I could spend and unlimited basketball, baseball and surfing.

Still, historians agree that the First World War had a major impact in shaping the modern world. A war of unprecedented violence, it upended the Victorian Era’s peace and prosperity. It unleashed mechanized warfare and death on a level that was staggering. Concurrently, it fundamentally altered the social norms for economics, psychology and liberalism that dated back to the Enlightenment. No one has developed an acceptable theory on the confluence of events that shattered the relationships of monarchies with blood and familial ties. The complicating treaties and alliances served as an obvious domino factor, but a single circuit breaker had the power to defuse the entire situation if it had been employed early.

Yet not a single leader had the courage or foresight to simply call “Time out!” and stop the equivalent of a runaway train. This strategic void led directly to the loss of 10 million lives and the destruction of a continent that had slowly evolved a benevolent culture with so much potential. Fortunately, the war was primarily rural and most of the grand historic buildings were spared; fate would not be so kind to the next confrontation … with thousands of bombers, guided bombs and the destruction of entire cities.

Perhaps worse, though, was the post-war legacy of hatred that made the horrific second tragedy inevitable. Consider the mindset of Adolf Hitler on Sept. 18, 1922, when he warned, “It cannot be that 2 million Germans should have fallen in vain … No, we do not pardon, we demand … vengeance!” Are these the words of a sane man who would be satisfied to regroup, rebuild and start over? Or a clever psychopath who would corrupt the minds of people, even as they were struggling with the punishment required by the Treaty of Versailles and the English, French and Russians exacting their revenge? Thousands of books have answered this with clarity.

Sadly, Americans and especially President Wilson would be seduced by a vague concept of a “14 Point Peace Plan” and a “League of Nations” to prevent future war, yet couldn’t even pass an obstinate Congress. It was another academic chimera, followed by a disabling stroke. Wilson’s successor was a flawed man, surrounded by corrupt men and public scandal. President Harding’s death in 1920 was unexpected but provided the opportunity for his vice president to perform an overdue house cleaning.

Calvin Coolidge was just the man to address the scandal-ridden administration of Warren G. Harding. His list of accomplishments are still not well known, but included cutting taxes four times, a budget surplus every year in office, and reduction of the national debt by a third. In many respects, he was a man of a bygone era. He wrote his own speeches, had only one secretary and didn’t even have a telephone on his presidential desk. Little wonder that President Reagan, who admired Coolidge’s efforts toward a smaller government and lower taxes, placed Silent Cal’s portrait in the White House Cabinet Room next to Lincoln and Jefferson.

Today, it’s not clear precisely how many wars we are in and how many have the exit strategy that Colin Powell considers essential to any military action (along with a clear objective and overwhelming forces to ensure victory). I wish I’d heard more from those WWI veterans that prompted this lesson!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Peaceful transfer of presidential power is one of our strengths

Seven Days in May, starring Burt Lancaster and Kirk Douglas, is a 1964 movie about a military/political cabal’s planned takeover of the U.S. government.

By Jim O’Neal

It seems clear that one of the bedrock fundamentals that contributes to the stability of the U.S. government is the American presidency. Even considering the terrible consequences of the Civil War – 11 states seceding, 620,000 lives lost and widespread destruction – it’s important to remember that the federal government held together surprisingly well. The continuity of unbroken governance is a tribute to a success that is the envy of the world.

Naturally, the Constitution, our system of justice and the rule of law – along with all the other freedoms we cherish – are all critical contributors. But it’s been our leadership at the top that’s made it all possible. In fact, one could argue that having George Washington as the first president for a full eight years is equal in importance to all other factors. His unquestioned integrity and broad admiration, in addition to precedent-setting actions, got us safely on the road to success despite many of the governed being loyal to the British Crown.

Since that first election in 1789, 44 different men have held the office of president (Grover Cleveland for two separate terms), and six of them are alive today. I agree with Henry Adams, who argued, “A president should resemble a captain of a ship at sea. He must have a helm to grasp, a course to steer, a port to seek. Without headway, the ship would arrive nowhere and perpetual calm is as detrimental to purpose as a perpetual hurricane.” The president is the one who must steer the ship, as a CEO leads an organization, be it small or large.

In the 229 intervening years, there have been brief periods of uncertainty, primarily due to vague Constitutional language. The first occurred in 1800, when two Federalists each received 73 electoral votes. It was assumed that Thomas Jefferson would be president and Aaron Burr would be vice president. The wily Burr spotted an opportunity and refused to concede, forcing the decision into the House. Jefferson and Burr remained tied for 35 ballots until Alexander Hamilton (convinced that Jefferson was the lesser of two evils) swayed a few votes to Jefferson, who won on the 36th ballot. This technical glitch was modified by the 12th Amendment in 1804 by requiring an elector to pick both a president and a vice president to avoid any uncertainty.

A second blip occurred after William Henry Harrison and John Tyler defeated incumbent Martin Van Buren. At age 68, Harrison was the oldest to be sworn in as president, a record he held until Ronald Reagan’s inauguration in 1981 at age 69. Harrison died 31 days after his inauguration (also a record), the first time a president had died in office. A controversy arose over the successor. The Presidential Succession Act of 1792 specifically provided for a special election in the event of a double vacancy, but the Constitution was not specific regarding just the presidency.

Vice President Tyler, at age 51, would be the youngest man to assume leadership. He was well educated, intelligent and experienced in governance. However, the Cabinet met and concluded he should bear the title of “Vice President, Acting as President” and addressed him as Mr. Vice President. Ignoring the Cabinet, Tyler was confident that the powers and duties fell to him automatically and immediately as soon as Harrison had died. He moved quickly to make this known, but doubts persisted and many arguments followed until the Senate voted 38 to 8 to recognize Tyler as the president of the United States. (It was not until 1967 that the 25th Amendment formally stipulated that the vice president becomes president, as opposed to acting president, when a president dies, resigns or is removed from office.)

In July 1933, an extraordinary meeting was held by a group of disgruntled financiers and Gen. Smedley Butler, a recently retired, two-time Medal of Honor winner. According to official Congressional testimony, Smedley claimed the group proposed to overthrow President Franklin Roosevelt because of the implications of his socialistic New Deal agenda that would create enormous federal deficits if allowed to proceed.

Smiley Darlington Butler was a U.S. Marine Corps major general – the highest rank authorized and the most decorated Marine in U.S. history. Butler (1881-1940) testified in a closed session that his role in the conspiracy was to issue an ultimatum to the president: FDR was to immediately announce he was incapacitated due to his crippling polio and needed to resign. If the president refused, Butler would march on the White House with 500,000 war veterans and force him out of power. Butler claimed he refused the offer despite being offered $3 million and the backing of J.P. Morgan’s bank and other important financial institutions.

A special committee of the House of Representatives (a forerunner to the Committee on Un-American Activities) headed by John McCormack of Massachusetts heard all the testimony in secret, but no additional investigations or prosecutions were launched. The New York Times thought it was all a hoax, despite supporting evidence. Later, President Kennedy privately mused that he thought a coup d’état might succeed if a future president thwarted the generals too many times, as he had done during the Bay of Pigs crisis. He cited a military plot like the one in the 1962 book Seven Days in May, which was turned into a 1964 movie starring Burt Lancaster and Kirk Douglas.

In reality, the peaceful transfer of power from one president to the next is one of the most resilient features of the American Constitution and we owe a deep debt of gratitude to the framers and the leaders who have served us so well.

JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Usual Fireworks Expected with Latest Supreme Court Selection

This photograph, signed by Supreme Court Chief Justice William H. Taft and the eight associate justices, circa 1927, sold for $14,340 at a September 2011 Heritage auction.

By Jim O’Neal

It is that time again when the news will be filled with predictions of pestilence, war, famine and death (the Four Horsemen of the Apocalypse) as President Trump tees up his next candidate for the Supreme Court. One side will talk about the reversal of Roe v. Wade as an example of the terrible future that lies ahead. The other side will be quick to point out that this fear-mongering first started in 1981 when Sandra Day O’Connor (the first woman to serve on the court) was nominated by President Reagan and that nothing has happened in the intervening 37 years.

My prediction is that regardless of whoever is confirmed, there will be no evidence from the past on any opinions on “Roe” and he or she will have been groomed by the “Murder Boards” to answer that it is settled law. Murder Boards are groups of legal experts who will rehearse the nominee on how to answer every possible question the Senate Judiciary Committee might ask on any subject, not just Roe, in their role in giving advice and consent. It produces what former Vice President Joe Biden described as a “Kabuki dance” when he was in the Senate.

The questioning does produce great public theater, but it is a tradition that dates to 1925 when nominee Harlan Stone actually requested he be allowed to answer questions about rumors of improper ties to Wall Street. It worked and he was confirmed by a vote of 71-6 and would later serve as Chief Justice (1941-46). In 1955, John Marshall Harlan II was next when Southern Senators wanted to know his views on public school desegregation vis-à-vis Brown v. Board of Education. He was also successfully confirmed 71-11 and since then, every nominee to the court has been questioned by the Senate Judiciary Committee. The apparent record is the 30 hours of grilling Judge Robert Bork experienced in 1987, when he got “Borked” by trying to answer every single question honestly. Few make that mistake today.

Roe v. Wade was a 1973 case in which the issue was whether a state court could constitutionally make it a crime to perform an abortion, except to save the mother’s life. Abortion had a long, legal history dating to the 1820s when anti-abortion statues began to appear that resembled an 1803 law in Britain that made it illegal after “quickening” (start of fetal movements) using various rationales such as illegal sexual conduct, unsafe procedures and the state’s responsibilities in protecting prenatal life.

The criminalization accelerated from the 1860s and by 1900, abortion was a felony in every state. Despite this, the practice continued to grow and in 1921, Margaret Sanger founded the American Birth Control League. By the 1930s, licensed physicians performed an estimated 800,000 procedures each year. In 1967, Colorado became the first state to decriminalize abortion in cases of rape, incest or permanent disability of the woman. By 1972, 13 states had similar laws and in 1970, Hawaii was the first state to legalize abortion on the request of the woman. So the legal situation prior to Roe was that abortion was illegal in 30 states and legal in the other 20 under certain conditions.

“Jane Roe” was an unmarried pregnant woman who supposedly wished to terminate her pregnancy and instituted an action in the U.S. District Court for the Northern District of Texas. A three-judge panel found Texas criminal statues unconstitutionally vague and the right to choose to have children was protected by the 9th through the 14th Amendments. All parties appealed and on Jan. 22, 1973, the Supreme Court ruled the Texas statute was unconstitutional. The court declined to define when human life begins.

Jane Roe’s real name was Norma McCorvey and she became a pro-life advocate before she died and maintained she never had the abortion and that she was the victim of two young, ambitious lawyers looking for a plaintiff. Henry Wade was district attorney of Dallas from 1951 to 1987 and the longest serving DA in United States history. He was also involved in the prosecution of Jack Ruby for killing Lee Harvey Oswald. After he was convicted, Ruby appealed and the verdict was overturned, but he died of lung cancer and is constitutionally presumed innocent.

Stay tuned for the fireworks.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Head-Butting with Russia Dates Back Nearly 300 Years

This Nicholas II gold specimen Imperial of 10 Roubles, 1895, sold for $228,000 at a January 2018 Heritage auction.

By Jim O’Neal

In the 21st century, the United States-Russian Federation relationship has become more contentious and complex. The deterioration accelerated during the second Obama term following disagreements over the Ukraine and the reabsorption of Crimea in 2014 after Secretary of State Hillary Clinton was unable to hit the reset button. The U.S. presidential election of 2016 ended with reports of Russian interference and has steadily morphed into a major issue covered obsessively by the media.

Formal investigations include both Houses of Congress, the FBI-Justice Department and a Special Prosecutor who is delving into possible obstruction and conspiracy activities, plus an unknown series of related crimes. It has become a “cottage industry” that is either really big (e.g. subpoenas, indictments and impeachment) or simply another partisan distraction.

For perspective, it is useful to recall that difficulties involving Russia are not a recent or even a 20th-century phenomenon. They started well before we became an independent nation and continued until we jointly became the world’s two superpowers. From 1732 to 1867, there were a number of squabbles with Tsarist Russia that stretched from the Bay Area in Northern California up the Pacific Coast to Alaska.

Like other European nations, Russia was interested in expanding through a strategy of colonization. They had become powerful under Peter the Great (1672-1725), who needed to develop new territories with fur-bearing mammals after over-hunting depleted the stocks in Siberia. He dispatched cartographer Vitus Bering to explore Alaska, and the first permanent settlement was a fur-trading settlement in 1784. This was followed by the Russian-American Company (RAC), formed in an attempt to monopolize the fur trade and convert Alaskan natives to pseudo-Russian subjects to help with maritime fur trading.

They migrated to the pristine Northern California Pacific Ocean area and in 1812 were able to establish an outpost called Fort Ross. Fort Ross lasted until 1841 and is now a California Historical Landmark an hour’s drive north of the bustling San Francisco Bay. However, the Russians were never able to make North America profitable and Secretary of State William Seward negotiated the purchase of Alaska for $7.2 million in 1867. Originally scoffed at as “Seward’s Folly,” the territory was admitted as the 49th state on Jan. 3, 1959.

At two cents an acre, the state’s 663,268 square miles was larger than the combined areas of Texas, California and Montana. Along with the Louisiana Purchase, it became one of the better land deals the United States made, excluding, of course, areas where we simply overpowered American Indians and took their land and anything else that was unoccupied.

Although we were allies with Russia during World War II, post-war Germany was up for grabs and Berlin became the next area of contention. It was decided to divide it with the United States, the United Kingdom and France taking three parts (West) and Russia taking the remainder (East) in 1945. In 1952, Russia closed the border and in 1961, they built the Berlin Wall to hinder defections to the West. It was here that President Reagan made his famous speech in 1987 … “Secretary General Gorbachev … tear down this wall,” and it did fall on Nov. 9, 1989, and Germany was officially reunited on Oct. 3, 1990. By then, George Herbert Walker Bush was president.

Oh, yes, Reagan’s speech was made on June 12 … while President Bush was celebrating his 63rd birthday. Happy Birthday, President Bush! We miss you. Get well soon.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

National Debt on Automatic Pilot to More Growth

A letter by President George W. Bush, signed and dated July 4, 2001, sold for $16,730 at an April 2007 Heritage auction.

By Jim O’Neal

In May 2001 – just 126 days after President George W. Bush took office – Congress passed his massive tax proposal. The Bush tax cuts had been reduced to $1.3 trillion from the $1.65 trillion submitted, but it was still a significant achievement from any historical perspective. It had taken Ronald Reagan two months longer to win approval of his tax cut and that was 20 years earlier.

George W. Bush

Bush was characteristically enthusiastic about this, but it had come with a serious loss in political capital. Senator James Jeffords, a moderate from Vermont, announced his withdrawal from the Republican Party, tipping control of the Senate to the Democrats, the first time in history that had occurred as the result of a senator switching parties. In this instance, it was from Republican to Independent, but the practical effect was the same. Several months later (after the terrorist attacks on the World Trade Center and the Pentagon), there was a loud chorus of calls to reverse the tax cuts to pay for higher anticipated spending.

Bush had a counter-proposal: Cut taxes even more!

Fiscal conservatives were worried that there would be the normal increase in the size and power of the federal government, lamenting that this was a constant instinctive companion of hot wars. James Madison’s warning that “A crisis is the rallying cry of the tyrant” was cited against centralization that would foster liberal ideas about the role of government and even more dependency on the federal system.

Ex-President Bill Clinton chimed in to say that he regretted not using the budget surplus (really only a forecast) to pay off the Social Security trust fund deficit. Neither he nor his former vice president had dispelled the myth about a “lock box” or explained the federal building in Virginia that had been built exclusively to hold government IOUs to Social Security. In reality, they were simply worthless pieces of scrip, stored in unlocked filing cabinets. The only changes that had ever occurred with Social Security funds were whether they were included in a “unified budget” or not. They had never been kept separate from other revenues the federal government received.

But this was Washington, D.C., where, short of a revolution or civil war, change comes in small increments. Past differences, like family arguments, linger in the air like the dust that descends from the attic. All of the huge surpluses totally disappeared with the simple change in the forecast and have never been discussed since.

Back at the Treasury Department of 15th Street, a statue to Alexander Hamilton commemorates the nation’s first Treasury Secretary, a fitting honor to the man who created our fiscal foundation. But on the other side stands Albert Gallatin, President Thomas Jefferson’s Treasury Secretary, who struggled to pay off Hamilton’s debts and shrink the bloated bureaucracy he built.

Hamilton also fared better than his onetime friend and foe, James Madison. The “Father of the Constitution” had no statue, no monument, no lasting tribute until 1981, when the new wing of the Library of Congress was named for him. This was a drought that was only matched by John Adams, the Revolutionary War hero and ardent nationalist. It was only after a laudatory biography by David McCulloch in 2001 that Congress commissioned a memorial to the nation’s second president.

Since the Bush tax cut and the new forecast, the national debt has ballooned to $20 trillion as 9/11, wars in Iraq and Afghanistan, and the 2008 financial meltdown produced a steady stream of budget deficits in both the Bush and Barack Obama administrations. The Donald Trump administration is poised to approve tax reform, amid arguments on the stimulative effect on the economy and who will benefit. In typical Washington fashion, there is no discussion over the fact that the national debt is inexorably on automatic pilot to $25 trillion, irrespective of tax reform. But this is Washington, where your money (and all they can borrow) is spent almost with no effort.

“Just charge it.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Some Supreme Court Confirmation Hearings Would Make Great Pay-Per-View TV

This oversized photograph of the U.S. Supreme Court, circa 1984, is signed by all nine justices, including Lewis F. Powell Jr. It realized $4,481.25 at an April 2011 Heritage auction.

By Jim O’Neal

When Justice Lewis F. Powell Jr. unexpectedly announced his retirement in June 1987, no commentator failed to emphasize the implications for the future of the Supreme Court. The New York Times stated the obvious: “Powell’s resignation gives President Reagan a historic opportunity to shape the future of the Court.” Justice Powell had played a pivotal role as the tie-breaking vote on controversial issues such as abortion, affirmative action and separation of church and state.

Yet Powell was not merely a simple tie-breaker. Since he frequently swayed the court’s decision from one ideological camp to another by virtue of his swing vote, he was viewed as mainstream. As a result, President Reagan attempted to portray Powell’s replacement, Robert Bork, as neither conservative nor liberal, stressing his “evenhanded and open-minded approach to the law.”

The president’s lack of success was immediately evident when Senator Edward Kennedy – only 45 minutes after Bork’s appointment – fired the opening salvo against Bork’s record on abortion, civil rights and criminal justice. Kennedy declared, “Robert Bork’s America is a land in which women would be forced into back-alley abortions, blacks would be forced to sit at segregated lunch counters, rogue policemen could break down citizens’ doors in midnight raids, schoolchildren could not be taught about evolution, and artists could be censured at the whim of government.”

Once Kennedy unleashed these polemics, there was no turning back. Southern Senators were intimidated by the possible loss of black voters and liberals in the Senate were eager for a good fight after eight years of frustrating losses to conservatives.

Despite being confirmed unanimously for the U.S. Court of Appeals, Judge Bork was stepping into a veritable political hornets’ nest and he was the wrong person in the wrong spot at the wrong time! His copious scholarly writings – an asset in academia – and his lucidly crafted, elegantly penned opinions on the appellate bench were red meat in the hands of hostile interest groups.

Bork with President Ronald Reagan in 1987.

Moreover, Bork’s personal appearance and demeanor seemed as suspect as his ideology. His devilish beard and turgid academic discourses did not endure him to the public or wavering Senators. His detailed, scholarly, lecture-like answers to every single question would be considered naive today … where nominees are well versed in the art of non-answers to tough questions, and grilled by “murder boards” designed to prepare careful answers to virtually everything the nominee has written or spoken since puberty. Today’s Google/Facebook generation of staffers can unearth obscure facts that might be even slightly contentious.

Judge Bork’s nomination was rejected by a resounding 42-58 vote. After being transfixed by the riveting testimony, I personally believe that even if Judge Bork were given another try today (he died in 2012), the outcome would be similar. He had such a high regard of his superior legal acumen and was so openly dismissive of the twits on the Senate Judiciary, it would be another verbal combat that would end just as badly.

It would be a perfect scenario for a pay-for-view cable TV spectacle, especially for Supreme Court nerds like moi.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].