Bikes Symbolized Progress for a Nation Ready for Growth

A rare campaign button shows presidential candidate William McKinley riding a bicycle at the height of the bike boom of the 1890s.

By Jim O’Neal

As the bicycle became more popular in the latter part of the 1800s, it was inevitable that millions of new enthusiasts would soon be demanding better roads to accommodate this object of pleasure, so symbolic of progress. It was hailed as a democratizing force for the masses and neatly bridged the gap between the horse and the automobile, which was still a work in progress.

The popularity of this silent, steel steed had exploded with the advent of the “safety bicycle” (1885), which dramatically reduced the hazards of the giant “high wheelers.” The invention of the pneumatic tire in 1889 greatly enhanced the comfort of riding and further expanded the universe of users. However, this explosion in activity also increased the level of animosity as cities tried to cope by restricting hours of use, setting speed limits and passing ordinances that curtailed access to streets.

There were protest demonstrations in all major cities, but it came to a head in 1896 in San Francisco. The city’s old dirt roads were crisscrossed with streetcar tracks, cable slots and abandoned street rail franchises. Designed for a population of 40,000, the nation’s third-wealthiest city was now a metropolis of 350,000 and growing. On July 25, 1896, advocates of good streets and organized cyclists paraded in downtown with 100,000 spectators cheering them on.

The “Bicycle Wars” were soon a relic of the past as attention shifted to a product that was destined to change the United States more than anything in its history: Henry Ford’s Model T. Production by the Ford Motor Company began in August 1908 and the new cars came rolling out of the factory the next month. It was an immediate success since it solved three chronic problems: automobiles were scarce, prohibitively expensive and consistently unreliable.

Voila, the Model T was easy to maintain, highly reliable and priced to fit the budgets of the vast number of Americans with only modest incomes. It didn’t start the Automobile Age, but it did start in the hearts and souls of millions of people eager to join in the excitement that accompanied this new innovation. It accelerated the advent of the automobile directly into American society by at least a decade or more.

By 1918, 50 percent of the cars in the United States were Model Ts.

There were other cars pouring into the market, but Model Ts, arriving by the hundreds of thousands, gave a sharp impetus to the support structure – roads, parking lots, traffic signals, service stations – that made all cars more desirable and inexorably changed our daily lives. Automotive writer John Keats summed it up well in The Insolent Chariots: The automobile changed our dress, our manners, social customs, vacation habits, the shapes of our cities, consumer purchasing patterns and common tasks.

By the 1920s, one in eight American workers was employed in a related automobile industry, be it petroleum refining, rubber making or steel manufacturing. The availability of jobs helped create the beginning of a permanent middle class and, thanks to the Ford Motor Company, most of these laborers made a decent living wage on a modern five-day, 40-hour work week.

Although 8.1 million passenger cars were registered by the 1920s, paved streets were more often the exception than the rule. The dirt roads connecting towns were generally rutted, dusty and often impassable. However, spurred by the rampant popularity of the Model T, road construction quickly became one of the principal activities of government and expenditures zoomed to No. 2 behind education. Highway construction gave birth to other necessities: the first drive-in restaurant in Dallas 1921 (Kirby’s Pig Stand), first “mo-tel” in San Luis Obispo in 1925, and the first public garage in Detroit in 1929.

The surrounding landscape changed with the mushrooming of gas stations from coast to coast, replacing the cumbersome practice of buying gas by the bucket from hardware stores or street vendors. Enclosed curbside pumps became commonplace as did hundreds of brands, including Texaco, Sinclair and Gulf. The intense competition inspired dealers to distinguish with identifiable stations and absurd buildings. Then, in the 1920s, the “City Beautiful” movement resulted in gas stations looking like ancient Greek temples, log cabins or regional Colonial New England and California Spanish mission style fuel stops.

What a glorious time to be an American and be able to drive anywhere you pleased and see anything you wished. This really is a remarkable place to live and to enjoy the bountiful freedoms we sometimes take for granted.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

President Ford’s Primary Task was Healing a Nation

A letter by Gerald R. Ford, signed and dated April 16, 1979, sold for $5,078 at an April 2007 Heritage auction.

By Jim O’Neal

Gerald Rudolph Ford (Leslie Lynch King Jr. at birth) was an uncomplicated man tapped by destiny for one of the most complex jobs in history. The first non-elected president and first vice president confirmed by the Senate, he was tasked with healing the nation’s wounds caused by the Vietnam War and the severe divisions resulting from the Watergate scandal. Atypical from the usual driven personalities in the Oval Office, Ford restored calm and confidence to a nation while ushering in a period of renewal for American society.

A year before his inauguration, it would never have occurred to Ford (1913-2006) that he would be thrust into the presidency. The highest office he ever aspired to was Speaker of the House of Representatives; and that seemed out of reach because the Democratic Party had a stranglehold in the House. As a result, Ford had decided to retire after the November 1974 elections.

President Ford

Suddenly, in October 1973, President Richard Nixon appointed him vice president in the wake of Spiro Agnew’s resignation. “Remember, I’m a Ford, not a Lincoln,” he said modestly when he assumed responsibility on Dec. 6, 1973. He was at peace with himself and provided a sense of restored purpose, blissfully unaware of the collapsing presidency and seemingly endless revelations of misconduct at high levels in the administration.

One bright spot was that even as it approached dissolution, the Nixon administration managed to navigate the Arab-Israeli War of 1973 and diminish the Soviet position in the Middle East by successfully sponsoring a complicated triangular diplomacy with Moscow and Beijing. The disintegration of executive power did not lead to a collapse of our international position. Nixon’s prestige after five years of foreign policy now came close to a policy of bluffing, but the sleight of hand grew more difficult and it was unsustainable.

As impeachment proceedings gathered momentum, Nixon’s personal conduct began to mirror his political decline. He kept abreast of policy issues and made key decisions, but Watergate absorbed more of Nixon’s intellectual and emotional capital. Routine business became more trivialized by the increasingly apparent inevitability of his downfall. His tragedy was largely self-inflicted and the only question was, “How long can this go on?”

Then on July 31, it was revealed that one of the tapes the Supreme Court ordered to be turned over to the Special Prosecutor was the long-sought “smoking gun”— conclusive proof of Nixon’s participation in the cover-up. On the tape, Nixon was clearly heard instructing Chief of Staff H.R. Haldeman to use the CIA to thwart an FBI investigation into the Watergate burglary.

With the tape’s release, Ford took the unprecedented step on Aug. 6 of disassociating from the president at a Cabinet meeting. He would no longer defend the president and said he would not have done so earlier had he known. Publically, he maintained silence as a “party in interest” (probably another first).

But it was the morning of Aug. 9, 1974, that witnessed one of the most dramatic moments in American history. At 9:30 in the East Room, Richard Nixon bade farewell to his staff. At 12:03 that same day, in the same room, Gerald R. Ford was sworn in as the 38th president of the United States.

Earlier, General Alexander Haig had handed Nixon’s formal resignation to Henry Kissinger in his role as Secretary of State. All presidential appointments are countersigned by the Secretary of State and, by the same token, resignations of a president and vice president are made to the Secretary of State as well. With the resignation of Spiro Agnew on Oct. 10, 1973, and Richard Nixon as president on Aug. 9, 1974, Kissinger achieved what we must hope will remain the permanent record for receiving high-level resignations … forever!

Our long national nightmare had finally come to an end.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

General Lee’s Decision Avoided the ‘Vietnamization of America’

Robert E. Lee declined President Lincoln’s offer to head up the Union Army since it would require him to bear arms against his home state of Virginia.

By Jim O’Neal

In late 1955, the Tappan Zee Bridge – spanning the Hudson River in New York – was opened with seven lanes for motor traffic. Two months ago, it was closed and is systematically being demolished. The deteriorating bridge, known in the governor’s office as the “hold-your-breath bridge,” was featured in the documentary The Crumbling of America, the story of the infrastructure crisis in the United States.

Also in this same category is the Arlington Memorial Bridge, which connects the Lincoln Memorial to Arlington National Cemetery and is metaphorically described as what rejoined the North and South after the Civil War. First proposed in 1886 as a memorial to General Ulysses S. Grant, it was blocked in Congress until President Warren G. Harding got snarled in a three-hour traffic jam in 1921 en route to the dedication of the Tomb of the Unknown Soldier.

Congress quickly approved his request for $25,000 to build the bridge and it finally opened in January 1932.

Nearby is Arlington House, the Robert E. Lee Memorial. This was the home for the Lee family for 30 years and where R.E.L. made the fateful decision to resign his commission in the U.S. Army on April 21, 1861, and join the Confederate States. He had declined President Abraham Lincoln’s offer to head up the Union Army since it would require him to bear arms against his home state of Virginia.

In June 1862, Congress enacted a property tax on all “insurrectionary” land and added an amendment in 1863 requiring the tax to be paid in person. Ill and behind Confederate lines, Mary Lee was unable to comply and the Lees never slept there again. The property was auctioned off on Jan. 11, 1864, and the high bidder ($26,800) was the U.S. government.

Secretary of War William Stanton approved the conversion of the Lee estate to a military cemetery in 1864. On May 13, a Confederate POW was buried there (renamed Arlington National Cemetery) and more than 400,000 have joined him, including President Taft, President JFK and my dear friend Roger Enrico.

For 15 years, I passed a statue of Robert E. Lee driving to my Dallas office. It invariably invoked memories of the wisdom of this soldier who surrendered his army to General Grant at Appomattox in April 1865. Most of his top aides tried to dissuade Lee from surrendering, arguing they could disband into the familiar countryside and hold out indefinitely in a stalemate. Eventually, Northern soldiers would simply return to their homes and then the South could regroup.

Thus did Robert E. Lee, so revered for his leadership in war, make his most historic contribution – to peace! By this one momentous decision, he spared the country the divisive guerilla war that would have followed … a vile and poisonous conflict that would have fractured the country perhaps permanently. Or as newspaper columnist Tom Wicker deftly put it, “The Vietnamization of America.”

Alas, Dallas city leaders recently removed the Lee statue and I sincerely hope they find some relief from the anguish they have suffered from this piece of marble sequestered so long. However, I suspect they will just move on to some other injustice. It reminds me of feeding jellybeans to pacify a ravenous bear. When you (inevitably) run out of jellybeans, he eats you.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Chester Arthur Surprised His Critics, Overcame Negative Reputation

This ribbon with an engraved portrait of Chester Alan Arthur, issued as a souvenir for an Oct. 11, 1882, “Dinner to The President of the United States by The City of Boston,” sold for $437 at a November 2014 auction.

By Jim O’Neal

President Ulysses S. Grant appointed Chester Alan Arthur to the lucrative post of Collector of the Port of New York in 1871. Arthur held the job for seven years, and with an annual gross income of $50,000, was able to accumulate a modest fortune. He was responsible for the collection of about 75 percent of the entire nation’s duties from ships that landed in his jurisdiction, which included the entire coast of New York state, the Hudson River and ports in New Jersey.

In 1872, he raised significant contributions from Custom House employees to support Grant’s successful re-election for a second term. The spoils system was working as designed, despite occasional charges of corruption.

Five years later, the Jay Commission was created to formally investigate corruption in the New York Custom House and (future president) Chester Arthur was the primary witness. The commissioner recommended a thorough housecleaning and President Rutherford B. Hayes fired Arthur and then offered him an appointment as consul general in Paris. Arthur refused and went back to New York law and politics.

At the 1880 Republican National Convention, eventual nominee James Garfield first offered the VP slot to wealthy New York Congressman Levi Morton (later vice president for Benjamin Harrison), who refused. Garfield then turned to Chester Arthur, who, when he accepted, declared, “The office of the vice president is a greater honor than I ever dreamed of attaining.” It would be the only election he would ever win, but it was enough to foist him into the presidency.

The Garfield-Arthur ticket prevailed and after being sworn in on March 4, 1881, the 49-year-old Garfield’s first act was to turn and kiss his aged mother. It was the first time a president’s mother had ever been present at an inauguration. She would outlive her son by almost seven years. President James Polk (1845-1849) also died three years before his mother, the first time that had happened.

On the morning of July 2, President Garfield was entering the Baltimore and Potomac Railroad Station in Washington, D.C., where he was to board a train to attend the 25th reunion of his class at Williams College. A mentally disturbed office seeker, Charles J. Guiteau, shot him twice. He died 80 days later and for the fourth time in history, a man clearly only meant to be vice president ascended to the presidency.”

“CHET ARTHUR PRESIDENT OF THE UNITED STATES! GOOD GOD!”

Although President Arthur’s greatest achievement may have been the complete renovation of the White House, he surprised even some of his harshest critics. Mark Twain may have summed it up best: “I am but one in 55 million, still in the opinion of this one-fifty-five millionth of the country’s population, it would be hard to better President Arthur’s administration.”

Faint praise, yet probably accurate. (First, do no harm.)

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Sanctions Didn’t Stop Germany from Roaring Back After WWI

A 1939 political cartoon by Charles Werner (1909-1997) for Time magazine comments on the worldwide mood 20 years after the Treaty of Versailles. The original art sold for $836 at a February 2006 Heritage auction.

By Jim O’Neal

From 1939 to the winter of 1941, the German military won a series of battles rarely equaled in the history of warfare. In rapid succession, Poland, Norway, France, Belgium, Holland, Yugoslavia, Denmark and Greece all fell victim to the armed forces of the Third Reich. In the summer and fall of 1941, the USSR came close to total defeat at the hands of the Wehrmacht, losing millions of soldiers on the battlefield and witnessing the occupation of a large portion of Russia and the Ukraine. The German air force, the Luftwaffe, played a central role in this remarkable string of victories.

It was even more startling to those countries that had participated in WWI and taken draconian anti-war measures when it ended. This was simply something that was NEVER supposed to happen again, much less a mere 20 years later. How was it even possible?

The Allied powers had been so impressed with the combat efficiency of the German Luftwaffe in WWI that they made a concerted effort to eliminate Germany’s capability to wage war in the air. Then they crippled their civilian aviation capability just to be certain. The Allies demanded the immediate surrender of 2,000 aircraft and rapid demobilization of the Luftwaffe. Then in May 1919, the Germans were forced to surrender vast quantities of aviation material, including 17,000 more aircraft and engines. Germany was permanently forbidden from maintaining a military or naval air force.

No aircraft or parts were to be imported, and in a final twist of the knife, Germany was not allowed to control their own airspace. Allied aircraft were granted free passage over Germany and unlimited landing rights. On May 8, 1920, the Luftwaffe was officially disbanded.

Other provisions of the Versailles Treaty dealt with the limits of the army and navy, which were denied tanks, artillery, poison gas, submarines and other modern weapons. Germany was to be effectively disarmed and rendered militarily helpless. An Inter-Allied Control Commission was given broad authority to inspect military and industrial installations throughout Germany to ensure compliance with all restrictions.

However, one critical aspect got overlooked in the zeal to impose such a broad set of sanctions. They left unsupervised one of the most influential military thinkers of the 20th century … former commander-in-chief of the German Army Hans von Seeckt. He was the only one who correctly analyzed the operational lessons of the war, and accurately predicted the direction that future wars would take. Allied generals clung to outdated principles like using overwhelming force to overcome defensive positions, while Von Seeckt saw that maneuvers and mobility would be the primary means for the future. Mass armies would become cannon fodder and trench warfare would not be repeated.

The story of the transformation of the Luftwaffe is a fascinating one. Faced with total aerial disarmament in 1919, it was reborn only 20 years later as the most combat-effective air force in the world. Concepts of future air war along with training and equipment totally trumped the opposition, which was looking backward … always fighting the last war.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Cuban Missile Crisis ‘News’ Gave Us a Preview of the Internet Age

An original October 1962 news photograph of President John F. Kennedy and Robert Kennedy taken as tensions grew during the Cuban Missile Crisis sold for $527 at an August 2015 Heritage auction.

By Jim O’Neal

“I am prepared to wait for my answer until hell freezes over.”

An unusual statement, especially at an emergency session of the somber United States Security Council, and uncharacteristically bellicose for the speaker, U.N. Ambassador Adlai E. Stevenson. It simply was the most dangerous time in the history of the world … the 1962 Cuban Missile Crisis.

Stevenson

Ambassador Stevenson was interrogating Soviet U.N. representative Valerian Zorin while accusing them of having installed nuclear missiles in Cuba, a mere 90 miles from the U.S. coastline. Tensions were sky high. The Joint Chiefs had recommended to President John F. Kennedy an airstrike, followed by an immediate invasion of Cuba using U.S. military troops.

Then with the world’s two superpowers eyeball to eyeball, as Dean Rusk commented, the other guy blinked. Cuba-bound Soviet ships stopped, turned back, and the crisis swiftly eased.

Over much of the world, and especially in Washington and New York, there was relief and rejoicing. With crucial backing from the United Nations and the Organization of American States (OAS), nuclear war was averted. Success in avoiding a war of potential global devastation has gradually clouded the fact that the United States came perilously close to choosing the military option.

The arguments of those who fought for time and political negotiations have been blurred and gradually obscured by widespread euphoria. Even for Ambassador Stevenson, the sweet taste of success soon turned sour. First, there was the death of his dear friend Eleanor Roosevelt, quickly followed by a vicious personal attack on him that he never fully recovered from.

When Mrs. Roosevelt reluctantly entered the hospital, it was thought she was suffering from aplastic anemia. But on Oct. 25, 1962, her condition was diagnosed as rare and incurable bone-marrow tuberculosis. She was prepared and determined to die rather than end up a useless invalid. Her children reluctantly decided Stevenson should be allowed one last visit to his old friend, although daughter Anna warned she might not recognize him.

On Nov. 9, two days after her death, the U.N. General Assembly put aside other business and allowed delegate after delegate to express their personal grief and their country’s sorrow. It was the first time any private citizen had been so honored. Adlai told friends that his speech at the General Assembly and the one he gave at her memorial service were the most difficult and saddest times of his life.

Then a harbinger of a brewing storm started on Nov. 13 when Senator Barry Goldwater issued a sharp attack on Stevenson by implying he had been willing to take national security risks to avoid a showdown with the Soviets. The Saturday Evening Post followed with an article on Cuba that portrayed Stevenson as advocating a “Caribbean Munich.” The headlines at the New York Daily News screamed “ADLAI ON SKIDS OVER PACIFIST STAND ON CUBA.”

For months, Washington was abuzz with rumors that it was all a calculated effort by JFK and Bobby to force Stevenson to resign as U.N. ambassador. It was all innuendo, half-facts and untrue leaks, but it was still reverberating a quarter of a century later when the Sunday New York Times magazine, on Aug. 30, 1987, published a rehash of all the gossip.

In truth, all we were witnessing was a preview of things to come: the internet age of “Breaking News” (thinly veiled opinions parading as facts), 24/7 cable TV loaded with panels of “talking heads,” and a torrent of Twitter gibberish offering a full banquet of tasty goodies for any appetite.

Stevenson, born in Los Angeles in 1900 – the year his grandfather ran for vice president on a losing ticket with William Jennings Bryan – lost his own bid for the presidency twice (1952 and 1956). He died of a heart attack in 1965 in London while walking in Grosvenor Square – finally getting some peace.

The rest of us will have to wait.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Artists Recognized James Monroe as a True American Hero

A charcoal sketch of George Washington aide Lt. Col. Robert Hanson Harrison that artist John Trumbull did for his epic painting The Capture of the Hessians at Trenton sold for $8,962 at a May 2009 Heritage auction.

By Jim O’Neal

John Trumbull (1756-1843) deservedly earned the sobriquet as the “Painter of the Revolution.” He actually started out as an aide to General George Washington, but ended up in London, where he developed into a highly respected artist. One of his paintings, which illustrates the signing of the Declaration of Independence, graces the $2 bill that features Thomas Jefferson. The bill was issued in 1976 to observe the bicentennial of that historic event.

Another of his numerous works is the The Capture of the Hessians at Trenton on Dec. 26, 1776. This one naturally features General Washington again, but there is also a depiction of future president, Lieutenant James Monroe, being treated for a near-fatal damaged artery.

An even more famous painting of the times is an 1851 oil on canvas that also features Washington – Washington Crossing the Delaware on Dec. 25-26, 1776. It was painted by Emanuel Leutze (1816-1868), a German-American immigrant. Once again, we find James Monroe holding the American flag – the Stars and Stripes – which critics are always quick to remind was a flag not adopted until the following year, 1777. Some nitpickers also harp that the time of day is wrong, the ship is incorrect, and (sigh) even the chunks of ice in the river aren’t right.

But the role of James Monroe as a true hero is beyond any doubt.

Often called the “Last of the Founding Fathers,” he was the fifth president of the United States and like Washington, Jefferson and Madison, the son of a Virginia planter. It is sometimes overlooked that in the first 36 years of the American presidency, the Oval Office was occupied almost exclusively by men from Virginia. Somehow, John Adams (Massachusetts) managed to squeeze in a quick four years as president (1797-1801) before sneaking out of Washington, D.C., when Thomas Jefferson ousted him.

James Monroe entered politics after his service in the Revolutionary War and systemically worked his way up after serving in the Virginia legislature. He was a U.S. senator, a minister to France, and then governor of Virginia. After helping negotiate the Louisiana Purchase, he served as minister to Britain, followed by another stint as Virginia’s governor. But after only four months, President Madison offered him an appointment as secretary of state to help draft the recommendation to Congress that led to the declaration of war against Great Britain in 1812.

When the war got off to a poor start, Madison wisely appointed him secretary of war and Monroe held both of these critical Cabinet positions until the war ended. After the war, the prosperity of the country improved dramatically and with Madison’s strong support, Monroe easily was elected president in 1816.

Taking office when the country finally had no unusual problems, the 58-year-old Monroe was bold enough to declare during his inaugural address: “Never did a government commence under auspices so favorable, nor ever was success so complete. If we look to the history of other nations, ancient or modern, we find no example of a growth so rapid, so gigantic, of a people so prosperous and happy … the heart of every citizen must expand with joy … how near our government has approached to perfection…”

It was truly the “Era of Good Feelings!”

Things change … and they will again.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Cotton Gin Extended America’s Abhorrent Practice of Slavery

The 1796 patent signed by George Washington for “new machinery called the Cotton Gin” realized $179,250 at a May 2011 Heritage auction.

By Jim O’Neal

In 1776, Scottish economist, philosopher and teacher Adam Smith wrote The Wealth of Nations, a book that helped create a new understanding of modern economics. A pervasive theme was the idea that any economic system could be automatic and self-regulating if it was not burdened by monopolies or artificial trade barriers. This theory has become widely known as “the invisible hand.” It heavily influenced my favorite economist Milton Friedman and his Free to Choose basic philosophy.

One highly topical insight was that slavery was not economically viable and contributed to inefficient markets. Aside from the obvious moral issue, Smith believed slave owners would benefit by switching to a wage-labor model, since it was much more inexpensive to hire workers than own them and provide decent conditions. Buying slaves was much more costly due to ongoing expenses of feeding, housing and caring for workers with a high mortality rate, workers who eventually would have to be replaced.

In the United States, there was also a major disconnect between the concepts of all men being created equal and the cruel practice of slavery, which was prevalent especially in the agrarian states of the South. Although many sincerely believed that slavery would gradually die out, powerful Southern states needed some kind of assurances before they agreed to the new federal Constitution. Section 9 Article 1 of the Constitution barred any attempt to outlaw the slave trade before 1808. Other provisions prohibited states from freeing slaves who fled from other states, and further required them to return “chattel property” (slaves) to their owners. Kicking the issue down the road 20 years enabled the delegates to reach a consensus.

Historian James Oliver Horton wrote about the power slaveholder politicians had over Congress and the influence commodity crops had on the politics and economy of the entire country. A remarkable statistic is that in the 72 years between the election of George Washington (1788) and Abraham Lincoln (1860), in 50 of those years, the president of the United States was a slaveholder; as was every single two-term president.

The passage in 1807 of the Act of Prohibiting Importation of Slaves in America, and the Slave Trade Act in Great Britain marked a radical shift in Western thinking. Even as late as the 1780s, the trade in slaves was still regarded as natural economic activity. Both U.S. and European colonies in the Caribbean depended on slave labor, which was relatively easily obtained in West Africa.

However, it was really the invention of the cotton gin by Eli Whitney in 1793 that dramatically extended the abhorrent practice of slavery. Cotton was suddenly transformed from a labor intensive, low-margin commodity with limited demand into a highly lucrative crop. Production in Southern states exploded as demand skyrocketed. The number of slaves grew concurrently from 700,000 in 1790 to 3.2 million by 1850. The United States quickly grew into the largest supplier in the world and snagged 80 percent of the market in Great Britain, whose appetite seemed insatiable.

As an economist, Adam Smith was undoubtedly right about hiring workers versus owning them, but everybody was too busy getting rich to worry about optimizing labor costs. And the more demanding abolitionists in the industrializing North denounced slavery the more Southern states were determined to retain it. It would take a bloody four-year Civil War and 630,000 casualties to settle it.

Harry Truman once explained why he preferred one-armed economists: It was because they couldn’t say “On the other hand…”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Roosevelt’s Courage, Determination Made Him a Remarkable Man

A President Theodore Roosevelt “Equality” pin, produced after Booker T. Washington visited the White House in 1901, sold for $8,962 at a November 2010 Heritage auction.

By Jim O’Neal

President Theodore Roosevelt Jr. was born on Oct. 27, 1858. His mother, Martha Bulloch “Mittie” Roosevelt, was a Southern Belle socialite and family members were wealthy Southern planters and part of the Georgia elite. In 1850, they had over 30 slaves, most of whom worked in the cotton fields. Many believe that the character Scarlett O’Hara in Gone With the Wind was at least partially based on Mittie.

The Roosevelt family moved north to New York, however Mittie remained fiercely loyal to the South and when the Civil War finally started, it caused a schism in the family. Mittie and her sister Anna, unbeknownst to Theodore Sr. or the neighbors, spent many afternoons putting together relief packets for relatives and friends in the South. They were shipped to the Bahamas and then by blockade-runner to Georgia.

Exactly 22 years later in 1880, Teddy Roosevelt celebrated his birthday by marrying 19-year-old Alice Hathaway Lee, a cousin of a Harvard classmate. After spending a few weeks at the Roosevelt home in Oyster Bay, they moved to New York City along with Theodore’s (now) widowed mother Mittie. When Alice discovered in July 1883 that she was pregnant, T.R. was predictably thrilled, as he fully endorsed the traditional American ideal of large families. His life seemed ideal since his political career was going so well as a member of the state legislature in Albany.

However, he soon became concerned when Alice fell sick as her due date grew near. The nature of her illness was hard to pinpoint, but the family doctor didn’t seem too concerned. Alice was well enough to worry more about Theodore’s mother than herself. Mittie had contracted something virulent and was not improving. Her high fever raised the possibility of typhoid, which, although not contagious, was also not treatable.

At 8:30 on the evening of Tuesday, Feb. 12, Alice gave birth to a healthy 8-pound girl. The good news was telegraphed to T.R. in Albany, who passed out cigars and proceeded to clean up some details before heading home. Then a second telegraph arrived; Alice had taken a turn for the worse. T.R. dropped everything and rushed back to Manhattan on the next train. Arriving home, he was dismayed to find Mittie burning up with typhoid fever and Alice battling what was vaguely described as Bright’s disease (a potentially fatal kidney condition). A beleaguered Roosevelt spent the next 16 hours at one bedside and then the other.

Mittie went first in the darkest predawn hours of Thursday, Feb. 14, and Alice breathed her last 11 hours later in the early afternoon on the same day. Stunned and disoriented, Roosevelt managed to inscribe a thick black X in his diary for Feb. 14, followed by a single sentence: “The light has gone out of my life.”

It is a testament to his courage and fierce determination that he was able to regroup after such tragedy, losing his wife and mother on the same day and in the same house. He was somehow able to resume his life, with his most important contributions yet to come.

Simply a truly remarkable man.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Harvard-Educated Adams Cracked Down on Non-Citizens, Free Speech

An 1805-dated oil on canvas portrait of John Adams, attributed to William Dunlap, sold for $35,000 at a May 2017 Heritage auction.

By Jim O’Neal

When Barack Obama was sworn in on Jan. 20, 2009, he became the eighth president to have graduated from Harvard, which has educated more U.S. presidents than any other university. Yale is second with five, with George W. Bush counting for both Yale and Harvard (where he earned an MBA).

The first of the “Harvard Presidents” goes all the way back to 1796, when John Adams narrowly defeated Thomas Jefferson 71 to 68 in the electoral vote count. It was the only election in history in which a president and a vice president were elected from opposing parties.

However, Jefferson bounced back four years later in a bitter campaign characterized by malicious personal attacks. Alexander Hamilton played a pivotal role in sabotaging President Adams’ attempt to win a second term by publishing a pamphlet that charged Adams was “emotionally unstable, given to impulsive decisions, unable to co-exist with his closest advisers, and was generally unfit to be president.”

When all the votes were counted in 1800, Adams actually ended up third behind both Jefferson and Aaron Burr (who eventually became vice president). John and Abigail Adams took the loss very emotionally and it alienated their relationship with Jefferson for 20-plus years. Adams departed the White House before dawn on Inauguration Day, skipped the entire inauguration ceremony and headed home to Massachusetts. The two men ultimately reconciled near the end of their lives (both died on July 4, 1826).

Adams had been an experienced executive-office politician after serving eight years as vice president for George Washington. However, his four years as president were controversial. It started when the Federalist-dominated Congress passed four bills, collectively called the Alien and Sedition Acts, which President Adams signed into law in 1798. The Naturalization Act made it harder for immigrants to become citizens, and the Alien Friends Act allowed the president to imprison and deport non-citizens deemed dangerous or from a hostile nation (Alien Enemy Act). And finally, the Sedition Act made it a crime to make false statements that were critical of the federal government.

Collectively, these bills invested President Adams with sweeping authority to deport resident non-citizens he considered dangerous; they criminalized free speech, forbidding anyone to “write, print, utter or publish … any false, scandalous and malicious writing or writing against the government of the United States … or either House of Congress of the United States … with intent to defame … or bring them into contempt or dispute … or to excite against them or either of them … the hatred of the good people of the United States.”

Editors were arrested and tried for publishing pieces the Adams administration deemed seditious. Editors were not the only targets. Matthew Lyon, a Vermont Congressman, was charged with sedition for a letter he wrote to the Vermont Journal denouncing Adams’ power grab. After he was indicted, tried and convicted, Lyon was sentenced to four months in prison and fined $1,000.

For Vice President Jefferson, the Alien and Sedition Acts were a cause of despair and wonderment. “What person, who remembers the times we have seen, could believe that within such a short time, not only the spirit of liberty, but the common principles of passive obedience would be trampled on and violated.” He suspected that Adams was conspiring to establish monarchy again.

It would not be the last time Americans would sacrifice civil liberties for the sake of national security. More on this later.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].