Who will continue the story of the greatest country in history?

The myth of George Washington chopping down a cherry tree appears on this tin mechanical badge, which dates to 1889.

By Jim O’Neal

It is always a pleasant surprise to discover an obscure name that has been lost in the sands of America’s history. Charles Thomson (1729-1824) falls into that category. He was Secretary of the Continental Congress from 1774 continuously until the U.S. Constitution was firmly in place and the bi-cameral government was functioning in 1789.

He was also the only non-delegate to actually sign the Declaration of Independence. A surprisingly active participant in the Revolutionary War, he wrote a 1,000-page manuscript on the politics of the times. It included the formation of the Continental Congress and a daily recap of speeches and debates right up to the agreement to go forward. In concept, it was a remarkable, contemporary record that is unique (except for notes James Madison complied). The public was not allowed to even attend the meetings in Philadelphia.

Alas, Thomson’s manuscript was never published since he destroyed it. Purportedly, it was because he wanted to preserve the reputations of these heroes and assumed that others would write about these historic times. The obvious implication is that his candid verbatim notes might tarnish some of his fellow colleagues. Thomson receives credit for helping design the Great Seal of the United States and since he personally chose what to include in the official journal of the Continental Congress, we’re left to wonder what he omitted or later burned … another example of why history is normally not considered precisely correct.

Charles Thomson

As Secretary of the Congress, Thomson personally rode to Mount Vernon, Va., and delivered the news to George Washington that he had been elected president of the United States. He told Washington that Congress was delighted he’d agreed “to sacrifice domestic ease and private enjoyments to preserve the happiness of your country.” Washington, in turn, said he couldn’t promise to be a great president, but could promise only “that which can be done by honest zeal.”

Political pundits opine that the office of the president was perfectly suited for George Washington, especially during the early formative years of the nation. A well-known hero in the fight for independence, he was a national leader who gained power without compromising himself or his principles. Absent the burden of a political party, he could have easily assumed the kind of monarchical power the nation had fought against. But, like his hero Cincinnatus, he had laid down his sword and returned to the plow. Clearly, this was a case of the office seeking the man as opposed to the reverse.

Washington truly did not aspire to the presidency – perhaps unique compared to all the men who followed. In his own words, he considered those eight years a personal sacrifice. In that era, land was the ultimate symbol of wealth and prestige. Through inheritance, he had acquired Mount Vernon and roughly 2,000 acres. That was not close to satisfying his ambitions and he spent much of his private life in a search of more … much more!

Historian John Clark called him an “inveterate land-grabber” and there’s plenty of evidence to support the claim. In 1767, he grabbed land set aside for Indians by the Crown by telling the surveyor to keep it a secret. This was followed by another 20,000 acres designated for soldiers in the French and Indian War. Washington arranged for officers to participate and then bought the land after telling the solders it was hilly, scrubby acreage. Washington would later boast that he had received “the cream of the country.”

Most biographies have been consistent in pointing out that land may have been a prime factor in his decision to court the widow Martha Parke Custis. They invariably point to his strong affection for Sally Fairfax, but she was his best friend’s wife. Martha was not without attraction. As one of the richest widows in North America, her marriage to George resulted in a windfall since what was hers became his. In addition to nearly 100 slaves, her 6,000 acres made George a very rich man. Details of their relationship are not available since Martha burned their love letters after his death.

However, since slaves over 12 were taxed, there are public records. During the first year of their marriage (he was 26 and she was 28), he acquired 13 slaves, then another 42 between 1761 and 1773. From tax records, we know he personally owned 56 slaves in 1761 … 62 in 1762 … 78 in 1765 … and 87 in the 1770s. Washington, Jefferson, Madison and most Virginia planters openly acknowledged the immorality of slavery, while confessing an inability to abolish it without financial ruin.

Washington had a reputation for tirelessly providing medical treatment for his slaves. But, was it for regard of property or more humane considerations? I suspect the answer lies somewhere in between.

As the first president, the paramount issue – among the many priorities of his first term – was to resolve the new government’s crushing debt. In 1790, the debt was estimated at $42 million. It was owed to common citizens of modest means and to thousands of Revolutionary War veterans whose IOUs had never been redeemed as stipulated by the Articles of Confederation. The war pension certificates they held had declined dramatically by 15 to 20 percent of face value.

Raising taxes was too risky and states might rebel. Ignore the debt, as had been the custom for several years, and the federal government risked its already weak reputation. The new president had to turn to his Cabinet for advice. He had an excellent eye for talent and the brilliant Alexander Hamilton was Treasury Secretary. He quickly formed a plan to create a new Bank of the United States (BUS). Since the bank would be backed by the federal government, people would feel safer about lending money and, as creditors, they would have a stake in both the bank and the government. Although Thomas Jefferson opposed the BUS, Washington prevailed in Congress.

Washington was re-elected four years later, again with a unanimous vote in the Electoral College. The first popular voting would not occur until 1824 and since that time, five presidential candidates have been elected despite losing the popular vote: John Quincy Adams (1824), Rutherford Hayes (1876), Benjamin Harrison (1888), George W. Bush (2000), and Donald J. Trump (2016).

It’s not easy starting a new country. There were no cherry trees to chop down as Parson Weems’ story describes. George Washington did not throw a silver dollar across the Rappahannock River. These are all fairy tales that grew over time. Yes, George Washington owned slaves and told a lie now and then. He was obsessed with land at one time. But, when it came to crunch time, he stepped up and committed eight years of his life to his country.

The big question now seems to be where we’ll find another man or woman to continue the story of the greatest country in history?

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Latest volume on political career of Johnson can’t come soon enough

A photo of Lyndon B. Johnson being sworn in as president, inscribed and signed by Johnson, sold for $21,250 at an August 2018 Heritage auction.

By Jim O’Neal

Like other reverential fans of author Robert Caro’s multi-volume biography of Lyndon Baines Johnson, I’m still waiting patiently for him to finish volume five. It will cover the entire span of LBJ’s presidency, with a special focus on the Vietnam War, the Great Society and the Civil Rights era. Caro’s earlier biography of Robert Moses, The Power Broker, won a well-deserved Pulitzer in 1974.

In 2011, Caro estimated that his final volume on LBJ (his original trilogy had expanded to five volumes) would require “another two to three years to write.” In May 2017, he confirmed he had 400 typed pages completed and intended to actually move to Vietnam. In December 2018, it was reported Caro “is still several years from finishing.”

Since Caro (b.1935) is two years older than me, there may exist a certain anxiety that time may expire unexpectedly. However, it will still be worth the wait and I shall consume it like a fine 3-Star Michelin dinner in Paris. Despite all that’s been written about this period of time, Caro is certain to surprise with new facts and his unique, incomparable perspective.

Recall that planning for the 1963 campaign was well under way by autumn for the 1964 presidential election. The razor-thin victory of JFK over Richard Nixon in 1960 (112,000 votes or 0.12 percent) had largely been due to VP Johnson’s personal efforts to deliver Texas to the Democrats.

Others are quick to remind us that allegations of fraud in Texas and Illinois were obvious and that Nixon could have won if he had simply demanded a recount. New York Herald Tribune writer Earl Mazo had launched a series of articles about voter fraud. However, Nixon persuaded him to call off the investigation, telling him, “Earl, no one steals the presidency of the United States!” He went on to explain how disruptive a recount would be. It would damage the United States’ reputation in foreign countries, who looked to us as the paragon of virtue in transferring power.

Forty years later, in Bush v. Gore, we would witness a genuine recount in Florida, with teams of lawyers, “hanging chads” and weeks of public scrutiny until the Supreme Court ordered Florida to stop the recount immediately. Yet today, many people think George W. Bush stole the 2000 presidential election. I’ve always suspected that much of today’s extreme partisan politics is partially due to the bitter rancor that resulted. His other sins aside, Nixon deserves credit for avoiding this, especially given the turmoil that was just around the corner in the tumultuous 1960s.

Back in 1963, Johnson’s popularity – especially in Texas – had declined to the point JFK was worried it would affect the election. Kennedy’s close advisers were convinced a trip West was critical, with special attention to all the major cities in Texas. Jackie would attend since she helped ensure big crowds. Others, like U.N. Ambassador Adlai Stevenson and Bobby Kennedy, strongly disagreed. They worried about his personal safety. LBJ was also opposed to the trip, but for a different reason. Liberal Senator Ralph Yarborough was locked in a bitter intraparty fight with Governor John Connally; the VP was concerned it would make the president look bad if they both vied for his support.

We all know how this tragically ended at Parkland Hospital on Nov. 22 in Dallas. BTW, Caro has always maintained that he’s never seen a scintilla of evidence that anyone other than Lee Harvey Oswald was involved … period. Conspiracy theorists still suspect the mob, Fidel Castro, Russia, the CIA or even the vice president. After 56 years, not even a whiff of doubt.

Lyndon Baines Johnson was sworn in as president in Dallas aboard Air Force One by Judge Sarah T. Hughes (who remains the only woman in U.S. history to have sworn in a president). LBJ was the third president to take the oath of office in the state where he was born. The others were Teddy Roosevelt in Buffalo, N.Y., following the McKinley assassination (1901) and Calvin Coolidge (1923) after Harding died. Coolidge’s first oath was administered by his father in their Vermont home. Ten years later, it was revealed that he’d taken a second oath in Washington, D.C., to avert any questions about his father’s authority as a Justice of the Peace to swear in a federal-level officer.

On her last night in the lonely White House, Jackie stayed up until dawn writing notes to every single member of the domestic staff, and then she slipped out. When the new First Lady walked in, she found a little bouquet and a note from Jackie: “I wish you a happy arrival in your new home, Lady Bird,” adding a last phrase, “Remember-you will be happy here.”

It was clear that the new president was happy! Just days before, he was a powerless vice president who hated Bobby Kennedy and the other Kennedy staff. They had mocked him as “Rufus Corn Pone” or “Uncle Corn Pone and his little pork chop.” Now in the Oval Office, magically, he was transformed to the old LBJ, who was truly “Master of the Senate.” Lady Bird described him with a “bronze image,” revitalized and determined to pass Civil Rights legislation that was clogged in the Senate under Kennedy. Historians are now busy reassessing this period of his presidency, instead of the prism of the Vietnam quagmire.

LBJ would go on to vanquish Barry Goldwater, the conservative running as a Republican in 1964, with 61.1 percent of the popular vote, the largest margin since the almost uncontested race of 1820 when James Monroe won handily in the “Era of Good Feelings.” 1964 was the first time in history that Vermont voted Democratic and the first time Georgia voted for a Republican. After declining to run in 1968, LBJ died five years later of a heart attack. Jackie Kennedy Onassis died on May 19, 1994, and the last vestiges of Camelot wafted away…

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Clock ticking for one of America’s most influential retailers

A photograph signed by Richard W. Sears sold for $1,792 at an October 2009 Heritage auction.

By Jim O’Neal

This is a highly condensed story of an American retailing giant that only seems relevant as just another casualty of internet e-tailing. In cultural terms, it is generally portrayed as just another backwater wasteland. But this situation seems oddly different, since it seems like Sears has been a central player in the story of American life.

For a long time, the retailer’s products, publications and people influenced commerce, culture and politics. And then, slowly, it became subsumed into the gravitational pull of business bankruptcies that is relentless when corporate balance sheets weaken and finally fail. I selected it – rather than, say, Montgomery Ward, Atlantic & Pacific, or J.C. Penney – because of its long history and because its demise was like losing a century of exciting surprises to an indifferent bankruptcy judge who yawned and gaveled it into the deep, dark cemetery of obscurity.

Sears has roots to 1886, when a man named Richard W. Sears began selling watches to supplement his income as a railroad station agent in North Redwood, Minn. The next year, he moved the business to Chicago and hired Alvah C. Roebuck, a watchmaker, through a classified ad. Together, they sold watches and jewelry. The name was changed in 1893 to Sears, Roebuck & Company and by 1894, sewing machines and baby carriages were added to its flourishing mail-order business. Its famous catalogs soon followed.

Sears & Roebuck helped bring consumer culture to middle America. Think of the isolation of living in a small town 120 years ago. Before the days of cars, people had to ride several days in a horse and buggy to get to the nearest railroad station. What Sears did was make big-city merchandise available to people in small towns, desperate for news and yearning for new things. It made hard work worthwhile knowing that there was a surprise just over the horizon.

The business was transformed when Richard Sears harnessed two great networks – the railroads, which now blanketed the entire United States, and the mighty U.S. Postal Service. When the Postal Service commenced rural free delivery (RFD) in 1896, every homestead in America came within reach.

And Richard Sears reached them!

He used his genius for promotion and advertising to put his catalogs in the hands of 20 million Americans, at a time when the population was 76 million. Sears catalogs could be a staggering 1,500 pages with more than 100,000 items. When pants supplier and manufacturing wizard Julius Rosenwald became his partner, Sears became a virtual, vertically integrated manufacturer. Whether you needed a cream separator or a catcher’s mitt, or a plow or a dress, Sears had it.

The orders poured in from everywhere, as many as 105,000 a day at one point. The company had so much leverage that it could nearly dictate its own terms to manufacturers. Suppliers could flourish if their products were selected to be promoted. Competition was fierce and the Darwinism effect was in full play. Business boomed as the tech-savvy company built factories and warehouses that became magnets for suppliers and rivals as well. City officials complained that it was harming nearby small-town retailers (sound familiar?).

There was a time when you could find anything you wanted in a Sears catalog, including a house for your vacant lot. Between 1906 and 1940, Sears sold 75,000 build-from-a-kit houses, some still undoubtedly still standing. The Sears catalog was second only to the Holy Bible in terms of importance in many homes.

In 1913, the company launched its Kenmore brand, first appearing on a sewing machine. Then came washing machines, dryers, dishwashers and refrigerators. As recently as 2002, Sears sold four out of every 10 major appliances, an astounding 40 percent share in one of the most competitive categories in retailing.

By 1925, they opened a bricks-and-mortar retail store in Chicago. This grew to 300 by 1941 and more than 700 in the 1950s. When post-war prosperity led to growth in suburbia, Sears was perfectly positioned to cash in on another major development: the shopping mall. A Sears store was an ideal fit for a large, corner anchor store with plenty of parking. Sears revenue topped $1 billion for the first time in 1945 and 20 years later it was the world’s largest retailer and, supposedly, unassailable.

Oops.

By 1991, Walmart had zipped by them … never bothering to pause and celebrate. For generations, Sears was an innovator in every area, including home delivery, product testing and employee profit-sharing, with 350,000 dedicated employees and 4,000 outlets. What went wrong?

The answer is many things, but among the most significant was diverting their considerable retail cash flow in an effort to diversify. Between 1981-85, they went on a spending spree, first acquiring Dean Witter Reynolds, the fifth-largest stock brokerage, and then real estate company Coldwell Banker. They ended up selling the real estate empire and then spun off Dean Witter in a desperate effort to return to their retailing roots. This was after someone decided to build a 110-story, 1,450 foot skyscraper with 3 million square feet (the tallest building in the world at the time) to centralize all their Chicago people and then lease whatever was left over. You have to wonder what all these people were doing. (It wasn’t selling perfume or filling catalog orders!). The Sears Tower is now called Willis Tower (don’t ask).

They stopped the catalogs in 1993. One has to speculate what would have happened had they simply put their entire cornucopia of goodies online. I know timing is everything, but in 1995, on April 3, a scientist named John Wainwright bought a book titled Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought. He purchased it online from an obscure company called Amazon.

I will miss Sears when they gurgle for the last time. I cherished those catalogs when we lived in Independence, Calif. (the place Los Angeles stole water from via a 253-mile aqueduct). Me and my Boy Scout buddies all made wish lists, while occasionally sneaking a peek at the lingerie section.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Are we ready to continue building this great nation?

Two weeks before the Treaty of Paris ended the Spanish-American War, Princeton Professor Woodrow Wilson in this letter to an anti-imperialist says it’s too late to protest and that the focus should be on the “momentous responsibilities” facing the nation.

By Jim O’Neal

Many historians believe that the European exploration of the Western Hemisphere (1500 to 1800) was one of the most transformative eras in the history of civilization. The great Scottish philosopher Adam Smith (1723-1790) took it a step further and labeled it “one of the two greatest and most important events recorded in the history of mankind.” Much of the modern world is a direct result of these 400 years of colonization and transference of culture. In the end, the world seemed inexorably on the way to what we now call “globalization.”

It seems overly dramatic to me (and omits great chunks of transformative periods) but also unambiguously clear that – despite the broad participation of other important nations – the people from England and Spain had the most influence on the vast territories of the New World. However, these two genuinely great empires ultimately evolved into dramatically different societies. Also in the crystal-clear category is that, in the end, they both managed to dissipate the powerful advantages they had created.

A quick snapshot of the world today confirms this devolution. The once mighty Spanish Empire is reduced to a relatively small, unimportant European nation (with a shaky economy, disturbing brain drain and geographic unrest). The other powerful empire of even greater influence in the world is now back to being a small island, wracked with political dissent over further retreat from the European Union (Brexit) and a dangerously unstable government.

In their place is the most powerful, democratic, innovative nation in the history of the world. But even the remarkable United States has developed troubling signs that pose a real threat to a continuation of prosperity. If we don’t find a way to reverse the issues that divide us (basically almost every single issue of importance) and close the inequality gap, our future will inevitably end up like those that went before. An economic boomlet has masked deep, difficult issues that politicians are blithely hoping will somehow be solved by some unknown means. We lack leadership at a time when Waiting for “Superman” is not a prudent strategy.

Some believe we are in a steady decline and that China will surpass America in many important areas this century. However, that is pessimistic conjecture. It’s more useful to re-examine the factors that propelled us to a pinnacle of unprecedented prosperity. I find it more interesting to visit the past rather than speculate on a future with so many possible outcomes (e.g. extinction via asteroid collisions, interstellar travel or a billion robots with superior intellect). It is an unknowable with questionable benefits.

One simplistic way is to skip our story of independence from England and correlate the decline of the Spanish Empire with our annexation of the Spanish-speaking borderlands. It broadly occurred in three phases, starting with the annexation of Florida and the Southeast by 1820. This was followed by California, Texas and the greater Southwest by 1855. Mexico lost 50 percent of its land and up to 80 percent of its mineral wealth. The final phase occurred with the Spanish-American War of 1898, which added Central America and the Caribbean to complete the New American Empire.

Virtually every American president was complicit in varying degrees, bookended by Thomas Jefferson and Teddy Roosevelt, who wrote as if this was preordained by a benevolent entity. With immigrants flowing into the East, the promise of free land and the lure of gold in California, the land between the oceans became steadily populated and blended. The short war with Spain was merely the capstone for a century of annexation, population growth and a perfect balance of territory, people and economic development. The motivation was clear (“sea to sea”) and the manipulation perfectly illustrated by this anecdote:

Publisher William Randolph Hearst (eager to have a war to sell more newspapers) hired Frederic Remington to illustrate the revolution erupting in Cuba. In January 1897, Remington wrote to Hearst, “Everything is quiet. There is no trouble. There will be no war. I wish to come home.” Hearst quickly responded, “Please remain. You furnish the pictures and I WILL FURNISH THE WAR.”

A year later, the Treaty of Paris was signed and Spain relinquished all claims of sovereignty and title to Cuba (long coveted by the U.S. for its sugar and labor), then ceded Puerto Rico and Guam to America. The Philippines was (much) more complicated. The islands had been under Spanish rule for four centuries and waging a war for independence since 1896. The U.S. Navy prevailed and Spain sold the Philippines to the U.S. for $20 million. However, Filipino nationalists had no interest in trading one colonial master for another. They declared war on the United States. Finally, in 1946, the U.S. recognized the Philippines’ independence.

And that, dear friends, is how you build (and lose) an empire.

In a different time, we would simply annex the rest of Mexico, eliminate the border with Canada and create a North American juggernaut to counter China and end squabbling over a wall. We could help Mexico (now perhaps a few U.S. states), eliminate drug cartels, develop the entire Baja California coastline to match Malibu and take advantage of the outstanding Mexican labor force to rebuild infrastructure. All the wasted money on border security (DHS, ICE, asylum, deportations, etc.) would be spent rebuilding old stuff.

But, I will need your vote for 2020! (I feel certain Adam Smith would agree.)

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Does America still have ‘the right stuff’ to continue this remarkable story?

This 1903 Louisiana Purchase Gold Dollar, Jefferson Design, minted to commemorate the centennial of the Louisiana Purchase, realized $37,600 at an April 2017 Heritage auction.

By Jim O’Neal

The 19th century in the United States was by any standard an unusually remarkable period. In 1800, John Adams was still president, but had lost his bid for re-election to Vice President Thomas Jefferson, the man behind the words in our precious Bill of Rights. Alexander Hamilton had used his personal New York influence to break a tie with Aaron Burr, since Jefferson was considered the least disliked of the two political enemies. (Burr would kill Hamilton in a duel in 1803 by cleverly escalating a disagreement into a matter of honor.)

There were 16 states in 1800 (Ohio would join the Union as no. 17) and the nation’s population had grown to 5.3 million. Within weeks of becoming president, Jefferson learned that Spain had receded a large portion of its North American territory to France. Napoleon now owned 530 million acres, more than what the United States controlled. Fearful that losing control of New Orleans to our new French neighbor would lead to losing control of the strategically important Mississippi River, he developed a plan without including Congress.

He dispatched Robert Livingston and James Monroe to buy greater New Orleans for $10 million. They were pleasantly surprised when the French offered to sell 100 percent of their North American territories for $15 million cash … less several million in pending claims. Concerned that the French would change their offer before they could get formal approval, an agreement in principle was agreed to (later formally approved by Congress after James Madison’s assurance of its constitutionality.)

What a prize! 828,000 square miles for 3 cents an acre, virtually doubling the size of the United States and gaining control of the mighty Mississippi and shipping into the Gulf of Mexico. With this uncertainty removed, cotton production now expanded rapidly south and soon represented over 50 percent of total exports. With the aid of the cotton gin and slave labor, the United States now controlled 70 percent of the world’s production. Ominously, seeds of a great civil war were planted with each cotton plant.

For millions of people overseas, conquest or riches were not their primary ambition. Escaping the clutches of famine trumped all other hardships of life. 1842 was the first year in America’s history that more than 100,000 immigrants arrived in a single year. Five years later, the number from Ireland alone exceeded this, with Irish coming to America to escape the scourge of the Great Famine. In the 1840s and ’50s, 20 percent of the entire population of Ireland crossed the Atlantic in search of a better life. In sharp contrast to the Pilgrims on the Mayflower – who were on a financial venture supplied with rations – the Catholic Irish left in rags to avoid starvation in a mainly Protestant nation.

Concurrently, another wave from the European mainland was fleeing revolution and counterrevolution. In Germany, half-a-million left in a three-year period (1852-55) as a spirit of revolt captured the European continent. “We are sleeping on a volcano,” warned Alexis de Tocqueville. Meanwhile, two German thinkers (Karl Marx and Friedrich Engels) penned their intellectual nonsense, The Communist Manifesto, from the safety and luxury of London.

In the United States, just before the impending boomlet of immigration in 1846, total railroad mileage was a meager 5,000 miles. Ten years later, it quintupled to 25,000 as the influx of labor to lay iron rails was a perfect match for $400 million in capital. As famine and revolution were destroying Europe, their foreign transplants were busy transforming their new homeland. Also, the transition of coal to steam to steamboats scampering around the newly dug connecting canals would inspire new communications like the telegraph and Pony Express. While the country had been busy absorbing the wave of immigrants, it had also been in the throes of a decades-long internal migration west.

Thomas Jefferson had predicted it would be 1,000 years before the frontier reached the Pacific Ocean. Only 23 years after his death in 1826, gold was discovered in California and the fever to get rich started a westward movement that expanded globally. Once under way, the richness of the soil and massive new resources of rivers, forests, fish and bison would expand the migration to include farmers and their families. Horace Greeley shouted, “Go West, young man” and they did.

With room to grow and prosper, by 1900 the population would expand by a factor of 15 times to 76 million. They resided in 45 states after the Utah territories joined the Union in 1896. Fulfilling the vision of Manifest Destiny (from sea to sea), the rural population of 95 percent evolved as urbanization grew to 40 percent as industrialization and worker immigrants staffed the factories and cities. A short war with Mexico added California, Arizona and New Mexico, and President Polk’s annexation of Texas in 1845 filled in the contiguous states.

However, it was the railroads that created the permanence. With 30,000 miles of track in 1860, America already surpassed every other nation in the world. The continual growth was phenomenal: 1870 (53k), 1880 (93k), 1890 (160k) and by 1900 almost 200,000 … a six-fold increase in a mere 40-year period. Yes, there were problems: Illinois had 11 time zones and Wisconsin 38, but this was harmonized by 1883. Most importantly, they connected virtually every city and town in America and employed 1 million people!

Throw in a few extras like electricity, oil wells, steel mills and voila! The greatest nation ever built from scratch. Today, we have 6 percent of the people on 6 percent of the land and 30 percent of the world’s economic activity … and we are celebrating the 50-year anniversary of putting a man on the moon.

Do we still have “the right stuff” to continue this remarkable story? I say definitely, if we demand that our leaders remember how we got here.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Moral arguments continue over the use of atomic weapons in WWII

A 1971 photograph of Emperor Hirohito and Empress Nagako, signed, sold for $8,125 at an April 2015 Heritage auction.

By Jim O’Neal

If history is any guide, the month of August will arrive right on schedule. Inevitably, it will be accompanied by yet another birthday (no. 82 if my math is correct) and intellectual debates over the use of atomic bombs dropped on two Japanese cities in August 1945. Despite the passage of 74 years and the fact that it ended World War II, it remains the most controversial decision of a long, bloody war.

As a reminder, President Franklin Roosevelt had died in April 1945 soon after the start of his record fourth term in office. Vice President Harry Truman had taken his place and the new president attended a conference in defeated Germany to discuss how to persuade or force Japan to surrender. Persuasion was not really an option since the Empire of Japan was firmly committed to continue even if it resulted in the annihilation of its people and the total destruction of their country.

They had demonstrated their resolve during the bloody island-by-island fighting that left the Japanese mainland as the final target. Another amphibious landing was ruled out due to the expected enormous loss of life and an oath of 100 million inhabitants to fight until killed. Estimates vary on how many Americans would die … but they were all too high.

One strategy was to simply blockade all their ports and use our overwhelming air superiority to bomb them until they relented. But President Truman had a secret weapon and was fully prepared to use it if Japan resisted.

On July 26, 1945, Truman, British Prime Minister Winston Churchill and President Chiang Kai-shek of the Republic of China signed the Potsdam Declaration that warned the Japanese that if they did not agree to an “unconditional surrender,” they would face “prompt and utter destruction.” In addition, 3 million leaflets were dropped on the mainland to be sure the people were aware of the stakes and perhaps help pressure the leadership.

Afterwards, critics of what became the nuclear option have argued it was inhumane and violated a wartime code-of-ethics, perhaps like mustard gas or the chemical weapons ban we have today. However, it helps to remember that the avoidance of attacking non-combatant civilians had long been discarded by the mass bombings of European cities (e.g. the infamous firebombing of Dresden). And then the even more brutally systematic firebombing of Japanese cities. Destruction became the singular objective, knowing that ending the war would save more lives than any precision bombing.

Case in point is Air Force General Curtis LeMay, who arrived in Guam in January 1945 to take command of the 21st Bomber Command. His theory of war is eerily similar to General William Tecumseh Sherman’s “March to the Sea” in the Civil War. LeMay explained: “You’ve got to kill people, and when you kill enough, they stop fighting.” Precision bombing had given way to terror attacks that included civilian deaths indiscriminately.

Importantly, Lemay had just the right equipment to destroy Japan’s highly flammable cities filled with wooden houses. First was a highly lethal weapon called the M-69 projectile developed by Standard Oil. It was a 6-pound bomblet that consisted of burning gelatinized gasoline that, when stuck to a target, was inextinguishable. Second was a fleet of B-29 Superfortresses, ideal for continental bombing. They were powered by 4×2200 hp engines with a crew of 11 and a range of 4,000 miles. On March 9 … 344 B-29s began dropping M-69s over Tokyo in a crisscross pattern that merged into a sea of flames. The result was 90,000 dead and another million homeless. The victims died from fire, asphyxiation and buildings falling on them. Some were simply boiled to death in superheated canals or ponds where they sought refuge from the fire.

Over the next four to five months, they attacked 66 of Japan’s largest cities, killing another 800,000 and leaving 8 million homeless.

Despite this demonstration of power, the Japanese formal reply to the Potsdam Declaration included the word “mokusatsu,” which was interpreted as an imperial refusal. It was on this basis that Truman gave the order to proceed with bombing Hiroshima on Aug. 6. He left Potsdam and was at sea when the ship’s radio received a prearranged statement from the White House: “16 hours ago, an American airplane dropped one bomb on Hiroshima … it is an atomic bomb … it is harnessing the basic power of the universe.” Three days later on Aug. 9, a second bomb was dropped on Nagasaki.

Japanese Emperor Hirohito agreed to capitulate and an imperial script announcing the decision to the Japanese people was recorded for radio broadcast. Most Japanese had never heard the emperor’s voice.

As the moral arguments continue about the use of atomic weapons on people (in WWII), I find it to be a distinction without a difference … at least compared to having one of Lemay’s little M-69s stuck on my back.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Zimmermann Telegram the proverbial straw that broke America’s isolationism

A vintage postcard signed by U.S. General John J. Pershing (right), and also showing British Field Marshal Douglas Haig and French General Ferdinand Jean Marie Foch, went to auction in October 2006.

By Jim O’Neal

On Jan. 31, 1917, the German Secretary of State for the Imperial Navy addressed the nation’s parliament. “They will not come because our submarines will sink them.” He went on to state, categorically, “Thus, America, from a military point of view, means nothing … nothing!”

Strictly from an Army standpoint, Eduard von Capelle may have had a point. The U.S. Army had gradually declined in size to 107,641 – ranked No. 17 in the world. Additionally, the Army had not been involved in large-scale operations since the Civil War ended in 1865, over 50 years earlier.

The National Guard was marginally larger, with a total of 132,000. However, these were only part-time militia spread among the 48 states and they, quite naturally, varied considerably in readiness. Equipment was another issue since they were armed with nothing heavier than machineguns. This was rectified significantly in 1917-18 when 20 million men were registered for military service.

Known to only a few, two weeks earlier on Jan. 16, British code-breakers had intercepted a diplomatic message sent by German Foreign Secretary Arthur Zimmermann. The “Zimmermann Telegram,” as it is known, was intended for Heinrich von Eckardt, the German ambassador to Mexico.

The missive gave the ambassador a set of highly confidential instructions to propose a Mexican-German alliance should the United States enter the war against Germany. Von Eckardt was to offer the president of Mexico generous military and financial support if Mexico were to form an alliance with Germany. In exchange, Mexico would be free to annex the “lost territory in Texas, New Mexico and Arizona.” In addition to distracting the United States, Mexico could assist in persuading Japan to join in, as well.

At the start of World War I, Germany’s telegraph cables passing through the English Channel had been cut by a British ship. This forced the Germans to send messages via neutral countries. They had also convinced President Woodrow Wilson that keeping channels of communications open would help shorten the war. The United States agreed to pass on German diplomatic messages from Berlin to their U.S. Embassy in Washington, D.C.

The United States was still firmly committed to remaining neutral and not being entangled in foreign wars that did not pose a direct threat. Wilson had been re-elected in 1916 with a main slogan of “He kept us out of war!” But that did not prevent many individual citizens from joining and many were already fighting in the war in a variety of ways. Some had joined the British Army directly and others joined Canadians already in Europe.

There were also groups in the French Foreign Legion and a special group in the French Air Force. They formed the La Fayette Escadrille in honor of the Marquis de Lafayette, who was a friend from our own war for freedom. Lafayette had actually fought in the American Revolution as a major general under George Washington. He was even present at Yorktown, Va., when British Army General Charles Cornwallis had surrendered, effectively bringing an end to armed hostilities. When Lafayette died in 1834 in Paris, President Andrew Jackson had both Houses of Congress draped in black for 30 days. Individual members of congress also wore mourning badges. It is likely that we may have lost the war with Britain absent the help from the French.

Back on the morning of Jan. 17, 1917, one of the British codebreakers (Nigel de Grey) entered Room 40 of the British Admiralty and asked his boss a question: “Do you want to bring America into the war? I’ve got something that might do the trick!” It was a decoded copy of the Zimmermann Telegram.

Room 40 was the home of the British cryptographic center and they were acutely aware of the implications of disclosing their clandestine activities. They developed an elaborate plan to get a copy to President Wilson without exposing that they had been monitoring all transatlantic cables, including America’s (a practice that would continue for another 25 years). Wilson received a copy on Feb. 25 and by March 1, it was splashed on the front pages of newspapers nationwide.

Diplomatic relations had already been severed with Germany in early February when Germany had resumed unrestricted submarine warfare on American ships in the Atlantic. The Zimmermann Telegram became the proverbial straw that broke America’s isolationism and on April 2, Wilson asked Congress to officially declare war, which they did four days later.

Remarkably, by June 17, the American Expeditionary Force had landed in France. General John J. Pershing and his troops soon marched on Paris. By 1918, it was almost as though Von Capelle’s prophetic “They will never come” had been trumped in six months by America’s melodramatic “Lafayette, we are here!”

Many of the best Room 40 personnel would end up at Bletchley Park to work on cracking the German Enigma machine. Their work is captured brilliantly in the 2014 film The Imitation Game, with Benedict Cumberbatch in the Oscar-nominated role of English mathematics genius Alan Turing.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How to explain why Americans remain so divided?

An oil-on-canvas portrait of George Washington by Philadelphia artist Robert Street (1796-1865) sold for $41,250 at a May 2019 Heritage auction.

By Jim O’Neal

The date was April 30, 1789, and the highly respected commander of the Continental Army during the Revolutionary War was ready to take the oath as the first president of the United States. The election had taken place much earlier in January, but electoral votes had not been counted until April 5.

General George Washington, who also had served as the presiding officer at the Continental Convention of 1787, was unanimously elected to the highest office in the new nation. He had to travel from his home in Virginia to New York City for the formal inauguration.

Robert Livingston, chancellor of New York, administered the presidential oath of office at Federal Hall across from the New York Stock Exchange. This site was home to the first Congress, the Supreme Court and executive offices. President Washington then retired to the Senate Chamber where he delivered the first inaugural address to a joint session of Congress. Observers commented he was mildly embarrassed and noticed an almost imperceptible tremble; possibly due to the significant historic relevance of the occasion.

Two blocks away at 75 Wall Street now stands a 42-story modern structure of marble, glass and steel. This luxurious condominium was converted from office space in 2008. It sits at the water’s edge of the Hudson River, atop an old slave market, where people were bought and sold for over 50 years (1711-62).

For Dutch and Spanish slave-traders, who controlled the transatlantic trade at the time, there were far superior markets for the sale of slaves. For one, the sugar plantations in both Spanish America and Portuguese Brazil required hundreds of thousands of slaves.

Given the insatiable European demand for sugar, it made little sense for slave-traders to undertake the additional time to travel up the North American coast to service what was considered a small speculative market. A slave ship could make a round trip between West Africa and Brazil in the same time it would take just to reach Virginia one-way. Compounding the cost were the death rates a longer journey would impose on their human cargo. This was a thriving business and decisions were made based on profitability.

Another factor was that England’s American Colonies were not willing to pay a premium since there was an adequate supply of Europeans willing to serve as indentured servants. As a result, the transatlantic perimeter of the booming slave market essentially ended at the sugar-growing islands of the Caribbean.

In 1788, the year before Washington’s inauguration, the founders recognized that they would have to include language protecting slavery in order to get the necessary state votes to gain approval of the proposed U.S. Constitution. Perhaps James Madison – “The Father of the Constitution” – was the one who accomplished this without using either the words “slaves” or “slavery” (they do not appear until later with the 13th and 15th Amendments). Instead, the reference was to a person “held to service” or “bound to service.”

In a compromise, the Constitution ordered Congress to pass a regulation to abolish it by 1800. A special committee extended the deadline to 1808 to allow a gradual 20-year phase out. They assumed (wrongfully) that slavery would become uneconomic and just naturally die out. In fact, it continued to grow. The 1790 census reported a U.S. population of 4 million, including 700,000 slaves.

Then, in 1793, Eli Whitney invented the cotton gin, which fueled a massive increase in cotton production. Slave plantations were America’s first big business, not the railroad, as some believe. Ten of the first 12 presidents were slaveholders, and two of the earliest Chief Justices.

The slave trade was one of Great Britain’s most profitable businesses. From 1791 to 1800, British ships made 1,300 trips across the Atlantic with 400,000 slaves. And then from 1801 to 1807 another 266,000. During the whole of the 18th century, the slave trade accounted for 6 million Africans. Britain was the worst transgressor, responsible for 2.5 million of the total.

In 1807, the U.S. Congress passed the slave act to “prohibit the transportation of slaves into any port … from any foreign place” (starting in 1808, as directed by the Constitution). However, it did not ban the trading of slaves within the United States. With an estimated 4 million slaves in the country (plus children born into slavery), this resulted in a self-sustaining model that did not require importation, which was now abolished.

The nation continued to evolve with a Southern agrarian society heavily dependent on slave labor while the North pursued industrialization. The constant debates over abolition simply shifted to how new territories and states would enter the Union … as free or slavers. This delicate balance was not sustainable and virtually everyone knew how it might end. It was almost inevitable that no permanent agreement was possible.

We know the implications of the great Civil War that was required to permanently stop slavery in the United States and the difficulties during the post-war reconstruction era. We can wonder about the progress in civil rights during the 20th century. But how do we explain why we are still so divided racially?

I was eager to hear the recent presidential debates, expecting discussions about climate change, health care, immigration, inequality and impeachment. Instead, issues like reparations, asylum, abortion and even forced busing in the 1970s took center stage. Had I dialed 2020 and ended up in 1820 in a Twilight Zone episode?

Maybe Pogo was right after all: “We have met the enemy and he is us.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

History littered with leaders who underestimated the power of the people

A Silver Medal for Gallantry, at the Battle of Bunker Hill, from King George III to Captain Peter Ewing of the Royal Marines, dated June 17, 1775, realized $32,000 at a June 2012 Heritage auction.

By Jim O’Neal

It is fascinating to watch the mass demonstrations in Hong Kong and speculate how the central government in Beijing will quell the unrest. It could end up badly if President Xi Jinping decides to ensure there is no uncertainty over the ultimate authority that still rests comfortably on the mainland … and resorts to force.

Hong Kong (Island) became a colony of the British Empire after the First Opium War (1839-42), then expanded to include the Kowloon Peninsula after the Second Opium War in 1860. Finally, Britain obtained a 99-year lease in 1898. After the lease expired in 1997, Hong Kong has exploded into one of the major financial centers of the world. It is home to many of the world’s ultra-rich and has managed to squeeze 7 million to 8 million residents into its 317 skyscrapers, the most of any city in the world. It has a first class, 21th century economic model that is widely admired.

The protesters’ beef is over governance, since HK has an autonomous system with executive, legislative and judicial powers devolved from the mainland (two systems, one country). They are resisting a proposal that would dilute the judiciary by moving criminal indictments and trials to Beijing. Small wonder since the conviction rate is almost 100 percent. If President Xi does move forward (at least now), it would be a classic mistake that many others have made. History is littered with leaders who underestimated the power of the people they govern.

Eight hundred years ago, King John of England experienced a similar revolt by English nobility against his rule. The king met with the barons and affixed the Royal Seal to a peace agreement that became known as the Magna Carta (Great Charter). The implications of the Magna Carta were quite modest with respect to what the king agreed to.

He guaranteed to respect feudal rights and privileges, to uphold the freedom of the church and maintain the nation’s laws. However, later generations have come to view it as the cornerstone of a democratic England.

King John inherited the crown after the death of his brother, Richard the Lionheart, when he was on crusade in 1199. By 1215, he was viewed as a failure after raising taxes on the nobility to compensate for losing Normandy to the French. He also frequently quarreled with Pope Innocent III and even sold church properties to replenish the royal coffers. In return, he was formally excommunicated.

But importantly, just by signing the Magna Carta, it implied that the king was obliged to follow certain laws and avoid any future claim of absolutism. Several earlier monarchs had talked about the king having some sort of “divine immunity.” Then there was also clause 39 (of 63) that stated “no freeman shall be arrested or imprisoned … or outlawed or exiled … except by the lawful judgment of his peers.” This clause has been celebrated as our own guarantee of a jury trial and habeas corpus.

In one aspect, all of this was actually moot since another civil war erupted almost immediately and both King John and the barons disregarded their commitments after the pope annulled the Magna Carta. Fortunately, King John died the following year (1216) and his 9-year-old son, Henry III, inherited the crown. Under the auspices of his guardian William Marshall, the Magna Carta was revived and eventually it formally entered English statute law.

Closer to home, it’s safe to assume that when 22-year-old King George III succeeded to the British Crown in 1760, most American Colonists considered themselves Britons and subjects of the king (but not Parliament). However, this system started to unravel after the Seven Years’ War of 1756-63 when Great Britain started imposing higher taxes on the Colonies. One glaring example was the large British garrisons established after the war; Colonists were required to pay all costs to maintain them.

This was followed by the Sugar Act of 1764 and the pervasive Stamp Act in 1765. Next was the Declaratory Act of 1776, when it became obvious that the Parliament of Great Britain was intent on extending its sovereign power into every nook and cranny of daily colonial life. Loyalty to the king was one thing, but to allow Parliament to impose new taxes at will and without any representation or discussion was quite another. It proved a bridge too far, but no one in England was sympathetic to the whinges from across the ocean.

Enter a man named Thomas Paine, who believed that arguments over equality, excessive taxes, or lack of representation or divided loyalty were wrong. He helped shift the focus to one of separation and unrestricted independence. Gaining support for his views was difficult due to slow communications in the Colonies and the subtle complications to men of little education. Newspapers were notoriously inadequate due to erratic distribution and lack of coherence.

What distinguished Paine was his remarkable ability to synthesize the issues and offer ideas that the general population could grasp. Further, he was a pamphleteer extraordinaire and authored “Common Sense,” which was perfect for the masses to understand. It was an immediate success – the equivalent of 18th century social media (which spawned the “Arab Spring” we witnessed in the Middle East). Suddenly, the momentum shifted to “Give Me Liberty or Give Me Death,” instead of untimely complaints.

Ever disdainful and out of touch, Lord Sandwich, First Lord of the Admiralty, pronounced to the House of Lords in March 1775: “Of the Colonies … they are raw, undisciplined cowardly men.” More famously, British Army officer James Grant proclaimed in the House of Commons that Americans could not fight because “they drink, they swear, they whore” and that he would “undertake to march from one end of the continent to the other with but 5,000 regular British soldiers.”

Pity King George (now 37), who had never been a soldier, had never been to America or even set foot in Scotland or Ireland. But, with absolute certainty, he believed his trust in providence and high sense of duty. Nagged by his mother – “George, be a king!” – America must be made to pay. Inevitably, war came on April 19 with the first blood at Lexington and Concord and then savagely at Bunker Hill. On June 3, General George Washington had taken command of the “American Rabble.”

Game on!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Throughout U.S. history, a lot of money has been made from tobacco

Peter Stackpole’s gelatin silver print titled Camel Cigarette Billboard Sign, Times Square, 1944, went to auction in 2013.

By Jim O’Neal

In 1976, while planning a new Frito-Lay plant for Charlotte, N.C., a small group of us made a trip to Winston-Salem to visit R.J. Reynolds Tobacco. They had spent $1B on a computer-integrated manufacturing plant (C.I.M.) that was recognized as the largest and most modern cigarette plant in the world. We were interested in the latest automation technologies available for possible application in our new plant.

My two most vivid memories include the pervasive odor outside the plant that I (correctly) identified as menthol. This was not too tough since the plant was the major producer of Salem brand cigarettes (I assumed the rest were Winstons, given the town we were in). Second was that every single manager we met was a heavy smoker, with the biggest clue being the distinctive deep-yellow stain between their index and middle fingers. It was like being in a 1940s Bette Davis movie.

We finished up with an enjoyable dinner with David P. Reynolds, chairman emeritus of Reynolds Metals, whom I had known since my days involving aluminum foil and beer cans. He amused the group by telling old company stories, including “Lucky Strike Green Goes to War.” It seems that in 1942, they wanted to change the package design by substituting white ink for the more familiar green. Both copper and chromium were expensive ingredients in the green ink, so it simply “went to war” (and never returned). There was another story involving Camel and Kaiser Wilhelm (the original name favored for the cigarette that debuted in 1913). I don’t remember the details, but the moral of the story was … never name a product after a living person.

Later, I learned about Operation Berkshire, a secret 1976 agreement between all tobacco CEOs to form a collective defense against anti-smoking legislation (anywhere). Each pledged to never concede that smoking had any adverse health effects. We all recall the “Seven Dwarfs” testifying in April 1994 to the U.S. Congress (under oath) that nicotine was not addictive and smoking did not cause cancer. Movie tip: The Insider starring Russell Crowe ranks No. 23 on AFI’s list of the “100 Greatest Performances of All Time.” It tells the tobacco story of today brilliantly.

Lucky Strike was introduced as chewing tobacco in 1871, evolving into a cigarette by the early 1900s.

More than 400 years earlier, in 1604, King James I had written a scathing rebuke to the evils of tobacco in A Counterblaste to Tobacco. He was the son of Mary, Queen of Scots, and ascended to the English throne when Elizabeth I died childless. He wrote of tobacco as “lothsome to the eye, hatefull to the Nose, harmefull to the braine, [and] dangerous to the Lungs.” He equated tobacco with “a branche of the sinne of drunkenness, which is the roote of all sinned.”

Tobacco was late to arrive in England. Fifteenth-century European explorers had observed American Indians smoking it for medicinal and religious purposes. By the early 16th century, ships returning to Spain took back tobacco, touting its therapeutic qualities. The Iberian Peninsula eagerly adopted its use.

When English settlers arrived in Jamestown in 1607, they became the first Europeans on the North American mainland to cultivate tobacco. Spotting an opportunity in 1610, John Rolfe (of Pocahontas fame) shipped a cargo to England, but the naturally occurring plant in the Chesapeake region was considered too harsh and bitter. The following year, Rolfe obtained seeds of the milder Nicotiana tabacum from the Spanish West Indies and soon production was rapidly growing and spreading to Maryland. By the middle of the 18th century, Virginia and Maryland were shipping nearly 70 million pounds of tobacco to Britain.

Even as many Colonial leaders in America believed that smoking was evil and hazardous to health, it had little effect on the relentless spread of tobacco farming. By the eve of the Revolutionary War, tobacco was the leading cash crop produced by the Colonies. Exports to Britain rose to over 100 million pounds … 50 percent of all Colonial trade. Never was a marriage of soil and seed more bountiful.

But tobacco cultivation and manufacturing were extremely labor-intensive activities. Initially, white indentured servants were used to harvest the crop and inducements to come to America often came in the form of a formal “indentured servitude” agreement. Typically, in exchange for agreeing to work for seven years, the servant would receive his own land to farm. This system was preferred over slavery; losing a slave was seen as more costly than losing an indentured servant.

Then the economics started shifting as land became scarcer and slaves more plentiful due to King Charles II. He decided to create the Royal African Company of England and grant it a monopoly with exclusive rights to supply slaves to the Colonies. Then, with the explosion of cotton production, there was an enormous demand for more slaves.

A cynic might note that the formation of the United States was first led by men from Virginia and then governed by them. President Washington, followed by Thomas Jefferson, James Madison and, finally, James Monroe … four of the first five presidents … all from Virginia and all with slave plantations.

Throughout our history, there has been a lot of money made from tobacco. As the plant manager at that C.I.M. plant explained, “We ship about 800 rail cars filled with cigarettes every eight hours and they come back loaded with cash.”

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].