Couriers through history have toted a staggering volume of items

Alf Landon’s congratulatory postal telegram to Franklin D. Roosevelt on Nov. 4, 1936, realized $7,767 at a June 2008 Heritage auction.

By Jim O’Neal

The passing of information over great distances is an ancient practice that’s used many clever techniques.

In 400 B.C., there were signal towers on the Great Wall of China; beacon lights or drumbeats also were used to relay information. By 200 B.C., the Han dynasty evolved a complex mix of lights and flags. Speed has always been a priority and took many forms, including at the U.S. Postal Service. “Neither snow nor rain nor heat nor gloom of night stays these couriers from the swift completion of their appointed rounds” is the unofficial motto of the Postal Service (probable source: The Persian Wars by Herodotus).

Authorized by Congress in 1792, the many forecasts of the Post Office Department’s imminent demise appear to be exaggerated. We still get incoming six days a week and a post-office driver delivered a package to my place at 8 p.m. last week … one day after I ordered it from Amazon. The post office – renamed the U.S. Postal Service in 1971 – is now running prime-time ads claiming they deliver more e-commerce packages than anyone.

Through history, postal couriers have toted a staggering volume of items, as well as a few astonishing ones. A resourceful farmer once shipped a bale of hay from Oregon to Idaho. A real coconut was sent fourth class from Miami to Detroit with the address and postage affixed to the hull. Even sections of pre-fab houses have been mailed, delivered and then assembled into full-size homes.

Accounts vary as to where the 53-cent postage was affixed to pre-schooler Charlotte May Pierstorff the day her parents mailed her to see her grandmother in Idaho. In 1914, they had discovered it was cheaper to send her by U.S. mail than the full-fare the railroad charged for children traveling alone. At 48½ pounds, little May fell within the parcel post 50-pound weight limit. She traveled in the train’s mail compartment and was safely delivered to grandma. She lived to be 78 and died in California. She’s featured in an exhibit at the Smithsonian … at the National Postal Museum.

Another cheapskate shipped an entire bank building – 80,000 bricks, all in small packages – from Salt Lake City to Vernal, Utah, in 1916. This time, the postmaster put his foot down … no more buildings! But 9,000 tons of gold bars were transferred from New York to Fort Knox in 1940-41. This time, the post office collected $1.6 million in postage and insurance. My all-time favorite was when jeweler Harry Winston donated the famous Hope Diamond to the Smithsonian in 1958. He kept costs low by sending the 45.52-carat gem in a plain brown wrapper by registered first-class mail. (Note: It arrived safely.)

The quest for speed took a quantum leap on May 24, 1844, when Samuel F.B. Morse sent the first telegraph. Standing in the chamber of the Supreme Court, Morse sent a four-word message to his assistant in Baltimore, who transmitted the message back. Members of Congress watched the demonstration with fascination. At the time, the Supreme Court was housed in the Capitol building. They finally got their own building in 1935 after heavy lobbying by Supreme Court Justice William Howard Taft.

For Americans at the turn of the 20th century, seeing a telegram messenger at the door usually meant bad news. Western Union and its competitors weren’t pleased by the fact that their roles as bearers of bad news had spread. So in 1914, they started emphasizing good-news messages, sending them in bright, cheerful seasonal envelopes. Next were 25-cent fixed-text telegrams that gave senders pre-written sentiments in 50 categories, like Pep-Gram #1339: “We are behind you for victory. Bring home the Bacon!” Forgot Mother’s Day? Use #432: “Please accept my love and kisses for my father’s dearest Mrs.” Next were singing telegrams, but they became passé and in 2006, Western shut down its telegram service.

The message of that first Morse telegraph in 1844 is a question we still ask after every innovation, whether it was faxes, the internet, email or texts: “What hath God wrought?”

Perhaps next is mental telepathy. Who knows? It will be faster and awe us once again.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

How We Record History Has Evolved Over the Ages

A 1935 copy of The History of Herodotus of Halicarnassus (Nonesuch Press) sold for $1,125 at an October 2013 auction.

By Jim O’Neal

We often fail to remember that history (itself) has a history. From the earliest times, all societies told stories from their past, usually imaginative tales involving the acts of heroes or various gods. Later, civilizations kept records inscribed on clay tablets or the walls of caves. However, ancient societies made no attempt at verification of records, and often failed to differentiate between reality and mythical events and legends.

This changed in the 5th century B.C. when historians like Herodotus and Thucydides explored the past by the interpretation of evidence, despite still including a mixture of myth (“history” means “inquiry” in Greek). Still, Thucydides’ account of the Peloponnesian War satisfies most criteria of modern historical study. It was based on interviews with eyewitnesses and attributed actual events to individuals rather than the intervention of gods.

Thus, Thucydides managed to create the most durable form of history: the detailed narrative of war, political conflict, diplomacy and decision-making. Then, the subsequent rise of Rome to dominance of the Mediterranean encouraged other historians like Polybius (Hellenic) and Livy (Roman) to develop narratives to capture a “big picture” that made sense of events on a longer time frame. Although restricted to just the Roman world, it was the beginning of a universal history to describe progress from origin to present, with a goal of giving the past a purpose.

In addition to making sense of events through narratives, there was a tradition growing to examine the behavior of heroes and villains for future moral lessons. We still attempt this today with a steady stream of studies of Lincoln, Churchill and Gandhi, as well as Stalin, Hitler and Mao.

But there was a big hiccup with the rise of Christianity in the late Roman Empire era, which fundamentally changed the concept of history in Europe. Historical events started to be viewed as “divine providence” or the working of God’s will. Skeptical inquiry was usually neglected and miracles routinely accepted without question. Thankfully, the Muslim world was more sophisticated in medieval times and they rejected accounts of events that could not be verified.

However, neither Christians nor Muslims produced anything close to the chronicle of Chinese history published under the Song Dynasty in 1085. It recorded history spanning almost 1,400 years and filled 294 volumes. (I have no idea how accurate it is!)

By the 20th century, the subject matter of history – which had always focused on kings, queens, prime ministers, presidents and generals – increasingly expanded to embrace common people, whose role in historical events became more accessible. But most world history was written as the story of the triumph of Western civilization, until the second half when the notion of a single grand narrative simply collapsed. Instead, the post-colonial, modern world demanded the study of blacks and women’s histories, in addition to Asians, Africans and American Indians.

Now we are in another new place where it is increasingly difficult to know where to find reliable accounts of real events and a flood of “fake news” is competing for widespread acceptance. Maybe Henry Ford was right after all when he declared that “History is bunk!”

Personally, I don’t mind and still enjoy frequent trips to the past … regardless of factual flaws.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chair and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Penicillin Changed Medicine — But Deadly Enemies Lurk

alexander-fleming-signed-photograph
A photograph signed by Nobel Prize winner Alexander Fleming sold for $1,250 at an April 2016 auction.

By Jim O’Neal

In the fifth century B.C., Herodotus noted in his “History” that every Babylonian was an amateur physician, since the sick were laid out in the street so that any passerby could offer advice for a cure. For the next 2,400 years, that was as good an approach as any to curing infections; doctors’ remedies were universally useless.

Until the middle of the 20th century, people routinely died from infections. Children were killed by scarlet fever, measles and even tonsillitis. Mothers systematically died from infections following childbirth and many who survived were taken later by pneumonia or meningitis.

Soldiers most commonly died from infections such as gangrene or septicemia, not from war injuries. Even a small cut could lead to a fatal infection. Bandaging a wound simply sealed in the infectious killers to carry out their deadly missions. Of the 10 million killed in World War I, 5 million died of infections.

There were few antidotes to infections … vaccination against smallpox with cowpox vaccine (Edward Jenner in 1796), introduction of antiseptics (Joseph Lister in 1865), and the advent of sulfa drugs in 1935. But there was no known cure for a stunning number of other deadly threats: typhoid fever, cholera, plague, typhus, scarlet fever, tuberculosis. The list seemed endless and most of these ended in death.

All of this changed in 1940.

Alexander Fleming’s discovery of penicillin while examining a stray mold in his London lab in 1928, and its eventual development by a team at Oxford University, led to the discovery of antibiotics. This was the most important family of drugs in the modern era. Before World War II ended, penicillin had saved the lives of hundreds of thousands and offered a viable cure for major bacterial scourges such as pneumonia, blood poisoning, scarlet fever, diphtheria and syphilis/gonorrhea.

The credit usually goes to Fleming, but the team of Howard Florey, Ernst Chain, Norman Heatley and a handful of others on the Oxford team deserve a major share. The efficacy and eventual use of the drug required them to perform their laboratory magic.

Neither Fleming nor Florey made a cent from their achievements, although Florey, Fleming and Chain did share a Nobel Prize. British pharmaceutical companies remarkably failed to grasp the significance of the discovery, so American companies – Merck, Abbott, Pfizer – quickly grabbed all the patents and proceeded to make enormous profits from the royalties.

The development of antibiotics is one of the most successful stories in the history of medicine, but it is unclear whether its ending will be a completely happy one. Fleming prophetically warned in his 1945 Nobel lecture that the improper use of penicillin would lead to it becoming ineffective. The danger was not in taking too much, but in taking too little to kill the bacteria and “[educating] them on how to resist it in the future.” Penicillin and the antibiotics that followed were prescribed too freely for ailments they could cure, and for other viral infections they had no effect on. The result is strains of bacteria that are now unfazed by antibiotics.

Today, we face a relentless and deadly enemy that has demonstrated the ability to mutate at increasingly fast rates – and these “super bugs” are capable of developing resistance. We must be sure to “keep a few steps ahead.”

Hear any footsteps?

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].