Kennedy’s Court Appointments Kept World of Judiciary at Peace

This copy of PT 109, signed by John F. Kennedy, author Robert J. Donovan and surviving crew members, sold for $13,750 at a December 2016 Heritage auction. The book tells the story of one of the most important episodes in Kennedy’s life.

By Jim O’Neal

When war broke out in 1939, all the Rhodes Scholars in England were sent home and this included Byron “Whizzer” White. He went back to Yale and graduated from its law school with honors. Then, in 1942, he enlisted in the Navy, as so many others did. He was serving in the Solomon Islands as PT boat squadron skipper and intelligence officer when John F. Kennedy was a PT boat officer. It was White who personally wrote the official account of the battle events that were later portrayed in the book and movie PT 109.

Flash forward 20 years and there was a famous photo of a smiling Kennedy, now president of the United States, pointing at the front-page headline of the New York Herald Tribune – “WHIZZER WHITE TO SUPREME COURT – LAWYER, NAVAL OFFICER, FOOTBALL STAR.” It was JFK’s first appointment to the Supreme Court.

In August 1962, President Kennedy got a second bite at the same apple. Justice Felix Frankfurter, once styled as “the most important single figure in our whole judicial system,” bowed to the effects of a stroke and announced his retirement. The president acceded to his request and called a press conference to announce he had chosen Secretary of Labor Arthur Goldberg as the replacement.

This was not a great surprise, since the 54-year-old labor expert was well-qualified and eager to join the court. The only slight reluctance was his close personal relationship to the president and the loss of a highly valued cabinet position. However, both Chief Justice Earl Warren and Frankfurter himself supported the decision and it was made.

The nation’s reaction was universally favorable and the Senate Judiciary Committee was in total agreement. Goldberg was confirmed by the full Senate, with only Senator Strom Thurmond recording his opposition. Thus the new justice was able to take his seat on the court in time for the October 1962 term. The world of the judiciary was at peace, even after the tragic events in Dallas in November 1963 and the Warren Commission investigation that followed.

However, after a mere three years on the Supreme Court, President Lyndon B. Johnson decided that Justice Goldberg should resign from the court and become ambassador to the United Nations, succeeding Adlai Stevenson. It now seems clear that LBJ’s motive was the naive hope that someway Goldberg might be able to negotiate an end to the nightmare in Vietnam. Goldberg was strongly opposed to the move, but as he explained to a confidant, “Have you ever had your arm twisted by LBJ?”

Supposedly, there was also a clearly implied understanding of an ultimate return to the court, which obviously never materialized. Neither did an LBJ suggestion that Goldberg might be a candidate for the 1968 vice-president slot – another false hope that was mooted by LBJ’s decision not to seek reelection.

Lost in all of this was the fact that Goldberg’s intended replacement on the court, Abe Fortas, had repeatedly declined LBJ’s offers to be a Supreme Court justice. In fact, poor Abe Fortas never said yes. The president simply invited him to the Oval Office and informed him that he was about to go to the East Wing “to announce his nomination to the Supreme Court” and that he could stay in the office or accompany him.

Fortas decided to accompany the president, but to the assembled reporters he appeared only slightly less disenchanted than the grim-faced Goldberg, with his tearful wife and son by his side. Goldberg had reluctantly agreed to become ambassador to the United Nations and commented to the assembled group, “I shall not, Mr. President, conceal the pain with which I leave the court.”

It was a veritable funereal ceremony – except for a broadly smiling LBJ, who had once again worked his will on others, irrespective of their feelings. The man certainly did know how to twist arms – and I suspect necks and other body parts – until he achieved his objectives.

He was sooo good at domestic politics, it seems sad he had to also deal with foreign affairs, where a different skill set was needed.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

We Have Lost Something Sacred in Today’s Judicial Nomination Process

John Jay (1745-1829) was the first Chief Justice of the United States.

By Jim O’Neal

The Supreme Court was created in 1789 by Article III of the U.S. Constitution, which stipulates “the judicial power of the United States shall be vested in one Supreme Court.” Congress organized it with the Judiciary Act of 1789.

John Jay of New York, one of the Founding Fathers, was the first Chief Justice of the United States (1789–95). Earlier, he was president of the Continental Congress (1778-79) and worked to ratify the U.S. Constitution by writing five of the Federalist Papers. Alexander Hamilton and James Madison wrote the other 85-plus essays, which were published in two volumes called “The Federalist” (“The Federalist Papers” title emerged in the 20th century).

Nearly 175 years later, in 1962, President John F. Kennedy nominated Byron Raymond “Whizzer” White to replace Associate Justice Charles Whittaker, who became chief legal counsel to General Motors (presumably with a nice salary increase). Whittaker had been the first person to serve as judge at all three levels: Federal District Court, Federal Court of Appeals, and the U.S. Supreme Court (a distinction matched by Associate Justice Sonia Sotomayor).

White was the 1960 Colorado state chair for JFK’s 1960 presidential campaign and had met both the future president and his father Joe while attending Oxford University on a Rhodes Scholarship in London when Joe Kennedy was ambassador to the Court of St James. This was after White had graduated from Colorado University Phi Beta Kappa, where he was also a terrific athlete, playing basketball, baseball and finishing runner-up for the Heisman Trophy. He is unquestionably the finest athlete to serve on the Supreme Court.

He continued mixing scholarship and athletics at Yale Law School, where he graduated No. 1 in his class magna cum laude and played three years in the National Football League for the Pittsburg Pirates (now the Steelers). He was elected to the College Football Hall of Fame in 1954.

Judge White was in the minority on the now-famous Roe v. Wade landmark decision on Jan. 22, 1973. Coincidentally, there was a companion case that has been virtually forgotten called Doe v. Bolton (Mary Doe v. Arthur K. Bolton, Attorney General of Georgia, et al.) that was decided on exactly the same day and on the identical issue (overturning the abortion law of Georgia). White was in the minority here, too.

White’s nomination was confirmed by a simple voice vote (i.e. by acclamation). He was the first person from Colorado to serve on the Supreme Court and it appears that one of his law clerks … Judge Neil Gorsuch, also from Colorado … most likely will become the second, although it is unlikely he will receive many Democratic votes, much less a voice vote.

Times have certainly changed in judicial politics and, unfortunately, for the worse … sadly. Advise and Consent has morphed into a “just say no” attitude and we have lost something sacred in the process.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Death of Last Astronaut on Moon Reminds Us to Press Forward

No more than 80 Silver Robbins Medallions were flown aboard Apollo 17, inscribed with the names of crew members Gene Cernan, Ron Evans and Harrison Schmitt. This example, from the personal collection of astronaut Alan Bean, sold for $59,375 at a May 2014 Heritage auction.

By Jim O’Neal

On Jan. 16, 2017, Eugene Andrew Cernan, the last NASA astronaut to walk on the surface of the moon, died in a Houston hospital. His historic flight on Apollo 17 lasted from Dec. 7 to Dec. 19, 1972, and man has not been back since then. Cernan was 82 years old and the first astronaut to be buried at Texas State Cemetery.

Eight space missions visited the moon between 1968 and 1972 as part of NASA’s Apollo program. Each mission carried three American astronauts inside a spacecraft launched by a Saturn V rocket. Apollo 8 was used to test the spacecraft as it orbited the moon. Then, in a dress rehearsal prior to landing, Apollo 10 flew close to the lunar surface.

Cernan

The first of the six missions that successfully landed on the moon was Apollo 11 in 1969. Astronauts Neil Armstrong and Buzz Aldrin touched down in July of that year, with Armstrong the first to actually walk on the lunar surface. Just 27 daredevil astronauts made that same remarkable trip and a total of 12 walked on the cratered, lifeless surface.

The Apollo astronauts were blasted into space inside the nose cone of the largest rocket ever built, the Saturn V. It was designed by Wernher von Braun and Arthur Rudolph at Huntsville, Ala., and remains the tallest, heaviest and most powerful rocket brought into full operational status. It was developed as “Operation Paperclip,” a special program using German rocket engineers and approved by President Harry S. Truman in 1945 to leverage their expertise in building Nazi Germany’ V-2 rocket.

Von Braun had started in the U.S. Army after World War II and then transferred when the National Aeronautical and Space Administration was established in 1958 in response to the Russian Sputnik panic. He then became director of NASA’s Marshall Space Flight Center, where they designed the Saturn V. After President John F. Kennedy’s promise to land a man on the moon, von Braun and his team ensured that the United States would win the space race against the Soviet Union.

The giant Saturn V rocket – 40 feet taller than the full Statue of Liberty – consisted of three rockets in one. The first two stages lifted the Apollo spacecraft into space and the third stage put Apollo on course after reaching low Earth orbit. Apollo also had three sections: command, service and lunar modules. All were linked together for the 250,000-mile journey. Once there, the lunar module took two astronauts to the moon’s surface and back. All three astronauts then returned to Earth in the command module. Its conical shape allowed it to withstand the heat of reentry into Earth’s atmosphere for an easy splashdown.

Each night, the moon looms over Earth, peering down and wondering when to expect the next visitors. Perhaps it will be Mars instead. Space … the final frontier!

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

McNamara a Fascinating Executive with a Fascinating Career

A large photograph of John F. Kennedy and his original cabinet, signed by cabinet members including Robert McNamara (fourth from left), sold for $7,500 at a December 2016 Heritage auction.

By Jim O’Neal

Lieutenant Colonel Robert McNamara had planned to return to Harvard after his stint in the military since he truly enjoyed the Cambridge lifestyle and teaching statistics was his first love. However, in a bizarre twist of coincidence, he and his wife Margaret contracted polio. They were still hospitalized in August 1945 when World War II. His mentor, Tex Thornton, persuaded him to consider a new, higher-paying career in the private sector to help with the family hospital bills.

McNamara and the other Whiz Kids excelled at the Ford Motor Company by utilizing the skills they honed in the Army: control the organization by converting facts and numbers into meaningful information that was actionable. This was particularly valuable at Ford and its archaic operations … pitted against its main competitor General Motors and its classic style of highly accountable, decentralized profit centers. McNamara became the unofficial leader when Thornton left Ford for greener fields in aerospace.

McNamara rose quickly, as Henry Ford II was new and unsure of himself. To Ford, McNamara offered reassurance; when questions arose, he always had answers, not vague estimates, but certitudes, facts and numbers … and a lot of them. On Nov. 9, 1960, McNamara was promoted to president at Ford. It was the first time someone outside the Ford family was in charge.

As fate would have it, the prior day, on Nov. 8, John F. Kennedy became president-elect of the United States. Their careers would soon be joined in a truly unexpected way.

Kennedy sent Sargent Shriver to offer McNamara either the Secretary of Treasury or Secretary of Defense cabinet position. McNamara was disdainful of Treasury, but eager to take on something much more exciting, assuming his boss would agree (it had been only six weeks since he had taken the reins at Ford).

We all know how this turned out, but perhaps not the financial sacrifice involved. By accepting the Defense position, McNamara left $3 million in stock options.

Robert Strange (his mother was Clara Nell Strange) McNamara served as Secretary of Defense under two presidents (JFK and Lyndon B. Johnson) from 1961 to 1968, the longest tenure in history (10 days longer than Donald Rumsfeld), and during the important build-up years in Vietnam. In 1968, he sent a letter to LBJ advising him that the war was unwinnable and recommending the United States end it. The president never replied and McNamara was finished.

Later, he told his friend, Washington Post publisher Katharine Graham, he wasn’t sure if he quit or was fired. She replied, “Are you crazy? Of course you were fired!”

In 2003, Errol Morris produced the documentary The Fog of War, which captures these war years, including a poignant ceremony when McNamara retired and LBJ awarded him the Medal of Freedom. McNamara was so emotional that he had to defer on his acceptance remarks. It is a good flick and recommended since it uses archival film with contemporary comments from McNamara.

A fascinating man and career. He served as president of the World Bank from 1968 to 1981 before dying in 2009 at age 93.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Americans Have Turned Inward Before, in the Days of Richard Nixon

Chicago Sun-Times political cartoonist Bill Mauldin drew this piece shortly before President Nixon resigned in 1974. The original art sold for $2,748.50 at a November 2014 Heritage auction.

By Jim O’Neal

On Jan. 20, 1973, surrounded by happy perjurers, Richard M. Nixon celebrated his second inauguration in a three-day, $4 million extravaganza, organized by political operative Jeb Stuart Magruder. Named by his Civil War-buff father after Southern General J.E.B. Stuart, Magruder would later serve seven months in prison for perjury involving Watergate.

The rhetoric of the inaugural address was less a promise of what the government would do than what it wouldn’t. Twelve years earlier, another president of the same generation had vowed that “We’ll pay any price, bear any burden, meet any hardship, support any friend, oppose any foe, to assure the survival of liberty.”

Now, Nixon declared that, “The time has passed when America will make every other nation’s conflict our own … or presume to tell the people of other nations how to manage their own affairs.” At the same time, he prepared to liquidate the domestic programs of liberal administrations. Paraphrasing President Kennedy’s most memorable line, Nixon said, “Let each of us ask — not just what will government do for me, but what can I do for myself?” (Lyndon B. Johnson would die two days later, but presumably from other causes).

As Nixon paused for effect, a faint sound could be heard from several blocks away. A group of youths was chanting “Murderer,” “Out now,” and “End racism.” A woman from Iowa told a New York Times reporter, “Just disgusting. Why can’t they do something about those kids!”

It was certainly indecorous, yet these demonstrations, like the counterculture of the time, were an expression of the deep divisions in America and they had to be endured. There is no practical way to stifle dissent in an open society; if there was, I suspect Magruder and his allies would have tried to use it.

The chanters – about 500 to 1,000 that included yippies, militants and Maoist activists – were the smallest and rudest protestors in the multitude of demonstrators.

So it was – after intervening in foreign conflicts for a third of a century – that the people of the United States turned inward once more, seeking comfort and renewal in isolation. “So we beat on, boats against the current, borne back ceaselessly into the past.” (Last line from The Great Gatsby by F. Scott Fitzgerald.)

Maybe someday.

Intelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Even with United Nations, War and Terrorism Persist

charter-of-the-united-nations-and-statute-of-the-international-court-of-justice
This 1945 copy of Charter of the United Nations and Statute of the International Court of Justice, signed by John F. Kennedy, Henry Cabot Lodge and Adlai E. Stevenson, sold for $2,375 at an April 2014 auction.

“… To save succeeding generations from the scourge of war …” — From the United Nations Charter

By Jim O’Neal

Edward Stettinius, chair of the U.S. delegation to the United Nations conference in San Francisco, signed the U.N. Charter in 1945. President Harry Truman was in attendance and later signed the document by which he ratified the charter of the United Nations.

The charter established the structure of the United Nations and outlined its guiding principles to prevent war, affirm fundamental human rights, facilitate international peace and security, promote improved living standards, and support social progress and economic advancements (whew!).

The United States, Britain and the USSR were the primary designers of the decision-making structure. The General Assembly consisted of all member countries. The Security Council, which was responsible for international peace and security, originally had 11 members, six of which were elected to two-year terms. Five – the United States, Britain, USSR, France and China – were permanent members, and each had veto power on Security Council resolutions.

Disagreements based on national interests plagued the discussions at the April conference, but they did not prevent the formal U.N. formation. There was also considerable debate about the voting process and veto provisions. Finally, on June 25, the delegates unanimously adopted the charter and the next day they all signed the document.

After the permanent members of the Security Council and most other members ratified the charter, the United Nations was officially established on Oct. 24, 1945. The world had entered a new period of international collaboration determined to avoid a repeat of the two wars that had caused so much devastation in the first half of the 20th century.

Alas, these lofty aims did not last long as the Cold War soon started, followed by major conflicts in Korea, Vietnam (twice), Afghanistan, etc. When we look around the world today, it’s estimated that the United States has Special Forces in over 70 countries (at least) and ad hoc terrorism is a routine, daily occurrence in many places. A new Cold War is gradually taking shape and even nuclear proliferation is back in the news.

Maybe conflict is in our DNA.

One thing is certain. Assuming the United Nations survives, they will have plenty to do for a long time.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is president and CEO of Frito-Lay International [retired] and earlier served as chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

For President Johnson, Goal was Reached with ‘Great Society’ Legislation

lyndon-b-johnson-great-society-bill-signing-pens-from-1965
A complete set of 50 pens President Johnson used to sign “Great Society” legislation in 1965 sold for $18,750 at a November 2015 Heritage auction.

By Jim O’Neal

Whether Lyndon B. Johnson intended to run a second time for the presidency (after his 1964 election) is uncertain. Many of his predecessors had made it clear that one elected term was enough.

Theodore Roosevelt made a campaign promise not to run again for president and regretted it so much that he later ran anyway (in 1912). Rutherford B. Hayes never intended to run more than once (and was happy he hadn’t), and neither did Harry Truman or Calvin Coolidge. Except for TR, these men were no longer popular by the end of their first elected term, and it most likely would have been a waste of time.

So it was with LBJ. On March 31, 1968, he took the nation by surprise when he announced abruptly in a televised address from his office, “I shall not seek, and I will not accept, the nomination of my party for another term as your president.”

Johnson had even spoken of resigning, but if anything deterred him, it was the fear of losing his “Great Society” programs in Congress. Even the media-fueled support for Robert Kennedy was threatening, because Johnson never trusted him and was leery of his lack of power with Congress to be sure the programs got enacted. Johnson cared more about his agenda than the presidency.

lf
President Johnson signs legislation.

Then, shortly after his retirement speech, came the assassinations of Dr. Martin Luther King Jr. (April) and Kennedy (June), which stirred even more violence in the streets. The military was on stand-by and ready to pour into Washington if rioting was too much for the police. For the man in the White House, the outside world was a horror show and the idea of returning to his ranch grew more appealing. A long-time colleague from the old days, Congressman Jack Brooks, said the president did not seek reelection because he “kind of wanted to get back home,” adding for those who might not understand, “It’s not so bad out on the ranch, you know.”

Some presidents depart the White House invigorated, but most leave exhausted. For LBJ, the office had drained his vigor and confidence. He also believed that history would never give him credit for achieving the most powerful social agenda since Roosevelt’s New Deal. It was Johnson’s political skill that made it happen, not JFK, but Johnson believed that somehow the applause would inevitably go to his more popular predecessor. Sadly, he was right, but in recent years, a more balanced narrative has evolved.

Republicans nominated Richard Nixon in August 1968 and the Democrats chose VP Hubert Humphrey. LBJ did not attend the convention to share Humphrey’s triumph since he didn’t want to add any Vietnam War baggage to the ticket. During the campaign, the war flared on and LBJ was still impassioned to end it. On Oct. 31, just days before the election, he even announced a halt to the bombing, but it was too late.

On Jan. 14, 1969, President Johnson delivered his final State of the Union to Congress. It was strong, pragmatic and well-received by his old Senate colleagues – and in a venue where he was very comfortable.

Then it was time to pack up and head back to Texas.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

Reagan’s Last Christmas in Office Marked by Memorable Snowy Fairyland

1980s-ronald-reagan-win-one-for-the-gipper-signed-photograph
A photograph signed by Ronald Reagan with the inscription “Win one for the Gipper” sold for $8,365 at a November 2014 Heritage auction. It’s considered the most famous line Reagan spoke on the silver screen, in 1940’s Knute Rockne, All American.

By Jim O’Neal

In 1980, Ronald Wilson Reagan became the oldest man (69) to be elected president. He extended his record in 1984 when he was reelected at age 73. For their last Christmas in the White House, the Reagans wanted to make a splash. The East Room was transformed into a snowy fairyland, with full-size trees and a gift-filled sleigh occupied by carolers and drawn by lifelike horses, all powdered with glittery “snow.” It was a vintage Hollywood image.

Thousands of visitors filed by and looked on in both delight and amazement at the dazzling scene. Nothing remotely like this had ever been seen in the White House. It was a playful farewell by two whose roots were as firmly planted in Hollywood as John F. Kennedy’s were in Boston or Lyndon B. Johnson’s on the banks of the Pedernales River.

On his final day in office, Jan. 20, 1989, President Reagan went to the Oval Office early and met with his Chief-of-Staff Ken Duberstein and General Colin Powell, the National Security Advisor. Both of them said reassuringly, “Mr. President, the world is quiet today.” After they left, Reagan also left the office, stopping at the door for one last look. George and Barbara Bush were arriving in the entrance hall below.

On the route from the Capitol to the White House, the incoming President George H.W. Bush and first lady took a cue from the Carters, leaving their car from time to time to walk along Pennsylvania Avenue to greet the crowds. They walked up the driveway on the same path all their predecessors had followed since James Monroe’s second term, 168 years before.

History linked the inauguration of George H.W. Bush and George Washington. It had been exactly 200 years since the first president began serving his first term.

President Bush had an extensive background that included two terms in Congress, ambassador to the United Nations, director of the CIA, liaison to China, and eight full years as vice president. He had easily defeated Michael Dukakis to win the presidency, but in the process famously declared “Read my lips. No new taxes!” – words that would haunt him.

Although favored for reelection in 1992, he got caught in a buzz saw when third-party candidate Ross Perot siphoned off nearly 19 percent of the popular vote and a young governor from Arkansas won with a plurality of 43 percent. William Jefferson Clinton and Al Gore Jr. became the youngest president and vice president in history.

George H.W. Bush became the 10th incumbent president to lose in a bid for reelection after becoming the first sitting vice president to be elected president since Martin Van Buren in 1836.

The strange world of presidential politics. We love it.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

 

Apollo XI Reminds Us What’s Important, and Why the Stars Beckon

The historic first photo of Earth from deep space signed by all 29 Apollo astronauts realized $38,837.50 at a June 2011 Heritage auction.

By Jim O’Neal

Today is a special date.

On the night of July 20, 1969, thousands of people descended upon Central Park in New York and other public venues to bear witness to the greatest technological achievement in the history of mankind. At the long stretch of green known as Sheep Meadow stood three 9-by-12-foot television screens. At precisely 10:56 p.m. EDT, the fuzzy image of a man in a space suit moved down a ladder until the moment his boot struck the fine-grained surface of the moon.

Apollo XI was the amazing coda of the amazing ’60s. The story of the astronauts – Alan Shepard’s simple arc, the dramatic orbit of John Glenn, the tragedy that killed Gus Grissom, Ed White and Roger Chaffee – had run parallel with the decade’s other dramas. But the long series of space shots had become routine and many had begun to question the priority of space discovery in a time of so much domestic strife.

Apollo XI changed all that … for a short time.

Newspaper publishers ordered up their “Second Coming” type, as Time magazine described it. This was no mere piece of news; this was history, big enough to challenge some of the best stories in the Bible.

The plan to go to the moon had been hatched in a conference room of the Cold War, after Sputnik embarrassed American science in 1957, and moved into high gear when John F. Kennedy audaciously promised a moon landing in 1961.

Among those at the crowded Apollo XI launch site was the heroic 1920s pilot Charles Lindbergh, now 67, who later wrote to crew member Michael Collins (the one who didn’t walk on the moon): “I believe you will find that it lets you think and sense with greater clarity.”

An Apollo 11 framed photo signed by Neil Armstrong, Michael Collins and Buzz Aldrin realized $10,755 at an October 2009 Heritage auction.

It had only been 41 years since Lindy had conquered the Atlantic Ocean solo, and now mankind had conquered space. But the space program, like other artifacts of the ’60s, gradually evaporated, because no matter where you stood, the ’60s were messy and hard to understand clearly.

Yet from out there, in the dark eternity of the universe, our little home projected a picture of harmony, an essentially beautiful orb, and so utterly still.

Personally, just seeing Earth from space, so tranquil, helps me keep perspective on what is truly important. I do hope we keep reaching for the stars. Eternity is a long time.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].

China’s Fall to Communists Launched Dark Period in American History

Andy Warhol’s screenprint Mao (With Orange Face), 1972, realized $47,500 at a May 2015 Heritage auction.

By Jim O’Neal

On April 4, 1949, the day the United States and 11 other nations signed the North Atlantic Treaty Organization (NATO), a Communist General by the name of Chu Teh began massing a million of Mao Tse-tung’s seasoned troops on the north bank of the Yangtze River. This was the last natural barrier between Mao and the few southern provinces still loyal to Chiang Kai-shek’s Nationalist Party, or Kuomintang (KMT).

Three weeks later, Chu Teh’s veterans stormed across the Yangtze, but only met token resistance. Chiang had withdrawn 300,000 of his most reliable soldiers to form a rear-guard perimeter around Shanghai. A week later, Chiang fled across the Formosa Strait to Taiwan, along with a cadre of KMT, but it seemed clear that China was a lost cause.

Mao Tse-tung proclaimed Red China’s sovereignty on Sept. 21, 1949 – the same day West Germany declared its sovereignty – and this was followed by Chiang announcing the formation of his new government in Taipei. Chinese politician Sun Yat-sen’s 50-year-old vision for a democratic China was dead, and the U.S. expectation that Chiang would establish the non-communist world’s eastern anchor died with it.

The world now had two Chinas!

The American response was slow. Newspapers had carried regular accounts of the Chinese Communists and the KMT’s slow disintegration, but China was so vast, the geography so unfamiliar and movements of the unmechanized armies so slow, that Americans had lost interest in these distant battles.

However, when the KMT collapsed, U.S. Secretary of State Dean Acheson decided to lay out the entire situation before the American people. On Aug. 5, 1949, the State Department issued a 1,054-page white paper, conceding the world’s largest nation had fallen into communist hands. The chain of events leading to this tragic end was also explained, including the $2 billion that had been largely wasted and the 75 percent of American arms shipments that had fallen into Mao’s hands.

The American people were stunned by this admission. Everything American diplomats had achieved in Europe – the Truman Doctrine, the Marshall Plan, NATO – seemed to have been annulled by this disaster in Asia.

The burning question was … who was responsible for losing China?

Richard Nixon of California flatly blamed the Democrats. On Feb. 21, a young congressman from Massachusetts, John F. Kennedy, said that at Yalta, a “sick” Franklin Roosevelt had given strategic places to the USSR. This, Kennedy concluded, “is the tragic story of China, whose freedom we fought to preserve. What our young men saved, our diplomats and our presidents have frittered away.”

Thus began one of the darkest periods in American history. President Harry S. Truman’s Executive Order 9835 created the “Loyalty Order” program and in 1947, the FBI began stalking “disloyal and subversive persons” by conducting name checks on 2 million federal employees and background checks on 500,000 annual applicants for government jobs. During the program’s five years, the FBI screened over 3 million Americans and conducted 10,000 field interviews. Preliminary indictments were filed against 9,977, of whom 2,961 were arraigned.

Seth Richardson, chairman of the Subversive Activities Control Board, summed up his findings for a Congressional committee: “Not one single case or evidence directing toward a case of espionage has been found by the FBI indicating that a particular case involves a question of espionage.”

In the entertainment industry, “blacklisting” became a form of blackmail and took its toll on a small group for a full decade.

Time has blurred the sharp contours of the Age of Suspicion, but it was a dark period that must never be allowed to recur.

We still don’t know, or agree on, who lost China.

Jim O'NielIntelligent Collector blogger JIM O’NEAL is an avid collector and history buff. He is President and CEO of Frito-Lay International [retired] and earlier served as Chairman and CEO of PepsiCo Restaurants International [KFC Pizza Hut and Taco Bell].