Part 6 Raven Rock. Alarming quotes from leaders

Preface. This is the 6th part of my book review of: Graff, G.M. 2018. Raven Rock. The Story of the U.S. Governments Secret Plan to Save Itself–While the Rest of Us Die. Simon and Schuster.

These are some of the things presidents, generals, and other leaders said in the book that struck me.

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report



Truman stopped one Oval Office debate over civilian versus military control of the bombs cold, saying, “You have got to understand that this isn’t a military weapon. It is used to wipe out women and children and unarmed people, and not for military uses. So we have got to treat this differently from rifles and cannons and ordinary things like that.


As Eisenhower said in one meeting, if war happened, the nation didn’t have “enough bulldozers to scrape the bodies off the street.”   “The destruction,” Eisenhower told his cabinet at one point, “might be such that we might have ultimately to go back to bows and arrows.”

Eisenhower gave a speech where he told the public that “The jet plane that roars over your head costs three quarter of a million dollars. That is more money than a man earning ten thousand dollars every year is going to make in his lifetime. What world can afford this sort of thing for long? We are in an armaments race. Where will it lead us? At worst to atomic warfare.  At best, to rob every people and nation on earth of the fruits of their own toil. Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed.”  This speech became one of the best known of the Cold War.

Eisenhower, during an NSC meeting said that postwar planning was useless.   None of the world’s nations would exist as we knew them, he argued, let alone be able to rise to the occasion of building a postwar peace.  After a nuclear war, every single nation, including the United States would emerge with a dictatorship.”


Kennedy never forgot the impression the nuclear drill made, and called it “chilling.” We were a nation preparing for our own destruction.

The idea of community shelters clashed with Republican Nelson Rockefellers push for private home shelters.   Rockefeller bordered on the obsessive about civil defense—he’d led a study panel in 1958 that pushed for shelters and had since adopted his own rhetoric. He had shelters built at the New York governor’s mansion and his own Fifth Avenue residence. He proselytized every chance he could. Indian prime minister Jawaharlal Nehru, after meeting the governor during a visit to New York, remarked: “Governor Rockefeller is a very strange man. All he wants to talk about is bomb shelters. Why does he think I am interested in bomb shelters?

When the Kennedy administration had pushed General Power to modify the existing “overkill” strategy to focus solely on military targets, Power had objected: “Why are you so concerned with saving their lives? The whole idea is to kill the bastards.” His next conclusion had abruptly ended the discussion: “At the end of the war, if there are two Americans and one Russian, we win!”


As the noon hour passed on January 20, 1969, Johnson relaxed. “When Richard Nixon took the oath,” President Johnson said later, “the greatest burden lifted from me that I have ever carried in my life.” As he explained, “Never a day went by that I wasn’t frightened or scared that I might be the man that started World War III.


FEMA official William Chipman had optimistically pointed to the experience of Europeans during the Bubonic Plague, which had wiped out a third of the population during the Middle Ages. “It was horrifying at the time, and yet six or eight years later, not only had English society rebounded but, by God, those people went out on an expeditionary force to France,” he explained. What he called the “post-attack United States” would, with time, resemble the pre-attack United States and “eventually” even restore traditional institutions and a democratic government: “As I say, ants will eventually build another anthill.

Deputy undersecretary of Defense for Strategic and Nuclear Forces brushed off concerns about the threat of war with the Soviet Union. “Everybody’s going to make it if there are enough shovels to go around. Dig a hole, cover it with a couple of doors, and then throw three feet of dirt on top. It’s the dirt that does it. Jones believed nuclear war was not only survivable but that if we were prepared for it, destruction would be very limited.  “With protection of people only, your recovery time to prewar GNP levels would probably be six or eight years. If we used the Russian methods for protecting both the people and the industrial means of production, recovery time could be two to four years.

Around the same time, an official in the Office of Civil Defense wrote, “A nuclear war could alleviate some of the factors leading to today’s ecological disturbances that are due to current high-population concentrations and heavy industrial populations.



Posted in Nuclear, Nuclear | Tagged , , , , | Leave a comment

Part 5 Raven Rock. Hidey holes for government and military officials to carry on democracy after nuclear war destroys the planet

Preface. This is the fifth part of my book review of: Graff, G.M. 2018. Raven Rock. The Story of the U.S. Governments Secret Plan to Save Itself–While the Rest of Us Die. Simon and Schuster.  There are so many doomsday shelters listed in this book that I gave up trying to list all of them, and there must be dozens if not hundreds not in the book because they’re Top Secret.

I’m interested in the government’s plans for a nuclear war because I have always assumed the government would have plans for the permanent emergency of declining fossil fuels.  After reading this book, I doubt it.  If they wouldn’t try to save the public for just two weeks after a nuclear war, they certainly aren’t planning forh Peak oil and everything else for that matter, or climate change.  On the other hands, there aren’t any solutions.  But I’d hoped they’d soften it a bit with rationing plans, preventing mass migrations, distributing food, and so on.  After all, in the 1980s when the government believed Peak Oil had arrived, there was a rationing plan (see my summary here).

Excerpts / summary of the book:

Only top government officials, staff, and some private experts were to be saved after nuclear doomsday, deep underground.  The public would be on their own, though theoretically we could all be saved — the National Speleological Society estimated the nation possessed enough caves to protect all Americans.

Underground shelters would help some, but the AEC concluded that even those who made it to a shelter would suffer a sorry fate. “When the survivors emerged from hiding, they would wander helplessly through a useless city.” Congressman Chet Holifield proposed publicly creating an alternate seat of government to ensure that the “nerve center of our nation” couldn’t be “paralyzed” by a Soviet atomic bomb.

In addition to the bunkers, a great deal of expense has also been spent on infrastructure such as massive communication systems, Marine helicopters, Air Force One, armored limousines, and screaming motorcades.

Many of those who were guaranteed a slot in a bunker refused, including all of the Presidents, the Supreme Court, House Speaker Tip O’Neill, National security advisor ZbigniewBrzezinski, countless people from all agencies since their families weren’t allowed to accompany them,

Many staff had no idea who would be evacuated, because having such a list was seen as a security risk, making it easier for an enemy to know who to target.  And in fact, private sector workers often had priority over government workers

On top of that, in the annual drills, the pentagon or White House official playing the role of the president never pressed the button, not even after an enemy missile strike and watch the United States be obliterated.  And this despite everyone knowing this was a drill and that nothing would happen.

In the real world I think it is even more likely this would happen.  It only takes a submarine missile launched off of the Atlantic coast about 14 minutes to hit its target, not enough time for anyone to make a decision, especially since there have been an alarming number of false alarms in the past.

Here’s a partial list of bunkers.  They’re well known to the soviets and Chinese, and most would not be able to survive a direct nuclear bomb hit, yet the government is still spending billions on them every year.

After 9-11, most of the bunkers got bigger and better, and communications and transportation to get government employees to shelters was improved as well.

Not all agencies were created equal, the USDA got to save 62 people, 3 from the forest service to work on fires in rural areas, 3 from Food and Nutrition Service to oversee distribution of USDA donated foods and emergency food stamps, and 2 from the Soil Conservation Service to work on the radiological contamination of water and soil.

Department of the Interior: grounds of a former college in Harpers Ferry.

Federal Reserve: Mount Pony, 70 miles south of Washington. Employees would be sharing their bunker with four billion dollars of cash to provide money, credit, and liquidity in the afterwards. The $2 bill introduced in 1976 was so unpopular that many of them also ended up in emergency bunkers.   Each of the 12 regional Federal reserve branches also had a relocation facility.   Gordon Grimwood, the Fed’s emergency planning officer said he couldn’t guarantee their plans would work, but after a nuclear war, everyone will take to the hill, and we’ll be back to tribal warfare and with no hope for national survival and recovery.

Intelligence Agencies: Peters Mountain, Charlottesville Virginia

Not all agencies were assigned to bunkers.  A large facility was set up at a 6,000 acre USDA cattle research station with 58 buildings, 75 miles away from D.C. in Front Royal, Virginia.  This was where 1,200 State Department employees would go, or nearby motels and apartments.

For a while bunker fever reigned, with at least 58 federal relocation sites for civilian agencies, and probably hundreds more that are still considered top secret.

One of the best of the hundreds of retreat bunkers was Mount Weather, dug out of a mountain and protected by nearly a quarter mile of granite.  This was where the top government officials at federal department s Agriculture, Commerce, HEW, HUD, Interior Labor, State, Transportation, and Treasury and federal agencies postal service, FCC, Federal Reserve, Selective Service, Federal Power Commission, Civil Service Commission, and Veterans administration would go.  There were some tensions about which agency got to send how many people to survive.  The Bureau of the Budget initially tried to lay claim to 400 of the 1,900 available spots at Mount Weather, but didn’t succeed in that.

Equally if not more impressive were two other mountain bunkers Raven Rock and Cheyenne mountain.

The irony is that none of these three massive under-mountain bunkers were likely to survive nuclear attack, and the Soviets certainly knew of their existence..

Even cities got in on the craze, Portland, Oregon, built a $670,000 bunker on top of Kelly Butte, about six and a half miles east of downtown, as well as some corporations, such as AT&T’s underground bunker in Netcong, New Jersey, Westinghouse Electric, based in Pittsburgh, kept its own relocation facility in an old limestone mine.

The Greenbrier luxury resort in White Sulphur Springs was where 1,000 congressional members and staff were slated to go. But telling all 535 members of Congress the evacuation’s location was an unnecessary security risk; the Greenbrier was not built to withstand a direct attack, so if its location became publicly known, its security would be ruined and its safety nullified. Thus, the Office of Defense Mobilization decided Congress simply wouldn’t be told in advance where to evacuate in an emergency. In later drills it was clear that few if any Congress members were likely to be told about their hidey-hole, and even less likely they’d be able to get there via a 6 hour train ride or in their autos.

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

Posted in Nuclear, Nuclear, Politics, War | Tagged , , | 1 Comment

Part 4 Raven Rock. The government abandons plans to aid the public, only the government to survive


Preface. This is the fourth  part of my book review of: Graff, G.M. 2018. Raven Rock. The Story of the U.S. Governments Secret Plan to Save Itself–While the Rest of Us Die. Simon and Schuster.  Clearly if the U.S. government abandoned plans to help Americans survive for two weeks until they could emerge from their bomb shelters, there are certainly no plans to help the public survive peak oil and peak everything else for that matter, a permanent emergency.

Excerpts and summary of civilian aid:

Even as the government appeared to be optimistic about nuclear war, which wouldn’t be all that bad and survivable, the sentiment among the nation’s leaders was that the public had virtually no hope of surviving.

As Eisenhower said in one meeting, if war happened, the nation didn’t have “enough bulldozers to scrape the bodies off the street”.

Eisenhower and VP Richard Nixon resisted an expansion of the national shelter program.  Nixon argued that “If 40 million were killed, the United States would be finished.” He did not believe the country would survive such a disaster. Eisenhower seemed to agree, later writing, “so far as I am personally concerned, I am not sure whether I would really want to be living if this country of ours should ever be subjected to a nuclear bath.”

After the 1962 the Cuban Missile Crisis the plans for a widespread public civil defense effort were abandoned when it became clear that doing so would be futile.  Instead, the only priority would be saving the government.

Kennedy tried for a while to enact an ambitious fallout shelter plan, but that ended after Viet Nam and the public’s distrust of government. By 1979 the government had abandoned the pretense of providing civilian aid; across the country, the stockpiles and shelters of the Eisenhower and Kennedy years were mildewing, forgotten and ignored.

In 1979, New York City abandoned its efforts to give away the remaining supplies socked away inside its 10,800 fallout shelters and began hiring contractors at $38 a ton to transport the stockpiles to landfills.

What began in the 1950s as an all-encompassing, nationwide push for civil defense, to ready every household and workplace, every village and city, for a Soviet attack, shrank decade by decade, until by 9/11, there was just one aspect of the grand plans left in operation: the evacuation of the nation’s leaders to bunkers hidden under mountains.

Efforts to protect civilian life fell by the wayside and a fourth and final grim phase of nuclear reality settled over the United States. Soon grandiose plans gradually shrank to just a single, all-consuming governmental goal: protect the idea of a democratic leadership and preserve the National Command Authorities—that virtually never-ending succession line of officials authorized to launch the nation’s nuclear weapons.

Rather than remake the entire society, the government would protect itself and let the rest of us die. That way, there was a chance that democracy could one day again blossom.

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

Posted in Nuclear, Nuclear, Politics, Where to Be or Not to Be | Tagged , , | 1 Comment

Venezuela collapse: looting, hunger, blackouts

[ Venezuela is experiencing a double whammy of drought and low oil prices, which has lead to blackouts and inability to import food.  It just keeps getting worse and worse.  Related posts:

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report ]

July 16, 2018. Keith Johnson. How Venezuela Struck it poor.

…”Venezuela’s murder rate, meanwhile, now surpasses that of Honduras and El Salvador, which formerly had the world’s highest levels, according to the Venezuelan Violence Observatory. Blackouts are a near-daily occurrence, and many people live without running water. According to media reports, schoolchildren and oil workers have begun passing out from hunger, and sick Venezuelans have scoured veterinary offices for medicine. Malaria, measles, and diphtheria have returned with a vengeance, and the millions of Venezuelans fleeing the country — more than 4 million, according to the International Crisis Group — are spreading the diseases across the region, as well as straining resources and goodwill.”

…”Thanks to their geology, Venezuela’s oil fields have enormous decline rates, meaning the country needs to spend more heavily than other petrostates just to keep production steady. ”

2017-10-22 Oil Quality Issues Could Bankrupt Venezuela.  The next few weeks for Venezuela will be crucial, as it struggles to meet a huge stack of debt payments. Reports that the nation’s oil production is experiencing deteriorating quality raises a new cause for concern for the crumbling South American nation.Reuters reported that its oil shipments are “soiled with high levels of water, salt or metals that can cause problems for refineries”, which has led to $200 million in cancellations of oil contracts, making Venezuela even less able to make debt payments, since oil is the only source of revenue barely keeping the nation afloat. Many experienced oil workers have fled the country to find food and escape violence.  Because of these problems, and Trump imposed sanctions, U.S. imports have dropped from roughly 700,000 barrels per day to 250,000 bpd.

2017-5-2 Venezuela Is Heading for a Soviet-Style Collapse. A few lessons from the last time an oil economy crashed catastrophically

2017-2-27 ASPO Peak Oil Review: A new survey shows that 75% of Venezuelans may have lost an average of 19 pounds in the last year as widespread food shortages continue. Nearly a third of the population are now eating two meals a day or less. The survey also shows that the average shopper spends 35 hours a month waiting in line to buy food and other necessities. A sense of hopelessness has engulfed the country, and most no longer have an incentive or the strength to protest against the government and its policies as was happening two years ago. Government roundups of opposition politicians continue. Venezuela is clearly well on its way to becoming a failed state.

2016-11-1 Venezuela is telling hungry city dwellers to grow their own food. Washington Post

2016-10-21 Planet Money Podcast #731: How Venezuela Imploded

2016-8-23 Venezuela’s latest response to food shortages: Ban lines outside bakeries

2016-05-04 Hungry Venezuelans Hunt Dogs, Cats, Pigeons as Food Runs Out. Economic Crisis and Food Shortages Lead to Looting and Hunting Stray Animals  

Sabrina Martín. April 27, 2016. Looting On the Rise As Venezuela Runs Out of Food, Electricity. PanAmPost.

Food Producers Alert They Have Only 15 Days Left of Inventory amid Rampant Inflation

“Despair and violence is taking over Venezuela. The economic crisis sweeping the nation means people have to withstand widespread shortages of staple products, medicine, and food.  So when the Maduro administration began rationing electricity this week, leaving entire cities in the dark for up to 4 hours every day, discontent gave way to social unrest.

On April 26, people took to the streets in three Venezuelan states, looting stores to find food.

Maracaibo, in the western state of Zulia, is the epicenter of thefts: on Tuesday alone, Venezuelans raided pharmacies, shopping malls, supermarkets, and even trucks with food in seven different areas of the city.

Although at least nine people were arrested, and 2,000 security officers were deployed in the state, Zulia’s Secretary of Government Giovanny Villalobos asked citizens not to leave their homes. “There are violent people out there that can harm you,” he warned.

In Caracas, the Venezuelan capital, citizens reported looting in at least three areas of the city. Twitter users reported that thefts occurred throughout the night in the industrial zone of La California, Campo Rico, and Buena Vista.  The same happened in Carabobo, a state in central Venezuela.

Supermarkets employees from Valencia told the PanAm Post that besides no longer receiving the same amount of food as before, they must deal with angry Venezuelans who come to the stores only to find out there’s little to buy.

Purchases in supermarkets are rationed through a fingerprint system that does not allow Venezuelans to acquire the same regulated food for two weeks.

Due to the country’s mangled economy, millions must stand in long lines for hours just to purchase basic products, which many resell  for extra income as the country’s minimum wage is far from enough to cover a family’s needs.

On Wednesday, the Venezuelan Chamber of Food (Cavidea) said in a statement that most companies only have 15 days worth of stocked food.

According to the union, the production of food will continue to dwindle because raw materials as well as local and foreign inputs are depleted.

In the statement, Cavidea reported that they are 300 days overdue on payments to suppliers and it’s been 200 days since the national  government last authorized the purchase of dollars under the foreign currency control system.

The latest Survey of Living Conditions (Encovi) showed that more than 3 million Venezuelans eat only twice a day or less. The rampart inflation and low wages make it increasingly more difficult for people to afford food.

“Fruits and vegetables have disappeared from shopping lists. What you buy is what fills your stomach more: 40 percent of the basic groceries is made up of corn flour, rice, pasta, and fat”.

But not even that incomplete diet Venezuelans can live on because those food products are hard to come by. Since their prices are controlled by the government, they are scarce and more people demand them.

The survey also notes the rise of diseases such as gastritis, with an increase of 25 percent in 2015, followed by poisoning (24.11 percent), parasites (17.86 percent), and bacteria (10.71 percent).

The results of this study are consistent with the testimony of Venezuelan women, who told the PanAm Post that because “everything is so expensive” that they prefer to eat twice a day and leave lunch for their children. That way they can make do with the little portions they can afford.”


Posted in Central & South America, Social Disorder | Tagged , | 1 Comment

Part 3 Raven Rock. The government’s plans for after a nuclear holocaust

This is the third part of my book review of: Graff, G.M. 2018. Raven Rock. The Story of the U.S. Governments Secret Plan to Save Itself–While the Rest of Us Die. Simon and Schuster.

This book stresses that a full nuclear attack might destroy civilization at best, and drive humans extinct at worse.  So to spend hundreds of billions to strike back with even more nuclear weapons, as well as continue Democracy and the Non-negotiable Way of Life of America is bizarre, preposterous. America will have gone from amber waves of grain, a fruited plain, and purple mountain majesties to a vast radioactive scorched earth and nuclear winter.

The plans to carry on as usual are the Continuity of Government (COG), Continuity of Operations (COOP), and the most top secret level, Enduring Constitutional Government (ECG).

Here are just a few of the responsibilities government agencies will have after a nuclear apocalypse.  They range from the Truman administration to today, so some of these plans may have been discontinued or the responsibilities transferred d to the Department of Homeland Security and FEMA.

Defense Resources Act: This lays out a new structure and roles for how the government will function after a national emergency that suspends the Constitution and Bill of Rights.  In includes rationing, price controls, media censorship, property confiscation, the creation of emergency government agencies, the seizure of private industries, and other powers considered too politically toxic ever to be discussed in peacetime.

Department of Agriculture: distribute rationed food.

Department of Health, Education, and Welfare: handle civilian refugees from the war zone

Emergency Food Agency: vast powers to dictate the production and distribution of foodstuffs

Emergency Transport Agency: They were to seize control of all of the nation’s merchant vessel fleet and all of the nation’s highways—dictating who could drive on what roads when.  In addition local governments could supplement local stockpiles from any retail store.  And much more, the plans were hundreds of pages long.

FBI: attorney general’s always had an emergency briefcase nearby full of documents about the Emergency Detention Program, which included sections on the “Alien Control Program,” “Internment of Diplomatic Personnel of Enemy Nations Program,” “Handling of Dangerous Non-enemy Aliens Attached to International Organizations,” and much more. These documents gave the FBI the power to arrest, search, detain, and deport thousands of people.

Federal Civil Defense Administration (now FEMA).  Although the FCDA may no longer exist, perhaps their solution to the problem of not having enough medical aid to support attacked areas for more than three days is still in place.  They believed that unscathed states needed to be forced to help, by military force if necessary.

Federal Communications Commission (FCC): seize and shut down all the nation’s broadcasters.

FEMA, now a part of Homeland Security, would help the nation rebuild from regional bunkers in places like Denton, Texas, and Maynard, Massachusetts. Although the public believes their main role is aid after natural disasters, much of their budget goes to COG – the Continuation of Government — by tracking the 20 government officials in the presidential line of succession around the clock, and the helicopters and staff to whisk them away.

Highway Transport Division: coordinate the nation’s 10 million commercial vehicles in an attack.

Interior Department: ensure fuel supplies for attacked areas and restore electrical service to damaged cities.

Internal Revenue Service: After studying how taxes could be levied from the 1960s to 1980s, the IRS and treasury concluded it would be very hard to levy taxes after a Soviet Strike.  Aside from taxpayers being dead, finding W-2s for those still alive would be impossible, and it was unfair to assess taxpayers on the pre-attack value of their property, many of them reduced to ashes or other damage.  The IRS calculated that over $2 trillion in property might disappear after a large strike, and recommended that the government assume the underlying mortgages to keep the banking system from collapsing.  In the end, the only way to get taxes would be a national sales tax as high as 24 percent.

National Park Service: run refugee camps.

Pentagon and FEMA: alert the nation if a Soviet attack is underway, though as a civil defense expert pointed out, the people who hear the warnings will run into a building and still be turned into sand seconds later. FEMA attempted to rate how disastrous the aftermath would be and estimated about one out of six Americans would face the worst scenario known as BOS #9, HIRAD HIFIRE, which means severe radiation and uncontrollable fires.

Post Office: register the nation’s dead, and develop a national registration program to determine who was still alive, where refugees were located, and registering foreign aliens so government agencies could round up anyone who might be subversive.  In 2009 they were assigned the additional duty of distributing vaccines after a biological attack.

Private citizens: during the Eisenhower administration nine men were chosen to remake the private sector by seizing assets across the nation and starting new bureaucracies to control every aspect of American life until a recovery was made.

Selective Service System: mailing letters after a nuclear attack so the military could draft 84,600 new recruits within 30 days

Wartime Information Security Program (WISP) formerly the Office of Censorship: this agency would have a Press Division to oversee censorship of newspapers, magazines, books, and other publishers, a Broadcast Division to censor radio and TV, a Telecommunications Division to censor telephones and telegraph systems, and a Postal and Travelers Division to censor the mail, and the Special Analysis Division, which would be in charge of collecting and determining what information should be censored in the first place.  One of the Watergate burglars, James W. McCord Jr, worked with the Office of emergency planning to draft a national watch list of troublesome Americans.

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

Posted in GOVERNMENT, Nuclear, Nuclear, Politics, Where to Be or Not to Be | Tagged , , , , , , | 3 Comments

Part 2 Raven Rock. The U.S. government’s plans to save civilians from nuclear war

This is city planner Oscar Newman’s probably tongue-in-cheek vision of an enormous spherical underground replica of Manhattan located thousands of feet below the city itself, to be switched into action in the event of a nuclear event. Source:

Graff, G.M. 2018. Raven Rock. The Story of the U.S. Governments Secret Plan to Save Itself–While the Rest of Us Die.

At the outset of the cold war, scientists proposed that nationalism had to go away to prevent nuclear wars, which could only be done by having the entire world come under one government that owned all weaponry.  That idea was agreed to be impractical.

So the next Big Idea was to go underground and bury urban populations inside mountains linked by subterranean railroads. Life magazine gave this a positive spin “Consider the ant, whose social problems much resemble man. Constructing beautiful urban palaces and galleries, many ants have long lived underground in entire satisfaction.” The military jumped on this underground bandwagon by saying “After all, sunlight isn’t so wonderful. You have to be near a window to benefit by it. With fluorescent fixtures, you get an even light all over the place.”

Edward Teller, father of the H-bomb, tried to convince the Kennedy White House that American life should be moved underground, especially public buildings like schools, libraries, and museums should since they could help save lives and preserve “some of the chief reasons for living.”

If not underground, then cities needed to be much smaller and dispersed in ribbons of narrow, low density, and splitting them up would cost less than going to war.  MIT professors proposed encircling urban centers with “life belts” of large paved beltways to speed evacuations. The land around the “life belt” roads would be reserved as parks ready to host large tent cities which erected after a nuclear ar to shelter refugees.

Real estate agents took advantage of bomb fears using titles such as “Small Farms—Out Beyond Range of Atomic Bombs.

Cities made their own attempts to cope:

  • New York City distributed two million dog tags to schoolchildren to help officials both identify bodies and reunite separated families after an attack.
  • Chicago’s recommended parents tattoo children’s blood type under their armpit—but not on the arm itself in case it was blown off—which would help speed blood transfusions to avoid radiation poisoning. The city also mapped which students lived close enough to school to make it home versus who should remain at school to experience nuclear war with their teachers.
  • Jacksonville, Florida distributed the best escape routes for those seeking to flee into Georgia.

At the state level, Kansas, officials calculated they could assemble two million pounds of food that might last two months.   In addition citizens could consume wildlife, officials estimated that there were  11 million “man-days” of rabbits, 10 million “man-days” of wild birds, five million “man-days” of edible fish, and nearly 20 million “man-days” of meat in residential pets.  The state also planned to confiscate household vitamins for the good of everyone and ration the state’s 28 day supply of coffee.

To help cities, the federal government:

  • advised cities to create mortuary zones to ID the bodies and intern them, using Post Office mail trucks to carry casualties
  • trained hundreds of police officers across the country on “Emergency Traffic Control,” to aid urban evacuations, and built special civil defense rescue trucks, and planned in 1958 to distribute radiology monitoring kits to all 15,000 American high schools, so that science classes could begin to teach students how to monitor fallout levels after an attack.

The federal government also made “survival crackers” with 3 million bushels of wheat, enough for Nabisco and other companies to make 150 million pounds of crackers. “This is one of the oldest and most proven forms of food known to man,” explained Deputy Assistant Secretary of Defense Paul Visher. “It has been the subsistence ration for many portions of the earth for thousands of years. Its shelf life has been established by being edible after 3,000 years in an Egyptian pyramid.” The specially made bulgur crackers were indeed nearly unchanging: A 50-page USDA report on the crackers and their chemistry found merely a “discernible but inconsequential decrease” in flavor after 52months of storage, though the highest compliment ever paid was that they tasted like cardboard. (So for your next natural disaster or Apocalypse, you might want to try making my crackers instead in my book “Crunch! Whole grain artisan chips and crackers”.

The government encouraged people to build their own shelters, and provided free plans for how to construct them, plus examples of shelters were shown at conventions, state fairs, and local fairs. But the truth is these shelters would do no good.  Cities would be leveled after a hydrogen bomb was dropped and the only shelters that would survive would need to be hundreds or thousands of feet underground.

The government encouraged homes to have a week of food, and much like Costco sells prepper food today, decades ago Sears, Roebuck & Co. exhibited “Grandma’s Pantry” in 500 of its stores, while women’s magazines had articles titled “Take these steps now to save your family”.

Many Americans remained stubbornly disinterested in nuclear war preparation because they didn’t like to be reminded regularly of how tenuous their daily existence was in the age of nuclear weapons. It was simply difficult to keep up the fear, hard to keep up the psychological pressure that the world might end at any moment.

But plenty of patriotic Americans were paying attention.  One program at its peak in 1956 had 380,000 volunteers on the lookout for Russian planes at 17,000 posts, many staffed around the clock.

In the late 1950s the federal government had over 3.2 million square feet of stockpiled burn dressings, paper blankets, 307,000 litters, and 1,400 gas masks, most of these facilities ten to 50 miles from a major target.

For a brief time, there was a shelter craze. IBM promised their 60,000 employees loans to build their shelters and sell construction materials at cost.  Jails gave inmates one day off for every day they lived in an underground shelter, and one experiment found 7,000 volunteers to live an average of three days each in a shelter.

But then the public sobered up and realized that only a few people had shelters, and a debate about the morality of the have shelters versus the have no shelters arose.

Those with shelters, such as Charles Davis in Austin, Texas said he was prepared to defend his shelter from the inside and recapture it if others got there first. Pointing to his cache of five guns and a four-inch-thick door, he said, “This isn’t to keep radiation out, it’s to keep people out.”

The very few homeowners nationwide who had taken the advice to build a shelter found themselves quite popular among friends and neighbors: One Valley Forge, Pennsylvania, shelter owner reported receiving half-joking offers of up to $16,000 from neighbors eager to share his shelter. In the back of everyone’s mind, though, was the “Gun Thy Neighbor” debate from the year before. If people were willing to pay money to reserve a spot in a neighbor’s shelter, what panic would possess them in a real attack?

Upping the game, in 1961 Las Vegas’s civil defense leader, J. Carlton Adair, suggested forming a 5,000-man local militia who could help repel by force California refugees who would likely pour across the border after a nuclear attack “like a swarm of locusts.  Along the same lines, Beaumont California, between Los Angeles and Phoenix, encouraged households to pack pistols in their nuclear survival kits to fend off the 150,000 Los Angeles refugees passing through Riverside County as they fled.

These morality debates made many Americans realize that they’d be on their own in an attack. The government had no plans for sheltering the entire country. One civil defense official said defensively, “I do not think it is the government’s responsibility to take care of you from the minute you’re born until you die.”

Not that anyone would get enough warning of an attack to get into their shelter or evacuate.  The warning sirens were ineffective in the Kenney years, and attempts to remedy that with $10 buzzers in a multi-million dollar program didn’t work out since the buzzer couldn’t provide specific detail or information to homeowners about the attack timing, duration, or scale—or instructions on what to do afterward. The next attempt was another $10 device to be installed in TV sets that was terminated after the Watergate scandal, because the public had lost trust in the government and the public wasn’t about to put something in their TV’s that was controlled by the government.

One of the last plans considered before the government abandoned trying to save the public where evacuation plans.  For example the one for New York city included moving the 4.8 million who were carless via subway, train, ferry, barge, cruise ships, aircraft, and buses. In Washington DC several routes were developed for different sections of the city to host areas where hundreds of thousands of refugees would descend on remote country towns where there weren’t shelters or food available.   But it didn’t take long to see these plans couldn’t possibly work, so they were abandoned.

If there’s no plan to help the public survive the two weeks shelter is needed from radiation, then there’s certainly no plan for peak oil and other energy and resource crises – unless it’s to stop mass migrations, which would probably happen at the state or local, not federal level.

After our house burned down, I was quite interested in emergency planning, and managed to talk to one of the executives in the state of California.  I asked if they had food, tents, and other emergency supplies stockpiled.  No she told me, that would open a political can of worms and open the state to criticism they weren’t distributing their stockpile fairly.

Homeland Security and FEMA aren’t preparing our nation for the Permanent Emergency – and how could they and why would they if they abandoned plans to save the public for just 2 weeks (after that it was believed to be safe to emerge).   You’re on your own folks!

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report ]

Posted in Nuclear, Nuclear, Politics, Refugee Camps | Tagged , , , , | Leave a comment

Part 1 Intro. Raven rock: the story of the U.S. governments secret plans to save itself after a nuclear war and let the rest of us die

This is a book review of Graff’s 2018 “ Raven Rock. The Story of the U.S. Governments Secret Plan to Save Itself–While the Rest of Us Die”.

This book exposes the government’s plans to carry on government and democracy after a nuclear war with Russia that began in the Cold War and continues on today. But without the rest of us, civilians would not be saved.

This is a patently ridiculous plan because our nation, and human life will be destroyed, as Graff explains below:

“The result of the president launching the nation’s bombers, submarines, and missiles against Russia would almost surely destroy not only both countries but all human life on the entire planet. There are some 30,000 nuclear weapons in the U.S. arsenal, the rough equivalent, one CIA director calculated, of 55 billion traditional 500-pound TNT bombs from World War II. Enough, as he said, to carpet each state in the union with a billion bombs—and still have five billion bombs left over.”

In the initial plans that began during the Truman administration, civilians were to be saved, but as time went on it became clear this couldn’t be done (that will be the subject of a later post).

So if the government wasn’t planning on helping the American people survive for two weeks after a nuclear war, then Homeland Security and FEMA surely don’t have any plans to help us cope or survive the permanent emergency ahead of energy and resource decline, which they certainly know about as shown in my posts from the congressional record under menu item experts, government (and especially military).

After a nuclear war starts, only top government officials will be able to go to the deep mountain shelters built at hundreds of billions of taxpayer expense.   Mostly from the executive branch, it was decided not to tell Congress anything since one of them was likely to spill the beans.  And the Supreme court didn’t want a shelter, since their families couldn’t come with them (many other departments and people felt this way as well).

It’s known that tens if not hundreds of billions of dollars have been spent on shelters, evacuation aircraft and helicopters, vehicles, keeping bomb shelters running, communications systems, and much more, but since so much of the money is in the top secret black budget, it may never be known how much has been spent.

And what a waste of money!  Chapter 17 has an amazing recounting of what an utter fiasco ensued after 9-11, much of it new information to me.  All these plans made for five decades fell apart. Anything that could go wrong did with evacuation plans, finding the 20 or so officials in the chain of the presidency in a decapitation event, miscommunication and being unable to communicate at all, confusion across all government and military agencies, no one who knew much about what the Continuity of Government (COG) plans were or how to activate them, all of the top FEMA employees were at a conference in Montana and unable to help out, and the inability of President Bush to speak to the people or anyone else for that matter from the airplane he was flying in – such a mess that we’d all be dead now if this had been a nuclear war.

But were these crazy plans scrapped after 9-11?  Heck no, if anything the government is spending even more money on bomb shelters, airplanes to run a nuclear war from in the sky since the U.S. land will be ruined and devastated below,

Here’s how the author describes his book:

“This book is meant to be the first definitive tour of the hidden architecture of the Cold War’s shadow government. It is a history of “how.” How nuclear war would have actually worked—the nuts and bolts of war plans, communication networks, weapons, and bunkers—and how imagining and planning for the impact of nuclear war actually changed the “why,” as leaders realized the horrors ahead and altered the course of the Cold War at several key points in response.

Since the Truman administration and hundreds of billions of dollars later, the federal government’s basic plan hasn’t changed a bit from the initial plan of run and hide in the Appalachian mountains of Virginia, Maryland, and West Virginia.

This story is a challenge to tell because much of the COG machinery remains shrouded in secrecy. Many of my requests to declassify even 50-year-old reports and memos were denied. The National Archives refused my request to release, for instance, four Kennedy-era memos from 1962 and 1963 dealing with civilian emergency planning—arguing that all four still remain so vital to national security that not a single word of them could be declassified.

The terrorist attacks of September 11, 2001, and the subsequent anthrax attack on the U.S. Congress restarted a focus on COG and COOP planning that continues to this day. Today, this secret world still exists, just beneath the surface of our country. In many ways, it’s actually more expansive, powerful, and capable today than it ever was during the twentieth century.

A generation before Ashton Carter took over the Pentagon as President Obama’s fourth defense secretary, he had helped design the E-4B “Doomsday plane” meant to help the commander-in-chief run a nuclear war from the sky—the same plane is still in use today to ferry defense secretaries around the world. It sits every day on a runway in Omaha, Nebraska, fully staffed, with its engines turning, ready to launch in just minutes and run a nuclear war from the sky.

Today, blue-and-gold Air Force helicopters still practice evacuating officials from Washington each day in the skies over the capital. Today, each time a major event like a presidential inauguration or State of the Union speech occurs, there’s still a “designated survivor” from the established line of succession, who skips the event and stays in a secure facility under guard until the event ends without incident.

Throughout the 1970s the Oak Ridge National Laboratory—working with OEP’s computer facility in Olney, Maryland—devoted significant resources to calculating which natural resources and infrastructure would likely survive various attacks; it estimated, for example, that 136 of the nation’s 224 oil refineries would be destroyed in a widespread nuclear attack—most of the rest, though, would be inoperable due to the loss of electricity.”

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report ]


Posted in Nuclear, Nuclear, Politics | Tagged , , | 1 Comment

Book review of “White Trash. The 400-year untold history of class in America”

Preface.  This book makes the case that the poor arrived 500 years ago when America was first settled, and most of them never rose to the middle or upper classes because “land was the principal source of wealth, and those without any had little chance to escape servitude. It was the stigma of landlessness that would leave its mark on white trash from this day forward”.  The poor and their ancestors remained in the underclass for the most part, and still are today.

Britain saw sending the poor and criminals to distant lands as a good way to get rid of them.  And often done forcefully, as Bailyn’s book “The Barbarous Years: The Peopling of British North America: The Conflict of Civilizations, 1600-1675” explains in great detail.

English writer Richard Hakluyt (1553-1616) envisioned America as becoming a workhouse, “a place where the surplus poor, the waste people of England, could be converted into economic assets. The land and the poor could be harvested together, to add to—rather than continue to subtract from—the nation’s wealth. Among the first waves of workers were the convicts, who would be employed at heavy labor, felling trees and burning them for pitch, tar, and soap ash; others would dig in the mines for gold, silver, iron, and copper. The convicts were not paid wages. As debt slaves, they were obliged to repay the English commonwealth for their crimes by producing commodities for export. In return, they would be kept from a life of crime, avoiding, in Hakluyt’s words, being “miserably hanged,” or packed into prisons to “pitifully pine away” and die.”

“During the 1600s, far from being ranked as valued British subjects, the great majority of early colonists were classified as surplus population and expendable “rubbish,” a rude rather than robust population. The English subscribed to the idea that the poor dregs would be weeded out of English society in four ways. Either nature would reduce the burden of the poor through food shortages, starvation, and disease, or, drawn into crime, they might end up on the gallows. Finally, some would be impressed by force or lured by bounties to fight and die in foreign wars, or else be shipped off to the colonies. Such worthless drones as these could be removed to colonial outposts that were in short supply of able-bodied laborers and, lest we forget, young “fruitful” females. Once there, it was hoped, the drones would be energized as worker bees.”

“The colonists were a mixed lot. On the bottom of the heap were men and women of the poor and criminal classes. Among these unheroic transplants were roguish highwaymen, mean vagrants, Irish rebels, known whores, and an assortment of convicts shipped to the colonies for grand larceny or other property crimes, as a reprieve of sorts, to escape the gallows. Not much better were those who filled the ranks of indentured servants, who ranged in class position from lowly street urchins to former artisans burdened with overwhelming debts. They had taken a chance in the colonies, having been impressed into service and then choosing exile over possible incarceration within the walls of an overcrowded, disease-ridden English prison. Labor shortages led some ship captains and agents to round up children from the streets of London and other towns to sell to planters across the ocean—this was known as “spiriting.” Young children were shipped off for petty crimes. One such case is that of Elizabeth “Little Bess” Armstrong, sent to Virginia for stealing two spoons. Large numbers of poor adults and fatherless boys gave up their freedom, selling themselves into indentured servitude, whereby their passage was paid in return for contracting to anywhere from four to nine years of labor. Their contracts might be sold, and often were, upon their arrival. Unable to marry or choose another master, they could be punished or whipped at will. Owing to the harsh working conditions they had to endure, one critic compared their lot to “Egyptian bondage.” Discharged soldiers, also of the lower classes, were shipped off to the colonies.”

“At all times, white trash reminds us of one of the American nation’s uncomfortable truths: the poor are always with us. A preoccupation with penalizing poor whites reveals an uneasy tension between what Americans are taught to think the country promises—the dream of upward mobility—and the less appealing truth that class barriers almost invariably make that dream unobtainable.”

The Sydney Morning Herald in Australia reports that the top think tank in China is studying the book “White Trash” to understand Trump. What a great idea.  This book presents a new understanding of our American history and how entrenched and berated the poor have always been, with little chance of emerging from poverty due to the class structure in the U.S.

What follows are the rest of my Kindle Notes.  Toss out everything you learned in school, it was Fake History.

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report ]


Nancy Isenberg. 2016. White Trash.  The 400-Year Untold History of Class in America.  Penguin Books.

Beyond white anger and ignorance is a far more complicated history of class identity that dates back to America’s colonial period and British notions of poverty. In many ways, our class system has hinged on the evolving political rationales used to dismiss or demonize (or occasionally reclaim) those white rural outcasts seemingly incapable of becoming part of the mainstream society.  Their history starts in the 1500s, not the 1900s. It derives from British colonial policies dedicated to resettling the poor, decisions that conditioned American notions of class and left a permanent imprint. First known as “waste people,” and later “white trash,” marginalized Americans were stigmatized for their inability to be productive, to own property, or to produce healthy and upwardly mobile children—the sense of uplift on which the American dream is predicated. The American solution to poverty and social backwardness was not what we might expect. Well into the 20th century, expulsion and even sterilization sounded rational to those who wished to reduce the burden of “loser” people on the larger economy.

In Americans’ evolving attitudes toward these unwanted people, perhaps the most dramatic language attached to the mid-19th century, when poor rural whites were categorized as somehow less than white, their yellowish skin and diseased and decrepit children marking them as a strange breed apart. The words “waste” and “trash” are crucial to any understanding of this powerful and enduring vocabulary. Throughout its history, the United States has always had a class system. It is not only directed by the top 1 percent and supported by a contented middle class. We can no longer ignore the stagnant, expendable bottom layers of society in explaining the national identity.

The poor, the waste, the rubbish, as they are variously labeled, have stood front and center during America’s most formative political contests. During colonial settlement, they were useful pawns as well as rebellious troublemakers, a pattern that persisted amid mass migrations of landless squatters westward across the continent. Southern poor whites figured prominently in the rise of Abraham Lincoln’s Republican Party, and in the atmosphere of distrust that caused bad blood to percolate among the poorer classes within the Confederacy during the Civil War. White trash were dangerous outliers in efforts to rebuild the Union during Reconstruction; and in the first two decades of the twentieth century, when the eugenics movement flourished, they were the class of degenerates targeted for sterilization. The poor were not only described as waste, but as inferior animal stocks too.

As evidenced in the popularity of the “reality TV” shows Duck Dynasty and Here Comes Honey Boo Boo in recent years, white trash in the 21st century remains fraught with the older baggage of stereotypes of the hopelessly ill bred.

This book tells many stories, then. One is the importance of America’s rural past. Another, and arguably the most important, is the one we as a people have trouble embracing: the pervasiveness of a class hierarchy in the United States. It begins and ends with the concepts of land and property ownership: class identity and the material and metaphoric meaning of land are closely connected. For much of American history, the worst classes were seen as extrusions of the worst land: scrubby, barren, and swampy wasteland. Home ownership remains today the measure of social mobility.

British colonists promoted a dual agenda: one involved reducing poverty back in England, and the other called for transporting the idle and unproductive to the New World. After settlement, colonial outposts exploited their unfree laborers (indentured servants, slaves, and children) and saw such expendable classes as human waste. The poor, the waste, did not disappear, and by the early eighteenth century they were seen as a permanent breed. This way of classifying human failure took hold in the United States. Every era in the continent’s vaunted developmental story had its own taxonomy of waste people—unwanted and unsalvageable. Each era had its own means of distancing its version of white trash from the mainstream ideal.

Long before they were today’s “trailer trash” and “rednecks,” they were called “lubbers” and “rubbish” and “clay-eaters” and “crackers”—and that’s just scratching the surface.

Let us recognize the existence of our underclass. It has been with us since the first European settlers arrived on these shores. It is not an insignificant part of the vast national demographic today.

In grand fashion, promoters imagined America not as an Eden of opportunity but as a giant rubbish heap that could be transformed into productive terrain. Expendable people—waste people—would be unloaded from England; their labor would germinate a distant wasteland. Harsh as it sounds, the idle poor, dregs of society, were to be sent thither simply to throw down manure and die in a vacuous muck. Before it became that fabled “City upon a Hill,” America was in the eyes of 16TH-century adventurers a foul, weedy wilderness—a “sink hole” suited to ill-bred commoners.

The idea of America as “the world’s best hope” came much later. Historic memory has camouflaged the less noble origins of “the land of the free and the home of the brave.

Class is the most outstanding, if routinely overlooked, element in presuppositions about early settlement. Even now, the notion of a broad and supple middle class functions as a mighty balm, a smoke screen. We cling to the comfort of the middle class, forgetting that there can’t be a middle class without a lower.

It is only occasionally shaken up, as when the Occupy Wall Street movement of recent years shone an embarrassing light on the financial sector and the grotesque separation between the 1 percent and the 99 percent. And then the media giants find new crises and the nation’s inherited disregard for class reboots, as the subject recedes into the background again.

Above all, we must stop declaring what is patently untrue, that Americans, through some rare good fortune, escaped the burden of class that prevailed in the mother country of England. Far more than we choose to acknowledge, our relentless class system evolved out of recurring agrarian notions regarding the character and potential of the land, the value of labor, and critical concepts of breeding.  Embarrassing lower-class populations have always been numerous, and have always been seen on the North American continent as waste people.

Beyond the web of stories the founding generation itself wove, our modern beliefs have most to do with the grand mythmakers of the 19TH-century. The inspired historians of that period were nearly all New Englanders; they outpaced all others in shaping the historical narrative, so that the dominant story of origins worked in their favor. That is how we got the primordial Puritan narrative of a sentimental community and a commendable work ethic. Of course, the twin attributes of religious freedom and hard work erase from the record all those settlers who did not live up to these high ideals. The landless, the impoverished, the progenitors of future generations of white trash conveniently disappear from the founding saga.

The compression of history, the winnowing of history, may seem natural and neutral, but it is decidedly not. It is the means by which grade school history becomes our standard adult history. And so the great American saga, as taught, excludes the very pertinent fact that after the 1630s, less than half came to Massachusetts for religious reasons.

Americans do not like to talk about class. It is not supposed to be important in our history. It is not who we are. Instead, we have the Pilgrims (a people who are celebrated at Thanksgiving, a holiday that did not exist until the Civil War), who came ashore at Plymouth Rock (a place only designated as such in the late eighteenth century).

The quintessential American holiday was associated with the native turkey to help promote the struggling poultry industry during the Civil War. The word “Pilgrim” was not even popularized until 1794.

The 1607 founding of Jamestown may lack a national holiday, but it does claim a far sexier fable in the dramatic rescue of John Smith by the “Indian princess” Pocahontas. As the story goes, in the middle of an elaborate ceremony, the 11-year-old “beloved daughter” of “King” Powhatan rushed forward and placed her head over Smith, stopping tribesmen from smashing his skull with their clubs. A magical bond formed between the proud Englishman and the young naïf, cutting through all the linguistic and cultural barriers that separated the Old and New Worlds.

Scholars have debated whether the rescue of Smith ever took place, since only his account exists and its most elaborate version was published years after Pocahontas’s death. Smith was a military adventurer, a self-promoter, a commoner, who had the annoying habit of exaggerating his exploits. His rescue story perfectly mimicked a popular Scottish ballad of the day in which the beautiful daughter of a Turkish prince rescues an English adventurer who is about to lose his head. Though an Anglican minister presided over Princess Pocahontas’s marriage to the planter John Rolfe, one member of the Jamestown council dismissed her as the heathen spawn of a “cursed generation” and labeled her a “barbarously-mannered” girl. Even Rolfe considered the union a convenient political alliance rather than a love match.

The Pocahontas story requires the princess to reject her own people and culture. This powerful theme has persisted, as the historian Nancy Shoemaker observes, because it contributes to the larger national rationale of the Indians’ willing participation in their own demise. Yet this young girl did not willingly live at Jamestown; she was taken captive. In the garden paradise of early Virginia that never was, war and suffering, greed and colonial conquest are conveniently missing. Class and cultural dissonance magically fade from view in order to remake American origins into a utopian love story.

Here was England’s opportunity to thin out its prisons and siphon off thousands; here was an outlet for the unwanted, a way to remove vagrants and beggars, to be rid of London’s eyesore population. Those sent on the hazardous voyage to America who survived presented a simple purpose for imperial profiteers: to serve English interests and perish in the process. In that sense, the “first comers,” as they were known before the magical “Pilgrims” took hold, were something less than an inspired lot. Dozens who disembarked from the Mayflower succumbed that first year to starvation and disease linked to vitamin deficiency; scurvy rotted their gums, and they bled from different orifices. By the 1630s, New Englanders reinvented a hierarchical society of “stations,” from ruling elite to household servants. In their number were plenty of poor boys, meant for exploitation. Some were religious, but they were in the minority among the waves of migrants that followed Winthrop’s Arbella. The elites owned Indian and African slaves, but the population they most exploited were their child laborers. Even the church reflected class relations: designated seating affirmed class station.

Virginia was even less a place of hope. Here were England’s rowdy and undisciplined, men willing to gamble their lives away but not ready to work for a living. England perceived them as “manure” for a marginal land. All that these idle men understood was a cruel discipline when it was imposed upon them in the manner of the mercenary John Smith, and the last thing they wanted was to work to improve the land. All that would keep the fledgling colony alive was a military-style labor camp meant to protect England’s interests in the country’s ongoing competition with the equally designing Spanish, French, and Dutch governments. That a small fraction of colonists survived the first twenty years of settlement came as no surprise back home—nor did London’s elite much care. The investment was not in people, whose already unrefined habits declined over time, whose rudeness magnified in relation to their brutal encounters with Indians. The colonists were meant to find gold, and to line the pockets of the investor class back in England. The people sent to accomplish this task were by definition expendable.

So now we know what happens to our colonial history. It is whitewashed.

To put class back into the story where it belongs, we have to imagine a very different kind of landscape. Not a land of equal opportunity, but a much less appealing terrain where death and harsh labor conditions awaited most migrants.

A firmly entrenched British ideology justified rigid class stations with no promise of social mobility.

So, welcome to America as it was. The year 1776 is a false starting point for any consideration of American conditions. Independence did not magically erase the British class system, nor did it root out long-entrenched beliefs about poverty and the willful exploitation of human labor. An unfavored population, widely thought of as waste or “rubbish,” remained disposable well into modern times.

Whether barren or empty, uncultivated or rank, the land acquired a quintessentially English meaning. The English were obsessed with waste, which was why America was first and foremost a “wasteland” in their eyes. Wasteland meant undeveloped land, land that was outside the circulation of commercial exchange and apart from the understood rules of agricultural production. To lie in waste, in biblical language, meant to exist desolate and unattended; in agrarian terms, it was to be left fallow and unimproved.

Like other Englishmen of his day, he equated wastelands with commons, forests, and fens—those lands that 16th century agrarian improvers eyed for prospective profits. Wasteland served the interest of private owners in the commercial marketplace, when the commons was enclosed and sheep and cattle grazed there; forests could be cut down for timber and cleared for settlements; fens or marshes could be drained and reconstituted as rich, arable farmland. It was not just land that could be waste. People could be waste too.

This brings us to our most important point of embarkation: Hakluyt’s America required what he classified as “waste people,” the corps of laborers needed to cut down the trees, beat the hemp (for making rope), gather honey, salt and dry fish, dress raw animal hides, dig the earth for minerals, raise olives and silk, and sort and pack bird feathers.  He pictured paupers, vagabonds, convicts, debtors, and lusty young men without employment doing all such work. The “fry [young children] of wandering beggars that grow up idly and hurtfully and burdenous to the Realm, might be unladen and better bred up.” Merchants would be sent to trade with the Indians, selling trinkets, venting cloth goods, and gathering more information about the interior of the continent. Artisans were needed: millwrights to process the timber; carpenters, brick makers, and plasterers to build the settlement; cooks, launderers, bakers, tailors, and cobblers to service the infant colony.

The bulk of the labor force was to come from the swelling numbers of poor and homeless. They were, in Hakluyt’s disturbing allusion, “ready to eat up one another,” already cannibalizing the British economy. Idle and unused, they were waiting to be transplanted to the American land to be better (albeit no more humanely) put to use.

This view of poverty was widely shared. One persistent project, first promoted in 1580 but never realized, involved raising a fleet of 100-ton fishing vessels comprising 10,000 men, half of whom were to be impoverished vagrants. The galley labor scheme was designed to beat the famously industrious Dutch at the fishing trade. Leading mathematician and geographer John Dee was another who imagined a maritime solution to poverty. In 1577, as the British navy expanded, he proposed converting the poor into sailors. Others wished for the indigent to be swept from the streets, one way or another, whether gathered up as forced laborers building highways and fortifications or herded into prisons and workhouses. London’s Bridewell Prison was chartered in 1553, the first institution of its kind to propose reformation of vagrants. By the 1570s, more houses of corrections had opened their doors. Their founders offered to train the children of the poor to be “brought up in labor and work,” so they would not follow in the footsteps of their parents and become “idle rogues.

As Hakluyt saw it, the larger reward would be reaped in the next generation. By importing raw goods from the New World and exporting cloth and other commodities in return, the poor at home would find work so that “not one poor creature” would feel impelled “to steal, to starve, and beg as they do.” They would prosper along with the growth of colonial trade. The children of “wandering beggars,” having been “kept from idleness, and made able by their own honest and easy labor,” would grow up responsibly, “without surcharging others.” Children who escaped pauperism, no longer burdens on the state, might reenter the workforce as honest laborers. The poor fry sent overseas would now be “better bred up,” making the lot of the English people better off, and the working poor more industrious. It all sounded perfectly logical and realizable.

Seeing the indigent as wastrels, as the dregs of society, was certainly nothing new. The English had waged a war against the poor, especially vagrants and vagabonds, for generations. A series of laws in the 14TH century led to a concerted campaign to root out this wretched “mother of all vice.” By the 16TH century, harsh laws and punishments were fixed in place. Public stocks were built in towns for runaway servants, along with whipping posts and cages variously placed around London. Hot branding irons and ear boring identified this underclass and set them apart as a criminal contingent. An act of 1547 allowed for vagrants to be branded with a V on their breasts and enslaved. While this unusual piece of legislation appears never to have been put into practice, it was nonetheless a natural outgrowth of the widespread vilification of the poor.

Slums enveloped London. As one observer remarked in 1608, the heavy concentrations of poor created a subterranean colony of dirty and disfigured “monsters” living in “caves.” They were accused of breeding rapidly and infecting the city with a “plague” of poverty, thus figuratively designating unemployment a contagious disease. Distant American colonies were presented as a cure. The poor could be purged. In 1622, the famous poet and clergyman John Donne wrote of Virginia in this fashion, describing the new colony as the nation’s spleen and liver, draining the “ill humors of the body . . . to breed good blood.” Others used less delicate imagery. American colonies were “emunctories,” excreting human waste from the body politic.

As masterless men, detached and unproductive, the vagrant poor would acquire colonial masters. For Hakluyt and others, a quasi-military model made sense. It had been used in Ireland. In the New World, whether subduing the Native population or contending with other European nations with colonial ambitions, fortifications would have to be raised, trenches dug, gunpowder produced, and men trained to use bows. Militarization served other crucial purposes. Ex-soldiers formed one of the largest subgroups of English vagrants. Sailors were the vagrants of the sea, and were often drawn into piracy. The style of warfare most common in the sixteenth century involved attacks on nearly impregnable fortifications, and required prolonged sieges and large numbers of foot soldiers. Each time war revived, the poor were drummed back into service, becoming what one scholar has called a “reserve army of the unemployed.

The life of the early modern soldier was harsh and unpredictable. Disbanded troops often pillaged on their way home. In the popular literature of the day, soldiers-turned-thieves were the subjects of a number of racy accounts. John Awdeley’s The Fraternity of Vagabonds (1561) and others of its kind depicted the wandering poor as a vast network of predatory gangs. Ex-soldiers filled empty slots in the gangs as “uprightmen,” or bandit leaders.  Sending veteran soldiers and convicts to America would reduce crime and poverty in one masterstroke.

When Jamestown, the English outpost along the Chesapeake Bay, was finally founded in 1607, the hardships its settlers experienced proved the general flaw in Hakluyt’s blueprint for creating real-life colonies. Defenders of the Virginia Company of London published tracts, sermons, and firsthand accounts, all trying to explain away the many bizarre occurrences that haunted Jamestown. Social mores were nonexistent. Men defecated in public areas within the small garrison. People sat around and starved. Harsh laws were imposed: stealing vegetables and blasphemy were punishable by death. Laborers and their children were virtual commodities, effectively slaves. One man murdered his wife and then ate her.

Jamestown’s was a slow, painful birth, attended by scant confidence in its future. That year, a lopsided Indian attack nearly wiped out the entire population. The pervasive traumas throughout Jamestown’s early years are legend. Before 1625, colonists dropped like flies, 80% of the first 6,000 dying off. Several different military commanders imposed regimes of forced labor that turned the fledgling settlement into a prison camp. Men drawn to Jamestown dreamt of finding gold, which did little to inspire hard work. Not even starvation awoke them from the dream. A new group arrived in 1611, and described how their predecessors wallowed in “sluggish idleness” and “beastial sloth.” Yet they fared little better. There were few “lusty men” in Virginia, to repeat Hakluyt’s colorful term. It remained difficult to find recruits who would go out and fell trees, build houses, improve the land, fish, and hunt wild game. The men of early Jamestown were predisposed to play cards, to trade with vile sailors, and to rape Indian women.  Bad decisions, and failed recruitment strategies left the colony with too few ploughmen and husbandmen to tend the fields and feed the cattle that were being shipped from England.

Tobacco was at once both a boom and bane. Though it saved the colony from ruin, it stunted the economy and generated a skewed class system. The governing council jealously guarded what soon became the colony’s most precious resource: laborers. The only one of Hakluyt’s lessons to be carefully heeded was the one they applied with vengeance: exploiting a vulnerable, dependent workforce.

The governor and members of his governing council pleaded with the Virginia Company to send over more indentured servants and laborers, who, like slaves, were sold to the highest bidder. Indentured servants were hoarded, overworked, and their terms unfairly extended. Land was distributed unequally too, which increased the class divide.

Contracts of indenture were longer than servant contracts in England—four to nine years versus one to two years. According to a 1662 Virginia law, children remained servants until the age of 24. Indentures were unlike wage contracts: servants were classified as chattels, as movable goods and property. Contracts could be sold, and servants were bound to move where and when their masters moved. Like furniture or livestock, they could be transferred to one’s heirs.

The leading planters in Jamestown had no illusion that they were creating a classless society. From 1618 to 1623, a good many orphans from London were shipped to Virginia––most indentured servants who followed in their train were adolescent boys. As a small privileged group of planters acquired land, laborers, and wealth, those outside the inner circle were hard-pressed to escape their lower status. Those who did become poor tenants found that little had changed in their condition; they were often forced do the same work they had done as servants. A sizable number did not survive their years of service. Or as John Smith lamented in his 1624 General History of Virginia . . . , “This dear bought Land with so much blood and cost, hath only made some few rich, and all the rest losers.

The leaders of Jamestown had borrowed directly from the Roman model of slavery: abandoned children and debtors were made slaves. When indentured adults sold their anticipated labor in return for passage to America, they instantly became debtors, which made their orphaned children a collateral asset.

The transportation of female cargo would “tie and root the Planters minds to Virginia by the bonds of wives and children.” Sexual satisfaction and heirs to provide for would make slothful men into more productive colonists. All that was required of the women was that they marry. Their prospective husbands were expected to buy them, that is, to defray the cost of passage and provisions. Each woman was valued at 150 pounds of tobacco.

The Puritan family was at no time the modern American nuclear family, or anything close. It was often composed of children of different parents, because one or another parent was likely to die young, making remarriage quite common. Winthrop fathered 16 children with 4 different wives, the last of whom he married at age 59, two years before his death. Most households also contained child servants who were unrelated to the patriarch; during harvest season, hired servants were brought in as temporary workers, and poor children were purchased for longer terms as menial apprentices for domestic service or farm-work. The first slave cargo arrived in Boston in 1638. Winthrop, for his part, owned Indian slaves; his son purchased an African.

While servants were expected to be submissive, few actually were. Numerous court cases show masters complaining of their servants’ disobedience, accompanied by charges of idleness, theft, rudeness, rebelliousness, pride, and a proclivity for running away. In 1696, the powerful minister Cotton Mather published A Good Master Well Served, which was an unambiguous attempt to regulate the Bay Colony’s disorderly servant population. Directing his words toward those who served, he insisted, “You are the Animate, Separate, Active Instruments of other men.” In language that is impossible to misunderstand, he reaffirmed, “Servants, your Tongues, your Hands, your Feet, are your Masters, and they should move according to the Will of your Masters.

One had to know his or her place in Puritan Massachusetts. Church membership added a layer of privilege before the courts and elsewhere to an already hierarchical regime. Expulsion from the church carried a powerful stigma. Heretics such as Anne Hutchinson and Mary Dyer were physically banished, cut off and ostracized. Only those who begged forgiveness and humbled themselves before the dual authority of court and church returned to the community. Dyer returned unrepentant, determined to challenge the ruling order. Between 1659 and 1661, she and three other Quakers were charged with “presumptuous & incorrigible contempt” of civil authority. After trial, they were summarily hanged.

Anne Hutchinson was excommunicated from the Boston congregation and expelled from the Bay Colony in 1638 for refusing to bend to the authority of the town fathers. She was sternly advised: “You have rather been a Husband than a Wife and a preacher than a Hearer, and a Magistrate than a Subject.” Hutchinson had held religious classes in her home, and had acquired a large following. Turning the social order upside down, she had undermined the carefully orchestrated moral geography of the Puritan meetinghouse. Male dominance was unquestioned, and ranks so clearly spelled out, that no one could miss the power outlined in something so simple as a seating chart. Members and nonmembers sat apart; husbands and wives were divided; men sat on one side of the room, women on the other. Prominent men occupied the first two rows of benches: the first was reserved exclusively for magistrates, the second for the families of the minister and governor, as well as wealthy merchants. The more sons a man had, the better his pew. Age, reputation, marriage, and estate were all properly calculated before a church seat was assigned.

Breeding had a place in more than one market. In Virginia and elsewhere in the Chesapeake region in the early 17th century, a gender imbalance of six to one among indentured servants gave women arriving from England an edge in the marriage exchange. Writing of Maryland in 1660, former indentured servant George Alsop claimed that women just off the boat found a host of men fighting for their attention. Females could pick and choose: even servants had a shot at marrying a well-heeled planter.

Women and land were for the use and benefit of man. Land held power because of its extent, potential for settlement, and future increase. Knowing how to master the land’s fruitfulness was the true definition of class power. It is important that we understand Bacon’s Rebellion for what it revealed: the most promising land was never equally available to all. The “Parasites” who encircled Governor Berkeley held a decided advantage. Inherited station was mediated by political connections or the good fortune of marrying into a profitable inheritance. By 1700, indentured servants no longer had much of a chance to own land. They had to move elsewhere or become tenants. The royal surveyors made sure that large planters had first bids on new, undeveloped land, and so the larger tracts were increasingly concentrated in fewer hands. Then, as more shipments of slaves arrived in the colony, these too were monopolized by the major landholding families.

Locke undoubtedly had a decisive hand in drafting the inherently illiberal Fundamental Constitutions. The Fundamental Constitutions did more than endorse slavery. It was a manifesto promoting a semi-feudalistic and wholly aristocratic society. Much ink was spilled in devising a colonial kingdom that conferred favor upon titled elites and manor lords. It was on the basis of a fixed class hierarchy that the precious commodity of land was allocated. Each new county was divided into sections: one-fifth of the land was automatically reserved for proprietors, another fifth for the colonial nobility, and three-fifths for untitled manor lords and freeholders.

Governing powers were left in the hands of the Grand Council, run by the local nobility and the proprietors, and it was this body that had sole authority for proposing legislation. A top-heavy colonial parliament consisted of proprietors or their deputies, all of the hereditary nobility of the colony, and one freeholder from each precinct. The constitution made clear that power rested at the top and that every effort had been made to “avoid erecting a numerous democracy.

Even the faux nobility was not as strange as another feature of the Locke-endorsed Constitutions. That dubious honor belongs to the nobility and manor lord’s unique servant class, ranked above slaves but below freemen. These were the “Leet-men,” who were encouraged to marry and have children but were tied to the land and to their lord. They could be leased and hired out to others, but they could not leave their lord’s service. Theirs, too, was a hereditary station: “All the children of Leet-men shall be Leet-men, so to all generations,” the Constitutions stated. The heirs of estates inherited not just land, buildings, and belongings, but the hapless Leet-men as well.

More than some anachronistic remnant of the feudal age, Leet-men represented Locke’s awkward solution to rural poverty. Locke did not call them villains, though they possessed many of the attributes of serfs. He instead chose the word “Leet-men,” which in England at this time meant something very different: unemployed men entitled to poor relief. Locke, like many successful Britons, felt contempt for the vagrant poor in England. He disparaged them for their “idle and loose way of breeding up,” and their lack of morality and industry. There were poor families already in Carolina, as Locke knew, who stood in the way of the colony’s growth and collective wealth. In other words, Locke’s Leet-men would not be charity cases, pitied or despised, but a permanent and potentially productive peasant class—yet definitely an underclass.

But did Leet-men ever exist? Shaftesbury’s Carolina plantation, which was run by his agent, had slaves, indentured servants, and Leet-men of a sort. In 1674, the absent owner instructed his agent to hire laborers as “Leet-men,” emphasizing that by their concurrence to this arrangement he could retain rights to the workers’ “progeny.” In this way, Shaftesbury saw children as key to his hereditary class system—as did his colonial predecessors in Virginia and Massachusetts.9 The Fundamental Constitutions was really a declaration of war against poor settlers.

The first surveyor reported that most of the Virginia émigrés in Carolina territory were not legitimate patent holders at all. They were poor squatters. The surveyor warned that the infant Carolina colony would founder if more “Rich men” were not recruited, that is, men who could build homes and run productive plantations. Landless trespassers (who were not servants) promised only widespread “leveling,” by which the surveyor meant a society shorn of desirable class divisions.

The proprietors definitely did not want a colony overrun with former indentured servants. They did not want Virginia’s refuse. In their grand scheme, Leet-men were intended to take the place of those who lived off the land without contributing to the coffers of the ruling elite. Serfs, in short, were better than those “lazy lubbers,” meaning stupid, clumsy oafs, the word that came to describe the vagrant poor of Carolina.

Locke’s invention of the Leet-men explains a lot. It enables us to piece together the curious history of North Carolina, to demonstrate why this colony lies at the heart of our white trash story. The difficult terrain that spanned the border with Virginia, plus the high numbers of poor squatters and inherently unstable government, eventually led Carolina to be divided into two colonies in 1712. South Carolinians adopted all the features of a traditional class hierarchy, fully embracing the institution of slavery, just as Locke did in the Fundamental Constitutions. The planter and merchant classes of South Carolina formed a highly incestuous community: wealth, slaves, and land were monopolized by a small ruling coterie. This self-satisfied oligarchy were the true inheritors of the old landgraves, carrying on the dynastic impulses of those who would create a pseudo-nobility of powerful families.

North Carolina, which came to be known as “Poor Carolina,” went in a very different direction from its sibling to the south. It failed to shore up its elite planter class. Starting with Albemarle County, it became an imperial renegade territory, a swampy refuge for the poor and landless. Wedged between proud Virginians and upstart South Carolinians, North Carolina was that troublesome “sink of America” so many early commentators lamented. It was a frontier wasteland resistant (or so it seemed) to the forces of commerce and civilization. Populated by what many dismissed as “useless lubbers” (conjuring the image of sleepy and oafish men lolling about doing nothing), North Carolina forged a lasting legacy as what we might call the first white trash colony. Despite being English, despite having claimed the rights of freeborn Britons, lazy lubbers of Poor Carolina stood out as a dangerous refuge of waste people, and the spawning ground of a degenerate breed of Americans.

Without a major harbor, and facing burdensome taxes if they shipped their goods through Virginia, many Carolinians turned to smuggling. Hidden inlets made North Carolina attractive to pirates. Along trade routes from the West Indies to the North American continent, piracy flourished in the late seventeenth and early eighteenth centuries. Several of Albermarle’s governors were accused of sheltering these high-seas thieves and personally profiting from the illicit trade. The notorious Blackbeard (a.k.a. Edward Teach, or Edward Thatch) made a home here, as did the Barbados gentleman turned pirate, Major Stede Bonnet. Supposedly, both were warmly welcomed into the humble homes of North Carolinians.

The settlers refused to pay their quitrents (land tax), which was one of the ways the proprietors hoped to make money. By 1729, when the proprietors sold their original grant to the British government, North Carolina listed 3,281 land grants, and 309 grantees who owned almost half the land. This meant that in a population of nearly 36,000 people, the majority received small or modest grants, or owned no land at all. Most poor households lacked slaves, indentured servants, or even sons working the land. In 1709, squatters in the poorest district in Albemarle petitioned “your honors” for tax relief, pointing out that their land was nothing more than sand. A few months later, an Anglican minister reported in disgust that the colonists “were so careless and uncleanly” that there was “little difference between the corn in the horse’s manger and the bread on their tables.” The entire North Carolina colony was “overrun with sloth and poverty.

Over the years, colonial officials rarely succeeded in collecting customs duties. The proprietors faced resistance in collecting quitrents. Disorder ruled. A British possession in name only, Albemarle County was routinely able to escape imperial rule

When Byrd identified the Carolinians as residents of “Lubberland,” he drew upon a familiar English folktale that featured one “Lawrence Lazy,” born in the county of Sloth near the town of Neverwork. Lawrence was a “heavy lump” who sat in his chimney corner and dreamt. His dog was so lazy that he “lied his head agin the wall to bark.” In Lubberland, sloth was contagious, and Lawrence had the power to put all masters under his spell so that they fell into a deep slumber. As applied to the rural poor who closed themselves off to the world around them, the metaphor of sleep suggested popular resistance to colonial rule. Byrd found the people he encountered in Carolina to be resistant to all forms of government: “Everyone does what seems best in his own eyes.

Byrd’s views, if colorfully expressed, were by no means his alone. An Anglican minister named John Urmston reported that his poor white charges loved their hogs more than they did their minister. They let the hogs into their churches to avoid the heat, leaving “dung and nastiness” on the floor. In 1737, Governor Gabriel Johnson of North Carolina referred to his people as “the meanest, most rustic and squalid part of the species.

Shocking as it is for us to contemplate, large numbers of early American colonists spent their entire lives in such dingy, nasty conditions. The sordid picture conveyed here is an unavoidable part of the American past. Yet there’s more. They walked around with open sores visible on their bodies; they had ghastly complexions as a result of poor diets; many were missing limbs, noses, palates, and teeth. As a traveler named Smyth recorded, the ignorant wretches he encountered wore “cotton rags” and were “enveloped in dirt and nastiness.”

The poor of colonial America were not just waste people, not simply a folk to be compared to their Old World counterparts. By reproducing their own kind, they were, to contemporaneous observers, in the process of creating an anomalous new breed of human.


The colony of free laborers offered a ready boundary (and slave-free zone) that would protect the vulnerable planter class from Native tribes and Spanish settlers in Florida, who might otherwise offer a haven to their runaway slaves. Georgia was a remarkable experiment.

Georgia was founded as a charitable venture, designed to uplift poor families and to reform debtors. One of the most important minds behind it belonged to James Edward Oglethorpe. Oglethorpe was a military adventurer who, with permission of Parliament and the colony’s trustees, traveled to the American colony and helped to plant settlers.

He saw this venture as a unique opportunity to reconstruct class relations. It was a charitable endeavor, one meant to reform debtors and rescue poor men, by offering society a decidedly more humane alternative to Locke’s servile Leet-men.  In refusing to permit slavery, the Georgia colony promised that “free labor” would replace a reliance on indentured servants as well as African bondsmen.

Unique among the American settlements, Georgia was not motivated by a desire for profit. Receiving its charter in 1732, the southernmost colony was the last to be established prior to the American Revolution. Its purpose was twofold: to carve out a middle ground between the extremes of wealth that took hold in the Carolinas, and to serve as a barrier against the Spanish in Florida. As such, it became the site of an unusual experiment.

Conservative land policies limited individual settlers to a maximum of 500 acres, thus discouraging the growth of a large-scale plantation economy and slave-based oligarchy such as existed in neighboring South Carolina. North Carolina squatters would not be found here either.

Poor settlers coming from England, Scotland, and other parts of Europe were granted fifty acres of land, free of charge, plus a home and a garden. Distinct from its neighbors to the north, Georgia experimented with a social order that neither exploited the lower classes nor favored the rich. Its founders deliberately sought to convert the territory into a haven for hardworking families. They aimed to do something completely unprecedented: to build a “free labor” colony.  Two “peculiar” customs stood out: both alcohol and dark-skinned people were prohibited. “No slavery is allowed, nor negroes,” Moore wrote. As a sanctuary for “free white people,” Georgia “would not permit slaves, for slaves starve the poor laborer.” Free labor encouraged poor white men in sober cultivation and steeled them in the event they had to defend the land from outside aggression. It also promised to cure settlers of that most deadly of English diseases, idleness.

A trustee, Oglethorpe never held the office of governor, nor did he even purchase land to enrich himself. Though a highly educated member of Parliament, he traveled without a servant and lived simply. Having fought as an officer under Prince Eugene of Savoy in the Austro–Turkish War of 1716–18, he understood military discipline. This was how he came to trust in the power of emulation; he believed that people could be conditioned to do the right thing by observing good leaders. He shared food with those who were ill or deprived. Visiting a Scottish community north of Savannah, he refused a soft bed and slept outside on the hard ground with the men. More than any other colonial founder, Oglethorpe made himself one of the people, promoting collective effort.

One young believer in the colony, 16-year-old Philip Thicknesse, wrote to his mother in 1735 that “a man may live here upon his own improvements, if he be industrious.” In his grand plan, Oglethorpe wanted a colony of orderly citizen-soldiers; he subscribed to the classical agrarian ideal that virtue was acquired by cultivating the soil and achieving self-sufficiency. Productive, stable, healthy farming families were meant to anchor the colony. As he wrote in 1732, women provided habits of cleanliness and “wholesome food,” and remained on hand to nurse the sick. Unlike others before him, Oglethorpe felt the disadvantaged could be reclaimed if they were given a fair chance. Far more radical was his calculation that a working wife and eldest son could replace the labor of indentured servants and slaves. He claimed that a wife and one son equaled the labor value of an adult male. He was clearly not fond of the practice of indenture, considering it the same as making “slaves for years.” While Georgia’s trustees did not prohibit the use of white servants, Oglethorpe made sure their tenures were limited. Oddly, it turned out that the colonists best suited to the Georgia experiment were not English but Swiss, German, French Huguenot, and Scottish Highlander, all of whom seemed prepared for lives of hardship, arriving as whole communities of farming families.

William Byrd weighed in on the ban against slavery in Georgia in a letter to a Georgia trustee. He saw how slavery had sparked discontent among poor whites in Virginia, who routinely refused to “dirty their hands with Labor of any kind,” preferring to steal or starve rather than work in the fields. Slavery ruined the “industry of our White People,” he confessed, for they saw a “Rank of Poor Creatures below them,” and detested the thought of work out of a perverse pride, lest they might “look like slaves.

Oglethorpe waged a war of words with proslavery settlers, whom he called “Malcontents.” At the height of the controversy, in 1739, he argued that African slavery should never be introduced into his colony, because it went against the core principle of the trustees: “to relieve the distressed.” Instead of offering a sanctuary for honest laborers, Georgia would become an oppressive regime, promoting “the misery of thousands in Africa” by permitting a “free people” to be “sold into perpetual Slavery.

He had written similarly about English sailors back in 1728. Strange though it might seem to us, Oglethorpe’s argument against slavery was drawn from his understanding of the abuse sailors faced as a distinct class. In the eighteenth century, seamen were imagined as a people naturally “bred” for a life at sea, whose very constitution was amenable to a hard life in the British navy. In his tract protesting the abuse of sailors, the more enlightened Oglethorpe rejected claims that men were born to such an exploited station. For him, seamen literally functioned as “slaves,” deprived of the liberties granted to freeborn Britons. As poor men, they were dragged off the streets by press gangs, thrown into prison ships, and sold into the navy. Poorly fed, grossly underpaid, and treated as “captives,” they were a brutalized class of laborers, and in every way coerced.

According to Georgians who petitioned for slaves, Negroes were “bred up” for hard labor in the same way as sailors. Africans would survive in damp, noxious swamps as well as in the sweltering heat. They were cheap to feed and clothe. A meager subsistence diet of water, corn, and potatoes was thought adequate to keep them alive and active. One outfit and a single pair of shoes would last an entire year. White indentured servants were fundamentally different. They demanded English dress for every season. They expected meat, bread, and beer on the table, and if denied this rich diet felt languid and feeble and would refuse to work. If forced to labor as hard as African slaves through the grueling summer months, or so the petitioners claimed, white servants would run away from Georgia as if escaping a “charnel house” (a repository for rotting corpses).

He now charged that the Georgians who fled to South Carolina preferred “whipping Negroes” to regular work. Oglethorpe pointed to those settlers who were not afraid of labor, who knew how to “subsist comfortably” without clamoring for slaves. They were the Scottish Highlanders and German settlers who had petitioned the trustees to keep slavery out of the colony. Oglethorpe felt that these folks were hardier and their predisposition to work was superior to that of Englishmen. But the truth lay in an ability to work collectively, a desire to understand and appreciate the demands of subsistence farming—a commitment to long-term survival in a sparsely settled colony.

Despite his good intentions, the colony failed to eliminate all class divisions. In addition to the fifty acres allotted to charity cases, settlers who paid their own way might be granted as many as five hundred acres. They were expected to employ between four and ten servants. But five hundred acres was the maximum limit for freeholders. The trustees wanted settlers to occupy the land, not to speculate in land. Absentee landholders were not welcome. Georgia also instituted a policy of keeping the land “tail-male,” which meant that land descended to the eldest male child. This feudal rule bound men to their families. The tail-male provision protected heirs whose poor fathers might otherwise feel pressure to sell their land.53 Many settlers disliked the practice. Hardworking families worried about the fate of their unmarried daughters, who might be left with nothing. One such complaint came from Reverend Dumont, a leader of French Protestants interested in migrating to Georgia. What would happen to widows “too old to marry or beget children,” he asked. And how could daughters survive, especially those “unfit for Marriage, either by Sickness or Evil Construction of their Body”? Dumont’s questions went to the core of Oglethorpe’s and the trustees’ philosophy. Young widows and daughters were seen as breeders of the next generation of free white laborers.

Alas, Oglethorpe was fighting a losing battle. Many of the men demanding slaves were promised credit to buy slaves from South Carolinian traders. Slaves were a lure, dangled before poorer men in order to persuade them to put up their land as collateral. That is why Oglethorpe believed that a slave economy would have the effect of depriving vulnerable settlers of their land. Keeping out slavery went hand in hand with preserving a more equitable distribution of land. If the colony allowed settlers to have “fee simple” land titles (so they could sell their land at will), large-scale planters would surely come to dominate. He predicted in 1739 that, left to their own devices, the “Negro Merchants” would gain control of “all the lands in the Colony,” leaving nothing for “all the laboring poor white Men.”

Oglethorpe left the colony in 1743, never to return. Three years earlier, a soldier had attempted to murder him, the musket ball tearing through his wig. He survived, but his dream for Georgia died. Over the next decade, land tenure policies were lifted, rum was allowed to flow freely, and slaves were sold surreptitiously. In 1750, settlers were formally granted the right to own slaves.58 A planter elite quickly formed, principally among transplants from the West Indies and South Carolina. By 1788, Carolinian Jonathan Bryan was the most powerful man in Georgia, with thirty-two thousand acres and 250 slaves. He set up shop there in 1750, the very year slavery was made legal, and his numerous slaves entitled him to large tracts of lands. But to build his empire he had to pull the strings of Georgia’s Executive Council, whose chief duty was distributing land. A long tenure on the council ensured that he acquired the most fertile land, conveniently situated along major trade routes. By 1760, only 5 percent of white Georgians owned even a single slave, while a handful of families possessed them in the hundreds.

Oglethorpe’s ideas did not entirely disappear. Both Benjamin Franklin and Thomas Jefferson agreed that slave-owning corrupted whites. The idea of promoting a free white labor buffer zone went into Jefferson’s draft of what became the Northwest Ordinance (1787), a blueprint for the admission of new states to the Union. Franklin and Jefferson were equally passionate about mobilizing the forces of reproduction. They saw population growth as a sign of national strength. Slavery, too, was to be measured as a numbers game. As Reverend Bolzius had observed, if slaves were encouraged to “breed like animals,” then poor whites could not reproduce at the same rate and hold on to their land or their freedom.

Land was the principal source of wealth, and remained the true measure of liberty and civic worth. Hereditary titles may have gradually disappeared, but large land grants and land titles remained central to the American system of privilege. When it came to common impressions of the despised lower class, the New World was not new at all.

As the colonies’ leading man of science, Franklin popularized the latest theories. Of primary interest here are his efforts to apply scientific knowledge to that most perplexing of all subjects: the creation of classes. It was an article of faith in eighteenth-century British thought that civilized societies usually formed out of the fundamental human need for security to ensure survival, but the same societies were gradually corrupted by a preoccupation with luxuries, which resulted in decadence. The rise and fall of the Roman Empire stood behind such theorizing; what Franklin did was to shift the focus to human biology. Underneath all human endeavors were gut-level animal instincts—and foremost for Franklin was the push and pull of pain and pleasure. Too much pleasure produced a decadent society; too much pain led to tyranny and oppression. Somewhere in between was a happy medium, a society that channeled humanity’s better animal instincts.4 Did North America offer the environment to achieve this happy medium? Franklin thought so. Its unique environment could strip away the unnatural conditions of the Old World system. The vast continent would give Americans a demographic advantage in breeding quickly and more fruitfully than their English counterparts. Freed from congested cities, as well as the swelling numbers of unemployed and impoverished, Americans would escape the extremes of great wealth and grinding poverty. Instead of a frantic competition over resources, the majority would be perfectly content to occupy a middling stage, what he called a “happy mediocrity.

In “Observations Concerning the Increase of Mankind” (1751), one of his most important treatises, Franklin predicted that Americans would double in population in twenty years. Idleness would be bred out of the English constitution. Large families encouraged parents to be industrious. Children would be put to work, imitating their parents, and spurred on by the will to survive. Class formation would occur, but it would be in a state of flux and adjustment, as people spread outward and filled the available territory. People needed incentives to produce more children.

Franklin reminded his readers in “Observations” that in the Roman Empire, fruitful women had been rewarded for the number of offspring they produced. Slave women were rewarded with their liberty, while freeborn widows with large broods earned property rights and the autonomy ordinarily reserved for freeborn men. His point was that great empires needed large populations (strength came in numbers) in order to people and settle new territories. The incentives that America offered were of a different kind than elsewhere: an abundance of land and the liberty to marry young.

In the American colonies and in England, the unmarried man of means was a scandalous figure. He was ridiculed as a hermaphrodite, as half man, half woman; his prescribed punishment, as one New York newspaper demanded, should be to have half of his beard shaved from his face to indicate his diminished manliness. Others felt he should lose his inheritance. In the same way that land could be left fallow, human fertility could be wasted. Having no children, wasting their seed, bachelors indulged in the worst kind of reproductive idleness.

Franklin deplored the racial imbalance in the West Indies, which kept the population of laboring whites at artificially low numbers. Slave-owners, who didn’t perform their own labor, suffered from physical defects: they were “enfeebled, and therefore not so generally prolific.” In short, he concluded that slavery made Englishmen idle and impotent.

Franklin also believed that slavery taught children the wrong lessons: “White Children become proud, disgusted with Labor, and being educated in Idleness, are rendered unfit to get a Living by Industry.

On a larger scale than Oglethorpe, Franklin was fashioning a free-labor zone for the northern colonies. The magic elixir to achieve his idealized British America was, in a word, breeding. In his imagination, a continental expanse populated by fertile settlers would create a more stable society. Children would replace indentured servants and slaves as laborers, mirroring the system of labor that Oglethorpe had tried but failed to permanently institute in Georgia.

Franklin’s own experiences belied his optimism as to the ease with which colonists moved from one place to another. As a teenager, he had run away from Boston to Philadelphia, cutting short the full term of an apprenticeship he had been contracted to serve with his elder brother. A fugitive and vagrant, he was part of the large class of servants on the lam. His movement, like so many others, was haphazard, less methodical than the ants he studied. William Moraley, who arrived in Philadelphia in the same decade as young Franklin and wrote a memoir about his experiences, may have said it best when he described himself as a “Tennis-ball of fortune,” bouncing from one new master to the next. Despite his literary skills, training as a law clerk and watchmaker, the un-Franklinesque Moraley seemed to migrate in circles and never up the social ladder. There was no guarantee that restlessness ensured social mobility.

Poverty was increasingly common as the eighteenth century wore on. Philadelphia had its economic slumps, brutally cold winter weather, and shortages of wood that caused the poor nearly to freeze to death. In 1784, one man who was part of the working poor in the city wrote to the local newspaper that he had six children, and though he “strove in all his power,” he could not support them. Hard work by itself was not the magic balm of economic self-sufficiency, nor was Franklin correct that big families were always a boon. He was even wrong about his tabulations on American birthrates. Infant mortality in Philadelphia was surprisingly high, and comparable to English rates, proving that Franklin’s prediction of a healthy and happy population was more rhetorical than it was demographic fact.

The quintessential self-made man was not self-made. The very idea is ludicrous given the inescapable network of patron-client relationships that defined the world of Philadelphia. To cushion his rise, Franklin relied on influential patrons, who provided contacts and loans that enabled him to acquire the capital he needed to set up his print shop and invest in costly equipment. For Franklin to obtain patronage and navigate contending political factions was a tricky enterprise. Pennsylvania’s class structure had some unusual quirks. At the top were the proprietors, members of William Penn’s family, who owned vast tracts of land and collected quitrents. Next came the wealthy Quaker landowners and merchants, bound together by family and religious ties. In the eighteenth century, the Society of Friends disowned any member who married outside the

Franklin was neither a Quaker nor a quasi-Quaker, but he did develop strong personal relationships with several cosmopolitan and highly educated Friends in Philadelphia and in England. He relied on Quaker patrons, especially in the early days of his business.

Class status was still based on family name in Pennsylvania, for the top tier was dominated by the Penn, Pemberton, and Logan families—the proprietors and Quaker elites. Below them was a growing transatlantic merchant class that set itself apart by engaging in a conspicuous display of wealth. These families owned slaves and servants, and silver tea sets; they wore rich fabrics, had grand homes, and drove carriages. At the time Franklin retired from his printing operations in 1748, he was in the top tenth percentile in wealth, owning a horse and chaise and having invested in a large tract of land. Even among the plain Quakers, known for their simple dress, carriages were a status symbol. In 1774, in a city of fifteen thousand, only eighty-four Philadelphians owned a carriage. Class was about more than wealth and family name; it was conveyed through appearances and reputation. Franklin understood this.

A legal distinction existed between the free and the unfree, the latter including not only slaves but also indentured servants, convict laborers, and apprentices. As dependents, they were all classified as mean, servile, and ill-bred. Thousands of unfree laborers flooded Philadelphia, so that as early as 1730, Franklin was complaining about “vagrants and idle persons” entering the colony. He wrote these words after having escaped impoverished circumstances not many years before. He had arrived in Philadelphia in 1723 as a runaway, meanly dressed in filthy, wet clothing.

Franklin was a man of his time, expressing a natural discomfort with unrestrained social mobility. For most Americans of the eighteenth century, it was assumed impossible for a servant to shed his lowly origins; the meaner sort, as one newspaper insisted, could never “wash out the stain of servility.” There were fears that the meaner sort were treading too close on the heels of those above them.30 Franklin certainly never endorsed social mobility as we think of it today, despite his own experience. To be accurate, he fantasized that the continent would flatten out classes, but it was clear that this condition was contingent upon keeping poor people in perpetual motion. Franklin’s militia plan expressed a conservative impulse. Giving the accomplished middling sort a feeling of public respect and a sense of civic duty would yield them the solid contentment of happy mediocrity.

Contentment might actually reduce the desire of more ambitious men to rise up the social ladder too quickly or recklessly.

Franklin was not blind to the fact that North America’s frontier settlers would not be composed solely of the finest British stock. He was quick to call those who inhabited the Pennsylvania backcountry the “refuse” of America. But at the same time, he hoped that the forces of nature would carry the day, that the demands of survival would weed out the slothful, and that the better breeders would supplant the waste people.

Paine’s sleight of hand in concealing class reflected his preference for talking about breeds. His overarching argument was that European-descended Americans were a new race in the making, one specially bred for free trade instead of the state machinery of imperial conquest. His critique of the British political economy was centered on the enormous debts it incurred through expensive military adventures, which he blamed on the frivolous ambitions of English royalty. Over time, kings and queens had become wasteful heads of state, in and of themselves a social liability.

The American colonies, meanwhile, were being “drained” of their collective manpower and wealth, merely to underwrite new overseas wars. Independence would allow America to “begin the world over again,” Paine declared dramatically. The new nation would signal a new world order. Unburdened by constant debt and a large military, it would be a vibrant continental power erected on the ideals of free trade and global commerce.

The Swedish botanist Carl von Linné, better known to history as Linnaeus, organized all of plant and animal life, and divided Homo sapiens, the word he coined for humans, into four varieties. The European type he said was sanguine, brawny, acute, and inventive; the American Indian he deemed choleric and obstinate, yet free; the Asian was melancholic and greedy; and the African was crafty, indolent, and negligent. This grand (and ethnocentric) taxonomy served Paine’s purpose in justifying the American Revolution. To “begin the world over again,” Americans of English and European descent had to be a new race in the making—perhaps a better one—as they laid claim to North America.45 In Paine’s simple formulation, breeding was either conditioned by nature or it was corrupted through superstition. The first possibility allowed a people’s fullest potential to be unleashed, while the latter only reduced their ability to grow and improve themselves. Again, he was not alone in equating monarchy with bad breeding. Paine

There was nothing sacred about a royal breed. Blind allegiance to what enlightened critics had reduced to a barnyard custom exposed how an intelligent, civilized people might lose their grip on reality. The natural order was greatly out of alignment: British kings were exalted above everyone else for no logical reason. Americans had a unique opportunity to break free from the relics of the past and to set a true course for a better future, one unburdened by the deadweight of kings and queens.

It was this antiauthoritarian idea that made Paine’s pamphlet most radical. If kings could be seen as “ignorant and unfit,” then why not royal governors, Quaker proprietors, or the “Better Sort” riding in their carriages? If monarchy was not what it was supposed to represent, other customary forms of power could be questioned too. Class appearances might be similarly seen as mere smoke and mirrors. This is why Paine was careful to downplay the distinction between the rich and the poor. He wanted his American readers to focus on distant kings, not local grandees. He wanted them to break with the Crown, not to disturb the class order.

For like reasons, he turned a blind eye to slavery. Paine’s America was above all else an “asylum” for future-directed Europeans. No one else need apply. He argued against the inherited notion that America was a dumping ground for lesser humans. It was only a sanctuary for able, hardworking men and women. This overly sanguine portrait cleaned up class and ignored what was unpleasant to look at. Indentured servitude and convict labor were still very much in evidence as the Revolution neared, and slavery was a fact of life. Philadelphia had a slave auction outside the London Coffee House, at the center of town on Front and Market Streets, which was directly across from Paine’s lodgings. In Common Sense, the propagandist mentioned “Negroes” and “Indians” solely to discredit them for being mindless pawns of the British, when they were incited to harass and kill white Americans and to undermine the worthy cause of independence. The English military had “stirred up Indians and Negroes to destroy us.” Us against them. Civilized America was being pitted against the barbarous hordes set upon them by the “hellish” power of London.47 Paine’s purpose was to remind his readers of America’s greatness, drawing on the visual comparison of the continent in its size and separation from the tiny island that ruled it.

Paine gave consideration to one more element that impinges on our study of class. He was thoroughly convinced that independence would eliminate idleness. Like Franklin, he projected a new continental order in which poverty was diminished. “Our present numbers are so happily proportioned to our wants,” he wrote, “that no man need be idle.” There were enough men to raise an army and engage in trade: enough, in other words, for self-sufficiency.

Thomas Jefferson thought about class in continental terms. His greatest accomplishment as president was the 1803 acquisition of Louisiana, a vast territory that more than doubled the size of the United States.

The Louisiana Territory, as he envisioned it, would encourage agriculture and forestall the growth of manufacturing and urban poverty—that was his formula for liberty. It was not Franklin’s “happy mediocrity” (a compression of classes across an endless stretch of unsettled land), but a nation of farmers large and small.

Eighteenth-century Virginia was both an agrarian and a hierarchical society. By 1770, fewer than 10% of white Virginians laid claim to over half the land in the colony; a small upper echelon of large planters each owned slaves in the hundreds. More than half of white men owned no land at all, working as tenants or hired laborers, or contracted as servants. Land, slaves, and tobacco remained the major sources of wealth in Jefferson’s world, but the majority of white men did not own slaves. That is why Mr. Jefferson wafted well above the common farmers who dotted the countryside that extended from his celebrated mountaintop home. By the time of the Revolution, he owned at least 187 slaves, and by the Battle of Yorktown he held title to 13,700 acres in six different counties in Virginia. Pinning down Jefferson’s views on class is complicated by the seductiveness of his prose. His writing could be powerful, even poetic, while reveling in rhetorical obfuscation. He praised “cultivators of the earth” as the most valuable of citizens; they were the “chosen people of God,” and they “preserved a republic in vigor” through their singularly “useful occupation.” And yet Jefferson’s pastoral paragon of virtue did not describe any actual Virginia farmers, and not even he could live up to this high calling. Despite efforts at improving efficiency on his farms, he failed to turn a profit or rescue himself from mounting debts. In a 1796 letter, he sadly admitted that his farms were in a “barbarous state” and that he was “a monstrous farmer.” Things continued downhill from there.

Virginians were far behind the English in the use of fertilizers, crop rotation, and harvesting and ploughing methods. It was common for large planters and small farmers alike to deplete acres of soil and then leave it fallow and abandoned. “We waste as we please,” was how Jefferson phrased it.

Jefferson’s various reform efforts were thwarted by those of the ruling gentry who had little interest in elevating the Virginia poor. Even more dramatically, his agrarian version of social mobility was immediately compromised by his own profound class biases, of which he was unaware. To imagine that Jefferson had some special insight into the anxious lives of the lower sort, or that he truly appreciated the unpromising conditions tenant farmers experienced, is to fail to account for the wide gulf that separated the rich and poor in Virginia

Revolutionary Virginia was hardly a place of harmony, egalitarianism, or unity. The war effort exacerbated already simmering tensions between elite Patriots and those below them. In British tradition, the American elite expected the lower classes to fight their wars. In the Seven Years’ War, for example, Virginians used the infamous practice of impressment to round up vagabonds to meet quotas. During the Revolution, General Washington stated that only “the lower class of people” should serve as foot soldiers. Jefferson believed that class character was palpably real. As a member of the House of Delegates, he came up with a plan to create a Virginia cavalry regiment specifically for the sons of planters, youths whose “indolence or education, has unfitted them for foot-service.

As early as 1775, landless tenants in Loudoun County, Virginia, voiced a complaint that was common across the sprawling colony: there was “no inducement for the poor man to Fight, for he had nothing to defend.” Many poor white men rebelled against recruitment strategies, protested the exemptions given to the overseers of rich planters, and were disappointed with the paltry pay. Such resistance led to the adoption of desperate measures. In 1780, Virginia assemblymen agreed to grant white enlistees the bounty of a slave as payment for their willingness to serve until the end of the war. Here was an instant bump up the social ladder. Here was the social transfer of wealth and status from the upper to the lower class. But even this gruesome offer wasn’t tempting enough, because few took the bait. Two years later when the Battle of Yorktown decided the outcome of the war, the situation was unchanged. Of those fighting on the American side, only a handful hailed from Virginia.

The committee considered a proposal granting each freeborn child a tract of seventy-five acres as an incentive to encourage poorer men to marry and have children. Jefferson’s freeholders needed children to anchor them to the land and as an incentive to turn from idleness. But reform did not take easily. Virginia’s freehold republic failed to instill virtue among farmers, the effect that Jefferson had fantasized. The majority of small landowners sold their land to large planters, mortgaged their estates, and continued to despoil what was left of the land. They looked upon it as just another commodity, not a higher calling. Jefferson failed to understand what his predecessor James Oglethorpe had seen: the freehold system (with disposable land grants) favored wealthy land speculators. Farming was arduous work, with limited chance of success, especially for families lacking the resources available to Jefferson: slaves, overseers, draft animals, a plough, nearby mills, and waterways to transport farm produce to market. It was easy to acquire debts, easy to fail. Land alone was no guarantee of self-sufficiency.

They were quite content to dump the poor into the hinterland. With the opening up of the land office in 1776, a new policy was adopted: anyone squatting on unclaimed land in western Virginia and Kentucky could claim a preemption right to buy it. Like the long-standing British practice of colonizing the poor, the Virginians sought to quell dissent, raise taxes, and lure the less fortunate west. This policy did little to alter the class structure. In the end, it worked against poor families. Without ready cash to buy the land, they became renters, trapped again as tenants instead of becoming independent landowners.

Jefferson, too, wanted Americans tied to the land, with deep roots to their offspring, to future generations. Agrarian perfection would germinate: a love of the soil, no less than a love of one’s heirs, instilled amor patriae, a love of country. He was not promoting a freewheeling society or the rapid commercial accumulation of wealth; nor was he advocating a class system marked by untethered social mobility. Jefferson’s husbandmen were of a new kind of birthright station, passed from parents to children. They were not to be an ambitious class of men on the make.

Jefferson’s idealized farmers were not rustics either. They sold their produce in the marketplace, albeit on a smaller scale. There was room enough for an elite gentry class, and gentleman farmers like himself. Using the latest husbandry methods, improving the soil, the wealthier farmers could instruct others, the less skilled beneath them. Education and emulation were necessary to instill virtue. American farmers required an apprenticeship of a sort, which was only possible if they were planted in the right kind of engineered environment

He had a lot to defend in the aftermath of the American Revolution. The war years had taken their toll. A postwar depression created widespread suffering. States had acquired hefty debts, which caused legislatures to increase taxes to levels far higher, sometimes three to four times higher, than before the war. Most of these tax dollars ended up in the hands of speculators in state government securities that had been sold to cover war expenses. Many soldiers were forced to sell their scrip and land bounties to speculators at a fracture of the value. Wealth was being transferred upward, from the tattered pockets of poor farmers and soldiers to the bulging purses of a nouveau riche of wartime speculators and creditors—a new class of “moneyed men.

“No distinction between man and man has ever been known in America,” he insisted. Among private individuals, the “poorest labourer stood on equal ground with the wealthiest Millionary,” and the poor man was favored when the rights of the rich and poor were contested in the courts. Whether the “shoemaker or the artisan” was elected to office, he “instantly commanded respect and obedience.” With a final flourish, Jefferson declared that “of distinctions by birth or badge,” Americans “had no more idea than they had of existence in the moon or planets.”33 Though Jefferson sold Europeans on America as a classless society, no such thing existed in Virginia or anywhere else. In his home state, a poor laborer or shoemaker had no chance of getting elected to office. Jefferson wrote knowing that semiliterate members of the lower class did not receive even a rudimentary education. Virginia’s courts meticulously served the interests of rich planters. And wasn’t slavery a “distinction between man and man”? Furthermore, Jefferson’s freehold requirement for voting created “odious distinctions” between landowners and poor merchants and artisans, denying the latter classes voting rights. One has to wonder at Jefferson’s blatant distortion.

Americans not only scrambled to get ahead; they needed someone to look down on. “There must be one, indeed, who is the last and lowest of the human species,” Adams concluded, and even he needed his dog to love him. He also sarcastically acknowledged that while Jefferson and his brand of republicans might disdain titles and stations, they had no intention of disturbing private forms of authority; the subordinate positions of wives, children, servants, and slaves were left safely intact.

Jefferson’s model of breeding generated an “accidental aristocracy” of talent. Class divisions would form through natural selection. Men would marry women for more than money; they would consciously and unconsciously choose mates with other favorable traits. It was all a matter of probability: some would marry out of sheer lust, others for property, but the “good and wise” would marry for beauty, health, virtue, and talents. If Americans had enough native intelligence to distinguish the natural aristocrats from the pseudo-aristocrats in choosing political leaders, then they had reasonable instincts for selecting spouses. A “fortuitous concourse of breeders” would produce a leadership class—one that would sort out the genuinely talented from the ambitious men on the make. The question that Jefferson never answered was this: What happened to those who were not part of the talented elite? How would one describe the “concourse of breeders” living on the bottom layer of society?

Conjuring a potent topographical metaphor, Jefferson contended that the colony had had a stagnant class system, whose social order resembled a slice of earth on an archeological dig. The classes were separated into “strata,” which shaded off “imperceptibly, from top to bottom, nothing disturbing the order of their repose.” Jefferson divided the top tier of supposed social betters into “Aristocrats, half breeds, pretenders.” Below them was the “solid independent yeomanry, looking askance at those above, yet not ventured to jostle them.” On the bottom rung he put “the lowest feculum of beings called Overseers, the most abject, degraded and unprincipled race.” Overseers were tasked to keep slaves engaged in labor on southern plantations. By pitting the honest yeomanry against the “feculum” of overseers, Jefferson harshly invoked the old English slur of human waste. That wasn’t enough. He portrayed overseers as panderers, with their “cap in hand to the Dons”; they were vicious men without that desirable deposit of virtue, who feigned subservience in order to indulge the “spirit of domination.

In this strange sleight of hand, slaves became invisible laborers outside his tripartite social ranking. Jefferson made them victims of overseers, not of their actual owners.

He presented the upper class as an odd collection of breeds: great planters (pure-blooded Aristocrats) sat at the top, but their children might marry down and produce a class of “half breeds.” The pretenders were outsiders who dared claim the station of the leading families, where they were never really welcomed. Despite his pose in his exchange with John Adams two years earlier, Jefferson’s brief natural history of Virginia’s classes proved that elites and upstarts married the “wellborn.” The Virginia upper class was a creation of marrying for money, name, and station, in which kinship and pedigree were paramount.

The western territories were for all intents and purposes America’s colonies. Despite the celebratory spirit in evidence each Fourth of July beginning in 1777, many anxieties left over from the period of the English colonization revived. Patriotic rhetoric aside, it was not at all clear that national independence had genuinely ennobled ordinary citizens. Economic prosperity had actually declined for most Americans in the wake of the Revolution. Those untethered from the land, who formed the ever- expanding population of landless squatters heading into the trans-Appalachian West, unleashed mixed feelings. To many minds, the migrant poor represented the United States’ re-creation of Britain’s most despised and impoverished class: vagrants. During the Revolution, under the Articles of Confederation (the first founding document before the Constitution was adopted), Congress drew a sharp line between those entitled to the privileges of citizenship and the “paupers, vagabonds, and fugitives from justice” who stood outside the national community.

The presumptive “new man” of the squatter’s frontier embodied the best and the worst of the American character. The “Adam” of the American wilderness had a split personality: he was half hearty rustic and half dirk-carrying highwayman. In his most favorable cast as backwoodsman, he was a homespun philosopher, an independent spirit, and a strong and courageous man who shunned fame and wealth. But turn him over and he became the white savage, a ruthless brawler and eye-gouger. This unwholesome type lived a brute existence in a dingy log cabin, with yelping dogs at his heels, a haggard wife, and a mongrel brood of brown and yellow brats to complete the sorry scene.

Both crackers and squatters—two terms that became shorthand for landless migrants—supposedly stayed just one step ahead of the “real” farmers, Jefferson’s idealized, commercially oriented cultivators. They lived off the grid, rarely attended a school or joined a church, and remained a potent symbol of poverty. To be lower class in rural America was to be one of the landless. They disappeared into unsettled territory and squatted down (occupied tracts without possessing a land title) anywhere and everywhere. If land-based analogies were still needed, they were not to be divided into grades of soil, as Jefferson had creatively conceived, but spread about as scrub foliage or, in bestial terms, mangy varmints infesting the land.

Both “squatter” and “cracker” were Americanisms, terms that updated inherited English notions of idleness and vagrancy. “Squatter,” in one 1815 dictionary, was a “can’t name” among New Englanders for a person who illegally occupied land he did not own. An early usage of the word occurred in a letter of 1788 from Federalist Nathaniel Gorham of Massachusetts, writing to James Madison about his state’s ratifying convention. Identifying three classes of men opposed to the new federal Constitution, he listed the former supporters of Shays’ Rebellion in the western counties, the undecided who might be led astray by opinionated others, and the constituents of Maine: this last group were “squatters” who “lived upon other people’s land” and were “afraid of being brought to account.” Not yet a separate state, Maine was the wooded backcountry of Massachusetts, and Gorham was about to become one of the most powerful speculators in the unsettled lands of western New York State. In 1790, “squatter” appeared in a Pennsylvania newspaper, but written as “squatters,” describing men who inhabited the western borderlands of that state, along the Susquehanna River. They were men who “sit down on river bottoms,” pretend to have titles, and chase off anyone who dares to usurp their claims.5 Interlopers and trespassers, unpoliced squatters and crackers grew crops, cut timber, hunted and fished on land they did not own. They lived in temporary huts beyond the reach of the civilizing forces of law and society and often in close proximity to Native Americans. In Massachusetts and Maine, squatters felt they had a right to the land (or should be paid) if they made improvements, that is, if they cleared away the trees, built fences, homes, and barns, and cultivated the soil. Their de facto claims were routinely challenged; families were chased off, their homes burned. Squatters often refused to leave, took up arms, and retaliated.

Even the threat of the gallows did not stop the flow of migrants across the Susquehanna, down the Ohio, and as far south as North Carolina and Georgia.

The motley caravan of settlers that gathered around encampments such as Fort Pitt (the future Pittsburgh), at the forks of the Ohio, Allegheny, and Monongahela Rivers, served as a buffer zone between the established colonial settlements along the Atlantic and Native tribes of the interior. A semi-criminal class of men, whose women were dismissed as harlots by the soldiers, they trailed in the army’s wake as camp followers, sometimes in the guise of traders, other times as whole families.8 Colonial commanders such as Swiss-born colonel Henry Bouquet in Pennsylvania treated them all as expendable troublemakers, but occasionally employed them in attacking and killing so-called savages. Like the vagrants rounded up in England to fight foreign wars, these colonial outcasts had no lasting social value. In 1759, Bouquet argued that the only hope for improving the colonial frontier was through regular pruning. For him, war was a positive good when it killed off the vermin and weeded out the rubbish. They were “no better than savages,” he wrote, “their children brought up in the Woods like brutes, without any notion of Religion, [or] Government.” Nothing man could devise “improved the breed.

“Crackers” first appeared in the records of British officials in the 1760s and described a population with nearly identical traits. In a letter to Lord Dartmouth, one colonial British officer explained that the people called “crackers” were “great boasters,” a “lawless set of rascals on the frontiers of Virginia, Maryland, the Carolinas and Georgia, who often change their places of abode.” As backcountry “banditti,” “villains,” and “horse thieves,” they were dismissed as “idle stragglers” and “a set of vagabonds often worse than the Indians.

An Anglican minister, Charles Woodmason, who traveled for six years in the Carolina wilderness in the 1760s, offered the most damning portrait of the lazy, licentious, drunken, and whoring men and women whom he adjudged the poorest excuses for British settlers he had ever met.

The “cracking traders” of the 1760s were described as noisy braggarts, prone to lying and vulgarity. One could also “crack” a jest, and crude Englishmen “cracked” wind. Firecrackers gave off a stench and were loud and disruptive as they snapped, crackled, and popped. A “louse cracker” referred to a lice-ridden, slovenly, nasty fellow.

Another significant linguistic connection to the popular term was the adjective “crack brained,” which denoted a crazy person and was the English slang for a fool or “idle head.” Idleness in mind and body was a defining trait. In one of the most widely read sixteenth-century tracts on husbandry, Thomas Tusser offered the qualifying verse, “Two good haymakers, worth twenty crackers.” As the embodiment of waste persons, they whittled away time, producing only bluster and nonsense. American crackers were aggressive. Their “delight in cruelty” meant they were not just cantankerous but dangerous.

The persistence of the squatter and cracker allows us to understand how much more limited social mobility was along the frontier than loving legend has it. In the Northwest (Ohio, Indiana, Illinois, Michigan, and Wisconsin Territories), the sprawling upper South (Kentucky, Tennessee, Missouri, and Arkansas Territories), and the Floridas (East and West), classes formed in a predictable manner. Speculators and large farmers—a mix of absentee land investors and landowning gentry—had the most power and political influence, and usually had a clear advantage in determining how the land was parceled out. The middling landowners had personal or political connections to the large landowning elite.

With this flood of new settlers, squatters made their presence known. Sometimes identified as families, at other times as single men, they were viewed as a distinct and troublesome class. In the Northwest Territory, they were dismissed as unproductive old soldiers, rubbish that needed to be cleared away before a healthy commercial economy could be established. President Jefferson termed them “intruders” on public lands. Some transients found subsistence as hired laborers. All of them existed on the margins of the commercial marketplace

Educated observers feared social disorder, particularly after the financial panic of 1819, when political writers predicted in the West a “numerous population, in a state of wretchedness.” Increasing numbers of poor settlers and uneducated squatters were “ripe for treason and spoil”—a familiar refrain recalling the language circulated during Shays’ Rebellion in 1786. In the wake of the panic, the federal government devised a program of regulated land sales that kept prices high enough to weed out the lowest classes.

Federal laws for purchasing land were weighted in favor of wealthier speculators.

Americans tend to forget that Andrew Jackson was the first westerner elected president. Tall, lanky, with the rawboned look of a true backwoodsman, he wore the harsh life of the frontier on his face and literally carried a bullet next to his heart. Ferocious in his resentments, driven to wreak revenge against his enemies, he often acted without deliberation and justified his behavior as a law unto himself. His controversial reputation made him the target of attacks that painted him as a Tennessee cracker.

Jackson’s personality was a crucial part of his democratic appeal as well as the animosity he provoked. He was the first presidential candidate to be bolstered by a campaign biography. He was not admired for statesmanlike qualities, which he lacked in abundance in comparison to his highly educated rivals John Quincy Adams and Henry Clay. His supporters adored his rough edges, his land hunger, and his close identification with the Tennessee wilderness. As a representative of America’s cracker country, Jackson unquestionably added a new class dimension to the meaning of democracy. But the message of Jackson’s presidency was not about equality so much as a new style of aggressive expansion. In 1818, General Andrew Jackson invaded Florida without presidential approval; as president, he supported the forced removal of the Cherokees from the southeastern states and willfully ignored the opinion of the Supreme Court. Taking and clearing the land, using violent means if necessary, and acting without legal authority, Jackson was arguably the political heir of the cracker and squatter.

Though by the 1830s he would come to be known as a bear hunter and “Lion of the West,” David Crockett was a militia scout and lieutenant, justice of the peace, town commissioner, state representative, and finally a U.S. congressman. He was first elected to the House of Representatives in 1827. What makes the historic David Crockett interesting is that he was self-taught, lived off the land, and (most notably for us) became an ardent defender of squatters’ rights—for he had been a squatter himself. As a politician he took up the cause of the landless poor.

In Congress he opposed large planters’ engrossment of vast tracts of land. He championed a bill that would have sold land directly from the federal government to squatters at low prices. He also opposed the practice of having courts hire out insolvent debtors to work off fees—an updated variation on indentured servitude.

Representative Crockett may have compared speculators to sneaky coons in an 1824 speech before the Tennessee House, but he never lost sight of the legal ploys used to trick poorer settlers out of their land warrants. In the end, the man, not the legend, did a better job of exposing class conflict in the backcountry, where real speculators were routinely pitted against real squatters.

Democrat Andrew Jackson’s stormy relationship with Crockett was replicated again and again with any number of contemporaries over the course of a career that was built on sheer will and utter impulse.

Whether supporters portrayed him as the conquering hero or his enemies labeled him King Andrew, all focused on his volatile emotions. He certainly lacked the education and polite breeding of his presidential predecessors.   His political rise came through violence, having slaughtered the Red Stick faction of the Creek Nation in the swamps of Alabama in 1813–14, while leaving hundreds of British soldiers dead in the marshes of New Orleans in January 1815. Jackson bragged about the British death toll, as did American poets. And it was no exaggeration. Bodies floated in rivers and streams, and bones of the vanquished were found by traveler’s decades later.

“Boisterous in ordinary conversation, he makes up in oaths what he lacks in arguments.” Not known for his subtle reasoning, Jackson was blunt in his opinions and quick to resent any who disagreed with him. Shouting curses put him in the company of both common soldiers and uncouth crackers.

Jackson’s aggressive style, his frequent resorting to duels and street fights, his angry acts of personal and political retaliation seemed to fit what one Frenchman with Jacksonian sympathies described as the westerner’s “rude instinct of masculine liberty”.

After New Orleans, Jackson led his army into Spanish Florida in 1818. He began by raising troops in Tennessee without waiting for the governor’s approval, then invaded East Florida under the guise of arresting a handful of Seminole Indians who were accused of attacking American settlers. When he attacked the fortified Spanish at Pensacola, what had begun as a foray to capture Indians quickly turned into a full-scale war and occupation.

Jackson went beyond squatting on Spanish soil. He violated his orders and ignored international law. After overtaking several Florida towns and arresting the Spanish governor, he executed two British citizens without real cause. The British press had a field day, calling the U.S. major general a “ferocious Yankee pirate with blood on his hands.” In a devastating caricature, Jackson appeared as a swarthy, swaggering bandit flanked by a corps of militiamen who were no more than ragged, shoeless brutes, beating drums with bones and wearing skulls instead of hats.

The pirate who doubled as a backcountry cracker bruiser was unrestrained and unrestrainable. In the Florida invasion, he was reportedly aided by squatters dressed up as “white savages,” who may in fact have been the true catalyst behind Jackson’s controversial action. The Florida conflict had all the signs of a squatters’ war. Soldiers reported that Seminole warriors only attacked “cracker houses,” leaving those of British or northern settlers untouched.

Prominent critics insisted on a congressional investigation. The powerful Speaker of the House, Henry Clay, demanded the rogue general’s censure. Jackson went to Washington, damned the established legal authorities, and told Secretary of State John Quincy Adams that the entire matter of Florida was between President Monroe and himself—and no one else. Confirmed rumors circulated that Jackson had threatened to cut off the ears of some senators because they had dared to investigate—and humiliate—him on the national stage.

In Jackson’s crude lexicon, territorial disputes were to be settled by violent means, not by words alone. He explained his Indian policy as the right of “retaliatory vengeance” against “inhumane bloody barbarians.” In 1818, he was heralded in a laudatory biography as a kind of backcountry Moses, administering justice with biblical wrath. To those who protested his lack of regard for international law or constitutional details, defenders claimed that he was “too much a patriot in war, to suffer the scruples of a legal construction.

Few of Jackson’s critics were buying the chivalrous portrait his defenders presented. He was not protecting women and children so much as opening up Florida lands to squatters and roughs and other uncivilized whites. But unlike Crockett, Jackson was never a champion of squatters’ rights. When ordered to remove them, he used the military to do the job. Yet at the same time he favored white possession of the land in the same way squatters had always defended their claims: those who cleared and improved the land were worthy occupants. Jackson’s thinking shaped his Indian removal policy as president. He argued that Indians should not be treated as sovereign nations with special claims on the public domain, but as a dependent class.

Could the members of the investigation committees fully appreciate the difficulties while sitting at home, their families safe from harm? The men censuring Jackson, whom the Kentucky congressman mocked as the “young sweet-smelling and powdered beau of the town,” were out of their league. With this clever turn of phrase, he recast Jackson’s foes as beaus and dandies, the classic enemies of crackers and squatters.55 Walker had tapped into a dominant class motif of cracker democracy, dating back at least to 1790, when the cracker-versus-beau plotline began to take shape.

The beau was an effete snob, and his ridicule an uncalled-for taunt. The real men of America were Jacksonian, the hearty native sons of Tennessee and Kentucky. They fought the wars. They opened up the frontier through their sacrifice and hardship. They fathered the next generation of courageous settlers. Defensive westerners thus attached to Jackson their dreams and made him a viable presidential candidate.

When John Quincy Adams supporters circulated a note written by Jackson filled with misspellings and bad grammar, Jacksonians praised him as “self-taught.” If his lack of diplomatic experience made him “homebred,” this meant that he was less contaminated than the former diplomat Adams by foreign ideas or courtly pomp. The class comparison could not be ignored: Adams had been a professor of rhetoric at Harvard, while his Tennessee challenger was “sprung from a common family,” and had written nothing to brag about. Instinctive action was privileged over unproductive thought.

The candidate’s private life came under equal scrutiny. His irregular marriage became scandalous fodder during the election of 1828. His intimate circle of Tennessee confidants scrambled to find some justification for the couple’s known adultery. John Overton, Jackson’s oldest and closest friend in Nashville, came up with the story of “accidental bigamy,” claiming that the couple had married in good conscience, thinking that Rachel’s divorce from her first husband had already been decreed. But the truth was something other. Rachel Donelson Robards had committed adultery, fleeing with her paramour Jackson to Spanish-held Natchez in 1790. They had done so not out of ignorance, and not on a lark, but in order to secure a divorce from her husband. Desertion was one of the few recognized causes of divorce.

In the ever-expanding script detailing Jackson’s misdeeds, adultery was just one more example of his uncontrolled passions. Wife stealing belonged to the standard profile of the backwoods aggressor who refused to believe the law applied to him. In failing to respect international law, he had conquered Florida; in disregarding his wife’s first marriage contract, he simply took what he wanted.

Jackson’s candidacy changed the nature of democratic politics. One political commentator noted that Jackson’s reign ushered in the “game of brag.” Jacksonians routinely exaggerated their man’s credentials, saying he was not just the “Knight of New Orleans,” the country’s “deliverer,” but also the greatest general in all human history.

Bragging had a distinctive class dimension in the 1820s and 1830s. In a satire published in Tennessee, a writer took note of the strange adaptations of the code of chivalry in defense of honor. The story involved a duel between one Kentucky “Knight of the Red Rag” and a “great and mighty Walnut cracker” of Tennessee. The nutcracker gave himself an exalted title: “duke of Wild Cat Cove, little and big Hog Thief Creek, Short Mountain, Big Bore Cave and Cuwell’s Bridge.” So what did this kind of posturing mean? Like certain masters of gangsta rap in the twenty-first century, crackers had to make up for their lowly status by dressing themselves up in a boisterous verbal garb.

While Jackson had little interest in squatters’ rights, his party did shift the debate in their favor. Democrats supported preemption rights, which made it easier and cheaper for those lacking capital to purchase land. Preemption granted squatters the right to settle, to improve, and then to purchase the land they occupied at a “minimum price.” The debate over preemption cast the squatter in a more favorable light. For some, he was now a hardworking soul who built his cabin with his own hands and had helped to clear the land, which benefited all classes.

Thomas Hart Benton, in quitting Tennessee and moving to Missouri, buried the hatchet with Jackson. As an eminent senator during and after Jackson’s two terms in office, he pushed through preemption laws, culminating in the “Log Cabin Bill” of 1841. But Benton’s thinking was double-edged: yes, he wished to give squatters a chance to purchase a freehold, but he was not above treating them as an expendable population. In 1839, he proposed arming squatters, giving them land and rations as an alternative to renewing the federal military campaign against the Seminoles in Florida. By this, Benton merely revived the British military tactic of using squatters as an inexpensive tool for conquering the wilderness.

The presidential campaign of 1840 appears to be the moment when the squatter morphed into the colloquial common man of democratic lore. Both parties now embraced him. Partisans of Whig presidential candidate William Henry Harrison claimed that he was from backwoods stock. This was untrue. Harrison was born into an elite Virginia planter family, and though he had been briefly a cabin dweller in the Old Northwest Territory, by the time he ran for office that cabin had been torn down and replaced with a grand mansion. Kentuckian Henry Clay, who vied with him for the Whig nomination, celebrated his prizewinning mammoth hog—named “Corn Cracker,” no less. The new class politics played out in trumped-up depictions of log cabins, popular nicknames, hard-cider drinking, and coonskin caps.

The squatter may have been tamed, at least in the minds of some, but political equality did not come to America in the so-called Age of Jackson. Virginia retained property qualifications for voting until 1851; Louisiana and Connecticut until 1845; North Carolina until 1857. Tennessee did not drop its freehold restriction until 1834—after Jackson had already been elected to a second term. Eight states passed laws that disenfranchised paupers, the urban poor.

Dirt-poor southerners living on the margins of plantation society became even more repugnant as “sandhillers” and pathetic, self-destructive “clay-eaters.” It was at this moment that they acquired the most enduring insult of all: “poor white trash.” The southern poor were not just lazy vagrants; now they were odd specimens in a collector’s cabinet of curiosities, a diseased breed, and the degenerate spawn of a “notorious race.” A new nomenclature placed the lowly where they would become familiar objects of ridicule in the modern age. Though “white trash” appeared in print as early as 1821, the designation gained widespread popularity in the 1850s.

The shift seemed evident in 1845 when a newspaper reported on Andrew Jackson’s funeral procession in Washington City. As the poor crowded along the street, it was neither crackers nor squatters lining up to see the last hurrah of Old Hickory. Instead, it was “poor white trash” who pushed the poor colored folk out of the way to get a glimpse of the fallen president. What made the ridiculed breed so distinctive? Its ingrained physical defects. In descriptions of the mid-19th century, ragged, emaciated sandhillers and clay-eaters were clinical subjects, the children prematurely aged and deformed with distended bellies. Observers looked beyond dirty faces and feet and highlighted the ghostly, yellowish white tinge to the poor white’s skin—a color they called “tallow.” Barely acknowledged as members of the human race, these oddities with cotton-white hair and waxy pigmentation were classed with albinos. Highly inbred, they ruined themselves through their dual addiction to alcohol and dirt. In the 1853 account of her travels in the South, Swedish writer Fredrika Bremer remarked that in consuming the “unctuous earth,” clay-eaters were literally eating themselves to death.

White trash southerners were classified as a “race” that passed on horrific traits, eliminating any possibility of improvement or social mobility. If these Night of the Living Dead qualities were not enough, critics charged that poor whites had fallen below African slaves on the scale of humanity. They marked an evolutionary decline, and they foretold a dire future for the Old South. If free whites produced feeble children, how could a robust democracy thrive? If whiteness was not an automatic badge of superiority, a guarantee of the homogeneous population of independent, educable freemen, as Jefferson imagined, then the ideals of life, liberty, and the pursuit of happiness were unobtainable

The Republican Party (1854) declared that poor whites were proof positive of the debilitating effects of slavery on free labor. A slave economy monopolized the soil, while closing off opportunities for non-slaveholding white men to support their families and advance in a free-market economy. Slavery crushed individual ambition, inviting decay and death, and draining vitality from the land and its vulnerable inhabitants. Poor whites were the hapless victims of class tyranny and a failed democratic inheritance.

Proslavery southerners took a different ideological turn, defending class station as natural. Conservative southern intellectuals became increasingly comfortable with the notion that biology was class destiny. In his 1860 Social Relations in Our Southern States, Alabamian Daniel Hundley denied slavery’s responsibility for the phenomenon of poverty, insisting that poor whites suffered from a corrupt pedigree and cursed lineage. Class was congenital, he believed.

Hundley’s ideology appealed broadly. Many northerners, even those who opposed slavery, saw white trash southerners as a dangerous breed. No less an antislavery symbol than Harriet Beecher Stowe agreed with the portrait penned by the Harvard-educated future Confederate Hundley. Though she became famous (and infamous) for her bestselling antislavery novel Uncle Tom’s Cabin (1852), Stowe’s second work told a different story. In Dred: A Tale of the Great Dismal Swamp (1856), she described poor whites as a degenerate class, prone to crime, immorality, and ignorance.

By the time of annexation, Anglo-Texans routinely ridiculed the dark-skinned, lower-class Tejanos as a sign of degradation among the native population. Here again, common language underscored the degradation of bloodlines. Increasingly, Mexicans were thrown together with blacks and Indians and contemptuously dismissed by Americans in general as a “mongrel race.” “Mongrel” was just another word for “half-breeds” or “mulattoes,” those of a “polluted” lineage. In 1844, Pennsylvania senator and future president James Buchanan crudely described an “imbecile and indolent Mexican race,” insistent that no Anglo-Saxon should ever be under the political thumb of his inferior. His colleague from New Hampshire, former treasury secretary Levi Woodbury, elevated the Texas Revolution into a racial war of liberation: “Saxon blood had been humiliated, and enslaved to Moors, Indians, and mongrels.” Such rhetoric had appeal far beyond the bloviated oratory of politicians. One Texas woman confidently wrote to her mother, “You feel the irresistible necessity that one race must subdue the other,” and “they, of the superior race, can easily learn to look upon themselves as men of Destiny.

California’s early history had been as grim as that of Texas. Both of these extensive territories were overrun with runaway debtors, criminal outcasts, rogue gamblers, and ruthless adventurers who thrived in the chaotic atmosphere of western sprawl. The California gold rush attracted not only grizzled gold diggers but also prostitutes, fortune hunters, and con men selling fraudulent land titles.

Built tall and rail thin, Helper must have stood out among the motley assortment of émigrés. He spent three long years in California and came away hating the state. Despite all the harsh things he had to say about almost everyone he met, he was obliged to admit that most imported women had little choice but prostitution if they wished to survive in the unruly town of San Francisco.

The new campaign turned the squatter into an entitled freeman. To be a homesteader was to be of the American people—who collectively owned as their inalienable “birthright” all the public land in the territories. Unfortunately, blocked by southern votes in Congress, the “inalienable homestead” would not become law until 1862, after secession.

The new Republicans revived the old critique of Washington and Jefferson: southern agriculture depleted the soil and turned the land into waste. Helper published tables proving the North’s greater productivity over the South.

All knew that poor whites were cursed because they were routinely consigned to the worst land: sandy, scrubby pine, and swampy soil. This was how they became known in the mid-nineteenth century as “sandhillers” and “pineys.” Forced to the margins, often squatting on land they did not own, they were regularly identified with the decaying soil.

Suffrage could be stripped away from any freeman by the planter-controlled courts. In the 1840s and 1850s, North Carolina, South Carolina, Louisiana, and Virginia kept poor whites at bay by retaining property qualifications for holding office. Social ostracism was an even greater mark of shame, as planters forced poor whites to use the back door when entering the master’s house. Slaves called them “stray goats” when they came begging for food or supplies. Southern reformers were just as disparaging. In a speech before the South Carolina Institute in 1851, industrial advocate and cotton mill owner William Gregg underscored the evolutionary argument in saying that “our poor white people . . . are suffered to while away an existence in a state but one step in advance of the Indian of the forest.

Few white trash squatters had any access to free soil or to homesteads. They lived instead like scavengers, vagrants, and thieves—at least according to reports by wealthy southerners. But the truth is more complicated. Many worked as tenants and day laborers alongside slaves; during harvesttime, poor men and women worked day and night for paltry wages. In cities such as Baltimore and New Orleans, some of the most backbreaking labor—working on the railroads, paving streets, dray driving, ditch building—was chiefly performed by underpaid white laborers.

In the 1850s, poor whites had become a permanent class. As non-slaveholders, they described themselves as “farmers without farms.” Small-scale slaveholders tended to be related to large planters, a reminder of how much pedigree and kinship mattered. Slave owners had unusual financial instruments that situated them above non-slaveholders: they raised slave children as an investment, as an invaluable source of collateral and credit when they sought to obtain loans. Whether they stayed put or moved west, poor whites occupied poor land. Nearly half left the Atlantic South for Texas, Arkansas, Mississippi, and elsewhere, and still poor whites as a percentage in the original slave states remained fairly constant.

Ten years before he became president of the Confederacy, Senator Jefferson Davis of Mississippi had argued that the slave states enjoyed greater stability. Recognizing that “distinctions between classes have always existed, everywhere, and in every country,” he observed that two distinct labor systems coexisted in the United States. In the South, the line between classes was drawn on the basis of “color,” while in the North the boundary had been marked “by property, between the rich and poor.” He insisted that “no white man, in a slaveholding community, was the menial servant of anyone.” Like many other proslavery advocates, Davis was convinced that slavery had elevated poor whites by ensuring their superiority over blacks. He was wrong: in the antebellum period, class hierarchy was more extreme than it ever had been.

Jefferson Davis and James Hammond spoke the same language. Confederate ideology converted the Civil War into a class war. The South was fighting against degenerate mudsills and everything they stood for: class mixing, race mixing, and the redistribution of wealth. By the time of Abraham Lincoln’s election, secessionists claimed that “Black Republicans” had taken over the national government, promoting fears of racial degeneracy. But a larger danger still loomed. As one angry southern writer declared, the northern party should not be called “Black Republicans,” but “Red Republicans,” for their real agenda was not just the abolition of slavery, but inciting class revolution in the South.

Class mattered for another reason. Confederate leaders knew they had to redirect the hostility of the South’s own underclass, the non-slaveholding poor whites, many of whom were in uniform. Charges of “rich man’s war and poor man’s fight” circulated throughout the war, but especially after the Confederate Congress passed the Conscription Act of 1862, instituting the draft for all men between the ages of eighteen and thirty-five. Exemptions were available to educated elites, slaveholders, officeholders, and men employed in valuable trades—leaving poor farmers and hired laborers the major target of the draft. Next the draft was extended to the age of forty-five, and by 1864 all males from seventeen to fifty were subject to conscription.

The Union army and Republican politicians advanced a strategy aimed at further exploiting class divisions between the planter elite and poor whites in the South. Generals Ulysses S. Grant and William T. Sherman, as well as many Union officers, believed they were fighting a war against a slaveholding aristocracy, and that winning the war and ending slavery would liberate not only slaves but also poor white trash. In his memoir, Grant voiced the class critique of the Union command. There would never have been secession, he wrote, if demagogues had not swayed non-slaveholding voters and naïve young soldiers to believe that the North was filled with “cowards, poltroons, and negro-worshippers.

Not surprisingly, evidence exists to prove that southern whites lagged behind northerners in literacy rates by at least a six-to-one margin. Prominent southern men defended the disparity in educational opportunity.



Posted in Distribution of Wealth, Financial, Politics, Poverty | Tagged , , | Leave a comment

Not enough lithium to electrify transportation

Preface. Breaking news: The New York times just reported that there are only 25 years of zinc reserves left and goes on to say that lithium reserves are even smaller — just 5% of zinc reserves (Penn 2018).  The authors of the paper below decline to give a date when lithium reserves will grow short, but this implies the date of shortages could arrive within a decade or less.

This is by far the best paper I’ve found explaining lithium reserves, lithium chemistry, recycling, political implications, and more. I’ve left out the charts, graphs, references, and much of the text, to see them go to the original paper in the link below (the link is in the title of the article).

I personally don’t think that electric cars and certainly never trucks will ever be viable because battery development is too slow, oil can be hundreds of times more energy dense than a battery per unit weight or volume, and will never be light enough to use in trucks. After all, lithium is the 3rd lightest element and truck batteries are far too heavy, and we sure can’t build batteries out of hydrogen and helium. The bottom line is that the laws of physics come into play not only for weight in trucks, but in the possibility of ever approaching the energy density of diesel fuel. The best possible battery with the most potential difference between oxidation and reduction would be lithium-fluorine, but after many decades of trying, scientists are far from developing such a battery, indeed, there are so many problems it may be impossible.

I explain these issues in my post at Who Killed the Electric Car, and my book When Trains Stop Running: Energy and the Future of Transportation.  In order for solar and wind to penetrate the grid more fully, their energy needs to be stored for when the sun isn’t shining and the wind isn’t blowing.  The main technology to do this is lithium ion batteries, which use orders of magnitude more lithium than a car or even truck battery. A few quotes from my book:

Li-ion energy storage batteries are more expensive than PbA or NaS, can be charged and discharged only a discrete number of times, can fail or lose capacity if overheated, and the cost of preventing overheating is expensive. Lithium does not grow on trees. The amount of lithium needed for utility-scale storage is likely to deplete known resources (Vazquez 2010)

To provide enough energy for 1 day of storage for the United states, li-ion batteries would cost $11.9 trillion dollars, take up 345 square miles and weigh 74 million tons (DOE/EPRI 2013).

Barnhart et al. (2013) looked at how much materials and energy it would take to make batteries that could store up to 12 hours of average daily world power demand, 25.3 TWh. Eighteen months of world-wide primary energy production would be needed to mine and manufacture these batteries, and material production limits were reached for many minerals even when energy storage devices got all of the world’s production (with zinc, sodium, and sulfur being the exceptions). Annual production by mass would have to double for lead, triple for lithium, and go up by a factor of 10 or more for cobalt and vanadium, driving up prices. The best to worst in terms of material availability are: CAES, NaS, ZnBr, PbA, PHS, Li-ion, and VRB (Barnhart 2013).

Anyone who isn’t dissuaded that lithium is too limited to replace petroleum should also consider the tremendous amount of environmental harm done and limitations of water and other resources to mine lithium (Katwala 2018)

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report ]

Vikström, H., Davidsson, S., Höök, M. 2013. Lithium availability and future production outlooks. Applied Energy, 110(10): 252-266. 28 pages

Lithium is a highly interesting metal, in part due to the increasing interest in lithium-ion batteries. Several recent studies have used different methods to estimate whether the lithium production can meet an increasing demand, especially from the transport sector, where lithium-ion batteries are the most likely technology for electric cars. The reserve and resource estimates of lithium vary greatly between different studies and the question whether the annual production rates of lithium can meet a growing demand is seldom adequately explained. This study presents a review and compilation of recent estimates of quantities of lithium available for exploitation and discusses the uncertainty and differences between these estimates. Also, mathematical curve fitting models are used to estimate possible future annual production rates. This estimation of possible production rates are compared to a potential increased demand of lithium if the International Energy Agency’s Blue Map Scenarios are fulfilled regarding electrification of the car fleet. We find that the availability of lithium could in fact be a problem for fulfilling this scenario if lithium-ion batteries are to be used. This indicates that other battery technologies might have to be implemented for enabling an electrification of road transports.


  • Review of reserves, resources and key properties of 112 lithium deposits
  • Discussions of widely diverging results from recent lithium supply estimates
  • Forecasting future lithium production by resource-constrained models
  • Exploring implications for future deployment of electric cars


Global transportation mainly relies on one single fossil resource, namely petroleum, which supplies 95% of the total energy [1]. In fact, about 62% of all world oil consumption takes place in the transport sector [2]. Oil prices have oscillated dramatically over the last few years, and the price of oil reached $100 per barrel in January 2008, before skyrocketing to nearly $150/barrel in July 2008. A dramatic price collapse followed in late 2008, but oil prices have at present time returned to over $100/barrel. Also, peak oil concerns, resulting in imminent oil production limitations, have been voiced by various studies [3–6].

It has been found that continued oil dependence is environmentally, economically and socially unsustainable [7].

The price uncertainty and decreasing supply might result in severe challenges for different transporters. Nygren et al. [8] showed that even the most optimistic oil production forecasts implied pessimistic futures for the aviation industry. Curtis [9] found that globalization may be undermined by peak oil’s effect on transportation costs and reliability of freight.

Barely 2% of the world electricity is used by transportation [2], where most of this is made up by trains, trams, and trolley buses.

A high future demand of Li for battery applications may arise if society choses to employ Li-ion technologies for a decarbonization of the road transport sector.

Batteries are at present time the second most common use, but are increasing rapidly as the use of li-ion batteries for portable electronics [12], as well as electric and hybrid cars, are becoming more frequent. For example, the lithium consumption for batteries in the U.S increased with 194 % from 2005 to 2010 [12]. Relatively few academic studies have focused on the very abundance of raw materials needed to supply a potential increase in Li demand from transport sector [13]. Lithium demand is growing and it is important to investigate whether this could lead to a shortfall in the future.

[My comment: according to Barhhart 2013 “utility scale energy storage batteries in commercial production are lithium, and if this continues, this sector alone would quickly consume all available lithium supplies” ]

Aim of this study

Recently, a number of studies have investigated future supply prospects for lithium [13–16]. However, these studies reach widely different results in terms of available quantities, possible production trajectories, as well as expected future demand. The most striking difference is perhaps the widely different estimates for available resources and reserves, where different numbers of deposits are included and different types of resources are assessed. It has been suggested that mineral resources will be a future constraint for society [17], but a great deal of this debate is often spent on the concept of geological availability, which can be presented as the size of the tank. What is frequently not reflected upon is that society can only use the quantities that can be extracted at a certain pace and be delivered to consumers by mining operations, which can be described as the tap. The key concept here is that the size of the tank and the size of the tap are two fundamentally different things.

This study attempts to present a comprehensive review of known lithium deposits and their estimated quantities of lithium available for exploitation and discuss the uncertainty and differences among published studies, in order to bring clarity to the subject. The estimated reserves are then used as a constraint in a model of possible future production of lithium and the results of the model are compared to possible future demand from an electrification of the car fleet. The forecasts are based on open, public data and should be used for estimating long term growth and trends. This is not a substitute for economical short-term prognoses, but rather a complementary vision.

Data sources

The United States Geological Survey (USGS) has been particularly useful for obtaining production data series, but also the Swedish Geological Survey (SGU) and the British Geological Survey (BGS) deserves honourable mention for providing useful material. Kushnir and Sandén [18], Tahil [19, 20] along with many other recent lithium works have also been useful. Kesler et al. [21] helped to provide a broad overview of general lithium geology.

Information on individual lithium deposits has been compiled from numerous sources, primarily building on the tables found in [13–16]. In addition, several specialized articles about individual deposits have been used, for instance [22–26]. Public industry reports and annual yearbooks from mining operators and lithium producers, such as SQM [27], Roskill [28] or Talison Lithium [29], also helped to create a holistic data base.

In this study, we collected information on global lithium deposits. Country of occurrence, deposit type, main mineral, and lithium content were gathered as well as published estimates for reserves and resources. Some deposits had detailed data available for all parameters, while others had very little information available. Widely diverging estimates for reserves and resources could sometimes be found for the same deposit, and in such cases the full interval between the minimum and maximum estimates is presented. Deposits without reserve or resource estimates are included in the data set, but do not contribute to the total. Only available data and information that could be found in the public and academic spheres were compiled in this study. It is likely that undisclosed and/or proprietary data could contribute to the world’s lithium volume but due to data availability no conclusions on to which extent could be made.

Geological overview

In order to properly estimate global lithium availability, and a feasible reserve estimate for modelling future production, this section presents an overview of lithium geology. Lithium is named after the Greek word “lithos” meaning “stone”, represented by the symbol Li and has the atomic number 3. Under standard conditions, lithium is the lightest metal and the least dense solid element. Lithium is a soft, silver-white metal that belongs to the alkali group of elements.

As all alkali elements, Li is highly reactive and flammable. For this reason, it never occurs freely in nature and only appears in compounds, usually ionic compounds. The nuclear properties of Li are peculiar since its nuclei verge on instability and two stable isotopes have among the lowest binding energies per nucleon of all stable nuclides. Due to this nuclear instability, lithium is less abundant in the solar system than 25 of the first 32 chemical elements [30].

Resources and reserves

An important frequent shortcoming in the discussion on availability of lithium is the lack of proper terminology and standardized concepts for assessing the available amounts of lithium. Published studies talk about “reserves”, “resources”, “recoverable resources”, “broad-based reserves”, “in-situ resources”, and “reserve base”.

A wide range of reporting systems minerals exist, such as NI 43-101, USGS, Crirsco, SAMREC and the JORC code, and further discussion and references concerning this can be found in Vikström [31]. Definitions and classifications used are often similar, but not always consistent, adding to the confusion when aggregating data. Consistent definitions may be used in individual studies, but frequently figures from different methodologies are combined as there is no universal and standardized framework. In essence, published literature is a jumble of inconsistent figures. If one does not know what the numbers really mean, they are not simply useless – they are worse, since they tend to mislead.

Broadly speaking, resources are generally defined as the geologically assured quantity that is available for exploitation, while reserves are the quantity that is exploitable with current technical and socioeconomic conditions. The reserves are what are important for production, while resources are largely an academic figure with little relevance for real supply. For example, usually less than one tenth of the coal resources are considered economically recoverable [32, 33]. Kesler et al. [21] stress that available resources needs to be converted into reserves before they can be produced and used by society. Still, some analysts seemingly use the terms ‘resources’ and ‘reserves’ synonymously.

It should be noted that the actual reserves are dynamic and vary depending on many factors such as the available technology, economic demand, political issues and social factors. Technological improvements may increase reserves by opening new deposit types for exploitation or by lowering production costs. Deposits that have been mined for some time can increase or decrease their reserves due to difficulties with determining the ore grade and tonnage in advance [34]. Depletion and decreasing concentrations may increase recovery costs, thus lowering reserves. Declining demand and prices may also reduce reserves, while rising prices or demand may increase them. Political decisions, legal issues or environmental policies may prohibit exploitation of certain deposits, despite the fact significant resources may be available.

For lithium, resource/reserve classifications were typically developed for solid ore deposits. However, brine – presently the main lithium source – is a fluid and commonly used definitions can be difficult to apply due to pumping complications and varying concentrations.

Houston et al. [35] describes the problem in detail and suggest a change in NI 43-101 to account for these problems. If better standards were available for brines then estimations could be more reliable and accurate, as discussed in Kushnir and Sandén [18].

Environmental aspects and policy changes can also significantly influence recoverability. Introduction of clean air requirements and public resistance to surface mining in the USA played a major role in the decreasing coal reserves [33].

It is entirely possible that public outcries against surface mining or concerns for the environment in lithium producing will lead to restrictions that affect the reserves. As an example, the water consumption of brine production is very high and Tahil [19] estimates that brine operations consume 65% of the fresh water in the Salar de Atacama region. [ The Atacama only gets 0.6 inches of rain a year ]

Regarding future developments of recoverability, Fasel and Tran [36] monotonously assumes that increasing lithium demand will result in more reserves being found as prices rise. So called cumulative availability curves are sometimes used to estimate how reserves will change with changing prices, displaying the estimated amount of resource against the average unit cost ranked from lowest to highest cost. This method is used by Yaksic and Tilton [14] to address lithium availability. This concept has its merits for describing theoretical availability, but the fact that the concept is based on average cost, not marginal cost, has been described as a major weakness, making cumulative availability curves disregard the real cost structure and has little – if any – relevance for future price and production rate [37].

Production and occurrence of lithium

The high reactivity of lithium makes it geochemistry complex and interesting. Lithium-minerals are generally formed in magmatic processes. The small ionic size makes it difficult for lithium to be included in early stages of mineral crystallization, and resultantly lithium remains in the molten parts where it gets enriched until it can be solidified in the final stages [38].

At present, over 120 lithium-containing minerals are known, but few of them contain high concentrations or are frequently occurring. Lithium can also be found in naturally occurring salt solutions as brines in dry salt lake environments. Compared to the fairly large number of lithium mineral and brine deposits, few of them are of actual or potential commercial value. Many are very small, while others are too low in grade [39]. This chapter will briefly review the properties of those deposits and present a compilation of the known deposits.

Lithium mineral deposits

Lithium extraction from minerals is primarily done with minerals occurring in pegmatite formations. However, pegmatite is rather challenging to exploit due to its hardness in conjunction with generally problematic access to the belt-like deposits they usually occur in. Table 1 describes some typical lithium-bearing minerals and their characteristics. Australia is currently the world’s largest producer of lithium from minerals, mainly from spodumene [39]. Petalite is commonly used for glass manufacture due to its high iron content, while lepidolite was earlier used as a lithium source but presently has lost its importance due to high fluorine content. Exploitation must generally be tailor-made for a certain mineral as they differ quite significantly in chemical composition, hardness and other properties [13]. Table 2 presents some mineral deposits and their properties.

Recovery rates for mining typically range from 60 to 70%, although significant treatment is required for transforming the produced Li into a marketable form. For example, [40, 41] describe how lithium are produced from spodumene. The costs of acid, soda ash, and energy are a very significant part of the total production cost but may be partially alleviated by the market demand for the sodium sulphate by-products.

Lithium brine deposits

Lithium can also be found in salt lake brines that have high concentrations of mineral salts. Such brines can be reachable directly from the surface or deep underground in saline expanses located in very dry regions that allow salts to persist. High concentration lithium brine is mainly found in high altitude locations such as the Andes and south-western China. Chile, the world largest lithium producer, derives most of the production from brines located at the large salt flat of Salar de Atacama.

Lithium has similar ionic properties as magnesium since their ionic size is nearly identical; making is difficult to separate lithium from magnesium. A low Mg/Li ratio in brine means that it is easier, and therefore more economical to extract lithium.

The ratio differs significant at currently producing brine deposits and range from less than 1 to over 30 [14]. The lithium concentration in known brine deposits is usually quite low and range from 0.017–0.15% with significant variability among the known deposits in the world (Table 3).

Exploitation of lithium brines starts with the brine being pumped from the ground into evaporation ponds. The actual evaporation is enabled by incoming solar radiation, so it is desirable for the operation to be located in sunny areas with low annual precipitation rate. The net evaporation rate determines the area of the required ponds [42].

It can easily take between one and two years before the final product is ready to be used, and even longer in cold and rainy areas.

The long timescales required for production can make brine deposits ill fit for sudden changes in demand. Table 3. Properties of known brine deposits in the world.

Lithium from sea water

The world’s oceans contain a wide number of metals, such as gold, lithium or uranium, dispersed at low concentrations. The mass of the world’s oceans is approximately 1.35*1012 Mt [47], making vast amounts of theoretical resources seemingly available. Eckhardt [48] and Fasel and Tran [36] announce that more than 2,000,000 Mt lithium is available from the seas, essentially making it an “unlimited” source given its geological abundance. Tahil [20] also notes that oceans have been proclaimed as an unlimited Li-source since the 1970s.

The world’s oceans and some highly saline lakes do in fact contain very large quantities of lithium, but if it will become practical and economical to produce lithium from this source is highly questionable.

For example, consider gold in sea water – in total nearly 7,000,000 Mt. This is an enormous amount compared to the cumulative world production of 0.17 Mt accumulated since the dawn of civilization [49]. There are also several technical options available for gold extraction. However, the average gold concentration range from <0.001 to 0.005 ppb [50]. This means that one km3 of sea water would give only 5.5 kg of gold. The gold is simply too dilute to be viable for commercial extraction and it is not surprising that all attempts to achieve success – including those of the Nobel laureate Fritz Haber – has failed to date.

Average lithium concentration in the oceans has been estimated to 0.17 ppm [14, 36]. Kushnir and Sandén [18] argue that it is theoretically possible to use a wide range of advanced technologies to extract lithium from seawater – just like the case for gold. However, no convincing methods have been demonstrated this far. A small scale Japanese experiment managed to produce 750 g of lithium metal from processing 4,200 m3 water with a recovery efficiency of 19.7% [36]. This approach has been described in more detail by others [51–53].

Grosjean et al. [13] points to the fact that even after decades of improvement, recovery from seawater is still more than 10–30 times more costly than production from pegmatites and brines. It is evident that huge quantities of water would have to be processed to produce any significant amounts of lithium. Bardi [54] presents theoretical calculations on this, stating that a production volume of lithium comparable to present world production (~25 kt annually) would require 1.5*103 TWh of electrical energy for pumping through separation membranes in addition to colossal volumes of seawater. Furthermore, Tahil [20] estimated that a seawater processing flow equivalent to the average discharge of the River Nile – 300,000,000 m3/day or over 22 times the global petroleum industry flow of 85 million barrels per day – would only give 62 tons of lithium per day or roughly 20 kt per year. Furthermore, a significant amount of fresh water and hydrochloric acid will be required to flush out unwanted minerals (Mg, K, etc.) and extract lithium from the adsorption columns [20].

In summary, extraction from seawater appears not feasible and not something that should be considered viable in practice, at least not in the near future.

Estimated lithium availability

From data compilation and analysis of 112 deposits, this study concludes that 15 Mt are reasonable as a reference case for the global reserves in the near and medium term. 30 Mt is seen as a high case estimate for available lithium reserves and this number is also found in the upper range in literature. These two estimates are used as constraints in the models of future production in this study.

Estimates on world reserves and resources vary significantly among published studies. One main reason for this is likely the fact that different deposits, as well as different number of deposits, are aggregated in different studies. Many studies, such as the ones presented by the USGS, do not give explicitly state the number of deposits included and just presents aggregated figures on a national level. Even when the number and which deposits that have been used are specified, analysts can arrive to wide different estimates (Table 5). It should be noted that a trend towards increasing reserves and resources with time can generally be found, in particularly in USGS assessments. Early reports, such as Evans [56] or USGS [59], excluded several countries from the reserve estimates due to a lack of available information. This was mitigated in USGS [73] when reserves estimates for Argentina, Australia, and Chile have been revised based on new information from governmental and industry sources. However, there are still relatively few assessments on reserves, in particular for Russia, and it is concluded that much future work is required to handle this shortcoming. Gruber et al. [16] noted that 83% of global lithium resources can be found in six brine, two pegmatite and two sedimentary deposits. From our compilation, it can also be found that the distribution of global lithium reserves and resources are very uneven.

Three quarters of everything can typically be found in the ten largest deposits (Figure 1 and 2). USGS [12] pinpoint that 85% of the global reserves are situated in Chile and China (Figure 3) and that Chile and Australia accounted for 70 % of the world production of 28,100 tonnes in 2011 [12]. From Table 2 and 3, one can note a significant spread in estimated reserves and resources for the deposits. This divergence is much smaller for minerals (5.6–8.2 Mt) than for brines (6.5– 29.4 Mt), probably resulting from the difficulty associated with estimating brine accumulations consistently. Evans [75] also points to the problem of using these frameworks on brine deposits, which are fundamentally different from solid ores. Table 5. Comparison of published lithium assessments.


One thing that may or may not have a large implication for future production is recycling. The projections presented in the production model of this study describe production of lithium from virgin materials. The total production of lithium could potentially increase significantly if high rates of recycling were implemented of the used lithium, which is mentioned in many studies.

USGS [12] state that recycling of lithium has been insignificant historically, but that it is increasing as the use of lithium for batteries are growing. However, the recycling of lithium from batteries is still more or less non-existent, with a collection rate of used Li-ion batteries of only about 3% [93]. When the Li-ion batteries are in fact recycled, it is usually not the lithium that is recycled, but other more precious metals such as cobalt [18].

If this will change in the future is uncertain and highly dependent on future metal prices, but it is still commonly argued for and assumed that the recycling of lithium will grow significantly, very soon. Goonan [94] claims that recycling rates will increase from vehicle batteries in vehicles since such recycling systems already exist for lead-acid batteries. Kushnir and Sandén [18] argue that large automotive batteries will be technically easier to recycle than smaller batteries and also claims that economies of scale will emerge when the use for batteries for vehicles increase. According to the IEA [95], full recycling systems are projected to be in place sometime between 2020 and 2030. Similar assumptions are made by more or less all studies dealing with future lithium production and use for electric vehicles and Kushnir and Sandén [18] state that it is commonly assumed that recycling will take place, enabling recycled lithium to make up for a big part of the demand but also conclude that the future recycling rate is highly uncertain.

There are several reasons to question the probability of high recycling shares for Li-ion batteries. Kushnir and Sandén [18] state that lithium recycling economy is currently not good and claims that the economic conditions could decrease even more in the future. Sullivan and Gaines [96] argue that the Li-ion battery chemistry is complex and still evolving, thus making it difficult for the industry to develop profitable pathways. Georgi-Maschler [93] highlight that two established recycling processes exist for recycling Li-ion batteries, but one of them lose most of the lithium in the process of recovering the other valuable metals. Ziemann et al. [97] states that lithium recovery from rechargeable batteries is not efficient at present time, mainly due to the low lithium content of around 2% and the rather low price of lithium.

In this study we choose not to include recycling in the projected future supply for several reasons. In a short perspective, looking towards 2015-2020, it cannot be considered likely that any considerable amount of lithium will be recycled from batteries since it is currently not economical to do so and no proven methods to do it on a large scale industrial level appear to exist. If it becomes economical to recycle lithium from batteries it will take time to build the capacity for the recycling to take place. Also, the battery lifetime is often projected to be 10 years or more, and to expect any significant amounts of lithium to be recycled within this period of time is simply not realistic for that reason either.

The recycling capacity is expected to be far from reaching significant levels before 2025 according to Wanger [92]. It is also important to separate the recycling rates of products to the recycled content in new products. Even if a percentage of the product is recycled at the end of the life cycle, this is no guarantee that the use of recycled content in new products will be as high. The use of Li-ion batteries is projected to grow fast. If the growth happens linearly, and high recycling rates are accomplished, recycling could start constituting a large part of the lithium demand, but if the growth happens exponentially, recycling can never keep up with the growth that has occurred during the 10 years lag during the battery lifetime. In a longer time perspective, the inclusion of recycling could be argued for with expected technological refinement, but certainties regarding technology development are highly uncertain. Still, most studies include recycling as a major part of future lithium production, which can have very large implications on the results and conclusions drawn. Kushnir and Sandén [18] suggest that an 80% lithium recovery rate is achievable over a medium time frame. The scenarios in Gruber et al. [16], assumes recycling participation rates of 90 %, 96% and 100%. In their scenario using the highest assumed recycling, the quantities of lithium needed to be mined are decreased to only about 37% of the demand. Wanger [92] looks at a shorter time perspective and estimates that a 40% or 100% recycling rate would reduce the lithium consumption with 10% or 25% respectively by 2030. Mohr et al. [15] assume that the recycling rate starts at 0%, approaching a limit of 80%, resulting in recycled lithium making up significant parts of production, but only several decades into the future. IEA [95] projects that full recycling systems will be in place around 2020–2030.

The impact of assumed recycling rates can indeed be very significant, and the use of this should be handled with care and be well motivated.

Future demand for lithium

To estimate whether the projected future production levels will be sufficient, it is interesting to compare possible production levels with potential future demand. The use of lithium is currently dominated by use for ceramics and glass closely followed by batteries. The current lithium demand for different markets can be seen in Figure 7. USGS [12] state that the lithium use in batteries have grown significantly in recent years as the use of lithium batteries in portable electronics have become increasingly common. Figure 7 (Ceramics and glass 29%, Batteries 27%, Other uses 16%, Lubrication greases 12%, Continuous casting 5%, Air treatment 4%, Polymers 3%, Primary aluminum production 2%, Pharmaceuticals 2%).

Global lithium demand for different end-use markets. Source: USGS [12] USGS [12] state that the total lithium consumption in 2011 was between 22,500 and 24,500 tonnes. This is often projected to grow, especially as the use of Li-ion batteries for electric cars could potentially increase demand significantly. This study presents a simple example of possible future demand of lithium, assuming a constant demand for other uses and demand for electric cars to grow according to a scenario of future sales of

electric cars. The current car fleet consists of about 600 million passenger cars. The sale of new passenger cars in 2011 was about 60 million cars [98]. This existing vehicle park is almost entirely dependent on fossil fuels, primarily gasoline and diesel, but also natural gas to a smaller extent. Increasing oil prices, concerns about a possible peak in oil production and problems with anthropogenic global warming makes it desirable to move away from fossil energy dependence. As a mitigation and pathway to a fossil-fuel free mobility, cars running partially or totally on electrical energy are commonly proposed. This includes electric vehicles (EVs), hybrid vehicles (HEVs) and PHEVs (plug-in hybrid vehicles), all on the verge of large-scale commercialization and implementation. IEA [99] concluded that a total of 1.5 million hybrid and electric vehicles had been sold worldwide between the year 2000 and 2010.

Both the expected number of cars as well as the amount of lithium required per vehicle is important. As can be seen from Table 9, the estimates of lithium demand for PEHV and EVs differ significantly between studies. Also, some studies do not differentiate between different technical options and only gives a single Li-consumption estimate for an “electric vehicle”, for instance the 3 kg/car found by Mohr et al. [15]. The mean values from Table 9 are found to be 4.9 kg for an EV and 1.9 kg for a PHEV.

As the battery size determines the vehicles range, it is likely that the range will continue to increase in the future, which could increase the lithium demand. On the other hand, it is also reasonable to assume that the technology will improve, thus reducing the lithium requirements. In this study a lithium demand of 160 g Li/kWh is assumed, an assumption discussed in detail by Kushnir and Sandén [18]. It is then assumed that typical batteries capacities will be 9 kWh in a PHEV and 25 kWh in an EV. This gives a resulting lithium requirement of 1.4 kg for a PHEV and 4 kg for an EV, which is used as an estimate in this study. Many current electrified cars have a lower capacity than 24 kWh, but to become more attractive to consumers the range of the vehicles will likely have to increase, creating a need for larger batteries [104]. It should be added that the values used are at the lower end compared to other assessments (Table 9) and should most likely not be seen as overestimates future lithium requirements.

Figure 8 shows the span of the different production forecasts up until 2050 made in this study, together with an estimated demand based on the demand staying constant on the high estimate of 2010– 2011, adding an estimated demand created by the electric car projections done by IEA [101]. This is a very simplistic estimation future demand, but compared to the production projections it indicates that lithium availability should not be automatically disregarded as a potential issue for future electric car production. The amount of electric cars could very well be smaller or larger that this scenario, but the scenario used does not assume a complete electrification of the car fleet by 2050 and such scenarios would mean even larger demand of lithium. It is likely that lithium demand for other uses will also grow in the coming decades, why total demand might increase more that indicated here. This study does not attempt to estimate the evolution of demand for other uses, and the demand estimate for other uses can be considered a conservative one. Figure 8. The total lithium demand of a constant current lithium demand combined with growth of electric vehicles according to IEA’s blue map scenario [101] assuming a demand for 1.4 kg of lithium per PHEV and 4.0 kg per EV. The span of forecasted production levels range from the base case Gompertz model

Concluding discussion

Potential future production of lithium was modeled with three different production curves. In a short perspective, until 2015–2020, the three models do not differ much, but in the longer perspective the Richards and Logistic curves show a growth at a vastly higher pace than the Gompertz curve. The Richards model gives the best fit to the historic data, and lies in between the other two and might be the most likely development. A faster growth than the logistic model cannot be ruled out, but should be considered unlikely, since it usually mimics plausible free market exploitation [89]. Other factors, such as decreased lithium concentration in mined material, economics, political and environmental problems could also limit production.

It can be debated whether this kind of forecasting should be used for short term projections, and the actual production in coming years can very well differ from our models, but it does at least indicate that lithium availability could be a potential problem in the coming decades. In a longer time perspective up to 2050, the projected lithium demand for alternative vehicles far exceeds our most optimistic production prognoses.

If 100 million alternative vehicles, as projected in IEA [101] are produced annually using lithium battery technology, the lithium reserves would be exhausted in just a few years, even if the production could be cranked up faster than the models in this study. This indicates that it is important that other battery technologies should be investigated as well.

It should be added that these projections do not consider potential recycling of the lithium, which is discussed further earlier in this paper. On the other hand, it appears it is highly unlikely that recycling will become common as soon as 2020, while total demand appears to potentially rise over maximum production around that date. If, when, and to what extent recycling will take place is hard to predict, although it appears more likely that high recycling rates will take place in electric cars than other uses.

Much could change before 2050. The spread between the different production curves are much larger and it is hard to estimate what happens with technology over such a long time frame. However, the Blue Map Scenario would in fact create a demand of lithium that is higher than the peak production of the logistic curve for the standard case, and close to the peak production in the high URR case.

Improved efficiency can decrease the lithium demand in the batteries, but as Kushnir and Sandén [18] point out, there is a minimum amount of lithium required tied to the cell voltage and chemistry of the battery.

IEA [95] acknowledges that technologies that are not available today must be developed to reach the Blue Map scenarios and that technology development is uncertain. This does not quite coincide with other studies claiming that lithium availability will not be a problem for production of electric cars in the future.

It is also possible that other uses will raise the demand for lithium even further. One industry that in a longer time perspective could potentially increase the demand for lithium is fusion, where lithium is used to breed tritium in the reactors. If fusion were commercialized, which currently seems highly uncertain, it would demand large volumes of lithium [36].

Further problems with the lithium industry are that the production and reserves are situated in a few countries (USGS [12] in Mt: Chile 7.5, China 3.5, Australia 0.97, Argentina 0.85, Other 0.135]. One can also note that most of the lithium is concentrated to a fairly small amount of deposits, nearly 50% of both reserves and resources can be found in Salar de Atacama alone. Kesler et al. [21] note that Argentina, Bolivia, Chile and China hold 70% of the brine deposits. Grosjean et al. [13] even points to the ABC triangle (i.e. Argentina, Bolivia and Chile) and its control of well over 40% of the world resources and raises concern for resource nationalism and monopolistic behavior. Even though Bolivia has large resources, there are many political and technical problems, such as transportation and limited amount of available fresh water, in need of solutions [18].

Regardless of global resource size, the high concentration of reserves and production to very few countries is not something that bode well for future supplies. The world is currently largely dependent on OPEC for oil, and that creates possibilities of political conflicts. The lithium reserves are situated in mainly two countries. It could be considered problematic for countries like the US to be dependent on Bolivia, Chile and Argentina for political reasons [105]. Abell and Oppenheimer [105] discuss the absurdity in switching from dependence to dependence since resources are finite. Also, Kushnir and Sandén [18] discusses the problems with being dependent on a few producers, if a problem unexpectedly occurs at the production site it may not be possible to continue the production and the demand cannot be satisfied.

Final remarks

Although there are quite a few uncertainties with the projected production of lithium and demand for lithium for electric vehicles, this study indicates that the possible lithium production could be a limiting factor for the number of electric vehicles that can be produced, and how fast they can be produced. If large parts of the car fleet will run on electricity and rely on lithium based batteries in the coming decades, it is possible, and maybe even likely, that lithium availability will be a limiting factor.

To decrease the impact of this, as much lithium as possible must be recycled and possibly other battery technologies not relying on lithium needs to be developed. It is not certain how big the recoverable reserves of lithium are in the world and estimations in different studies differ significantly. Especially the estimations for brine need to be further investigated. Some estimates include production from seawater, making the reserves more or less infinitely large. We suggest that it is very unlikely that seawater or lakes will become a practical and economic source of lithium, mainly due to the high Mg/Li ratio and low concentrations if lithium, meaning that large quantities of water would have to be processed. Until otherwise is proved lithium reserves from seawater and lakes should not be included in the reserve estimations. Although the reserve estimates differ, this appears to have marginal impact on resulting projections of production, especially in a shorter time perspective. What are limiting are not the estimated reserves, but likely maximum annual production, which is often missed in similar studies.

If electric vehicles with li-ion batteries will be used to a very high extent, there are other problems to account for. Instead of being dependent on oil we could become dependent on lithium if li-ion batteries, with lithium reserves mainly located in two countries. It is important to plan for this to avoid bottlenecks or unnecessarily high prices. Lithium is a finite resource and the production cannot be infinitely large due to geological, technical and economical restraints. The concentration of lithium metal appears to be decreasing, which could make it more expensive and difficult to extract the lithium in the future. To enable a transition towards a car fleet based on electrical energy, other types of batteries should also be considered and a continued development of battery types using less lithium and/or other metals are encouraged. High recycling rates should also be aimed for if possible and continued investigations of recoverable resources and possible production of lithium are called for. Acknowledgements We would like to thank Steve Mohr for helpful comments and ideas. Sergey Yachenkov has our sincerest appreciation for providing assistance with translation of Russian material.


Ashley, K., Cordell, D., Mavinic, D., 2011. A brief history of phosphorus: From the philosopher’s stone to nutrient recovery and reuse. Chemosphere 84, 737–746.

Bardi, U., 2005. The mineral economy: a model for the shape of oil production curves. Energy Policy 33, 53–61. Bardi, U., 2009. Peak oil: The four stages of a new idea. Energy 34, 323–326.

Bardi, U., Lavacchi, A., 2009. A Simple Interpretation of Hubbert’s Model of Resource Exploitation. Energies 2, 646–661. BGS, 2013. World mineral statistics archive. British Geological Survey. Available at:

Barnhart, C., et al. 2013. On the importance of reducing the energetic and material demands of electrical energy storage. Energy Environment Science 2013(6): 1083–1092.

Buckingham, D.A., Jasinski, S.M., 2012. Historical Statistics for Mineral and Material Commodities in the United States; Phosphate rock statistics.

Carpenter, S.R., Bennett, E.M., 2011. Reconsideration of the planetary boundary for phosphorus. Environ. Res. Lett. 6, 014009.

Cooper, J., Lombardi, R., Boardman, D., Carliell-Marquet, C., 2011. The future distribution and production of global phosphate rock reserves. Resources, Conservation and Recycling 57, 78– 86.

Cordell, D., Drangert, J.-O., White, S., 2009. The story of phosphorus: Global food security and food for thought. Global Environmental Change 19, 292–305.

Cordell, D., White, S., Lindström, T., 2011a. Peak phosphorus: the crunch time for humanity? The Sustainability Review. Available from

Cordell, D., Rosemarin, A., Schröder, J.J., Smit, A.L., 2011b. Towards global phosphorus security: A systems framework for phosphorus recovery and reuse options. Chemosphere 84, 747–758.

Cordell, D., White, S., 2011. Peak Phosphorus: Clarifying the Key Issues of a Vigorous Debate about Long-Term Phosphorus Security. Sustainability 3, 2027–2049.

Cordell, D., White, S., 2013. Sustainable Phosphorus Measures: Strategies and Technologies for Achieving Phosphorus Security. Agronomy 3, 86–116.

De Ridder, M., De Jong, S., Polchar, J., Lingemann, S., 2012. Risks and Opportunities in the Global Phosphate Rock Market: Robust Strategies in Times of Uncertainty. The Hague Centre for Strategic Studies (HCSS). Report No 17 | 12 | 12. ISBN/EAN: 978-94-91040-69-6

Déry, P., Anderson, B., 2007. Peak phosphorus. Resilience. Available from: EcoSanRes, 2008. Closing the Loop on Phosphorus. EcoSanRes Factsheet 4. Stockholm Environment Institute.

DOE/EPRI. 2013. Electricity storage handbook in collaboration with NRECA. USA: Sandia National Laboratories and Electric Power Research Institute.

Edixhoven, J.D., Gupta, J., Savenije, H.H.G., 2013. Recent revisions of phosphate rock reserves and resources: reassuring or misleading? An in-depth literature review of global estimates of phosphate rock reserves and resources. Earth Syst. Dynam. Discuss. 4, 1005–1034.

Elser, J., Bennett, E., 2011. Phosphorus cycle: A broken biogeochemical cycle. Nature 478, 29–31. Elser, J.J., 2012. Phosphorus: a limiting nutrient for humanity? Curr. Opin. Biotechnol. 23, 833–838.

Evenson, R.E., Gollin, D., 2003. Assessing the Impact of the Green Revolution, 1960 to 2000. Science 300, 758–762.

Fantazzini, D., Höök, M., Angelantoni, A., 2011. Global oil risks in the early 21st century. Energy Policy 39, 7865–7873.

Filippelli, G.M., 2011. Phosphate rock formation and marine phosphorus geochemistry: The deep time perspective. Chemosphere 84, 759–766.

Fixen, P.E., 2009. World Fertilizer Nutrient Reserves— A View to the Future. Better Crops, Vol. 93 No. 3 pp. 8-11.

Gilbert, N., 2009. Environment: The disappearing nutrient. Nature News 461, 716–718.

GPRI, 2010. GPRI Statement on Global Phosphorus Scarcity. Available from:

Hanjra, M.A., Qureshi, M.E., 2010. Global water crisis and future food security in an era of climate change. Food Policy 35, 365–377.

Herring, J.R., Fantel, R.J., 1993. Phosphate rock demand into the next century: Impact on world food supply. Nonrenewable Resour. 2, 226–246.

Hubbert, M.K., 1956. Nuclear energy and the fossil fuels. Report No. 95, Shell Development Company.

Hubbert, M.K., 1959. Techniques of prediction with application to the petroleum industry., Published in 44th Annual Meeting of the American Association of Petroleum Geologists. Shell Development Company, Dallas, TX.

Höök, M., Bardi, U., Feng, L., Pang, X., 2010. Development of oil formation theories and their importance for peak oil. Marine and Petroleum Geology 27, 1995–2004.

Höök, M., Li, J., Oba, N., Snowden, S., 2011. Descriptive and Predictive Growth Curves in Energy System Analysis. Natural Resources Research 20, 103–116.

IFA, 1998. The Fertilizer Industry, Food Supplies and the Environment. International Fertilizer Industry Association and United Nations Environment Programme, Paris, France.

IFA, 2011. Global Phosphate Rock Production Trends from 1961 to 2010. Reasons for the Temporary Set-Back in 1988-1994. International Fertilizer Industry Association (IFA). Paris, France. IFA, 2014. Production and trade statistics. International Fertilizer Industry Association. Available at:

Jakobsson, K., Bentley, R., Söderbergh, B., Aleklett, K., 2012. The end of cheap oil: Bottom-up economic and geologic modeling of aggregate oil production curves. Energy Policy 41, 860– 870.

Jasinski, S.M., 2013a. Annual publication: Mineral Commodity Summaries: Phosphate Rock (19962013) U.S. Geological Survey. Available at:

Jasinski, S.M., 2013b. Minerals Yearbook, Phosphate Rock (1994-2013). U.S. Geological Survey. Available at:

Katwala, A. 2018. The spiralling environmental cost of our lithium battery addiction. Wired.

Koppelaar, R.H.E.M., Weikard, H.P., 2013. Assessing phosphate rock depletion and phosphorus recycling options. Glob. Environ. Change, 23(6): 1454–1466.

Krauss, U.H., Saam, H.G., Schmidt, H.W., 1984. International Strategic Minerals Inventory Summary Report – Phosphate, U.S. Geological Survey Circular 930-C. U.S. Department of the Interior.

May, D., Prior, T., Cordell, D., Giurco, D., 2012. Peak minerals: theoretical foundations and practical application. Natural Resources Research, 21(1):43-60.

Mohr, S., Evans, G., 2013. Projections of Future Phosphorus Production. URL

Mohr, S., Höök, M., Mudd, G., Evans, G., 2011. Projection of long-term paths for Australian coal production—Comparisons of four models. International Journal of Coal Geology 86, 329– 341.

Mórrígan, T., 2010. Peak Phosphorus: A Potential Food Security Crisis. Global & International Studies, University of California. Santa Barbara, CA.

OCP, 2010. OCP Annual Report 2010. Office Chérifien des Phosphates. OCP, 2011.

OCP Annual Report 2011. Office Chérifien des Phosphates.

Pan, G., Harris, D.P., Heiner, T., 1992. Fundamental issues in quantitative estimation of mineral resources. Nat Resour Res 1, 281–292. Rosemarin, A., 2004. The precarious geopolitics of phosphorous. Down to Earth. Available from:

Penn, I. 2018. How zinc batteries could change energy storage. New York Times.

Rosemarin, A., De Bruijne, G., Caldwell, I., 2009. Peak phosphorus: the Next Inconvenient Truth. The Broker. Available at:

Scholz, R.W., Ulrich, A.E., Eilittä, M., Roy, A., 2013. Sustainable use of phosphorus: A finite resource. Science of The Total Environment 461–462, 799–803.

Scholz, R.W., Wellmer, F.-W., 2013. Approaching a dynamic view on the availability of mineral resources: What we may learn from the case of phosphorus? Global Environmental Change 23, 11–27.

Schröder, J.J., Cordell, D., Smit, A.L., Rosemarin, A., 2009. Sustainable Use of Phosphorus. Plant Research International, Wageningen UR and Stockholm Environment Institute (SEI), Report 357.

Smil, V., 2000. Phosphorus in the Environment: Natural Flows and Human Interferences. Annual Review of Energy and the Environment 25, 53–88.

Smit, A.L., Bindraban, P.S., Schröder, J.J., Conijn, J.G., Van der Meer, H.G., 2009. Phosphorus in the agriculture: global resources, trends and developments. Plant Research International, Wageningen UR. Report No. 282.

Steen, I., 1998. Phosphorus availability in the 21st century: Management of a non-renewable resource. Phosphorus & Potassium, Issue No:217. Available from:

Sverdrup, H.U., Ragnarsdottir, K.V., 2011. Challenging the planetary boundaries II: Assessing the sustainable global population and phosphate supply, using a systems dynamics assessment model. Appl. Geochem. 26, Supplement, S307–S310.

Udo de Haes, H.A., van der Weijden, W.J., Smit, A.L., 2009. Phosphate – from surplus to shortage. Steering Committee for Technology Assessment. Ministry of Agriculture, Nature and Food Quality, Utrecht.

UNEP, 2001. Environmental Aspects of Phosphate and Potash Mining. United Nations Environment Programme and International Fertilizer Industry Association, Paris, France.

UNEP, 2009. Agribusiness, Issue 1. Chief Liquidity Series. United Nations Environment Programme Finance Initiative.

UNEP, 2011. Phosphorus and Food Production, UNEP Year Book 2011.

UNEP/UNIDO, 2000. Mineral Fertilizer Production and the Environment. Part 1. The Fertilizer Industry’s Manufacturing Processes and Environmental Issues. United Nations Environment Programme Industry and the Environment United Nations Industrial Development Organization in collaboration with the International Fertilizer Industry, Paris, France.

USBM, 1993. Bureau of Mines Minerals Yearbook (1932-1993). Available at:

USGS, 2013. Mineral Commodity Summaries 2013. Appendix C: Reserves and Resources. Available from:

Vaccari, D.A., 2009. Phosphorus: A Looming Crisis. Sci. Am. 300, 54–59.

Vaccari, D.A., Strigul, N., 2011. Extrapolating phosphorus production to estimate resource reserves. Chemosphere 84, 792–797.

Van Enk, R.J., Van der Vee, G., Acera, L.K., Schuiling, R., Ehlert, P., 2011. The phosphate balance: current developments and future outlook. The ministry of Economic Affairs, Agriculture and Innovation initiated and finances InnovationNetwork. ISBN: 978–90–5059–414–1

van Kauwenbergh, S., 2010. World Phosphate Rock Reserves and Resources.The International Fertilizer Development Center (IFDC).

van Vuuren, D.P., Bouwman, A.F., Beusen, A.H.W., 2010. Phosphorus demand for the 1970–2100 period: A scenario analysis of resource depletion. Glob. Environ. Change 20, 428–439.

Vazquez, S., et al. 2010. Energy storage systems for transport and grid applications. IEEE Transactions on Industrial Electronics 57(12): 3884.

Wellstead, J., 2012. Political Risks in MENA Phosphate Markets. Potash Invest. News. Available from:

Vikström, H., Davidsson, S., Höök, M., 2013. Lithium availability and future production outlooks. Applied Energy 110, 252–266.

Posted in Automobiles, Electrification, Peak Lithium | Tagged , , , , , | 2 Comments

What percent of Americans are rational?

Preface. Why does rationality matter — what’s the harm in believing there’s a fat old “Santa Claus” God in the sky noting down every time you’re naughty or nice on trillions of inhabited planets in the universe every second of the day, and has been for trillions of years? There’s no harm at all, people have always believed odd things.

But that’s not always true. Evangelists are trying to force the rest of us to see the world their way and voting for totally irrational people.  They and others who can’t tell fake from real news and believe in conspiracy theories threaten Democracy and consequences could be as high as launching nuclear weapons.

For example, 81% of evangelicals voted for Trump and they are 26% of voters. No other religious or non-religious group delivered as many votes to Trump: 42 million (mainstream Christians 27.8 million, white Catholics 16.8 million).  And this despite knowing he stiffed thousands of workers, grabbed women’s asses, hung out with gangsters, which should have resulted in losing his casino license, laundered money for the Russian mafia, and much more (Johnson 2016).

Andersen (2017) estimates that only a third of us are more or less solidly reality-based.

The polls below show Andersen may be too kind. One poll concludes that only 27% of us are rational.

It may be even less than that, because there isn’t any survey that covers paranormal, supernatural, and basic knowledge of the world. For example, the National Science Foundation survey of basic knowledge of the world found that 26% of Americans think the sun revolves around the Earth, and only 48% in evolution — that human beings developed from earlier species of animals.

Alice Friedemann  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report ]

What percent of Americans are rational?

In a really strict sense none of us are 100% rational due to cognitive biases, framing effects, fallacies and so on (wiki lists over 250 of these). To be human is to be irrational. But we’re all capable of improving our critical thinking skills and our understanding of the world.

So I’ll stick with the paranormal, pseudoscience, scientific knowledge, and conspiracy beliefs.

A Gallup poll in 2005 found that “Three in four Americans believe in Paranormal”, and found that 73% believe in one or more of these: ESP, Haunted Houses, Ghosts, Telepathy, Clairvoyance, Astrology, communication with the dead is possible, Witches, reincarnation, Channeling. Only 27% of Americans thought none of them were true.

And it might have been even lower if irrational beliefs had been expanded to include conspiracy theories, scientific understanding, evolution, climate change, creationism, the Devil, Hell, angels, miracles, and other beliefs.

Paranormal and supernatural beliefs. 

Multiple numbers reflect results from several surveys:

  1. Angels: 77%,  72%, 72%   88% of Christians, 95% of evangelical Christians
  2. Astrology: 25%, 26%, 29%
  3. Channeling: 9%
  4. Civil war wasn’t about slavery but states’ rights: 48%
  5. Climate Change not due to man-made activities: 40%
  6. Clairvoyance: 26%
  7. Communication with the dead is possible: 21%
  8. Creationism: 36%
  9. Devil: 61%, 60%, 58%
  10. ESP: 41%
  11. Ghosts: 34%, 42%, 42%
  12. Haunted Houses: 37%
  13. Heaven: 71%, 75%
  14. Hell: 64%, 61%
  15. Jesus born of a virgin: 73%, 61%, 57%
  16. Jesus is God or son of God: 73%, 68%
  17. Jesus’s resurrection: 70%, 65%
  18. Life after death: 71%, 64%
  19. Miracles: 76%, 72%
  20. Reincarnation: 21%, 20%, 24%
  21. Sun revolves around the Earth: 25%
  22. Telepathy: 31%
  23. UFOs: 34%, 32%, 36%, extraterrestrial beings have visited 24%
  24. Vaccines cause autism: 56%
  25. Witches: 21%, 23%, 26%

Conspiracy theories  (Chapman 2016)

So what is a conspiracy theory? It’s (1) a group (2) acting in secret (3) to alter institutions, usurp power, hide truth, or gain utility (4) at the expense of the common good.

There’s no way to stereotype people who believe conspiracy theories, they exist across gender, age, race, income, political affiliation, educational level and occupational status.

Education makes a difference though. 42% of those without a high school diploma had a high predisposition to conspiracies.  A much lower, but still shockingly high 23% of those with postgraduate degrees also had a high disposition for conspiratorial beliefs (Uscinski 2014).

Only 26% of Americans disagreed with all 9 conspiracy theories below, and 33% even believed in a made-up conspiracy researchers called “The North Dakota Crash”.  The percent who said that the government is concealing what they know about….

  1. The 9/11 attacks 54.3%
  2. The JFK assassination 49.6%
  3. Alien encounters 42.6%
  4. Global warming 42.1%
  5. Plans for a one world government  32.9%
  6. Obama’s birth certificate shows he’s a foreigner 30.2%
  7. The origin of the AIDs virus 20.1%
  8. Death of supreme court justice Scalia 27.8%
  9. The moon landing  24.2%

People who believed in the highest number of conspiracies are also more likely to believe that “The World Will End in My Lifetime” (uh-oh, those evangelists again), as well as more likely to be fearful of government, less trusting of other people, and more likely to take actions such as buying a gun to overcome their fears.


National Science Foundation Questions 2014

The questions below are followed by correct answer and the percent who got it right:

  1. The center of the Earth is very hot. True 84%
  2. The continents have been moving their location for millions of years and will continue to move. True 83%
  3. Does the Earth go around the sun, or does the sun go around the Earth? Earth around sun 74%
  4. All radioactivity is man-made. True or false? False 72%
  5. Electrons are smaller than atoms. True or false? True 53%
  6. Lasers work by focusing sound waves. True or false? False 47%
  7. The universe began with a huge explosion. True or false? True 39%
  8. It’s the father’s gene that decides whether the baby is a boy or girl. True or false? True 63%
  9. Antibiotics kill viruses as well as bacteria. True or false? False 51%
  10. Human beings, as we know them today, developed from earlier species of animals. True or false? True 48%

Not surprisingly, the higher the education level the greater the number of correct answers.

The world

Below is a poll of over 17,000 adults all over the world (Ipsos 2017) asking if they think that Religion does more harm in the world than good.  In my opinion, YES, DOES MORE HARM is a sign of rationality.  If you do too, then the rational nations are: Belgium, Germany, Spain, Australia, India, Sweden, Great Britain, France, Canada, Hungary, Argentina, Poland, Italy, Serbia, Mexico, and Turkey.  All of these 15 nations who scored higher than the U.S. But congratulations to the 44% of Americans who answered correctly.

Related Posts:

Critical Thinking

Posts showing good critical thinking

Surveys, references

Andersen, K. 2017. Fantasyland. How America Went Haywire. A 500-Year History. Random House.

AP / GFK. December 8-12, 2011. Poll in 2011. Associated Press.  1,000 interviews. Error: +/- 4%

Baylor. 2017. American values, mental health, and using technology in the age of trump. Baylor religion survey.

Chapman. October 11, 2016. What aren’t they telling us? Chapman University Survey of American Fears.

Gallup. 2005. Paranormal beliefs come (Super)naturally to some.

Gallup. 2005. Three in Four Americans believe in Paranormal.

Gallup. 2016. Most Americans still believe in God.

Harris Poll. 2009. What People Do and do not believe in.

Harris Poll. 2013. What do Americans Believe?

IPSOS. July 2017. Ipsos global poll: Two in three Australians think religion does more harm than good in the world.

Johnson, D. 2016. The Making of Donald Trump. Penguin.

National Science Foundation. 2015. Belief in the Paranormal or pseudoscience. Science and technology: public attitudes and public understand.

Politico. August 3, 2017. How the CIA Came to Doubt the Official Story of JFK’s Murder Newly released documents from long-secret Kennedy assassination files raise startling questions about what top agency officials knew and when they knew it.

Reardon 2016. Reardon, S. October 18, 2016. The scientists who support Donald Trump. Nature.

Uscinski, J.E., et al. 2014. American Conspiracy Theories. Oxford University Press.


Posted in Critical Thinking, Critical Thinking and Scientific Literacy, Religion | Tagged , , , , | 4 Comments