William E. Rees: Don’t Call Me a Pessimist on Climate Change. I Am a Realist

Preface. William E. Rees is professor emeritus of human ecology and ecological economics at the University of British Columbia. He’s one of my favorite ecological writers and has written about energy, limits to growth, overshoot, sustainability and other ecological topics for many years.

Alice Friedemann  www.energyskeptic.com  Author of Life After Fossil Fuels: A Reality Check on Alternative Energy; When Trucks Stop Running: Energy and the Future of Transportation”, Barriers to Making Algal Biofuels, & “Crunch! Whole Grain Artisan Chips and Crackers”.  Women in ecology  Podcasts: WGBH, Jore, Planet: Critical, Crazy Town, Collapse Chronicles, Derrick Jensen, Practical Prepping, Kunstler 253 &278, Peak Prosperity,  Index of best energyskeptic posts

***

William E. Rees. 2019. Don’t Call Me a Pessimist on Climate Change. I Am a Realist To see our fate clearly, we must face these hard facts about energy, growth and governance. Part one of two. thetyee.ca

No one wants to be the downer at the party, and some would say that I am an unreformed pessimist. But consider this — pessimism and optimism are mere states of mind that may or may not be anchored in reality. I would prefer to be labeled a realist, someone who sees things as they are, who has a healthy respect for good data and solid analysis (or at least credible theory).

Why is this important? Well, if Greta Thunberg and followers are to inspire more than emotional release about climate change, the world needs to face some hard facts that suggest we are headed toward catastrophe. At the same time, skepticism is the hallmark of good science; realists too must be open to the challenge posed by new facts.

So, today, and in a piece to follow, I present an unpopular but fact-based argument in the form of two “Am I wrong?” queries. If you accept my facts, you will see the massive challenge we face in transforming human assumptions and ways of living on Earth.

I welcome being told what crucial facts I might be missing. Even a realist — perhaps especially a realist in present circumstances — occasionally wants to be proved incorrect.

Question 1: The modern world is deeply addicted to fossil fuels and green energy is no substitute. Am I wrong? The Tyee is supported by readers like you Join us and grow independent media in Canada

We can probably agree that techno-industrial societies are utterly dependent on abundant cheap energy just to maintain themselves — and even more energy to grow. The simple fact is that 84 per cent of the world’s primary energy today is derived from fossil fuels.

It should be no surprise, then, that carbon dioxide from burning fossil fuels is the greatest metabolic waste by weight produced by industrial economies. Climate change is a waste management problem!

Cheap fossil energy enabled the world to urbanize, and this process is continuing. The UN expects the urban population to rise to 6.7 billion — 68 per cent of humanity — by 2050. There will be 43 mega-cities with more than 10 million inhabitants each as early as 2030, mostly in China and other Asian countries.

Building out these and hundreds more large cities will require much of the remaining allowable carbon budget. Moreover, the current and future inhabitants of every modern city depend absolutely on the fossil-fuelled productivity of distant hinterlands and on fossil-fuelled transportation for their daily supplies of all essential resources, including water and food.

Fact: Urban civilization cannot exist without prodigious quantities of dependable energy.

All of which generates a genuine emergency. By 2018, the combustion of fossil fuel alone was pumping 37.1 billion tonnes of carbon dioxide into the atmosphere. Add to this the net carbon emissions from land clearing (soil oxidation) and more vigorous forest fires, and we can see why atmospheric carbon dioxide concentrations reached an all-time high of 415 parts per Million in early 2019. This is 48% above pre-industrial levels and concentrations are rising exponentially.

And, of course, everyone with an active brain cell is aware that CO2 is the main human-related driver of global warming and associated climate change.

Cue the techno-optimists’ chorus: “Not to worry, all we have to do is transition to green renewable energy!”

In fact, there is plenty of superficial support for the notion that green tech is our saviour. We are told repeatedly that the costs of providing renewable energy have fallen so low that it will soon be practically free. Australian professors Andrew Blakers and Matthew Stocks say “Solar photovoltaic and wind power are rapidly getting cheaper and more abundant — so much so that they are on track to entirely supplant fossil fuels worldwide within two decades.” Luckily, the transition won’t even take up much space: UC Berkeley professor Mehran Moalem argues that “an area of the Earth 335 kilometres by 335 kilometres with solar panels… will provide more than 17.4 TW power…. That means 1.2 per cent of the Sahara desert is sufficient to cover all of the energy needs of the world in solar energy.” (Someone should remind Prof. Moalem that, even if such an engineering feat were possible, a single sandstorm would bury the world’s entire energy supply.)

The first problem with such claims is that despite rapid growth in wind and solar generation, the green energy transition is not really happening. The chart below shows that in most recent years (except 2009, following the 2008 global financial crisis), the uptick in global demand for electrical energy exceeded the total output of the world’s entire 30-year accumulation of solar power installations. Between 2017 and 2018, the demand increase outpaced total solar supply by 60 per cent; two years’ demand increase absorbs the entire output of solar and wind power combined.

582px version of EnergyDemandChart.png
The annual increase in demand for electricity exceeds the entire output of photovoltaic electricity installations. Graph courtesy of Pedro Prieto, with permission.

As long as the growth in demand exceeds additions to supply from renewables, the latter cannot displace fossil fuels even in electricity generation — and remember, electricity is still less than 20 per cent of total energy consumption, with the rest being supplied mostly by fossil fuels.

Nor is any green transition likely to be cheap. The cost of land is substantial and, while the price of solar panels and wind turbines have declined dramatically, this is independent of the high costs associated with transmission, grid stabilization and systems maintenance. Consistently reliable wind and solar electricity requires integrating these sources into the grid using battery or pumped hydro storage, back-up generation sources (e.g., gas turbines, cruise-ship scale internal combustion engines, etc.) and meeting other challenges that make it more expensive.

Also problematic is the fact that wind/solar energy is not really renewable. In practice, the life expectancy of a wind turbine may be less than 15 years. Solar panels may last a few years longer but with declining efficiency, so both turbines and panels have to be replaced regularly at great financial, energy and environmental cost. Consider that building a typical wind turbine requires 817 energy-intensive tonnes of steel, 2,270 tonnes of concrete and 41 tonnes of non-recyclable plastic. Solar power also requires large quantities of cement, steel and glass as well as various rare earth metals.

World demand for rare-earth elements — and Earth-destroying mining and refining — would rise 300 per cent to 1,000 per cent by 2050 just to meet the Paris goals. Ironically, the mining, transportation, refining and manufacturing of material inputs to the green energy solution would be powered mainly by fossil fuels (and we’d still have to replace all the machinery and equipment currently running on oil and gas with their electricity-powered equivalents, also using fossil fuel). In short, even if the energy transition were occurring as advertised, it would not necessarily be reflected in declining CO2 emissions.

If we divide 2018 into energy segments, oil, coal and natural gas powered the globe for 309 out of 365 days, hydro and nuclear energy gave us 41 days, and non-hydro renewables (solar panels, wind turbines, biomass) a mere 15 days. If the race is towards a decarbonized finish line by 2050, we’re still pretty much stalled at the gate.

Fact: Despite the hype about the green energy revolution and enhanced efficiency, the global community in 2019 remains addicted to fossil energy and no real cure is on the horizon.

As I say, please do tell me I’m wrong.

Posted in William Rees | Tagged , , , | 5 Comments

Northeast apple production suffering from Climate Change

Preface. Although this article is only about one crop in one area, it portends a darker future for food production in the future, with each region having their own issues (i.e. drought in California). It’s only a matter of time before climate change reduces the food supply.

Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

***

Newburger, E. 2019. ‘We’re fighting for our lives’ – US apple farmers endure major crop and profit losses as climate changes. CNBC.com.

Climate change has become an impossible financial burden for many farmers.

Warm spells in the winter force trees to flower prematurely and expose the buds to unpredictable winter frosts, hail and other extreme weather. In some cases, entire trees can become stressed and die.

Hurricanes snap and topple apple trees.

The Hudson Valley has a long and robust history of apple growing. New York is the second-largest apple producer in the U.S, and about 22% of the state’s annual apple production is derived from the Valley. But the apple-lush region is under serious pressure. In the past 50 years, the Valley has experienced a 71% increase in heavy rains, with unpredictable downpours and flooding.

Ryan, who operates several orchards in the Hudson Valley, has experienced three major crop losses due to bad weather in the last decade.

In 2012, apples in New York bloomed about a month earlier than usual when temperatures rose up to 70 degrees in February and then dropped back down. Half of the state’s apple crop was destroyed and farmers endured millions of dollars in losses.

It’s a significant economic problem, as apple production in the Northeast U.S. is valued at more than $400 million.

Between 2012 and 2017, more than 2,000 farms closed in the state, representing a 6% drop in the number of farms statewide, according to the New York Farm Bureau.

Lack of winter chilling and early bloom in the spring might seem like obvious problems, but climate change also poses a slew of other hardships for apple growers. For instance, warmer nights lead to the spread of pests and other diseases, and hot days in the winter sunburn the apple’s plant tissue.

Warming temperatures also cause significant defects and pigment damage to developed fruit, making it impossible for growers to sell them on the market. If the nights don’t cool down enough, an apple that is supposed to turn red will turn brown or pink instead.

The ramifications of climate change on the entire food system will be immense. But the problem is not yet obvious for most U.S. consumers, in part because a destroyed harvest in one region of the country could be masked by a better harvest in another

Posted in Farming & Ranching, Food production | Tagged , , , | Comments Off on Northeast apple production suffering from Climate Change

When wood is again our main energy source, how long will it last?

Preface.  Just when civilization is decades from returning to wood as the main energy source (due to peak oil in 2018), climate change is allowing invasive beetles to survive winters and kill trees, with drought and wildfires increasing the damage.

Although the high prices of wood in 2021 are being blamed on inflation, the true cause is drought, invasive beetles, climate change, and wildfires. In the U.S., as the first article shows, a main reason is tree dieoff in Canada.

After that are a lot of statistics to get an idea of how wood stacks up against other energy resources. Burning wood to heat U.S. homes is about 1% of fuel used to directly heat buildings, with 50-66% of this wood being burned in wood stoves rather than fireplaces (USDOE 2008).

In 2014, about 2.5 million households (2.1%) across the country used wood as the main fuel for home heating, up from 1.9 million households (1.7%) in 2005. An additional 9 million households (7.7%) use wood as a secondary heating fuel. This combination of main and secondary heating accounts for about 500 trillion British thermal units (Btu) of wood consumption per year in the residential sector, or about the same as propane consumption and slightly less than fuel oil consumption (U.S. EIA March 17, 2014 Increase in wood as main source of household heating most notable in the Northeast).

Alice Friedemann  www.energyskeptic.com  Author of Life After Fossil Fuels: A Reality Check on Alternative Energy; When Trucks Stop Running: Energy and the Future of Transportation”, Barriers to Making Algal Biofuels, & “Crunch! Whole Grain Artisan Chips and Crackers”.  Women in ecology  Podcasts: WGBH, Jore, Planet: Critical, Crazy Town, Collapse Chronicles, Derrick Jensen, Practical Prepping, Kunstler 253 &278, Peak Prosperity,  Index of best energyskeptic posts

***

Meyer R (2021) Why Dead Trees Are ‘the Hottest Commodity on the Planet’. The Atlantic.

In North America, lumber is typically traded in units of 1,000 board feet; builders need about 15,000 board feet, on average, to construct a single-family home. From 2015 to 2019, lumber traded at $381 for 1,000 board feet. This month, it reached an all-time high of $1,104 for the same amount. The lumber shortage has added at least $24,000 to the cost of a new home.
Since 2018, a one-two punch of environmental harms worsened by climate change has devastated the lumber industry in Canada, the largest lumber exporter to the United States. A catastrophic and multi-decade outbreak of bark-eating beetles, followed by a series of historic wildfire seasons, have led to lasting economic damage in British Columbia, a crucial lumber-providing province. Americans have, in effect, made a mad dash for lumber at the exact moment Canada is least able to supply it.
In 2017, British Columbia recorded the worst wildfire season in its history. Fires cleared 1.2 million hectares of land, or more than 1 percent of the province’s area.  That record was surpassed the following year, when 1.3 million hectares burned.
In an ugly feedback loop, the beetle outbreak contributed to wildfires, because when conifers are attacked by pests, they secrete more pitch in self-defense,which is extremely flammable. When trees are drought-prone and filled with pitch, it’s like fire starter on the landscape. (Nor is wildfire the only risk of pitch: British Columbia sawmills and pulp mills have exploded while processing pitch-loaded wood.)
Among builders, the preferred “species” of wood for framing homes is called Canadian SPF, or Canadian spruce-pine-fir. As its hyphenated name gives away, SPF is not a single species of tree, but a catchall industry name for conifers grown in the northern boreal forest. If you’re in a relatively new American home or low-rise building right now—or if you can see one out the window—there’s a good chance it’s made of SPF imported from Canada, specifically British Columbia or Alberta.
Canadian SPF is grown in orderly tracts of forest that span much of Canada’s northern belt. Starting in 1999, an outbreak of bark-eating mountain pine beetles has ravaged conifer forests across the American and Canadian West. It has been especially bad in British Columbia, which exports about half of its lumber to the U.S. The beetle has devoured 18 million hectares of forest in British Columbia alone, killing 60 percent of its merchantable pine.
Across North America, the woodland affected by the beetle—a tract stretching from Montana to Saskatchewan—totals 27 million hectares, an area more than three-quarters the size of Germany.
Trees take a long time to grow in the harsh climes of British Columbia. With its bountiful sunlight and warm, wet weather, Florida can grow a pine to merchantable size in 15 years, but in 15 years, a tree is up to 6 feet tall. Canadian forests take 40 to 60 years to reach maturity.
In the past two years, about 30 sawmills have closed in British Columbia. Other factors in 2019 made the local industry’s crash especially sharp: A sluggish housing market and American import duties helped suppress demand too.
***

FIREWOOD only no fossil fuels

This shows how many years it would take to deplete forests in each state if wood alone were used for heat. Source: Nate Hagens 2007 oildrum post “Old Sunlight vs Ancient Sunlight -An Analysis of Home Heating and Wood” which he derived from many other sources. 

***

Wood takes up about 5 times as much space as coal for the same amount of BTU’s generated.  Coal and wood come with different energy densities, so this is a very rough estimate.

Energy Content of Fuels:  just part of Energy Fun Facts table

Coal                         25  million BTU/ton 2,103 lbs to 2,243 lbs = 1 cubic yard
Crude Oil                 5.6 million BTU/barrel
Oil                         5.78 million BTU/barrel = 1700 kWh / barrel
Gasoline                   5.6 million BTU/barrel (a barrel is 42 gallons) = 1.33 therms / gallon
Natural gas liquids    4.2 million BTU/barrel
Natural gas                      1030 BTU/cubic foot
Wood                       20 million BTU/cord  8 x 4 x 4 feet = 128 cubic feet = 4.7 cubic yards

Source: http://www.physics.uci.edu/~silverma/units.html

wiki: One seasoned (dry) cord of red oak has the heating equivalent of 108 US gallons

Nate Hagens. Home Heating in the USA: A Comparison of Forests with Fossil Fuels. theoildrum

USDOE 2008. U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy. 2008 Buildings Energy Data Book buildingsdatabook.eere.energy.gov/.

THE ENERGY IN WOOD

Freshly cut wood has over 60% moisture and therefore takes much more effort to release the energy in the wood fibers. Seasoned wood approaches 20% moisture content and releases about 6,400 BTUs per pound of wood. (Pure bone-dry wood tops 8,000 BTUs per pound but is not practical for home use). Almost all wood types create the same amount of BTUs per pound (6,400), but depending on their individual densities and other properties, differ in how many pounds make up 1 cord. Some examples are:

Hickory => 4,327 lbs per cord => 27.7 million BTUs per cord
Red Maple => 2,924 lbs per cord => 18.7 million BTUs per cord
Cottonwood => 2,108 lbs per cord => 13.5 million BTUs per cord
Cedar => 1,913 lbs per cord => 12.2 million BTUs per cord

A complete list of wood types and BTU content per cord can be found here

This analysis assumes one cord of wood typically is about 2400 pounds. We then arrive at 2,400 X 6,400 BTUs =15,360,000 BTUs per cord. Therefore, in the 52 US states, we have 34.7 million cords of annual volume growth of wood available times 15.36 million BTUs per cord => 533 Trillion BTUs that can be presently be accessed sustainably from hardwoods. (If we eschew all other forest products, this number roughly doubles, and if we include softwoods, it roughly doubles again)

PUTTING THE PIECES TOGETHER

Heating with wood is not as efficient as heating with natural gas or #2 heating oil. A significant portion of the heat generated from burning escapes up the flue to dissipate as heat in the atmosphere. Wood stoves and furnaces average about 55% efficiency. This compares to 85% efficiency for natural gas furnaces and 80% for furnaces using #2 heating oil or kerosene. (the lower the efficiency rating the more BTUs of heat is ‘lost’ and unable to provide heat to targeted areas).

So, of the 5,030 trillion BTUs generated by natural gas furnaces in 2004, 85% or 4,275 trillion BTUs went directly to heating, and 15%, or 755 trillion BTUs was dissipated as waste heat. Similarly, of the 998 trillion BTUs generated by heating oil, roughly 80%, or 799 trillion BTUs went directly to heating.

Of the 532 Trillion BTUs that could be generated annually from forest growth, approximately 55% or 297 Trillion BTUs would end up as ‘actual heat’. Natural Gas and Heating Oils collectively generated 5,074 Trillion BTUs of ‘actual heat’. Thus, this analysis indicates that we could sustainably replace 297 / 5,074 Trillion BTUs or 5.8% of fossil fuel home heating use with home heating from wood. Alternatively, the entire United States forest stock of hardwoods contains 364 billion cubic feet of wood, or 2.84 billion cords which would throw off 24,024 Trillion BTUs (note, this is only 24% of the total annual energy usage of the country). So the good news is if we were really cold and sans fossil fuels, we could chop down trees for at least 4 years before the US would resemble Easter Island (24,024/5,074= 4.74 years). On a state by state basis, the distribution would look like the following:

What’s a British Thermal Unit (Btu)?

1 Btu = 1000 joules = 1 match tip = amount of energy to raise temperature of 1 pint of water 1 degree Fahrenheit
4 Btus  = 1 food calorie
36 Btus  = 9 calories = energy expended by the average female jogging for 1 minute
70 Btus  = energy to make 1 cup of coffee
2,455 Btus  = energy required for 1 horsepower for one hour
7,200 Btus  = minimum average energy requirement per person = 1,800 calories
11,000 Btus  = 1 lb of coal
16,000 Btus  = energy in 1 lb of fat
125,000 Btus = 1 gallon of gasoline
149,000 Btus = 1 gallon of heating oil, diesel fuel, or jet fuel (JP 4)
840,000 Btus = energy in 1 cubic foot of coal
995,000 Btus = energy in 1 cubic foot of crude oil
1,000,000 Btus = 90 pounds of coal = 8 gallons of gasoline

       22,000,000 Btus = 1 cord of wood (8 x 4 x 4 feet) about 1.5 to 2 tons typically

36,000,000 Btus = electricity used by the average American in a year
104,000,000 Btus = yearly residential energy use of average U.S. household
323,000,000 Btus = per person energy consumption in the U.S. = 54 barrels of oil
1,000,000,000 Btus = electricity used by 30 average Americans in a year = 47 cords of wood
100,000,000,000 Btus = electricity used by 3,000 average Americans in a year
1,000,000,000,000 Btus  = energy contained in 450 railroad cars of coal (100 tons each)
= train 4 1/4 miles long
= 8 million gallons of gas = 166,000 barrels of crude oil
= enough energy to heat 20,000 single family homes per year in the U.S.
5 trillion Btus  = enough energy to heat 100,000 single family homes per year in the U.S.
7.3 trillion Btus = amount of oil being carried by the Exxon Valdez when its oil spill occurred
36 trillion Btus  = yearly electricity consumption of all Arkansas residences
135 trillion Btus = energy use by all residential dishwashers in the U.S. for one year
734 trillion Btus  = energy generated for residential TV viewing
1 quadrillion Btus = 1,000,000,000,000,000 Btus
= 170 million barrels of crude oil
= 60 million tons of dried wood
= 19 days of U.S. petroleum imports
= 25 days of U.S. motor gasoline use
= 25 hours of world energy use (1990)
= fill a football field 3.7 miles high with oil
= 450,000 railroad coal cars (100 tons each)
= train 4,300 miles long (from New York to San Francisco and half way back)
1.4 quadrillion Btus = energy used by all residential air conditioning in the U.S. for one year
= energy used by all residential refrigeration in the U.S. for one year
5 quadrillion Btus  = total energy consumed by U.S. chemical industry yearly
= 850 million barrels of crude oil
= enough oil to fill 83 Sears Towers
= 225 million tons of coal
= energy used by all residential appliances in the U.S. for one year
33 quadrillion Btus  = total energy consumed by U.S. industry yearly
= 5.6 billion barrels of crude oil
= enough oil to fill 550 Sears Towers
= fill 1,400 of the world’s largest supertankers
= 1.5 billion tons of coal
88 quadrillion Btus  = total energy consumed in U.S. yearly
= 15 billion barrels of crude oil
= enough oil to cover a soccer field 230 miles deep
= fill 3,700 of the world’s largest supertankers
= 4 billion tons of coal
= 40 million railroad cars of coal (100 tons each)
= coal train long enough to reach to the moon and half way back
363 quadrillion Btus  = worldwide energy production, 1995

Source: Energy Fun Facts. Energy, Environmental, and Economics (E3) Handbook Appendix F. U.S. Department of Energy EERE industrial technologies program.

Common Heating Fuels and their Energy Content (From PG&E)

Energy Source Unit Energy Content
(Btu)
Electricity 1 Kilowatt-hour 3412
Butane 1 Cubic Foot (cu.ft.) 3200
Coal 1 Ton 28,000,000
Crude Oil 1 Barrel – 42 gallons 5,800,000
Fuel Oil no.1 1 Gallon 137,400
Fuel Oil no.2 1 Gallon 139600
Fuel Oil no.3 1 Gallon 141800
Fuel Oil no.4 1 Gallon 145100
Fuel Oil no.5 1 Gallon 148800
Fuel Oil no.6 1 Gallon 152400
Diesel Fuel 1 Gallon 139,000
Gasoline 1 Gallon 124,000
Natural Gas 1 Cubic Foot (cu.ft.) 950 – 1150
Heating Oil 1 Gallon 139,000
Kerosene 1 Gallon 135,000
Pellets 1 Ton 16,500,000
Propane 1 Gallon 91,330
Propane 1 Cubic Foot (cu.ft.) 2,550
Residual Fuel Oil 1) 1 Barrel – 42 gallons 6,287,000
Wood – air dried 1 Cord 20,000,000
Wood – air dried 1 pound 8000
Posted in BioInvasion, Drought & Collapse, Nate Hagens, Where to Be or Not to Be, Wildfire, Wood | Tagged , , , , , | Comments Off on When wood is again our main energy source, how long will it last?

Generating electricity with biomass at utility-scale in California limited to direct combustion in small 50 MW plants

Preface. It’s obviously much easier and more energy efficient to set logs on fire for heat and electricity than to turn them into ethanol.  

Burning biomass can’t do much to solve our energy crisis.  To produce just 10% of U.S. electricity (405 TWh) would require wood plantations the size of Minnesota (Smil 2015).  Every year.

Unfortunately, burnt biomass is not cleaner and greener than fossil fuels and just as hard and expensive to control. Burning wood emits carcinogens, NOx, carbon monoxide, Sulphur dioxide, carbon dioxide, antimony, arsenic, cadmium, chromium, copper, dioxins, furans, lead, manganese, mercury, nickel, polycyclic aromatic hydrocarbons (PAHs), selenium, vanadium, and zinc.  If the wood was chemically treated then the range and amount of pollutants grows.  Burning biomass worsens our health further by depositing fine particles in our lungs.  In addition, burning biomass removes essential agricultural nutrients like nitrogen and phosphate from being returned to the soil that are essential to growing food (Biofuelwatch 2014, Reijnders 2006).

The energy burning biomass to generate electricity pales in comparison to all the energy that went into building the power station and emission controls, planting and logging tree plantations, trucking the biomass to the power station, chipping it into smaller bits, and burning it at only 35% efficiency or less.

When steam engines ruled, forests disappeared over the horizon from rivers and rail tracks.  Biomass power stations have a similar problem, they are only cost effective using biomass within 100 miles, and if they burn a forest, it will take 50 years to regenerate.

Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer, Barriers to Making Algal Biofuels, and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

***

CEC. 2015. Estimated cost of new renewable and fossil generation in California. California Energy Commission. CEC-200-2014-003-SD. 254 pages.

CHAPTER 8: Biomass Technology

Biomass technologies are plants that use biological resources, such as forestry waste or farming by-products, to produce electricity through thermal and chemical processes. Biomass technologies are in limited production here in California. While these technologies are designed to harness biological by-products sustainably, they suffer from:

  1. The limitation of requiring large, reliable fuel sources to produce energy economically.
  2. The high cost of transporting the fuel from the origination site to the generation site. This limitation exposes the producer to the volatile market for diesel or other petroleum fuels, which can unexpectedly add significant costs.

Biomass is plant- based material, agricultural vegetation, or agricultural wastes used as fuel and has three primary technology pathways:

  • Pyrolysis- transformation of biomass feedstock materials into fuel (often liquid “biofuel”) through the application of heat in the presence of a catalyst.
  • Combustion- transformation of biomass feedstock materials into useful energy through the direct burning of those feedstocks using a variety of burner/boiler technologies also used for burning materials such as coal, oil, and natural gas.
  • Gasification- transformation of biomass feedstock materials into synthetic gas through the partial oxidation and decomposition of those feedstocks in a reactor vessel and oxidation.

Of these technology pathways, only direct combustion of biomass is commercially available for utility-scale plants.

Gasification methods are used in some small-scale applications but are not yet viable for utility-scale applications. Active research into pyrolysis for biofuel production is ongoing but is not used for electricity production.

Combustion technologies are widespread and include the following general approaches:

Stoker boiler combustion uses similar technology for coal-fired stoker boilers to combust biomass materials, using either a traveling grate or a vibrating bed.

  • Fluidized bed combustion uses a special form of combustion where the biomass fuel is suspended in a mix of silica and limestone through the application of air through the silica/limestone bed. This is similar to technology used in newer coal-fired boilers. Fluidized bed combustion boilers are classified as either bubbling fluidized bed (BFB) or circulating fluidized bed (CFB) units.
  • Biomass- cofiring uses biomass fuel burned in conjunction with coal products in current pulverized- coal boiler technology used in utility-scale electricity production.

Recent sources of data and analysis have focused on the fluidized bed technology. It is also the most likely biomass technology to be installed in California. The remainder of this chapter will focus on fluidized bed technology

The inherent fuel versatility of fluidized bed systems provides a plant operator the ability to burn many biomass resource types, including those feedstocks with significant moisture variations.

Biomass fuel type and uniformity-The type and uniformity of delivered biomass fuel supply are a primary cost driver for any biomass technology. Given the variation of the delivered moisture content and heating value of biomass fuel feedstocks, along with fuel processing issues, the handling and processing costs of biomass fuels can vary greatly. As a result, the type and characteristics of the different biomass fuels can have a material impact on the capital cost of the boiler design, as well as the overall fuel handling and operations cost.

Fuel transport and handling costs- The availability of sufficient biomass fuel resources near the plant location is a critical driver for operating cost. Most biomass fuel is transported by truck to a plant site. To maintain commercially reasonable prices, the effective economic radius from the plant location to the aggregate fuel supply is limited to about 100 miles. The varied nature of biomass fuel feedstocks also necessitates special handling equipment and larger numbers of dedicated staff than are needed for coal- fired combustion power plants of equivalent size. As a result, the typical maximum size of biomass plants is limited to about 50 MW in California  (McCann, et al., 1994).

Small biomass facilities lose a great deal of power over transmission lines.

Interconnection Loss Estimates for Generation Tie-Linestransmission losses interconnection tie-lines .

 

 

Boiler island cost-Capital cost of the boiler island is a critical cost driver that can account for roughly 40 to 60 percent of the overall plant cost, depending on the type of biomass combusted and the need for postcombustion pollution controls. The choice of source and type of fuels to be combusted is an important cost driver. In addition, the escalation trends for raw materials used in manufacturing the boiler island, primarily steel cost, are factors that can influence delivered boiler island cost.

Long-term fuel supply contracts-Most current biomass fuel supply contracts are of short-term duration and can entail varying fuel qualities. A key cost barrier to promoting biomass circulating bed combustion in California is the ability to develop and achieve performance on long-term (for example, 5 years duration and longer) fuel supply contracts for available fuel sources.

Plant scale- While current CFB technology has been proven to utility-scale applications of up to 300 MW, supply availability limits potential plant scale. Steam-generator scale economies are substantial, with a 50 MW biomass plant likely to cost substantially more per kW than a 500 MW coal-fired plant of the same technology (McCann, et al., 1994).

Emissions control costs-Costs of emission control needed to satisfy air quality and permitting requirements can increase the cost of biomass plants. Post-combustion emissions control technologies, such as selective catalytic reduction/selective noncatalytic reduction technologies for NOx control, and additional particulate matter controls, are important cost drivers that can significantly increase the capital and operating costs of biomass plants.

Posted in Biomass, Electric Grid | Tagged , , | 1 Comment

Steam powered farm tractors

Preface. Steam engines weren’t very efficient, 10 to 20% at best, which is why they went away beginning around 1920 when oil-powered engines came along.  At the very best steam engines for transportation reached 10 to 20% efficiency. They were almost universally powered by coal, mostly by locomotives and stationary engines in factories. Only in America for a few decades was there so much wood that steam boats and locomotives burned wood rather than coal, until deforestation east of the Mississippi forced coal to be used, and a few decades later around 1920 oil combustion engines became more powerful, efficient, far less dangerous, and cheaper and the steam engines that remain are steam turbines used to generate electricity, not to move vehicles.

Steam turbines to generate electricity are very efficient — the best can be 45%.  But since trucks run on diesel fuel and can’t be electrified with batteries or catenary, and biomass doesn’t scale up enough to produce electricity, I am mainly interested in steam engines burning biomass as a long-term replacement for diesel-engines for trucks in the future.

Biomass won’t scale up for vehicle and factory steam engines in the future for the same reason it didn’t in the past, the wood will run out quickly. Forests can take decades to grow back, since photosynthesis is inefficient, with about  a half percent of new biomass added per year.

Further limiting biomass steam engines is that post carbon they will depend on horses, like they did in the past, to haul biomass fuel and water to the steam engine. Each horse needs an average of 5 acres to provide its food, land which is now used to produce human food.

Meanwhile, post fossil fuels, wood will also be needed for nearly everything – heating and cooking, homes and buildings, furniture and flooring, tools, roofs, and so on. It will be needed to make charcoal to make iron, bricks, metals, ceramics, and so on.

In fact, biomass depletion (especially deforestation) is one of the main reasons past civilizations have failed for the past 5,000 years (see one of my favorite books: John Perlin’s “A forest journey”.  And not just because there isn’t enough biomass.  When you cut down forests, topsoil blows and washes away, and agricultural production declines. War ships can no longer be built to trade or steal trees elsewhere.

Steam-powered vehicles are so inefficient I wasn’t going to ever cover them.  But after several interviews about my book “When Trucks Stop Running”, several readers commented that we’ll use steam engines in the future.  I agree.  Until the forests are depleted, civilization crashes again, the forests grow back, and steam engines are made again, for a while.

Which reminded me of that one reason why the energy crisis isn’t feared by anyone is that it’s like Bop-a-mole.  Even if you succeed in convincing someone that solar PV power won’t be able to replace oil because it has a low EROI (too low to replace itself let alone provide power for everything else), is too seasonal, requires too much non-existent energy storage, and so on, most people will reason: but there’s still wind power, hydrogen, geothermal, wave and tidal, hydropower, natural gas and so on.  Given the reduction of news and conversation to ten-second soundbites, the pressure to be optimistic about everything all the time (the scientists will come up with something!), and lack of scientific  education, I don’t expect to ever make a dent in the general ignorance on energy and natural resource matters, but I don’t mind. This site is meant for the very small percent of people who, like me, want to understand reality regardless of how depressing it may be.  An even smaller subset of them will actually make different choices about career and where to live than they might have otherwise, choices that may save their lives in the bottleneck ahead. Good luck to anyone who has read this far!

Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer, Barriers to Making Algal Biofuels, and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

***

Brandon Knapp. 2000. How Steam Power Revolutionized the Farm in America. Yesterday’s Tractors.

As the American farm entered the 1800s, its main source of power came from three animals-the horse, the mule, or the ox. The average farm worked by horses was 100 acres, and a farmer walked 8 miles per acre to plow his fields (with a walking plow) at the average speed of 1 1/2 mph. With 100 acres, the farmer walked 800 miles to plow his fields. And he still had to plant the crop, and cultivate! For wheat and other crops the grain had to be separated from the chaff with a machine called the thresher. The thresher was powered by a power sweep, which was turned by horses. Everything depended on the strength and durability of humans and horses.

In 1849, things began to change. Some of the first portable steam engines for farm use were built in this year, in Philadelphia. They only provided belt power for machines like the thresher. There were three sizes-4, 10, and 30 horsepower. The 4-hp model sold for $625 and the 30-hp model sold for $2300. That was a lot of money back then! These machines were also heavy; the 4-hp model weighed two tons, or 1000 pounds per horsepower!

These machines were pulled from field to field by horses. The steam engine provided steady power, it didn’t tire after hard work, and it only was “fed” when it worked; instead of all year round, like animals. Yet these machines were still crude, and a low steam pressure of 50 to 90 p.s.i. limited the amount of work that could be done.

Over the next few years, the steam pressure would be steadily increased with better quality material and construction of the boilers. However the greatest change of the steam engine would make it unforgettable for the next 150 years-“Self Propelled” steam engines began their debut in 1855. At first they were just a normal “Portable” engine, with chains or gears connecting the crankshaft and the rear wheels. They couldn’t even steer! They still needed horses to turn. But the self-propelled engine could also pull its thresher behind it.

If a steam engine could pull a thresher, it could also pull another type of load-the plow. In 1855, a “steam plow” was used by its inventor Obed Hussey. In 1858, Joseph Fawkes used his 30-hp engine; named “Lancaster”, for a plowing demonstration at the Illinois State Fair. The engine and plow were then taken to the U.S. Agricultural Society’s contest in Chicago where it won the championship. The steam engine that could be used for plowing, pulling, belt work, or other uses became known as the Steam Traction Engine.

Then, development of the steam engine slowed as the Civil War began. Most industry was used to produce weapons of war. However, the Armies required more food, and the Armies took many men from their farms at the same time. The few men and women left on the farms needed to use technology to keep up with demand. So the small number of steam engines (mostly portable types) became more popular. Yet the war kept farmers from getting the technology they wanted. It would have to wait until after the war…

After the war, steam engines steadily improved in technology and quality. Many different types and manufacturers of engines sprung up. Case, one of the largest manufacturers of steam engines, made its first engine in 1876. Port Huron began in 1882. In 1880, a patent was issued for a steering devise; the steam engine could make itself turn! Then came the invention of the clutch (very high technology!). Steam pressures of 150 p.s.i. became commonplace. Work was easier for the farmer as the steam engine pulled the plow, and turned the belt to thresh the grain.

The steam traction engine’s popularity soared during the 1890s. But, so did the horse’s. Just as the Eli Whitney’s cotton gin needed more slaves; the steam engine required more horses. The steam traction engine could plow, haul huge loads, and power the threshing machine all day. It needed plenty of fuel and water, which was brought by horses. The increased amount of tilled land needed to be planted, and cultivated, which the steam engine was too big to do. Although the steam engine made horses unneeded for some big jobs, more horses were needed for many others.

Groups of farmers formed “threshing rings” in order to pay for the costs of an engine and thresher. It was very expensive; a 110 hp engine from Case could cost over $3000! The farmers began to realize that the steam engine, while useful, still didn’t keep expenses down enough (when you add horses to the bill) to make them useful to the small farmer. Only larger farms could afford them. As the “newfangled” gasoline engines became more reliable, and smaller, they began to cut into the steam engine’s market. From 1900 and on the steam engine became less popular. In 1924 came the Farmall, a gas tractor that could do all the jobs on the farm. It was the final nail in the coffin. Steam production stopped a few years later. A few steam engines worked ’till World War Two. Then many were lost in the scrap drives. Not too many are around today, and you can only see them at antique tractor shows. Yet, when they are there, you notice them. Just look for the plumes of coal/wood smoke, and listen for the whistles. They still are impressive!

Interesting Information

President Abraham Lincoln said in 1859-“The successful application of steampower to farm work is a desideratum-especially a steam plow. To be successful, it must, all things considered, plow better than can be done by animal power. It must do all the work as well, and cheaper, or more rapidly, so as to get through more perfectly in season; or in some way afford an advantage over plowing with animals, else it is no success.”

Horsepower in steam engines was first measured with the formula, 1hp for every 10-14 square feet of boiler surface. But this formula was outdated by the increase of steam pressures in the engines, yet the formula was used until 1911. Then a new measurement-brake horsepower (which was measured on a Prony Brake-type dynamometer). An engine from 1908, which was advertised as 30hp, might be advertised as a 100hp engine in 1912! Some ads had both types of horsepower rating, such as 30-100hp.

Steam engines exploded every day in the U.S. in the early 1900s. For a plain steam traction engine-the boiler holds 52 cubic feet of water, and 26 cubic feet of steam at 150 psi. That 26 cubic feet of steam at 150 psi weighs 9.73 lbs, but holds 1,300,000 foot pounds of energy. The 52 cubic feet of water is at 366° F. It holds 38,000,000 foot pounds of energy. When the boiler fails, it releases enough energy (from the steam and water) to send a one-pound object straight up 7,500 miles (into orbit). Or a 7,500 pound object (the traction engine) one mile up!

Some reader comments:

  • The early farms were not 100 acres in size, most of them were much closer to 40 acres, and probably less than 20 acres was actually plowed under, so the theory of a farmer walking 800 miles just to get the plowing done is a bit far-fetched. A farm of 100 acres would have been quite an operation and would have required several hired men.
  • This was a good report but the dates are misleading. plowing with traction engines did not start until about 1876 when case introduced the first traction engine.

References

Ertel, Patrick W. American Steam Tractors.  1997.

Halberstadt, Hans. Steam Tractors. 1996. Iron Will. Reiman Publications, L.P., 1997.

Letourneau, Peter. Vintage Case Tractors. 1997.

Macmillan, Don., and Jones, Russell. John Deere Tractors and Equipment, Volume One 1837-1959. American Society of Agricultural Engineers, 1988.

Moorhouse, Robert. The Illustrated History of Tractors. Quintet Publishing Limited, 1996.

Norbeck, Jack. Encyclopedia of American Steam Traction Engines. Crestline Publishing Co., 1976.

Posted in Biomass-powered Steam Engines | Tagged , , , , | 8 Comments

Biomass charcoal to create manufacturing high heat

Preface.  The following industries need heat of up 1800 to 3275 F: Chemicals, Forest products, Iron and Steel, Plastics & Rubber, Fabricated metals, Transport Equipment, Computers, electronics & equipment, Aluminum, Cement, Glass, Machinery, Foundries. For nearly all of these products, there is no commercial electric process. Hydrogen steel is still at prototype stage and far from commercial.

Chapter 9 of my book “Life After Fossil fuels” lists the top heat from nuclear (572), geothermal (400) and others, none of them are hot enough, not even the proposed small nuclear reactors. So that leaves charcoal. According to wiki “Charcoal briquettes can burn up to approximately 1,260 °C (2,300 °F) with a forced air blower forge.”

De Decker (2011) explains that “A large share of energy consumed worldwide is by heat. Cooking, space heating and water heating dominate domestic energy consumption. In the UK, these activities account for 85% of domestic energy use, in Europe for 89% and in the USA for 61%. Heat also dominates industrial energy consumption. In the UK, 76% of industrial energy consumption is heat. In Europe, this is 67%. Few things can be manufactured without heat.

Although it is perfectly possible to convert electricity into heat, as in electric heaters or electric cookers, it is very inefficient to do so. It is often assumed that our energy problems are solved when renewables reach ‘grid parity’ – the point at which they can generate electricity for the same price as fossil fuels. But to truly compete with fossil fuels, renewables must also reach ‘thermal parity‘.  It still remains significantly cheaper to produce heat with oil, gas or coal than with a wind turbine or a solar panel.”

In today’s solar thermal plants, solar energy is converted into steam (via a steam boiler), which is then converted into electricity (via a steam turbine that drives an electric generator). This process is just as inefficient as converting electricity into heat: two-thirds of energy gets lost when converted from steam to electricity. 

But the good news is that you wont’ starve — the food, beverage and textile industries don’t need high heat.

But oh dear, at what a cost to the planet. In the past, the massive production of charcoal, employing hundreds of thousands of workers was a major cause of deforestation

Alice Friedemann  www.energyskeptic.com  Author of Life After Fossil Fuels: A Reality Check on Alternative Energy; When Trucks Stop Running: Energy and the Future of Transportation”, Barriers to Making Algal Biofuels, & “Crunch! Whole Grain Artisan Chips and Crackers”.  Women in ecology  Podcasts: WGBH, Financial Sense, Jore, Planet: Critical, Crazy Town, Collapse Chronicles, Derrick Jensen, Practical Prepping, Kunstler 253 &278, Peak Prosperity,  Index of best energyskeptic posts

***

Muhumuza, R. 2019. Africa’s charcoal trade is decimating region’s fragile forest cover. Associated Press.

The machete-wielding men lodge themselves deep inside forests for weeks at a time, felling trees that will be incinerated into pieces of charcoal. Because they often work at night and target seemingly idle public land, they operate with relative impunity while decimating forests in parts of Africa.

Fires in Brazil’s Amazon rainforest have underscored the challenges of conserving the Earth’s forest cover, a substantial amount of which is found in Africa. After the Amazon, the Congo basin tropical rainforest — covering territory the size of Western Europe — is the world’s second largest, often referred to as the Earth’s second lung.

The world’s poorest continent, home to over 1.2 billion people, has long struggled to protect its forests amid a population explosion that fuels demand for plant-based energy sources seen by many as cheap, especially charcoal.

Some 25% to 35% of climate-changing greenhouse gas emissions come from so-called biomass burning, which also includes seasonal fires intentionally set to clear land for agriculture, according to the European Space Agency. The majority of those fires occur in tropical regions of Africa.

Reliance on charcoal or firewood is highest in Africa and Asia, according to a 2018 report by the U.N. Food and Agriculture Organization, with some African cities almost entirely dependent on charcoal for cooking. In Kinshasa, the capital of Congo, 90% of residents rely mainly on it, the report said.

In Somalia, ravaged by extremist violence, the cutting of trees to sustain an illicit charcoal trade is so widespread that the U.N. has warned that desertification there threatens stability.

The value of the charcoal export trade from the Horn of Africa nation to the Middle East and elsewhere — though banned — is estimated at over $360 million per year. Some 8.2 million trees were felled for charcoal between 2011 and 2017, according to U.N. figures.

In Uganda, an East African nation whose lush vegetation once inspired Winston Churchill to call it “the pearl of Africa,” authorities have long warned about the unsustainable nature of the charcoal trade, which persists despite the extension of the power grid deep into the country. Hydroelectric power remains too expensive for many people even in the capital, Kampala, as middle-class families run charcoal stoves to keep electricity bills down.

Edwin Muhumuza, an environmental protection activist who runs the Kampala-based civic group Youth Go Green, said demand for charcoal has turned it into a precious commodity much like gold or coffee.

“We are really concerned,” he said. “What annoys is they cut down the trees but they don’t replace them.”

Now the National Environment Management Authority, a government agency, is urging authorities to remove consumption taxes on liquid petroleum gas, an alternative source of cooking energy, to save forests from the charcoal business.

Figures show a dire situation. Uganda’s forest cover as a percentage of total land stood at 9% in 2015, down from 24% in 1990, according to government data.

But authorities in northern districts such as Gulu, which provides much of the charcoal entering Kampala, are fighting back in a campaign that has yielded scores of impounded charcoal trucks since 2015.

Gulu chairman Martin Mapenduzi organizes raids in hopes of arresting charcoal burners.

“Illegal logging has gone down but the destruction of forests for charcoal burning is still high,” Mapenduzi said. “It’s something that is giving us a lot of headache, but we are fighting.”

The price of a bag of charcoal, which can sustain a small family for several weeks, has been rising steadily in Kampala, reaching about $28 in August largely because of reduced supply from places such as Gulu. A whole bag is unaffordable for many who instead buy it daily in smaller quantities.

The expense is still far too much for families, said Rose Twine, an entrepreneur who sells her version of an eco-stove while warning against what she calls the unsustainable reliance on charcoal.

References.

De Decker, K. 2011. The bright future of solar thermal powered factories. Low Tech magazine.

Posted in Manufacturing & Industrial Heat, Peak Biofuels, Wood | Tagged , , , | 3 Comments

Why self-driving cars may not be in your future

Preface. Below are excerpts from several articles about why a completely automated vehicle is unlikely.  Heaven forbid they are invented. Researchers have found that people will drive 76% more miles, stop using bicycles and mass transit, waste a considerable amount of energy, and increase congestion.

In December of 2021 I heard Ralph Nader being interviewed on Science Friday about car safety.  When asked about self-driving, he laughed, and said that was such a fantasy, since it required very detailed, up-to-date filming of all streets, which could throw the software off if changes were made, or if the white and yellow lines on the road weren’t clearly visible, and it would cost tens of billions of dollars to maintain highways to the standards needed for car software.  I can see that California is aware of this and plans to improve 395,000 miles of highway striping (Snibbe 2018).

Self-driving cars in the news:

Metz C (2021) The Costly Pursuit of Self-Driving Cars Continues On. And On. And On. New York Times. Seven years ago Waymo discovered that spring blossoms made its self-driving cars get twitchy on the brakes. So did soap bubbles. And road flares. Matching the competence of human drivers was elusive. The cluttered roads of America, it turned out, were a daunting place for a robot. The wizards of Silicon Valley said people would be commuting to work in self-driving cars by now. Instead, there have been court fights, injuries and deaths, and tens of billions of dollars spent on a frustratingly fickle technology that some researchers say is still years from becoming the industry’s next big thing. Only the deepest-pocketed outfits like Waymo, a subsidiary of Google’s parent company, Alphabet; auto giants; and a handful of start-ups are managing to stay in the game.

Alice Friedemann  www.energyskeptic.com  Author of Life After Fossil Fuels: A Reality Check on Alternative Energy; When Trucks Stop Running: Energy and the Future of Transportation”, Barriers to Making Algal Biofuels, & “Crunch! Whole Grain Artisan Chips and Crackers”.  Women in ecology  Podcasts: WGBH, Financial Sense, Jore, Planet: Critical, Crazy Town, Collapse Chronicles, Derrick Jensen, Practical Prepping, Kunstler 253 &278, Peak Prosperity,  Index of best energyskeptic posts

***

Quain, J. R. September 26, 2019. Autonomous Cars are still learning to see. New York Times.

Consensus is growing among designers that self-driving cars just aren’t perceptive enough to make them sufficiently safe.

The problem is that cars can’t begin to figure out what’s around them — separating toddlers from traffic cones, for example — if they can’t see the objects in the first place.  Autonomous cars still can’t see well enough to safely maneuver in heavy traffic or see far enough ahead to handle highway conditions in any kind of weather.

Until now, the standard model for autonomous cars has used some combination of four kinds of sensors — video cameras, radar, ultrasonic sensors and lidar. It’s the approach used by all the leading developers in driverless tech, from Aptiv to Waymo. However, as dozens of companies run trials in states like California, deployment dates have drifted, and it has become increasingly obvious that the standard model may not be enough.

Video cameras can be foiled by glare. Standard radar can judge the relative speed of objects but has Mr. Magoo-like vision. Ultrasonic sensors can sense only nearby objects — and not very clearly. Lidar (formally, light detection and ranging), while able to create 3-D images of people and street signs, has distance limitations and can be stymied in heavy rain. And even the most sophisticated artificial intelligence software can’t help if it doesn’t have the perceptual data to begin with.

My comment: according to the article, other sensing technologies are being developed, but read the articles below before thinking this is an easy technical problem to solve soon.  Just be glad that some of this technology is making conventional cars safer to drive.

Mervis, J. December 15, 2017. Not so fast. We can’t even agree on what autonomous, much less how they will affect our lives. Science.

Human drivers aren’t as unsafe as they’re made out to be.  A fatal crash now occurs once every 3.3 million hours of vehicle travel. It will be hard for an automated system to beat that. The public will be much less accepting of crashes caused by software glitches or malfunctioning hardware rather than human error. “Society now tolerates a significant amount of human error on our roads,” Pratt told a congressional panel earlier this year. “We are, after all, only human.”

While developers amass data on the sensors and algorithms that allow cars to drive themselves, research on the social, economic, and environmental effects of Automated Vehicles (AVs) is sparse. Truly autonomous driving is still decades away according to most transportation experts.

In the dystopian view, driverless cars add to many of the world’s woes. Freed from driving, people rely more heavily on cars—increasing congestion, energy consumption, and pollution. A more productive commute induces people to move farther from their jobs, exacerbating urban sprawl. At the same time, unexpected software glitches lead to repeated recalls, triggering massive travel disruptions. Wealthier consumers buy their own AVs, eschewing fleet vehicles that come with annoying fellow commuters, dirty back seats, and logistical hassles. A new metric of inequality emerges as the world is divided into AV haves and have-nots.

Companies have good reason for painting the rosiest scenario for their technology, Shladover, a transportation engineer at the California Partners for Advanced Transportation Technology program in Richmond says. “Nobody wants to appear to be lagging behind the technology of a competitor because it could hurt sales, their ability to recruit top talent, or even affect their stock price,” he says.

As a result, it’s easy for the public to overestimate the capabilities of existing technology. In a fatal crash involving a Tesla Model S and a semitrailer in May 2016, the driver was using what Tesla describes as the car’s “autopilot” features—essentially an advanced cruise control system that can adjust the car’s speed to sync with other vehicles and keep the car within its lane. That fits the definition of a level-two vehicle, which means the driver is still in charge. But he wasn’t able to react in time when the car failed to detect the semi.

Shladover believes AV companies need to be much clearer about the “operational design” of their vehicles—in other words, the specific set of conditions under which the cars can function without a driver’s assistance. “But most of the time they won’t say, or they don’t even know themselves,” he says.

But progress will likely be anything but steady. Level three, for example, signifies that the car can drive itself under some conditions and will notify drivers when a potential problem arises in enough time, say 15 seconds, to allow the human to regain control. But many engineers believe that such a smooth hand-off is all but impossible because of myriad real-life scenarios, and because humans aren’t very good at refocusing quickly once their minds are elsewhere. So many companies say they plan to skip level three and go directly to level four—vehicles that operate without any human intervention.

Even a level-four car, however, will operate autonomously only under certain conditions, say in good weather during the day, or on a road with controlled access.

Rural communities might need government subsidies to give residents of a sparsely populated area the same access to AVs that their urban neighbors enjoy. And advocates for mass transit, bicycling, and carpooling are likely to demand that AV fleets enhance, rather than compete against, these sustainable forms of transportation.

Pavlus, John. July 18, 2016. What NASA Could Teach Tesla about Autopilot’s Limits. Scientific American.

Decades of research have warned about the human attention span in automated cockpits

After the Tesla’s Model S in auto-pilot mode crashed into a truck and killed its driver, the safety of self-driving cars has been questioned due to 3 factors: the autopilot system didn’t see the truck coming, the driver didn’t notice the truck either, so neither applied the brakes.

Who better knows the dangers than NASA, where automation in cockpits has been studied for decades (i.e. cars, space shuttle, airplane).  They describe how connected a person is to a decision-making process as “in the loop”, which, say driving a car yourself, means Observe, Orient, Decide, Act (OODA).  But if your car is in autopilot but you can still interact with the system to brake or whatever, you are “ON the loop”.

Airplanes fly automated, with pilots observing.  But this is very different from a car.  If something goes wrong the pilot has many minutes to react. The plane is 8 miles in  the air.

But in a car, you have just ONE SECOND.  That requires a faster reflex reaction time than a test pilot. There’s almost no margin for error.  This means you might as well be driving manually since you still have to be paying full attention when the car is on autopilot, not sitting in the back seat reading a book.

Tesla tries to get around this by having the autopilot make sure the driver’s hands are on the wheel and visual and audible alerts are triggered if not.

But NASA has found this doesn’t work because the better the auto-pilot is, the less attention the driver pays to what’s going on.  It is tiring, and boring, to monitor a process that does well for a long time, and was called a “vigilance decrement” as far back as 1948. Experiments back then showed that after just 15 minutes vigilance drops off.

So the better the system the more we’re likely to stop paying attention.  But no one would want to buy a self-driving car that they may as well be driving. The whole point is that dangerous stuff we’re already doing now like changing the radio, eating, and talking on the phone would be less dangerous in autopilot mode.

These findings expose a contradiction in systems like Tesla’s Autopilot. The better they work, the more they may encourage us to zone out—but in order to ensure their safe operation they require continuous attention. Even if Joshua Brown was not watching Harry Potter behind the wheel, his own psychology may still have conspired against him.

Tesla’s plan assumes that automation advances will eventually get around this problem.

Transportation experts have set up 6 levels of automation.

What the car does at each of the 6 levels:

  • 0: nothing
  • 1: accelerates, brakes, OR steers
  • 2: accelerates, brakes, AND steers
  • 3: assumes full control within narrow parameters, such as when driving on the freeway, but not during merges or exit.
  • 4: Everything, only under certain conditions (e.g. specific locations, speed, weather, time of day)
  • 5: everything: goes everywhere, any time, and under all conditions

What the driver does:

  • 0: Everything
  • 1: Everything but with some assistance
  • 2: remains in control, monitors and reacts to conditions
  • 3: must be capable of regaining control within 10-15 seconds
  • 4: Nothing under certain conditions, but everything at other times
  • 5: Nothing, and unable to assume control

Our take on the prospects:

  • 0: older fleet
  • 1: present fleet
  • 2: Now in testing
  • 3: might never be devloped
  • 4: where the industry wants to be
  • 5: Never

Computers do not deal well with anything unexpected, with sudden and unforeseen events.   Self-driving cars can obey the rules of the road, but they cannot anticipate how other car drivers will behave. Without super-accurate GPS automation relies on seeing lines on the pavement to keep in their lane, but snow, rain, and fog can make them go away. Self-driving cars rely on special detailed maps of the location of intersections, on-ramps, stop signs and so on. very few roads are mapped to this degree, or updated with construction, detours, conversions to roundabouts, new stop lights, and so on.  They don’t detect potholes, puddles, or oil spots well and can be confounded by the shadows of overpasses.  If a collision is unavoidable, do you run over the child or swerve into a light pole and kill the driver potentially? (Boudette 2016).

John Markoff. January 17, 2016. For Now, Self-Driving Cars Still Need Humans. New York Times.

Self-driving cars will require human supervision. On many occasions, the cars will tell their human drivers, “Here, you take the wheel,” when they encounter complex driving situations or emergencies.  In the automotive industry, this is referred to as the hand-off problem, and automotive engineers say there is no easy solution to make a driver who may be distracted by texting, reading email or watching a movie perk up and retake control of the car in the fraction of a second that is required in an emergency. The danger is that by inducing human drivers to pay even less attention to driving, the safety technology may be creating new hazards. The ability to know if the driver is ready, and if you’re giving them enough notice to hand off, is a really tricky question.

The Tesla performed well in freeway driving, but on city streets and country roads, Autopilot performance could be described as hair-raising. The car, which uses only a camera to track the roadway by identifying lane markers, did not follow the curves smoothly or slow down when approaching turns.  On a 220-mile drive to Lake Tahoe from Palo Alto, Calif., Dr. Thrun said he had to intervene more than a dozen times.

Like the Tesla, the new autonomous Nissan models will require human oversight and even their most advanced models aren’t autonomous in  snow, heavy rain and some nighttime driving.

You could propose various fixes, but none of them get around the 1 second time for the driver to react. That is not fixable.

Massachusetts Institute of Technology, CSAIL. 2018. Self-driving cars for country roads: Most autonomous vehicles require intricate hand-labeled maps, but new system enables navigation with just GPS and sensors. ScienceDaily.

Uber’s recent self-driving car fatality underscores the fact that the technology is still not ready for widespread adoption. One reason is that there aren’t many places where self-driving cars can actually drive. Companies like Google only test their fleets in major cities where they’ve spent countless hours meticulously labeling the exact 3D positions of lanes, curbs, off-ramps and stop signs.

Indeed, if you live along the millions of miles of U.S. roads that are unpaved, unlit or unreliably marked, you’re out of luck. Such streets are often much more complicated to map, and get a lot less traffic, so companies are unlikely to develop 3D maps for them anytime soon. From California’s Mojave Desert to Vermont’s White Mountains, there are huge swaths of America that self-driving cars simply aren’t ready for.

Additional references

Boudette, N. June 4, 2016. 5 Things That Give Self-Driving Cars Headaches. New York Times.

Snibbe K (2018) New Road Striping in California Meant to Help Self-Driving Vehicles. The Orange County Register. https://www.govtech.com/fs/new-road-striping-in-california-meant-to-help-self-driving-vehicles.html

Really great short story about self driving cars: T. C. Boyle. 2019. Asleep at the Wheel. New Yorker.

Posted in Artificial Intelligence, Automobiles | Tagged , , , , , , , | 1 Comment

Foreign Policy: The limits of clean energy

Preface. This article appeared in the magazine Foreign Policy. Some key points:

  • Renewables to power the world would require 34 million metric tons of copper, 40 million tons of lead, 50 million tons of zinc, 162 million tons of aluminum, and no less than 4.8 billion tons of iron.
  • The batteries for power storage when the sun isn’t shining and the wind isn’t blowing will require 40 million tons of lithium requiring a 2,700 percent increase over current levels of extraction. Lithium is an ecological disaster. It takes 500,000 gallons of water to produce one ton of lithium. Most lithium is in dry areas, and mining companies are using up the groundwater, leaving nothing for farmers to irrigate their crops with, while chemical leaks from lithium mines have poisoned thousands of miles of rivers, killing entire freshwater ecosystems.
  • We’ll also need to replace 2 billion vehicles with electric vehicles, leading to even more mind-boggling amounts of materials.
  • Ecologists estimate that even at present rates of global material use, we are overshooting sustainable levels by 82 percent.

Better yet, read this overview of requirements, the best of all appraisals I’ve seen the past 22 years:

Michaux (2022) Assessments of the Physical requirements to globally phase out fossil fuels

Alice Friedemann  www.energyskeptic.com  Author of Life After Fossil Fuels: A Reality Check on Alternative Energy; When Trucks Stop Running: Energy and the Future of Transportation”, Barriers to Making Algal Biofuels, & “Crunch! Whole Grain Artisan Chips and Crackers”.  Women in ecology  Podcasts: WGBH, Jore, Planet: Critical, Crazy Town, Collapse Chronicles, Derrick Jensen, Practical Prepping, Kunstler 253 &278, Peak Prosperity,  Index of best energyskeptic posts

***

Hickel J (2019) The limits of clean energy. If the world isn’t careful, renewable energy could become as destructive as fossil fuels. Foreign policy.

The conversation about climate change has been blazing ahead in recent months. Propelled by the school climate strikes and social movements like Extinction Rebellion, a number of governments have declared a climate emergency, and progressive political parties are making plans—at last—for a rapid transition to clean energy under the banner of the Green New Deal.

This is a welcome shift, and we need more of it. But a new problem is beginning to emerge that warrants our attention. Some proponents of the Green New Deal seem to believe that it will pave the way to a utopia of “green growth.” Once we trade dirty fossil fuels for clean energy, there’s no reason we can’t keep expanding the economy forever.

This narrative may seem reasonable enough at first glance, but there are good reasons to think twice about it. One of them has to do with clean energy itself.

The phrase “clean energy” normally conjures up happy, innocent images of warm sunshine and fresh wind. But while sunshine and wind is obviously clean, the infrastructure we need to capture it is not. Far from it. The transition to renewables is going to require a dramatic increase in the extraction of metals and rare-earth minerals, with real ecological and social costs.

We need a rapid transition to renewables, yes—but scientists warn that we can’t keep growing energy use at existing rates. No energy is innocent. The only truly clean energy is less energy.

In 2017, the World Bank released a little-noticed report that offered the first comprehensive look at this question. It models the increase in material extraction that would be required to build enough solar and wind utilities to produce an annual output of about 7 terawatts of electricity by 2050. That’s enough to power roughly half of the global economy. By doubling the World Bank figures, we can estimate what it will take to get all the way to zero emissions—and the results are staggering: 34 million metric tons of copper, 40 million tons of lead, 50 million tons of zinc, 162 million tons of aluminum, and no less than 4.8 billion tons of iron.

In some cases, the transition to renewables will require a massive increase over existing levels of extraction. For neodymium—an essential element in wind turbines—extraction will need to rise by nearly 35 percent over current levels. Higher-end estimates reported by the World Bank suggest it could double.

The same is true of silver, which is critical to solar panels. Silver extraction will go up 38 percent and perhaps as much as 105 percent. Demand for indium, also essential to solar technology, will more than triple and could end up skyrocketing by 920 percent.

And then there are all the batteries we’re going to need for power storage. To keep energy flowing when the sun isn’t shining and the wind isn’t blowing will require enormous batteries at the grid level. This means 40 million tons of lithium—an eye-watering 2,700 percent increase over current levels of extraction.

That’s just for electricity. We also need to think about vehicles. This year, a group of leading British scientists submitted a letter to the U.K. Committee on Climate Change outlining their concerns about the ecological impact of electric cars. They agree, of course, that we need to end the sale and use of combustion engines. But they pointed out that unless consumption habits change, replacing the world’s projected fleet of 2 billion vehicles is going to require an explosive increase in mining: Global annual extraction of neodymium and dysprosium will go up by another 70 percent, annual extraction of copper will need to more than double, and cobalt will need to increase by a factor of almost four—all for the entire period from now to 2050.

The problem here is not that we’re going to run out of key minerals—although that may indeed become a concern. The real issue is that this will exacerbate an already existing crisis of overextraction. Mining has become one of the biggest single drivers of deforestation, ecosystem collapse, and biodiversity loss around the world. Ecologists estimate that even at present rates of global material use, we are overshooting sustainable levels by 82 percent.

Take silver, for instance. Mexico is home to the Peñasquito mine, one of the biggest silver mines in the world. Covering nearly 40 square miles, the operation is staggering in its scale: a sprawling open-pit complex ripped into the mountains, flanked by two waste dumps each a mile long, and a tailings dam full of toxic sludge held back by a wall that’s 7 miles around and as high as a 50-story skyscraper. This mine will produce 11,000 tons of silver in 10 years before its reserves, the biggest in the world, are gone.

To transition the global economy to renewables, we need to commission up to 130 more mines on the scale of Peñasquito. Just for silver.

Lithium is another ecological disaster. It takes 500,000 gallons of water to produce a single ton of lithium. Even at present levels of extraction this is causing problems. In the Andes, where most of the world’s lithium is located, mining companies are burning through the water tables and leaving farmers with nothing to irrigate their crops. Many have had no choice but to abandon their land altogether. Meanwhile, chemical leaks from lithium mines have poisoned rivers from Chile to Argentina, Nevada to Tibet, killing off whole freshwater ecosystems. The lithium boom has barely even started, and it’s already a crisis.

And all of this is just to power the existing global economy. Things become even more extreme when we start accounting for growth. As energy demand continues to rise, material extraction for renewables will become all the more aggressive—and the higher the growth rate, the worse it will get.

It’s important to keep in mind that most of the key materials for the energy transition are located in the global south. Parts of Latin America, Africa, and Asia will likely become the target of a new scramble for resources, and some countries may become victims of new forms of colonization. It happened in the 17th and 18th centuries with the hunt for gold and silver from South America. In the 19th century, it was land for cotton and sugar plantations in the Caribbean. In the 20th century, it was diamonds from South Africa, cobalt from Congo, and oil from the Middle East. It’s not difficult to imagine that the scramble for renewables might become similarly violent.

If we don’t take precautions, clean energy firms could become as destructive as fossil fuel companies—buying off politicians, trashing ecosystems, lobbying against environmental regulations, even assassinating community leaders who stand in their way.

Some hope that nuclear power will help us get around these problems—and surely it needs to be part of the mix. But nuclear comes with its own constraints. For one, it takes so long to get new power plants up and running that they can play only a small role in getting us to zero emissions by midcentury. And even in the longer term, nuclear can’t be scaled beyond about 1 terawatt. Absent a miraculous technological breakthrough, the vast majority of our energy will have to come from solar and wind.

Reducing energy demand not only enables a faster transition to renewables, but also ensures that the transition doesn’t trigger new waves of destruction.

Posted in Alternative Energy, Mining, Peak Lithium | Tagged , , | 7 Comments

Airplanes are energy gluttons. Finite oil should be used for ships, locomotives, & trucks

Preface. As oil declines and the energy crisis worsens, airplanes ought to be the first to go since they are 600 times less energy efficient than large cargo ships (30,000 / 50), 50 to 120 times less efficient than trains, and 7.5 to 15 times less efficient than trucks.

This image has an empty alt attribute; its file name is transportation-eff-kj-per-ton-per-km.jpg

Though airplanes will continue to fly, because they use kerosene, which is a fraction of every crude barrel of oil, and much of it can’t be converted to diesel (trucks, rail, ships) or gasoline (cars).  Though ships can burn just about anything, and since they carry 90% of all global trade, perhaps kerosene can keep them going a bit longer.

Kerosene is the fraction of choice for airplanes for a number of reasons. First, it has a much higher flash point than gas, commonly around 464°F. Second, kerosene has a freezing point of -57 F, which is not uncommon at the high altitudes planes fly at. In fact, it’s usually – 67 F — but the kerosene doesn’t freeze because the plane is flying through this cold air mass at hundreds of miles per hour. The speed of air over the wings creates friction, heating the surfaces (Page 2020).

United States air carriers burn through 17 billion gallons of jet fuel annually. The amount of fuel an airliner needs depends on many factors, including aircraft type, weight and direction of travel. Here are some price estimates for 2019 (Stewart 2019):

  • Los Angeles International to Tokyo Narita: This trans-Pacific hop uses an estimated 9,500 gallons of jet fuel at an estimated price of $19,190.
  • New York JFK to Los Angeles International: This popular transcon flight uses an estimated 5,325 gallons of jet fuel at an estimated price tag of $10,757. The return to New York uses slightly less fuel at 5,075 gallons with an estimated price of $10,252.
  • Chicago O’Hare to Miami International: Heading southbound, this flight operates using an estimated 2,350 gallons, with an estimated cost of $4,747. On the northbound return, the cost estimate soars to $7,201, using an estimated 3,565 gallons.
  • Denver International to San Francisco International Airport: This particular route can see significant variances in fuel quantity and pricing because of the variety of aircraft operating between the two cities. On average, aircraft fill up with an estimated 3,500 gallons of jet fuel, costing an estimated $7,070. However, price can vary from $4,040 on the low end to $14,140 on the high end.

It’s also been found that if airplanes flew in birdlike formations, the way geese fly in a V, up to 10% of their energy could be saved as well as emissions lowered (Khadilkar 2021).

Alice Friedemann www.energyskeptic.com  author of “Life After Fossil Fuels: A Reality Check on Alternative Energy“, 2021, Springer; “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer; Barriers to Making Algal Biofuels, and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Collapse Chronicles, Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

* * *

Nygren, E., et al. 2009. Aviation fuel and future oil production scenarios. Energy Policy 37/10: 4003-4010

Jet fuel is extracted from the middle distillates fraction and competes with the production of diesel. Aviation fuel is almost exclusively extracted from the kerosene fraction of crude oil.

Today global oil production is roughly 81.5 million barrels per day (Mb/d), which is equivalent to an annual output of 3905.9 Mt.

Aviation fuels include both jet fuel for turbine engines and aviation gasoline for piston engines. The dominant fuel is jet fuel originating from crude oil as it is used in all large aircraft. Jet fuel is almost exclusively extracted from the kerosene fraction of crude oil, which distills between the gasoline fraction and the diesel fraction. The IEA estimated the world’s total refinery production in 2006 was 3861 million tonnes (Mt). The aviation fuel part was 6.3%, implying an annual aviation fuel production of 243 Mt (corresponding to about 5 Mb/d), including both jet fuel and aviation gasoline.

Figure 3 shows how the world’s refinery production is divided into different fractions.

Figure 3: Distribution of world refinery production in 2006. The total production was 3861 Mt. Source: International Energy Agency, 2008b. Key World Energy Statistics 2008 and previous editions, see also: http://www.iea.org/textbase/nppdf/free/2008/key_stats_2008.pdf

Figure 3: Distribution of world refinery production in 2006. The total production was 3861 Mt. Source: International Energy Agency, 2008b. Key World Energy Statistics 2008 and previous editions, see also: http://www.iea.org/textbase/nppdf/free/2008/key_stats_2008.pdf

If the refinery would like to increase jet fuel production, diesel production must decrease. During the year the proportion between diesel and jet fuel production changes and the fuel most profitable at that moment is produced.

Swedish Environmental class-1, ultra-low sulphur, diesel is a prioritized product, which has the consequence that no jet fuel at all is manufactured. The kerosene fraction is blended directly into the diesel fraction to provide the correct viscosity properties. Having fewer products is a way to increase the efficiency of the refinery.

The conclusion to be drawn is that aviation fuel production is not a fixed percentage of refinery output. In 2006, aviation fuel was 6.3% of world refinery production. The kerosene fraction is an average of 8-10% of the crude oil, but all kerosene does not become jet fuel or diesel. Kerosene can also be used to decrease the viscosity of the heavy fractions of crude oil and is used as lamp oil in certain parts of the world.

The environmental parameters that define the operating envelope for aviation fuels such as pressure, temperature and humidity vary dramatically both geographically and with altitude. Consequently, aviation fuel specifications have developed primarily on the basis of simulated performance tests rather than defined compositional requirements. Given the dependence on a single source of fuel on an aircraft and the flight safety implications, aviation fuels are subject to stringent testing and quality assurance procedures. The fuel is tested in a number of certified ways to be sure of obtaining the right properties following a specification of the international standards from, for example, IATA guidance material, ASTM specifications and UK defense standards (Air BP, 2000). Tests are done several times before the fuel is finally used in an airplane.

Today, the increasing addition of biofuels to diesel is a problem for the aviation industry. One of the more common biodiesels is FAME (Fatty Acid Methyl Esther). FAME is not a hydrocarbon and no non-hydrocarbons are allowed in jet fuel, except for approved additives as defined in the various international specifications such as DEF STAN 91-91 and ASTM D 1655. Consequently, biofuels-contaminated jet fuel cannot be utilized due to jet fuel standards. The problem with FAME is that it has the ability to be absorbed by metal surfaces. Diesel and jet fuel are often transported in a joint transport system making it possible for FAME stuck in tanks, pipelines and pumps to desorb to the jet fuel. The limit for contamination of jet fuel with FAME is 5 ppm. FAME can be picked up in any point of the supply chain, making 5 ppm a difficult limit and therefore the introduction of biofuels to the diesel fraction has had a negative impact upon jet fuel supply security.

References

Ashby, M.F., 2015. Materials and sustainable development table A.14.

Khadilkar D (2021) Tight Flight: A birdlike formation would save fuel for planes. Scientific American.

Page, C. 2020. How do pilots know how much fuel to take on a flight? thepointsguy.com

Smil, Vaclav. 2010. Prime Movers of Globalization. The History and Impact of Diesel Engines and Gas Turbines pp 160, 204

Stewart, M. 2019. How much does it cost to fuel up an airliner? thepointsguy.com

Posted in Airplanes, Transportation What To Do | 4 Comments

Vaclav Smil on natural gas (ethane) and plastics

Preface. Vaclav Smil doesn’t mention using plastic for heat, but in a letter to The Guardian, David Reed suggests:

“The effort of collecting, transporting and cleaning plastics for possible recycling has largely failed, created much more pollution and contributed massively to climate change. The idea of burning plastics and using the energy to heat our homes was proposed by the plastics company Dow more than 30 years ago: it suggested treating all plastics as “borrowed oil”. At that time, ordinary domestic waste had a calorific value of low-grade coal, so the suggestion was that this waste should be burned in efficient plants with heat recovery and treatment of the gases produced, perhaps even trapping the carbon dioxide produced, rather than trying to recycle the complex (and dirty) mix of plastics.  Today, with higher use of more complex plastics, this makes even more sense. Mixed plastics cannot really be recycled: they are long-chain molecules, like spaghetti, so if you reheat and reprocess them, you inevitably end up with something of lower performance; it’s called down-cycling.”

While this could be polluting if not done right, people will certainly turn to burning plastic and anything else they can get their hands on at some point of energy decline. Better to do it correctly now in an incinerator than in backyards in the future as well as to protect our land and waterways from plastic pollution right now.

Thermal recycling processes require temperatures of between 300 °C and 900 °C (572 °F to 1,650 °F), consuming a whole lot of energy (Nakaji 2021).

Fracked shale oil and gas have created a boom in plastics in the U.S., with billions of dollars invested in new plants to take advantage of this very temporary bonanza (of the 8 major tight oil basins, only the Permian is not in decline). Fracked oil is too light to be used as a transportation fuel.

2022-4-6 Ethane to outpace growth in all other U.S. petroleum product consumption through 2023. U.S. Energy Information Administration: Ethane mainly serves as a petrochemical feedstock to produce ethylene, which is used to make plastics and resins. Consumption of ethane has grown every year since 2010 in the U.S. More is now consumed than jet fuel or propane. Consumption of ethane, which we estimate using product supplied, grew by 50,000 barrels per day (b/d) in 2021, according to data from our March 2022 Petroleum Supply Monthly. We forecast in our March 2022 Short-Term Energy Outlook (STEO) that by 2023, U.S. consumption of ethane will grow by another 340,000 b/d.

Alice Friedemann   www.energyskeptic.com  author of “Life After Fossil Fuels: A Reality Check on Alternative Energy”, 2021, Springer; “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer, Barriers to Making Algal Biofuels, and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Collapse Chronicles, Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

***

Vaclav Smil. 2013. Making the Modern World: Materials and Dematerialization.  Wiley.

Polyethylene (PE) is by far the most important thermoplastic (it accounted for 29% of the world’s aggregate plastic output, or roughly 77 Mt, in 2010), polypropylene (PP) comes next (with about 19% or 50 Mt in 2010), followed by polyvinyl chloride (PVC, about 12% or 32 Mt in 2010).

In 2010, packaging consumed almost 40% of the total (mostly as various kinds of PE and PP), construction about 20% (mostly for plastic sheets used as vapor barriers in wall and ceiling insulation), the auto industry claimed nearly 8% (interior trim, exterior parts), and the electrical and electronic industry took about 6% (mostly for insulation of wires and cables).

All of these products begin as ethane. In North America and the Middle East ethane is separated from natural gas, and low gas prices and abundant supply led to surplus production for export and favored further construction of new capacities: in 2012 Qatar launched the world’s largest LDPE plant and, largely as a result of shale gas extraction, new ethylene capacities are planned in the USA (Stephan, 2012). The dominant feedstock for ethane in Europe, where prices of imported natural gas are high, is naphtha derived by the distillation of crude oil.

Plastics have a limited lifespan in terms of functional integrity: even materials that are not in contact with earth or water do not remain in excellent shape for decades. Service spans are no more than 2–15 years for PE, 3–8 years for PP, and 7–10 years for polyurethane; among the common plastics only PVC can last two or three decades and thick PVC cold water pipes can last even longer (Berge, 2009).

Some products made out of plastic:

  • Transparent or opaque bags (sandwich, grocery, or garbage)
  • sheets (for covering crops and temporary greenhouses),
  • wraps (Saran, Cling)
  • squeeze bottles (for honey)
  • HDPE garbage cans
  • containers (for milk, detergents, motor oil)
  • HDPE for house wraps (Tyvek) and water pipes
  • PEX for water pipes and as insulation for electrical cables
  • UHMWPE for knee and hip replacements.
  • massive LDPE water tanks
  • indoor–outdoor carpeting
  • lightweight fabrics woven from PP yarn and used particularly for outdoor apparel
  • insulated wires, water, and sewage pipes to food wraps and her car’s interior and body undercoating
  • disposable and surgical gloves
  • flexible tubing for feeding
  • breathing and pressure monitoring, catheters
  • blood bags
  • IV containers
  • sterile packaging
  • trays
  • basins
  • bed pans and rails
  • thermal blankets
  • lab ware
  • construction (house sidings, window frames)
  • outdoor furniture
  • water hoses
  • office gadgets
  • toys

References

Nakaji Y et al (2021) Low-temperature catalytic upgrading of waste polyolefinic plastics into liquid fuels and waxes. Applied Catalysis B: Environmental.

 

Posted in Natural Gas, Vaclav Smil | Tagged , , , | 2 Comments