Clouds may not curb global warming as much as hoped for

The following article, Clouds Play Lesser Role in Curbing Warming, Study Finds, is from climatecentral.org

Analysis of the first seven years of data from a NASA cloud-monitoring mission suggests clouds are doing less to slow the warming of the planet than previously thought, and that temperatures may rise faster than expected as greenhouse gas pollution worsens — perhaps 25 percent faster.

Clouds can play an important role in slowing global warming by reflecting energy back into space. As temperatures rise, clouds contain more liquid water and fewer ice crystals, making them brighter, meaning they reflect more sunlight.

The new research, however, suggests climate models have overestimated how much ice is in clouds, meaning less is available to be converted to liquid as temperatures rise.

“When carbon dioxide concentrations and temperatures rise, then mixed-phase clouds will increase their liquid water content,” said Ivy Tan, a PhD candidate at Yale University who led the research, which investigated common clouds that contain both ice and water. “Many models are overestimating how much ice is in the mixed-phase clouds.”

The repercussions of the findings, which were published Thursday in Science, could make it harder to hold warming to limits set during recent United Nations climate negotiations

The coldest clouds are full of ice; the warmest are full of water. Modeling experiments by Tan and two other scientists focused on inbetweeners — mixed-phase clouds, such as undulating stratiform and fluffy stratocumulus clouds, which are abundant over the vast Southern Ocean and around the Northern Hemisphere north of New York.

For their study, the researchers used the NASA data to guide the modification of a popular earth model. They added more liquid and less ice to the clouds in their model simulations, striving to create more realistic conditions.

Because there was less ice, cloud brightness increased more slowly than it did in the unmodified model, since fewer ice crystals were replaced with reflective liquid as temperatures warmed.

One of climate science’s great quests is to project how much earth warms when carbon dioxide  concentrations double — something known as climate sensitivity. When carbon dioxide levels were doubled in the modified model, temperatures rose by at least a quarter more than they did when the unmodified model was used — to at least 5°C (9°F).

What the findings might actually mean for earth will depend heavily on how much carbon dioxide, methane and other greenhouse gases yet gets billowed into the atmosphere, and how quickly. But the discovery suggests impacts from climate change will be worse, and that they will get worse more quickly than earth models had previously indicated.

Isaac Held, a National Oceanic and Atmospheric Administration climate scientist, said he agreed with the researchers about the “the importance of getting the ice-liquid ratio in mixed-phase clouds right,” but he doesn’t agree that global climate models generally underestimate climate sensitivity.

Based on past observations, Held, who was not involved with the study, said the climate sensitivity of 5°C or more shown by the new research may be implausible.

“Admittedly, it is a rather high estimate, which may reflect the fact that the model used is already on the sensitive side,” said Mark Zelinka, a cloud modeling expert at Lawrence Livermore National Laboratory who worked with Tan on the research. But, based on Zelinka’s interpretation of historical data, he said it “seems premature” to dismiss it as implausible.

Tan, meanwhile, said it would be a mistake to focus too closely on the exact number. The sensitivity result from the modeling experiments should be taken “with a grain of salt,” she said.

That’s because the study was based on a single model. A main point in conducting the experiments was to show that climate models contain a bias that could be corrected. The group hopes other scientists will conduct similar experiments using different models to help hone in on a more reliable measure of climate sensitivity.

Michael Mann, a meteorology professor at Penn State who was not involved with the study, said it’s “speculative” but “plausible” that global climate models have been underestimating climate sensitivity by assuming too much cloud glaciation.

“This is one of several recent studies that provide sobering evidence that earth’s climate sensitivity may lie in the upper end of the current uncertainty range,” Mann said in an email. “That means that avoiding dangerous 2°C warming might be an even greater challenge.”

The new findings underscore the urgency of taking steps to slash rates of greenhouse gas pollution, Mann said.

Carbon dioxide levels have risen more than 40 percent to 400 parts per million since before the Industrial Revolution, and they continue to rise at a hastening pace.

The increase in carbon dioxide levels recorded so far has played the most important role in pushing average global temperatures up by 1°C (1.8°F) during the last 200 years. That has worsened heat waves, floods and droughts, leading to record-breaking temperatures in 2014 and then again in 2015.

A U.N. pact negotiated in Paris in December set a goal for limiting warming to well below 2°C. Plans by the world’s biggest polluters to protect the climate, including China, the U.S. and Europe, however, so far fall well short of the measures needed to achieve that goal.

Posted in Global Warming | Tagged , , | Leave a comment

Benefits of promoting soil health in agriculture U.S. House hearing 2014

Consequences of degraded soils

Consequences of degraded soils

 

 

 

 

 

 

 

 

 

 

[ At last, 10 years after I first published “Peak soil: Why biofuels destroy ecosystems and civilizations“, Congress had a hearing that educated house members on why preserving topsoil is so essential for food production for future generations. But no rules came out of this session to encourage or enforce farmers and ranchers to practice proven techniques that would preserve soil for future generations.

Preservation of topsoil is needed according to James Harbach, a farmer in Pennsylvania, because “We need to have a premium structure that promotes soil building techniques, and, conversely, provides a disincentive for soil degrading practices. Taxpayers should not be on the hook for supporting production agriculture that exports more topsoil nutrients and soil carbon than actual crop products.  The benefits of healthy soil need to be acknowledged in the regulatory process. We need regulatory agencies to recognize that well managed farms with healthy soils are the key to reducing agricultural problems.  Our government programs need to motivate farmers to adopt soil health principals. Many do the opposite, they enable poor stewardship.”

A few quotes from this U.S. House hearing:

Glenn Thompson, Pennsylvania, Chairman: We owe it to future generations to do what we can today, and understand and recognize the importance of healthy soil.

Jason Weller, Chief, Natural Resources Conservation Service, U.S. Department of Agriculture:  In order to feed an additional two billion people over the next 40 years we are going to have to grow as much food in the next 40 years as we have, as a world civilization, over the last 500 years.

If you think back to before Magellan everyone knew the Earth was round, before he circumnavigated the globe, and all lifetimes and all the food that was grown over that time period, we are going to have to grow in half a lifetime the same amount of food to feed that surging world population, and we are having to do that on a smaller land base.

Just here in the United States, in the last 30 years, upwards of 43 million acres of land have been converted from agricultural lands to non-agricultural lands, and of those 43 million acres, a land area the size of the State of Washington, 14 million acres of that were prime soils, the most food productive soils on Earth, the size of West Virginia, paved over and converted to other uses.

Shanon Phillips, Director, Water Quality Division, Oklahoma Conservation Commission:  The importance of protecting our national soil resources, which were built over geologic time and heavily impacted through settlement and development of our continent, is relatively obvious as it relates to the promotion of a strong agricultural industry, which, in turn, is critical for a healthy national economy. Scientists estimate that as much as 60% of carbon has been lost from agricultural soils since the 1800s. This loss in organic matter affects a soil’s capacity to absorb and hold nutrients and water, which are critical for production of crops and livestock forage. Protection of our soil resources is also mandatory for protection of the nation’s water resources.

Erosion of soil particles, washing of compounds from the soil, and changes in soil structure which affect water infiltration are some of the most significant sources of water quality problems in the U.S. According to the U.S. Environmental Protection Agency, approximately 777,759 or 67% of impaired miles of U.S. streams and rivers and 9,794,360 or 40% of impairments to lakes, reservoirs and ponds are caused by pollutants related to soil erosion or leaching of pollutants from soils such as excess nutrients, sedimentation, turbidity (suspended particles), pathogens, and pesticides.

Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer]

House 113-23. September 18, 2014. The benefits of promoting soil health in agriculture and rural America. U.S. House of Representatives. 72 pages.

GLENN THOMPSON, PENNSYLVANIA , CHAIRMAN.  I want to welcome everyone to this hearing of the Conservation, Energy, and Forestry Subcommittee on the topic of healthy soils, which is critically important to American agriculture and strong farming communities. Congress has long recognized the importance of promoting soil health across the country, starting with the establishment of the Soil Conservation Service in 1935 as a permanent part of USDA. The need for this agency came in response to a persistent problem of soil erosion across the country, particularly in the Dust Bowl region. Now, the Soil Conservation Service, which eventually became the Natural Resources Conservation Service, plays an important role in preserving soil health across the country by providing producers with voluntary assistance in monitoring and assessing soil conditions on their land.

As our predecessors did for us in the past, we owe it to future generations to do what we can today, and understand and recognize the importance of healthy soil. The Earth’s population is projected to grow to roughly nine billion people by the year 2050. Given the growing demands on farmland everywhere, we must invest in the necessary resources, and best practices, to be certain that producers have the capacity to meet this growing need.

It is heartening to see how farmers, ranchers, and foresters across the country have made promoting the health and sustainability of soil a fundamental priority. For example, I see this all the time across the 5th District of Pennsylvania, where farmers are engaging in innovative practices, including no-till practices, cover cropping, and adhering to other best practices in order to preserve the nutrients in the soil. Additionally, it is important for us to remember that soil health is closely linked with water quality. In addition to the great work being done at the state and county levels, I am proud that so many of the farmers and foresters in Pennsylvania have taken voluntary steps to promote soil health in order to do their part to assist in the recovery of the Chesapeake Bay.

Whether it is protecting our drinkable water supply, keeping nutrients for the next crop year, or maintaining a supply of forage for livestock, there is no shortage of reasons why we must continue to innovate when it comes to promoting soil health.
JASON WELLER, CHIEF, NATURAL RESOURCES CONSERVATION SERVICE, U.S. DEPARTMENT OF AGRICULTURE, WASHINGTON, D.C.  Our renewed focus is a return to the past in some respects. Mr. Lucas referred to our almost 80 year history as an organization, coming back to our roots, literally, on soil. The palpable excitement and energy it is creating not only within NRCS, but also with our brothers and sisters in the Soil and Water Conservation districts across the United States, and with farmers and ranchers themselves. This approach to managing our soils as a living ecosystem, as well as the physical and chemical properties of the soil, is something that we are really excited about.

This is not about NRCS. Overall we play a small role in this movement. This is a broad coalition, from farmers and ranchers to land-grant universities, extension services, ag retailers, foundations, private individuals, nonprofit groups, you name it. There is a huge constellation of groups who are working on this topic of soil health. They are bringing tremendous innovation, new ideas, excitement, resources.

Soil health is the capacity of soil to function as a vital, living ecosystem that sustains plants, animals, and humans. So what we are doing when we focus on soil health is we are trying to protect that ecosystem, the soil ecosystem, to support the life within the ecosystem, as well as the other properties of the soils, ultimately to benefit life above the soil, the plants that grow the food and fiber we depend upon. And soil is really important because it provides some key functions that we rely on.

For example, healthy soil regulates the flow of water. When it rains or snows, or when the irrigation water is applied, what happens to that water? Does it leave the fields in runoff? Does it infiltrate in with the soil structure? As it gets into the soil structure, do the soils help to buffer and filter that water?

It provides an important crucial component for helping to clean the waters. Soils also then helps cycle the nutrients that are applied to farm fields through fertilizers, manures, and other sources, as well as the nutrients that are available from the environment itself, decomposition of the plant matter, and actually deposition from the atmosphere itself.

A really critical component of soil is it sustains life. At the end of the day, all life on Earth depends upon soil, and the function soils provides to grow the food, the fuel, the fiber we really need. It helps the plants transform the energy from the sun itself. It depends upon the soil medium for that conversion of the sun’s energy into energy that we then can use as life.

Speaking about life, for many years, our organization, many in our culture, have been focused on the physical and chemical properties of soil. Some have never lost sight. Many farmers and ranchers know that the soil is alive. But we are increasingly becoming aware of what is actually happening in that ecosystem below the surface of the Earth, what is happening in the soils. And what we are learning is that the life in the soils is among the most diverse on Earth.  There are millions of species of soil microorganisms, and billions of organisms in the soil itself that are all interrelated in an ecosystem, a web. They depend on each other, feed each other, transfer nutrients and energy between themselves. In just 1 teaspoon of soil, there are more microorganisms than there are people on Earth.

The bacteria that live in the rhizosphere, around the root structure, help feed nutrients into the plants, and in return the plants feed them carbohydrates, literally sugar, to those bacteria, which you could put 40 million on the head of a pin. An incredible array of life. That life includes bacteria, fungi, algae, arthropods and other insects, protozoa, larger vertebrate animals, and they all are there working together to bring energy, and life, and food throughout this whole structure. And, at the end, having a robust life force within the soil itself helps support the production of cropland whether it is for food, whether it is just for vegetation for wildlife.

The role of those life forms are really critical. They help shred the the biomass that resides in the soils. They help decompose that biomass to turn it into humus, a really rich organic structure of the soils themselves. They create micro-pores and macro-pores for water and air to infiltrate into the soil structure. They create room for the microbes and organisms to live. They help cycle nutrients out of the atmosphere, in the soil itself, and fertilizers that are applied to the soils. They help make the soil more efficient in the processing of those nutrients. And, ultimately, they also help clean the water, as the water moves through the soil structure. So the organisms are crucial overall not just to help the soils, but the health and quality of our environment.

In order to feed an additional two billion people over the next 40 years we are going to have to grow as much food in the next 40 years as we have, as a world civilization, over the last 500 years.

If you think back to before Magellan everyone knew the Earth was round, before he circumnavigated the globe, and all lifetimes and all the food that was grown over that time period, we are going to have to grow in half a lifetime the same amount of food to feed that surging world population, and we are having to do that on a smaller land base.

Just here in the United States, in the last 30 years, upwards of 43 million acres of land have been converted from agricultural lands to non-agricultural lands, and of those 43 million acres, a land area the size of the State of Washington, 14 million acres of that were prime soils, the most food productive soils on Earth, the size of West Virginia, paved over and converted to other uses.

So we have a massive challenge to grow food, we have less land to do it on, how are we going to do it?

This is a call to action to help support farmers and ranchers to begin investments today in improving and protecting the vibrancy and the health of their soils so they can be sustainable and grow that food and fiber, not just maintain yield, but boost yield, for decades to come.

Here are some of the benefits of healthy soil:

  1. Healthy soil helps improve water infiltration.
  2. When it rains, water doesn’t run off the field, it gets into the soil.
  3. It improves the water holding capacity of the soil.
  4. By increasing the organic matter and the porosity of the soil, you get more water in the root zone, where the crops can get at that water.
  5. Healthy soil helps improve water quality, protects our streams, rivers, and aquifers.
  6. It increases the nutrient availability of fertilizers, manure, and decomposing biomass.
  7. It helps cycle nutrients, makes them available again to grow food.
  8. It helps save energy. Producers can be more efficient with their use of their farm equipment and irrigation pumps.
  9. It helps save wear and tear on their equipment, which  at the end of the day, saves them money
  10. It helps improve the health of the plants, the crops themselves.
  11. It makes crops more drought resistant
  12. It makes crops more tolerant of high water events.
  13. It makes crops more resistant to pests and disease.

We view soil as a living factory, and when that factory is optimized, when you have all those critters working together, helping to feed the crops, you can then optimize the yield coming off those crops, off the farm fields. As a producer put it, anything can have quality, but only living things can have health. And that is what we are focused on, the health, and how do you nurture the health of the microorganisms, the ecosystem below the surface of the soil. We have four basic principles when we talk about soil health from a macro perspective for when we work with a farmer or rancher.

  1. Minimize disturbance of soil: the physical, the biological and the chemical
  2. Maximize the diversity of the plants in the soil. As we have learned from ecologists, ecosystems that are diverse in their populations are more resilient to stress, to drought, to pests, to disease. The more diverse the crops, the more diverse the microorganisms in the soil.
  3. Keep your soils covered for as much as possible to protect them from the erosive effects of wind and water.
  4. Have living roots in the soil for as long as possible. Instead of a bare field, keep the living roots in the soil to capture the energy of solar radiation to feed the organisms in the soil for as long as possible

NO-TILL.  One of the key principles is no-till, which leaves last years crop residues on the soil and at planting time seeds are drilled into the soil.   This addresses two of these key principles above, in terms of minimizing disturbance of the soil, but also maintaining a residue on the soil. So it is actually an interesting that we estimate across the U.S. there are about 67 million acres of cropland that are in continuous no-till. That is nearly a quarter of the crops.

In terms of avoiding lost carbon to the atmosphere, and keeping the carbon in the soils where it helps protect the ability of the soils to grow crops, about 8.8 million metric tons of CO2 equivalent emissions are avoided by keeping the carbon in the soil.  That is equivalent to burning 990 million gallons of gas, or powering 1.9 million passenger vehicles per year, that are now being kept in the soils to help protect crops, and maintain the organic content in the soils.

So we view no-till practices as one of the greatest approaches to improving the health of the soils.  No till has a number of benefits.

  1. It protects those soils from the erosive power of wind, and from water.
  2. It helps shade out weeds
  3. It keeps the soil cool. When soil is bare in summer heat, you can actually cook and kill the microbes in the soil. Bare soils also get dried out, which stresses the plants.
  4. No-till mulch can serve as the biomass that fungi, mites, and other critters decompose into organic matter. So that also serves as a feedstock to help boost the organic matter in your soils, which boosts the water holding capacity of your soils.
  5. More organic matter means more water. You create a reservoir in your fields to hold water when it rains or snows, or when you irrigate. So a rough rule of thumb, for every one percent increase in organic matter, you increase the water holding capacity of an acre by 25,000 gallons of water. So soil health is a great way to help capture—when you irrigate, or when it rains, it helps create more drought resiliency for your crop fields.

Without no-till, there is nothing to stop tons of good topsoil from washing off the field, along with the water and fertilizer that would have penetrated the field which could have been stored underground for a long time.  That’ll cost the farmer money and energy to replace that water and nutrition later on.

Another key practice that we use is cover crops. [In his presentation he has a slide of a rye crop growing between corn flattened down before it was harvested, acting as a mulch, as well as the root system below providing food for microorganisms, keep soil from washing away, and so on.]

What does this mean for actual production of food? If you remember, back in 2012 we had one of the worst droughts in half a century, that impacted almost every state. And so some partners at the Sustainable Ag Research and Education Program at USDA, as well as the Conservation Technology Information Center, did a survey from producers in 759 producers in 7 Central Midwest states, and asked them if they used cover crops, and if so, what happened to your yield? We learned cover crops improved yield by 11% than folks who didn’t use them as part of the rotation.  And for soybeans, it was even higher –an over 14 percent increase in yield.

In addition to survey information from producers, NRCS has learned of positive soil health results directly from individual producers. For example, Steve Groff farms 225 acres in Lancaster County, Pennsylvania, where he grows corn, soybeans and small grains, as well as pumpkins. Through more than 30 years of using no- till and multi-species cover crops, Mr. Groff reports that he has increased his soil organic matter from two percent to almost five percent, and has obtained yields that exceed local averages by ten percent.

Gabe Brown, who farms about 2,000 acres near Bismarck, North Dakota, keeps soil covered with dense, diverse plants and cover crops, while also integrating livestock into his soil health management system. Mr. Brown reports that he has more than doubled his soil’s organic matter content, and these healthy soils have resulted in higher than county-average yields.

Ray Styer, who grows corn silage and multi-species cover crops on 80 acres in Rockingham County, North Carolina, reports that he has more than tripled his soil organic matter and obtained yields that are 4 tons per acre above the county average.

In Carroll, Ohio, Dave Brandt farms a corn-soybean-wheat rotation on 1,500 acres. For more than 35 years, he has used a soil health management system with no-till, diverse cover crop mixes, and crop rotations; and has increased his soil’s organic matter from 2% to over 5%. Even during the drought of 2012, Mr. Brandt reported that he averaged 170 bushels of corn per acre, which was nearly twice the yield of his conventional farming neighbors.

At the end of the day, in my view, cover crops are one of the best risk management tools we can offer a farmer or rancher from our quiver of conservation practices, but this is a great risk management tool. It helps not only maintain yield, but also protects farmers in periods of stress.

Another core practice is nutrient management– how you manage the nutrients of your soils. There are the nutrients you apply through fertilizer, manure, poultry litter and the available nutrients that are left over from last year’s application of fertilizers. It is the biomass, the residue that is left over from the crops. It is the actual deposition of nitrogen from the atmosphere, and how the microbes incorporate those nutrients. So a nutrient management plan, using the 4R’s of the right source, at the right time, the right rate, the right amount, is part and parcel of an effective soil health management system.  This is something that NRCS is working very closely with foundations, the research community, certified crop advisors, institutes, you name it. There is a broad array of folks who are working with producers to incorporate effective nutrient management into the soil health systems.

Improving soil health on range and pasture.  We talked about how tillage is a disturbance to the soil. Overgrazing also is a disturbance of the soil, a biological disturbance. When you overgraze, you are stressing the plants. You actually have smaller root mass below the soils. You get more opportunity, then, to heat the soils, to cook those microbes, create drought stress in the plants and provide more opportunity for weed penetration, invasive penetration.

The best practice is rotational grazing so livestock don’t overgraze.  The plants that aren’t grazed have roots that go down several feet or more, which create pores for water to infiltrate in, and for the nutrients to have better access deeper into the soil profile and more habitat for microbes. So at the end of the day, the soils on the right are going to be more drought resistant, more resistant to invasive species and weeds. And so, overall, you are going to have a more sustainable yield of forage off those lands than you would off a continuously grazed system.

At NRCS, because of our focus on soil health, I wanted to share something we are very excited about, and it is a big deal. For the first time I am aware of, anywhere on Earth, we actually have mapped out the soil carbon stock for a continent. Over the last couple of years we completed is a survey across the United States. And then, using our knowledge of soils, we mapped out the current stock of organic matter across the United States.

Why is this important? So, for a conservation planter, or an agronomist, a farmer, you want to understand, what is my organic matter? What is my organic content? So now that we have that base underlying understanding of the soil pattern—or content—organic content, we also know the carbon carrying capacity for those soils. So then we can prescribe the most effective soil health management system to help not only protect the organic matter, but also boost organic matter in your soils.

Ricky and Russell Wiggins are cotton and peanut farmers. They farm about 2,700 acres in Alabama. And when they started in the soil health journey, their soil organic matter was around .75 percent, less than one percent soil organic matter. They decided to try conservation tillage systems using high residue cover crops, and left rye grass in the fields to provide high residue to protect their soils, and better organic matter.  Over several years they have increased their soil organic matter to over 3%, making their soil easier to work with, more pliable, and easier to plant peanuts in. They burn less fuel and replace the tips of their plows less often. Their terraces need less maintenance. When it rains, the terraces flow clean, so there is less maintenance in their ditches and water canals. So overall it is saving them money, saving them wear and tear, saving them fuel. And, in periods of drought like they experienced a couple years ago, when other producers in their area were not able to plant because the soils were dry, they could plant. They got a crop in the ground and got a harvest, because of their approach for soil health.

Weather resilience in soils has always been important and will continue to be even more so as we work to improve our natural defenses against climate change and extreme weather, such as extended droughts and severe storms, as well as indirect effects such as changing threats from pest populations and plant diseases. Healthy soils will be a key component for agricultural producers to successfully adapt to these challenges and will help ensure that we can continue to meet the food demands of a growing population.

Our Conservation Effects Assessment Project, which has now evaluated conservation impacts covering over 300 million acres of cropland, has estimated that the same practices we use to enhance soil health—such as no-till, cover crops, and crop rotation—have reduced edge-of-field sediment loss by 47–73%, phosphorus loss by 33–59%, and nitrogen loss in runoff by 35–58%.

Yet, there is more to be done. Events like drought in Texas and California and algal blooms in Lake Erie and Lake Champlain, coupled with the need to meet the demand for food, fiber, and fuel for a growing population, tell us the time is now to enhance the health of our nation’s soils.

I will conclude by saying that I believe improving the health of our nation’s soils is one of the most important things that we can do for this and for future generations. That is because improving soil health not only supports growing the food, fiber, and fuel needed by a rapidly expanding world population, but it also allows us to simultaneously address some of our nation’s most pressing natural resource needs. It allows us to increase resiliency to extreme weather events, improve water quality, increase carbon sequestration, enhance habitat for pollinators and other wildlife, increase farm profitability, and we believe also reduce economic risk associated with crop production.

The CHAIRMAN.  In some cases, it is folks who want to give it a try, but they are leery of the cost, or it may be encumbering risk. It is a transition to a different cropping system. We can help offload some of that risk, and allow them to experiment with different nutrient management approaches, and cover crops, and tillage practices to help them get up to speed. I think we are well equipped for it.

Mr. LUCAS, OKLAHOMA. From first arrival of Europeans until a century ago, the concept of mining the soil existed. It was a resource that you utilized, then you moved on. Starting in the 1930s, perhaps, with a particular focus in my region and the southern, the Great Plains, and the east side of the Rockies, and the evolvement of the Soil Conservation Service, the predecessor to the NRCS, the focus began to shift that this was something that was not to be used and thrown away, but it was to be truly nurtured. And now we are apparently taking the next step, so progress is a positive thing, and we are in that direction.

BOB GIBBS, OHIO.  In northwestern Ohio the soils are heavier, in some parts it is really heavy.  Is no-till hard to adapt in those heavy soil, does it work as well?

Mr. WELLER, chief of NRCS.  It is my understanding that is it is effective, that folks are using no-till, even in those heavier soils. It is perhaps a different management approach, and it takes some adjustment, but it is something that can be adapted, even in a heavier soil environment.

The CHAIRMAN.  We are not 100 percent on board with some of these practices. They haven’t really been embraced. Why do you think there is resistance? It is not new science. It is old science that has resurfaced. What do you see as the barriers to enlisting more farmers and ranchers?

Mr. WELLER. The old adage, don’t fix what is not broken, if they can get a crop, they are making a living, they are doing okay, why introduce what could potentially be a risk to trying something different?

JOHN LARSON, CHIEF EXECUTIVE OFFICER, NATIONAL ASSOCIATION OF CONSERVATION DISTRICTS, WASHINGTON, D.C.   NACD is a nonprofit organization that represents America’s 3,000 conservation districts, their state and territory associations, and the 17,000 men and women who serve on their governing boards.  Districts are the local government part of the conservation delivery system, and work with millions of cooperating land owners and operators to help them manage and protect land and water resources on all private lands, and many public lands, in the United States, utilizing that voluntary, incentive-based approach. I like to think of the conservation districts as the original pioneers of soil health. Soil health is, and has been, one of the top priorities of conservation districts across the nation since their creation in the 1930s. In fact, soil health is the very reason that districts were created.

Long term nationwide conservation and production practices have resulted in better protection of our precious soil and water resource base, the foundation of our nation’s food supply. Conservation districts play a key role in this process by working with local producers and land owners to implement critical conservation practices on the ground. In Indiana, districts are key members of the multi-partner Conservation Cropping System Initiative that has vaulted that state to a leading position in the soil health movement. In North Dakota, the Burleigh County Conservation District adopted soil health as its major focus 20 years ago, and today national and international visitors have come to the district for soil health tours and workshops. Other districts are renting no-till drills, supplying cover crop seed, facilitating farmer-led soil health advocates, providing no-till test plots, and much, much more.

Through these and other efforts, conservation districts across the nation are helping producers and land owners get the tools that they need to continue caring for the land and provide food, feed, fiber, and fuel for the world. We firmly believe that it is better to invest in long term conservation practices today than to be forced to pay the escalated cost of repair in the future.

The benefits of improving soil health reach far beyond the farm. Health soils lead to higher water quality, by allowing for better nutrient cycling, and reducing sediment runoff, a better ability to manage water, reduce flood damage, an increase in the amount of soil carbon sequestration that the soil does itself. The benefits of nutrient management on soil fertility within a productive and healthy cropping system utilize soil health practices that are assisting producers. As the Chief mentioned, the 4R’s nutrient stewardship approach, in which producers apply the right source of nutrients at the right rate, at the right time, and in the right place, fits perfectly into soil health.

While we are seeing improvements nationwide in both the recognition and the need for the adoption of best management practices for soil health, there is still work to be done. Specifically, we see five main areas that need—that are needed in the future: (1) developing specific soil health conservation practice criteria; (2) increasing soil health research, both the scientific and economic; (3) training NRCS conservation district and other partners and employees; (4) ensuring farm bill programs facilitate farm bill health—healthy soils adoption; and (5) communicating the benefits of soil health to both the agriculture and urban audiences.

World population is expected to hit nine billion by 2050. We believe that the widespread adoption of soil health practices is what will make us successful in meeting that need. If we act now, we have the chance to make a difference on the land that will last for generations.

In the early 1930s, along with the greatest depression this nation ever experienced, came an equally unparalleled ecological disaster known as the Dust Bowl. Following a severe and sustained drought in the Great Plains, the region’s soil began to erode and blow away, creating huge black dust storms that blotted out the sun and swallowed the countryside. Thousands of ‘‘dust refugees’’ left the black fog to seek better lives. But the storms stretched across the nation as soil blown from the Great Plains reached east to New York. Dust even sifted into the White House and onto the desk of President Franklin D. Roosevelt. On Capitol Hill, while testifying about the erosion problem, soil scientist Hugh Hammond Bennett threw back the curtains to reveal a sky blackened by dust. Congress unanimously passed legislation declaring soil and water conservation a national policy and priority and creating the Soil Conservation Service to fight it. Because nearly 3/4 of the continental United States is privately owned, Congress realized that only active, voluntary support from landowners would guarantee the success of conservation work on private land.

As many of you will remember, 2 years ago, our nation experienced a drought of proportions we haven’t seen since the 1930s and 1950s. However, despite this extreme drought, we didn’t enter into a modern-day Dust Bowl situation. There’s a good reason for that—and it’s something that all of us in the conservation community can be proud of: careful, long-term nationwide conservation and production practices that started mainly in response to the Dust Bowl of the 1930s. The implementation of these practices has resulted in better protection of our precious soil and water resource base—the foundation of our nation’s food supply.

Soil health’’ is defined as ‘‘the continued capacity of soil to function as a vital living ecosystem that sustains plants, animals, and humans.’’ Healthy soil ecosystems allow for increased water infiltration, improved water-holding capacity, enhanced nutrient cycling and sequestration, and increased biodiversity. Historically, soil management activities focused on the physical and chemical functions of the soil. Today’s emphasis on soil health recognizes the critical importance of biological function in the soil. ‘‘Soil Ecology’’ emphasizes that soil is a living ecosystem. This ecosystem is impacted by chemical (i.e., fungicides), biological (monocultures) and physical disturbance (tillage) that diminish soil function. There are four key management principles to improve soil ecosystem function: (1) minimize the chemical, biological, and physical disturbance in the soil; (2) keep the soil covered as much as possible throughout the year; (3) maintain a living root, growing for as long as possible, to feed the soil microbes and transfer more solar energy into the soil; and (4) increase crop diversity above ground to add biological diversity to the soil. These basic management activities are central to improving soil health.

The benefits of improved soil health reach far beyond the farm. Healthy soils lead to higher water quality, by allowing for better nutrient cycling and reducing sediment runoff; a better ability to manage water and reduce flood damage; and an increase in the amount of carbon sequestered in the soil itself. Due to its increased water-holding capacity, healthy soil is more resilient against drought; it is also naturally less prone to disease and pest problems, thereby allowing farmers to optimize their use of crop protectants. And because healthy soil requires fewer petroleum-based products for tillage it also saves on energy use and costs.

SHANON PHILLIPS, DIRECTOR, WATER QUALITY DIVISION, OKLAHOMA CONSERVATION COMMISSION, OKLAHOMA CITY, OK. I think many of us appreciate how critical soil health is towards a strong agricultural industry, but, unfortunately, fewer people seem to appreciate how critical it also is towards supporting and protecting the nation’s water supplies.

And we have already talked about how soil health can reduce water pollution, because healthier soils have greater infiltration rates, which means there is less runoff of pollutants that then enter our nation’s waterways. Healthier soils require less supplemental fertilization, which also means lower opportunities for pollution of nutrients to our nation’s waterways. And, finally, healthy soils are living soils, which promote a multitude and variety of microbial communities, which also break down some pollutants into compounds that are less problematic when they enter our waterways. And we know, from data that has been provided by states to the U.S. EPA, that at least 60% of the pollutants which cause impairments to water bodies and our nation’s waters are related to pollutants that come from soils.

Water bodies are recognized as being impaired when they are not meeting the Clean Water Act goals, which means that they are not fishable, they are not safe for swimming, or don’t provide safe drinking water. Toledo this summer  had to turn off the taps due to toxic algae blooms, which are happening all over the nation, from New York State, to Wisconsin, to Oregon, down to Texas, Kansas, and Oklahoma. A July fourth holiday bloom in 2011 in Grand Lake, Oklahoma dramatically impacted the local community and made the news for sickening one of our Senators. At least 38 waterbodies in New York had suspected or confirmed blue-green algae blooms this summer, and toxin production above safe levels was confirmed in at least seven of those systems. Algae blooms occur and persist when a waterbody receives more nutrients than it can naturally assimilate. These excessive nutrients are often related to soil erosion and the washing of pollutants from land surfaces. Agriculture, although certainly not the only source, is one of the most significant sources of nutrients in the U.S. The good news is that we know and have demonstrated how to reduce these nutrient and sediment-related impacts from agriculture. These successes have been demonstrated all over the nation and many of them are chronicled on the EPA Nonpoint Source Success Story Website at: http://water.epa.gov/polwaste/nps/success319/. This website highlights at least 508 waterbodies across the nation where water pollution problems have been solved.

Oklahoma is one of the most successful states in the nation at demonstrating how these voluntary conservation programs that bring a partnership of local landowners, conservation districts, USDA, NRCS, and FSA, and the State Conservation Agency together to implement these conservation practices and solve water quality problems. One of the reasons that we have been so successful in Oklahoma is because we have also made EPA a part of that partnership. And I recognize that that makes a lot of people very nervous, but what we are doing with EPA is we are utilizing their funds from the Section 319 Clean Water Act program to provide technical support to support water quality monitoring. We are using their technical support to design that water quality monitoring, and we are using that data to prove to EPA and others that these conservation programs not only assist farmers in maintaining their operations, but they also solve water quality problems without additional regulation.

The importance of protecting our national soil resources, which were built over geologic time and heavily impacted through settlement and development of our continent, is relatively obvious as it relates to the promotion of a strong agricultural industry, which, in turn, is critical for a healthy national economy. Scientists estimate that as much as 60% of carbon has been lost from agricultural soils since the 1800s. This loss in organic matter affects a soil’s capacity to absorb and hold nutrients and water, which are critical for production of crops and livestock forage. Protection of our soil resources is also mandatory for protection of the nation’s water resources.

Erosion of soil particles, washing of compounds from the soil, and changes in soil structure which affect water infiltration are some of the most significant sources of water quality problems in the U.S. According to the U.S. Environmental Protection Agency, approximately 777,759 or 67% of impaired miles of U.S. streams and rivers and 9,794,360 or 40% of impairments to lakes, reservoirs and ponds are caused by pollutants related to soil erosion or leaching of pollutants from soils such as excess nutrients, sedimentation, turbidity (suspended particles), pathogens, and pesticides.

Most of these programs relied on voluntary conservation programs to help states and local partners clean up waterbodies affected by pollution which resulted from soil erosion or the washing of pollutants from the soil.  Nonpoint source pollution results when rainfall or snowmelt washes pollutants off or out of the land and into streams. It is much more difficult to measure or control than point source pollution, which is generally thought of as pollution from a defined source, such as a pipe at a waste-water treatment plant. In states like Oklahoma where the majority of land is privately held and used for agricultural production, conservation programs to protect and reduce the impacts from agriculture have been very successful.

Without the EPA partnership, there would not be a Nonpoint Source Program in Oklahoma, nor would there be any documented Nonpoint Source Success Stories. We would not be able to prove that voluntary programs can successfully address water quality problems on our agricultural lands because our limited state and Federal funds from other programs are focused on other purposes. Finally, the EPA oversight and technical support for the 319 program is both beneficial for the overall program and critical toward legitimizing program results.

JAMES HARBACH, FARM MANAGER, SCHRACK FARMS, LOGANTON, PA.  I am very fortunate to have been part of agriculture for more than 40 years. On our operation, I have witnessed the transition from conventionally plowed ground to no-till. Some of our fields have not been plowed for 40 years. We have seen firsthand the transformation of our soils, and the positive results when you farm in nature’s image. In the last decade, with the addition of cover crops, and the belief that plants feed the soil, instead of soil feeding the plants, we have seen incredible results. Some examples include organic matter increases of one percent in 3 years, and steady state infiltration rates that average 4.5 inches per hour. I have no fancy degrees, no financial incentives to be here today, and I don’t enjoy public speaking, but I have a passion for our soils, and the land around the world. I am not an organic farmer, although we no longer use insecticide or fungicides, and only a fraction of the herbicides and fertilizers that we once applied. I used to be part of the group of traditional thinking farmers, but by attending national conferences, field days, and visiting open-minded farmers around the country, I now have an understanding of the important symbiotic relationships that are achieved when you farm in nature’s image. Our farm is part of a like-minded nationwide soil health community that believes that soil health holds the answers to all of our problems.

Agriculture today is farming a degraded resource, and has accepted this as normal. Despite our best efforts, our soil has lost the ability to effectively absorb rainwater, are void of biological life, and are depleted of nutrients. Our soils are so degraded that we must rely on industrial inputs to keep our farmlands productive. We now have a broken water cycle as a result of a broken carbon cycle. The loss of soil organic matter has contributed to carbon dioxide levels in the atmosphere because we have robbed the soils of its carbon. Soil organic matter has many, many functions, water infiltration, water holding capacity, groundwater recharge, and its ability to cycle and store nitrogen, along with other nutrients.  

On our farms we no longer have water leaving our fields. Even with 4 to 4.5 inch rainfall, we don’t see any erosion.

Conservation programs have historically reacted to resource concerns, instead of being proactive to address the source of the problem. We need to start promoting proactive conservation, instead of reactive conservation. NRCS has embraced soil health as one of their core programs. It is a good start, but what we need is a mammoth soil health education program to teach farmers, Federal and state agencies, regulators, universities, children, and the general public. Farmers need to understand how the soil functions before they will value it as a resource, but government programs need to motivate farms to adopt soil health principles.

We need to have a premium structure that promotes soil building techniques, and, conversely, provides a disincentive for soil degrading practices. Taxpayers should not be on the hook for supporting production agriculture that exports more topsoil nutrients and soil carbon than actual crop productsThe benefits of healthy soil need to be acknowledged in the regulatory process. We need regulatory agencies to recognize that well managed farms with healthy soils are the key to reducing agricultural problems.  Our government programs need to motivate farmers to adopt soil health principals. Many do the opposite, they enable poor stewardship.

More independent, government funded studies need to be conducted on the effects of fertilizer, herbicides, GMO’s and pesticides on the soil community and human health. We cannot rely on industry to fund these studies and produce unbiased results. Each state needs to have long term, no-till farms that exhibit improvements in soil health. These farms need to be central in soil health research and education programs. Soil health farms need to monitor improvements in profitability, water infiltration and retention, soil organic matter increases and soil generation. Can agriculture sequester enough soil carbon to make a measureable difference in atmospheric CO2 concentrations? In the book Cows Save the Planet and Other Improbably Ways of Restoring Soil to Heal the Earth, Dr. Christine Jones states that every 1 ton increase in soil organic carbon represents 3.67 tons of CO2 sequestered from the atmosphere. Can healthy soils significantly reduce rain water runoff? Do healthy soils leak nutrients or does this only occur in poorly structured and poorly managed soils? Changing weather patterns are linked to soil management. Bare, exposed, dry soils put more heat into air and change flow patterns above the fields. Bare soils do not cycle the water, lowering the ability for the plants to contribute to local moisture. New soil testing technologies like the Haney Soil Health Tool and Solvita CO2 Burst that measure biological life and nutrient availability need to be promoted and incorporated into crop nutrient recommendations. What will motivate farms to achieve good soil health and increase the soil organic matter—regulations or education and gaining a better understanding?

If you promote soil health principles, be prepared for a huge push back from the agriculture industries that sell products to farmers. Once a farm restores healthy soils, few of these products are needed and it will reduce industry sales. Farmers that contract with NRCS and are given incentive payments for installing practices (EQIP, CREP, CRP, WRP) should be required to attend soil health trainings and education programs.

TIMOTHY J. WALZ, MINNESOTA.   In 2012, average corn yield was 126.2 bushels where cover crops were employed, 115 bushels without it. So we know that cover crops work.

JILL L. SACKETT, EXTENSION EDUCATOR, AGRICULTURE PRODUCTION SYSTEMS, UNIVERSITY OF MINNESOTA EXTENSION REGIONAL OFFICE, MANKATO, MN.  I started my professional career with cover crops basically the minute I started my professional career with Extension. I did my best to hit the ground running, and, thankfully enough, I had some really good resources out there on a national level, like SARE, on a regional level, like the Midwest Cover Crop Council, and then probably about 20 years off and on of cover crop research from the state, some great soil and water conservation districts, and some wonderful farmers. At that time, adoption rates of cover crops were very low. But we all forged ahead, and we were hosting education events. We were doing demonstration plots, and we were working with some amazing farmers that had innovative ideas. We quickly figured out that there are two main focuses for why Minnesota farmers are interested in cover crops, the first of which is soil health, which is why we are here today. Now, when you ask a farmer, why are you interested in trying cover crops, the phrase soil health may not actually be what comes out of his or her mouth, but what they do describe definitely makes up quality, healthy, productive soils. They share with us that they would like to see a decrease in soil erosion, an increase in soil organic matter, an increase in natural nutrient cycling. They want to see more water infiltration. During drought, they want to be able to have higher water holding capacity in that soil, all of which are part of what makes a soil healthy.

The second point, and not surprising for the Land of 10,000 Lakes, is water quality. What we noticed in Minnesota the last few years, unfortunately, is that we tend to have far too much water in the spring, and far too little water in the summer and fall. Cover crops are one of the few practices that actually allow us to deal with both of those issues. The use of cover crops allows us to take up excess water, to take up excess nutrients, and the roots help hold our soil in place, all of those things we want in our fields, where we need them most. On the flipside, when it comes to drought, a living plant can actually shade the soil, a dead cover crop can mulch the soil, both of which help decrease soil evaporation.

This brings us to an excellent story that shows the interrelationship of cover crops, soil health, and water quality. In 2013 it wouldn’t stop snowing and raining, and a lot of our farmers in southeast Minnesota weren’t even able to plant their cash crops. Some of them decided to plant cover crops, many for the first time ever, in order to keep their soil in place, and to keep any nutrient that they had put on that soil in that field. The farmer in question here decided to use oilseed radish, and in September he called and said, I don’t know what to do. It is really dry, and yet these plants are growing robustly. The roots are 3? in diameter, the leaves are bushy and about 2.5 feet tall. What do I do? My neighbors are telling me that I need to just plow it under and get rid of it. My NRCS person is telling me, no, leave it be, you need to protect that soil. He had gotten my phone number from his soil and water conservation district, and he gave me a call because he knew I had had some experience with oil seed radish. And, in my experience, the chemical makeup of this particular plant allows it to decompose quickly in the spring, so I encouraged him to not listen to his neighbors, and listen to his NRCS agent instead, and leave it be. And that is where we left our conversation. This June I was pleasantly surprised to find an e-mail from him, where he told me that the oil seed radish had basically dissolved by planting time, and that the soil conditions were some of the best he had ever seen in his 41 years of planting. But my favorite line from the e-mail was, ‘‘We showed them.’’

I had the expertise of the Midwest Cover Crop Council; local research from University of Minnesota, Minnesota Department of Agriculture, and USDA Agricultural Research Service; and a few experienced Soil and Water Conservation District personnel and farmers to guide me in my efforts. Some of these resources already had 15 or more years of cover crop experience. In the beginning there was some polite skepticism, some eye-rolling and, in some cases, actual sleeping in the back row. But, I can honestly say there was also genuine interest in the message I was sharing.

Minnesota is known as the ‘‘Land of 10,000 Lakes.’’ We actually have 11,842. And, we’re home to 6,594 rivers and streams. All told, we have just over 13 million acres of surface water. ‘‘Water, water, everywhere,’’ and yet we also have had to deal with our fair share of drought conditions. Too often lately, there are places in Minnesota that deal with flooding in the spring and then drought in the summer and fall. Needless to say, Minnesota knows water, and we’re well aware of how important it is to our 26 million acres of agricultural land as well as to our drinking needs and recreational activities. Research and common sense show that having growing plants on the land as long as possible helps to use excess water and nutrients and also helps keep the soil in place. During dry periods, the shading action of a green plant or the mulching action of a dead plant can help decrease soil water evaporation. With their potential to assist in water quality and quantity, cover crops are definitely starting to draw attention.

Cover crops are only a piece of the puzzle, however. We also need to see an increase in conservation tillage practices like strip till or no-till; additional crops in our rotation instead of only one or two; and an increased use of best management practices.

Mr. LUCAS. What are the greatest challenges? Is it lack of cooperation between entities back home? Is it persuading the very traditional farmers to try something new? Just what are your chief challenges in moving your message forward

Mr. HARBACH. We are very active in Pennsylvania with the PA No-Till Alliance, and we spend a lot of our time ON education, and seminars, and field days. And the biggest problem we have is just getting people to understand. There is a lack of understanding. I live in a valley with a lot of Amish community, and they are not ones to come out to this kind of stuff, so to penetrate them is difficult. But, basically, it is just the lack of understanding from the farmers’ perspective.  It took me several years to get where I am today, and it is really hard to expect a farmer to get there overnight. It is going to take a long time to get that level of understanding.

Mr. LUCAS. In my hometown, we had 14,000 people on the Census roll in 1930, after the Dust Bowl of the Depression, of the 1930s, the droughts of the 1950s all left, we are not quite back to 4,000 people yet, so the quality of soil will impact the ability of your citizens, your fellow neighbors, to be able to create a livelihood. You can see many ulcerations if you fly over much of my part of western Oklahoma, in spite of all of the decades of efforts.

The CHAIRMAN. You mentioned that you no longer use insecticides and fungicides, and only a fraction of the herbicides and fertilizer that you once used. What are the direct benefits or challenges you see with using so much less fertilizer and herbicides?

Mr. HARBACH. It is not really a challenge. Initially it is, to make that first big step, but there is so much research done. And I have to say, from ARS, there are a lot of good things coming out of there, because that is research that is done that is not industry funded, and we need to increase that. By not using the insecticides or as many chemicals, it is a systems approach. Once you achieve soil health, you have the beneficial bugs that take out the critters that you are applying an insecticide for, or maybe a fungicide, and the fungicide is very hard on soils, and we need to get people to understand that once you have achieved that certain level of soil health, that some of these other input costs can go away.

CHRIS JAHN, PRESIDENT, THE FERTILIZER INSTITUTE.   The Fertilizer Institute (TFI) is the leading voice of the fertilizer industry, representing the public policy, communication and statistical needs of producers, manufacturers, retailers and transporters of fertilizer. The Institute’s members play a key role in producing and distributing vital crop nutrients, such as nitrogen, phosphorus and potassium, which are used to replenish soils throughout the United States that in turn produce healthy and abundant supplies of food, fiber and fuel.   Commercial fertilizers have a critical role to play in boosting crop production to the levels necessary to meet the demands of this rapidly growing world population. Crop nutrients such as nitrogen, phosphorus, potassium, and secondary and micronutrients such as calcium, zinc and iron are responsible for between 40 and 60% of today’s total food production.  For a high yielding corn production system (250 bu/acre), a soil with 2.5% organic matter could provide 20% of the annual recommended nitrogen. The average estimate of available nitrogen, used by agronomist, is 20 pounds of available nitrogen for every 1% of organic matter.

 

 

Posted in Agriculture, Biomass, Food, Infrastructure, Pesticides, Soil, Soil, Water | Tagged , , , , , , | Leave a comment

Twice as many El Niños in 21st century

Expect more drought, flooding, and other crazy weather

In Nature Climate Change, doi.org/q4c, researchers predict that  El Niños will become twice as common, about once a decade in the future versus every 20 years the past century.

Another recent study showed that even normal El Niños will bring more severe drought and rain (Nature, doi.org/n9n).

Extreme El Niños can kill tens of thousands of people by causing a tenfold increase in rain in South America, flooding in the Americas, and drought in Australia, Africa, and elsewhere.

Until now scientists weren’t sure if climate change would affect El Niño because  it wasn’t known if temperatures in the Pacific would vary more in the future, but since the eastern Pacific is warming faster than the western Pacific this will eat up the east and shift rainfall.

Other scientists have found that El Ninos have grown more intense between 1979-2009 than 1590-1880  (Climate of the Past, doi.org/q28).

Posted in Extreme Weather | Leave a comment

Global warming spreads disease in the arctic

[ A summary of the spread of disease in the Arctic  in the August 2014 issue of Scientific American follows ]

Pathogens moving northward:

  • Aleutian Islands, Alaska. A distemper virus that infects seals in the North Atlantic ocean now attacks sea otters in the North Pacific
  • Saint Lawrence Island, Bering Sea. Avian cholera 200 kilometers off the alaskan mainland has killed hundreds of northern fulmars, murres, and crested auklets–seabirds unafflicted before.
  • Sahtu, Northwest Territories. Ticks, never observed here efore, have been discovered on moose hides.
  • Victoria Island, Arctic Archipelago. Lungworm has spread several hundred kilometers north across the musk oxen population
  • Hay Island, Nova Scotia. Ringed seals have passed a parasite to gray seals that killed 400 pups in 2012.
  • Sweden. Mosquitoes have spread the tularemia bacterium to people across the country. This can cause severe fever, inflammation and death.  Sweden also has more hantavirus infections because of warming.
  • Arkhangelsk Oblast, Russia. Tick-borne encephalitis cases in humans rose 50-fold from the decade 1980-1989 to the decade 2000-2009.

A warming climate helps parasites mature fast. Cold summers that used to keep parasites in check do so less often. A tipping point has happened, warmth lasts longer. Parasites like the lungworm now mature in one summer instead of two. This has caused musk ox populations to decline dramatically.

In Russia forests are advancing into tundra at a rate of about 1 km per year, and ticks along with them, affecting 4 million people.

Summary: For eons arctic cold kept a cap on disease, which wildlife has gotten used to. It’s thought that one reason birds migrate north is to avoid disease to devote their energy to raising young. Viruses, fungi and parasites are not just invading the north, but also tropical and temperate ecosystems as the climate heats up.

 

[ This is the transcript from NPR’s July 20th, As Polar Icebox Shrinks, Infectious Pathogens Move North. ]

Science writer Chris Solomon tells NPR’s Arun Rath that global warming has caused an influx of new diseases in animals that could eventually spread to humans.

ARUN RATH, HOST: Infectious diseases may be spreading more quickly, thanks to global warming. Viruses that were kept in check by the polar ice box are being released. And as some animals move north to keep cool, they’re bringing all sorts of parasites with them, from microbes to ticks. Christopher Solomon has written about this in the August issue of “Scientific American.”  Christopher, you wrote about sea otters off the coast of Alaska’s Aleutian Islands that are now infected with a virus from halfway around the world. What happened?

SOLOMON: There is something called phocine distemper virus, and it’s a relative of canine distemper. Phocine distemper has killed 50,000 seals over the last 25 years in the North Atlantic. And as scientists were trying to figure out why sea otters splashing in the Aleutian Islands were not doing so well, they found evidence of phocine distemper in them, and it became a detective story. And they said, well, what’s it doing in the North Pacific? And their theory is that it has made its way through the fabled Northwest passage via a seal or its feces and met animals on the other side due to the dramatic level of sea ice reduction.

RATH: So in addition to opening up lanes for shipping, warming has opened up a highway for viruses?

SOLOMON: Yes. In essence, disease is finding new lanes of travel. Existing disease up there is becoming invigorated. And new disease is hitchhiking on all sorts of wildlife, whether it’s fish or wild boars or ticks that are moving north in search of new habitat that’s cooler.

RATH: Wow. And in terms of land animals, I know with your article there is a photo of a big herd of very serious looking musk oxen. And they’ve been affected as well?

SOLOMON: Yes, this is another interesting case. Musk oxen – people may be able to visualize from a Disney or Pixar movie – they’re those smelly, kind of shaggy, horned relics of the Ice Age. And they’ve had a relationship with this parasitical lung worm for eons. It gave them a bit of a smoker’s cough. But the lung worm was always kept in check because it never was able to thrive in the brutal Arctic environment too well. And now, with essentially longer, warmer summers, the lung worm can complete its life cycle in one summer instead of two. And it has proliferated and has expanded it’s range up to where 30% of the world population of musk oxen live. And this is not good for the declining number of musk oxen in the far north.

RATH: Now, diseases can sometimes jump from animals to humans. How much is there for people to worry about, beyond animal populations?

SOLOMON: Well, that’s the interesting point in all of this. Since 1940, 60% of the new infectious diseases we’ve discovered in humans have come from animals. We’ve knocked down the borders between the natural world and the man-made world. Or, in these cases, the borders are simply melting away.

As one parasitologist Michael Grigg at the National Institutes of Health told me – he said, if the animals get sick, we can get sick. So we really need to pay attention to what’s happening out there. I’m not saying that the Arctic is collapsing under the weight of contagion right now. But things are happening that the scientists are really only starting to grasp in the north. And we need to pay attention to these flares that are going up.

Posted in Disease | Tagged , , , , , | Leave a comment

Plants are sucking streams dry thanks to more CO2

Slezak, M. October 24, 2015. Carbon emissions make Earth greener but are also drying it out. NewScientist.

Source: Ukkola, A. M., et al. October 19, 2015 Reduced streamflow in water-stressed climates consistent with CO2 effects on vegetation. Nature Climate Change 6, 75–78 (2016)

The carbon dioxide we’ve been pumping into the atmosphere is fertilizing plants and making them grow faster – but now those plants are sucking our streams dry.

Australia is already a parched country and will only become drier as the planet warms and rainfall decreases.

But now it turns out that Australia has lost about a quarter of its stream flow over the past 30 years as plants given an extra boost by our carbon emissions grow faster and use more water.

How that extra growth would affect water uptake has been a matter of debate. That’s because the extra CO2 has two opposing effects, says Anna Ukkola from Macquarie University in Sydney, Australia.

Plants have a waxy seal over their leaves that stops them from losing too much water to the air. To get access to CO2 in the air, which they need to photosynthesize, they have to open little pores in that seal. But they also lose water: CO2 goes in, water out.

Since there is a lot more carbon in the air than there used to be, plants can close their pores partially and still get the same amount of CO2 while losing less water, says Ukkola.  Early models concluded that stream flow would increase. If plants lose less water, they reasoned, then there will be more in the streams.

But later models disagreed, showing that it depends on exactly how the plants’ growth is affected: if they become more leafy, then they will lose more water to the air. Researchers have tried to sort this out by growing experimental plots. “But in the experiments, the changes in water use is varied – it’s all over the shop,” says Randal Donohue from the CSIRO in Canberra, Australia.

Boost plant cover

Donohue and colleagues were the first to show in 2013 that increased carbon dioxide levels were boosting plant cover around the world, by examining satellite images and removing the effects of other factors such as changes in rainfall and land use change.

Using similar methodology, Ukkola and colleagues repeated that analysis for Australia, and then compared the carbon-driven greening in 190 river basins with the changes in stream flow over that time.

After allowing for other factors like changes in rainfall, they found that a significant drop in stream flow was associated with the greening of the landscape. In areas that were greening more, stream flow was also diminishing more.

Overall, the CO2-induced greening was responsible for a stream flow reduction of between 24 and 28%, says Ukkola.

“The plants are growing more and bigger leaves,” he says, which evaporate even more water.

Ukkola says the results can probably be extrapolated to places with similar climates to Australia like the Mediterranean.

Posted in CO2 and Methane, Water, Water | Tagged , , , | Leave a comment

Abrupt Impacts of Climate Change

NRC. 2013. Abrupt Impacts of Climate Change: Anticipating surprises. National Research Council, National Academies of Sciences press.

[ Excerpts from this 223-page document, with a table of possible abrupt changes. My interest in climate change for many years was whether it could trigger an event like the Permian extinction. One of the biggest spikes that might induce that would be from methane.  But this study says that methane hydrates and methane from permafrost would take from centuries to millenia  from now to happen, and it will not be abrupt in the future, and it is not likely to happen in this century. 

On the positive side, there is a good chance that abrupt climate change can be avoided.  This is because conventional oil peaked in 2005 and unconventional oil peak is not far away.  We are about to emit a LOT less CO2 and methane after peak oil, and that we are also near peak coal and natural gas. Since all fossil fuels will be declining soon, it means the levels projected by the IPCC are not likely (they use out-of-date, incorrect data from 1998 for estimates of what the coal, oil, and natural gas deposits are).  See the posts here for peer-reviewed papers on this topic.

We are locked into climate change for thousands of years, but the good news is that many scientists believe that peak fossil fuels will at worst reach levels in the lowest few IPCC projections.  The decline of fossil fuels will kill far more people, far sooner than Climate change.  And then crazy weather, rising sea levels, and disease will make life harder for those who survive.   Alice Friedemann   www.energyskeptic.com ]

“Abrupt climate change is generally defined as occurring when some part of the climate system passes a threshold or tipping point resulting in a rapid change that produces a new state lasting decades or longer (Alley et al., 2003). In this case “rapid” refers to timelines of a few years to decades.

“Abrupt climate change can occur on a regional, continental, hemispheric, or even global basis. Even a gradual forcing of a system with naturally occurring and chaotic variability can cause some part of the system to cross a threshold, triggering an abrupt change. Therefore, it is likely that gradual or monotonic forcings increase the probability of an abrupt change occurring.

Climate is changing, forced out of the range of the last million years by levels of carbon dioxide and other greenhouse gases not seen in Earth’s atmosphere for a very long time.

It is clear that the planet will be warmer, sea level will rise, and patterns of rainfall will change. But the future is also partly uncertain—there is considerable uncertainty about how we will arrive at that different climate. Will the changes be gradual, allowing natural systems and societal infrastructure to adjust in a timely fashion? Or will some of the changes be more abrupt, crossing some threshold or “tipping point” to change so fast that the time between when a problem is recognized and when action is required shrinks to the point where orderly adaptation is not possible?

A study of Earth’s climate history suggests the inevitability of “tipping points”— thresholds beyond which major and rapid changes occur when crossed—that lead to abrupt changes in the climate system.

The history of climate on the planet—as read in archives such as tree rings, ocean sediments, and ice cores—is punctuated with large changes that occurred rapidly, over the course of decades to as little as a few years.

There are many potential tipping points in nature, as described in this report, and many more that we humans create in our own systems. The current rate of carbon emissions is changing the climate system at an accelerating pace, making the chances of crossing tipping points all the more likely.

Scientific research has already helped us reduce this uncertainty in two important cases; potential abrupt changes in ocean deep water formation and the release of carbon from frozen soils and ices in the polar regions were once of serious near-term concern are now understood to be less imminent, although still worrisome as slow changes over longer time horizons. In contrast, the potential for abrupt changes in ecosystems, weather and climate extremes, and groundwater supplies critical for agriculture now seem more likely, severe, and imminent.

 

In addition to a changing climate, multiple other stressors are pushing natural and human systems toward their limits, and thus become more sensitive to small perturbations that can trigger large responses. Groundwater aquifers, for example, are being depleted in many parts of the world, including the southeast of the United States. Groundwater is critical for farmers to ride out droughts, and if that safety net reaches an abrupt end, the impact of droughts on the food supply will be even larger.

Levels of carbon dioxide and other greenhouse gases in Earth’s atmosphere are exceeding levels recorded in the past millions of years, and thus climate is being forced beyond the range of the recent geological era.

The paleoclimate record—information on past climate gathered from sources such as fossils, sediment cores, and ice cores—contains ample evidence of abrupt changes in Earth’s ancient past, including sudden changes in ocean and air circulation, or abrupt extreme extinction events. One such abrupt change was at the end of the Younger Dryas, a period of cold climatic conditions and drought in the north that occurred about 12,000 years ago. Following a millennium-long cold period, the Younger Dryas abruptly terminated in a few decades or less and is associated with the extinction of 72 percent of the large-bodied mammals in North America. Some abrupt climate changes are already underway, including the rapid decline of Arctic sea ice over the past decade due to warmer polar temperatures.

Scientific research has advanced sufficiently that it is possible to assess the likelihood, for example the probability of a rapid shutdown of the Atlantic Meridional Overturning Circulation (AMOC) within this century is now understood to be low.

Human infrastructure is built with certain expectations of useful life expectancy, but even gradual climate changes may trigger abrupt thresholds in their utility, such as rising sea levels surpassing sea walls or thawing permafrost destabilizing pipelines, buildings, and roads.

The primary timescale of concern is years to decades. A key characteristic of these changes is that they can come faster than expected, planned, or budgeted for, forcing more reactive, rather than proactive, modes of behavior.

Table S.1 summarizes the state of knowledge about potential abrupt changes. This table includes potential abrupt changes to the ocean, atmosphere, ecosystems, and highlatitude regions that are judged to meet the above criteria. For each abrupt change, the Committee examined the available evidence of potential impact and likelihood. Some abrupt changes are likely to occur within this century—making these changes of most concern for near-term societal decision making and a priority for research.

S-1 abrupt CC 1

 

 

 

S-1 abrupt CC 2S-1 abrupt CC 3S-1 abrupt CC 41 Change could be either abrupt or non-abrupt.

2 Committee assesses the near-term outlook that sea level will rise abruptly before the end of this century as Low; this is not in contradiction to the assessment that sea level will continue to rise steadily with estimates of between 0.26 and

0.82­m by the end of this century (IPCC, 2013).

3 Methane is a powerful but short-lived greenhouse gas

4 Limited by ability to predict methane production from thawing organic carbon

5 No mechanism proposed would lead to abrupt release of substantial amounts of methane from ocean methane hydrates this century.

6 Limited by undertainty in hydrate abundance in near-surface sediments, and fate of CH4 once released

7 Species distribution models (Thuiller et al., 2006) indicate between 10–40% of mammals now found in African protected areas will be extinct or critically endangered by 2080 as a result of modeled climate change. Analyses by Foden et al.(2013) and Ricke et al. (2013) suggest 41% of bird species, 66% of amphibian species, and between 61% and 100% of corals that are not now considered threatened with extinction will become threatened due to climate change sometime between now and 2100.

Disappearance of Late-Summer Arctic Sea Ice

Recent dramatic changes in the extent and thickness of the ice that covers the Arctic sea have been well documented. Satellite data for late summer (September) sea ice extent show natural variability around a clearly declining long-term trend (Figure S.1). This rapid reduction in Arctic sea ice already qualifies as an abrupt change with substantial decreases in ice extent occurring within the past several decades. Projections from climate models suggest that ice loss will continue in the future, with the full disappearance of late-summer Arctic sea ice possible in the coming decades. The impacts of rapid decreases in Arctic sea ice are likely to be considerable. More open water conditions during summer would have potentially large and irreversible effects on various components of the Arctic ecosystem, including disruptions in the marine food web, shifts in the habitats of some marine mammals, and erosion of vulnerable coastlines. Because the Arctic region interacts with the large-scale circulation systems of the ocean and atmosphere, changes in the extent of sea ice could cause shifts in climate and weather around the northern hemisphere. The Arctic is also a region of increasing economic importance for a diverse range of stakeholders, and reductions in Arctic sea ice will bring new legal and political challenges as navigation routes for commercial shipping open and marine access to the region increases for offshore oil and gas development, tourism, fishing and other activities.

Increases in Extinction Threat for Marine and Terrestrial Species

The rate of climate change now underway is probably as fast as any warming event in the past 65 million years, and it is projected that its pace over the next 30 to 80 years will continue to be faster and more intense. These rapidly changing conditions make survival difficult for many species. Biologically important climatic attributes—such as number of frost-free days, length and timing of growing seasons, and the frequency and intensity of extreme events (such as number of extremely hot days or severe storms)—are changing so rapidly that some species can neither move nor adapt fast enough

The distinct risks of climate change exacerbate other widely recognized and severe extinction pressures, especially habitat destruction, competition from invasive species, and unsustainable exploitation of species for economic gain, which have already elevated extinction rates to many times above background rates. If unchecked, habitat destruction, fragmentation, and over-exploitation, even without climate change, could result in a mass extinction within the next few centuries equivalent in magnitude to the one that wiped out the dinosaurs. With the ongoing pressures of climate change, comparable levels of extinction conceivably could occur before the year 2100; indeed, some models show a crash of coral reefs from climate change alone as early as 2060 under certain scenarios. Loss of a species is permanent and irreversible, and has both economic impacts and ethical implications. The economic impacts derive from loss of ecosystem services, revenue, and jobs, for example in the fishing, forestry, and ecotourism industries. Ethical implications include the permanent loss of irreplaceable species and ecosystems as the current generation’s legacy to the next generation.

Abrupt Changes of Unknown Probability Destabilization of the West Antarctic Ice Sheet

The volume of ice sheets is controlled by the net balance between mass gained (from snowfall that turns to ice) and mass lost (from iceberg calving and the runoff of meltwater from the ice sheet). Scientists know with high confidence from paleo-climate records that during the planet’s cooling phase, water from the ocean is traded for ice on land, lowering sea level by tens of meters or more, and during warming phases, land ice is traded for ocean water, raising sea level, again by tens of meters and more. The rates of ice and water loss from ice stored on land directly affect the speed of sea level rise, which in turn directly affects coastal communities. Of greatest concern among the stocks of land ice are those glaciers whose bases are well below sea level, which includes most of West Antarctica, as well as smaller parts of East Antarctica and Greenland. These glaciers are sensitive to warming oceans, which help to thermally erode their base, as well as rising sea level, which helps to float the ice, further destabilizing them. Accelerated sea level rise from the destabilization of these glaciers, with sea level rise rates several times faster than those observed today, is a scenario that has the potential for very serious consequences for coastal populations, but the probability is currently not well known,

Research to understand ice sheet dynamics is particularly focused on the boundary between the floating ice and the grounded ice, usually called the grounding line (see Figure S.3). The exposed surfaces of ice sheets are generally warmest on ice shelves, because these sections of ice are at the lowest elevation, furthest from the cold central region of the ice mass and closest to the relatively warmer ocean water. Locations where meltwater forms on the ice shelf surface can wedge open crevasses and cause ice-shelf disintegration—in some cases, very rapidly.

Because air carries much less heat than an equivalent volume of water, physical understanding indicates that the most rapid melting of ice leading to abrupt sea-level rise is restricted to ice sheets flowing rapidly into deeper water capable of melting ice rapidly and carrying away large volumes of icebergs. In Greenland, such deep water contact with ice is restricted to narrow bedrock troughs where friction between ice and fjord walls limits discharge. Thus, the Greenland ice sheet is not expected to destabilize rapidly within this century. However, a large part of the West Antarctic Ice Sheet (WAIS), representing 3–4 m of potential sea-level rise, is capable of flowing rapidly into deep ocean basins. Because the full suite of physical processes occurring where ice meets ocean is not included in comprehensive ice-sheet models, it remains possible that future rates of sea-level rise from the WAIS are underestimated, perhaps substantially.

Abrupt Changes Unlikely to Occur This Century

These include disruption to the Atlantic Meridional Overturning Circulation (AMOC) and potential abrupt changes of high-latitude methane sources (permafrost soil carbon and ocean methane hydrates). Although the Committee judges the likelihood of an abrupt change within this century to be low for these processes, should they occur even next century or beyond, there would likely be severe impacts. Furthermore, gradual changes associated with these processes can still lead to consequential changes.

However, it is important keep a close watch on this system, to make observations of the North Atlantic to monitor how the AMOC responds to a changing climate, for reasons including the likelihood that slow changes will have real impacts, and to update the understanding of the slight possibility of a major event.

Potential Abrupt Changes due to High-Latitude Methane

Large amounts of carbon are stored at high latitudes in potentially labile reservoirs such as permafrost soils and methane-containing ices called methane hydrate or clathrate, especially offshore in ocean marginal sediments. Owing to their sheer size, these carbon stocks have the potential to massively affect Earth’s climate should they somehow be released to the atmosphere. An abrupt release of methane is particularly worrisome because methane is many times more potent than carbon dioxide as a greenhouse gas over short time scales. Furthermore, methane is oxidized to carbon dioxide in the atmosphere, representing another carbon dioxide pathway from the biosphere to the atmosphere.

According to current scientific understanding, Arctic carbon stores are poised to play a significant amplifying role in the century-scale buildup of carbon dioxide and methane in the atmosphere, but are unlikely to do so abruptly, i.e., on a timescale of one or a few decades.

Although comforting, this conclusion is based on immature science and sparse monitoring capabilities. Basic research is required to assess the long-term stability of currently frozen Arctic and sub-Arctic soil stocks, and of the possibility of increasing the release of methane gas bubbles from currently frozen marine and terrestrial sediments, as temperatures rise.

The Committee examined a number of other possible changes. These included sea level rise due to thermal expansion or ice sheet melting (except WAIS—see above), decrease in ocean oxygen (expansion in oxygen minimum zones (OMZs)), changes to patterns of climate variability, changes in heat waves and extreme precipitation events (droughts/floods/ hurricanes/major storms), disappearance of winter Arctic sea ice (distinct from late summer Arctic sea ice—see above), and rapid state changes in ecosystems, species range shifts, and species boundary changes.

Early studies of ice cores showed that very large changes in climate could happen in a matter of a few decades or even years, for example, local to regional temperature changes of a dozen degrees or more, doubling or halving of precipitation rates, and dust concentrations changing by orders of magnitude

What has become clearer recently is that the issue of abrupt change cannot be confined to a geophysical discussion of the climate system alone. The key concerns are not limited to large and abrupt shifts in temperature or rainfall, for example, but also extend to other systems that can exhibit abrupt or threshold-like behavior even in response to a gradually changing climate. The fundamental concerns with abrupt change include those of speed—faster changes leave less time for adaptation, either economically or ecologically—and of magnitude—larger changes require more adaptation and generally have greater impact.

This report offers an updated look at the issue of abrupt climate change and its potential impacts, and takes the added step of considering not only abrupt changes to the climate system itself, but also abrupt impacts and tipping points that can be triggered by gradual changes in climate. This examination of the impacts of abrupt change brings the discussion into the human realm, raising questions such as: Are there potential thresholds in society’s ability to grow sufficient food? Or to obtain sufficient clean water? Are there thresholds in the risk to coastal infrastructure as sea levels rise?

Bark beetles are a natural part of forested ecosystems, and infestations are a regular force of natural change. In the last two decades, though, the bark beetle infestations that have occurred across large areas of North America have been the largest and most severe in recorded history, killing millions of trees across millions of hectares of forest from Alaska to southern California (Bentz, 2008); see Figure B. Bark beetle outbreak dynamics are complex, and a variety of circumstances must coincide and thresholds must be surpassed for an outbreak to occur on a large scale. Climate change is thought to have played a significant role in these recent outbreaks by maintaining temperatures above a threshold that would normally lead to cold-induced mortality.

When there are consecutive warm years, this can speed up reproductive cycles and increase the likelihood of outbreaks (Bentz et al., 2010). Similar to many of the issues described in this report, climate change is only one contributing factor to these types of abrupt climate impacts, with other human actions such as forest history and management also playing a role.

They noted that events that did not meet the common criterion of a semi-permanent change in state could still force other systems into a permanent change, and thus qualify as an abrupt change. For example, a mega-drought may be followed by the return of normal precipitation rates, such that no baseline change occurred, but if that drought caused the collapse of a civilization, a permanent, abrupt change occurred in the system impacted by climate.

 

The 2002 NRC study introduced the important issue of gradual climate change causing abrupt responses in human or natural systems, noting “Abrupt impacts therefore have the potential to occur when gradual climatic changes push societies or ecosystems across thresholds and lead to profound and potentially irreversible impacts.” The 2002 report also noted that “…the more rapid the forcing, the more likely it is that the resulting change will be abrupt on the time scale of human economies or global ecosystems” and “The major impacts of abrupt climate change are most likely to occur when economic or ecological systems cross important thresholds

Changes occurring over a few decades, i.e., a generation or two, begin to capture the interest of most people because it is a time frame that is considered in many personal decisions and relates to personal memories. Also, at this time scale, changes and impacts can occur faster than the expected, stable lifetime of systems about which society cares. For example, the sizing of a new air conditioning system may not take into consideration the potential that climate change could make the system inadequate and unusable before the end of its useful lifetime (often 30 years or more). The same concept applies to other infrastructure, such as airport runways, subway systems, and rail lines. Thus, even if a change is occurring over several decades, and therefore might not at first glance seem “abrupt,” if that change affects systems that are expected to function for an even longer period of time, the impact can indeed be abrupt when a threshold is crossed. “Abrupt” then, is relative to our “expectations,” which for the most part come from a simple linear extrapolation of recent history, and “expectations” invoke notions of risk and uncertainty. In such cases, it is the cost associated with unfulfilled expectations that motivates discussion of abrupt change. Finally, changes occurring over one to a few years are abrupt, and for most people, would also be alarming if sufficiently large and impactful.

The rate of greenhouse gas addition to the atmosphere continues to increase, with many policies in place to accelerate rising greenhouse gases (IMF, 2013). It is sobering to consider that about one-fifth of all fossil fuels ever burned were burned since the 2002 report was released. The sum of global emissions from 1751 through 2009 inclusive is 355,676 million metric tons of carbon; sum of global emissions from 2002 through 2009 inclusive is 64,788 million metric tons of carbon (Boden et al., 2011). Total carbon emissions for 2002-2009 compared to the total 1751-2009 is thus greater than 18%.

Abrupt Changes of Primary Concern

Either because they are currently believed to be the most likely and the most impactful, because they are predicted to potentially cause severe impacts but with uncertain likelihood, or because they are considered to be unlikely to occur but have been widely discussed in the literature or media.

 

It is very unlikely that the AMOC will undergo an abrupt transition or collapse in the 21st century. Delworth et al. (2008) pointed out that for an abrupt transition of the AMOC to occur, the sensitivity of the AMOC to forcing would have to be far greater than that seen in current models. Alternatively, significant ablation of the Greenland ice sheet greatly exceeding even the most aggressive of current projections would be required. As noted in the ice sheet section later in this chapter, Greenland ice has about 7.3m equivalent of sea level rise, which, if melted over 1000 years, yields an annual rise rate of 7 mm/yr, about 2 times faster just from Greenland than today’s rate from all sources, and more than 10 times faster than the rate from Greenland over 2000–2011 (Shepherd et al., 2012). Although neither possibility can be excluded entirely, it is unlikely that the AMOC will collapse before the end of the 21st century because of global warming.

Rising sea level increases the likelihood that a storm surge will overtop a levee or damage other coastal infrastructure, such as coastal roads, sewage treatment plants, or gas lines—all with potentially large, expensive, and immediate consequences

A separate but key question is whether sea-level rise itself can be large, rapid and widespread. In this regard, rate of change is assessed relative to the rate of societal adaptation. Available scientific understanding does not answer this question fully, but observations and modeling studies do show that a much faster sea-level rise than that observed recently (~3 mm/yr over recent decades) is possible (Cronin, 2012). Rates peaked more than 10 times faster in Meltwater Pulse 1A during the warming from the most recent ice age, a time with more ice on the planet to contribute to the sealevel rise, but slower forcing than the human-caused rise in CO2 (Figure 2.5 and 2.6). One could term a rise “rapid” if the response or adaptation time is significantly longer than the rise time. For example, a rise rate of 15 mm/yr (within the range of projec

Projections of sea-level rise remain notably uncertain even if the increase in greenhouse gases is specified accurately, but many recently published estimates include within their range of possibilities a rise of 1m by the end of this century (reviewed by Moore et al., 2013). For lowlying metropolitan areas, such as Miami and San Francisco, such a rise could lead to significant flooding

Thirty nine percent of the population lives in coastal shoreline counties. This population grew by 39 percent between 1970 and 2010, and is projected to grow by 8.3 percent by 2020. The population density of coastal counties is 446 people per sq mile, which is over 4 times that of inland counties. Just under half of the annual GDP of the United States is generated in coastal shoreline counties, an annual contribution that was $6.6 trillion in 2011. If counted as their own country, these counties would rank as the world’s third largest economy, after the United States and China. Some portions of these counties are well above sea level and not vulnerable to flooding (e.g., Cadillac Mountain, Maine, in Acadia National Park, at 470 m). But, the interconnected nature of roads and other infrastructure within political divisions mean that sea-level rise would cause problems even for the higher parts of these counties. The following statistics, from NOAA’s State of the Coast,a highlight the wealth and infrastructure at risk from rising seas: • $6.6 trillion: Contribution to GDP of the coastal shoreline counties, just under half of US GDP in 2011.b

  • 446 persons/mi2: Average population density of the coastal watershed counties (excluding Alaska). Inland density averages 61 persons per square mile.h

In many cases, such areas would be difficult to defend by dikes and dams, and such a large sea level rise would require responses ranging from potentially large and expensive engineering projects to partial or near complete abandonment of now-valuable areas as critical infrastructure such as sewer systems, gas lines, and roads are disrupted, perhaps crossing tipping points for adaptation (Kwadijk et al., 2010). Miami was founded little more than one century ago, and could face the possibility of sea level rise high enough to potentially threaten the city’s critical infrastructure in another century (Strauss et al., 2013). In terms of modern expectations for the lifetime of a city’s infrastructure, this is abrupt. If sometime in the coming centuries sea level should rise 20 to 25 m, as suggested

FIGURE B The long-term worst-case sea-level rise from ice sheets could be more than 60 m if all of Greenland and Antarctic ice melts. A 20 m rise, equivalent to loss of all of Greenland’s ice, all of the ice in West Antarctica, and some coastal parts of East Antarctica, is shown here.This may approximate the sea level during the Pliocene period (3–5 million years ago), the last time that CO2 levels are thought to have been 400 ppm.This figure emphasizes the large areas of coastal infrastructure that are potentially at risk if substantial ice sheet loss were to occur. SOURCE: http://geology.com/sea-level-rise/washington.shtml. for the Pliocene Epoch, 3 to 5 million years ago (see Figure 2.5), when CO2 is estimated to have had levels similar to today of roughly 400 parts per million, most of Delaware, the first State in the Union, would be under water without very large engineering projects (Figure B). In terms of the expected lifetime of a State, this could also qualify as abrupt.

In addition, compaction following removal of groundwater or fossil fuels, or possibly inflation from injection of fluids, may change land elevation

Most mountain glaciers worldwide are losing mass, contributing to sea-level rise. However, the amount of water stored in this ice is estimated to be less than 0.5 m of sea-level equivalent (Lemke et al., 2007), so the contribution to sea-level rise cannot be especially large before the reservoir is depleted. On the other hand, the reservoir in the polar ice sheets is sufficient to raise global sea level by more than 60 m (Lemke et al., 2007).

Beyond some threshold of a few degrees C warming, Greenland’s ice sheet will be almost completely removed. However, the timescale for this is expected to be many centuries to millennia. This still could result in a relatively rapid rate of sea-level rise. Greenland ice has about 7.3 m equivalent of sea-level rise (Lemke et al., 2007), which, if melted over 1000 years (a representative rather than limiting case), yields an annual rise rate of 7 mm/yr just from Greenland, slightly more than twice as fast as the recent rate of rise from all sources including melting of Greenland’s ice.

 

Mass loss by flow of ice into the ocean is less well understood, and it is arguably the frontier of glaciological science where the most could be gained in terms of understanding the threat to humans of rapid sea-level rise. Increased ice-sheet flow can raise sea level by shifting non-floating ice into icebergs or into floating-but-still-attached ice shelves, which can melt both from beneath and on the surface. Rapid sea-level rise from these processes is limited to those regions where the bed of the ice sheet is well below sea level and thus capable of feeding ice shelves or directly calving icebergs rapidly, but this still represents notable potential contributions to sea-level rise, including the deep fjords in Greenland (roughly 0.5 m; Bindschadler et al., 2013), parts of the East Antarctic ice sheet (perhaps as much as 20 m; Fretwell et al., 2013), and especially parts of the West Antarctic ice sheet (just over 3 m;

The loss of land ice, particularly from marine-based ice sheets such as the West Antarctic Ice Sheet—possibly in response to gradual ocean warming—could trigger sea-level rise rates that are much higher than ongoing. Paleoclimatic rates at least 10 times larger than recent rates have been documented, and similar or possibly higher rates cannot be excluded in the future. This time scale is also roughly that of humanbuilt infrastructure such as roads, water treatment plants, tunnels, homes, etc. Deep uncertainty persists about the likelihood of a rapid ice-sheet “collapse” contributing to a major acceleration of sea-level rise; for the coming century, the probability of such an event is generally considered to be low but not zero.

The impacts of ocean acidification on ocean biology have the potential to cause rapid (over multiple decades) changes in ecosystems and to be irreversible when contributing to extinction events. Specifically, the increase in CO2 and HCO3– availability might increase photosynthetic rates in some photosynthetic marine organisms, and the decrease in CO32– availability for calcification makes it increasingly difficult for calcifying organisms (such as some phytoplankton, corals, and bivalves) to build their calcareous shells and effects pH sensitive physiological processes (NRC, 2010c, 2013). As such, ocean acidification could represent an abrupt climate impact when thresholds are crossed below which organisms lose the ability to create their shells by calcification, or pH changes affect survival rates

Of more immediate concern is the expansion of Oxygen Minimum Zones (OMZs). Photosynthesis in the sunlit upper ocean produces O2, which escapes to the atmosphere; it also produces particles of organic carbon that sink into deeper waters before they decompose and consume O2. The net result is a subsurface oxygen minimum typically found from 200–1000 meters of water depth, called an Oxygen Minimum Zone. Warming ocean temperatures lead to lower oxygen solubility. A warming surface ocean is also likely to increase the density stratification of the water column (i.e., Steinacher et al., 2010), altering the circulation and potentially increasing the isolation of waters in an OMZ from contact with the atmosphere, hence increasing the intensity of the OMZ. Thus, oxygen concentrations in OMZs fall to very low levels due to the consumption of organic matter (and associated respiration of oxygen) and weak replenishment of oxygen by ocean mixing and circulation. Furthermore, a hypothetical warming of 1ºC would decrease the oxygen solubility by 5 µM (a few percent of the saturation value). This would result in the expansion of the hypoxic2 zone by 10 percent, and a tripling of the extent of the suboxic zone (Deutsch et al., 2011). With a 2ºC warming, the solubility would decrease by 14 µM resulting in a large expansion of areas depleted of dissolved oxygen and turning large areas of the ocean into places where aerobic life disappears.

Hypoxia is the environmental condition when dissolved water column oxygen (DO) drops below concentrations that are considered the minimal requirement for animal life. Suboxia is even further depletion of oxygen and anoxia is the condition of no paleo records have shown the extinctions of many benthic species during past periods of hypoxia. These periods have coincided with both a rise in temperature and sea level. Records also indicate long recovery times for ecosystems affected by hypoxic events (Danise et al., 2013). In addition, when the oxygen in seawater is depleted, bacterial respiration of organic matter turns to alternate electron-acceptors with which to oxidize organic matter, such as dissolved nitrate (NO3–). A by-product of this “denitrification” reaction is the release of N2O, a powerful greenhouse gas with an atmospheric lifetime of about 150 years. Low-oxygen environments, in the water column and in the sediments, are the main removal mechanism for nitrate from the global ocean. An intensification of oxygen depletion in the ocean therefore also has the potential to alter the global ocean inventory of nitrate, affecting photosynthesis in the ocean. However, the lifetime of nitrate in the global ocean is thousands of years, so any change in the global nitrate inventory would also take place on this long time scale.

Likelihood of Abrupt Changes

Changes in global ocean oxygen concentrations have the potential to be abrupt because of the threshold to anoxic conditions, under which the region becomes uninhabitable for aerobic organisms including fish and benthic organisms. Once this tipping point is reached in an area, anaerobic processes would be expected to dominate resulting in a likely increase in the production of the greenhouse gas N2O. Some regions like the Bay of Bengal already have low oxygen concentrations today.

OMZs have also been intensified in many areas of the world’s coastal oceans by runoff of plant fertilizers from agriculture and incomplete wastewater treatment. These ‘dead zones’ have spread significantly since the middle of the last century and pose a threat to coastal marine ecosystems (Diaz and Rosenberg, 2008).This expansion of OMZs is due to nutrient runoff makes the ocean more vulnerable to decreasing solubility of O2 in a warmer ocean. Indeed, as warming of the ocean intensifies, the decrease in oxygen availability might become non-linear; particularly, as indicated by the expansion of the size of the oxygen minimum zone

 

ABRUPT CHANGES IN THE ATMOSPHERE

 

Atmospheric Circulation The climate system exhibits variability on a range of spatial and temporal scales. On large (i.e., continental) scales, variability in the climate system tends to be organized into distinct spatial patterns of atmospheric and oceanic variability that are largely fixed in space but fluctuate in time. Such patterns are thought to owe their existence to internal feedbacks within the climate system. Prominent patterns of large-scale climate variability include: • the El-Nino/Southern Oscillation (ENSO), • the Madden-Julian Oscillation (MJO), • the stratospheric Quasi-Biennial Oscillation, • the Pacific-North American pattern, and • the Northern and Southern annular modes (the Northern

Given the definition of abrupt change in this report (see Box 1.2), there is little evidence that the atmospheric circulation and its attendant large-scale patterns of variability have exhibited abrupt change, at least in the observations. The atmospheric circulation exhibits marked natural variability across a range of timescales, and this variability can readily mask the effects of climate change (e.g., Deser et al., 2012a, 2012b). As noted above, patterns of large-scale variability in the extratropical atmospheric wind field exhibit variations on timescales from weeks to decades (Hartmann and Lo, 1998; Feldstein, 2000).

Weather and Climate Extremes

Extreme weather and climate events include heat waves, droughts, floods, hurricanes, blizzards, and other events that occur rarely.

Extreme weather and climate events are among the most deadly and costly natural disasters. For example, tropical cyclone Bhola in 1970 caused about 300,000-500,000 deaths in East Pakistan (Bangladesh today) and West Bengal of India.3,4 Hurricane Katrina caused more than 1,800 deaths and $96-$125 billion in damages to the Southeast U.S. in 2005. Worldwide, more than 115 million people are affected and more than 9,000 people are killed annually by floods, most of them in Asia (Figure 2.9 or see, for example, the Emergency Events Database5). Heat waves contributed to more than 70,000 deaths in Europe in 2003 (e.g., Robine et al., 2008) and more than 730 deaths and thousands of hospitalizations in Chicago in 1995 (Chicago Tribune, July 31, 1995; Centers for Disease Control and Prevention, 1995). Heat waves are one of the largest weather-related sources of mortality in the United States annually.6

TABLE 2.1 Billion-dollar weather and climate disasters in the United States from 1980 to 2011 by type. Total damages are in consumer-price-index-adjusted 2012 dollars. Note that the impacts of droughts are difficult to determine precisely, so those figures may be underestimated.

 

The potential for abrupt regime shifts was raised in NRC (2002), which highlighted the transitions into and out of the 1930s Dust Bowl as prime examples.

The impacts of extreme events on societal tipping points have been more clearly appreciated (Lenton et al., 2008; Nel and Righarts, 2008).

Extreme warm temperatures in summer can greatly increase the risks of mega-fires in temperate forests, boreal forests, and savanna ecosystems, leading to abrupt changes in species dominance and vegetation type, regional water yield and quality, and carbon emission (e.g., Adams, 2013), before the gradual increase of surface temperature crosses the threshold for abrupt ecosystem collapse

Extreme events could lead to a tipping point in regional politics or social stability. In Africa, extreme droughts and high temperatures have been linked to an increase of risk of civil conflict and large-scale humanitarian crisis in Africa.

Generally, extreme climate events alone do not cause conflict. However, they may act as an accelerant of instability or conflict, placing a burden to respond on civilian institutions and militaries around the world (NRC, 2012b). For example, the devastating tropical cyclone Bhola in 1970 heightened the dissatisfaction with the ruling government and strengthened the Bangladesh separatist movement. This led eventually to civil war and independence of Bangladesh in 1971

Historically, extreme climate events such as decadal mega-droughts may have triggered the collapse of civilizations, such as the Maya (Hodell et al., 1995; Kennett et al., 2012) or large scale civil unrest that ended the Ming dynasty (Shen et al., 2007).

ABRUPT CHANGES AT HIGH LATITUDES

Potential Climate Surprises Due to High-Latitude Methane and Carbon Cycles

Interest in high-latitude methane and carbon cycles is motivated by the existence of very large stores of carbon (C), in potentially labile reservoirs of soil organic carbon in permafrost (frozen) soils and in methane-containing ices called methane hydrate or clathrate, especially offshore in ocean marginal sediments. Owing to their sheer size, these carbon stocks have potential to massively impact the Earth’s climate, should they somehow be released to the atmosphere. An abrupt release of methane (CH4) is particularly worrisome as it is many times more potent as a greenhouse gas than carbon dioxide (CO2) over short time scales. Furthermore, methane is oxidized to CO2 in the atmosphere representing another CO2 pathway from the biosphere to the atmosphere in addition to direct release of CO 2 from aerobic decomposition of carbon-rich soils.

Permafrost Stocks

Frozen northern soils contain enough carbon to drive a powerful carbon cycle feedback to a warming climate (Schuur et al., 2008). These stocks across large areas of Siberia comprise mainly yedoma (an ice-rich, loess-like deposit averaging ~25 m deep [Zimov et al., 2006b]), peatlands (i.e., histels and gelisols), and river delta deposits. Published estimates of permafrost soil carbon have tended to increase over time, as more field datasets are incorporated and deposits deeper than 1 m depth are considered. Estimates of the total soil-carbon stock in permafrost in the Arctic range from 1,700–1,850 Gt C (Gt C = gigatons of carbon; Tarnocai et al., 2009;

To put the Arctic soil carbon reservoir into perspective, the carbon it contains exceeds current estimates of the total carbon content of all living vegetation on Earth (approximately 650 Gt C), the atmosphere (730 Gt C, up from ~360 Gt C during the last ice age and 560 Gt C prior to industrialization, Denman et al., 2007), proved reserves of recoverable conventional oil and coal (about 145 Gt C and 632 Gt C, respectively), and even approaches geological estimates of all fossil fuels contained within the Earth (~1,500 – 5,000 Gt C). It represents more than two and a half centuries of our current rate of carbon release through fossil fuel burning and the production of cement (nearly 9 Gt C per year, Friedlingstein et al., 2010). These vast deposits exist largely because microbial breakdown of organic soil carbon is generally low in cold climates, and virtually halted when frozen in permafrost. Despite slow rates of plant growth in the Arctic and sub-Arctic latitudes, massive deposits of peat have accumulated there since the last glacial maximum (Smith et al., 2004; MacDonald et al., 2006). Potential response to a warming climate Permafrost soils in the Arctic have been thawing for centuries, reflecting the rise of temperatures since the last glacial maximum (~21 kyr ago) and the Little Ice Age (1350-1750).

FIGURE 2.12 Top: Approximate inventories of carbon in various reservoirs (see text for references).

Melting has accelerated in recent decades, and can be attributed to human-induced warming (Lemke et al., 2007). Under business-as-usual climate forcing scenarios, much of the upper permafrost is projected to thaw within a time scale of about a century (Camill, 2005, Lawrence and Slater, 2005). Exactly how this will proceed is uncertain.

It is clear that the time scale for deep permafrost thaw is measured in centuries, not years. Furthermore, unlike methane hydrates (see below), the very large stocks of permafrost soil carbon (i.e., the 1,672 Gt C of Tarnocai et al., 2009) must first undergo anaerobic microbial fermentation to produce methane, itself a gradual decomposition process. There are no currently proposed mechanisms that could liberate a climatically significant amount of methane or CO 2 from frozen permafrost soils within an abrupt time scale of a few years, and it appears gradual increases in carbon release from warming soils can be at least partially offset, owing to rising vegetation net primary productivity.

A related idea is the possibility of rising soil temperatures triggering a “compost bomb instability” (Wieczorek et al., 2011)—possibly including combustion—and a prime example of a rate-dependent tipping point (Ashwin et al., 2012). Such possibilities would represent a rapid breakdown of the Arctic’s very large soil carbon stocks and warrant further research. Even absent an abrupt or catastrophic mobilization of CO2 or methane from permafrost carbon stocks, it is important to recognize that Arctic emissions of these critical greenhouse gases are projected to increase gradually for many decades to centuries, thus helping to drive the global climate system more quickly towards other abrupt thresholds examined in this report.

Methane Hydrates in the Ocean

Stocks Under conditions of high pressure, high methane concentration, and low temperature, water and methane can combine to form icy solids known as methane hydrates or clathrates in ocean sediments.

Throughout most of the world ocean, a water depth of about 700 m is required for hydrate stability. In the Arctic, due to colder-than-average water temperatures, only about 200 m of water depth is required, which increases the vulnerability of those methane hydrates to a warming Arctic Ocean. The Arctic is also a focus of concern because of the wide expanse of continental shelf (25 percent of the world’s total), much of which is still frozen owing to its exposure to the frigid atmosphere during lowered sea levels of the last glacial maximum (see above). The inventory of methane in ocean margin sediments is large but not well constrained, with a generally agreed upon range of 1,000-10,000 Gt C (Archer, 2007; Boswell, 2007; Boswell et al., 2012). One inventory places the total Arctic Ocean hydrates at about 1,600 Gt C by extrapolation of an estimate from Shakhova et al. (2010a) to the entire Arctic shelf region (Isaksen et al., 2011) (see Figure 2.12). The geothermal increase in temperature with depth in the sediment column restricts methane hydrate to within a few hundred meters thickness near the upper surface of the sediments

Warming bottom waters in deeper parts of the ocean, where surface sediment is much colder than freezing and the hydrate stability zone is relatively thick, would not thaw hydrates near the sediment surface, but downward heat diffusion into the sediment column would thin the stability zone from below, causing basal hydrates to decompose, releasing gaseous methane. The time scale for this mechanism of hydrate thawing is on the order of centuries to millennia, limited by the rate of anthropogenic heat diffusion into the deep ocean and sediment column.

The proportion of this gas production that will reach the atmosphere as CH4 is likely to be small. To reach the atmosphere, the CH4 would have to avoid oxidization within the sediment column (a chemical trap) and re-freezing within the stability zone shallower in the sediment column (a cold trap).

 

Most of the methane gas that emerges from the sea floor dissolves in the water column and oxidizes to CO2 instead of reaching the atmosphere. Bubble plumes tend to dissolve on a height scale of tens of meters even in the cold Arctic Ocean, methane hydrate is only stable below about 200 m water depth, making for an inefficient pathway to the atmosphere at best.

Over time scales of centuries and millennia, the ocean hydrate pool has the potential to be a significant amplifier of the anthropogenic fossil fuel carbon release. Because the chemistry of the ocean equilibrates with that of the atmosphere (on time scales of decades to centuries), methane oxidized to CO2 in the water column will eventually increase the atmospheric CO2 burden (Archer and Buffett, 2005). As with decomposing permafrost soils, such release of carbon from the ocean hydrate pool would represent a change to the Earth’s climate system that is irreversible over centuries to millennia.

Impacts of Arctic Methane on Global Climate

Although attention is often focused on methane when considering a potential Arctic carbon release, because methane is a short-lived gas in the atmosphere (CH4 oxidizes to CO2 within about a decade), ultimately a methane problem is a CO2 problem. It does matter how rapidly methane is released, and the impacts of a spike versus chronic emissions are discussed in Box 2.4. As methane emissions from permafrost degradation will also be accompanied by larger fluxes of CO2, Arctic carbon stores clearly have the potential to be a significant amplifier to the human release of carbon.

Speculations about potential methane releases in the Arctic have ranged up to about 75 Gt C from the land (Isaksen et al., 2011) and 50 Gt C from the ocean (Shakhova et al., 2010a). A release of 50 Gt C methane from the Arctic to the atmosphere over 100 years would increase Arctic CH4 emissions by about a factor of 25, and would make the present-day permafrost area about two times more productive of CH4 on average as comes from wetlands today. Postulating such a methane release over a more abrupt 10-year time scale, the emission rates from present-day permafrost would have to exceed that from wetlands by a seemingly implausible factor of 20, supporting a longer century timescale for this process, and making methane emission from polar regions an unlikely candidate for a tipping point in the climate system. Nonetheless, as can be seen in Box 2.4, releasing 50 Gt C of methane over 100 years would have a significant impact on Earth’s climate. The atmospheric CH4 concentration would roughly quadruple, with a resulting total radiative forcing from CH4 of about 3 Watts/m2. The magnitude of this forcing is comparable to that from doubling the atmospheric CO2 concentration, but the impact of the methane forcing would be strongly attenuated by its short duration (see Box 2.4).

Summary and the Way Forward

Arctic carbon stores are poised to play a significant amplifying role in the centurytimescale buildup of CO2 and methane in the atmosphere, but are unlikely to do so abruptly, on a time scale of one or a few decades.

Boreal forests appear susceptible to rapid transition to sparse woodland or treeless landscapes as temperature and precipitation patterns shift

At the global scale, observations show that the transitions from forests to savanna and from savanna to grassland tend to be abrupt when annual rainfall ranges from 1,000 to 2,500 mm and from 750 to 1,500 mm, respectively (Hirota et al., 2011; Mayer and Khalyani, 2011; Staver et al., 2011). Such rainfall regimes cover nearly half of the global land, where either a gradual climate change across the ecosystem thresholds or a strong perturbation due to either extreme climate events, land use, or diseases could trigger abrupt ecosystem changes. The latter could in turn amplify the original climate change in the areas where land surface feedback is important to climate

 

Amazon forests represent the world’s largest terrestrial biome and potentially the tropical ecosystem most vulnerable to abrupt change in response to future climate change in concert with agricultural development (e.g., Cox et al., 2000; Lenton et al., 2008;

The forests are characterized by a tall canopy of broadleaved trees, 30-40m high, sometimes with impressive emergent trees up to 55 m or taller. The Brazilian portion of the Amazon comprises 4 × 106 km2,12 less than 1 percent of global land area, but disproportionally important in terms of aboveground terrestrial biomass (15 percent of global terrestrial photosynthesis [Field et al., 1998]) and number of species (~25 percent, Dirzo and Raven, 2003). Direct human intervention via deforestation represents an existential threat to this forest: despite recent moderation of rates of deforestation, the Amazon forest is on track to be 50 percent deforested within 30 years—arguably by itself an abrupt change of global importance (Fearnside,

Lenton et al. (2008) and Nobre and Borma (2009) have summarized current understanding of “tipping points” in Amazonian forests. Global and regional models do indeed simulate hysteresis and collapse of Amazonia forests. Models exhibit these shifts for a range of perturbations: temperature increases of 2-4°C, precipitation decreases by ~40 percent (1100 mm, according to Lenton et al., 2008), and/or deforestation that replaces large swathes of the forest with agriculture

Thresholds may occur much closer to current conditions, for example, if precipitation falls below 1,600-1,700 mm (Nobre and Borma, 2009). Indeed, long-lasting damage to Amazonian forests may have occurred after the single severe drought in 2005

The committee concludes that credible possibilities of thresholds, hysteresis, indirect effects, and interactions amplifying deforestation, make abrupt (50 year) change plausible in this globally important system. Rather modest shifts in climate and/or land cover may be sufficient to initiate significant migration of the ecotone defining the limit of equatorial closed-canopy forests in Amazonia, potentially affecting large areas.

In the context of this report, extinction is recognized as “abrupt” in two respects. First, the numbers of individuals and populations that ultimately compose a species may fall below critical thresholds such that the likelihood for species survival becomes very low. This kind of abrupt change is often cryptic, in that the species at face value remains alive for some time after the extinction threshold is crossed, but becomes in effect a “dead clade walking” (Jablonski, 2001). Such losses of individuals that take species towards critical viability thresholds can be very fast—within three decades or less, as already evidenced by many species now considered at risk of extinction due to causes other than climate change by the International Union for the Conservation of Nature.15

The abrupt impact of climate change on causing extinctions of key concern, therefore, is its potential to deplete population sizes below viable thresholds within just the next few decades, whether or not the last individual of a species actually dies.

From the late 20th to the end of the 21st century, climate has been and is expected to continue changing faster than many living species, including humans and most other vertebrate animals, have experienced since they originated. Consequently, the predicted “velocity” of climate change—that is, how fast populations of a species would have to shift in geographic space in order to keep pace with the shift of the organisms’ current local climate envelope across the Earth’s surface—is also unprecedented (Diffenbaugh and Field, 2013; Loarie et al.,

Climate change now is proceeding at “at a rate that is at least an order of magnitude and potentially several orders of magnitude more rapid than the changes to which terrestrial ecosystems have been exposed during the past 65 million years.

Moreover, the overall temperature of the planet is rapidly rising to levels higher than most living species have experienced (Figure 2.19). Consequently all the populations in some species, and many populations in others, will be exposed to local climatic conditions they have never experienced (so-called “novel climates”), or will see the climatic conditions that have been an integral part of their local habitats disappear (“disappearing climates”) (Williams et al., 2007). Models suggest that by the year 2100, novel and disappearing climates will affect up to a third and a half of Earth’s land surface, respectively (Williams et al., 2007), as well as a large percentage of the oceans

Thus, many species will experience unprecedented climatic conditions across their geographic range. If those conditions exceed the tolerances of local populations, and those populations cannot migrate or evolve fast enough to keep up with climate change, extinction will be likely. These impacts of rapid climate change will moreover occur within the context of an ongoing major extinction event that has up to now been driven primarily by anthropogenic habitat destruction.

Recent work suggests that up to 41 percent of bird species, 66 percent of amphibian species, and between 61 percent and 100 percent of corals that are not now considered threatened with extinction will become threatened due to climate change sometime between now and 2100 (Foden et al., 2013; Ricke et al., 2013), and that in Africa, 10-40 percent of mammal species now considered not to be at risk of extinction will move into the critically endangered or extinct categories by 2080, possibly as early as 2050

A critical consideration is that the biotic pressures induced by climate change will interact with other well-known anthropogenic drivers of extinction to amplify what are already elevated extinction rates. Even without putting climate change into the mix, recent extinction has proceeded at least 3-80 times above long-term background rates (Barnosky et al., 2011) and possibly much more (Pimm and Brooks, 1997; Pimm et al., 1995; WRI, 2005), 17 primarily from human-caused habitat destruction and overexploitation of species. The minimally estimated current extinction rate (3 times above background rate), if unchecked, would in as little as three centuries result in a mass extinction equivalent in magnitude to the one that wiped out the dinosaurs (Barnosky et al., 2011) (see Box 2.4). Importantly, this baseline estimate assumes no effect from climate change. A key concern is whether the added pressure of climate change would substantially increase overall extinction rates such that a major extinction episode would become a fait accompli within the next few decades, rather than something that potentially would play out over centuries. Known mechanisms by which climate change can cause extinction include the following. 1. Direct impact of an abrupt climatic event—for example, flooding of a coastal ecosystem by storm surges as by seas rise to levels discussed earlier in this report. 2. Gradually changing a climatic parameter until some biological threshold is exceeded for most individuals and populations of a species across its geographic range—for example, increasing ambient temperature past the limit at which an animal can dissipate metabolic heat, as is happening with pikas at higher elevations in several mountain ranges (Grayson, 2005). Populations of ocean corals (Hoegh-Guldberg, 1999; Mumby et al., 2007; Pandolfi et al., 2011; Ricke et al., 2013) and tropical forest ectotherms (Huey et al., 2012) also inhabit environments close to their physiological thermal limits and may thus be vulnerable to climate warming. Another potential threshold phenomenon is decreasing ocean pH to the point that the developmental pathways of many invertebrates (NRC, 2011a; Ricke et al., 2013) and vertebrate species are disrupted, as is already beginning to happen (see examples below).

Interaction of pressures induced directly by climate change with non-climatic anthropogenic factors, such as habitat fragmentation, overharvesting, or eutrophication, that magnify the extinction risk for a given species—for example, the checkerspot butterfly subspecies Euphydryas editha bayensis became extinct in the San Francisco Bay area as housing developments destroyed most of their habitat, followed by a few years of locally unfavorable climate conditions in their last refuge at Jasper Ridge, California (McLaughlin et al., 2002). 4. Climate-induced change in biotic interactions, such as loss of mutualist partner species, increases in disease or pest incidence, phenological mismatches, or trophic cascades through food webs after decline of a keystone species. Such effects can be intertwined with the intersection of extinction pressures noted in mechanism 3 above. In fact, the disappearance of checkerspot butterflies from Jasper Ridge was because unusual precipitation events altered the timing of overlap of the butterfly larvae and their host plants (McLaughlin et al., 2002).

BOX 2.4 MASS EXTINCTIONS Mass extinctions are generally defined as times when more than 75 percent of the known species of animals with fossilizable hard parts (shells, scales, bones, teeth, and so on) become extinct in a geologically short period of time (Barnosky et al., 2011; Harnik et al., 2012; Raup and Sepkoski, 1982). Several authors suggest that the extinction crisis is already so severe, even without climate change included as a driver, that a mass extinction of species is plausible within decades to centuries. This possible extinction event is commonly called the “Sixth Mass Extinction,” because biodiversity crashes of similar magnitude have happened previously only five times in the 550 million years that multi- cellular life has been abundant on Earth: near the end of the Ordovician (~443 million years ago), Devonian (~359 million years ago), Permian (251 million years ago), Triassic (~200 million years ago), and Cretaceous (~66 million years ago) Periods. Only one of the past “Big Five” mass extinctions (the dinosaur extinction event at the end of the Cretaceous) is thought to have occurred as rapidly as would be the case if currently observed extinctions rates were to continue at their present high rate (Alvarez et al., 1980; Barnosky et al., 2011; Robertson et al., 2004; Schulte et al., 2010), but the minimal span of time over which past mass extinctions actually took place is impossible to determine, because geological dating typically has error bars of tens of thousands to hundreds of thousands of years. After each mass extinction, it took hundreds of thousands to millions of years for biodiversity to build back up to pre-crash levels.

Data also indicate that continued climate change at its present pace would be detrimental to many species of marine clams and snails, fish, tropical ectotherms, and some species of plants (examples and citations below). For such species, continuing the present trajectory of climate change would very likely result in extinction of most, if not all, of their populations by the end of the 21st century. The likelihood of extinction from climate change is low for species that have short generation times, produce prodigious numbers of offspring, and have very large geographic ranges. However, even for such species, the interaction of climate change with habitat fragmentation may cause the extirpation of many populations. Even local extinctions of keystone species may have major ecological and economic impacts.

The interaction of climate change with habitat fragmentation has high potential for causing extinctions of many populations and species within decades (before the year 2100 if not sooner). The paleontological record and historical observations of species indicate that in the past species have survived climate change by their constituent populations moving to a climatically suitable area, or, if they cannot move, by evolving adaptations to the new climate. The present condition of habitat fragmentation limits both responses under today’s shifting climatic regime. More than 43 percent of Earth’s currently ice-free lands have been changed into farms, rangelands, cities, factories, and roads (Barnosky et al., 2012; Foley et al., 2011; Vitousek et al., 1986, 1997), and in the oceans many continental-shelf areas have been transformed by bottom trawling (Halpern et al., 2008; Jackson, 2008; Hoekstra et al., 2010). This extent of habitat destruction and fragmentation means that even if individuals of a species can move fast enough to cope with ongoing climate change, they will have difficulty dispersing into suitable areas because adequate dispersal corridors no longer exist. If individuals are confined to climatically unsuitable areas, the likelihood of population decline is enhanced, resulting in high likelihood of extinction if population size falls below critical values, from processes such as random fluctuations in population size

Novel climates are those that are created by combinations of temperature, precipitation, seasonality, weather extremes, etc., that exist nowhere on Earth today. Disappearing climates are combinations of climate parameters that will no longer be found anywhere on the planet. Modeling studies suggest that by the year 2100, between 12 percent and 39 percent of the planet will have developed novel climates, and current climates will have disappeared from 10 percent to 48 percent of Earth’s surface (Williams et al., 2007). These changes will be most prominent in what are today’s most important reservoirs of biodiversity

 

The end-Permian extinction started from a different continental configuration and global climate, so an exact reproduction is not to be expected,

The climatic warming at the last glacial-interglacial transition was coincident with the extinction of 72 percent of the large-bodied mammals in North America, and 83 percent of the large-bodied mammals in South America—in total, 76 genera including more than 125 species for the two continents. Many of these extinctions occur within and just following the Younger Dryas, and generally they are attributed to an interaction between climatic warming and human impacts. The magnitude of climatic warming, about 5oC, was about the same as currently-living species are expected to experience within this century, although the end-Pleistocene rate of warming was much slower. Also similar to today, the end-Pleistocene extinction event played out on a landscape where human population sizes began to grow rapidly, and when people began to exert extinction pressures on other large animals. The main differences today, with respect to extinction potentials, are that anthropogenic climate change is much more rapid and moving global climate outside the bounds living species evolved in, and the global human population, and the pressures people place on other species, are orders of magnitude higher than was the case at the last glacialinterglacial transition (Barnosky et al., 2012).

Many of the extinction impacts in the next few decades could be cryptic, that is, reducing populations to below-viable levels, destining the species to extinction even though extinction does not take place until later in the 21st or following century. The losses would have high potential for changing the function of existing ecosystems and degrading ecosystem services (see Chapter 3). The risk of widespread extinctions over the next three to eight decades is high in at least two critically important ecosystems where much of the world’s biodiversity is concentrated, tropical/ sub-tropical areas, especially rainforests and coral reefs. The risk of climate-triggered extinctions of species adapted to high, cool elevations and high-latitude conditions also is high.

Abrupt climate impacts may have detrimental effects on ecological resources that are critical to human well-being. Such resources are called “ecosystem services” (Box 3.1), which basically are attributes of ecosystems that fulfill the needs of people. For example, healthy diverse ecosystems provide the essential services of moderating weather, regulating the water cycle and delivering clean water, protecting and keeping agricultural soils fertile, pollinating plants (including crops), providing food (particularly seafood), disposing of wastes, providing pharmaceuticals, controlling spread of pathogens, sequestering greenhouse gases from the atmosphere, and providing recreational opportunities

 

Largely due to water-delivery issues related to climate change, cereal crop production is expected to fall in areas that now have the highest population density and/or the most undernourished people, notably most of Africa and India (Dow and Downing, 2007). In the United States, key crop growing areas, such as California, which provides half of the fruits, nuts, and vegetables for the United States, will experience uneven effects across crops, requiring farmers to adapt rapidly to changing what they plant. Fisheries Degradation of coral reefs by ocean warming and acidification will negatively affect fisheries, because reefs are required as habitat for many important food species, especially in poor parts of the world. For example, in the poorest countries of Africa and south Asia, fisheries largely associated with coral reefs provide more than half of the protein and mineral intake for more than 400 million people (Hughes et al., 2012). On a broader scale, many fisheries around the world can be expected to experience changes as ocean temperatures, acidity, and currents change (Allison et al., 2009; Jansen et al., 2012; Powell and Xu, 2012), with attendant socio-economic impacts (Pinsky and Fogarty, 2012). One study suggests climate change, combined with other pressures on fisheries, may result in a 30–60 percent reduction in fish production by 2050 in areas such as the eastern Indo-Pacific, and those areas fed by the northern Humboldt and the North Canary Currents (Blanchard et al., 2012). Because other pressures, notably over-fishing, already stress fisheries, a small climatic stressor can contribute strongly to hastening collapse

Forest diebacks (Anderegg et al., 2013) and reduced tree biodiversity (Cardinale et al., 2012) can be expected to have major impacts on timber production. Such is already the case for millions of square miles of beetle-killed forests throughout the American West. Drought-enhanced desertification of dryland ecosystems may cause famines and migrations of environmental refugees

Regulatory Services

Also of concern is the potential loss of regulatory services, which buffer the effects of environmental change (Reid et al., 2005). For example, tropical forest ecosystems slow the rate of global warming both by absorbing atmospheric carbon dioxide and through latent heat flux (Anderson-Teixeira et al., 2012). Coastal saltmarsh and mangrove wetlands buffer shorelines against storm surge and wave damage (Gedan et al., 2011). Grassland biodiversity stabilizes ecosystem productivity in response to climate variation (see Cardinale et al., 2012 and references therein). Climate change has the clear potential to exacerbate losses of these critical ecosystem services (for instance, decrease in rainforests, desertification) and attendant impacts on human societies. Direct Economic Impacts Some species currently at risk of extinction, and some of those which will be further imperiled by ongoing climate change, provide significant economic benefits to people who live in the surrounding areas, as well as significant aesthetic and emotional benefits to millions of others, primarily through ecotourism, hunting, and fishing. At the international level, for example, ecotourism—largely to view elephants, lions, cheetahs, and other threatened species—supplies around 14 percent of Kenya’s GDP as of 2013 (USAID, 2013) and supplied 13 percent of Tanzania’s in 2001 (Honey, 2008). Yet in a single year, 2009, an extreme drought decimated the elephant population and populations of many other large animals in Amboseli Park, Kenya. Increased frequency of such extreme weather events could erode the ecotourism base on which the local economies depend. Other international examples include ecotourism in the Galapagos Islands—driven in a large part to view unique, threatened species—which contributed 68 percent of the 78 percent growth in GDP of the Galapagos that took place from 1999–2005 (Taylor et al., 2008). Within the United States, direct economic benefits of ecosystem services also are substantial; for example, commercial fisheries provide approximately one million jobs and $32 billion in income nationally (NOAA, 2013). Ecotourism also generates substantial revenues and jobs in the United States—visitors to national parks added $31 billion to the national economy and supported more than 258,000 jobs in 2010 (Stynes, 2011).

Less obviously, there are also systems whose useful lifetimes are cut short by gradual changes in baseline climate. Such systems are experiencing abrupt impacts if they are built to last a certain period of time, and priced such that they can be amortized over that lifetime, but their actual lifetime is artificially shortened by climate change. One example would be a large air conditioning system for computer server rooms. If maximum high temperatures rise faster than planned for, the lifetime of such systems would be cut short, and new systems would need to be installed at added cost to the owner of the servers.

Another example is storm runoff drains in cities and towns. These systems are sized to handle large storms that precipitate a certain amount of water in a certain period of time. Rare storms, such as a 1000-year event, are typically not considered when choosing the size of pipes and drains, but the largest storms that occur annually up to once per decade or so are considered. As the atmosphere warms and can hold more moisture, the amount of rain per event is increasing (Westra et al., 2013), changing the baseline used to size storm runoff systems, and thus their utility, generally long before the systems are considered to have reached

Another type of infrastructure problem associated with abrupt change is the infrastructure that does not exist, but will need to after an abrupt change. The most glaring example today is the lack of US infrastructure in the Arctic as the Arctic Ocean becomes more and more ice free in the summer. For example, the United States lacks sufficient ice breakers that can patrol waters that, while seasonally open in many places, will still have extensive wintertime ice cover. Servicing and protecting our activities in this resource-rich region is now a challenge, one that only recently, and abruptly, emerged. This challenge has illustrated a time scale issue associated with abrupt change. Currently, it will take years to rebuild our fleet of ice-breakers, but because of the rapid loss of sea ice in 2007 and more recently, the need for these ships is now (NRC, 2007; O’Rourke, 2013). Coastal Infrastructure Globally, about 40 percent of the world’s population lives within 100 km of the world’s coasts. While complete inventories are lacking, the accompanying infrastructure— from the obvious, such as roads and buildings, to the less obvious but no less critical, such as underground services (e.g., natural gas and electric lines)—is easily valued in the trillions of dollars, and this does not include ecosystem services such as fresh water supplies, which are threatened as sea level rises. A nearly equal percentage of the US population lives in Coastal Shoreline Counties.2 In addition, coastal counties are more densely populated than inland ones. The National Coastal Population Report, Population Trends from 1970 to 2020 (NOAA, 2013), reports that coastal county population density is over six times that of inland counties (Figure 3.1). Consequently, the United States has a large amount of physical assets located near coasts and currently vulnerable to sea level rise and storm surges exacerbated by rising seas (See Chapter 2 and especially Box 2.1 for additional discussion of this issue.) For example, the National Flood Insurance Program (NFIP) currently has insured assets of $527 billion in the coastal floodplains of the United States, areas that are vulnerable to sea level rise and storm surges.

Nearly half of the US gross domestic product, or GDP, was generated in the Coastal Shoreline Counties along the oceans and Great Lakes (see NOAA State of the Coast3). Despite the ongoing rise of sea level, and the frequent, high-profile illustrations of the value and vulnerabily of coastal assets at risk, there is no systematic, ongoing, and updated cataloging of coastal assets that are in harm’s way as sea level rises. Overall, there is a need to shift to more holistic planning, investment, and operation for global sea ports (Becker et al., 2013).

Permafrost, or permanently frozen ground, is ubiquitous around the Arctic and subArctic latitudes and the continental interiors of eastern Siberia and Canada, the Tibetan Plateau and alpine areas. As such, it is a substrate upon which numerous pipelines, buildings, roads and other infrastructure have (or could be) built, so long as these structures are properly designed to not thaw the underlying permafrost. For areas underlain by ice-rich permafrost, severe damage to permanent infrastructure can result from settlement of the ground surface as the permafrost thaws (Nelson

Over the past 40 years, significant losses (>20 percent) in ground load-bearing capacity have been computed for large Arctic population and industrial centers, with the largest decrease to date observed in the Russian city of Nadym where bearing capacity has fallen by more than 40 percent (Streletskiy et al., 2012). Numerous structures have become unsafe in Siberian cities, where the percentage of dangerous buildings ranges from at least 10 percent to as high as 80 percent of building stock in Norilsk, Dikson, Amderma, Pevek, Dudina, Tiksi, Magadan, Chita, and Vorkuta (ACIA, 2005).

The second way in which milder winters and/or deeper snowfall reduce human access to cold landscapes is through reduced viability of winter roads (also called ice roads, snow roads, seasonal roads, or temporary roads). Like permafrost, winter roads are negatively impacted by milder winters and/or deeper snowfall (Hinzman et al., 2005; Prowse et al., 2011). However, the geographic range of their use is much larger, extending to seasonally frozen land and water surfaces well south of the permafrost limit. They are most important in Alaska, Canada, Russia, and Sweden, but also used to a lesser extent (mainly river and lake crossings) in Finland, Estonia, Norway, and the northern US states. These are seasonal features, used only in winter when the ground and/or water surfaces freeze sufficiently hard to support a given vehicular weight. They are critically important for trucking, construction, resource exploration, community resupply and other human activities in remote areas. Because the construction cost to build a winter road is <1 percent that of a permanent road (e.g., ~$1300/km versus $0.5–1M/km, Smith, 2010) winter roads enable commercial activity in remote northern areas that would otherwise be uneconomic. Since the 1970s, winter road season lengths on the Alaskan North Slope have declined from more than 200 days/year to just over 100 days/year (Hinzman et al., 2005). Based on climate model projections, the world’s eight Arctic countries are all projected to lose significant land areas (losses of 11 percent 82 percent) currently possessing climates suitable for winter road construction (Figure 3.3), with Canada (400,000km2) and Russia (618,000km2) experiencing the greatest losses in absolute land area terms

Although the prospect of such trans-Arctic routes materializing has attracted considerable media attention (and indeed, 46 vessels transited the Northern Sea Route during the 2012 season), it is important to point out that these routes would operate only in summer, and numerous other non-climatic factors remain to discourage trans-Arctic shipping including lack of services, infrastructure, and navigation control, poor charts, high insurance and escort costs, unknown competitive response of the Suez and Panama Canals, and other economic factors

This section briefly describes several other human health-related impacts—heat waves, vector-borne and zoonotic diseases, and waterborne diseases—but there are others, including potential impacts from reduced air quality, impacts on human health and development, impacts on mental health and stress-related disorders, and impacts on neurological diseases and disorders

Heat waves cause heat exhaustion, heat cramps, and heat stroke; heat waves are one of the most common causes of weather-related deaths in United States (USGCRP, 2009). Summertime heat waves will likely become longer, more frequent, more severe, and more relentless with decreased potential to cool down at night. Increases in heat-related deaths due to climate change are likely to outweigh decreases in deaths from cold snaps (Åström et al., 2013; USGCRP, 2009). In general, heat waves and the associated health issues disproportionately affect more vulnerable populations such as the elderly, children, those with existing cardiovascular and respiratory diseases, and those who are economically disadvantaged or socially isolated (Portier et al., 2010). Increasing temperature and humidity levels can cross thresholds where it is unsafe for individuals to perform heavy labor (below a direct physiological limit). Recent work has shown that environmental heat stress has already reduced the labor capacity in the tropics and mid-latitudes during peak months of heat stress by 10 percent, and another 10 percent decrease is projected by 2050 (Dunne et al., 2013) with much larger decreases further into the future. Areas of Concern for Humans from Abrupt Changes Heavy rainfall and flooding can enhance the spread of water-borne parasites and bacteria, potentially spreading diseases such as cholera, polio, Guinea worm, and schistosomiasis.“Outbreaks of waterborne diseases often occur after a severe precipitation event (rainfall, snowfall). Because climate change increases the severity and frequency of some major precipitation events, communities—especially in the developing world—could be faced with elevated disease burden from waterborne diseases” (Portier et al., 2010).

Vector-borne diseases are those in which an organism carries a pathogen from one host to another. The carrier is often an insect, tick, or mite, and well-known examples include malaria, yellow fever, dengue, murine typhus, West Nile virus, and Lyme disease. Zoonotic diseases are those that are transmitted from animals to humans by either contact with the animals or through vectors that carry zoonotic pathogens from animals to humans; examples include Avian Flu, and H1N1 (swine flu). Changes in climate may shift the geographic ranges of carriers of some diseases. For example, the geographic range of ticks that carry Lyme disease is limited by temperature. As air temperatures rise, the range of these ticks is likely to continue to expand northward (Confalonieri et al., 2007).

National Security

The topic of climate and national security including a recent review entitled Climate and Social Stresses: Implications for Security Analysis (NRC, 2012b). as well as the excellent discussion of this topic by Schwartz and Randall (2003).

 

Conflicts over water issues may become more numerous as droughts become more frequent. In addition, famine and food scarcity have the potential to cause international humanitarian issues and even conflicts, as do health security issues from epidemics and pandemics (also see previous section). These impacts from climate change may present national security challenges through humanitarian crises, disruptive migration events, political instability, and interstate or internal conflict. The impacts on national security are likely to be presented abruptly, in the sense that the eruption of any crisis represents an abrupt change.

An example of an abrupt change that affects the national infrastructure of a number of countries is the opening of shipping lanes in the Arctic as a result of the retreating sea ice. There are geopolitical ramifications related to possible shipping routes and territorial claims, including potential oil, mineral, and fishing rights.

Rapid or catastrophic methane release from sea-floor or permafrost reservoirs has also been shown to be much less worrisome than first considered possible

Fast changes in atmospheric methane concentration in ice cores from glacial time correlated with abrupt climate changes (e.g., Chappellaz et al., 1993). However, subsequent research has revealed that the variations in methane through the glacial cycles (1) originated in large part from low-latitude wetlands, and were not dominated by high-latitude sources that could be potentially much larger, and (2) produced a relatively small radiative forcing relative to the temperature changes, serving as a small feedback to climate changes rather than a primary driver.

Methane was also proposed as the origin of the Paleocene–Eocene thermal maximum event, 55 million years ago, in which carbon isotopic compositions of CaCO3 shells in deep sea sediments reflect the release of some isotopically light carbon source (like methane or organic carbon), and various temperature proxies indicate warming of the deep ocean and hence the Earth’s surface. But the longevity of the warm period has shown that CO2 was the dominant active greenhouse gas, even if methane was one of the important sources of this CO2, and the carbon isotope spike shows that if the primary release reservoir were methane, the amount of CO2 that would be produced by this spike would be insufficient to explain the extent of warming, unless the climate sensitivity of Earth was much higher than it is today (Pagani et al., 2006).

The collected understanding of these threats is summarized in Table 4.1. For example, the West Antarctic Ice Sheet (WAIS) is a known unknown, with at least some potential to shed ice at a rate that would in turn raise sea level at a pace that is several times faster than is happening today. If WAIS were to rapidly disintegrate, it would challenge adaptation plans, impact investments into coastal infrastructure, and make rising sea level a much larger problem than it already is now. Other unknowns include the rapid loss of Arctic sea ice and the potential impacts on Northern Hemisphere weather and climate that could potentially come from that shift in the global balance of energy, the widespread extinction of species in marine and terrestrial systems, and the increase in the frequency and intensity of extreme precipitation events and heat waves.

Anticipating the potential for climatically-induced abrupt change in social systems is even more difficult, given that social systems are actually extremely complex systems, the dynamics of which are governed by a network of interactions between people, technology, the environment, and climate. The sheer complexity of such systems makes it difficult to predict how changes in any single part of the network will affect the overall system, but theory indicates that changes in highly-connected nodes of the system have the most potential to propagate and cause abrupt downstream changes. Climate connects to social stability through a wide variety of nodes, including availability of food and water, transportation (for instance, opening Arctic seaways), economics (insurance costs related to extreme weather events or rising sea level, agricultural markets, energy production), ecosystem services (pollination, fisheries), and human health (spread of disease vectors, increasing frequency of abnormally hot days that cause physiological stress). Reaching a climatic threshold that causes rapid change in any one of these arenas therefore has high potential to trigger rapid changes throughout the system.

 

 

Posted in Climate Change | Leave a comment

How climate change will affect energy production and transportation

[ Climate change and extreme weather will harm oil and gas exploration and production, fuel transport, thermoelectric power generation (coal, natural gas, nuclear, geothermal, solar concentrated power), hydropower, biofuels, wind, solar PV, the electric grid, and increase energy demand from sea level rise, heat, drought, floods, more storms, and more extreme weather.  Extreme heat and population growth will exacerbate these problems, as demand increases at times when energy production needs to decrease due to lack of cooling water (drought) and so on.

Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation, 2015, Springer]

USDOE. July 2013.  U.S. Energy Sector Vulnerabilities to Climate Change and Extreme Weather. U. S. Department of Energy. 80 Pages. 

Summary.  Natural disasters and climate change are already affecting our ability to produce and deliver energy from oil, natural gas, coal.   Climate change will make matters worse:

  1. Energy infrastructure is at or past its lifetime yet expected to operate in ranges it wasn’t designed for.
  2. Heat, drought, and floods reduce power output for both fossil fuel and renewable energy generation. Heat increases wildfires, which reduce power output
  3. Energy infrastructure along the coast is at risk from sea level rise, increasing intensity of storms, and higher storm surge and flooding, potentially disrupting oil and gas production, refining, and distribution, as well as electricity generation and distribution. Sea level rise will flood roads and rail lines, halting receipt or delivery from ships at ports.
  4. We need 40% more power generation, oil, coal, and natural gas for the 442 million people the USA census bureau in 2012 predicts may be here in 2060.
  5. Competition for water: Electric generation plants use 13% of water in the USA. Drought can shut power plants down. Agriculture will need even more water as temperatures rise
  6. Longer and more extreme droughts will lower lake and river levels, making barge delivery of coal, ethanol, food, etc., more difficult or impossible
  7. Thawing permafrost in the arctic will make oil drilling and exploration more difficult, and may ruin the Alaska pipeline
  8. Oil and gas production is vulnerable to decreasing water since large volumes of water are needed for enhanced oil recovery, hydraulic fracturing, and refining

Examples of Impacts from extreme storms, floods, and sea level rise 

Figure 1. Selected events over the last decade illustrate the U.S. energy sector's vulnerabilities to climatic conditions

Figure 1. Selected events over the last decade illustrate the U.S. energy sector’s vulnerabilities to climatic conditions

Figure 1 illustrates some of the many ways in which the U.S. energy sector has recently been affected by climatic conditions. These types of events may become more frequent and intense in future decades.

#21 February 2013: Over 660,000 customers lost power across 8 states in the Northeast after a winter storm brought snow, heavy winds, and coastal flooding, resulting in significant damage to the electric transmission system

#26 July 2011: ExxonMobil’s Silvertip pipeline, buried beneath the Yellowstone River in Montana, was torn apart by flood-caused debris, spilling oil into the river and disrupting crude oil transport in the region.

#27 June 2011: Missouri River floodwaters surrounded Fort Calhoun Nuclear Power plant in Nebraska. The plant was closed all summer due to persistent flood waters

#28 May 2011: Nearly 20% of barge terminals along the Ohio River were closed due to flooding, impacting coal and petroleum transport. Flooding along the Ohio and Mississippi rivers threatened oil refineries and infrastructure from Tennessee to Louisiana

#29 2005: Hurricanes Katrina & Rita inflicted significant damage on the Gulf Coast, destroying 115 offshore platforms and damaging 52 others, damaging 535 pipeline segments, which caused a near-total shutdown of the Gulf’s offshore oil and gas production for several weeks. Nine months after the hurricanes, 22% of oil production and 13% of gas production remained shut-in, equating to the loss of 150 million barrels of oil and 730 billion cubic feet of gas from domestic supplies

Climate change and implications for the energy sector

Table ES-1. Climate change and implications for the energy sector   

Oil and gas exploration and production

Climate change: 1) Thawing permafrost in Arctic Alaska, 2) decreasing water availability, 3) Increasing intensity of storms, sea level rise, and storm surge

Impact: 1) Damaged infrastructure 2) impacts on drilling, production, and refining, 4) increased risk of physical damage and disruption to offshore and coastal facilities

Fuel transport

Climate change: reduction in river levels, increasing intensity and frequency of floods

Impact: disruption of barge transport of crude oil, petroleum products, and coal

Electric power generation (Coal, natural gas, nuclear, geothermal and solar CSP)

Climate change: 1) increasing heatwaves 2) increasing water temperatures 3) decreasing water availability 4) increasing intensity of storms, sea level rise, and storm surge 5) increasing intensity and frequency of flooding

Impact: 1) reduction in plant efficiency, generation capacity 2) impacts on coal, natural gas, & nuclear supply chains 4) physical damage and disruption of coastal facilities

Hydropower

Climate Change: 1) increasing temperatures and evaporation losses 2) changes in precipitation and decreasing snowpack 3) increasing intensity and frequency of flooding

Impact: 1) reduction in generation capacity 2) physical damage

Bioenergy and biofuel production

Climate change: 1) increasing heat 2) decreasing water, sea level rise and increasing intensity and frequency of flooding

Impact: increased irrigation demand and risk of crop damage from extreme heat, risk of decreased production

Interdependencies (risk of cascading failures)

nexus of energy water land systems

Moving and treating water: 4% of total electricity consumption in the United States, if end users are included, then 13% of total primary energy consumption

  • Energy needs Water for fuel processing, cooling of power plants, and fracking.
  • Water needs Energy to be pumped, transported, and treated.
  • Land needs Water for agriculture, ecosystems, homes, and businesses.
  • Water needs Land for storage and ground cover vegetation.

Agriculture and energy compete for water.  Irrigation requires energy to pump the water, energy systems need water to cool down with.

  • Land needs Energy for transportation, agriculture, commercial and residential uses
  • Energy needs Land for bioenergy crops, energy infrastructure such as dams, mines, power plants, power lines, pipelines, and refineries

Transportation uses energy, energy resources (coal, ethanol, etc) need transportation to reach power plants and end users.

  • Communication systems need electricity.
  • The electric grid needs communications systems to monitor itself (i.e. smart grid).

Most countries, including those the United States imports electricity and fuels from, will face similar impacts, which may in turn impact U.S. energy security  

Disruptions in one energy sector can lead to cascading failures, as can be seen from what happened after Hurricane Sandy, which caused a storm surge in New York Harbor 9 feet above average high tide and power outages affecting 8 million customers in 21 states, shut down gasoline fuel pumps, two oil refineries and another four refineries had to reduce output. Ports and several power plants in the Northeast, including nuclear power units, petroleum/natural gas refineries and pipelines, and petroleum terminals, were either damaged or had temporary shutdowns from high winds and floods.

Wildfires

  • The wildfire season has increased by nearly 80 days in the past 30 years.
  • The average length of large fires has almost quadrupled from 7.5 days to 37 days, and their size of has also increased
  • Climate change is expected to increase the frequency, intensity, and total acres burned by wildfires, especially in Alaska and parts of the West
  • Warmer temperatures and drought stress forests and make stands vulnerable to mortality from pest infestations such as the pine beetle, increasing wildfire risk

Arctic oil and gas

Oil and gas in Arctic Alaska are very vulnerable to climate change because temperatures in the Arctic are increasing twice as fast as the global average. The region may have up to 90 billion barrels of oil, 1,669 trillion cubic feet of natural gas, and 44 billion barrels of natural gas liquids, about 22% of the world’s undiscovered oil and gas resources

Climate change will make it difficult to get at oil and gas in the Arctic because:

  1. Thawing permafrost could damage oil and gas infrastructure
  2. Risks to onshore fossil fuel development could include the loss of access roads built on permafrost, inability to create new roads, problems due to frost heave and settlement of pipelines set on pilings or buried in permafrost, and reduced load-bearing capacity of buildings and structures
  3. The trans-Alaska oil pipeline was constructed with thousands of pipes that remove heat from permafrost, which are having problems from by increasing temperatures.
  4. Drilling wastes rely on the permafrost to prevent subsurface movement of the wastes into the environment; thawing permafrost could require alternative waste disposal methods.
  5. To protect the tundra, the Alaska Department of Natural Resources limits the amount of travel on the tundra, and over the past 30 years, the number of days when travel is permitted has dropped from more than 200 to 100, reducing the number of days that oil and gas exploration and extraction equipment can be used
  6. Reduced sea ice coverage limits ice-based infrastructure and transportation
  7. Sea ice melting can result in more icebergs, a risk to oil and gas operations in the Arctic since icebergs can damage rigs and vessels
  8. Climate change may increase the frequency of polar storms further disrupting drilling, production, and transportation
  9. In addition to the thawing of permafrost, other risks could increase, including lightning strikes, tundra fire, storm surge, and coastal erosion

Climate change effect on power generation

Higher air and water temperatures plus drought are likely to reduce electricity generation capacity during the summer months. For example, the average summer capacity at thermoelectric power plants by 2031–2060 is projected to decrease by between 4.4% and 16%, depending on climate scenario, water availability, and cooling system type, as compared to the end of the 20th century

Hydropower: Increasing temperatures increase evaporative water losses and water use, decreasing water availability for hydropower, and heat intensifies stratification of reservoirs behind dams, depleting dissolved oxygen in reservoirs and downstream, and degrading habitat for fish and other wildlife.

Agriculture. Increasing temperatures will increase evapotranspiration (ET) rates, which increases water demand; if increased water demand is not met by increased irrigation (or rain), the increased ET rates could reduce average crop yields. Extreme heat could damage crops, and extended periods of drought could destroy entire harvests.

Wind Energy   Average annual wind speeds in the United States could decrease from 1 to 3% by 2050, and as much as 3 to14% in the Northwest

Solar Energy Increasing temperatures could reduce potential generation capacity of solar PV by 6%. Photovoltaic (PV) output and efficiency could be lowered by hotter temperatures, changes in cloud cover, haze, humidity, and

Electric Grid The U.S. electric grid is a large and complex system that consists of more than 9,200 electric generating units with more than 1,000 GW of generating capacity connected to more than 300,000 miles of transmission lines. Hotter temperatures will affect the grid significantly in many ways:

  1. Increasing temperatures are expected to increase transmission losses
  2. Reduce current carrying capacity
  3. Increase stresses on the distribution system
  4. Decrease substation and transmission efficiency and lifespan.
  5. About 7% of power is lost now in transmission and distribution, and even more will power will be lost as temperatures increase.
  6. The effects of high temperatures are exacerbated when wind speeds are low or nighttime temperatures are high, preventing transmission lines from cooling. Nighttime temperatures have been increasing at a faster rate than daytime temperatures, and they are expected to continue to increase
  7. System transmission losses during a heat wave could be significant and contribute to electric power interruptions and power outages.
  8. Hot temperatures sag of overhead transmission lines due to thermal expansion, which can cause fires and power outages when lines contact trees or the ground.
  9. More frequent and severe wildfires increase the risk of physical damage to electricity transmission infrastructure and decrease available transmission capacity. The heat, smoke, and particulate matter also lower the capacity of a transmission line. The soot accumulates on the insulators that attach transmission lines to towers, causing leakage currents, and ionized air in the smoke acts as a conductor, causing arcing between lines, which can cause an outage. Fire retardant can foul transmission lines. The probability of exposure to wildfires for some lines in California is projected to increase by 40% by the end of the century.

One study estimates that 34 GW of additional generating capacity will need to be constructed in the western region alone by 2050 to meet the increased peak load due from higher temperatures.  Add in another 22% more power for population growth expected by 2035 and you need to spend at least $45 billion in western states alone.

Water & Drought

The largest declines in rainfall from climate change are expected in the summer. In western states, snow water has declined as much as 75% between 1950 and 2000 and is melting earlier.

Droughts have become more common and widespread over the past 40 years in the Southwest, southern Great Plains, and Southeast.

In the Midwest, evaporation rates are projected to increase, water levels decrease, and the frequency, intensity, and duration of droughts are likely to increase.

This combination of more intense droughts and reduced summer rain and stream flows may substantially impact water availability in the summer.

Groundwater depletion is occurring across the United States, including in the High Plains (the location of the Ogallala aquifer) and in the California Central Valley. Future impacts on groundwater resources will result from a combination of changes in precipitation patterns, increases in evaporation rates, increases in droughts, and increasing competition for water among various sectors (e.g., energy, agriculture, industry, and residential). These impacts are expected to continue to decrease groundwater availability, particularly in the central and western regions, as heavily utilized aquifers experience reduced recharge rates.

Implications for the Energy Sector. Decreasing water supplies directly impacts nearly all aspects of energy supply: how electricity is produced; where future capacity may be sited; the cost of producing electricity; the types of generation or cooling technologies that are cost-effective; and the costs and methods for extracting, producing, and delivering fuels. Limited water available for cooling at thermoelectric facilities can affect power plant utilization. Increased evaporation rates or changes in snowpack may affect the volume and timing of water available for hydropower. Decreased water availability can affect bioenergy production. In regions where water is already scarce, competition for water between energy production and other uses will also increase. Future conditions will stress energy production infrastructure in all regions—particularly those with the most water-intensive generation portfolios.

Oil and Gas Exploration and Production. Water is required in many different stages of the oil and gas value chain, from exploration to processing to transport, with the largest volume of water used in the refining process. In exploration and production processes, the largest volume of water is used as a supplemental fluid in the enhanced recovery of petroleum resources. Water is required to a lesser extent for other activities, including drilling and completion of oil or gas wells; work-over of an oil or gas well; creation of underground hydrocarbon storage caverns through solution mining of salt formations; as gas plant cooling and boiler water; as hydrostatic test water for pipelines and tanks; as rig wash water; and as coolant for internal combustion engines for rigs, compressors, and other equipment. Water is not only used in conventional oil and gas exploration and production, but significant volumes of impaired water are produced in the process. This produced water is the largest volume by-product associated with oil and gas exploration and production. The total volume of produced water in 2007 was estimated to be 21 billion barrels, or 2.4 billion gallons per day . More than 98% of this produced water is injected underground: Approximately 59% is injected into producing formations to enhance production and about 40% is injected into non-producing formations for disposal.

Coal and uranium mining. Water is needed for mining operations. Mining tailings can harm surface and groundwater quality.

Coal slurry pipelines.  Water is used to transport the slurry, which harms water quality.

Coal-bed methane, tight gas sands, shale oil and gas use huge amounts of water and contaminate it.

Decreasing water availability impact on oil refining. Conventional oil refining requires 0.5 to 2.5 gallons of water per gallon of gasoline equivalent. Additional water may be consumed if reforming and hydrogenation steps are required.

Fuel Transport. Decreased water levels in rivers and ports can cause interruptions and delays in barge and other fuel delivery transportation routes. Crude oil and petroleum products are transported by rail, barge systems, pipelines, and tanker trucks. Coal is transported by rail, barge, truck, and pipeline. Corn-based ethanol, blended with gasoline, is largely shipped by rail, while bioenergy feedstock transport relies on barge, rail, and truck freight. A complex web of crude oil and petroleum product pipelines deliver petroleum from domestic oil fields and import terminals to refineries and from refineries to consumption centers across the United States. The shale oil revolution in areas such as the Bakken in North Dakota and Montana will likely increase barge traffic, with crude oil being transported by barge along the Missouri

Reductions in river levels could impede barge transport of crude oil, petroleum products, and coal, resulting in delivery delays and increased costs. In August 2012, the U.S. Army Corps of Engineers reported groundings of traffic along the Mississippi River due to low water depths from drought. This disrupted the transportation of commodities delivered by barges, including coal and petroleum products. Petroleum exports through New Orleans were valued at about $1.5 billion per month in 2012. When river levels decrease, barge operators reduce their loads. A tow (chain of barges pulled or pushed as a group) on the upper Mississippi, Illinois, and Ohio rivers typically has 15 barges, each capable of carrying more than 1,000 tons. A one-inch (2.5 cm) drop in river level can reduce tow capacity by 255 tons. Likewise, the typical tow on the lower Mississippi has 30– 45 barges, resulting in decreased capacity of up to 765 tons for just a one-inch decrease in river level.

Most coal in the United States is mined in 3 regions: Appalachia, the Midwest, and  western states. Barges carry 11% of U.S. coal to power plants. According to the EIA, 63% of coal production is projected to originate from western states by 2030 compared to 54% in 2011, meaning an even larger share of coal produced would be transported long distances. Continued transportation of fossil fuels by barge would make coal power plants vulnerable to supply chain failure from drought-reduced river levels in the future.

Electric power generation uses the largest amount of freshwater in the United States

  • 200 billion gallons per day, or 40% of all freshwater withdrawals.
  • 90% of thermoelectric power generation in the United States requires water for cooling. Low flow conditions in rivers and low lake levels—due to drought, increased evaporation, or changes in precipitation and runoff patterns—pose an operational risk to thermoelectric facilities
  • About 25% of electric generation is in counties projected to be at high or moderate water supply sustainability risk in 2030.
  • 350 coal-fired power plants in the United States (out of 580) are located in areas subject to water stress (i.e., limited water supply and/or competing water demand from other sectors).
  • Increasing power needs for the growing U.S. population could increase thermoelectric water consumption by as much as 27% by 2035
  • Carbon capture and storage (CCS) technologies could contribute to increased water consumption, which are twice as high as coal and natural gas

Water and Coal Power Generation.  Decreasing water availability could affect the coal supply chain. Coal provides over 40% of the electric power generated in the United States and uses water for many stages, from extraction to processing and transport. Coal can be mined from deep underground caverns, surface pits, or mountaintops. Coal mining processes can use significant amounts of water: an estimated 70–260 million gallons of water per day, or approximately 50–59 gallons of water for every short ton of coal mined.  Water is used at several different stages, including for cooling or lubricating cutting and drilling equipment, dust suppression, fuel processing, and revegetation when mining and extraction are complete. One short ton of coal generates about 1,870 kWh of electricity. Depending on its quality, coal may need to be “washed” with water and chemicals to remove sulfur and impurities before it can be burned in a power plant.

Water and Nuclear Power.   Nuclear energy provides about 20% of the electricity in the United States. Over the last decade, U.S. uranium mines have supplied less than 10% of the uranium fuel powering the nuclear fleet, with the rest imported. Water used to mine uranium has traditionally been comparable to the estimates for underground and surface coal mining: between one and six gallons per British thermal unit (BTU). Uranium fuel processing requires additional water (45 to 150 gallons per MWh).

Water and Renewable Energy Resources. The water demand associated with renewable energy technologies varies significantly. Water consumption for thermoelectric power generation based on solar CSP plants or geothermal technologies using once-through or recirculating cooling can be comparable to, or even greater than, that of fossil or nuclear thermoelectric power plants.

While photovoltaic (PV) power generation consumes little, Concentrating Solar Power (CSP) uses steam generation and water cooling and requires significant volumes of water and consume more water than a natural gas, coal-fired, or nuclear power plant. A typical parabolic trough CSP plant with recirculating cooling uses over 800 gal/MWh; most of it for cooling, with less than 2% for mirror washing. These values compare to less than 700 gal/MWh for a nuclear power plant, 500 gal/MWh for a supercritical coal-fired power plant, and 200 gal/MWh for a combined cycle natural gas plant.

Hydropower. Changing precipitation and decreasing snowpack could decrease available hydropower generation capacity. Higher temperatures, less snowpack, and decreasing water availability have reduced the Colorado River’s flow and left Lake Mead more than 100 feet (30 meters) below full storage capacity.  Hoover Dam loses 5–6 MW of capacity for every foot (0.3 meter) decline in Lake Mead, because at lower water levels there is less water pressure to drive the turbines as well as a greater potential for air bubbles to form and flow through with the water causing the turbines to lose efficiency. Studies on the effects of stream flow on available hydropower generation in the Colorado River Basin suggest that for each 1% decrease in stream flow, power generation decreases by 3%. For several California rivers, summer hydropower potential is projected to decrease 25% because runoff is projected to occur two weeks earlier under a climate scenario of 3.6°F warming.

Bioenergy and Biofuel Production. Changes in precipitation and runoff may affect bioenergy production. Drought and other changes in the hydrologic cycle may diminish feedstock production efficiency for both traditional and second-generation bioenergy. Increasing competition for water, particularly in times when (and locations where) water is scarce, will affect energy and food production alike. On average, producing one gallon of corn ethanol requires 17–239 gallons of water for irrigation and conversion. One study found that 520–3,281 gallons of freshwater is currently required to produce one gallon of biodiesel from microalgae.

Increasing Storms, Flooding, and Sea Level Rise

  • Increasing intensity of storm events, sea level rise, and storm surge put coastal and offshore oil and gas facilities at increased risk of damage or disruption
  • Increasing intensity of storm events increases the risk of damage to electric transmission and distribution lines.
  • Increasing intensity of storm events, sea level rise, and storm surge poses a risk to coastal thermoelectric facilities, while increasing intensity and frequency of flooding poses a risk to inland thermoelectric facilities.
  • Increasing intensity and frequency of flooding increases the risk to rail and barge transport of crude oil, petroleum products, and coal.

As atmospheric temperatures increase, so does the waterholding capacity of the air—generally by about 7% per 1.8°F increase in temperature. As a result, rainstorms become more intense and a greater fraction of precipitation falls during heavy rainfall events, increasing flooding risk. Recent projections indicate that globally, the heaviest precipitation events are likely to occur twice as frequently as they do today by the end of the century. In the United States, high-rainfall events which today occur once every 20 years may occur once every 4-15 years by 2100, depending on location. Such events are also expected to become more intense, with 10%–25% more precipitation falling in the heaviest events. The greatest increases are expected in parts of the Northeast, Midwest, Northwest, and Alaska.

Changes in the timing and amount of precipitation consequently shift the frequency, intensity, and duration of floods

Since the 1970s, the intensity of hurricanes and tropical storms has increased. According to the IPCC, the intensity of category 4 & 5 hurricanes is expected to increase by 80% in 2081-2100 compared to now.

Sea Level Rise

Globally, absolute sea level rose at an average rate of.07 inches per year from 1880 to 2011, but from 1993 to 2011 the average sea level rose at a rate of .11– .13 inches per year. The rate of global sea level rise over the last 20 years is double the rate observed over the last century. Future global sea level rise over the rest of this century is projected to increase at a faster rate than over the last century.

In coastal areas, storm events combined with sea level rise will contribute to greater storm surge impacts, increasing over time as both storm intensity and sea level rise increase

Sea level rise will exacerbate existing vulnerabilities to hurricanes and storm surge because hurricanes and storms damage wetlands and other natural and manmade features that help protect coastal infrastructure from sea level rise, flooding, and hurricanes.

The second-costliest year for weather and climate disasters in the United States was 2012, with estimated damage of approximately $115 billion (NOAA 2013a). These events include severe weather and tornados, tropical storms, droughts, and wildfires. The two major drivers of damage costs in 2012 were Hurricane Sandy ($65 billion) and an extended drought ($30 billion).

Heavy rainfall and flood events in the Midwest and Northeast threaten inland facilities and infrastructure and may impede the transportation of coal to power plants. More intense hurricanes pose a particular risk to ports and energy infrastructure in coastal regions.

The Gulf Coast region exemplifies the high-volume, high-value, complex system of resources, infrastructure, and transportation networks required to convert raw materials such as natural gas and crude oil into fuels. With nearly 4,000 active oil and gas platforms, over 30 refineries, and 25,000 miles of pipeline, the Gulf region’s oil and gas industry produces approximately 50% of U.S. crude oil and natural gas and half of U.S. refining capacity, as well as the U.S. Strategic Petroleum Reserve with 700 million barrels of crude oil   

Increasing intensity of storm events, sea level rise, and storm surge put coastal and offshore oil and gas facilities at increased risk of damage or disruption. In 2005, Hurricanes Katrina and Rita shut down or damaged hundreds of oil drilling and production platforms and offshore drilling units, and damaged 457 offshore oil and gas pipelines and significantly damaged onshore oil refining, gas processing, and pipeline facilities, which impacted oil and gas production for months.  As energy sector development in the Gulf Coast has proceeded over the last 50 years, including the deployment of deep-water rigs costing as much as a billion dollars, the potential for significant damage from storm events in the region has increased. In addition to causing physical damage to energy infrastructure, an increase in the intensity of storms can decrease fuel supplies. Storm-related disruptions to extraction, processing, refining, and generation also cause losses for downstream businesses and industries.

Business Interruption Costs. The economic impacts of combined sea level rise and storm surge damages to the energy industry in the Gulf region could average $8 billion per year by 2030.   Increasing intensity of storm events, sea level rise, and storm surge could impact oil storage facilities and operations.

Fuel Transport

  1. More frequent heavy rainfall events will increase flood risk across the United States, particularly in the Northeast and Midwest.
  2. Increased frequency and intensity of flooding will affect water levels in rivers and ports and could wash out rail lines.
  3. Flooding events could also cause interruptions and delays in fuel and petrochemical feedstock deliveries.
  4. Increasing intensity and frequency of flooding increases the risk to rail and barge transport of crude oil, petroleum products, and coal.
  5. Intense storms and flooding can impede barge travel and wash out rail lines, which in many regions follow riverbeds.
  6. Flooding of rail lines has already been a problem both in the Appalachian region and along the Mississippi River. In 2011, severe flooding throughout the Powder River Basin disrupted trains. Rerouting of trains due to flooding can cost millions of dollars and delay coal deliveries. Approximately 71% of the nation’s coal is transported by rail lines, with the remainder transported by barge, truck, and pipeline.
  7. As heavy precipitation events become more frequent and the risk of flooding increases, so will the risk of disruptions to coal deliveries. Delivery disruptions could, in turn, interrupt electricity generation at some power plants.
  8. The amount of crude oil and petroleum products transported by U.S. railways during the first half of 2012 increased by 38% from the same period in 2011. Although the majority of oil is transported by pipeline, railroads play an increasingly important role in transporting U.S. crude oil to refineries. This is especially true for North Dakota’s Bakken formation, which has limited pipeline infrastructure. The formation has more than tripled oil production in the last 3 years to become the 2nd-largest oil producer in the United States.
  9. The United States produces and transports more than one billion short tons of coal every year. While coal is produced in 25 states, the Powder River Basin, largely in Wyoming, accounted for 468 million tons of production in 2010, or 43% of U.S. coal production

Thermoelectric Power Generation

Numerous thermoelectric power plants line the coasts of the United States.

  • 10% are nuclear reactors
  • 15% are coal-fired plants
  • 75% are oil or natural gas-fired plants.

Many inland thermoelectric power plants are located in low-lying areas or flood plains. Increasing intensity of storm events, sea level rise, and storm surge poses a risk to coastal thermoelectric facilities.

The Atlantic Coast from Hampton Roads, Virginia, and further north, and the Gulf Coast are considered to be particularly vulnerable to sea level rise because the land is relatively flat and, in some places, subsiding.

An increase in sea level of 2 feet would affect more than 60% of the port facilities on the Gulf Coast.

An increase of 4 feet would affect 75% of port facilities.

In addition, assuming higher range projections for sea level rise combined with future 100-year floods in California, up to 25 thermoelectric power plants could be flooded by the end of the century, as well as scores of electricity substations and natural gas storage facilities.

Increasing intensity and frequency of flooding poses a risk to inland thermoelectric facilities. The intake structures, buildings, and other infrastructure at thermoelectric generation facilities that draw cooling water from rivers are vulnerable to flooding and, in some cases, storm surge.

Renewable Energy Resources

Increasing intensity and frequency of flooding could impact the operation of hydropower facilities in some regions. Flooding has the potential to increase river flows and hydropower generation. In extreme cases, floods can prove destructive to dams. The large sediment and debris loads carried by floodwaters can block dam spillways, and powerful masses of water can damage important structural components

Sea level rise and increasing intensity and frequency of flooding could inhibit bioenergy production. In 2008, major corn producing states in the upper Midwest experienced extreme flooding due to heavy rainfalls over an extended period of weeks. This flooding affected early-season planting operations. In coastal agricultural regions, sea level rise and associated saltwater intrusion and storm surge flooding can harm crops through diminished soil aeration, salinization, and direct damage.

Electric Grid.  Increasing intensity of storm events increases the risk of damage to electric transmission and distribution lines.

Strong winds associated with severe storms, including tropical storms and hurricanes, can be particularly damaging to energy infrastructure and result in major outages. In addition, heavy snowfall and snowstorms, which have increased in frequency in the Northeast and upper Midwest, and decreased in frequency in the South and southern Midwest, can also damage and disrupt electricity transmission and distribution.

Costs from Power Outages. A Congressional Research Service report estimates that storm related power outages cost the U.S. economy $20–$55 billion annually. Whether from aging infrastructure, increasing development, or increasing storm intensity and frequency, outages from weather-related events are increasing.

A study that only looked at Texas, Louisiana, and coastal communities in Mississippi and Atlanta predicted that by 2030 there will be nearly $1 trillion in energy assets at potential risk from rising sea levels and more intense hurricanes.

Wildfire trends: In 2012, more than 9.2 million acres burned nationwide, with fires setting records in many states both in terms of acres burned (e.g., New Mexico) and economic damages (e.g., Colorado). The wildfire activity of 2012 supplanted 2011 as the year with the 3rd most acres burned, behind 2006 and 2007. Although the number of fires was below average, the size of the fires notably increased. In the western states, the wildfire season has increased by nearly 80 days during the past 3 decades and the average duration of large fires has almost quadrupled, from 7.5 days to 37 days. These increases are attributed to both changes in forest management practices and increasing temperatures coupled with earlier spring snowmelt, drying soils, and vegetation Projected changes: The frequency of wildfires is projected to increase in some parts of the United States, particularly Alaska and parts of the West. Annual mean area burned in the western United States is projected to increase by 54% by the 2050s compared to the present day, and increase by as much as 175% in the Pacific Northwest.

Cloud Cover Historic trends: Cloud cover data from more than 100 stations indicate that, from 1970–2004, total cloud cover increased by approximately 1.4%. Increases occurred in nearly all parts of the United States except the Northwest.

Snowpack Water

Millions of people in the West depend on the springtime melting of mountain snowpack for power generation, irrigation, and domestic and industrial use. Runoff, excess water from rainfall or snowmelt that does not evaporate, flows over the land and ends up as stream flow. Stream flow influences the amount of water available for power generation, irrigation, domestic supply, and other competing uses. The fraction of precipitation falling as rain rather than snow has increased in many parts of the United States during the past 50 years, reducing total snowpack and increasing the risk of water shortages in the summer and fall. Total seasonal snowfall has generally decreased in southern and some western areas, increased in the northern Plains and Great Lakes, and not changed in other areas, such as the Sierra Nevada. In 2012, the nation experienced the third smallest winter snow cover extent in recorded history. Below average snowpack was observed for much of the western United States (NOAA 2013c). This is particularly relevant to the energy sector in areas with snowmelt-driven watersheds, such as the West, where the fraction of precipitation falling as rain increased by almost 10% over the past five decades.

As a result of earlier snowmelt, since the mid-20th century seasonal runoff has been occurring up to 20 days earlier in the West and up to 14 days earlier in the Northeast. The lack of snowfall across the Rockies, Great Plains and Midwest was a precursor to the record breaking droughts that impacted two-thirds of the United States during the summer and fall of 2012.

Snowpack in the mountains of the western and southwestern states are projected to decrease significantly by mid-century. Due to reductions in snowpack, earlier snowmelt, and changes in snowfall patterns, average winter and spring stream flows are projected to increase in the western states, summer stream flows are projected to decrease, and peak runoff is projected to continue to occur earlier. Under a higher emissions scenario (A2), peak runoff at the end of the century in snowmelt driven streams is projected to occur as much as 25 to 35 days earlier compared to 1951–1980/

The greatest decreases in low river flows (reduced river flow for the lowest 10% of daily river flows) are projected for southern and southeastern regions of the United States, where flows are projected to decrease by more than 25% in a lower emissions scenario.

During the past 40 years, much of the Southwest, southern Great Plains, and Southeast experienced an increase in drought conditions, whereas the Northeast, Great Plains, and Midwest experienced a decrease. The first decade of the 21st century was particularly dry in the western states. In 2012, more than 60% of the contiguous United States experienced drought conditions. Projected changes: A greater risk of drought is expected in the future, with dryer summers and longer periods between rainfall events. Under higher emissions scenarios, widespread drought is projected to become more common over most of the central and southern United States. Overall, the frequency, intensity, and duration of droughts are likely to increase and water levels are likely to decrease.

Groundwater Levels: In many parts of the United States, groundwater is being depleted at rates faster than it is being recharged, including:

  • High Plains Ogallala aquifer
  • California Central Valley
  • Chicago-Milwaukee area
  • West-central Florida
  • Desert Southwest

And others as well. In parts of Kansas, Oklahoma, and Texas groundwater levels were more than 130 feet (40 meters) lower in 2007 than in 1950.

A combination of changes in precipitation and increases in evaporation rates, droughts, and competition for water may decrease groundwater availability, particularly in the central and western states, as heavily utilized aquifers experience reduced recharge rates. By the end of the century, natural groundwater recharge in the Ogallala aquifer is projected to decrease by more than 20%, under warming of 4.5°F or greater.

Heavy Precipitation and Downpours: Heavy downpours have increased, and the fraction of rainfall coming from intense single-day events has also increased. Since the beginning of the 20th century, total rainfall during the most intense precipitation events in the United States has increased by about 20%. Since 1991, the amount of rain falling in intense precipitation events has been above average throughout the continental United States. There are clear trends toward very heavy precipitation for the nation as a whole, and particularly in the Northeast and Midwest,.

Measurements of stream gauges with historical records of at least 85 years show that the greatest increases in peak stream flows have occurred in the upper Midwest (specifically, the Red River of the North), and in the Northeast (especially in New York, New Jersey, and eastern Pennsylvania). However, stream flows in the Rocky Mountains and the Southwest have shown significant declines.

Examples of Impacts from increasing heat

  1. August 2012: Dominion Resources Millstone Nuclear Power Station in Connecticut shut down a reactor because the temperature of the intake cooling water was too hot for 2 weeks, resulting in the loss of 255,000 MWh of power
  2. September 2011: High temperatures and electricity demand tripped a transformer and transmission line near Yuma, Arizona, starting a chain of events that led to shutting down the San Onofre nuclear power plant. 2.7 million customes in San Diego county lost power for up to 12 hours.

Examples of Impacts from not enough water

  1. July 2012: In the midst of one of the worst droughts in American history, natural gas and oil companies that use hydraulic fracturing faced higher water costs or were denied access to water for 6 weeks or more in many states
  2. Summer 2012: Drought and low river depths disrupted the transportation of commodities, such as petroleum and coal, delivered by barges. The U.S. Army Corps of Engineers reported grounding of traffic along the Mississippi River
  3. Summer 2012: Reduced snowpack in the mountains of the Sierra Nevada and low precipitation levels reduced California’s hydroelectric power generation by 38% compared to the prior summer
  4. September 2010: Water levels in Nevada’s Lake Mead dropped to levels not seen since 1956, prompting the Bureau of Reclamation to reduce Hoover Dam’s generating capacity by 23%.

 

Posted in Blackouts, Cascading Failure, Climate Change, Electric Grid, Electric Grid, Energy Production, Hydropower, Ships and Barges, Water | Tagged , , , , , , , | Leave a comment

OPEC’s policies are a threat to the U.S. economy. U.S. House 2000

[ Perhaps when the energy crisis has struck and rationing grows ever tighter, people not be traveling much and have more free time, and interest in the history of energy policy. So here’s a bit of what was said back in 2000.

Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation, 2015, Springer]

House 106–197. June 27, 2000. OPEC’s policies: a threat to the U.S. economy.  U.S. House of Representatives.

Excerpts from these 85 pages follow.

Benjamin A. Gilman, New York, Chairman.   Today’s hearing is the third in our series on the impact of the price fixing schemes by the Organization of Petroleum Exporting Companies on the American homeowner, on the small businessman, on our commuters, on our aviation industry, on the truck drivers and the policy maker who sits in your seat and must manage this uneasy and very troubled relationship.

We look forward to holding additional meetings of our Committee to explore additional issues related to the energy crisis facing the American people, including a sustainable energy strategy and a review of the profits of the major oil companies that are up some $7 billion over the past year and the OPEC nations whose revenues have doubled over the past 2 years. I would also note that the General Accounting Office released a report over the weekend reviewing areas where existing controls over foreign travel of our nuclear scientists can be and should be strengthened.

The administration’s laissez faire approach has sent a clear signal to OPEC that price fixing is okay by us, that production cutbacks are not so bad after all, and that as long as you keep trying to aim at a reasonable price for crude oil, you can overshoot $30 barrel oil with not so much as a slam on the wrist. Our government has become the victim of the manipulation of the oil market by OPEC.

The legislation I introduced last week, the Foreign Trust Busting Act and the International Energy and Fair Pricing Act of 2000 will ensure that this administration adopts a consistent and a comprehensive policy of opposition to OPEC and to other similar cartels. In the ongoing energy crisis facing our Nation, we can help keep the spotlight where it belongs, on this international energy cartel. With the enactment of this measure, the administration will no longer be able to go back to business as usual in supporting any back room arrangements and cartel-like behavior.

Sam Gejdenson, Connecticut.  The failure to act about our energy independence really starts here in the Congress. If you think of the initiatives of the Republican-led Congress over the last 6 years, I think one of its earliest initiatives was simply to abolish the Energy Department. But it got worse. When we take a look at where we are today as a Nation, this Congress has continuously prohibited the administration from increasing the standards of efficiency on automobiles. This is not simply as bad as living with the status quo, because as Americans moved from cars to trucks, it actually reduced our overall fleet average

For those of you who think this would somehow infringe on our personal freedoms, think about this. When I was a teenager, a Corvette got 9 miles to the gallon. Today that same car, more powerful and faster, gets 27 miles to the gallon, because Congress and the administration after the energy crisis forced the automobile industry by increasing CAFE standards, not as this Congress has done by blocking the administration from increasing CAFE standards.

The Congress ought to pass a new CAFE standard, demanding more efficient standards for trucks and cars.  It’s high time that we started enacting sensible legislation and supporting the Administration’s efforts to reduce our dependence on foreign oil.

Unfortunately, for 6 years, the majority in Congress has failed to make the necessary investments in energy efficiency, renewables, and conservation. Since Fiscal Year 1996, the majority has slashed the President’s proposed investments in energy supply research and development by approximately $2 billion and conservation programs by $1.1 billion.  The majority also consistently blocked any effort to improve the fuel efficiency of our cars and trucks. If we had more efficient vehicles on the road today, high gasoline prices would be less of an issue. Unfortunately, since 1996, Congress has barred the National Highway Traffic Safety Administration (NHTSA) from even studying whether or not fuel efficiency standards for cars and light trucks should be increased. As a result, we have not increased average fuel efficiency standards (Corporate Average Fuel Economy—CAFE, for short) since 1985. This is extremely short-sighted because raising the average fuel economy of our cars and trucks just by one mile per gallon will save about 250 million gallons of gasoline each year— that’s 12.5 million barrels of oil per year.

Kevin Brady, Texas.  The inescapable fact is America is addicted to foreign oil and we are falling deeper into addiction every day. Many have chosen to blame OPEC, the dealers of the oil, for not selling to us at a fair street price, which is ludicrous. America needs to kick its habit, its dependence on foreign oil, and that is one of the questions the Secretary Richardson needs to answer today, why we fail to address the real problem.

We talk about Africa and Caspian and Latin America, but why aren’t we doing more to significantly increase the responsibility America takes for our energy needs? Is it the conflict between our environmental goals and our energy goals? Is it the unwillingness to stand up to special interests and say we have to have a long-term energy policy that allows us to be more independent? What is it going to take to get a responsible energy policy that all of America is engaged in?

Edward R. Royce, California.  We have heard a defense of the Department of Energy after the disastrous guarding of our nuclear secrets, after we have seen the inability of the Energy Department to formulate an energy strategy. And let me just say this for the record, it is not for the lack of spending. We spent $17.8 billion over in the Department of Energy. Is this really the record we wish to defend? The answer which we have heard here is to raise taxes, to spend money on new subsidies for alternative fuels.

The world is awash in oil reserves, and it is a matter of using our diplomatic clout to increase production out of OPEC, and yet what we have here is a call for more funds into the Department of Energy.

This administration has been able to push up the gas taxes to the point where they are 60 cents a gallon State and local. That is the hit now. I just want to share with you the words, a quote. ‘‘The United States should start by gradually imposing a higher gasoline tax, hiking it by 1 or 2 cents per month, until gasoline costs $2.50 to $3.00 per gallon, comparable to prices in Europe and Japan.’’ That is what Paul and Ann Ehrlich said in their book, and this is what Vice President Gore said. The time for action is due and past due. The Ehrlichs have written the prescription. Now, it was Vice President Gore who was the chief advocate of the energy tax, arguing it was good for the economy, good for the environment, and I would urge you to read George Stephanopoulos’ book ‘‘All Too Human’’ about that. This administration has pursued this goal.

What we would like to do is get some focus on the question of OPEC and getting some leverage on OPEC to break that cartel. I would just like to say as Chairman of the Africa Subcommittee, I have listened to the Nigerians explain that they would like to double their production of oil. I think it would be wise for the administration to get behind that effort. You know, new technology is allowing for deeper offshore drilling. West Africa is one of the top regions for oil prospecting. Frankly, their known reserves dwarf anything in the Caspian Sea. We need to have a focused energy policy on breaking up this OPEC cartel and taking those countries that want to develop more production on their reserves and encouraging them to do so. I hope we end today’s hearing with some commitment that we will focus on the pieces of legislation that the Chairman of this Committee has introduced in order to try to go after that OPEC cartel and break it up.

Bill Richardson, Secretary, Department of Energy.  I continue to believe that markets should set prices, but while we import 22% less oil from OPEC today than we did around our last gas crunch, which was in 1977, it remains clear that actions by major oil producing nations still significantly affect oil supply. That is why this spring I spent a great deal of my time talking with energy ministers and leaders from the oil producing nations, Saudi Arabia, Kuwait, Mexico, Norway and Venezuela, often getting great criticism from one side that I wasn’t tough enough, from the other side that we were too pressure oriented. Each of these nations is well aware of the special economic and energy relationships between their country and the United States, as well as to other importing countries. Each of these nations agrees that stability is our common goal and that volatility in the oil markets is undesirable.

We believe in engaging OPEC. And if you look at the record, for instance, Saudi Arabia has been forthcoming. They have been leaders in increasing production. Kuwait has also, and I think Chairman Gilman effectively made a case with Kuwait earlier and was helpful. So there have been countries, Algeria is another country that has taken some surprising positive positions in increases in production. What we try to do with OPEC is engage them, convince them, make our arguments on economic grounds, not political grounds. It doesn’t pay, I have found, to coerce or threaten, but to be forceful. As you know, a lot of OPEC countries were not happy when I made those visible trips and when I advocated very strongly for our position. This last time we took a more low key approach. But it still involved a number of telephone calls and quiet visits that took place. That is how I think we should deal with OPEC. OPEC is a reality. They are going to be around. As a nation, we need to reduce our reliance on imported oil. I think that is message number one. This is where, together, in a bipartisan fashion, we can deal with renewable energy and those tax credits and the Home Heating Oil Reserve and helping domestic oil and gas production.

Steve Chabot, Ohio.  I agree with [others] about the ingratitude of both Kuwaitis and Saudi Arabia in the fact that we sent our men and women in harm’s way over there. This is a real slap in the face to the United States that they have cooperated in this collusion, in this unholy alliance of countries withholding oil from the market and driving up these gas prices to the extent that they have been, particularly in the Midwest, where we live.

Consumers in my district are getting gouged, or perhaps I should say gored, at the gas pumps. Working families are being priced off the highways. Small businesses are feeling the squeeze. Frankly, your administration is rapidly losing credibility. In February, when our constituents felt the first major spike in gas prices, you said, ‘‘It is obvious that the Federal Government was not prepared. We were caught napping. We got complacent.’’ Now it is late June and those taxpayers are still waiting for relief.

Many of my constituents have asked me if there isn’t something the Clinton administration can do when it engages in dialogue with the price fixing oil cartels. After all, it hasn’t been so long ago that American servicemen and women laid their lives on the line for some of those oil producing nations that are now threatening our economy with cutbacks and production and higher prices.

I have to ask the same question: What goes on at those meetings? I note that you traveled to Saudi Arabia in February 1999, oil was then selling for $12 a barrel. In March you went to the OPEC meeting in Vienna, the price jumped to $14.68 per barrel. In July, you hosted the Western Hemisphere Energy Ministers Conference, and the cost of a barrel of oil soared to $20. In August a trip to Nigeria, $21 a barrel. By December 1999, when you hosted the African Energy Ministers Conference, the price went to $26 a barrel. After you traveled to Saudi Arabia, Kuwait, Mexico, Norway and Venezuela in February of this year, the price of oil rose to nearly $30 a barrel. Apparently whatever our government was doing during those meetings wasn’t working very well. Do you think it is perhaps time for the Clinton administration to take a different approach? Do you think perhaps we can send a strong message to the price fixing oil cartels that we take a dim view of this criminal behavior and that our President will finally respond to this crisis by exercising the power he has as chief executive? Can we tell them to look elsewhere for assistance, perhaps in the area of arms sales? Mr. Secretary, the working people of my district in Cincinnati and all over the Midwest and in fact all over the country are growing angrier by the day. They want their government, the government they pay for, to lend them a hand. The time for complacency is over.

I believe our policy of engagement with OPEC is working. Now, let me just tell you a little bit about OPEC, and you know this very well. There are some countries there in OPEC that we don’t have strong relations with, Iraq, Iran, Libya. There are other countries that we have strong relations with, Saudi Arabia, Kuwait, Venezuela, Nigeria and Indonesia. OPEC operates by consensus, and I engage them, every minister, intensively. I did not travel this last time, but telephone incessantly, making our case, saying ‘‘keep an open mind,’’ and we think the results were positive.

Christopher H. Smith, New Jersey.  Having read Vice President Gore’s book, Earth in the Balance and The Population Explosion by Paul Ehrlich. It is a book of pseudoscience, extreme exaggeration, a book filled with worst case scenarios. As a matter of fact, I went back and looked at some of the things with The Population Bomb. They haven’t happened. Yet that was used to drive policy for years and yet those worst case scenarios were nothing but worst case scenarios that didn’t even come close to happening. Hyperbole like that is very dangerous when it has such an impact on policy. Now, in looking at The Population Explosion, there is a quote, and again I read the book, so I am very well acquainted with it, but one quote from it, ‘‘The United States can start by gradually imposing a higher gasoline tax, hiking it by 1 or 2 cents per month, until gasoline costs $2.50 to $3.00 per gallon, comparable to prices in Europe and Japan.’’ That is on page 219 to 220. As we all know, the Vice President wrote the promo for that and said, ‘‘The time for action is due and passed due. The Ehrlichs have written the prescription.’’ if that is not an endorsement of higher gasoline prices, I don’t know what is.  

The Vice President has clearly made it clear that he would like to see higher prices as a way of mitigating consumption as an environmental issue.

Bill Richardson, Secretary, Department of Energy.  I know that the Vice President cares about how we can make automobiles and trucks more fuel efficient, and still ensure that Americans have a free choice in buying them. I just heard today that SUVs, their sale has been dramatically increasing in the last 2 weeks, more than ever, the most-sold automobile.

I remember going to Saudi Arabia when prices were $10 a barrel and there was great concern in Saudi Arabia, there was great concern in America’s oil patch, in New Mexico and Texas, and in California and Arizona and many other States, Louisiana, because our domestic oil and gas industry was hurting. Our policy has been to say that $10 is too low, $30 is too high. It is now over $30, $31, I think, and we are saying it is too high. Now, given that, what has been our policy with OPEC? Our policy with OPEC has been to forcefully engage it. When they had the production cuts, we expressed strong concerns. We are against artificially set prices. We think the market should dictate.

We think Nigeria has enormous potential for more oil and gas production, and we are working with them to bring more technology, to bring more American investment. We have got substantial investment there. They have had some infrastructure problems, as you know, because of some of the political issues that have been affected there. There was a lot of corruption; instead of revenues coming in from energy production for other capacities, they went elsewhere. What we want to do is develop—we have a 3-pronged strategy: Develop oil and gas resources in three key regions; in Africa, in Latin America, and in the Caspian. We think that we bring our leadership in that area, especially in Nigeria, where there is a pro-market, pro-democracy government, that is doing the best it can to get the economy back and bring some true democracy, and is having some good effects, we are very bullish about Nigeria. The problem still is their infrastructure, their pipelines. We also support a West Africa gas pipeline. We have been very involved in spurring the production of that with both, some energy companies and some of the governments there in Chad and Nigeria and other nations that are key to that. So we think that Africa is a real untapped resource, not just for itself, but for our country.

Gregory W. Meeks, New York.  Let me just say maybe something that might not be as popular to say, but I think we just need to be mindful and always believe in counting our blessings. Though we are going through a crisis here in America right now with reference to oil and gas prices, still, as I was walking over here with my intern, she mentioned to me, you know, aren’t we still getting gas and oil cheaper than anyplace else in the world, and that is probably true, and we should count our blessings for that. But it does not mean that we should be easy and take it easy, and there is enough blame to go around with reference to the crisis we are currently in. Clearly… there is blame on the consumer’s part. We have not been smart consumers. There is blame on the administration, there is blame on Congress. And we can sit here until we are blue in the face, blaming one another and pointing fingers at one another.,

Donald M. Payne, New Jersey.  I think what we need to do is stop being so dependent. I think what we need to do is stop buying all of those sports vehicles, as you mentioned.  We need to talk about ways to reduce the consumption of these gas guzzlers that have been reintroduced into our country, and I believe that what we need to do is to start looking at ourselves to see how we can come about.

I was shocked at Congressman Delahunt as he read off the 8 or 9 companies, oil companies, and the profits, starting at 600% now, they were making profits all along. I mean, that is on top of what was going on. That is egregious. I mean, here we are all bashing OPEC, and we should, but no one, especially from the other side, no one is talking about what is happening with these oil companies, and the mergers, which is happening in banking, which is happening in transportation, which is happening in the airlines. We are going right back to the standard oils of the turn of the century, with the robber barons and the big mega companies that are there, and they are so large that they are almost too big for the government to even have an impact on.

It is really being naively optimistic to think that we can do something to make OPEC change. I mean, people say we need to bust up OPEC. I just would like to know how do you bust OPEC up? We should bust up the diamond cartel. As a matter of fact, they take diamonds of civil wars and bandits and dictators and continue to sell them. We ought to look at busting that up too. It is great to say that, but how do you go about breaking up a group that comes together. I think that we need to have alternative sources, we need to stop being dependent. As long as we go to bigger cars and more gas guzzlers and more disregard for the regard that we had 10 or 15 years ago when we went to smaller cars and people were more fuel efficient. But we have gotten back to the way we were in our habits of consumption that just going on and on and on, until we have alternative energy sources

When we reduce our dependence on OPEC, they will simply reduce the prices. That will weaken the cartel. That is the only way I think we are going to have a real impact

Dana Rohrabacher, California.  We do have some fundamental questions about administration policy. What we see from the Clinton-Gore administration has not had a responsible energy policy, and perhaps this is due to the fact that it is being unduly influenced by looney environmental ideas that have been espoused by the Vice President for decades. The Vice President has been the number one advocate of higher gas prices in order to achieve his environmental goals for decades. Now, are you or are you not here telling us that the Vice President has or has not abandoned his commitment to dramatically raising the price of gasoline in America?

Bill Richardson:  Congressman, the Vice President does not favor higher gasoline prices for consumers. Let me just state that.

Mr. ROHRABACHER. He has always advocated that. That is not even debatable.

Bill Richardson: That is not the case. He wants to see tax credits for families to purchase fuel efficient cars.

Mr. ROHRABACHER: No, he has advocated in his writing, he has advocated in speeches, that Americans, that we are at fault because we want to use our cars too much, because the price of gas is too low. Does that mean the administration has backed off of its commitment to higher gas prices through the Kyoto agreement? Has the administration backed off from that?

Secretary RICHARDSON. Congressman, we have never been for that. Let me just tell you what the Vice President wants to do. You mentioned automobiles. It is through him that the big 3 and the Department of Energy and other agencies are trying to make SUVs more fuel efficient, 40 miles per gallon, 80 miles per gallon. That is his objective.  

Brad Sherman, California.  We are being told that oil prices would be lower if we just got rid of all environmental concerns, drilled everywhere, eliminated any attempt to reduce air pollution, and nothing could be further from the truth. I want to thank the administration and the Secretary for standing firm on environmental concerns.

We should, instead, focus on the fact that we went to war in the Gulf, we could have experienced thousands of casualties, and we had an opportunity to turn to Saudi Arabia and to turn to Kuwait and say in return for your continued existence as countries, we insist that you leave OPEC and produce oil at a reasonable economic rate. Instead, we returned Kuwait to its Sultan or its Emir, and, let’s face it, Saudi Arabia would not be an independent state today had we not acted. Without asking for a single concession for the American consumer or motorist, and in doing so, we not only failed to overthrow Saddam Hussein, we failed to break OPEC.

Those who blame the environmentalists should recognize that if it wasn’t for environmentalist concerns, we would be getting 12 miles a gallon in our cars and 8 miles a gallon or 6 miles a gallon in our trucks and SUVs, and think that we need to go further if we want to break OPEC toward fuel efficiency standards and toward fuel efficiency research. We are told that America is addicted to foreign oil, so the solution is huge subsidies for big producers of oil domestically. Yet we, as motorists, pay the same price, whether we are buying oil from Saudi Arabia or from Texas, domestically produced oil sells for no less. So when OPEC forces the price of oil up, the producers in Texas do just as well as those in Kuwait, and yet we are told we are supposed to give more subsidies, more tax breaks, to those who are already getting huge prices for their oil. The key is not foreign oil versus domestic oil, it is just total world supply of oil.

Howard Metzenbaum, Chairman of the Consumer Federation of America, former Senator of Ohio

I think that history will record that probably the failure to have some sense of appreciation from Saudi Arabia and Kuwait is probably one of the most ungracious, ignominious acts of any nation, one to the other. We were there when they needed us, we were there with our men, women, who went there to save those countries. The Kuwaiti leadership left the country while our men and women were there saving them from being overtaken by the Iraqis, and in appreciation, what comes about? The highest price oils, restricting the production of oil. They ought to be ashamed of themselves.

I believe what is happening now is a serious threat, not to our Nation’s security, but to the lives, the economic welfare, of literally millions of Americans. The price of gasoline may not matter much to those who have the wherewithal, but the price of gasoline is a very serious threat to working people who have to use their automobiles to get to work, to mothers who have to leave their children at a baby clinic, at a child clinic, so that their child may be safe while the mother is working, and it is a challenge for many who are living on a very meager existence to try to be able to get along with the extra costs brought about by reason of increased gasoline prices. It is just unfair, it is unreasonable, it is illogical for us not to be releasing oil from the Strategic Petroleum Reserve.

BENJAMIN A. GILMAN, NEW YORKToday’s hearing is the third in our series on the impact of the price-fixing-schemes of the Organization of Petroleum Exporting Countries on the American homeowner, the small businessman, the commuter, the truck driver, the consumer—and the policymaker who sits in your seat and must manage this uneasy and very troubled relationship. Our policy is hard to discern—and harder still to explain to the average American who has seen gasoline prices rise some 60 cents over the past year and a half to record levels in the northeast and Midwest. Oil prices today are higher than at any time since the Iraqi invasion of Kuwait. Continued high prices for gasoline and other fuels are now beginning to stunt our own economic growth and curtail global growth prospects as well. In addition, they are stoking the flames of inflation inducing bankers to raise rates and curtail lending.

How has the Administration reacted to this growing threat to our pocketbook and our prosperity? Remarkably passive in the face of OPEC’s continued assault on our free market system and antitrust norms, this Administration is still firing blanks when it should be making an all-out attack on the production allocation system which has kept oil at $30 a barrel for much of the year. The producers are in clover with multi-billion dollar profits while consumers are in hock to a cartel that is turning our economy’s soft landing into an abrupt free fall with no rip cords left to pull. I am still waiting for the answers I raised at our first hearing: What has the Administration done to systematically review our policies toward OPEC and its member states? Why has the Administration failed to weigh in strongly enough with OPEC last year to prevent a continuation of production cutbacks? And how can we begin to take effective action against its continued production cutbacks and price fixing behavior?

The Administration’s laissez-faire approach has sent the clear signal to OPEC that price-fixing is fine by us, that production cutbacks are not so bad after all, and that as long as you keep trying to aim at a reasonable price for crude oil, you can overshoot your mark with $30 a barrel oil with not so much as a slap on the wrist. Uncle Sam is being played for ‘‘Uncle Sucker.’’ The legislation I introduced last week, ‘‘The Foreign Trust Busting Act’ ’and the ‘‘International Energy Fair Pricing Act of 2000’’ will ensure that this Administration adopts a consistent and comprehensive policy of opposition to OPEC and other similar cartels. In the ongoing energy crisis facing this nation, it keeps the spotlight where it belongs—on this international energy cartel. With the enactment of this measure, the Administration will no longer be able to go back to business as usual in supporting back room arrangements and cartel-like behavior. The first measure would allow lawsuits to be brought against foreign energy cartels. The second would specifically direct the President to make a systematic review of its bilateral and multilateral policies and those of all international organizations and international financial institutions to ensure that they are not directly or indirectly promoting the oil price-fixing activities policies and programs of OPEC. (53) It would require the Administration to launch a policy review of the extent to which international organizations recognize and or support OPEC and to take this relationship into account in assessing the importance of our relationship to these organizations. It would set up a similar review of the programs and policies of the Agency for International Development to ensure that this agency has not indirectly or inadvertently supported OPEC programs and policies. Finally, it would examine the relationship between OPEC and multilateral development banks and the International Monetary Fund and mandates that the U.S. representatives to these institutions use their voice and vote to oppose any lending or financial support any country that provides support for OPEC activities and programs.

Paul Gillmor, Ohio. Another factor in the high gas prices has been the fact that the United States has placed many areas ‘‘off limits’’ to domestic petroleum exploration and production. While there may be some valid reasons for doing so, the fact is this has made the United States more dependent on foreign energy and much more vulnerable to the international cartel.

Robert Menendez, New Jersey.     The past five years, Republicans in Congress have funded only 12% of the Administration’s requests for new investments in renewable sources of energy and energy efficiency initiatives—this measly and irresponsible level of funding has been nearly $2 billion short of Clinton Administration requests.

I don’t think, Mr. Chairman, it is appropriate to claim here today that the Administration has no energy policy.

Republicans not only have failed to build up the Strategic Petroleum Reserve when fuel was cheap, but before we faced this crisis, they proposed getting rid of the Energy Department and selling off the reserve—policies that would have been extremely detrimental if carried out as proposed.

Cynthia A. McKinney, Georgia.  I want to bring something that I feel is very, very important to your attention that I am sure you are not aware of. It has to deal with the situation of African-American workers at Savannah River site. I just want to list some of the things that are alleged to have taken place there. There is a work area where African Americans primarily work. That area is referred to as ‘‘Coonsville.’’ Nooses have been placed on African Americans’ work stations, and electricians brought a noose to the site and demonstrated the historical value of a noose. The ‘‘N’’ word is reportedly regularly used by both management and staff. African Americans at the Savannah River site have 1.7 to 1.8 times the exposure to radiation than their white counterparts. African American employees feel that management places African Americans in the work site to get the radiation. Twenty percent of the total workforce at Savannah River site is African American, yet 40% of the staff in the areas of exposure to radiation are African American. Two percent of the upper management at Westinghouse are African Americans. There has never been an African American vice president at Savannah River site. A machine named ‘‘the manipulator’’ is referred to as the slave master. Finally, I would just like to say I had the president of Westinghouse, Savannah River site, in my congressional office, Mr. Buggy, and while there, Mr. Buggy actually used the ‘‘N’’ word in my presence, in my office. That is the kind of leadership that exists at Savannah River site Westinghouse under contract by DOE. Now, I also have a letter from Maryanne Sullivan, general counsel, dated May 15, 2000, from the Department of Energy, where she says that litigation expenses are considered to be costs of doing business. My question to you, Mr. Secretary, is why should the U.S. taxpayers foot the bill for litigation expenses against poor employees who have already been victimized by that kind of management and that kind of an environment? And why should that be condoned by the Department of Energy?

Secretary RICHARDSON. Congresswoman, I will get back to you on these issues. Let me just say that after that 60 Minutes report came out, and I think you are aware of that, I sent a team down there to look at some of those allegations. I also sent my ombudsman, somebody who I appointed in the Department to find problems of racial profiling, we have had some problems with Asian Americans in the suspect case at Los Alamos, and I wanted to send a message that we don’t tolerate racial profiling. I will have somebody come see you, or I will come to see you myself, to look into some of these issues that you have raised with me.

Edward J. Curran, Director of Counter-Intelligence, Department of Energy

I am a current FBI employee with the assignment to review of the counterintelligence program within DOE, prepare a 90-day study with recommendations, and improve the counterintelligence program.

What we found is the counterintelligence program at the Department of Energy was almost nonexistent. It didn’t even meet minimal standards. We said that and we said we have a lot of things to do here. The 48 recommendations were very controversial within the Department and the laboratories. There is a great deal of resistance to any of those recommendations. We broke them down into tier tier 1, 2 and 3. Tier 1 were recommendations that we need to do right now to fix the problem at DOE. One of those recommendations was to enhance our pre-brief and debriefing programs of our scientists who are traveling overseas, and we acknowledged 2 years ago they are targets of foreign intelligence service, just like anybody else in the government, DOD or other government agencies, including private industry. The results of this GAO study we worked very closely with them in the past 8 months while they were preparing this. We gave them complete access to our database that we put the information on our pre-briefings. These were pre-briefings that the Secretary has approved in November. Despite the resistance, he approved all 48 of these recommendations.

We agree totally that all our scientists are at risk, no matter where they are outside the United States, whether it be because of economic espionage, proprietary information. What we have to first address, though, are those countries from the sensitive countries that have a track record, have been identified as activities by those intelligence services that threaten immediately our national security. Our scientists get pre-briefs often, personal briefings, before they go overseas. We gather this type of information. We know what countries do what to us, and it is a defensive mechanism that whether we do this or not, that targeting is going to take place overseas. We feel that to have a structured program to prepare these people to go over is of tremendous interest to counterintelligence. If I could just read from the GAO study one paragraph, which was unfortunately leaked to the news media last week, and I think some elements in the news media believed that anything that is leaked is critical to the Department of Energy. I think you need to read it thoroughly, though, to see this is not a critical report. Page 3 of the GAO study, it says, DOE and its laboratories have instituted several national security controls over official foreign travel by laboratory employees. They include threat assessment and analysis provided by DOE’s office of counterintelligence, security and counterintelligence awareness training, and a review and approval process for foreign travel requests, face-to-face or written pre-travel briefings, classification review of publications and presentations, and face-to-face or written post-travel debriefings and trip reports prepared by the traveler.

What we try to do is if you do these pre-briefings early, we can come up with determinations whether a particular employee is being targeted or singled out, whether because of the science he happens to be working on, or whether he may show some vulnerabilities. Once we determine that, if we consider an employee to be in harm’s way or be unusually targeted, we will take him out of that country.  Every scientist within DOE that travels to a foreign country is required to have a pre-brief with a counterintelligence officer. Every employee going overseas is required by the Secretary to have at least an annual briefing on awareness training and counterintelligence security issues.

 

Posted in Energy Policy | Tagged , , , | Leave a comment

U.S. House 2000 Ensuring adequate supplies of natural gas and crude oil

[ The usual partisan nonsense can be found here — Republicans blaming Democrats for not solving the energy crisis by “Drill Baby Drill” and Democrats berating  Republicans for cutting energy efficiency programs. There are a few biophysical acknowledgements that energy underlies our economy (rather than money as economists would have you believe).   So however clueless our representatives can appear to be, they are aware the U.S. is totally dependent on a finite amount of imported oil from unstable countries, and that this puts the U.S. at risk of energy shortage.  Which of course implies endless oil wars abroad, though this is only spoken of military leaders invited to these hearings.   

When the energy crisis strikes, I think historians will be struck by how much both political parties seek to solve the crisis with more energy — more biofuels, more gas, more oil, more wind, more solar.  Not energy efficiency. Or a strategy to inform the public and ask them to do what they can to conserve energy (especially since Americans consume 5 times more energy than the rest of the world), or to make birth control and abortion more easily available here and the rest of the world to get population numbers more in line with carrying capacity as fossil fuels decline.  So what if that’s political suicide, it’s the right thing to do for the long-term good of the United States and the rest of the world. 

Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation, 2015, Springer]

House 106-147. May 24, 2000. National energy power: ensuring adequate supply of natural gas and crude oil. U.S. House of Representatives.

Excerpts from this 133 page document follow.

Mr. BARTON. We are going to have a serious discussion today on what our energy policy should be. One could argue the last 7 years we have been adrift in terms of a comprehensive, coherent, coordinated energy policy. Today’s hearing is the first of three. We are going to focus on the oil and natural gas industry and some of the issues that face us in that arena. This past winter we got a wakeup call that gasoline is not forever going to stay below $1 a gallon at the pump. OPEC reemerged as a force to be reckoned with. And we had the situation where our Secretary of Energy, a former member of this subcommittee, was running around the world trying to get OPEC to raise their production quotas, which was not a pretty picture in my mind. As bad as the situation was this past winter, several of our witnesses today will paint a much more dismal picture for the future. And according to them, and I happen to share their concerns, unless we do a course correction on our national energy policy, the EIA representative will predict that by the year 2020 the United States will be dependent on foreign suppliers for 64% of its petroleum consumption and over 50% of that is expected to come from the OPEC countries.

I think that those numbers indicate that it is time for us to do this course correction sooner rather than later with respect to domestic oil and gas exploration and production policies in our Nation. We are a country rich in resources. There is no reason that we can’t minimize that dependence. Yesterday the Energy Information Administration released a report that estimated that there are 10.3 billion barrels of oil that could be recovered from the Arctic National Wildlife Reserve which we commonly call ANWR. Similarly, vast areas of the Outer Continental Shelf have been put off limits for natural gas exploration and production in the last decade. The Destin Dome near Florida’s coast is estimated to contain at least 2.6 trillion cubic feet of natural gas. If this is true, it would be one of the largest natural gas fields in the Gulf of Mexico.

Neither the Congress nor the administration have spent significant time on this subject over the last few years. I do not intend this statement to be a criticism; it is simply a fact. And it is somewhat understandable when the economy is doing well and at a time when energy prices are generally low. I share the concern regarding our growing dependency on foreign sources of energy. And this subcommittee is an appropriate forum to explore the various energy policies which exist today and to look at proposals that are aimed at promoting a higher level of energy self-sufficiency.

Mr. HALL. And your decision to hold hearings on the state of our national energy policy is once again in vogue again like it has been so many times. It seems like we go up and down the system. The development and implementation of a coherent and flexible energy policy tends to recede into the background when energy prices are stable but come roaring to the front when things go up or when the public debates the prices as they rise, as they have over the last year.   I don’t believe there is a more important element of our economy on which Federal policy needs to be developed and carried out more consistently than energy. We do need an energy policy and that energy policy can be very easy and simple. It simply is incentive to look for it and reward for finding it. Because the little ones look for it, they borrow money from the big banks to look for it, and when the little ones find it, the big ones buy it and redistribute it, make money and live happily financially ever after, while little ones go back to looking for a bank that will loan them some money. Even in high times bank won’t loan money because there is no consistency.

Mr. PALLONE.   The majority leadership’s idea of a national energy policy involves drilling the Arctic National Wildlife Refuge and unfortunately, this is not sound policy. If we open the Arctic refuge to oil and gas development, we will only have the equivalent of 6 more months’ worth of oil supply.  In the process we would destroy one of our Nation’s greatest natural resources forever. And drilling the Arctic refuge will do nothing to increase our energy security or lower prices at the pump. Instead of drilling in the Arctic refuge, we should be banning exports of Alaskan oil to other Nations.

A sound energy policy entails a comprehensive approach that includes promoting and funding commonsense programs to conserve energy and develop alternative energy sources. Such programs also would reduce our reliance on polluting fossil fuels and on oil imports from foreign nations. Regrettably, the Republican leadership has harmed the Nation’s energy security by cutting funding for energy efficiency, renewable energy, weatherization and alternative fuel programs during the past several years.

After taking control of Congress, Republicans cut energy efficiency programs by 26%.

We have to help ensure our Nation’s short- and long-term energy security independence by using mass transit, bicycles, and other fuel-efficient vehicles. There are a lot of things we can do in terms of conservation measures even here in the House of Representatives. And I just would like to see a more proactive effort so that we can go home and tell our residents and our children that we are working to protect the Nation’s energy security as well as their pocket books as well as our resources.

Mr. LARGENT.  Fossil energy, specifically natural gas and crude oil, have to and will continue to dominate United States energy supply. According to the Energy Information Administration, petroleum consumption in the United States is projected to increase at an annual rate of 1.3%. However, also according to the EIA, our domestic petroleum supply, including natural gas, is projected to remain nearly flat. What does that mean? It means that instead of currently relying on the rest of the world’s petroleum for 52% of our domestic oil and gas consumption, in 20 years we will depend on foreign imports for 64% of our domestic consumption needs. Frankly, I think that estimate is conservative.

DANIEL YERGIN.  If prices are ever in the $30-plus range for any period of time, we think that the impact of slowing economies will provide an inevitable correction on those prices. But this wide band of prices that we have seen, from 10 to $34 a barrel, in this most basic commodity over a little more than a year, underlies the inherent volatility in what is probably still the world’s most important commodity.

Let me in the last couple minutes turn to something that this subcommittee has considered a lot—and Congressman Sharp and I were talking about this for a very long time—which is natural gas prices. In a sense, this subcommittee is the home of consideration of that. So much of the energy consideration has been focused on oil recently. But the United States natural gas supply system right now is characterized by very tight supplies, and we are going to hear a lot more about that in the next few months. The spring market and the recent heat waves have shown how tight.

Indeed, the United States is making a very big bet on the adequacy of future gas supplies without realizing it. Fifteen percent of our electricity today is generated with natural gas. In terms of proposed new capacity, that number goes up to 96%. We have seen a slow supply response, partly because of the oil and gas price collapse the last couple of years.

There are big challenges ahead. No. 1 is to reverse the decline in supply. We estimate that in round numbers we need half a trillion dollars’ of investment in the upstream natural gas business to get the kind of supply that we need in 10 years. We need to add 50% more reserves in this decade than in the last decade. We will need to connect new frontiers of gas development, including, as Admiral Watkins pointed out, the Arctic, and reduce the pressure on the gas infrastructure system in our country. We need to recognize the very large reliance we are making and we are putting on natural gas to power the growth of our new economy.

The attention on oil when prices reached $30-plus underlines the fact that oil prices are one of the few prices—along with the price of labor and the price of money (interest rates)—that can move the economy and spook the stock market.

One of the major changes from the oil turmoil years of the 1970s is the relative absence of confrontation that characterized that earlier period. No longer is there a North-South struggle. Instead, we are in an era of ‘‘emerging markets.’’ Countries like Mexico and Saudi Arabia—recognize how integrated their economies have become with the United States. Mexico worries about oil prices. But, especially post NAFTA it also worries a great deal about the health of the United States economy.

Exporters also have assimilated the great lesson of the 1980s—that customers count and you do not want to risk losing market share. They could see that prices at recent levels could well damage their interests in two ways. First, they could lead to a slowing of economies, reducing demand and thus creating new problems for the exporters. Secondly, although the oil industry has been cautious in its spending, persisting high prices could end up stimulating the development of a lot of new supply. And the exporters have no interest in seeing oil prices turn into a big campaign issue in the United States.

The message is similar for the United States. We import about half of our total oil supplies, and our dependence will grow. We have a dense web of interdependence with many of the oil-exporting countries, of which oil is but one, though a most important element. The current supply picture is taut, which means that the market could be subject to a great deal of volatility. With low inventories, prices could be driven up again by everything from another Iraqi showdown with the United Nations, political or technical problems in a major exporting country, or political tensions among exporters, to the pace of Asia’s recovery and the strength of the U.S. and European economies. Continuing demand recovery in Asia and Latin America combined with further gains in North America are expected to propel world demand growth through 2002. We are looking at a 2.2% worldwide annual average increase in demand between 2000 and 2002 as long as economies remain on the track on which they are. If prices are in the $30 or higher range for any period of time, we think that the impact of slowing economies will provide the inevitable if unfortunate corrective. Many of the exporting countries suggest that an ‘‘appropriate’’ range for oil prices is $20-25 a barrel—a price that is often said to be good for consumers and good for producers. But the much wider band of $10 to $34 a barrel within 1 year underlines the inherent volatility in the world’s most important commodity market and the difficulty in getting the price ‘‘right.’’

U.S. Natural Gas.  Much of the energy attention in recent months has been focused on oil. But the U.S. natural gas system is also characterized by very tight supply. The spring market and recent heat wave have shown how tight: Prices today are 70% higher than they were this time last year. In CERA’s view, for the first time in many years, there is real uncertainty about the ability of North American supply to meet demand without sharp price rises. Demand pressures are intensifying, while there are few signs yet of growth in supply. The differences are being made up by withdrawing supplies from storage, where the levels are very low. There is a similarity here to the oil market. Pressures on inventories are driving the market. But there are major differences, too. In contrast to the oil-exporting nations, there are no suppliers withholding supplies from the market. And there is no capability to ramp up production quickly to reduce the pressure in the market. In the months ahead, we may see prices at new, higher levels that have not been seen since the emergence of natural gas spot markets in the mid-1980s. How high will prices go this summer? That depends on how hot the summer. We could well expect $3.50 to $4.00 levels. With a hot summer, prices could spike to $5.

In examining the outlook for the industry in our new study, ‘‘The Future of North American Natural Gas,’’ we see major challenges. Natural gas is a critical fuel for the United States both for energy and environmental reasons. It currently provides 23% of our total energy. It heats 53 million homes; it is a major feedstock for industry; and, increasingly, it will be the key to our future electricity supplies. Electricity will continue to become ever more central to our economy; the digital economy depends upon a very high quality electricity supply system. But that system, in turn, will more and more depend upon natural gas. Indeed, the United States is making a major bet on future gas supplies—without realizing it. Currently, just 15% of our current electric generating capacity is fired by natural gas. However, 96% of proposed new generating capacity is gas-fired. The reasons are cost, flexibility, technology, and environmental attractiveness. Demand for natural gas is currently very strong, owing to the completion of new gas-fired power generation plants, and as economic growth stimulates the demand for the power that both new and existing gas-fired generation units an produce. At the same time, high decline rates in existing natural gas production require higher levels of drilling to maintain supply—let alone keep pace with demand.

The supply response—new exploration and development—has been slow in coming for many reasons. One is the continuing impact of the 1998-99 price collapse, which devastated the cash flows of the upstream oil and gas industry and continues to leave many companies cautious and capital-constrained. An industry that has been hurt by boom-and-bust cycles is leery of setting off another one. The industry has downsized so much in response to lean times that it faces a shortage of labor.

At least until recently, capital that might otherwise have flowed into the industry instead went into the technology sector of the stock market—although that may well be ending. The result is that US supply is likely to be down this year, while western Canadian supplies are only now beginning to grow, and then only modestly. Reversing the recent declines in US wellhead supply requires offsetting higher decline rates in existing production and moving beyond the current plateau of drilling. Greater investment is needed in exploration as well as in development areas. Nevertheless, CERA expects supply to begin to show year-over-year increases in the United States toward the end of 2000, and in Canada supply growth is at last expected to be evident this spring. CERA does believe that there is the gas supply potential to meet the challenges of increased demand from power generation at a price that would not discourage that market development. It is very important to avoid short-term government intervention in the market that would discourage investment in supply.

There are big challenges ahead. The number one is to reverse the decline in supply. To meet the target of 30 TCF in ten years, compared to the 22 TCF today, will require something on the order of half a trillion dollars in U.S. upstream development. We will need to add 300 to 350 TCF of new reserves in this decade, which is 50% more than we added in the 1990s. We will need to connect to new frontiers of gas development, including the Arctic, and reduce the pressure on the gas infrastructure system in this country. And we need to recognize the very large bet that we are making on natural gas to power the growth of our new economy.

PHIL SHARP, Harvard electricity policy group, John F. Kennedy school of Government, Harvard University.  I am a Lecturer in Public Policy and recently chaired the Secretary of Energy’s Electric System Reliability Task Force.  From 1975 to 1995, I was a Member of Congress from Indiana and for 8 years had the honor of serving as Chairman of the Energy and Power Subcommittee. During those twenty years, I participated in nearly all major legislative efforts regarding energy policy and clean air policy as well. During that time I supported policies which proved effective and others which did not. Fortunately, for me, no one is keeping score.

Our basic energy policy for assuring adequate supplies of oil and natural gas is reliance on the competitive market. Stick with it. That has been the policy for two decades. After years of intense political and ideological dispute, after several ‘‘energy crises,’’ after experimenting with extensive economic regulation, a broad consensus emerged toward the end of the 1970’s that market forces, not the government, should determine the price and allocation of oil and gas supplies—that market forces would be the main determinant of how we produce, distribute, and use oil and gas. It is a bi-partisan policy. It is an effective policy. It is not well understood. Our policy has never been, however, simply one of laissez-faire. For reasons of equity, security, and environmental protection, we have a host of supplementary policies that seek to shape those market forces. Some of these supplementary policies have big impacts on the oil and gas markets and should be periodically re-examined. The unsettling price swings of this past year naturally raise doubts about our market policy; and they inevitably give rise to political calls for short term-policies that will stabilize prices to avoid short term pain—either for producers or for consumers. The overriding lesson from the past: ‘‘just say no’’ to proposals for controlling prices—directly or indirectly. Be wary of the siren song that lures the government to try to smooth the price path in a turbulent market. No one yet is calling for direct price controls. Perhaps, we have truly learned how ineffective, counterproductive and costly oil and gas controls can be for consumers and the economy. Perhaps, there is also memory of how politically difficult it was to change or abandon them. Like many others in Congress, I underwent the metamorphosis from supporting controls to helping end them. Today, however, we hear proposals aimed at controlling price spikes—not directly, but indirectly: release crude oil from the nation’s strategic petroleum reserve; create a regional product reserve in New England to quell future spikes; require private suppliers of fuel oil to maintain stocks at levels set by the government.

While not nearly as draconian as direct price controls, these proposals suffer some of the same disabilities. They assume the government can regularly out-guess the complex and rapidly changing market place. They are not likely to produce the desired result. They seldom can be invoked in a timely fashion. They often produce unexpected and undesirable consequences. Earlier this year, the President was wise not to draw down the Strategic Petroleum Reserve; and the Congress was wise not to collectively press for such action. Like it or not, in our enormously complex economy, it is the change in prices which stimulates added production, which moves products to where they are needed (like fuel oil to New England), and which encourages consumers to take seriously energy efficiency.

The pain of price swings and the benefits of the market, of course, are not evenly distributed. That is why supplementary policies such as low-income energy assistance and weatherization are the compassionate courses to pursue, rather than efforts to control the price level.

We are energy ‘‘interdependent’’—integrally connected to world oil and gas markets. While we Americans today are far more cognizant of ‘‘globalization’’ than we were in the 1970’s, it is important to remind ourselves that we live in an energy interdependent world which is very difficult for most of us to understand. This may explain why some of our rhetoric about energy policy is so at odds with reality and why so many policy proposals miss their mark. One of the major lessons from the 1970’s: there is no set of acceptable import reduction policies which can achieve anything close to oil ‘‘independence.’’ In nearly every conceivable way, we are a part of the international market—in terms of products, prices, capital investment, environmental impact, etc. There certainly are benefits to us and to the world market if we can lessen our reliance on oil. But when judging proposals that purport to cut our imports, it is important to carefully ascertain the real benefits and carefully weigh them against the real costs—economic and environmental. Cutting US imports, for example, by several million barrels—would diminish our drain on the world export market and diminish the potential impact on parts of our economy if there were a disruption in Persian Gulf production. But such a reduction in imports would in no way end our strategic concern with world oil markets in general and Persian Gulf supplies in particular. US crude oil prices are largely set by the world market; the economies of our major trading partners rely heavily on oil; and nearly 2/3 of the world’s proven oil reserves are located in the volatile Persian Gulf region.

As the Congress considers proposals to facilitate production and distribution, it should also consider proposals that advance efficiency in the use of energy. In the 1970’s, there was much rhetorical fighting over whether we could ‘‘produce’’ our way out of the crisis or ‘‘conserve’’ our way out—as if we had an ‘‘either-or’’ choice. In the end, of course, the market dictated both; and policies were adopted in the name of doing both. Efficiency improvements have played and will play a major role in helping us economically fuel the economy in environmentally acceptable ways. Since the 1970’s, significant efficiency gains have been made in nearly every sector of consumption. While market prices and market forces have been, and should be, the central driver of efficiency, government policies undoubtedly contributed to those gains—through R & D, tax incentives, jawboning, and, in a few instances, minimum efficiency standards. In the United States, if we are talking oil, we are talking automobiles—that is, passenger vehicles. It is disturbing that the projections for fleet fuel economy improvements are so dismal. This must be a matter of public concern. One of the chief reasons for moving to competition in the electric utility industry is to accelerate adoption of efficiency innovations throughout the system—from the generator to the customer.

William F. Martin, Chairman, Washington policy & analysis.   Natural Gas Consumption in the US Can Increase by 60% in the Next Twenty Years. Our latest study Fueling the Future: Natural Gas & New Technologies for a Cleaner 21st Century, reveals that consumption of natural gas could increase by almost 60% over current levels, from 22 quadrillion Btus (quads) in 1998 to 35 quads by 2020.

A Business as Usual Scenario for the US Energy Future Emphasizes Coal, Oil and Natural Gas Since the first WPA study on natural gas almost twelve years ago, the prospects for natural gas have improved, due in part to political and economic support for the natural gas industry and energy sector deregulation. Our energy economy has improved significantly over the last twenty years in terms of efficiency and our domestic energy resources have also expanded—especially coal, nuclear energy and natural gas. As we look to the

This led WPA to the conclusion that both coal and nuclear power remain important for electricity generation. In fact, we expect nuclear and coal capacity is unlikely to decline as precipitously by 2020 as many forecasts predict. Our projections assume that approximately two-thirds of all nuclear plants scheduled for retirement before 2020 extend their licenses and remain operational. Natural gas consumption is also seen as growing from 22 quads in 1998 to 29.7 quads by the year 2020, primarily in the electrical sector. We also see a significant increase in oil imports due in part to higher demand and declining domestic production. Under these assumptions, natural gas maintains market share in the electrical sector, but makes relatively few inroads into growing end-use demand within the transportation, commercial, residential and industrial sectors. Foreseeable problems related to supply and demand constrain the expansion of natural gas usage. On the supply side, WPA assumes that much of domestic natural gas reserves, both onshore and offshore, remain restricted or off-limits to exploration and production. Additionally, we assume that pipeline growth is constrained by factors including siting problems and inadequate capital investment based in large part on uncertainties about future demand in specific areas.

Natural Gas Can Play a Larger, More Direct and Dynamic Role in Meeting Our Energy Needs

We see steady penetration of gas into the electrical market, but the key to the success of this scenario is the penetration of end-use markets, including vehicles powered in a variety of ways by natural gas, gas-cooling and increased use by key industrial sectors. This projection foresees greater use of gas for distributed generation to site-based power for the industrial and commercial sectors, and by 2020 even the residential sector will see growth in this category. The increasing share of distributed generation is reflected in WPA’s projections by end-use sector. In the electricity sector, coal and oil show little or no growth in market share, but grow in absolute terms. In our Current Trajectory scenario, coal exceeds 61% of generation share, while in the High Gas Use scenario, it remains well below that at under 56%. Under this scenario natural gas consumption in 2020 is nearly 6 quads above the Current Trajectory. Roughly half of the increase is attributable to the residential and commercial sectors where more new customers choose gas and more customers convert from other fuels to gas. This scenario also exhibits continued expansion in a number of successful new markets such as residential gas fireplaces and commercial gas cooling. Additionally, distributed generation in the form of reciprocating engines, microturbines and fuel cells advances, accounting for roughly 20% of all new electricity generating capacity and 5% of total capacity by 2020.

Industrial gas demand is roughly 2.5 quads higher, continuing the robust growth of the past 10 to 15 years. Although the cogeneration market becomes saturated, other forms of distributed generation are expected to prosper, and highly efficient heating, cooling and process equipment continues to evolve, enabling gas to remain the dominant industrial energy source.

Natural gas cars, trucks and buses consume over 1 additional quad. Although these vehicles account for less than 1% of the overall vehicular market in 2020, they can make significant contributions to air quality and operational economics, primarily in fleet applications in congested urban areas.

There are Adequate US Reserves of Natural Gas to Meet a 35 Quad Future at Reasonable Prices

The decade of the 1990’s has demonstrated the vast and diverse nature of the gas resource base. Further, the resource base continues to ‘‘expand’’ as estimates today are larger than those made in the early 1990’s by the same estimators—despite the fact that we have produced and consumed over 150 trillion cubic feet in this period. Some components of today’s gas supply were not even acknowledged 10 to 15 years ago. Coalbed methane, for example, which now accounts for 6% of domestic gas production, was not included in most resource base estimates prior to 1988. There were tremendous technological advances in the past 10 years, from 3D seismology to horizontal drilling and innumerable computer-related breakthroughs. Similar advances will be required, and should be anticipated over the next 20 years, in order to satisfy a 35 quad demand level. Such advances will enable domestic production to increase from over 19 quads today to over 29 quads in 2020. Canada will contribute a slightly greater share in the future, increasing their exports from 3 quads per year to roughly 5 quads. Abundant worldwide and Alaskan gas resources offer mid-term insurance, while methane hydrates and other more exotic sources provide longer-term potential.

A Shift to Greater and More Direct Use of Natural Gas Provides Substantial National Benefits.

There are many reasons why the country should capitalize on this powerful national asset in order to clean up the environment, spur economic growth, reduce oil imports and conserve energy. Several key advantages of the High Gas Use scenario are shown below: 1. Natural gas is inherently cleaner-burning than coal or oil. Switching from those fuels to gas will reduce greenhouse gas emissions, acid rain, smog, solid waste and water pollution.

The efficiency of the natural gas system helps conserve the nation’s energy resources. When the entire energy cycle of producing, processing and transporting energy is measured, natural gas is delivered to the consumer with a total energy efficiency of about 90%, compared with 27% for electricity.

Natural gas is a highly reliable North American form of energy. About 85% of the gas consumed each year in the United States is produced domestically. The balance is imported from Canada. In comparison, roughly 60% of the oil used in the United States is imported, primarily from the members of OPEC. We project that oil imports could be reduced by about 2.6 million barrels a day (equivalent to Venezuela’s current total production levels).   Billions of dollars could be saved over the next decades as distributed generation accounts for approximately 20% of new electricity generation capacity, thus avoiding the need to build approximately 150 capital intensive large-scale power plants . GNP and trade balance improvements would occur as global gas demand spurs US exports of gas-using technologies. The US leads the world in terms of its natural gas infrastructure, and US companies are providing their equipment to countries in Latin America, Europe and the Far East that are just beginning to develop natural gas systems.

Melanie A. Kenderdine, acting Director of policy, U.S. Department of energy.  Increasing the average fuel efficiency of America’s automobiles by just 3 miles per gallon would save us almost a million barrels of oil per day. This demonstrates the value of fuel-efficient vehicles and why we have focused a great deal of effort on our PNTB program to produce a prototype 80-mile-per-gallon vehicle by 2004.

Jay Hakes and the EIA Office predict that we are going to need 1,000 new power plants in this country by 2020, and 900 of them will be powered by natural gas.

The potential for increased savings in energy demand in the U.S. economy remains enormous.  And to meet growing energy demand, it remains essential.

ENERGY POLICY. The Administration’s ‘‘First Principle’’: Reliance on Market Forces.

The Clinton/Gore Administration has published two statements of its national energy policy in the last several years: Sustainable Energy Strategy (July 1995) and The Comprehensive National Energy Strategy (CNES, April 1998). Both documents provide a guide to energy policies proposed and implemented by the Administration, and seek to ensure that energy policy is well integrated into the Nation’s economic and national security policies.

We have completed two scientific reviews of energy related technology development, Federal Energy Research and Development for the Challenges of the 21st Century, in 1997 and, more recently, Powerful Partnerships in 1999. These provide an analysis of energy technologies being developed by the Department, and make recommendations on how to best utilize these technologies both domestically and internationally. Finally, the Department over the last several years, has engaged in numerous road-mapping exercises with industry, government, and academic stakeholder groups, and two extensive energy portfolio exercises, in which we matched our energy R&D investments against larger strategic goals of the CNES. This process, followed by an analysis of the portfolio, has helped us to identify gaps in our portfolio and opportunities for additional investments in energy technology. The Comprehensive National Energy Strategy, which DOE released in 1998, identified five overarching energy goals: • Improving the efficiency of the energy system; • Ensuring against energy supply disruptions; • Promoting energy production and use in ways to protect human health and the environment; • Expanding future energy choices, and; • Cooperating internationally on energy issues.

The Nation’s Energy Challenges In addition to identifying five energy goals, the CNES highlighted three major energy challenges for policy-makers. These are: • Maintaining America’s energy security in global markets; • Harnessing the forces of competition in restructured energy markets, and; • Mitigating the environmental impacts of energy use While each of these challenges warrants different Government actions, there is a need to invest in the development of alternatives and longer-term technologies to meet our future energy needs. We have added an additional long-term energy responsibility to the three CNES challenges: • Ensuring a diverse set of reliable and affordable energy sources for America, now and in the future.

Challenge #1: Maintaining America’s Energy Security in Global Markets. The United States remains heavily dependent on crude oil. Since 1985, domestic crude oil production has declined by 34%, while domestic oil consumption has increased by more than 22%. In 1974, net imports of crude oil and products supplied about 35% of U.S. consumption. In 1999, net imports supplied about 50% of U.S. consumption. The Administration’s response to the important role of oil in our economy and the increase in net imports recognizes the following:

  • Consumption of oil continues to grow.
  • The cost of oil production in the U.S. is high relative to other producing nations.
  • The price of oil is a world price. High or low prices of oil worldwide will mean high or low prices domestically.
  • Reducing volatility in oil prices will spur investment and match supply to demand.
  • Global capacity must be increased if we are to meet domestic and international demand for oil.

To spur domestic production and lower the costs of doing business —without imposing quotas on imported oil, which would raise costs to consumers—the President has proposed tax incentives for 100% expensing of geological and geophysical costs (G&G), and allowing the expensing of delay rental payments. G&G expensing will encourage exploration and production. Delayed rental expensing will lower the cost of doing business on federal lands. The Administration has also supported and promoted virtually all significant energy legislation enacted by the Congress over the last seven years. This includes legislation for: Deepwater Royalty Relief; lifting the ban on the export of Alaska North Slope Oil; Royalty Simplification; privatization of the Elk Hills Naval Petroleum Reserve; the transfer and lease of Naval Oil Shale Reserves One and Three for production; Alternative Minimum Tax (AMT) and%age depletion tax relief for small operators; and creation of a guaranteed loan program for small domestic oil and gas producers. The Administration has also proposed legislation to transfer Naval Oil Shale Reserve Two to the Ute Indian Tribe for production; USGS estimates that there may be as much as 0.6 tcf of gas on this property.

To ensure that we are not overly reliant on imports from a single region of the world, we have diversified our sources of supply. Although our oil imports have increased, our sources of these imports have changed significantly over the last two decades. Last year, we imported 4.85 million barrels of oil per day from OPEC nations, down 22% from the 6.19 million barrels of oil per day in 1977. Our imports now come from over 40 countries.

To help the world develop its oil resources and increase world capacity, Secretary Richardson has actively promoted investment and development of the world’s energy resources. Most notably, Secretary Richardson has held two international energy summits—the Western Hemisphere Energy Ministers Summit in New Orleans and the African Energy Ministers Summit in Tucson, to discuss energy issues and plot a course for global energy development and future uses. In addition, the Secretary has traveled to virtually all the major energy producing regions of the world—the Caspian, Russia, the Middle East, Nigeria, Norway, Mexico, and Venezuela—to encourage energy production and business for U.S. energy companies.

We are also investing in reducing net oil imports by focusing on demand side technologies and policies. More than 60% of our oil consumption is for transportation, making vehicle fuel efficiency a ripe target for reducing the consumption side of the net import equation. Specifically, the Department’s transportation program is:

  • developing an 80 mile-per-gallon (mpg) prototype sedan by 2004 through our Partnership for Next Generation Vehicles Program;
  • improving light truck fuel efficiency by 35% while meeting newly issued EPA Tier 2 emission standards by 2004;
  • developing technologies to increase fuel economy of the largest heavy trucks from 7 to 10 mpg (nearly 50%) by 2004;
  • increasing domestic ethanol production to 2.2 billion gallons per year by 2010;
  • develop production prototype vehicles that will double the fuel-efficiency of tractor trailer truck and triple the efficiency of heavy-duty pick-ups;
  • supporting tax credits for hybrid vehicles.

Increasing the average fuel economy for cars and light duty vehicles by just 3 miles per gallon would save almost a million barrels of oil per day. This represents over 15% of current U.S. daily production. Investing in fuels and more fuel efficient vehicles could substantially reduce our reliance on imported oil at the same time it contributes to a cleaner, healthier environment. Without minimizing the importance of increased oil production, it is clear that even a small commitment to greater vehicle efficiency will net significant gains in reducing net oil imports, without compromising pristine onshore or offshore environmental ecosystems.

Ensuring the reliability of the energy grid is a growing focus of the Administration’s R&D efforts. While the electricity system powers other infrastructures, it will also be increasingly dependent on natural gas as a fuel source for both central power stations and small, distributed generation. EIA’s Annual Energy Outlook, 2000, projects the annual growth of 4.3% for the use of natural gas for electricity generation through 2020.

In addition, our energy delivery systems are becoming increasingly reliant on telecommunications and computing systems for fast, efficient operation. These trends will likely result in increased efficiencies and a range of new consumer products, but can also potentially increase physical and cyber threats to our energy infrastructure. To ensure the reliability and security of the electricity and natural gas infrastructures, the Administration has proposed a new Energy Infrastructure Reliability initiative with three components:

  1. electric reliability which will focus on regional grid control, distributed resources and microgrids, information system analysis, possible offsetting of peak summertime electric load with distributed generation and natural gas cooling technologies for example, and high capacity transmission;
  2. natural gas infrastructure reliability to include storage, pipeline and distribution R&D, and;
  3. secure energy infrastructures, vulnerability assessments, interdependency analysis, risk analysis, and the development of protection and mitigation technologies.

The President’s Bioenergy and Biobased Products Initiative is intended to address this growing need. Recent scientific advances in bioenergy and biobased products have created enormous potential to enhance U.S. energy security, help manage carbon emissions, protect the environment, and develop new economic opportunities for rural America. This nation has abundant biomass resources (grasses, trees, agricultural wastes) that have the potential to provide power, fuels, chemicals and other biobased products. The President has set a goal of tripling U.S. use of biobased products and bioenergy by 2010, which would generate as much as $20 billion a year in new income for farmers and rural communities, while reducing greenhouse gas emissions by as much as 100 million tons a year—the equivalent of taking more than 70 million cars off the road.

DOE’s Carbon Sequestration Program is designed to develop technologies and practices to sequester carbon that: are effective and cost-competitive; provide stable, long-term storage; and are environmentally benign. Increased carbon emissions are expected unless energy systems reduce the carbon load to the atmosphere. Accordingly, carbon sequestration—carbon capture, separation and storage or reuse—must play a major role if we are to continue to enjoy the economic and energy security benefits which fossil fuels bring to the nation’s energy mix.

The U.S. uses 94 quads of primary energy a year. The nation’s 100 million households and 4.6 million commercial buildings consume 36% of the total. Buildings also use two thirds of all electricity generated nationally. Energy consumption in buildings is a major cause of acid rain, smog and greenhouse gases, representing 35% of carbon dioxide emissions, 47% of sulfur dioxide emissions and 22% of nitrogen oxide emissions. Clearly, more efficient buildings will pay big dividends in reduced energy use and a cleaner environment.

Research and development areas for buildings include: heating, ventilation, and air conditioning; building materials and envelope; building design and operation; lighting; appliances, and; on-site generation. To use energy more efficiently, we are working to develop ‘‘intelligent building’’ control systems, more efficient appliances, and fuel cells to power commercial buildings. Standards to improve the energy efficiency of fluorescent lighting in commercial and industrial applications, proposed this March, are expected to save between 1.2 and 2.3 quadrillion BTUs of energy over 30 years, enough energy to supply up to 400,000 homes per year over the same time period. We have recently proposed an update to the efficiency standards for water heaters, and expect to issue proposals for clothes washers and central air conditioners in the near future—each of which are likely to produce even greater energy and environmental benefits. The industrial sector consumed almost 35 quads of primary energy in 1997— about 38% of all energy used in the United States. The industrial sector contains extraction industries, as well as materials processing and product manufacturing industries. Over 80% of the energy consumed in manufacturing (including feedstocks) occurs in only seven process industries: aluminum; steel, metal casting, forest products, glass, chemicals, and petroleum. These major process industries are becoming more capital-intensive. Markets are continuing to become more competitive globally.

To increase the average fuel efficiency of new cars and light trucks by 20% by 2010; • reduce the annual energy consumed by buildings; and • by 2010, reduce energy consumption in federal facilities by 35% relative to the 1985 consumption level, saving taxpayers $12 billion from 2000-2010. These reductions in energy demand will result in comparable reductions in greenhouse gas emissions, as well as reductions of other environmental impacts associated with energy use. Of course, none of this can be achieved without the active support of other agencies, industry and consumers. DOE looks forward to working with the Congress to develop and fund programs to increase the efficiency of our transportation, commercial, manufacturing and building sectors in order to save energy, increase the competitiveness of U.S. industry, and reduce our reliance on imported oil.

Investing in Renewable Power Sources.  Renewable resources such as wind, solar, photovoltaics, geothermal, biomass, hydrogen, and hydroelectric, are abundant. These alternatives are used for power generation and their primary advantage is that they produce virtually no emissions or solid wastes. Their primary disadvantages are the cost of producing power (except hydro) compared to coal and natural gas, and the need to create an infrastructure required to deliver this power to market.

To take advantage of the environmental benefits of renewable power, the Department has focused on decreasing its costs and tackling infrastructure issues. The most feasible approach to lowering cost and delivering renewable power appears to be through distributed generation—alternatives to central power stations, where power is generated locally or on-site. Distributed generation technologies are a major R&D focus at DOE.

The Government’s Commitment: Ensuring a Diverse, Reliable and Affordable Set of Energy Sources for the Future The energy options within our portfolio are oil, gas, coal, energy efficiency, renewables, hydropower, fission, and fusion. We must strategically manage energy R&D with this understanding about the energy world as we know it: there is no single silver bullet which will solve all our energy needs, making science and technology— and a broad-based energy R&D portfolio—key to meeting our long term energy needs.. Without energy technologies, a ton of coal, a barrel of oil, a cubic foot of natural gas, a ton of uranium ore, a stiff breeze, or the sun’s warmth cannot directly contribute to the prosperity of modern society. With the very best technologies, however, society can use energy resources efficiently and responsibly and with great economic and environmental gain. While economic and security challenges continue to demand investment in a robust energy research and development (R&D) program, environmental challenges provide additional impetus for increased focus on energy-related science and technology during the coming years.

DOE’s energy resources R&D portfolio is organized in three broad strategic areas: reliable and diverse energy supply ($170 million, FY01 request); clean and affordable power ($542 million, FY01 request), and; efficient and productive energy use ($437 million FY01 request). In addition, the Department has a basic science portfolio ($1.2 billion FY 01 request) which supplies the foundation for much of the applied R&D in the energy areas. A number of reviews and studies have been conducted that provide valuable information on the adequacy and focus of this portfolio. Overall, these studies have confirmed that our energy portfolio is generally well-focused on the nation’s strategic energy goals. However, the studies also have identified a number of deficiencies in how fully these goals are addressed by the portfolio and made a number of recommendations for important portfolio changes or additions, including: • Significantly enhanced R&D funding • Renewed emphasis on electric power systems reliability • A Nuclear Energy Research Initiative • Carbon management R&D • Increased bioenergy R&D • Methane hydrate R&D • Hydrogen R&D • Clean fuels R&D • Integration of fuel cell R&D efforts

The Administration strongly supports the increased use of natural gas. Several of these recommended changes or additions to our portfolio relate directly or indirectly to natural gas—power systems reliability, carbon management, methane hydrates, clean fuels, and fuels cells all involve the development of technologies to increase the supply, improve the delivery of, or improve the environmental performance of natural gas.

Because it is abundant and relatively clean, natural gas will be the fuel of choice to meet the nation’s future power generation needs. Of the 1000 power plants the Energy Information Agency (EIA) projects the U.S. will need by 2020, 900 will probably be natural gas power plants. If we are able to produce the gas to meet this need, we will need the means to distribute it safely and efficiently. Right now, there are 85 proposed pipeline projects just for the years 2000 through 2002, at the same time significant impediments exist for pipeline and storage siting. Investments in natural gas

The Clinton/Gore Administration has invested roughly $1.5 billion in natural gas R&D. DOE’s joint efforts with industry have helped produce the fuel cells, microturbines, reciprocating engines, and other enabling technologies to power the gas industry of the future. DOE’s request for natural gas R&D funding in FY 2001 is around $215 million and, as I mentioned earlier, includes an initiative for energy infrastructure reliability. The natural gas portion of this initiative specifically focuses on methane leakage, aging and corroding pipelines, and natural gas storage, to improve the safety and reliability of the natural gas distribution network.

We need to encourage increased natural gas supply. The National Petroleum Council’s recent study on natural gas projects increased consumption for natural gas of 29 trillion cubic feet (TCF) in 2010 and 31 trillion cubic feet (TCF) by 2015. At the same time, EIA estimates that in 1998, reserve additions of natural gas were only 83% of production. To meet this demand, we will need to ensure that we have an adequate supply of natural gas.

Clearly, energy is the engine that drives our economy.

In the utility industry, only one third of all thermal energy from coal or gas is actually transformed into electricity in a typical power plant. In the transportation sector, which accounts for 60% of the nation’s demand for oil, the fuel economy of passenger vehicles has actually declined in recent years due to the increasing market share of SUVs and minivans. Net imports of oil continue to increase and domestic oil production has declined.

Mr. LARGENT. Dr. Yergin, I wanted to talk to you about your book “The Epic Quest for Oil, Money and Power”. What I find missing from the discussion, are the historical views of World War II and the demise of the Japanese and the Germans and its relationship to the lack of supply of petroleum products. And one of the things that I found missing on this particular panel is the discussion of our dependence on this importation of oil as it relates to our national security. And I would just like you to make some comments about if there are some parallels that we need to be aware of what occurred to Japan, of what occurred to Germany to a lesser extent, in terms of their national security and the lack of a consistent, reliable energy source.

Mr. YERGIN.  It was for me one of the kind of amazing revelations, in terms of many years I spent researching that story, to look at World War II differently and to see the degree in which oil had been among the triggers of the war. Why did Hitler invade the Soviet Union? Partly to get the oil, in, of all places, Baku and the Caspian that is so much in attention now. And our embargo against the Japanese, embargoing them on oil prior to the Second Word War, was connected to the Japanese march toward what became Indonesia. And of course the conduct of the war and you look at it and you see that ultimately, in many ways, both the Japanese and the German military machines were immobilized by lack of fuel. And it was a great strength that we had that Congressman Hall had the vitality and the creativity and the energy of the U.S. oil industry in terms of really supplying the entire Allied effort.

Mr. LARGENT. At what point does the red light on the dashboard go off in terms of national security issues? As we see these forecasts going from 52% to 64%, at what point does the light go off and say, wait a second; this is a real problem. What are we going to address this problem? Where is that line?

Mr. YERGIN. That is talked about a lot. Basically, that line keeps moving up as our imports keep moving up. A lot of it has to do with how you assess the relationship with the countries that are major suppliers of oil. If Iraq had not only kept Kuwait and was dominating the Gulf we would have looked at any of these numbers as very alarming. So I think it has to be seen within the context of overall relationships.

Mr. MARTIN.  The red line is an international number and involves all of the worlds’ dependence on the Persian Gulf.   I think we are getting very close to that red line. If we just look in isolation at the U.S., that really doesn’t do much. Indeed, this is an international strategy we are involved with. That, again, is why the IEA is so important. I spent 4 years in the IEA during the second crisis, the Iranian revolution. We may have it all right here, but if other countries are demanding more and more oil then we lose, too. I note, for example, that it is remarkable how much Asian demand is increasing on the Persian Gulf, far more than us at the moment. So what do we do with the Japanese, the Koreans, even the Chinese? And why are the Chinese sending missiles to Iran when they are getting oil from the Gulf?  

Michael L. Johnson, VP & General Manager, CONOCO, INC.   Producers are highly optimistic about the long-term supply of natural gas. A recent National Petroleum Council study estimates the recoverable natural gas resource space in the lower 48 States is 1,400 trillion cubic feet. That is enough for many decades into the future. Today’s estimate is substantially higher than in 1992. That is in addition to the large volume of natural gas we produced between those years. New technologies are permitting us to increase recoverable gas resources faster than we are consuming them. One reason for this increased resource base is an expansion into frontier areas such the Mackenzie Delta, the Beaufort Sea and the North Slope. Tapping into these resources is not inexpensive, however. Climate and terrain are frequently hostile. We must constantly balance cost, risk, and potential to keep natural gas prices cost competitive. Today we are winning the battle to keep natural gas prices competitive, and the robust supply of natural gas makes us confident of our ability to meet future market demand, demand that we all agree will increase substantially in response to our Nation’s need to fuel our economy. Natural gas producers are also faced with a significant challenge. That is fueling the majority of new electric power generators that will be added in the next 20 years. We are confident that we can meet that challenge and are preparing to serve that growing demand in addition to the traditional markets that we have always served.

It would be nice to think that these wrenching industry changes would be enough to ensure continuing gas supplies at competitive prices to all Americans, but they are only part of the solution. Just as vital is a policy climate that permits our companies to produce resources in ways that are cost effective and efficient. We have the knowledge and the technologies to do that, we have the resource base, but we do not have the Federal policies we need to bring natural gas to the American public long-term at competitive prices.

There are people in positions of power today who are misleading the American public. They refuse to acknowledge our industry’s proven ability to produce natural gas in ways that respect and preserve the environment. They refuse to credit our technological breakthroughs. They refuse to respect the Nation’s need for natural gas. As a consequence, they are trying to prohibit natural gas exploration and production in some of our richest resource bases both on shore and offshore that continue to be locked up. There are two inevitable consequences

Conoco is an integrated, international energy company headquartered in Houston, Texas, and is among the top dozen or so U.S. gas producers. The company had revenues of $27 billion in 1999 and operates in more than 40 countries. Conoco’s natural gas and gas products operations include the gathering, processing, distribution, and marketing of natural gas and natural gas liquids in North America, the U.K., Norway and Trinidad. In 1999, Conoco marketed natural gas volumes in excess of 4.4 billion cubic feet per day in the U.S. and Europe.

Let me address the supply of natural gas. I want to emphasize producers’ optimism about the long-term supply of natural gas.

Today, we supply about 23% of the energy America consumes. That’s about 19 quadrillion BTUs (or ‘‘quads’’) of domestic gas, and an additional 3 to 4 quads from independent and affiliated companies in Canada.

There is no doubt that, in the future, the U.S. could—if we chose to do so—dramatically increase the amount of natural gas marketed domestically. The resource is there. There are many estimates of U.S. natural gas supply. All are highly positive. At the request of the Department of Energy, the National Petroleum Council (or ‘‘NPC’’) undertook a study in 1999 that estimates the recoverable natural gas resource base in the Lower-48 states at 1466 trillion cubic feet (or ‘‘tcf’’). That estimate is significantly stronger than the estimate the NPC made in 1992. In fact, the new study finds a strong probability of at least 171 tcf more than it found in 1992. That’s in addition to the 124 tcf we produced between 1992 and 1999. The reasons for this growth are the new technologies and new methods of locating resources that the industry has developed and implemented. Our strides in these areas are so rapid that, as the NPC numbers show, we are increasing recoverable resources faster than we are consuming existing reserves. Part of this expansion of the resource base involves a re-exploration and re-assessment of areas that had been assumed to be in decline, such as California. We are also expanding our reach out of the Lower-48 states to new frontiers such as offshore eastern Canada, where experts predict lie upwards of 45 tcf of natural gas. Western Canada, the Mackenzie Delta and Beaufort Sea, and the North Slope offer additional possibilities, through reservoirs in traditional formations and through our greatly increased ability to tap coal-bed methane. Tapping into the resources in these frontier areas is not inexpensive. Climate and terrain are frequently hostile. We must constantly balance cost, risk, and potential in an energy market that is frequently unpredictable.

At the same time as natural gas demand projections are rising above previous expectations, the entire energy industry is becoming more competitive. Restructuring of wholesale and retail markets in natural gas and electricity is well advanced. New technologies like distributive power are giving residential and business customers new options. The result has been high volatility in energy prices. It has been difficult for U.S. companies to predict revenue streams accurately and to plan capital investments. Wall Street has at times been wary of our industry, making it difficult for some of our companies to raise the capital required to expand exploration and production. You only need to look at our stock prices so see that. At the same time, we’re finding that reservoirs in some areas, such as the Western Gulf of Mexico, deplete more rapidly than originally projected. That puts additional pressure on our capital budgets.

As I have explained here today, there is no question that the U.S. has vast natural gas resources and that our producers can bring these resources to market. That information does not, however, answer the question: How much will that gas cost? Will the policies of the U.S. government cause the American people to pay unreasonably high prices for the clean-burning natural gas they need? Mr. Chairman, you have consistently demonstrated your concern for energy costs and competitive markets. You have supported federal policies for the natural gas industry that have reduced the unreasonable regulatory costs that have burdened our industry—and our customers—in the past. You have moved through your committee a bill on electricity restructuring that shows a deep concern for the budgets of American families and for the competitiveness of American industries.

Unfortunately, not everyone in government shares your sharp eye for policy consequences. There are those in positions of power today who are misleading the American people. They are endorsing a position that locks up increasing amounts of land—to prohibit natural gas exploration and production in our richest resource areas, both on- and off-shore. And they are misleading our citizens into believing that can be done without economic ramifications. America’s richest natural gas resources—the resources we can produce most cost-effectively—lie under onshore and offshore federal lands. Our industry can produce this gas in ways that are environmentally sensitive, and we are committed to that goal. Advances in our industry have reduced the impact of gas production on the environment. And dozens of environmentally sensitive technologies are being employed by the industry. Yet, we hear constantly from a number of highly placed federal policymakers who oppose domestic natural gas production and transport. I do not know if they are merely misinformed, or if there is some other reason for their statements. What I do know is that they are trying to convince the American people that it is in their best interest to prohibit natural gas production and transport across much of this nation.

Catherine Good Abbott, Interstate Natural Gas Association of America.  INGAA is the trade association that represents interstate natural gas pipelines in the United States, the inter-provincial pipelines in Canada and PEMEX in Mexico. These pipeline systems transport 90% of the natural gas consumed in the United States.

The Department of Energy’s Energy Information Administration estimates that natural gas use will increase from about 22 Tcf today to 30 Tcf shortly after 2010, reflecting a 36% increase in natural gas use. The largest area of growth, about 60% of this total, is expected in the electric generation market. In some areas, the anticipated growth is even more significant. In the Northeast, for example, demand for natural gas used for power generation is expected to increase by 250%!

To meet the demands of a 30 Tcf market, interstate pipelines will need to construct a significant number of new facilities. A study called ‘‘Pipeline and Storage Infrastructure Requirements for a 30 Tcf U.S. Gas Market’’ was conducted by Energy and Environmental Analysis, Inc. for the INGAA Foundation. This study found that approximately 2,100 miles of new pipeline will be needed every year between now and 2010 to have the capacity necessary to serve this increased demand. The pipeline construction process has become increasingly complex. As you may be aware, it is simply getting more difficult to build any type of new facility. In the U.S., we must obtain and coordinate multiple state and federal environmental permits. We also must consider the concerns of landowners, who are becoming more interested and involved in our projects. To keep our projects economic, we must keep costs under control. And to keep the costs under control, we must promote more efficient review and approval processes—within our companies and among the regulatory agencies that ultimately decide the fate of the project.

As I mentioned earlier, between now and 2010, approximately 2,100 miles of new gas transmission must be added each year to accommodate market growth to 30 Tcf. To accomplish this, the natural gas pipeline industry will need to invest upwards of $32 billion for pipeline transmission and storage facilities.

[my note $32,000,000,000 / 2100 miles =  $15.238 million per mile – must include storage and other infrastructure].

Pipelines must compete for investment capital in the marketplace with S&P 500 companies with similar risk profiles for the same capital. In most cases, pipelines also have to compete for capital within their own organizations. A fundamental tenet of this decision-making process is that increasing risk requires a return commensurate with that risk. If returns on pipeline investments are not commensurate with the risks inherent in the pipeline business, less capital will be invested in pipeline projects relative to investments in other businesses that have a better risk/return profile.

Costs for the pipeline industry have increased substantially in this decade and will likely continue. These increased risks include: (1) expiration of long-term gas utility transportation contracts and the prospect of non-renewal; (2) movement to shorter-term contracts with new non-gas utility customers; (3) competition from unregulated marketing firms who can buy capacity at regulated rates and sell it at unregulated rates; (4) state commission restructuring of gas and electric services; and (5) federal electric restructuring initiatives which cause uncertainty in the market that offers our best opportunity for future growth.

Roger B. Cooper, Executive VP, Policy & Planning, American Gas Association.  AGA represents 189 local natural gas distribution companies, which deliver natural gas to 60 million customers in the United States. As coal was the dominant fuel of the 19th Century and oil during the 20th Century, we believe that natural gas will be the fuel of the 21st Century.

Today you have heard the testimony of the author of the American Gas Foundation’s study Fueling the Future: Natural Gas & New Technologies for a Cleaner 21st Century. Bill Martin described to you a scenario for an increase in gas demand from 22 quadrillion Btus (about 21.4 Tcf) today to 35 ‘‘quads’’ (almost 34 TcF) in 2020. He told you about the enormous environmental, energy security and efficiency benefits that will result. If gas consumption in 2020 is 60% higher than today (35 quads) we can: —Reduce CO2 by 930 million tons per year. —Reduce oil imports by 2.6 million barrels per day. —Reduce national energy consumption by 6%.

The recent report issued by the U.S. Department of Energy’s Power Outage Study Team emphasized the value of natural gas technologies in easing strain on the electric power generation grid. The report noted that ‘‘distributed generation,’’—that is, small electric-power generation units that are installed on or near the customer’s premises—can help relieve the demand on the electric grid, especially during peak demand. Natural gas fuel cells and microturbines are among these new distributed generation technologies. Substituting natural gas cooling for electric air-conditioning can also help to level summer electric peaks. Since natural gas use typically peaks in the winter months, this is truly a win-win scenario for both the natural gas and electric industries.

Sophisticated and highly efficient combined cooling, heat and power systems (CHP) are being installed around the country, and are especially popular at universities and large conference centers, such as Opryland in Nashville and the huge McCormick Center in Chicago. Texas is one of the leading states in terms of industrial CHP capacity and is fourth behind New York, California and Pennsylvania in commercial capacity.

Let me give you three examples of how natural gas technologies—which are commercially available—will help meet the nation’s energy goals. —In 1994, Thomason Hospital in El Paso, which was undergoing a major renovation/expansion, had to find a way to meet energy needs in a cost-effective manner, while locked into the highest electric rates in Texas. Its solution was a combined heat and power plant that includes gas-fired reciprocating engines, dual fueled boilers, gas engine driver chillers, and single effect absorption chillers. This CHP system provides electricity, low pressure steam (heating and hot water), and cooling for the hospital complex. Benefits to the hospital include reduced electric demand (330 kW), a levelized electric load and $460,000 in annual savings.

My second example describes a ‘‘save-the-day’’ situation at the Brookfield Zoo in suburban Chicago. During the winter of 1998, the local electric utility suffered a power failure that could have resulted in suffering or death of 200 creatures ranging from a massive Pacific walrus to the jellyfish. But the Zoo had installed a natural gas cogeneration system that kept the facility running. There’s a financial benefit, as well: by operating in parallel with the local utility during peak demand hours, the Brookfield Zoo expects to generate a positive cash flow in excess of $700,000 during the next decade. That can feed a lot of walruses.

A final example comes from the retail sector. A Walgreen’s drug store in Chesterton, Indiana, became an energy pioneer last year by installing a natural gas ‘‘microturbine.’’ This unit—about the size of a commercial refrigerator—operates quietly and efficiently, using natural gas as a fuel to turn a small turbine that generates electrical power. The unit is also equipped with a ‘‘desiccant dehumidification’’ system that pulls moisture from the air, keeping customers comfortable while they shop. The biggest benefit is reliability. Because this system runs on natural gas it is independent of the electric grid and therefore is not affected by brownouts or blackouts caused by weather extremes. Imagine how relieved you’d feel if you needed to buy medicine during a local power outage—and the familiar Walgreen’s is the only open business you see in a sea of darkness.

Barry Russell, Independent Petroleum Association of America and the National Stripper Well Association.  I am testifying on behalf of the IPAA, the National Stripper Well Association, and 32 cooperating associations of the IPAA that represent state and regional interests. These organizations represent independent petroleum and gas producers, the segment of the industry that is damaged the most by the lack of a domestic energy policy that recognizes the importance of our own national resources. NSWA represents the small business operators in the petroleum and natural gas industry, producers with ‘‘stripper’’ or marginal wells. Today’s hearing addresses a fundamental issue—National Energy Policy: Ensuring Adequate Supply of Natural Gas and Crude Oil. This testimony will focus first on several key factors that influence this issue and second on actions that should be taken to improve the future domestic supply.

There are many factors that affect the development of a sound national energy policy. This testimony will focus on several key issues in crafting a sound policy to address adequate supply of essential natural gas and petroleum.

Fossil energy—particularly natural gas and petroleum based energy—will continue to dominate energy supply in the United States. According to the National Petroleum Council’s Natural Gas study, natural gas and petroleum account for 64.8% of national energy needs. Future projections show significant growth in the use of these fuels as domestic energy demand continues to increase. The U.S. economy is driven by the availability of adequate energy supplies whether consumed by manufacturing, by transportation to and from jobs, or by the expanding role of computer use and the Internet. It ignores this reality to suggest that an equally robust economy can be sustained without substantial energy growth. And, it ignores this reality to suggest that natural gas and petroleum will not be the dominant share of this growth.

Regardless of changes in world politics, energy supply remains a national security issue. A decade ago energy supply from foreign sources would be viewed as a national security risk in the context of the Cold War—supply routes at risk and energy sources subject to control by adversaries. Today’s national security issue is different, but it is nonetheless significant. Currently, we import over 55% of our nation’s petroleum. It comes from diverse sources, but diversity is not security. In 1973 the OPEC oil embargo crippled this country. Yet, we now import over twice as much petroleum on tonnage basis from the OPEC countries that embargoed us—and neither Iran or Iraq participated in that embargo. Whether we like to address it or not, our sources of petroleum come from countries with a history of instability. We are currently importing approximately 500,000 barrels per day from Iraq. Clearly, this is not a reliable source. Saudi Arabia is ruled by a monarchy in a world without ruling monarchs; it is constantly subject to subversion by radical religious elements. Even Venezuela is ruled by a government that has dramatically shifted that country’s priorities over the past two years and continues to be difficult to predict. We must recognize that shifts in any of these suppliers can dramatically and adversely affect our nation and our national economic security.

The past three years have demonstrated how susceptible the U.S. energy supply can be to foreign actions. The precipitous drop in petroleum prices in late 1997 through early 1999 posed a catastrophic threat to domestic petroleum production and a substantial threat to domestic natural gas production. As a result of the extended low petroleum prices in 199899, capital investment in petroleum production throughout the world declined. Existing production was lost. In the U.S., production dropped from 6.5 million B/D to less than 6.0—million B/D. Natural gas production suffered as well because the two commodities are linked. This year, the country has seen the inevitable consequences of lost capital in the exploration and production industry—as worldwide demand has increased, worldwide supply capacity has not kept up. This year, petroleum prices have reached levels not seen since the Persian Gulf war. Different segments of the economy have been threatened. In each case, the price and supply issues have largely been defined by the actions of foreign producer nations. Our policies must recognize this vulnerability.

Future domestic natural gas and petroleum exploration and production will increasingly depend on independent producers. Domestic exploration and production of natural gas and petroleum have changed dramatically since 1986—the time of the last petroleum price crisis and major revisions to the federal tax code. Since that time the role of independent producers has increased. Generally, for example, domestic petroleum production is divided roughly 60% from the lower 48 states onshore, 20% from the offshore, and 20% from Alaska. Since 1986, the share of onshore lower 48 states production by independents has increased from about 45% to over 60%. Independents are also increasingly active in the offshore. In the aggregate, independents drill over 85% of the wells in the U.S., produce 45% of the petroleum, and produce over 65% of the natural gas.

This is a trend that will continue. The reasons are straightforward. Large, integrated petroleum companies are driven by their need to generate adequate shareholder returns. In the ‘‘Dot Com’’ world we are living in, this requires finding and developing large ‘‘elephant’’ fields. Mature fields that yield more limited quantities of natural gas and petroleum characterize most of the U.S. Many of the U.S. large field prospects such as the Arctic National Wildlife Refuge (ANWR) are not available for development. In the offshore, moratoriums limit many options; those that are left are largely in the ‘‘ultra-deep’’ portions of the Gulf of Mexico. So, compelled to fill their refineries, major integrated companies focus their development funds to the deep Gulf of Mexico and overseas. This leaves the brunt of future domestic resource development to independent producers—large and small. Independents need different policies than integrated companies. Independents rely on revenues generated solely in the upstream and are more susceptible to price swings that strip away critical financial resources.

Natural gas is an increasingly important element of domestic energy supply. The National Petroleum Council Natural Gas study concluded that domestic natural gas demand will increase from the current 22 trillion cubic feet per year (Tcf/yr) to 29 Tcf/yr by 2010. Most of this increase will be needed to fuel expanding electricity generation. The study concluded that: U.S. gas demand will be filled with U.S. production, along with increasing volumes from Canada and a small, but growing, contribution from liquefied natural gas (LNG) imports… Two regions—deepwater Gulf of Mexico and the Rockies—will contribute most significantly to the new supply… U.S. production is projected to increase from 19 TCF in 1998 to 25 TCF in 2010, and could approach 27 TCF in 2015. Deeper wells, deeper water, and nonconventional sources will be key to future supply. Importantly, this study concludes that these future natural gas needs can be met through domestic resources supplemented by other North American resources. Equally important, it identified key issues that had to be addressed to meet these needs.

It is equally important to recognize that a larger aspect of access to natural resources involves opening access to that which is not now available and halting the trend of further embargoes of western lands. Unfortunately, the Administration avoids dealing with the clear need to open government lands to exploration and production. It hides behind an environmental sensitivity argument that is proven wrong by its own DOE report. It focuses on arguments against opening ANWR and avoids dealing with access issues offshore and in the Rockies where its own National Petroleum Council Natural Gas study concludes that over 200 trillion cubic feet of natural gas is either off limits or difficult to permit.1t is important to understand that access issues differ between these areas.

ANWR and offshore activity off of California, the Eastern Gulf of Mexico, and the Atlantic are constrained by policy decisions, both executive and legislative, through prohibitions and moratoriums. These are based on outdated reactions to spills occurring in the past.

Access in the Rockies won’t be resolved by a single act. Here, we are dealing with a mosaic of limitations. Some involve land that is completely excluded from natural gas and petroleum exploration and production. The Antiquities Act of 1906 has been used to declare areas as national monuments placing land completely off limits.  In other areas, the Department of Agriculture is proposing to expand roadless areas in national forests that will preclude natural gas and petroleum development. Some national forests, like the Lewis and Clark National Forest, projected to be a world class natural gas source, have been administratively closed to natural gas and petroleum development. Wilderness areas have been created without an understanding of the resources that might be lost. We must also deal with permitting limitations and other indirect actions of federal agencies.  Because these are government lands, it is necessary that federal agencies issue permits for the exploration and production activities. These agencies are charged with the task of developing environmental management plans for areas under the National Environmental Policy Act (NEPA). NEPA can be used to create effective, environmentally sound management plans, or it can be used to delay and deny access. Frequently, the results reflect the attitude of the agency and its leaders.

Other factors identified in the NPC Natural Gas study relate to access to technology and to human resources must not be overlooked. Domestic natural gas and petroleum development have changed dramatically during the past two decades through the application of new technologies such as 3D and 4D seismic analysis, horizontal drilling, and the use of advance offshore technologies. The widespread availability of these and other technologies will be critical to meeting future challenges as well. Similarly, the industry has suffered further declines in employment. During the 1998-99 low petroleum price crisis, the natural gas and petroleum extraction industry lost 65,000 jobs of which only about 7,000 have returned. Employment has dropped below 300,000 from levels that exceeded 600,000 in 1984. These are highly skilled jobs at both the rig operator and engineering level. Many of the domestic industry workers are Hispanics. But, once people leave the industry it is hard to attract them back. Attracting new workers is equally difficult. For example, enrollment in petroleum engineering has consistently fallen over the past several years. While these are not issues that are dominated by federal policy decisions, they are nonetheless essential to meeting future natural gas and petroleum demand.

Study after study has shown that hydraulic fracturing is an environmentally sound process involving the brief injection and subsequent removal of fluids to place proppants necessary to open natural gas and petroleum formations for development. Some analysts believe that over 60% of the natural gas wells that will be needed to meet the projected 2010 demand will require hydraulic fracturing. It is exactly this type of poorly targeted regulation that must be avoided.

Mr. BARTON. On the back of your testimony you have a chart, you do not really allude to it in your testimony, but along one axis, it says Wildlife Restrictions is the heading and then big game winter range, sage grouse lek,  sage grouse nesting, mountain plover breeding, mountain plover nesting, raptor nesting, burrowing owl, prairie dog avoidance. Are these all local or Federal restrictions in the Rocky Mountains that you cannot drill during those times?

Mr. RUSSELL. A lot of them are. But it goes to the point that you were talking about before with the pipelines, which is, if you take an area of land and look at all the various environmental restrictions, this is just an example of some of the endangered species’ kinds of restrictions there that block access during certain periods of time.

Mr. BARTON. The prairie dog is not an endangered species, is it?

Mr. RUSSELL. Yes, in some areas it is.

Mr. BARTON. Not in Texas.

Mr. RUSSELL. You would be surprised at the list that we would show you. But what I was going to say is it is not only this list but if you go through and look at not only endangered species but some aspects of the Clean Air Act—the permitting, the monitoring, some of the things that you talked about before which really go to this procedural aspect and the lack of coordination between State and Federal agencies, that what happens is that whole areas are essentially taken off the table for development. That is what we were trying to show there, and I can give you more supporting detail on that.

Red Cavaney, President & CEO, American Petroleum Institute.  First, and most important, there is a clear link between the use of petroleum and economic development.   Gasoline powered automobiles have been the dominant mode of transport for the past century. There is no evidence that consumers are ready to change that. Regardless of fuel, the automobile—likely configured far differently from today— will remain the consumer’s choice for personal transport. The freedom of mobility and the independence it affords are highly valued. While substitutes for gasoline may someday change this reality, any such wholesale change is more than several decades away—the amount of time required to fully retire the existing and still growing fleet of automobiles powered by gasoline and to deploy any replacement fuel source throughout the world.

THE ENORMOUS COST OF VOLATILITY. While historically our industry and our nation have been able to survive the disruptions and nervousness brought on by market volatility, they have done so at a price. Volatility is in no one’s best interest. Our companies suffer because it is much more difficult to undertake the types of long-term planning and investment needed for sustainability. When crude oil prices are rock-bottom, as they were last year, companies simply do not have the resources needed to invest in projects that could cost billions of dollars over many years. When oil prices are higher, as they are today, companies are reluctant to take the kind of risks necessary for future growth because they have no guarantee that prices will not plummet again. As one pipeline supply company executive told us recently, ‘‘Uncertainty leads to paralysis.’’ Our suppliers—companies that make pipeline and other equipment needed for exploration and production—have to put their own interests first, so when producers quit buying their products because of low crude oil prices, they have to search elsewhere for customers. By the time our companies are ready to buy equipment again, those suppliers may no longer be making those products because they’ve found more reliable customers. Similarly, both producers and their suppliers are too often forced to lay off many of their employees and those employees get other jobs—many in other industries— and are thus unavailable when the producers are once again ready to expand. In the highly specialized area of petroleum engineering, the number of students entering those programs at our universities has steadily decreased. In addition, if the hard times last long enough, many of our smaller producers and their suppliers will simply close up shop, never to return. But there are other casualties as well. The first is the confidence Americans have always had in our industry to provide them with the products when they need them at a reasonable cost. Our companies have strived hard to live up to that trust, and for the most part they have succeeded. However, there can only be so much of this volatility and its resulting uncertainty regarding supply and prices before that confidence is brought into question. And perhaps even more important are the national security implications of this volatility. As the world’s only remaining super power, the United States has inherited a tremendous burden, at home and abroad. The question we must ask is this: how dependent on a reliable source of the products our companies make are our armed forces and those of our strategic allies? The answer, of course, is extremely dependent. Any serious disruption of any segment of our industry—from production to transportation to refining to delivery— could place severe strains on our armed forces’ ability to do their jobs. Without a sound energy policy that encourages greater domestic production and lifts the stifling effects of over-regulation, we place too much at risk.

Looking to the future, technology could well permit the economic development of the largest known source of hydrocarbons—methane hydrates—methane frozen in ice. Located in deep water under intense pressure, methane could provide the world with several more centuries of available clean energy. The U.S. Geologic Survey has estimated that the United States has 320,000 TCF of methane in hydrates, which is 200 times the size of conventional gas reserves.

Currently, many of these areas have been placed off-limits by the federal government.  The search for new domestic offshore oil and natural gas is limited to the Gulf of Mexico and Alaskan waters because of the congressional moratoria that have placed off-limits most of the rest of our coastal waters. Onshore, the President has repeatedly used his executive powers to limit oil and gas activity on vast regions of government lands. Congress has refused to authorize exploration on that small section of the Arctic National Wildlife Refuge that was specifically set aside by law for possible exploration in 1980. And most recently, the U.S. Forest Service moved to make it more difficult for our companies to explore for natural gas and oil on government lands when it announced a plan to bar road building in virtually all of the large areas in the forest system, spanning a total of 43 million acres in 39 states. Our industry supplies the energy to keep America going strong, but to continue to produce domestic oil and natural gas, we must have improved access to federal and state lands.

Even with greater access and flexibility, the United States will continue to need to rely on foreign oil supplies. Thus, it is important that we maximize the diversity of those supplies to help ensure the reliability of a continuous flow of oil imports. Unfortunately, U.S. unilateral trade sanctions and the constant threat of sanctions narrow our sources of supply, frustrating achievement of this important objective. In recent years, unilateral economic sanctions have increasingly become the policy tool of choice in the conduct of U.S. foreign policy. One of the favorite targets of these recent sanctions has been major oil-producing countries. The U.S. currently has sanctions in place against countries comprising over 10% of world oil production and 16% of estimated remaining oil resources. There is little evidence that unilateral sanctions produce desired outcomes. There is a better way: working with the governments in which they do business, some of our companies are adopting their own human rights and environmental standards. In short, U.S. policymakers face a dilemma. Growing supplies of crude oil will be required to sustain world economic prosperity, and diverse, ample foreign supplies are needed to help ensure our own country’s economic growth. The drive to impose unilateral sanctions is an obstacle to both of these objectives.

Government must also resist the siren song of politically popular short-term solutions that can have devastating long-term implications. For instance, the unexpectedly severe cold spell in the Northeast last winter and the resulting spike in home heating oil prices led to efforts to create a regional home heating oil reserve for New England. We are now hearing similar calls for a 1.5 million-barrel oil reserve for California. In both cases, the motivation behind the proposals is understandable. Public officials have a sincere interest in seeing that their constituents are not inconvenienced or harmed by supply disruptions that could lead to shortages or higher prices. We share that interest. However, we must caution against the creation of such reserves because they would have the ultimate effect of the government becoming a heavy player in the marketplace, something no one wants.

If the federal government is in the marketplace, many operators in the private sector will choose to take their business elsewhere rather than compete head-to-head with the government. We need only look to the 1970s to see the nature of adverse impacts when the federal government plays a direct role in the daily marketplace.

With its deep pockets and no need to turn a profit, government has the freedom to buy oil at high prices and sell low to keep retail prices artificially low.

In closing, we share your concern for the future of the industry and the security of our nation. America’s oil and natural gas companies have a long and proud history of providing this country’s consumers—including our armed forces—with a reliable and affordable supply of energy to make their homes comfortable and take them where they need to go, when they want to go. Through good and lean years, U.S. suppliers of natural gas and petroleum products have kept America’s armed forces mobile, its factories running and have provided the fuel to move goods from manufacturers to retailer and, ultimately, into America’s homes and offices.

WATKINS.   As you are well aware on this committee and from my experience as Secretary of Energy in helping Congress craft the National Energy Policy Act of 1992, the entire complex of energy sources, both production and consumption, needs to be analyzed and integrated in order to address any one of the individual components, like oil and gas, that aggregate to make up the whole. As a consequence, I have prepared and submitted a long formal statement for the record which goes back nearly 10 years to the formulation of the National Energy Strategy by the Bush Administration in 1991. Lessons learned from that process and a review of all components that made up the energy strategy indicate interconnectivity and balance between these components and they are demanded when considering any one of them.

The 1991 Bush strategy was the culmination of nearly 2 years of intense efforts, holding hearings around the country to listen to all interested parties, developing a strategy which emerged therefrom, and drafting a report for the Congress. That report was reviewed by various analytically capable outside entities, including the Office of Technology Assessment here on the Hill before it was abolished, and was published and sent to the Hill to inspire the needed legislation. The strategy also included and complemented a number of Bush Administration initiatives which I enumerated on page 3 of my longer statement. The strategy was then used by both the Congress and the administration as the template for constructing and passing the Energy Policy Act. This act, signed into law by President Bush in October 1992, was among the most comprehensive energy bills ever enacted. The Energy Policy Act affected every aspect of the way this Nation produces and uses energy, including reshaping Federal regulation of the Nation’s energy sector to spur competition and investment in new technologies. It stands today as landmark legislation, but it must be periodically reviewed and updated if it is to remain relevant. For example, an updated energy strategy is badly needed today to provide the broad framework within which so many other related pieces of legislation like the Clean Air Act, the Public Utility Company Holding Act, Public Utility Regulatory Policy Act, Nuclear Waste Act, and many others find themselves, and all of these related pieces have to be periodically updated themselves, but within some kind of overarching umbrella strategy because of their inherent interconnectivity. Our strategy in the early 1990’s, for example, predicted about $30 a barrel in crude oil prices by the year 2000, recognizing cyclic variations enroute. Sure enough, that is where we are today. Additionally, the act passed by Congress had the potential to reduce oil imports by about 4.7 million barrels per day by the year 2010. This represented at the time a one-third cut in the projected level of petroleum imports. Not only would this enhance energy security but it would also positively influence balance of trade, effect considerable energy cost savings for consumers, and result in significant reduction in energy demand.

Whether we are on track to achieve this objective I have no idea, but knowing how we are doing relative to that objective is germane to answering one of your questions. In this connection, before leaving office in January 1993, we had established a complex tracking program called the Energy Policy Act Information System to comply with provisions of the 1992 act. This system has been used successfully in development of the National Energy Strategy and had also been reviewed by the Congressional Office of Technology Assessment and found to contain reasonable models and predictions. I prepared my final DOE posture statement in January 1993 and reported the degree of compliance with the act to date. I also passed information regarding this tracking system to my successor in the new administration team. The first required DOE report entitled ‘‘Implementation Status Report’’ was then published in April 1993, shortly after I left office, based on our tracking system. It is very well detailed, every single section of the track, to see how we are doing against the advocacy that was contained in the act. To the best of my knowledge, results of the detailed tracking system were not reported out again. Rather, our system was replaced in July 1995 with what I would call a puff piece, called the National Energy Policy Plan, allegedly the new Clinton Administration’s energy policy. The latter’s direct relationship with or even reference to the 1992 Energy Policy Act is not clear. I have no information which would indicate the degree of congressional satisfaction in either specificity or quality of this or any of the follow-on biennial reports required under section 801 of the DOE Organization Act which demands submission of a national energy policy plan. You asked in your letter whether or not the direction was the right one regarding energy policy and security in general, and oil and gas in specific. My answer, admittedly rendered from a distance without detailed data, is a simple no. If we had a clue as to the direction in which we were actually headed, we could have predicted the current oil crisis well ahead of time and may have had a chance to influence our national attitude toward oil, for this crisis was in part generated by our own failure to enhance U.S. oil production as initially recommended in the National Energy Strategy 10 years ago.

 

Ten years from now, I predict we will decrying the fact that we are being held hostage on an energy security problem to foreign imports of LNG from states like Libya. Yes, the potential exists to mitigate spikes in gasoline prices through enhanced oil production and reduced consumption at home. However, with no sense of urgency on upgrading our national energy policy, only sporadic attention being given to the trade and diplomacy aspects of our dependency on OPEC’s whims, opposition to any increase in U.S. oil or gas production, and an unwillingness on the part of DOE to track what we are already supposed to be doing under existing legislation and make recommendations as to new strategic direction where needed, there is little hope of avoiding a repetition of unannounced oil and gasoline spikes in the years ahead.

 

Sadly, I now think we need to start the process all over again, similar to the one we put into place 10 years ago, and establish a new baseline. The new administration should be tasked by Congress to do this. If we get serious about energy policy again and follow such a path, we might be able to answer your question, Mr. Chairman: Are we going in the right direction or are changes needed?

 

The Strategy acknowledged that the U.S. was part of an energy interdependent world. It recognized that it was not in our interest to adopt measures that may reduce imports, but inflict severe economic or environmental damage. Therefore, the National Energy Strategy balanced economic, environmental and energy security objectives. Over the next 20 years, we believed that such a balanced approach to production and conservation would power a larger U.S. economy while using less energy. At the same time, the U.S. would produce more of the energy it uses. The National Energy Strategy by the year 2010 would have, if pursued aggressively:

  • Reduced domestic oil demand by 3.4 million barrels per day, below projected levels.
  • Increased domestic oil production by 3.8 million barrels per day above projected levels.
  • Increased the electricity produced from renewable sources, such as solar, hydropower, and geothermal by 16%
  • Raised the use of alternative transportation fuels, such as compressed natural gas, ethanol and methanol, thereby reducing the need for up to 2.5 million barrels of oil per day.
  • Reduced growth in electricity demand by unlocking market forces through elimination of costly regulation in existing law, thereby saving consumers approximately $27 billion per year in electricity costs.

 

With its proposals to increase the use of clean coal technology, natural gas, and nuclear energy to generate electricity, as well as to develop new energy efficient technologies, the Strategy would have also:

  • Held U.S. emissions of greenhouse gases by the year 2010 at or below 1990 levels.
  • Improved air quality by reducing emissions of pollutants that contribute to acid rain and smog.
  • Mitigated solid waste problems by reducing coal ash waste 25 million tons per year, and by lowering coal cleaning wastes by 50 million tons per year.

 

The Strategy incorporated and complemented a number of Bush Administration initiatives. These included: (1) the 1990 provisions to the Clean Air Act; (2) natural gas well-head decontrol legislation: (3) incentives provided to domestic renewable and fossil energy producers in fiscal year 1991 budget agreement; (4) the energy research and development initiatives announced in the President’s FY-92 budget; (5) the Administration’s domestic energy supply and demand measures adopted in response to the Iraqi oil disruption; and (6) the Administration’s science and mathematics education initiatives.

 

To meet the challenges that lay ahead, the National Energy Strategy called on Federal, State, and local governments to work together to encourage energy conservation and new energy production through reduced regulation and streamlined licensing procedures, particularly in the natural gas, oil and gas pipeline and hydropower areas. At the Federal level, the Administration intended to lead by improving the energy efficiency of Federal buildings, Federal housing and accelerating the purchase of alternative fuel vehicles for the Federal fleet. In February 1992, one year after initially promulgating the National Energy Strategy, the Strategy was adjusted to fit the reality of the latest and continually changing data base of knowledge that drove the first projections in 1991. This updated baseline was then used by the Congress as the template for constructing and passing the Energy Policy Act of 1992. This Act, signed into law by President Bush in October 1992, was among the most comprehensive energy bills ever enacted.

 

The Energy Policy Act affected every aspect of the way this nation produces and uses energy, including reshaping Federal regulation of the nation’s energy sector to spur competition and investment in new technologies. It still stands today as landmark legislation, but it must be periodically reviewed and updated if it is to remain relevant. For example, it is badly needed to provide the broad framework within which so many other related pieces of legislation like Clean Air Act; The Public Utility Company Holding Act (PUCHA); The Public Utility Regulatory Policy Act (PURPA); Nuclear Waste Policy Amendments Act; and the like find themselves. All of these related pieces have to be periodically updated themselves, but within some overarching umbrella strategy because of their inherent connectivity thereto.

 

I present all this background before addressing your question as to whether or not our oil and gas policies are adequate in view of the recent shock to the nation on high gasoline prices at the pump. As our National Energy Strategy clearly articulated, the inter-coupling of all energy generation and conservation mechanisms must be continuously melded together into a sensible, comprehensive, integrated, and balanced relationship. If they are not, we will continue to have unannounced and unwanted perturbations when we could have better predicted and mitigated them.   Additionally, the Act passed by Congress had the potential to reduce oil imports by about 4.7 million barrels per day by the year 2010. This represented a one-third cut in the projected level of petroleum imports. Not only would this enhance energy security, but it would also positively influence balance of trade, considerable energy cost savings for consumers and significant reduction in energy demand. Finally, before leaving office in January 1993, we had established a complex tracking program called the Energy Policy Act Information System to comply with provisions of the 1992 Energy Policy Act. This System had been used successfully in development of the National Energy Strategy and then reviewed by the Congressional Office of Technical Assessment (OTA) and found to contain reasonable models and predictions. I prepared a final Posture Statement in January 1993 and reported the degree of compliance with the Act to date and passed information regarding this tracking system to my successor and the new Administration team. The first required report was then submitted to the Congress in April 1993, shortly after I left office including an assessment using the tracking system. To the best of my knowledge, the detailed tracking system was not used again and was replaced by issuance of a Strategic Plan in April 1994, followed a year later in July 1995 by a National Energy Policy Plan which was allegedly the new Clinton Administration’s energy policy. This Plan’s direct relationship with or even reference to the 1992 Energy Policy Act is not clear. I have no information which would indicate the degree of Congressional satisfaction in either specificity or quality of the biennial reports required under Section 801 of the DOE Organization Act which demands submission of a National Energy Policy Plan every two years. In my opinion, therefore, it is time to rebuild the National Energy Strategy as we did ten years ago, using a similar approach—i.e. hold hearings around the country, listen to all parties with a goal of making recommended amendments to the 1992 Energy Policy Act so as not to lose the Act’s relevance. Unless this is done, I do not believe the nation will be well served by a Department of Energy (DOE) that seems to put out energy ‘‘puff pieces’’ that have no associated implementing strategy for their execution or assessment. I have in mind here the Administration’s ‘‘Sustainable Energy Strategy, Clean and Secure Energy for a Competitive Economy’’ report to the Congress in July 1995 which was their own first real response to the two-year reporting requirement. This 73-page document arrived on the Hill, with little fanfare, and had little impact. This 1995 report pays only lip service to the importance of energy policy. In reality this so-called ‘‘Sustainable Energy Strategy’’ is short on energy policy and long on wishful thinking. To take just two examples, among the five strategies set forth for a Sustainable Energy Policy are ‘‘Reinvent Environmental Protection’’ and ‘‘Engage the International Market.’’ Then, later, during his confirmation hearings before the Senate Energy Committee in 1997, Secretary Pena promised that there would be a new National Energy Policy Plan (NEPP) by the fall of that year to answer growing Congressional concern about whether this nation has in place a program that addresses its energy and environmental needs for the next century. Subsequently, it was announced that this new NEPP would be delayed another six months, well past the December 1997 meeting in Kyoto at which commitments constituting a de facto energy policy would be made. Whether it ever got submitted, I have no idea. But Secretary Pena departed in the summer of 1998 and I can’t imagine his departing shot at a strategy was very effective. I don’t know what the Congress has received since, but I can’t believe there has been much substantive clarity or direction in whatever the DOE is doing in this area. Further, you may recall in 1997 that President Clinton casually remarked to assembled students at American University here that the U.S. could reduce its emissions of carbon dioxide and other greenhouse gases by 20% immediately at no cost to the economy ‘‘if we just changed the way we do things.’’ This bold assertion was a shock to even his staff since it was an off-the-cuff remark. No one had a clue, even the President, what or how he wanted us to change. Certainly, his remarks had nothing to do with studies emanating from within his own Executive Branch that would justify such an assertion. His own 1995 Sustainable Energy Strategy said there were no easy ways to reduce greenhouse gas emissions by 20%—immediately or over the next 10-15 years—unless the President had in mind truly profound changes in the world in which we work and live, induced by carbon taxes, rising energy prices and slower economic growth. With a realistic National Energy Strategy, the magnitude of such changes would be clear, along with their effects on individual states and regions. Without a well substantiated strategy, then, enforced through negotiated Congressional adjustments to some logical baseline, like the existing Energy Policy Act of 1992, we are victims of the rhetoric of whatever advocacy-only minstrels wander by.

 

This crisis was in part generated by our own failure to enhance U.S. oil production as initially recommended in the Bush Strategy ten years ago. East and west off-shore oil and gas moratoria continue; the Arctic National Wildlife Refuge remains closed; we are importing 10% more oil than we were 10 years ago; and the worse is not over. We are utilizing more and more natural gas in our utilities and this consumption will continue to grow exponentially. As a result, we are now ‘‘dusting off’’ our liquid natural gas (LNG) depots again. Ten years from now we will be decrying the fact that we are held hostage to foreign imports of LNG from states like Libya. Further, we have not even maximized oil recovery from existing U.S. oil reserves to the extent possible. Yes, the potential still exists to mitigate spikes in gasoline prices through enhanced oil production at home.

 

Mr. SHIMKUS. I would also be remiss not to mention the ethanol aspects of having another renewable fuel option. The reality is we have an overreliance on foreign oil. And I do believe that we can safely go into the Arctic Wildlife Refuge. I think that was proven with the Alaskan pipeline. My now-deceased father-in-law worked on that pipeline and it has a tremendous record of the environmental safety of that program. I think we also look at the continental shelves and also the marginal well issue as all part of a national energy portfolio along with renewable fuels.

 

Mr. BOUCHER.  In 1998, the United States imported approximately 22% of the energy that we consumed in that year. And the Department of Energy predicts that our dependence on foreign energy sources will continue to grow. In past years the Congress has taken prudent measures to address these concerns. For example, in 1980 Congress acted to encourage the domestic production of oil and natural gas by establishing the section 29 tax credit for companies that sought to produce oil and gas from unconventional sources such as tight rock formations, coal beds and from biomass. Congress recognized the benefit to the Nation of developing oil and natural gas from locations that are very difficult to reach and not economic from which to produce using conventional means. The section 29 tax credit is scheduled to expire on December 31, 2002. Allowing the credit to expire would result in the abandonment of many wells that are producing today by virtue of the section 29 tax credit, and I would encourage, Mr. Chairman, that we recommend that the section 29 tax credit be extended, perhaps enduring the balance of this Congress, rather than waiting until the next Congress at the conclusion of which the credit is scheduled to expire.

Posted in Energy Policy | 1 Comment

Nuclear waste will last a lot longer than climate change

[ Anyone who survives peak fossil fuels and after that, rising sea levels and extreme weather from climate change, will still be faced with nuclear waste as both deadly pollutant and potential weapon.  The worst last an awfully long time. According to wiki: of particular concern in nuclear waste management are two long-lived fission products, Tc-99 (half-life 220,000 years) and I-129 (half-life 15.7 million years), which dominate spent fuel radioactivity after a few thousand years. The most troublesome transuranic elements in spent fuel are Np-237 (half-life two million years) and Pu-239 (half-life 24,000 years).

This is a summary (excerpts and paraphrased) of the 7 March 2012 Newscientist article Resilient reactors: Nuclear built to last centuries by Fred Pearce.  Nuclear waste can last thousands to hundreds of thousands of years, yet nothing has been done to protect future generations from these toxic pollutants. Yucca mountain remains shut down with no new repositories in sight (the most comprehensive account of the nuclear waste debacle is Alley’s “Too Hot to Touch“).   I don’t have much hope that the waste will ever be stored after reading Alley’s book.

Conventional oil peaked in 2005 and once the exponential decline begins, unconventional oil will be unable to fill in the gap.  Add on a corrupt financial system about to break and it is clear we will be both too energy and monetarily poor in the future to take on this task. Governments will be too busy trying to feed people to prevent social unrest and fighting wars to get more oil, so the bulk of the remaining oil will be diverted to agriculture and the military, fixing infrastructure, heating homes and buildings, and a million other things.  Cleaning up nuclear waste is not likely to be on the “to do” list.  Alice Friedemann  www.energyskeptic.com]

All nuclear plants have to be shut down within a few decades because they become too radioactive, making them so brittle they’re likely to crumble.

Decommissioning can take longer than the time that the plant was operational.  This is why only 17 reactors have been decommissioned, and well over a hundred are waiting to be decommissioned (110 commercial plants, 46 prototypes, 250 research reactors), yet meanwhile we keep building more of them.

Building longer lasting new types of nuclear power plants

[ This section offers potential techno-fixes for stronger materials.  Whether such materials are ever discovered AND cheap enough to use at this very late date remains to be seen, see the article if that interests you. I seriously doubt new plants of any kind will be built because the billions in capital required and the many years of getting approval and building are far more than a new natural gas plant. The liability is so costly that companies would have to have any liability waived to get the funding. Nuclear power can not balance variable wind and solar power, only natural gas and hydropower are suited for that.  So many nuclear plants are falling apart, that on top of the cost to dismantle them there’s the chance of a failure and the public adamantly fighting against new plans as happened in the 1980s. There are other issues with alternative reactors as well ].

Fast-breeders were among the first research reactors. But they have never been used for commercial power generation. There’s just one problem. Burke says the new reactors aren’t being designed with greater longevity in mind, and the intense reactions in a fast-breeder could reduce its lifetime to just a couple of decades. A critical issue is finding materials that can better withstand the stresses created by the chain reactions inside a nuclear reactor.Uranium atoms are bombarded with neutrons that they absorb. The splitting uranium atoms create energy and more neutrons to split yet more atoms, a process that eventually erodes the steel reactor vessel and plumbing.

The breakdown that leads to a reactor’s decline happens on the microscopic level when the steel alloys of the reactor vessels undergo small changes in their crystalline structures. These metals are made up of grains, single crystals in which atoms are lined up, tightly packed, in a precise order. The boundaries between the grains, where the atoms are slightly less densely packed, are the weak links in this structure. Years of neutron bombardment jar the atoms in the crystals until some lose their place, creating gaps in the structure, mostly at the grain boundaries. The steel alloys – which contain nickel, chromium and other metals – then undergo something called segregation, in which these other metals and impurities migrate to fill the gaps. These migrations accumulate until, eventually, they cause the metal to lose shape, swell, harden and become brittle. Gases can accumulate in the cracks, causing corrosion.

A reactor that does not need to be shut down after a few decades will do a lot to limit the world’s stockpile of nuclear waste. But eventually, even these will need to be decommissioned, a process that generates vast volumes of what the industry calls “intermediate-level” waste.

Despite its innocuous name, intermediate-level waste is highly radioactive and will one day have to be packaged and buried in rocks hundreds of meters underground, while its radioactivity decays over thousands of years. It is irradiated by the same mechanism that erodes the machinery in a nuclear power plant, namely neutron bombardment.

Toxic legacy

Nuclear waste is highly radioactive and remains lethal for thousands of years and is without doubt nuclear energy’s biggest nightmare. Efforts to “green” nuclear energy have focused almost exclusively on finding ways to get rid of it. The most practical option is disposal in repositories deep underground. Yet, seven decades into the nuclear age, not one country has built a final resting place for its most toxic nuclear junk. So along with the legacy waste of cold-war-era bomb making, it will accumulate in storage above ground – unless the new reactors can turn some of that waste back into fuel.

Without a comprehensive clean-up plan, the wider world is unlikely to embrace any dreams of a nuclear renaissance.

Posted in Climate Change, Nuclear Waste | Tagged , , | 2 Comments