[ Most, but not all of what follows comes from the NPC paper below. I’ve reworded or interjected some comments as well. As you can see in Figure 1 below, light cars and trucks are the main guzzlers of petroleum. When oil shocks hit, a rational, well-run society would ration fuel so that freight, agricultural, delivery, infrastructure maintenance, and other essential medium and heavy duty trucks could keep society running. But time is running out. We’re probably going to be stuck with our existing, extremely wasteful, inefficient fleet of cars and trucks when oil shocks hit starting sometime between 2016 and 2024 depending on geological depletion, social unrest and war, declining exports from producing nations, financial crashes, etc.
- Hybrid electric trucks are very different from HEV cars
- Electric truck range is less in cold weather
- All Electric Trucks. Probably not going to happen. Ever. Why not?
- Who Killed the Electric Car?
Alice Friedemann www.energyskeptic.com author of “When Trucks Stop Running: Energy and the Future of Transportation, 2015, Springer]
NPC. 2012. Advancing Technology for America’s Transportation Future. Chapter 13. National Petroleum Council, 69 pages
As far as all-electric (BEV) trucks, it only makes sense for a small subset, those that that stop a lot, and go less than 100 miles a day, such as delivery trucks in classes 4, 5, and garbage trucks in class 8a, to capture regenerative braking energy.
At current rates of purchase it will take over 35,000 years to replace America’s fleet of 247 million gasoline cars with plug-in vehicles
The difference between a diesel combustion engine car and an all-electric car
|VW Lupo 3L 1.2 TDI car||Diesel ICE weight kg/lbs||Battery Electric weight kg/lbs|
|Vehicle chassis minus power train||595 / 1312||595 / 1312|
|engine + gearbox + drive shafts||180 /397||85 / 187|
|cooling (radiator, hoses, coolant, etc.)||10 / 22||7 / 15|
|exhaust||15 / 33|
|power electronics (inverter, charger, DC-DC conv.)||20 / 44|
|fuel tank + cooler + filter||9 / 20|
|diesel (7 L)||6 / 13|
|Battery pack kWh: front 8.3, center 7.7, rear 11||273 / 602|
|Total for Powertrain||245 / 540||435 / 959|
|Curb weight||840 / 1852||1030 / 2271|
Figure 1. As you can see, drivetrain of an ICE is 29% of overall weight, but in the BEV is 42%, with the battery 26.5% of overall weight in this tiny diesel car converted to all electric. Source: Besselink, I.J.M., et al. 2010. Design of an efficient, low weight battery electric vehicle based on a VW Lupo 3L. 25th World Battery, Hybrid and Fuel Cell Electric Vehicle Symposium & Exhibition.
When all-electric cars and trucks break down
A key lesson learned during the previous period when electric vehicles were on the market from 1996–2003 is that the inability to diagnose and remedy a situation when the vehicle fails to charge is a source of tremendous customer dissatisfaction. This is most important for BEVs, as they do not have any gasoline or other secondary power source on-board.
The critical need is the ability to identify whether the fault lay with the charging device (EVSE) or the vehicle, without having to send a technician. Enabling this requires multiple components: A “smart” EVSE—one that includes communications technology, a service entity to either receive an automatic message or receive the customer inquiry, and remote access of the EVSE by the service entity to diagnose the fault. Ideally, the result is the dispatch of either an EVSE service technician, if the fault is with the EVSE, or a tow truck, if the fault is with the vehicle.
Plug-in hybrid (PHEV) and Battery only electric vehicles (BEV) configurations
Degradation & Longevity
There are two facets to battery longevity.
- The actual calendar life of the battery. It is currently unknown whether batteries used in PEVs will last for the life of the vehicle, and battery replacement is likely to remain a significant expense.
- The degradation of power and energy storage capacity that occurs over time.
The gasoline engine in PHEVs can compensate for this, but BEVs will experience reduced power and vehicle range.
Battery innovation, therefore—improved energy density, reduced degradation, and a predictable calendar life is necessary for the wide-scale adoption of BEVs.
PHEVs can easily recharge the battery overnight using a standard 110V outlet,
BEVs will most likely need to charge at a higher power level (240V). This requires the purchase and installation of a separate charging unit, which could be a barrier to vehicle purchase if the expense is high—e.g., if new panel capacity is needed, or there is no existing 240V connection in the garage.
No garage: For both PHEVs and BEVs, drivers in urban areas with on-street parking and drivers who live in multiple dwelling units such as apartments, both types of charging (110V and 240V) will be difficult to realize, as the installation cost can be high and the driver typically lacks the authority to install a charging unit.
A pickup or large SUV would require a very large battery, resulting in a very-high-priced vehicle. Additionally, a vehicle with a limited range would not align with the typical duty cycles and use cases for passengers, and long-distance recreational driving.
To scale up to the hundreds of volts required for automotive powertrain use, many cells are assembled in series to form a battery pack, which connects and contains the cells, and also includes a battery management system (BMS)—an electronic control system that uses various sensors to monitor the state of each cell within the pack (e.g., for voltage, temperature, internal resistance) and to control electrical flows to and from the battery. Most packs include an integrated thermal management system to moderate the pack temperature. The most basic thermal management uses passive air-cooling of the pack from the outside, while more sophisticated systems employ liquid or refrigerant cooling. attributes that are critical to vehicles that use grid-supplied power for driving.
Battery systems add a lot of extra weight, for example:
- If a battery system has a cell mass of 40 kg and a non-cell mass of 40 kg, the mass burden is 100%. Therefore, if the cell-based specific energy is 200 Wh/kg, the actual metric the vehicle designer must consider is 200/2.00 or 100 Wh/kg. It is not unusual for a battery system volume burden to be well in excess of 100%.
- A doubling of battery pack gravimetric performance requires not only a 2X improvement in cell based specific energy but also a 50% reduction in the mass of non-cell components.
As you can see below, there is no winning battery. Not one has all of the essential performance, cost, and safety characteristics required. Cycle life is essential — it must be high for the battery to last long enough to be acceptable. Energy and power density must also be high to move the vehicle for hours and acceleration. Cost must be low to be affordable.
|Cathode||Anode||Abbrev.||Energy Density||Power Density||Cycle Life||Safety||Cost|
|Lithium Cobalt Oxide||Graphite||LCO||High||Fair||Fair||Fair||High|
|Nickel Cobalt Aluminum Oxide||Graphite||NCA||High||High||Fair||Fair||High|
|Lithium Iron Phosphate||Graphite||LFP||Low||High||High||Very good||Fair|
|Lithium Manganese Oxide||Graphite||LMO||High||High||Fair||Very good||Fair|
|Lithium Manganese Oxide Spinel||Graphite||LMO||High||High||Fair||Good||Low|
|Lithium Manganese Oxide Spinel Polymer||Graphite||LMO||High||High||Fair||Good||Low|
|Manganese Nickel Cobalt Oxide||Graphite||MNC||High||Fair||Low||Fair||High|
|Lithium Manganese Oxide Spinel||Lithium Titanate Oxide||LMO-LTO||Low||Low||High||Good||High|
|Lithium Nickel Oxide||Graphite||LNO||High||Fair||Fair||Fair||Fair|
|Lithium Manganese Nickel Oxide Spinel||Graphite||LMNS||High||High||Fair||Fair||Low|
|Lithium Manganese Nickel Oxide Spinel||Lithium Titanate Oxide||LMNS-LTO||Fair||High||High||Good||Low|
Source: Shmuel De-Leon, “High Power Rechargeable Lithium Battery Market,” presented at IFCBC Meeting, February 4, 2010.
Below are the batteries used in autos as of February 2012:
Battery Energy Density
Energy Density is often used as a generic reference to the mass or volume that the cells or battery system occupy within the vehicle compared to the gross number of energy units, typically watt-hours (Wh), that can be stored in them. In reality, 2 different energy metrics must be considered when selecting the appropriate electrochemistry and battery size for a particular application, as either metric can be the key design driver.
Battery designs strive to strike a balance between the energy and power requirements of the battery. In a vehicle application, power density is needed to provide sufficient acceleration as well as optimal ability to capture regenerative braking energy, while specific energy is the primary determinant of the vehicle’s all-electric range. Both specific energy and power density are fundamental to optimal battery pack design. Depending on the context of the discussion, power density may reference power delivered by mass or by volume, measured in watts per kilogram and watts per liter, respectively. The former, specific power, is typically referenced when discussing battery performance while the latter is relevant in the context of vehicle packaging.
As the gasoline energy provides the necessary driving range, the battery packs of PHEVs with shorter all-electric range or with blended-mode operation, are generally optimized to meet peak power demands. Conversely, BEV packs are optimized around the vehicle’s energy demands. This may, however, come at the expense of power. Specific energy, rather than specific power, is the primary determinant of total battery cost.
- Specific, or gravimetric, energy refers to the amount of stored energy per unit mass, typically watt-hours per kilogram
- Energy, or volumetric, density, refers to the amount of stored energy per unit volume, typically watt-hours per liter. In PHEV cars, the battery volume is most important because there is little room for a battery pack, electric motor(s) and power control electronics, since they’re crammed into existing cars models that already have an internal combustion engine, transaxle, and fuel tank
- In BEVs, however, the battery can become the primary constraint as it becomes a substantial fraction of the total vehicle mass, significantly affecting energy requirements and overall vehicle dynamics.
The total amount of energy that a particular battery technology can deliver or accept is a function of the rate (power) requirement on discharge, recharge, or during energy recuperation—e.g., regenerative braking events. Therefore, in addition to understanding the energy density versus specific energy capabilities, the vehicle designer must also understand the functionality between power and energy for a particular battery type (chemistry). A conventional graphic method used to illustrate the relationship between power and energy for a particular battery type is the Ragone plot, which is a logarithmic curve of the energy available versus power demand. The Ragone relationship can be expressed on a gravimetric or volumetric basis. As shown above, compared to other chemistries, lithium-ion is relatively insensitive to power demand, while for LiM Polymer, increasing specific energy comes with a significant decrease in specific power.
Ragone plots express cell level performance. The mass and volume of the cells and the incremental mass and volume, or burden, of the non-cell battery system components (battery casing, thermal management system, etc.), however, are extremely important in determining whether or not the targeted goals for battery system energy density and specific energy are achievable. Although there is no universally accepted convention for calculating burden, it is typically defined as the ratio of the non-cell mass or volume to the cell mass or volume.
Batteries are very heavy
To go 350 miles, a battery needs to be so huge that the vehicle weighs about 3 times more than a gas car
The battery has to be over-sized to reach the range promised by the manufacturer. A typical vehicle battery is controlled so that it never discharges fully, thus the total installed, or nominal, capacity is greater than the capacity actually used. This approach extends battery life, allows for sufficient power at low states of charge (important for BEVs), mitigates the risk associated with cell-to-cell variation in high-voltage packs, builds in engineering margin to enable the original equipment manufacturer (OEM) to promise a certain driving range for a certain time period.
Currently, the depth of discharge for a BEV is in the 70% range, with the battery’s state of charge (SOC) ranging from, for example, 20 to 90%. The effect of this is that a cost figure derived from useable capacity is greater than a cost based on total energy. So A $600/kWh cost based on total capacity would translate to a $857/kWh cost based on useable capacity when used in a BEV with a 70% SOC swing ($600 divided by 0.7).
Battery Degradation and Longevity
All batteries experience power and capacity fade over time as functions of cycling, time, and temperature. The mechanisms that degrade battery power and capacity vary with battery chemistry, the operating profile and ambient conditions. Instead of calendar life—the age of the battery in years—battery life is typically described by the number of times the battery can be charged and discharged, referred to as cycle life. Some battery chemistries are more sensitive than others to the number of charge-discharge cycles.
The cycle life of a battery is fundamentally determined by the reversibility of the electrochemical reaction(s) that are responsible for the energy storage function. In other words, the degradation of the battery life is the result of loss of the electrochemical reaction reversibility upon charge-discharge cycling. The key factors responsible for the cycle life, which are more pronounced at elevated temperatures are:
1) Mechanical/structural fatigue or failure of the active materials, especially at the microscopic level. Materials deteriorate due to stress, especially cyclic stress-induced fatigue upon charge-discharge cycling.
2) Side reactions of positive and negative electrodes with the electrolyte, which results in the formation of highly resistive interfacial layers that impede the electrochemical reaction(s) and a loss of active materials (lithium, anode material, cathode material, and electrolyte), which leads to loss of capacity.
In most commercial lithium-ion chemistries, the primary cause of the decrease in battery function is the undesirable side reactions between the electrolyte and active materials on the electrodes. These reactions consume lithium, thereby limiting the lithium available to participate in the desirable discharge/ recharge reactions. These irreversible side reactions also result in the formation of films on the active materials that impede ionic and interfacial transfer.
In laboratory “accelerated cycle” testing, current lithium-ion battery technologies have been shown to provide several thousands of deep cycles. The number of cycles before end of life is different for each chemistry. Lithium Manganese Oxide, Lithium Iron Phosphate, and Lithium-Nickel Cobalt Aluminate have demonstrated over 1,000, 5,000, and 4,000 cycles respectively. If used in a BEV100, these levels of cycle life could translate into 3 to 15 years of useful battery life. In real-world driving, however, it is difficult to apply the notion of a “cycle.” In addition to the broad charge-discharge cycles, there are also millions of “micro” cycles associated with regeneration and acceleration, which in some chemistries, are equally as impactful in the degradation of battery performance.
The vehicle performance ramifications of year-to-year decreases in capacity and power are most significant for BEVs. As battery capacity decreases over time, the allowable SOC swing must increase in order for the battery to deliver the same number of all-electric miles. As explained in the previous section, expanding the SOC swing can accelerate the degradation and decrease battery life. Further, at a low SOC, the battery may not be able to deliver sufficient power to the vehicle.
Example: Consider a 20 kWh battery with a 60% SOC swing at beginning of life, which equates to 12 kWh of useable energy. Assuming 3 miles of electric driving range per kWh, 12 kWh of useable energy would provide 36 miles of electric driving range. If 10 years later, the total capacity has degraded by 30% to 14 kWh, in order for the battery to deliver the same 36 miles of driving range, the SOC swing must increase to over 85% (12 kWh needed capacity for 36 miles, divided by the total capacity of 14 kWh).
Temperature Effects. Battery life is extremely sensitive to the time–temperature characteristics of the vehicle environment during both operation and storage, i.e., when the vehicle is parked. Battery life versus temperature is functionally described by the Arrhenius equation, which is logarithmic. A “rule of thumb” employed by battery engineers is that for every 10°C (50°F) increase in average temperature, battery life will be reduced by 50%.
Example: A battery operated at an average temperature of 70°F (21°C) that demonstrates 15 years of calendar life will, if operated at an average temperature of 106°F (41°C), yield at best 3.8 years of calendar life. It is imperative that the battery cells within a pack be exposed to the same thermal history. If some cells in a string degrade more rapidly than others due to temperature non-uniformities, the entire pack will be compromised and life will be negatively affected. Further, extreme heat or sustained temperatures over 120°F can be fatal to the battery.
Limiting SOC Swing. The predominant countermeasure being employed by automakers to ensure battery longevity is to counter the expected degradation by increasing the nominal (total) capacity of the battery and limiting the SOC swing, as discussed in the previous section on total energy versus useable energy.
While sufficient laboratory cycle life has been demonstrated for some of the commonly used chemistries for batteries in automotive use, much uncertainty remains about the calendar life of these batteries when used in real-world driving and conditions. Battery management systems, power controls, and thermal management techniques can extend the life of the battery, but substantial investment in research and development is needed to accurately evaluate and improve the calendar life of batteries.
Battery performance is more affected by extreme cold. In extreme cold weather, the power capability of batteries decreases. This occurs because the ionic and chemical processes that govern the internal battery processes are “thermally motivated.” Thus the key chemical reactions and ionic transport mechanisms happen more slowly at lower temperatures, thereby limiting both the amount of instantaneous power available as well as the overall amount of energy that can be delivered. The driver experiences reduced power and greatly reduced vehicle range. This issue is most significant for BEVs, as they are entirely dependent on the battery for propulsion.
The generic term “battery cost” is imprecise. Many cost references are at the individual cell level, but moving from the cell to the module to the battery pack increases the cost—anywhere from 25 to 65%.7 Vehicle OEMs generally expect to purchase a battery pack from a supplier, integrate this battery with the vehicle, and provide it to their retail sales channel.8 Consequently, in addition to understanding the basic cost components of a battery pack, it is important to differentiate whether “costs” are based on the manufacturing costs for the battery cells, the costs for the battery pack supplier, the price the vehicle OEM would pay the battery supplier for the pack, or the amount of cost the vehicle OEM passes on to the retailers or end consumers, as each level within the supply chain adds costs. The “battery” could be a cell or a pack; the “cost” could be the cost to the OEM, the retailer, or the end consumer; the “cost” per kilowatt-hour could refer to either useable capacity or total capacity; and the pack may or may not include an active thermal management system.
From this point forward, “battery cost” will refer to the complete battery pack, including the battery management system but not including an active thermal management system, at the battery supplier-to-vehicle OEM level, on the basis of total capacity.
Vehicle Charging Infrastructure
Level 1 Charging—Level 1 (L1) charging is lowpower charging at 120 volts (V) of alternating current (AC) at a rate of approximately 1.4 kW.
Level 2 Charging—Level 2 (L2) charging is medium-power charging at 240V AC at a rate of approximately 3 kW, up to 19 kW. An L2 EVSE is larger than an L1 EVSE, and includes more sophisticated circuitry in order to ensure safety. (See Figure 13-12.)
Level 3 DC (Direct Current) Fast Charging —“DC Fast Charging” is high-power charging, with the electricity supplied by an off-board (off-vehicle) charger at variable DC voltages and currents. There are different power levels possible with DC, but for vehicle charging, the power is typically supplied at 200–450V, at a rate of up to 90 kW, depending on the requirements of the vehicle. Charging is controlled by the vehicle’s battery management system, which ensures that the battery is not charged at a rate in excess of predetermined limits.
The charging times above are for a trip of 25 miles. Trucks need to go at least twice that far, and their vehicle weight is much heavier, so batteries would need to be much larger and take much longer to charge. It is possible that fast charging shortens and degrades battery life, so fast charging isn’t necessarily a solution.
The benefits of DC Fast Charging have led some to conclude that even higher charging rates would be beneficial, so that charge times could be reduced to be similar to refueling of current gasoline vehicles, about 5 minutes. In addition to the vehicle- and battery-related issues that would need to be solved, much higher charge rates would pose significant challenges to the grid. Ultra-fast charging of a 25 kWh battery pack in 5 minutes would require a power flow rate of approximately 300 kW, which is approximately equivalent to the peak power requirements for a 100,000 square foot office building. The power use for this load would have a sharp profile similar to an industrial load like a sawmill. Achieving the same charge time for larger batteries, such as those for a heavier or longer-range BEV, would require even more power. Although loads of this type can be provisioned, the equipment to supply this load without disrupting surrounding loads would be bulky and expensive, and the low utilization of these assets would make cost recovery difficult except in very high-traffic areas. It would be possible to use on-site energy storage to reduce the grid demands for this load, but this equipment is also bulky and expensive, particularly if the station is designed to accommodate back-to-back recharges.
One of the largest uncertainties for vehicle manufacturers is the impact of DC Fast Charging on battery life. Charging at high rates will likely increase the rate of battery degradation for near-term chemistries, which will increase the likelihood of warranty replacements and negative customer perceptions. Additionally, potential charge patterns, such as fast charging at high ambient temperatures or charging multiple times per day will likely have an additional negative impact on battery life. Real world data will be needed to measure the magnitude of the impacts from fast charging.
Where is battery money coming from?
The American Recovery and Reinvestment Act of 2009 (ARRA):
- $2.4 billion in loans to three electric vehicle factories in Tennessee, Delaware, and California. Provided
- $2 billion in grants (with 100% matching funds by private industry) to support 30 factories that produce batteries, motors, and other electric vehicle components. These grants are intended to build the capacity to produce 50,000 BEV/PHEV batteries annually by the end of 2011 and 500,000 batteries annually by December 2014.
- $400 million in grants (with 100% matching funds by private industry and/or state/municipal entities) to install 22,000 EVSEs in 20 U.S. cities.
- $2,500 to $7,500 per vehicle tax credits for the purchase of BEVs and PHEVs. The amount of the credit is based on the size of the battery
- Funded the “EV Project,” which included approximately 400 DC fast chargers.
Over 40 U.S. states have adopted other measures promoting electric-drive vehicle usage, including access to high occupancy vehicle lanes (with a single occupant BEV or PHEV), waived emission inspections, tax credits, rebates, and other programs.
The U.S. government and some U.S. state governments also fund extensive R&D efforts on batteries and other electric vehicle components.
Other regulatory programs, such as the ZeroEmission Vehicle program that applies in over 10 U.S. states, mandate the sale of substantial numbers of BEVs, PHEVs, and Fuel Cell Electric Vehicles (FCEVs).
What follows is from: February 2015. Status and Issues for Plug-in Electric Vehicles and Hybrid Electric Vehicles in the United States. Alternative Fuel and Advanced Vehicle Technology Market Trends. Argonne National Laboratory
Subsidies & incentives, state and private for PEV:
Cold Weather effects on driving range
Extreme losses of range for BEVs parked outdoors off the grid during cold snaps combined with snowstorms are particularly problematic.
Evidence of the negative effect of cold weather on BEV range is receiving attention (Allen, Lohse-Busch , Rosack, Santini, Yuksel). The Volt Stats! website (Rosack), which includes an option to plot monthly mpg, shows a 14% drop in Volt electric drive “mpg equivalent” from the best month (September) to worst (January) , on a national average basis. Allen shows that range declines as great as 50% (relative to results in fall or spring) are possible for BEVs when temperatures are well below freezing. Coldest day drops for a BEV in a snowstorm in a cold state are the worst case (Santini). This case may be worse than any of the above analyses have estimated, since driving in a snowstorm was not examined.
Allen, M. (2014). Electric Range for the Nissan Leaf & Chevrolet Volt in Cold Weather. Fleet Carma. Waterloo, Canada. http://www.fleetcarma.com/nissan-leaf-chevrolet-volt-cold-weather-range-loss-electric-vehicle/
Lohse-Busch, H. et al. 2012. Advanced Powertrain Research Facility AVTA Nissan Leaf testing and analysis. Argonne National Laboratory
Rosack, M. 2015 Volt Stats! http://www.voltstats.net/
Santini et al. 2014. Daytime Charging–What is the Hierarchy of Opportunities and Customer Needs? – A Case Study Based on Atlanta Commute Data. TRB 14-5337. Presented at the Annual Meeting of the Transportation Research Board, Washington, DC
Yuksel, T., et al. 2015. Effects of Regional Temperature on Electric Vehicle Efficiency, Range, and Emissions in the United States. Working paper. Preliminary results presented at the Annual Meeting of the Transportation Research Board, Washington, DC. Jan. Carnegie Mellon University. http://pubs.acs.org/doi/abs/10.1021/es505621s