The EV battery conundrum

Erik J Spek, Chief Engineer, TÜV SÜD Canada Newmarket, Ontario, Canada

What kind of developments will lead to EV batteries that are safer and deliver more energy?

The requirement for EV safety becomes a high-profile topic when a tragic event reaches the front pages. Attention focuses on the battery and high-voltage electrics of any EV involved in an accident or fire. Photographs of an electric vehicle burned to its frame rails and sitting on its wheel rims is a magnet for horror-inducing headlines. Reactions range from, “Why are we wasting money on these death traps,” to, “Accidents can happen,” and, “We’re working to eliminate them completely.”

These incidents can severely undermine all the organizations involved as well as the battery and electric vehicle industry. Thus it is important to engineer-out battery hazards that abuse can provoke. The challenge for battery and vehicle engineers is to protect the battery and its component cells from abuse by not only making the component cells themselves as benign to abuse as possible, but also to have the vehicle structure and systems provide primary protection to the battery. The situation resembles how the vehicle structure protects a gasoline tank.

To engineer-out hazards we need to understand what causes them. Why do batteries burn? Are they any more dangerous than gasoline? It has been postulated that were gasoline to be invented today, it would never gain acceptance based on the dangers. The challenge associated with containing the risks for batteries is the same as that for gasoline: it lies in understanding the risk mechanisms and then finding ways of controlling them.

To understand battery dangers, it is helpful to consider the fire triangle and battery construction. A battery of 300 to 400 V is built up from about 100 individual cells, each typically at 3 to 4 V for the electrochemistries that come under the lithium ion label. Each cell has the three attributes of the fire triangle -- fuel, oxidizer and heat -- under certain abnormal conditions. Control the heat buildup under some of those conditions and the fuel and oxidizer will remain benign, thus minimizing the risk of fire.

It can be challenging to control those potential heat buildups. Designers can approach the problem by making the cell itself tolerate higher temperatures or by reducing the potential for outside heat to affect the pack. High temperature tolerance is partly a matter of cell materials engineering with the focus on reducing the oxidizing potential of the active materials in the cell cathode. Here, battery designers strive to make the liquid and/or solid electrolyte less volatile and to reduce the possibility of unintended current paths within the cells.

Battery-pack engineering is responsible for keeping hot ambient environments away from the cells and for removing internally generated heat from cells that are used heavily. For every design change there is usually an unintended consequence. Making cells that better tolerate higher temperature may incur use of separators made of thicker plastics. But thicker plastic weighs more which reduces range and cost increases. An alternative is the use of better quality (and thinner) plastics that withstand higher temperature, but this will boost the cost of an already expensive battery.

Even with these changes, designers want to know what can be done to verify that the resulting cell is more robust. For example, can we rely on computer models to predict the thermal qualities of lithium ion cells when abused? To date, though some interesting work is underway, there are no good, reliable models that can predict how cells respond when electrically, thermally or mechanically abused. Only physical testing will predict these outcomes today.

What are the battery failure modes and safety risks? Let’s examine what a bystander could detect if a cell in a battery were to undergo a thermal runaway ending in a fire or explosion. Gases may vent from within the cell. The pressure rise induced by the rising temperature may force the gases outside the battery pack and bystanders may note its unique smell. This gas typically has corrosive components and should not be inhaled.

If the cell does proceed to thermal run away, there may be a fire which may spread to the rest of the pack and will be visible. If the gases do not immediately catch fire they may ignite with explosive force in the presence of a random spark once gas volume builds up. Pack construction should prevent gas from accumulating inside through use of escape paths to the outside. Conventional fire fighting techniques could then deal with a fire, if it happens.

Modern lithium-ion battery packs use a large amount of plastic to reduce weight and cost. This practice is a natural consequence of plastics’ versatility, but it is also an Achilles heel for battery pack design. The main problem is that most plastics have a softening temperature that is in the same range as the thermal runaway condition. So a cell that vents gases and reaches 150°C may also soften the plastic structural components. These components keep high-voltage bus bars and cables from touching each other and accidentally generating large sustained arcs similar to those of a welding machine.

Consequently, there is not only a chance of a cell-driven fire but also a possibility of arcing and high voltage on the external case. The time scale of these events can be quite short once a single cell reaches about 120 to 150°C. However, not all events necessarily happen quickly. There have been instances of batteries experiencing a slow internal electrical leak that led to unintended internal circulating currents which resulted in internal fires or sustained arcs. The time scales for this failure mode have ranged from hours to weeks.

EV battery fires are among the new phenomena first responders will encounter. The outside of the vehicle or a battery pack on a workbench may appear to be normal with no obvious excess heat, no unusual smells or noise. The battery management system may even report normal operation.

However, a slow internal current path may provoke a slow thermal runaway. Slow internal electrical leaks, with resulting unintended current paths, can also result from enclosure seals that fail to keep out humidity, conductive dusts and in coastal areas, salt fog. The launching of pleasure craft from trailers towed by pickup trucks is one scenario where salty water might compromise seals and start internal electrical leaks, potentially causing a fire. An added complication is that the conductive water exposed to the high voltage can dissociate into gases that include chlorine and hydrogen. The chlorine may be concentrated high enough to poison bystanders, and hydrogen (5%) in combination with chlorine (95%) can be explosive.

Testing, standards, construction

SUD subjected this early full EV battery of the NaS (sodium sulfur) type to a ‘crush’ test in the early 1990s to simulate what could happen in a collision.

There are a number of abuse tests designed specifically for evaluating a cell or battery pack’s ability to survive abuse in automotive applications. In North America, these tests have been developed by Sandia National Labs and SAE and have been in use since 1999 with periodic updates. They include SAE J2464, SAE J2929, SAND 2005-3123 and SAND99-0497.

The battery electronic management system manages the pack for maximum energy storage and homogeneous temperature distribution. But it also detects internal battery defects that can become voltage potentials on the outside enclosure. If the enclosure is conductively connected to the car body, the result is a voltage on the car. If it notes such conditions, the battery management system will keep the battery from operating and will warn of a fault.

If the battery management system cannot function after an accident, either because of accident damage or loss of 12-V power, there is then no means of detecting the potential hazard. Work is underway at standards organizations to deal with this scenario by addressing the need for field-discharging damaged battery packs to reduce risks.

There is also research underway to assess how far a battery pack must discharge to be considered benign. Is 50% SOC enough? 0% SOC? Or must the battery completely discharge to 0 V? What is straightforward in a test lab can be a challenge in the field. Almost all batteries in commercially available vehicles today can’t be discharged without the use of an external device.

A similar dilemma can arise when shipping a damaged battery pack. How can safe transport be assured? Must the pack be completely discharged? That decision relies on information about what is happening inside the battery pack post accident. Every pack has a battery management system of some description, but they are all different. So it is not yet possible for them to converse with the tools first responders or fire fighters might use.

In almost every case the “quiet before the storm” is almost undetectable if the battery management system is not communicating. There is no observable smell, smoke, fumes, noise, heat or vibration before thermal runaway sets in. Similarly, there may be no visible distortion on the outside of a battery pack. And in almost every case, the pack is not visible unless the car is on a hoist.

Unlike a gasoline fire where all available fuel ignites and burns at once, a battery pack fire is characterized by a reaction from each runaway cell separated by a few minutes between “pops.” Consequently, a complete burning of the pack takes longer than a gasoline fire.

Widespread R&D is underway to determine how batteries and cells can be more robust. Researchers are trying to determine whether EVs can be made to handle as much or more abuse as petroleum-fueled vehicles.

Battery designs are tested to establish their limits. Once researchers know these limits, they can evaluate alterations to cell and pack designs. The tests mimic what might be expected to happen to vehicles in various scenarios that arise over their lives. These scenarios include extreme environmental conditions (potholes, bad roads, heat, cold, humidity, corrosive environments, etc.), accidents, equipment failure (chargers, battery management systems, powertrain systems, active safety controls, etc) and handling mishaps (puncture by forklifts, drops from truck floors, high altitudes in air transport, cells or packs shorting to each other, etc).

Researchers have devised tests that replicate these potential abuse conditions as closely as possible. Starting at the individual cell level, the abuse tests include overcharge, overdischarge, short circuit, penetration by a nail, crushing, impact, thermal shock and thermal stability (that is, how hot can a cell be before it malfunctions or runs away thermally).

At the module level (many cells combined in a block) and battery pack level, additional tests examine what happens to the cells when one cell runs away thermally, what happens when the assembly is dropped, how the pack reacts to a gasoline fire beneath it, the effect of a rollover, and what happens when a pack is immersed. These are in addition to the penetration, crush overcharge and overdischarge tests (Tesla bricking?) which take place at the pack level.

Over the last three years, TUV SUD has conducted hundreds of tests at the cell, module and pack level at its three battery testing locations in Newmarket Canada, Auburn Hills, Mich., and Munich Germany. The samples typically come from several OEMs, battery manufacturers and cell manufacturers.

TUV SUD-generated data is helping to answer questions about emerging trends and whether cells and battery packs are becoming more robust. Cells tested there are showing steady improvement in abuse tests such as nail penetration, overcharge and overdischarge. TUV SUD is also examining trends emerging for short circuit and crush tests.

It is fair to ask how researchers characterize battery robustness. For the most part, OEMs and ultimately anyone in the vicinity of the car or battery wants units that just stop functioning or vent gasses at the extreme. Flames, fires, or explosions are unacceptable. Over the last three years we have witnessed a steady rise in the proportion of cells that react acceptably. Thus the evidence is that cells are becoming more robust.

The open question remains: How robust do they need to be? Of necessity, the test methods are a condensed representation of the many variations of abuse that batteries can encounter in the real world. But the correlation between test methods and real-world abuse so far is weak. So passing an abuse test may not be enough evidence that a battery will hold up in a real-world calamity.

Mature products such as electrical consumer appliances and smaller consumer battery cells undergo a standard array of tests used for a “pass-fail” judgment. It has taken many decades to see this kind of rigor in testing. Battery testing for automotive applications has not had the benefit of such an extensive learning period and in fact has only been in serious use for about 15 years. A handful of standards such as SAE J2929, UL 2580, IEC 62133 and a few others have pass-fail criteria currently. The rest are a work in progress.

To put this in perspective, lithium-ion battery development has only been underway for about two decades. Work on large format cells sized from 5 to 40 A-hr for automotive applications has progressed for half that time. (For reference, consumer cells are in the range of 0.5 to about 3 A-hr.)

During the 1980s and 1990s, the most intensive battery development work for vehicles took place on sodium-ion (Na) systems with test fleets numbering in the low hundreds of vehicles. Na systems had much higher temperature thresholds for runaway conditions (450°C). There were no fires when they failed, just sustained electrical arcs leading to the melting of aluminum and glass. These systems had no organic materials and no plastics internally to burn.

Even so, there were two Na battery-caused overheating incidents with smoke and vehicle damage caused by ineffective controls and basic weaknesses. Controls today are improved with much better software and hardware technologies. However, sodium sulfur (NaS) batteries have a fatal weakness of thermal runaway caused by electrolyte defects. This has relegated them to such low power applications as grid support rather than EVs. Sodium nickel chloride (NaNiCl2) batteries, while similar to NaS, lack this problem and are being produced by Fiamm and GE for applications that need long life, high energy density, and low cost.

All in all, better technology and investments in development have not been able to solve all battery challenges. But the sharp development tools we have available today plus lessons-learned from history may do the job for lithium ion.

Wants and needs

Test results show EV batteries are becoming safer, despite the fact that battery fires still make headlines. In the graph, HSL stands for hazard severity level. It refers to how a sample reacts to the test. HSL from 0 to 2 is usually accepted as an acceptable response for a cell to be included in a pack design. In other words nothing drastic happened in the abuse test. HSL= 3 is borderline and would be accepted but more abuse testing would be required to stay accepted. HSL from >3 to 7 would not be acceptable in a normal vehicle.

The end-user requirements for an EV battery are usually assumed to be the same as those for vehicles powered by internal combustion engines. Thus they may include a consistent vehicle range of several hundred kilometers, power to spare for accelerating to top speed, and an acquisition cost comparable to that of a conventional vehicle. Moreover, EV buyers typically expect to see lower operating costs and no new risks or hazards.

So far, EV makers haven’t been able to supply all these attributes at the beginning of life. It will be even more difficult for them to satisfy such expectations at the end of life. In addition, end-users assume that, as with normal vehicles, you can hand the keys of an EV to anyone owning a driver’s license. The vehicle should handle any weather conditions or terrain and must survive for many years with few breakdowns. The consumer expects all this while ignoring the short five year history of modern, commercially available PHEVs and EVs – nothing like the one hundred-plus-years it has taken for internal combustion-powered vehicles to reach their present maturity. Hopefully, the saying “great things are the result of unreasonable people or consumers” will apply.

However, the situation with EVs is not all doom and gloom. Requirements for EV power are largely already met. EV performance, as measured in time-to-distance, already challenges that of the best ICE vehicles. Examples include the Fisker Karma, Bill Dube’s A123-powered Killacycle, and the Tesla Model S. The “glorified golf cart” label no longer applies. Of course, an EV battery most do more than just supply power. We need it as well to accept the high current generated both during regenerative braking and fast charging. There is good evidence that both of these can be accommodated.

Range is still a problem. In most EVs it is typically far less than 200 km on the best day. Extremes in weather, terrain, driving style, and age of the battery pack will drop this figure to well below 100 km. Cost is also a barrier to adoption. Today the battery makes up a far larger fraction of the total vehicle cost than an engine.

We can overcome the range deficiency with a relatively small ICE in a plug-in hybrid electric vehicle (PHEV). The need for the ICE will diminish. A phrase often heard is that electric vehicles are an energy crisis on wheels. This phrase will likely remain apropos until batteries can store much more usable energy per kilogram and do so more economically.

Thus there are three areas where batteries need significant work before EVs can replace ICE vehicles: energy density (and thus EV range) as measured in W-hr/kgm, cost in $/kW-hr, and improved tolerance to abuse. These are all major challenges requiring sobering levels of investment and a development plan that can easily take a decade to deliver a commercial product.

First consider where improvements in energy density might arise. Today the most energy dense batteries have a value around 100 W-hr/kgm at the battery pack level and are from one of two electrochemical families, lithium ion and sodium ion. Of these, the most predominant is lithium ion. There is a good deal of fine print qualifying the 100 W-hr/kgm value pertaining to how fast energy is extracted, ambient temperature and the ability of the battery to deal with it, battery age, and the degree of balance amongst the battery cells.

What does 100 W-hr/kgm mean to vehicle range? The answer to that question comes from the relationship between vehicle range in electric mode and three parameters: the amount of battery capacity in the vehicle as measured in kilowatt-hours; external influences such as geography, weather, tires, and driving style; and battery energy density in watt-hours-per-kilogram. Many exhaustive digital tools include all these factors and can predict vehicle range. But there is also a simple rule-of-thumb that gives a first-cut range estimation: Range in kilometers, R, equals the combination of external influences, e, multiplied by the fraction of vehicle mass contributed by the battery, fb, and divided by the battery energy density in W-hr/kgm, E. In equation form this is: R = e/E × fb.

e is also called specific weighted energy consumption. Its value ranges from 0.04 for steel-on-steel rail vehicles to as high as 0.40 for high-performance passenger vehicles driven to their potential. Its units are W-hr/tonne-km where a tonne = 1 metric ton, or 1,000 kgm.

In a typical passenger car, the body structure including the drive system weighs about 0.5 kgm leaving 0.5 kgm for the payload of driver, passengers, freight and battery system. For a 1,500-kg car, this leaves 750 kg between payload and battery system. If we allow 375 kgm for a payload of four people, there is 375 kgm left for the batteries.

At 100 W-hr/kgm there will then be 37.5 kW-hr of battery on board. For reference, this is about the same as for a Coda Electric sedan. The equation then yields a range of (100/0.2) × 0.25 = 125 km. This result assumes no battery energy goes into cabin A/C, heating, or other accessories. When more favorable driving conditions are factored in with e = 0.1, range goes to 100/0.1 × 0.25 = 250 km. If we were to discover a much better battery technology that could produce an E of 200 W-hr/kgm, range with the aggressive e of 0.2 would still yield a range of 200/0.2 × 0.25 = 250 km. Thus range is directly proportional to the energy density of the battery system. Simply put, make more energy-dense batteries and range improves directly.

How can the energy density of the battery be improved? This is one of the most important questions in battery development today. Most announcements of improvements in battery performance do not deal with this question. Rather, they trumpet more battery power output or discharge power, faster charging, or the ability to absorb more regenerative power during braking.

In one way or another, announcements touting better battery performance essentially refer to a means of increasing the reactive area of the active material making up electrodes in each cell. The widespread availability of nano-sized and other exotic materials make it easier to boost active surface area. It can be argued that though these efforts might be interesting they do not address the persistent issue of range anxiety.

The road to better energy density and performance starts at the cell level. However, energy density can only improve through storage of more energy in a given cell mass or reducing the amount of materials that do not contribute to energy storage. The former boils down to putting more electrochemically active material into the electrodes of the cell. The latter focuses on stripping away mass from every cell component.

The electrochemical challenge seems to have few takers. Two current exceptions are Envia Systems Inc., Newark, Calif., and IBM. Envia Systems has reported improved cathode and anode electrochemistries with a reported potential for 400 W-hr/kgm at the cell level yielding perhaps 200 W-hr/kgm at the pack level. IBM has reported progress on a lithium-air electrochemistry with cell level potential of 1,000 W-hr/kgm.

Both efforts are encouraging. However, history shows that moving them from laboratory cells to working packs will likely require a multiyear effort supported by resources eventually measured in hundreds of millions of dollars.

The other approach of reducing battery mass while keeping energy capacity constant is possibly more of a challenge. Lithium-ion cells are already a thin-film technology. The critical component thicknesses are measured in a few tens of micrometers. The anode, cathode and separator have all become thinner. As a result, cell-level energy density can now reach 200. With dimensions this small, even airborne contaminants in clean rooms can degrade cell life and perhaps tolerance to abuse. Low-hanging fruit for thinner materials appears to be behind us given what we know today.

The preceding comments apply to cylindrical and prismatic cells. Most lithium-ion cell development today for automotive use focuses on large-format soft-case prismatic cells, more commonly known as pouch cells. This type has the lowest percentage of overhead material such as the case, external terminals and connection components and thus has the highest energy density.

Keeping in mind that pouch cells are the subject of most development activity, perhaps an opportunity lays just outside the cell. Conventional practice brings electronic current from the copper anode backing foil and aluminum cathode backing foil out to positive and negative terminals. There is another approach known as the bipolar method. Here, every cell needn’t have external terminals. Instead, cell current passes from cell to cell directly rather than out at right angles to external terminals. The usual approach is to construct modules of cells that are physically joined as a monobloc.

To make this technique work, cells are stacked so the negative electrode also becomes the positive electrode of the next cell pair. An electronically-conducting membrane prevents ionic flow between these back-to-back electrodes.

Historically, there have been numerous attempts at this construction style, especially in lead-acid and nickel-metal-hydride cells. Few have been successful. The reward for getting this right is less cell pouch material and fewer components for a given capacity, less cell ohmic resistance, and a less expensive assembly process. The problem with this approach is it put present single-cell production processes in an upheaval. Anode and cathode production may continue largely as they now exist, but the process of assembling a monobloc of cells would necessitate a complete retooling of assembly and ancillary operations.

Nevertheless, at some point in time, the benefit of bipolar monobloc lithium-ion may justify the cost of the necessary changes. We wait for that time and its accompanying improvement in vehicle range.

Resources

TÜV SÜD Canada, www.tuvcanada.com
For an excellent overview of applicable standards, recommended practices and transportation requirements for electric vehicles, batteries, charging and charging infrastructure, review the document entitled Electric Vehicle Standards Panel organized by the American National Standards Institute (ANSI), http://www.ansi.org/standards_activities/standards_boards_panels/evsp/overview.aspx?menuid=3

© 2012 Penton Media Inc. 

http://eetweb.com/The_EV_battery_conundrum/?NL=EET-01&Issue=EET-01_20121210_EET-01_102&YM_RID=mail@arizonaenergy.org&YM_MID=1358949&sfvc4enews=42