[ Here are a few of the points made in this 170 page document about improving the nation’s water system security (excerpts follow):
- There are many potential threats to water infrastructure, including terrorism, failure of aging infrastructure, flooding, hurricanes, earthquakes, cyber-security breaches, chemical spills, a pandemic causing widespread absenteeism of water treatment employees, and intentional release of chemical, biological, and radiological agents.
- Preventing a terrorist attack on the nation’s water infrastructure may be impossible because of the number and diversity of utilities, the multiple points of vulnerability, the high number of false positives, and the expense of protecting an entire system.
- Drinking and sewage water treatment depend on electricity. In a power outage, natural or deliberately started fires would be hard to put out. Explosives could destroy communications.
Alice Friedemann www.energyskeptic.com author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: KunstlerCast 253, KunstlerCast278, Peak Prosperity]
NRC. 2007. Improving the Nation’s Water Security: Opportunities for Research. Committee on Water System Security Research, National Research Council, National Academies Press. 170 pages
An attack on the water infrastructure could cause mortality, injury, or sickness; large-scale environmental impacts; and a loss of public confidence in the safety and quality of drinking water supplies.
An important overarching issue that remains unresolved is making water security information accessible to those who might need it. The problem of information sharing in a security context is one of the most difficult the EPA faces. Currently, some important information on priority contaminants and threats that could improve utilities’ response.
Improving the Nation’s water security has been classified and cannot be shared with utilities, even through secure dissemination mechanisms.
While contingency plans have existed for decades within the water and wastewater utilities industry to handle power interruptions or natural events such as flooding, new security concerns include disruption of service by physical attack (e.g., explosives), breaches in cyber security, and the intentional release of contaminants (including chemical, biological, and radiological agents).
Both drinking water and wastewater systems are vulnerable to terrorist attack. The consequences of security threats involve potential mortality, injury, or sickness; economic losses; extended periods of service interruption; and a loss of public confidence in the safety and quality of drinking water supplies—a major concern even without a serious public health consequence.
Flushing a drinking water distribution system in response to intentional chemical contamination could transport contaminants to the wastewater system and, unless removed by wastewater treatment, into receiving waters; thus, large-scale environmental impacts could also result from water security events.
Security threats to wastewater systems, while posing a less direct impact on public health, are nevertheless serious concerns. Chemical or microbial agents added in relatively small quantities to a wastewater system could disrupt the treatment process, and a physical attack on a wastewater collection system could create local public health concerns and potentially large-scale environmental impacts.
Wastewater collection systems (e.g., large-diameter sewer mains) may also serve as conduits for malicious attacks via explosives that could cause a large number of injuries and fatalities.
An attack on a wastewater system could also create public health concerns if untreated wastewater were discharged to a river used as a downstream drinking water supply or for recreational purposes (e.g., swimming, fishing).
Infrastructure Interdependencies: Electricity, firefighting, communications, natural disasters, epidemics
Threats to water security also raise concerns regarding cross-sector interdependencies of critical infrastructures. Water utilities are largely dependent upon electric power to treat and distribute water. Likewise, electric power is essential to collect and treat wastewater.
The firefighting ability of municipalities would be seriously weakened without an adequate and uninterrupted supply of water, and intentional fires could be set as part of a terrorist attack to further exacerbate this impact. Explosive attacks in wastewater collection systems could affect other critical co-located infrastructures, such as communications.
Many of the principles used to prepare for and to respond to water security threats are directly applicable to natural hazards. Hurricane Katrina reminded the nation that natural disasters can cause both physical damage and contamination impacts on water and wastewater systems.
Moreover, natural disasters (e.g., earthquakes, floods) and routine system problems (e.g., aging infrastructure, nonintentional contamination events) are far more likely to occur than a terrorist attack.
An epidemic or pandemic illness could also create failures in smaller water or wastewater utilities if supply chains become compromised due to widespread absenteeism or if essential personnel are incapacitated. Thus, threats from intentional attacks are not the only threats to the integrity of the nation’s water systems.
The municipal wastewater industry has over 16,000 plants that are used to treat a total flow on the order of 32,000 billion gallons per day (Bgal/d). More than 92% of the total existing flow is handled by about 3,000 treatment plants that have a treatment capacity of 1 million gallons per day (Mgal/d) or greater, although more than 6,000 plants treat a flow of 100,000 gallons per day or less. Nearly all of the wastewater treatment plants provide some form of secondary treatment and more than half provide some form of advanced treatment using a diversity of treatment processes and configurations. Thus, crafting a wastewater security research strategy that is suitable for all wastewater treatment plants is difficult.
Protecting a very large number of utilities against the consequences of the wide range of possible threats is a daunting, perhaps impossible, task. The development of a workable security system to prevent physical attacks against commercial airline flights is difficult and is still a work in progress, and the comparable problem for water systems is vastly more complex. Security technologies for one type of system might not work for another, and many systems might require custom designs. Further, no systems are immune from concern about an attack. A chemical or biological attack on a system that serves only a few thousand people would still be significant in terms of loss of life, economic damage, or the amount of fear and loss of confidence it would cause. In addition, smaller systems tend to be less protected and more vulnerable to a malicious attack. Approximately 160,000 drinking water systems and 16,000 wastewater systems operate simultaneously 24 hours a day, 7 days a week, with the largest systems each servicing millions of customers, and each is capable of being attacked by many different means requiring different methods of prevention. Expecting utilities to harden water and wastewater infrastructure to eliminate all vulnerabilities is unreasonable. The costs of security for the industry would be borne by the end users, and these users may not be willing to bear the costs of developing and implementing technologies that could prevent even a limited range of terrorist attacks over the entire nation’s water and wastewater systems.
Clearly, the earlier a contaminant is detected, the greater the likelihood that its public health impact can be reduced. Thus, an initial research interest has focused on developing early detection systems for chemical or biological agents that might intentionally be introduced into water or wastewater. Any such effort, however, will have to overcome some significant challenges to fashion advanced technologies into a workable system, considering the challenge of the number and diversity of water and wastewater systems and potential contaminants.
Detecting intruders and chemicals: too many false positive alarms
Let us assume, for example, a very high rate of one such intentional attack per year among the largest 10,000 drinking water systems. To detect such an attack, sensors would have to be placed throughout the systems and take frequent measurements. If a generic intrusion detector samples once every 10 minutes and there are on average 20 detectors per system (a reasonable assumption for one of the 10,000 largest systems, although one might expect more for a very large system and fewer for a very small system), this adds up to a million sampling intervals per system per year. Assuming a false positive rate of one in 10 million measurements (an extraordinarily small rate if also maximizing sensitivity), this would still produce 1,000 false positives per year among these 10,000 water systems. If only one true positive in 10,000 is expected, this means that almost every time the alarm goes off (99.9 percent of the time), it is a false positive. As a result, operators are likely to disconnect, ignore, or simply choose not to install the detection system. If detectors are ignored or not maintained, they cannot practically serve their purpose, whether to prevent, warn, or treat.
The problem is compounded when considering the installation of detectors for each of a large number of potential biothreat agents. Meinhardt published a table of 28 selected agents in 8 broad categories identified by multiple governmental, military, and medical sources as possible biowarfare agents that might present a public health threat if dispersed by water. Assuming success in constructing a 100% sensitive and extremely specific detector for the eight broad agent categories (e.g., viral pathogen, marine biotoxin) and assuming each broad category has an equal probability of being employed in an attack, the probability of a true alarm is reduced by almost another order of magnitude. In other words, the additional analysis of multiple categories of agents requires an order-of-magnitude reduction in the false positive rate of a detector just to get back to the unsatisfactory baseline of a system for a generic intrusion detector. The fundamental problem relates to the rarity of an attack on any particular system. Detectors can be made with high sensitivity and specificity (low false positive and false negative rates), but when applied in situations where the event to be detected is uncommon, the predictive value of an alarm can be very small.
A false positive alarm every few years might conceivably be acceptable to some communities that consider themselves high-risk targets, assuming there is an agreed-upon response plan in place for a positive signal.
(The calculations were conducted as follows: 10,000 water systems * 20 detectors/system * 6 measurements/detector/hour * 8760 hours/year = 10,512,000,000 measurements/year across all 10,000 systems. Given the assumptions in this scenario of a false positive rate of one in 10 million measurements and an attack rate of one per 10,000 drinking water systems, there will be approximately 1,000 false positives and only one is a true positive (one attack) per year).
Improved event detection architecture could possibly reduce the number of false positives. In this approach, a water system would install an array of sensors linked in a way that only triggers an alarm when a statistically significant number of sensors detect abnormal levels. This should reduce or eliminate the false positives caused by independent sensor malfunctions, but it would also increase the false negative rate (i.e., decrease specificity) and the cost of the detection system. The cost of purchasing and maintaining such detection instruments over a period of years needs to be considered in evaluating the likelihood of implementation.
Disease surveillance systems have been proposed as another method to detect a drinking water contamination event. The detection of a water-related event using a human-disease-based surveillance system with an appropriate epidemiologic follow-up investigation is insensitive to any but the largest outbreak events and would occur too late to prevent illness. However, disease surveillance systems could be used to mitigate further exposure and implement treatment or prophylaxis (detect to treat), especially if linked to contaminant monitoring systems. The problems associated with in situ detection systems, discussed in the previous section, apply with even more force to disease surveillance systems designed to detect specific syndromes related to bioterror agents, because disease surveillance systems have only modest sensitivities and specificities. The body’s immune system reacts generically to many in symptoms” seen in so many different diseases at first presentation. The implementation of enhanced disease surveillance systems is costly and has inherent false positive and negative rates. For example, not every case of waterborne disease will eventually be diagnosed as such. Therefore it has been argued that the benefits of such enhanced systems may not outweigh the costs in the general case. Public health researchers have argued that “it is challenging to develop sensible response protocols for syndromic surveillance systems because the likelihood of false alarms is so high, and because information is currently not specific enough to enable more timely outbreak detection or disease control activities” (Berger et al., 2006).
The EPA faces risks in providing water security information and risks in withholding it, and there is no easy solution to a problem that involves risks on both sides. As an example, if research were to find an unforeseen but easy way to contaminate a system, this information might change how utilities protect themselves and improve their ability to recognize that an attack has taken place. At the same time, this information can be used for malicious purposes. As a result, there is a delicate balance between alerting a significant number of water operators of a danger, while minimizing the potential for suggesting a route of attack to a malefactor.
Preventing a terrorist attack on the nation’s water infrastructure may be impossible because of the number and diversity of utilities, the multiple points of vulnerability, and the expense of protecting an entire system.
Overall, the EPA efforts in physical and cyber security are limited in scope, reflecting the relatively low priority of the topic to the EPA. The committee is concerned that the potential seriousness of physical attacks on a drinking water system are being overlooked, and therefore, contingencies and recovery options for physical attacks are not being addressed adequately in the research agenda. The lack of in-house expertise on the topics of physical and cyber security further limits the EPA’s ability to take a leadership role in this area, because contract management alone offers limited guidance and oversight to the work being performed.
Two classified reports have been developed that are related to, but not directly associated with, Section 3.2 of the Action Plan: the Threat Scenarios for Buildings and Water Systems Report and the Wastewater Baseline Threat Document. The first report, as described previously in this chapter, ranked the most likely contamination threats to drinking water,
Disagreggation of large water and wastewater systems should be an overarching theme of innovation. Large and complex systems have developed in the United States following the pattern of urban and suburban sprawl. While there are clear economies of scale for large utilities in construction and system management, there are distinct disadvantages as well. The complexity of large systems makes security measures difficult to implement and complicates the response to an attack. For example, locating the source of incursion within the distribution system and isolating contaminated sections are more difficult in large and complex water systems. Long water residence times are also more likely to occur in large drinking water systems, and, as a result, disinfectant residual may be lacking in the extremities of the system because of the chemical and biological reactions that occur during transport. From a security perspective, inadequate disinfectant residual means less protection against intentional contamination by a microbial agent.